What do you think of the new Nvidia RTX 50 Series GPUs? HELP SUPPORT MY WORK AND GET DISCORD ACCESS FOR ONLY $1: www.patreon.com/ErockOnTech Help me grow my Instagram: instagram.com/erockontech
Feel like it's just a marketing scam but that's business. I have a 4070 ti super bios flashed then overclocked to match a 4080 and I'm cool with that. I'll wait till the 60 series.
Seems the biggest change down the line is A.I tops, which is whats being utilized to create fake frames I imagine...but if you can't tell the difference between fake and natural does it matter?.
If you’re not into the new tech you might want to get out gaming cause it’s headed in this direction we are maxed out on raw performance it’s getting to expensive and we know yall don’t by expensive gpus
I doubt it would be as slow as you say, otherwise the NVIDIA CEO would not dare to compare it with RTX 4090. He would not lie, he is not dumb to ruin his business reputation.
Multi frame gen sounds absolutely disgusting. Three frames are generated between one real frame. I can't imagine the amount of latency and artifacts this will have. Nvidia is once again using these cheap AI gimmicks to try and justify cutting corners on the hardware and raw performance. I absolutely despise this trend. We are going completely backwards in terms of video game graphics and hardware/raw performance simply for the sake of shitty overrated AI. This is pathetic at its best. Native resolution and real FPS will always be better than this fake AI crap. Raw native rendering all day, every day
Whether we like it or not this is a milestone we'll have to complete if we eventually want to have not "shitty overrated AI". The tech will eventually get good enough to be nearly artifact free even if that might be another 10 years from now. But I do agree it sucks that we're in this transitional period and we're experiencing the worst part of it...
@@gringoboy701I really wanna get an AMD card for it's FPS per dollar. I don't care much for the RT and AI upscaling, however AMD has always been a little strange/problematic with editing softwares and other workloads. I want to get into content creation and Nvidia just seems better at handling that. I just don't wanna fork over an arm and a leg for a good card from Nvidia 😭
@@Dempig Yep, the 4090 was "30% better" than the 4080. It's either believe the specs or the charts for the 5080. I personally think the new chip design and memory speed increase should push it further than the 5% some people claim.
The real question is why a monster like the 4090 can't use multi frame gen, because I'm not buying the "it doesn't have enough tensor core to handle it" bs
Yes that is it. Massive increase to AI cores to cover the 2 additional frames. Nvidia provided the rest of the updated features in DLSS4 to previous gen.
If the 5070 can do MultiFrameGen with 988 AI TOPS. I am pretty sure the 4090 @ 1321 AI TOPS can do it no problem. But this 5th generation is just a scam software update.
This is a classic example of Nvidia's bullshit marketing. Deception, lies, everything overpriced, and for a good part of the 5xxx series launch, I have zero doubt that availability is going to be pretty much non-existent. The fact they're marketing this card as a 4090 replacement is straight up disgusting.
We have this situation because nvidia has won the mindshare game. Sadly most buyers have no clue they're being ripped off. It's a vicious cycle. AMD keeps kicking themselves in the dick instead of pricing their products to gain any marketshare.
DLSS and framegen don't count when we're talking about performance. They introduce artifacting, smearing, and all sorts of image quality problems, as well as extra latency. They're post-processing like throwing filters on an image and cost significantly less than actually generating the image. We're not going to let nvidia get away with equating these things with actual card performance so they can make a product more cheaply and claim it's better, like saying 5070 = 4090. Not to mention this is entirely in software, so they're choosing not to support MFG on the 40 series of cards just so they can sell this crappy 50 series of cards claiming "it's so much better than the 40 series!"
Have you watched the latest digital foundry and hardware unboxed videos on dlss4 and fsr4? This stuff is getting better and better with every iteration. The latest GPUs will likely provide enough horsepower for traditional rendering at 4K or 1440p depending on pricepoint. It's when you enable path tracing that the AI software will show it's worth, making incredibly taxing effects work pretty damn well. I know this cos I've played Indy with full PT on my 4080. Without PT you get a very solid 4K experience. With it you use the software features to get an incredible experience with occasionally noticeable artifacting. But these artifacts are reducing over time. In the DF video they showed that latency isn't even increased greatly when going from 2xFG to 3 or 4 thanks to further software features. My only issue with what Nvidia is doing is that their prices are too high and they are too stingy with VRAM (the 5080 should have 24GB of VRAM at the same price). But their overall approach is most likely correct I feel. We shall wait and see though how the reviews come out and how well future games implement these features.
@@formulaic78 The trend is "make upscalers not terrible and then it's okay if we can only render 10 fps because we can use ML to upscale and generate frames to get 100+ fps!" which furthers the industry trend of using UE5 oversimplifications to produce games that run worse and don't generally look better. The latency on a game running 10 fps boosted to 150+ with DLSS/FG is unacceptable. Just go play a game from 10 years ago that looks hilariously just as good as a modern game, and notice you can crank the settings to max and get 200 fps on modern hardware with native rendering. No artifacting, input delay as low as your hardware can handle. The correct comparison is not "oh it's only 7 more ms of input lag to get 4 frames instead of 2" it's "oh it's only 60ms more latency to use this garbage over optimized native rendering." Being inefficient on purpose and handwaiving it as "ML will fix it" is making games with worse performance and which look worse, all while jacking up the prices and saying actually it's the opposite. We used to have game generations where new engines came out with crazy visual breakthroughs not using RT/PT and didn't require DLSS and a generation or two later the hardware ran it faster than anything on the market could handle with few exceptions except poorly optimized games and engines. We don't get crazy visual breakthroughs anymore, and we don't get that kind of performance anymore, instead the focus is on "look at how close we can get to native rendering using 300W of ML cores!" If you had optimized engines, drivers, and games, you could do the native rendering faster without the ML cores at all. RT is also a gimmick because it's a theoretically ideal rendering technique, but it's extremely computationally expensive. It's used in CGI where frame rendering times of minutes to hours per frame is acceptable, on server farms running 50x what your gaming PC can do, because that's what is necessary to do it at high a high resolution with sufficient quality. These GPU RT implementations run at a hilariously low resolution and use ML upscaling to try to make it look acceptable. The previous generation lighting techniques got 90% there with an insignificant fraction of the work. That's how software optimization works. Imagine research teams at nvidia/amd optimizing those techniques to be 99% instead of 90% then doing them with hardware support. That's the methodology that got us from computers being barely able to render text in the 70s to what you see today, and it's being thrown away for gimmicks that make everything worse. As an interesting note, this is generally true of all software development. Not only does it take longer than it used to, the output isn't any better, and despite modern hardware being thousands of times better than it was 20 years ago, you don't experience that generational leap anymore. Windows used to take 5-10 minutes to boot. Then it was 1-2 minutes. Then it was 15 seconds. Now it's back to 1-2 minutes to boot and start all of your applications because their metric is purely the login screen. Huge applications used to take minutes to start, which dropped to single digit seconds, and that's going back up. The hardware is getting faster. They can't just run some ML on their code and approximate the instructions, so they can't hide the inefficiency in marketing gimmicks. This is not the way.
LSFG 3 just came out with a x20 Frame generation feature (yes, twenty, not just x4) Which makes my 3060 about 10 times faster than the 4090!!!* *using Nvidia """math"""
@@divertiti AMD's frame gen does. Also that lossless meme app. The point is taking the piss out of Nvidia's marketing, comparing one GPU using FG vs one that doesn't (or in the 50 series case, using 4x FG vs the 40 series' 2x FG). It's always some apples to oranges comparison in the marketing slides.
I find it interesting that Nvidia can claim all these things, while at the same time, 90 percent of gamers rely on the experts on youtube to really do their homework in uncovering the truth. Why can't Nvidia just be open and honest from the start?
Glad to see people seeing through this whole smoke screen and goalpost moving. This lineup is going to be god awful, and I'll be avoiding the entire 50 series like the plague. If my current GPU dies on me for whatever reason, I'm going either AMD or Intel, and I'll be fine with the slight reduction in performance. F*** Nvidia.
@@RandomPersonOnTheInternet1234 that's the thing its not the GPU that has the features its the software. Nvidia is creating and artificially excluding access to older gen cards. If Nvidia allowed the 30 series to have frame gen the 40 series would be even less appealing than it was, now they're doing it again with the 50 series and their new "revolutionary" multi frame generation that the 40 series wont be able to use because F you gib us money
@@caramelpepsi2582weirdest thing. Nvidia is a business, not a charity. Don't like their practices, don't buy em. You will though. Gamers won't stick to principles if it means leaving performance on the table.
So either way, people is going to be mad at NVIDIA for wrong advertising or for not improving their performance that much besides the AI and upscalers xD
Yeap if it’s affordable they complain about scalpers. If it’s expensive they complain they can’t afford it. If it has massive rasterization performance uplift they complain that there is no innovation. If Nvidia focuses on innovation they claim it’s just a scam with gimmicks.
what you forget is, the 50xx series is manufactured in 4NP instead of 4N which results in 25% natural performance gain, so with the exact same graphics board with the same gpu at the same clock speed but just manufactured in 4NP will work ~25% faster! also most people need to let go of the idea that performance can only be achieved in one way. those "fake frames" arent real fake frames. they are real real frames :) it doesnt matter how the frames are rendered. the AI is using the real frames to genereate other real frames. it doesnt matter for anybody at latest when the quality differences arent noticeable anymore. and i predict that the MFG technologie can also be used as a latency decrasing technology within the next two generations of dlss fg by integrating the generated frames into the engines workload to lower the latency with calculated input workload in between every generated frame and still incresing the fps significantly. they already working on that.
If I had a 4090 I can't see spending $2,000 to upgrade to a 5090. Essentially a 20% upgrade. Nvidia is hung up on the 'AI' thing and most people do not care about it. Just make an AI card and if people want AI they will buy it. Imagine if the Tensor core's real estate were extra CUDA cores. These cards could use a little more VRAM too on the higher end. If Intel can do it so can Nvidia. We have to wait for AMD and benchmarks across the board to get the whole picture of what is truly going on.
I have a 4090. I'm not upgrading to a 5090. I upgraded from 1080Ti to 3080 cause it was nearly a 100% upgrade, and almost the same jump from 3080 to 4090. But 30% (maybe) for 2300€? No way.
I was underwhelmed with the 3090 for trying to get 4K gaming performance. I was then blown away with the 4090’s performance for 4K. The 5000 series looks like a joke, I’ll be waiting until the 6000 series to upgrade.
@@user-sv2wy6gx7uYeah, and the 4090 was 45 to 60% better than the 3090 Ti. In some cases nearing 70% better. 5000 is all about gimmicky crap like DLSS4 and AI bullshit with even MORE fake frames. The 5090 has a higher power draw than the 4090, so watt for watt it might not even be 25-30% better. Pathetic. But go off King.
3:00 nah uhh bro let me stop you right there jensen knows its lying everyone and their momma knows its lying they just hope and pray that you the customer is appeased by said lie END of STORY
I first messed with DLSS while tinkering with Marvel Rivals settings. I got 90+ fps on 1440p but something just didnt feel right, it was like my hertz was 30hz or something as my monitor is set to 144. Once i disabled it and got a clean 60 fps it felt normal again. Cant imagine dealing with this by default.
I have the single frame generation on my 4090 and have never use it in any games. I am more interested in raw performance with no DLSS or frame generation between the 40 and the 50 series Nvidia videocards.
One thing you're missing is the architecture. It's generally what provides the biggest increase in performance. I'm not saying 5070 will be like a 4090 in pure rasterization, I'm purely pointing there will be a large uplift over 4070. NVIDIA RTX 4000 series (Ada Lovelace architecture) is built on TSMC's 5nm process. RTX 5000 series (Blackwell architecture) is built on TSMC's 4nm process
You know the point of these vids is getting likes and views, wich happens better with rage bait , brainwashing all these young people on a budget how evil and bad these fake frames are and don't count, fucking lol while in reality the only thing people are going to see is a massive increase of frames. This is herd behaviour zt its finest. 99% of the people crying over these cards are going to be happy if they actually had one😊
Nvidia is well known for their marketing fluff these days. There was no way the 5070 could compare directly to the 4090. I don't quite understand why the 5080 only has 16GB of VRAM......I imagine a $1200 5080 TI will come out with 20 GB of VRAM in the future of course.
By the time they release the 5080 ti or whatever it will be less than a year before the 6000 series releases. I would rather have a ~30% raw performance uplift and new features of a next gen card over a 5% performance uplift and a little bit more vram
I wonder how long before high end GPUs and CPUs will hit the wall in North America to where you have to move from a 120V outlet to a 240V outlet? most homes have 15A or 1,800w of load capacity. It's 1,800w, but everything else in the room is also taking away from that wattage, like monitors, speakers, lights, etc..
I'm guessing the 5090 got a clock decrease partially because it was already hitting the 600w limit. I wonder if partner cards will have an extra power connector for overclocking headroom (OCd 5090 breaking 1000w?).
5:30 The AI TOPS figure for 5090 is not comparable to the figure for the 4090. Nvidia is using FP4 to measure the 5090 whereas 4090 was FP8, so if you change to the same calculations for both cards then 5090 is 1676 TOPS of FP8 compute vs 4090 at 1321 TOPS.
The jacket forgot to boost 5070 with 24GB vram buffer to support his claim "5070=4090" :) also, "lossless scaling" utility costs around $5. everyone can have hundreds of generated frames w/o buying nvidia's 5000 generation. ofc, worse responsiveness and artifacts are the bonus. I, for one, never ever enable FG.
I am upgrading from GTX 1070 in this year. I am from March 2017 on GTX 1070 :) On October 19th 2024 I upgraded my PC build, but I left GTX 1070 to wait for new series. I will buy RTX 5070 Ti, I won't wait whole year for RTX 5080 SUPER with 24 GB VRAM.
@@YTisProMentalillness Why pay $250 more plus AIB probably more like $300 to $350 when the $5080 has the same 16gb of Vram as to 5070ti? Would justify the price difference if it was 20gb because the 5080 is not that much faster then a 5070ti not Like a 5090 would be to the 5080.
It’s impossible for the 5070 to be as good as the 24gb 4090. Hands down a scan claim. We have seen with a 3060ti, 3070/3070ti and 3080 10gb that vram is a huge deal now in new titles. As soon as the 5070 reaches a title that needs more than the 12gb it has the card starts to stutter and becomes crap for high fps, hence why so many people including me sold my 3060ti because 8gb vram limited the cards potential. In my opinion it’s impossible as soon as the vram requirements become even higher in future the 4090 will still crush the 5070 and 5070ti even.
Its funny because people say its impossible for a 5070 to match a 4090 (which I agree with obviously its only with framegen), but a lot of people saying that also believe amd's 9070xt will beat the 7900xtx lol. Thats equally as rediculous
It's DLSS 4. They are selling you a slightly improved 40 series with GDDR7 and DLSS 4. It's a great improvement over DLSS 3.5, and I think by the time DLSS 5 is a thing it might as well be a must have, but DLSS 4 just isn't good enough fidelity-wise and quality-wise to justify the overpriced GPUs. The 5070 should be ~500 or even 450.
@@steel5897raster is the thing that runs that game, everything depends on it. Not worrying about raster is like making a car and not considering the engine.
Its AMD for me. No upscaling, no fabricated frames, no extra latency thank you very much. Plus the software covering both cpu and gpu in one app is really nice.
@@doublecrossedswine112 Thats fine, I give away most of my stuff for free too, that doesn't change the fact that current GPUs are all going to lose value
As an XR fanboy/enthusiast, I’m really curious how well the 5090 handles a **super** heavy modded up SkyrimVR Wabbajack list and how well it can handle a fully loaded VRChat world (80 people with “very poor avatars“) with lots of in world effects
I know everyone gets up in arms over frame gen, DLSS ect. I personally don't care if a frame is made natively or if its based from AI as long as it looks good. It's not like im ingesting these frames lmao
Can’t understand what? That these cards are 30-40% better than last gen and as a xtx owner I literally have to go to nvidia to upgrade? Yeah whatever dude
Actually many of us do and wee see it WILL be the future of gaming and don't mind it and maybe like it. We prob also don't play crap online FPS multiplayer games too.
@@jmangames5562 Yeah well , i have been playing only FPS games like cod2, cod4, csgo, now cs2 , with some racing games like Asseto Corsa. I dont care about it honestly, i never use any of those anyways, i prefer native, and uncaped fps , only thing i tried using was antilag 2, but i dont notice any difference, so i just dont use it . I can see how this seems like a good feature, for those who play story mode games etc, and just want to see a higher numbers . I just hope this doesnt "Destory" the gpu market for us who prefer raw and real power , and everyone just switches to software features, instead of actually using proper parts, to create a powerfull product .
@@jmangames5562 well I'm holding onto my 4000 series a little while longer anyways. What gets me is people act like they didn't know about the advancement in AI. They have been talking about it all year so I wasn't expecting anything wat more powerful.
5070 faster 10% over 4070s. 33fps vs 30fps 5080 faster 15% over 4080s. 34.5fps vs 30fps 5090 faster 20% over 4090 36fps vs 30fps *Raw performance increase. More optimistic +5% +10% 4000 series Vram lifespan at the day of release = 4 years left 5000 series Vram lifespan at the day of release =
@@lcg-channel It's going to probably be 30-40% in rt games because they have packed in more rt cores as well as better rt cores. This is based on their benchmarks. Got to wait for reviews.
Oh for sure. Each generation of GPUs introduces the issue of what is considered good for the generation in general vs what is considered good for each individual person. The 5070 will not be the performance king that Nvidia is marketing it as. However, for someone in your position, it’ll be a nice upgrade.
@ yup. As much as I absolutely want a 5090, I’m pretty I would be sleeping on the couch if I got one (and all the other parts I would need with it). In fact, my wife would probably move the couch to the garage and make me sleep in there, and we are still under the polar vortex here in Ohio. 😂😂😂
I might go with a cheaper 5070ti as a placeholder when the next get comes out or wait and see what AMD cooks up cos this gen feels not much better than last gen, to the point that a 4090 might be better to buy as the price will probably drop.
@@Kuroganemk2 Weird how people say a 30% uplift is not much better than last gen, but then say they are buying amd which had a literal 30% decrease in performance and vram lol
The 5090 wont be sold out for years to come, That said imo again just like the 40 series the 5090 is the best looking value if your gaming @ 4k / 240hz or doing hi workloads and needing a card to last for years without feeling the need to upgrade every gen or mid gen
LMFAO ....Next is The 6090 is now software only. Thats right Folks , We wont make a Harware Video Card Anymore , all of you can now just buy our New 6090 software and Bang you have the Fastest and state of the Art Mind Blowing GPU. All with DLSS 6 and Frame Gen 6.0.
that will likely happen. you will buy a specific ai supporting card and the rest is software. You dont need grafics anymore, evweything will be hallucinated, generated through neural models and pixels. Like a dream. a simulation of nothingness
2nd comment, on a more serious note... Its great that you want to dive in on the independent testing, and having to go out and buy them shows your dedication to the community. Thanks again for the well broken down vid.
Thanks for actually stating it as it is with the 5070 vs the 4090... i've seen channels do nothing but spit rainbows and sunshine over that BS marketting statement and only briefly mention how it does that. It is really frustrating that because of nvidia's deceptive and downright scummy marketting, people are actually believing it and defending nvidia...
There is no believing it. It WILL be able to use DLSS 4 MFG and match your 4090 in gaming fps. That is not wrong. Everyone knows it is with those features. Raster is not some elite generation of frames that makes it "real" vs what Ai cores generate they just do it differently.
@@jmangames5562 This is delusional. The reason 120 fps is and feels faster than 30 fps is because the latency of the real rendered frames have ¼ the latency -- including the inputs that affect the outcome of the next frame. AI/FG frames doesn’t help latency - in fact it increases latency, making your dismal 30 fps even worse in this regard.
I've just seen some UE5 Demos on the RTX 5090 with DLSS 4 MFG and you can see clearly the rubberbanding, so it's only playable with a pad or super slow mouse movement.
Who cares is nvidia's entire consumer base and the gaming media. They would love to convince everyone of this mentality, that's the marketing push, so far the reception doesn't seem to show it as working at all. Make no mistake if their sales don't dip this generation they'll continue nerfing specs and "compensating" with software in the future. 70 class cards matched the previous generations flagship performance for the last 10 years straight, now you're getting a card with raster somewhere between a 4070 and 4070 ti at best, maybe not even matching a 4070 super.
we all 100% want 240fps that feels like 24fps ahh smooth so smooth and higher resolution textures that take up double the vram but are blury and look less sharp than games from 10 years ago
Multi frame gen sounds absolutely amazing. Generating three frames between each real frame has the potential to provide a smoother and more fluid experience. It’s exciting to imagine how advancements like this could evolve gaming and graphics technology. Nvidia continues to push the boundaries of what AI can achieve, finding creative ways to enhance performance and optimize hardware. This trend shows how innovation can lead us in new directions, blending AI with traditional rendering techniques. While some might prefer native resolution and FPS, there’s no denying the potential of AI-driven features to revolutionize the gaming experience. Embracing progress while balancing raw performance and creativity opens up amazing possibilities for the future of graphics and gameplay.
I like FG. It has its use-cases. But the problem is that they are equating it to "performance", when it is not. Upscaling helps with performance. FG only helps with motion fluidity. If anything, it makes "performance" ever so slightly worse. The problem is in the way they are marketing it.
Another thing people need to take into account is that the frame number on paper says nothing about frame stability and image quality on the actual screen. If I get to run my games at 250fps with RT and Ultra everything, but it looks like vaseline smeared all over my monitor, I'm not paying you even $1 for that GPU
I have Multi Frame Generation 2 up to 4X in any game or video with my 7900XTX with a Lossless Scaling program and it works and delivers. Very very good. Much better than anything I've tried... 8 euros well spent.
Already knew it was going to be slower, possibly slower than 40 super series. They compare it to the original 40 to make it seem like it’s an “improvement”. Raw faster there will be no improvement without 4x fake frames
The biggest issue is you are required to use DLSS for the 5070 to even reach said performance of a 4090 which on paper Isn’t a bad thing, however that being said DLSS was created for systems and or parts that couldn’t reach said higher end performance in the first place… so in essence their trying to market a feature that is for lower systems on a top of the line alleged new and improved hardware.
Nvidia is now a AI company not a hardware company, from the specs the 5080 is only going to be marginally faster than a 4080 in pure rasterisation, like 10-15%
@@thetheoryguy5544 The 5090 will be 25-30% faster over the 4090 but it also has 30% more CUDA cores and can draw way more power, the rest of the stack isnt going to see that jump
@@BOKtober the 50xx series is manufactured with 4NP process which results in a natural ~25% performance gain so any 50xx series graphics board will see at least that jump in performance.
So for someone like me who is looking at replacing their old 1060 would the 5070 be worth it for its price or would I be better off getting something in the 40xx once they become cheaper due to the new cards?
I think people overlook that we have 25% better ray tracing performance on most cards and Farc Cry 6 data is WITH RT ON, so not even that data really tells use anything about the performance gain, it could be as low as 15%
Generation-over-generation rasterization performance improvements for GPUs within the same tier (e.g., 4070 Ti to 5070 Ti) typically range between 20% and 35%. Claims of significantly higher gains are often overstated, as such advancements are constrained by current manufacturing technology. Substantial performance increases would only be achievable through a major breakthrough in manufacturing processes. For instance, if the 50 series were built on a 4 nm process and the next generation transitioned to a 1 nm process, this could yield a significantly higher performance uplift and noticeably improved power efficiency.
If mult generation is what people want, why not use lossless scaling? I just started using it and i’ll use it’s mult gen that works on every single game with my 4090, unlike dlss mult gen. Idk if i see a point in upgrading to the 50 series unless you have a mid-range 20 series card or something
Your 5070 vs. 4090 argument is like saying an EV (car) will not match an ICE because the EV doesn't have 8 cylinders, 3 turbochargers, super-sparkplugs, etc., basically the components you've become accustomed to. when all that matters is the results in a particular domain.
It all depends on the end cost put out by manufacturers. If 5070 TI will be in same price range as 4070 TI Super, then ofc you go for 5070 TI, because the "raw power" is imo around 10-15% better. The problem will most likely be that the end price will be higher than 4070 TI Super by probably 20%.
Until we get some third party benchmark-tests, all this is highly speculative, but: - "per core"-uplift from Lovelace to Blackwell could be relatively minor - just look at how many more cores they put onto the 5090 and how large that chip is. If the new cores were that much of an improvement, would they really have needed 21,760 CUDA cores vs the 4090's 16,384? - total number of cores on non-5090 cards aren't that much higher when compared to their Lovelace-equivalents (5070 = 6,144, 4070 = 5,880, 4070 Super = 7,168 ... 5070 Ti = 8960, 4070 Ti = 7680, 4070 Ti Super = 8448) - outside of the 5090, amounts of VRAM are identical between 40xx and 50xx, as are memory bus-widths with the only uplift coming, apparently, from the switch from GDDR 6X to 7. So outside of the 5090 - a sentence that will probably become a mantra with this gen - 50xx performance could be pretty "meh" indeed. It also doesn't fill me with confidence that one of the earliest leaks mentioned Nvidia comparing the 5070 to the 4070 Ti (not to the Ti Super) in internal comparison charts, something they repeated during the launch event. If the 5070 would beat the Ti Super, surely they would announce that fact. Speaking of which: It's also quite disheartening to see how they used all the AI-stuff during that presentation to give a probably false impression of the 5070's capabilities. 5070 = 4090 levels of performance? Yeah... right.
If it gives better fidelity with just a 6% of a second penalty i think it's good. The only poeple who will not like a 50-60 ms latency are pro/ competetive gamers which is how much percentage of the market?
What this video didn't specifically point out is that none of the 5000 series cards beats my 4090 except for the 5090. For me, anything other than the 5090 will get me worse performance overall, despite what NVIDIA and it's CEO Jensen Huang claim.
So it is like 25% increase in performance for 25% increase in price from 4090 to 5090. It is basically 4090ti. It is not an actual upgrade to the family unless it is the same price with increased performance.
So if I was to purchased a 4090 back in the day and now seeing that a new graphic card with all the bells and whistle being used in a game to get the same performance. Should I be pissed that the new card that is wayyy cheaper is technically giving me the same performance..from the same company?
I think the real tell will be to see how these new gen architectures perform... the raw stats alone wont tell the whole picture. Im eager to see some real testing when these cards become available. Ill check back then. Great video tho summing up what we know so far.
What do you think of the new Nvidia RTX 50 Series GPUs?
HELP SUPPORT MY WORK AND GET DISCORD ACCESS FOR ONLY $1: www.patreon.com/ErockOnTech
Help me grow my Instagram: instagram.com/erockontech
Feel like it's just a marketing scam but that's business. I have a 4070 ti super bios flashed then overclocked to match a 4080 and I'm cool with that. I'll wait till the 60 series.
Seems the biggest change down the line is A.I tops, which is whats being utilized to create fake frames I imagine...but if you can't tell the difference between fake and natural does it matter?.
even the AI TOPS are not apples to apples, they are lying or "misleading": people... FP8 on 4090 to FP4 on 5090
If you’re not into the new tech you might want to get out gaming cause it’s headed in this direction we are maxed out on raw performance it’s getting to expensive and we know yall don’t by expensive gpus
I doubt it would be as slow as you say, otherwise the NVIDIA CEO would not dare to compare it with RTX 4090. He would not lie, he is not dumb to ruin his business reputation.
Multi frame gen sounds absolutely disgusting. Three frames are generated between one real frame. I can't imagine the amount of latency and artifacts this will have. Nvidia is once again using these cheap AI gimmicks to try and justify cutting corners on the hardware and raw performance. I absolutely despise this trend. We are going completely backwards in terms of video game graphics and hardware/raw performance simply for the sake of shitty overrated AI. This is pathetic at its best. Native resolution and real FPS will always be better than this fake AI crap. Raw native rendering all day, every day
single fg was already enough latency, how the hell will this feel?
They aren’t cutting corners, they are just developing the chips in AI performance metrics than typical
As you still buy a card.
Whether we like it or not this is a milestone we'll have to complete if we eventually want to have not "shitty overrated AI". The tech will eventually get good enough to be nearly artifact free even if that might be another 10 years from now. But I do agree it sucks that we're in this transitional period and we're experiencing the worst part of it...
@@gringoboy701I really wanna get an AMD card for it's FPS per dollar. I don't care much for the RT and AI upscaling, however AMD has always been a little strange/problematic with editing softwares and other workloads. I want to get into content creation and Nvidia just seems better at handling that. I just don't wanna fork over an arm and a leg for a good card from Nvidia 😭
on RAW performance... the advancement of the 5000 series over the 4000 is MINIMAL.
disgusting.
30% is not minimal lol
@@Dempig On one card.
@@Dempig Yep, the 4090 was "30% better" than the 4080. It's either believe the specs or the charts for the 5080. I personally think the new chip design and memory speed increase should push it further than the 5% some people claim.
@@Dempigespecially for same price. For the first time in bitcoin/ai era we have better price/perf
@@Dempigdo it over the refreshed cards and not the base one
The real question is why a monster like the 4090 can't use multi frame gen, because I'm not buying the "it doesn't have enough tensor core to handle it" bs
Yes that is it. Massive increase to AI cores to cover the 2 additional frames. Nvidia provided the rest of the updated features in DLSS4 to previous gen.
its not bs bro... people like you should just leave the internet fam..
@@RED--01 can i have your 4090 when you get the 5090?
If the 5070 can do MultiFrameGen with 988 AI TOPS. I am pretty sure the 4090 @ 1321 AI TOPS can do it no problem. But this 5th generation is just a scam software update.
Why AMD 9800 CPU cannot be as fast as AMD 9950 CPU in video processing apps?
This is basically the 20 series all over again.
Exactly. 4090 will have a long life like the venerable 1080.
@@wolfen69 well now that we have lossless scaling , most cards really
@@wolfen69 dude anything that has 80 or 90 on it will have a long life. tf are you talking about.
@@sfguzmani there are better and worse generations, 4000 and 1000 were definitely the better ones. "tf are you talking about"
@@sfguzmani whoa, chill. I was responding to a comment about this being the 20 series all over again.
This is a classic example of Nvidia's bullshit marketing. Deception, lies, everything overpriced, and for a good part of the 5xxx series launch, I have zero doubt that availability is going to be pretty much non-existent. The fact they're marketing this card as a 4090 replacement is straight up disgusting.
We have this situation because nvidia has won the mindshare game. Sadly most buyers have no clue they're being ripped off. It's a vicious cycle. AMD keeps kicking themselves in the dick instead of pricing their products to gain any marketshare.
@@jerrycurls88 It really is a lose lose situation for consumers. Nvidia has deceptive marketing and AMD has garbage marketing lol
@@dirtydoge756 yup, both companies have normalized "mid-range" to mean 500-600. I'm old enough to remember 250 would buy a nice mid range GPU.
worse of all, for 2000 dollars
The reason Huang's claim went viral is not artificial intelligence; it's genuine stupidity.
The more you buy, the more you save 🤣
He used AI for the marketing? :P
Amen!
Investor stupidity and that’s about it, anyone who’s used Frame Generation knows it’s not worth a shit.
Artificial Intelligence, no, Genuine Stupidity is fuckin funny lmao
I will buy all your 4090s for $600, hit me up...
I'll pay $650 to help a fellow brother 😁
@@Albreckno no, I’ll pay $650, and throw in a complimentary free pizza
I'll sell u my 3080 for 1000$ since it's more real frames than the 4090 Frames.
@@briananeuraysem3321😂😂😂
@@rkwjunior2298 ufff 🙈🙊
DLSS and framegen don't count when we're talking about performance. They introduce artifacting, smearing, and all sorts of image quality problems, as well as extra latency. They're post-processing like throwing filters on an image and cost significantly less than actually generating the image. We're not going to let nvidia get away with equating these things with actual card performance so they can make a product more cheaply and claim it's better, like saying 5070 = 4090. Not to mention this is entirely in software, so they're choosing not to support MFG on the 40 series of cards just so they can sell this crappy 50 series of cards claiming "it's so much better than the 40 series!"
Have you watched the latest digital foundry and hardware unboxed videos on dlss4 and fsr4? This stuff is getting better and better with every iteration. The latest GPUs will likely provide enough horsepower for traditional rendering at 4K or 1440p depending on pricepoint. It's when you enable path tracing that the AI software will show it's worth, making incredibly taxing effects work pretty damn well. I know this cos I've played Indy with full PT on my 4080. Without PT you get a very solid 4K experience. With it you use the software features to get an incredible experience with occasionally noticeable artifacting. But these artifacts are reducing over time. In the DF video they showed that latency isn't even increased greatly when going from 2xFG to 3 or 4 thanks to further software features. My only issue with what Nvidia is doing is that their prices are too high and they are too stingy with VRAM (the 5080 should have 24GB of VRAM at the same price). But their overall approach is most likely correct I feel. We shall wait and see though how the reviews come out and how well future games implement these features.
@@formulaic78 The trend is "make upscalers not terrible and then it's okay if we can only render 10 fps because we can use ML to upscale and generate frames to get 100+ fps!" which furthers the industry trend of using UE5 oversimplifications to produce games that run worse and don't generally look better. The latency on a game running 10 fps boosted to 150+ with DLSS/FG is unacceptable. Just go play a game from 10 years ago that looks hilariously just as good as a modern game, and notice you can crank the settings to max and get 200 fps on modern hardware with native rendering. No artifacting, input delay as low as your hardware can handle. The correct comparison is not "oh it's only 7 more ms of input lag to get 4 frames instead of 2" it's "oh it's only 60ms more latency to use this garbage over optimized native rendering." Being inefficient on purpose and handwaiving it as "ML will fix it" is making games with worse performance and which look worse, all while jacking up the prices and saying actually it's the opposite. We used to have game generations where new engines came out with crazy visual breakthroughs not using RT/PT and didn't require DLSS and a generation or two later the hardware ran it faster than anything on the market could handle with few exceptions except poorly optimized games and engines. We don't get crazy visual breakthroughs anymore, and we don't get that kind of performance anymore, instead the focus is on "look at how close we can get to native rendering using 300W of ML cores!" If you had optimized engines, drivers, and games, you could do the native rendering faster without the ML cores at all. RT is also a gimmick because it's a theoretically ideal rendering technique, but it's extremely computationally expensive. It's used in CGI where frame rendering times of minutes to hours per frame is acceptable, on server farms running 50x what your gaming PC can do, because that's what is necessary to do it at high a high resolution with sufficient quality. These GPU RT implementations run at a hilariously low resolution and use ML upscaling to try to make it look acceptable. The previous generation lighting techniques got 90% there with an insignificant fraction of the work. That's how software optimization works. Imagine research teams at nvidia/amd optimizing those techniques to be 99% instead of 90% then doing them with hardware support. That's the methodology that got us from computers being barely able to render text in the 70s to what you see today, and it's being thrown away for gimmicks that make everything worse.
As an interesting note, this is generally true of all software development. Not only does it take longer than it used to, the output isn't any better, and despite modern hardware being thousands of times better than it was 20 years ago, you don't experience that generational leap anymore. Windows used to take 5-10 minutes to boot. Then it was 1-2 minutes. Then it was 15 seconds. Now it's back to 1-2 minutes to boot and start all of your applications because their metric is purely the login screen. Huge applications used to take minutes to start, which dropped to single digit seconds, and that's going back up. The hardware is getting faster. They can't just run some ML on their code and approximate the instructions, so they can't hide the inefficiency in marketing gimmicks. This is not the way.
@@formulaic78 4k native is 80 series or XT/XTX, problem is the 80 sereis runs out of VRAM.
LSFG 3 just came out with a x20 Frame generation feature (yes, twenty, not just x4)
Which makes my 3060 about 10 times faster than the 4090!!!*
*using Nvidia """math"""
🤯
Combine it with FSR frame gen for a total 40x! FREE PERFORMANCE!
3060 isn't capable of generating even a single frame
@@divertiti AMD's frame gen does. Also that lossless meme app. The point is taking the piss out of Nvidia's marketing, comparing one GPU using FG vs one that doesn't (or in the 50 series case, using 4x FG vs the 40 series' 2x FG). It's always some apples to oranges comparison in the marketing slides.
I tried x20 and holly Jesus the artifacting is mad crazy LMAO
I find it interesting that Nvidia can claim all these things, while at the same time, 90 percent of gamers rely on the experts on youtube to really do their homework in uncovering the truth. Why can't Nvidia just be open and honest from the start?
Because they have shareholders waiting to get their cut. Being honest and dropping all the marketing BS ain’t good for business.
Glad to see people seeing through this whole smoke screen and goalpost moving. This lineup is going to be god awful, and I'll be avoiding the entire 50 series like the plague.
If my current GPU dies on me for whatever reason, I'm going either AMD or Intel, and I'll be fine with the slight reduction in performance. F*** Nvidia.
I wanted to get the 5080 but it might be worse than a 4090 with less Vram lol
There was no smoke screen, if the gpu has features then it’s completely fine to include that in its performance
Human beings may not be perfect, but a computer program with language synthesis is hardly the answer to graphics rendering.
@@RandomPersonOnTheInternet1234 that's the thing its not the GPU that has the features its the software. Nvidia is creating and artificially excluding access to older gen cards. If Nvidia allowed the 30 series to have frame gen the 40 series would be even less appealing than it was, now they're doing it again with the 50 series and their new "revolutionary" multi frame generation that the 40 series wont be able to use because F you gib us money
@@caramelpepsi2582weirdest thing. Nvidia is a business, not a charity. Don't like their practices, don't buy em. You will though.
Gamers won't stick to principles if it means leaving performance on the table.
So either way, people is going to be mad at NVIDIA for wrong advertising or for not improving their performance that much besides the AI and upscalers xD
Yeap if it’s affordable they complain about scalpers. If it’s expensive they complain they can’t afford it. If it has massive rasterization performance uplift they complain that there is no innovation. If Nvidia focuses on innovation they claim it’s just a scam with gimmicks.
Pretty much. They will still be nice and loyal though. Taking a principled stance and switching to AMD would mean less frames. Can't have that.
@@Unit-kp8wmit all depends if its playable...50/60ms latency with everything on is not so great... thats almost 100ms...
what you forget is, the 50xx series is manufactured in 4NP instead of 4N which results in 25% natural performance gain, so with the exact same graphics board with the same gpu at the same clock speed but just manufactured in 4NP will work ~25% faster!
also most people need to let go of the idea that performance can only be achieved in one way. those "fake frames" arent real fake frames. they are real real frames :)
it doesnt matter how the frames are rendered. the AI is using the real frames to genereate other real frames. it doesnt matter for anybody at latest when the quality differences arent noticeable anymore. and i predict that the MFG technologie can also be used as a latency decrasing technology within the next two generations of dlss fg by integrating the generated frames into the engines workload to lower the latency with calculated input workload in between every generated frame and still incresing the fps significantly. they already working on that.
If I had a 4090 I can't see spending $2,000 to upgrade to a 5090. Essentially a 20% upgrade.
Nvidia is hung up on the 'AI' thing and most people do not care about it. Just make an AI card and if people want AI they will buy it.
Imagine if the Tensor core's real estate were extra CUDA cores. These cards could use a little more VRAM too on the higher end. If Intel can do it so can Nvidia. We have to wait for AMD and benchmarks across the board to get the whole picture of what is truly going on.
Except for raytacing its 66%
It's not $2000. I can trade my my 4090 into Newegg for $1350 with their trade in program.
@@Tesseramousin far cry thyre use only rt,only 27% rt improvement..not 66%
I have a 4090. I'm not upgrading to a 5090. I upgraded from 1080Ti to 3080 cause it was nearly a 100% upgrade, and almost the same jump from 3080 to 4090. But 30% (maybe) for 2300€? No way.
@@johari-d3q I'm just going by the rt core numbers, but you may be right if it doesn't translate to actual performance
I was underwhelmed with the 3090 for trying to get 4K gaming performance. I was then blown away with the 4090’s performance for 4K.
The 5000 series looks like a joke, I’ll be waiting until the 6000 series to upgrade.
Your comment makes no sense. The 5090 is 30% faster (without AI) than the 4090, which you were impressed by... lmao
@@user-sv2wy6gx7uYeah, and the 4090 was 45 to 60% better than the 3090 Ti. In some cases nearing 70% better.
5000 is all about gimmicky crap like DLSS4 and AI bullshit with even MORE fake frames.
The 5090 has a higher power draw than the 4090, so watt for watt it might not even be 25-30% better.
Pathetic. But go off King.
@@Bottlecap Dude is so mad his 4090 gets beat by a $550 card lol
@@Bottlecap Wait till the benchmarks come out. This card is going to be a monster even without MFG. Not to mention 32GB DDR7
@@Dempig Nah a 4090 is not getting beat by a 5070. That's marketing BS and borderline lying.
the dude who sold me his complete 4090/7800x3d build on fb marketplace for $2200 in preparation for the the 5090 is definitely punching air HARD rn
If he just sold the gpu and cpu I doubt he cares cause the 5090 is still gonna be a big jump but if he sold his whole build yikes…
@reekid8257 nah it was the whole computer not just the gpu.
@ that’s actually tragic😭✌️
3:00 nah uhh bro let me stop you right there
jensen knows its lying
everyone and their momma knows its lying
they just hope and pray that you the customer is appeased by said lie END of STORY
I first messed with DLSS while tinkering with Marvel Rivals settings. I got 90+ fps on 1440p but something just didnt feel right, it was like my hertz was 30hz or something as my monitor is set to 144. Once i disabled it and got a clean 60 fps it felt normal again. Cant imagine dealing with this by default.
how dare you not praise daddy nvidia.........
lol
I had a similar experience with rivals
I have the single frame generation on my 4090 and have never use it in any games. I am more interested in raw performance with no DLSS or frame generation between the 40 and the 50 series Nvidia videocards.
That is the best way to test the performance.
One thing you're missing is the architecture. It's generally what provides the biggest increase in performance. I'm not saying 5070 will be like a 4090 in pure rasterization, I'm purely pointing there will be a large uplift over 4070.
NVIDIA RTX 4000 series (Ada Lovelace architecture) is built on TSMC's 5nm process.
RTX 5000 series (Blackwell architecture) is built on TSMC's 4nm process
You know the point of these vids is getting likes and views, wich happens better with rage bait , brainwashing all these young people on a budget how evil and bad these fake frames are and don't count, fucking lol while in reality the only thing people are going to see is a massive increase of frames. This is herd behaviour zt its finest. 99% of the people crying over these cards are going to be happy if they actually had one😊
how is the 5070 gonna come close even with all the new frame generation with literally 2000 less cuda cores then the 3080?
The new Ai cores will really impact performance. But it’s not native. In short, it won’t really come close these are just Nvidia’s marketing claims.
Nvidia is well known for their marketing fluff these days. There was no way the 5070 could compare directly to the 4090. I don't quite understand why the 5080 only has 16GB of VRAM......I imagine a $1200 5080 TI will come out with 20 GB of VRAM in the future of course.
24 gb
Yes it will have 24GB @ 256-bit bus using 3GB modules instead of the 2GB it currently uses.
what game consumes 16gb vram?
By the time they release the 5080 ti or whatever it will be less than a year before the 6000 series releases. I would rather have a ~30% raw performance uplift and new features of a next gen card over a 5% performance uplift and a little bit more vram
I asked Grok AI for the spec difference between the 5070 and the 4070 Super and it has the 5070 slightly beating the 4070 Super
Love how the 4090 vs 5090 very quietly omits a critical, longstanding measurement: Actual Tflops (not RT) lmao.
I wonder how long before high end GPUs and CPUs will hit the wall in North America to where you have to move from a 120V outlet to a 240V outlet? most homes have 15A or 1,800w of load capacity. It's 1,800w, but everything else in the room is also taking away from that wattage, like monitors, speakers, lights, etc..
These 5000 series cards are 4000 series with AI
Good video man! Cant wait to see all the cards on the channel! Im getting a 5080 so pretty hyped for it!
based on some personal calculations the 5070 will have half the raw performance of a 4090 (just my guess tho)
I'm guessing the 5090 got a clock decrease partially because it was already hitting the 600w limit. I wonder if partner cards will have an extra power connector for overclocking headroom (OCd 5090 breaking 1000w?).
High end 4090s already hit 600W. Same with high end XTX.
where is the rtx 5060 announcement nvidia? did they just leave out the mid range cards without us noticing?
I'm not impressed with the latency issues when using DLSS frame generation, so I don't have high hopes for multi-frame generation either.
5:30 The AI TOPS figure for 5090 is not comparable to the figure for the 4090.
Nvidia is using FP4 to measure the 5090 whereas 4090 was FP8, so if you change to the same calculations for both cards then 5090 is 1676 TOPS of FP8 compute vs 4090 at 1321 TOPS.
Wow typical Nvidia scammage.. I knew those figures looked fishy. Thanks for pointing that out
Istg Jensen hired a bunch of tech bros to make ai generated replies to defend nvidia in the replies 🤣
all benchmark tests should also measure average latency
The jacket forgot to boost 5070 with 24GB vram buffer to support his claim "5070=4090" :)
also, "lossless scaling" utility costs around $5. everyone can have hundreds of generated frames w/o buying nvidia's 5000 generation. ofc, worse responsiveness and artifacts are the bonus. I, for one, never ever enable FG.
Reflex 2.0 is exclusive to rtx5000 😢?
Not exactly. Reflex 2 will initially launch only on the 50 series cards. However, Nvidia says it will also eventually come to the 40 series cards.
I can't wait for the actual benchmarks to come out. Im on a 1070 and i think the 5070 ti would be a nice upgrade.
The only time its worth buying Nvidia is when your current GPU is over 3 generations old. But if you want a true 5070 Ti then go for the 5080 instead.
Im right there with you. I have a 1080 and now looking at getting the 5090
1070? A 3070 would be a big upgrade.
I am upgrading from GTX 1070 in this year. I am from March 2017 on GTX 1070 :) On October 19th 2024 I upgraded my PC build, but I left GTX 1070 to wait for new series. I will buy RTX 5070 Ti, I won't wait whole year for RTX 5080 SUPER with 24 GB VRAM.
@@YTisProMentalillness Why pay $250 more plus AIB probably more like $300 to $350 when the $5080 has the same 16gb of Vram as to 5070ti? Would justify the price difference if it was 20gb because the 5080 is not that much faster then a 5070ti not Like a 5090 would be to the 5080.
As a 4070 owner there is no incentive that makes me want to upgrade to a 5070. Only 4% more CUDA cores and 12GB VRAM is shockingly bad!
yes no sense upgrading from.a 4070 viewpoint. only for ai things would benefit a lot. using models and ai tools
It’s impossible for the 5070 to be as good as the 24gb 4090. Hands down a scan claim. We have seen with a 3060ti, 3070/3070ti and 3080 10gb that vram is a huge deal now in new titles. As soon as the 5070 reaches a title that needs more than the 12gb it has the card starts to stutter and becomes crap for high fps, hence why so many people including me sold my 3060ti because 8gb vram limited the cards potential. In my opinion it’s impossible as soon as the vram requirements become even higher in future the 4090 will still crush the 5070 and 5070ti even.
Its funny because people say its impossible for a 5070 to match a 4090 (which I agree with obviously its only with framegen), but a lot of people saying that also believe amd's 9070xt will beat the 7900xtx lol. Thats equally as rediculous
It's DLSS 4. They are selling you a slightly improved 40 series with GDDR7 and DLSS 4. It's a great improvement over DLSS 3.5, and I think by the time DLSS 5 is a thing it might as well be a must have, but DLSS 4 just isn't good enough fidelity-wise and quality-wise to justify the overpriced GPUs. The 5070 should be ~500 or even 450.
but you know the 4NP process manufacturing results in a natural 25% performance gain, right? ;)
I get the feeling the 4000 series is going to spike in price briefly.
everything will spike i guess..........
This was very helpful thank you!
Glad it was helpful!
There's a reason rasterization is not shown as a comparison. If they thought it was 20-30% better they would have put it on the chart.
Raster is not shown because it's not relevant as a field of cutting edge graphics anymore.
@@steel5897Exactly, rasterization is antiquated tech, not relevant in assessing progress
It is confirmed 40% faster than last gen bro
@@steel5897raster is the thing that runs that game, everything depends on it. Not worrying about raster is like making a car and not considering the engine.
@@catmeow11111 raster only matters for 1080p budget gamers
Great video, this is why my next GPU upgrade would be to an AMD GPU over NVIDIA, or a 4090 when prices drop.
Thank you! Yeah I’m excited for AMD to announce the new cards officially for performance claims.
Its AMD for me. No upscaling, no fabricated frames, no extra latency thank you very much. Plus the software covering both cpu and gpu in one app is really nice.
My 4070 Ti Super lost value over the 50 series, you AMD guys are in serious trouble, its value plummet time
@@gingerbread6967 I don't sell my old stuff. I pay it forward.
@@doublecrossedswine112 Thats fine, I give away most of my stuff for free too, that doesn't change the fact that current GPUs are all going to lose value
@@gingerbread6967 As an AMD user I'm used to that lol. Its how its always been for us.
Do you play at 1080p or something? Because AMDs best card for the next 2 years, the 7900xtx, requires upscaling even at 1440p for UE5.
As an XR fanboy/enthusiast, I’m really curious how well the 5090 handles a **super** heavy modded up SkyrimVR Wabbajack list and how well it can handle a fully loaded VRChat world (80 people with “very poor avatars“) with lots of in world effects
No one going this deep, Some people just over complexing it for no reason. If your doing insane stuff past gpu levels get a nv workstation
I know everyone gets up in arms over frame gen, DLSS ect. I personally don't care if a frame is made natively or if its based from AI as long as it looks good. It's not like im ingesting these frames lmao
Is what matters about a frame only how it looks (which is already questionable)?
Exactly, DLSS 4 is looking like a winner, thats why Nvidia is focused on it, frankly, it looks like AMD is in trouble
That's the problem. Is not going to look like you think. Especially if you play multiplayer.
@@gingerbread6967 It looks terrible from that DF video, there are glitches, artifacts and shimmering everywhere.
@@JimmyChino55 No it doesn't, I laugh,
5:00 there is a small mistake. the text in white that states RTX 5090 should be in green
Its so annoying people cant understand this .
Well they mentioned that a while back so you would figure they would of known
Can’t understand what? That these cards are 30-40% better than last gen and as a xtx owner I literally have to go to nvidia to upgrade? Yeah whatever dude
Actually many of us do and wee see it WILL be the future of gaming and don't mind it and maybe like it. We prob also don't play crap online FPS multiplayer games too.
@@jmangames5562 Yeah well , i have been playing only FPS games like cod2, cod4, csgo, now cs2 , with some racing games like Asseto Corsa. I dont care about it honestly, i never use any of those anyways, i prefer native, and uncaped fps , only thing i tried using was antilag 2, but i dont notice any difference, so i just dont use it . I can see how this seems like a good feature, for those who play story mode games etc, and just want to see a higher numbers . I just hope this doesnt "Destory" the gpu market for us who prefer raw and real power , and everyone just switches to software features, instead of actually using proper parts, to create a powerfull product .
@@jmangames5562 well I'm holding onto my 4000 series a little while longer anyways. What gets me is people act like they didn't know about the advancement in AI. They have been talking about it all year so I wasn't expecting anything wat more powerful.
9070XT FTW. Why have 5070 when you can have a 9070?!
5070 faster 10% over 4070s. 33fps vs 30fps
5080 faster 15% over 4080s. 34.5fps vs 30fps
5090 faster 20% over 4090
36fps vs 30fps
*Raw performance increase. More optimistic +5% +10%
4000 series Vram lifespan at the day of release = 4 years left
5000 series Vram lifespan at the day of release =
No it's 20-30% raw perf increase across the board.
@@_godsl4yer_at least. Possibly more with RT heavy games. FG and MFG are bonus options.
Nobody has real world benchmarks.
@@lcg-channel It's going to probably be 30-40% in rt games because they have packed in more rt cores as well as better rt cores. This is based on their benchmarks. Got to wait for reviews.
@@_godsl4yer_ 10+ optimistic 10% = 20% 👍
I’m sure you’re absolutely correct. That said, I’m 100% it will run circles around my 3060
Oh for sure. Each generation of GPUs introduces the issue of what is considered good for the generation in general vs what is considered good for each individual person. The 5070 will not be the performance king that Nvidia is marketing it as. However, for someone in your position, it’ll be a nice upgrade.
@ yup. As much as I absolutely want a 5090, I’m pretty I would be sleeping on the couch if I got one (and all the other parts I would need with it). In fact, my wife would probably move the couch to the garage and make me sleep in there, and we are still under the polar vortex here in Ohio. 😂😂😂
@@Arlen.Kundert Haha 🤣 I’ve been on the couch a time or two myself. I can relate haha.
AI = DLSS It's 1/4 the speed of the 4090, but then they add 4 fake frames. Sigh.
Yep I’ll be sticking with my 4090 for quite awhile. 7900xtx is also a very solid choice.
I agree for sure.
I think I want the 5080, seems close to a 4090 in raw power. I don't think I can afford a 5090.
4090 will match a 5080ti.
5080 is the real normal persons flagship. 5090 is now just too much for most people, its insane, but priced out of most budgets.
I might go with a cheaper 5070ti as a placeholder when the next get comes out or wait and see what AMD cooks up cos this gen feels not much better than last gen, to the point that a 4090 might be better to buy as the price will probably drop.
@@Kuroganemk2 Weird how people say a 30% uplift is not much better than last gen, but then say they are buying amd which had a literal 30% decrease in performance and vram lol
@@Dempig The problem isn't the 30% uplift but it costing 2000$+ if it was 1200$ with a 30% uplift, people would be super simping.
5090 specs and formfactor make this a no brainer. The 5090 will be sold out for years to come. PERIOD
The 5090 wont be sold out for years to come, That said imo again just like the 40 series the 5090 is the best looking value if your gaming @ 4k / 240hz or doing hi workloads and needing a card to last for years without feeling the need to upgrade every gen or mid gen
LMFAO ....Next is The 6090 is now software only. Thats right Folks , We wont make a Harware Video Card Anymore , all of you can now just buy our New 6090 software and Bang you have the Fastest and state of the Art Mind Blowing GPU. All with DLSS 6 and Frame Gen 6.0.
that will likely happen. you will buy a specific ai supporting card and the rest is software. You dont need grafics anymore, evweything will be hallucinated, generated through neural models and pixels. Like a dream. a simulation of nothingness
Me : still intending to use my 3080 for the next 10 years because of how much I paid
seems to me Nvidia is maxed out on hardware and now has to go to scam AI crap..
Raw native performance is what matters in a GPU - always has been, _not_ fake frames, upscalers, and other gimmick bonus features.
Okay. And what are you going to do? Reprimand NVIDIA?
@@Kyanzes Yeah, by not buying from them for yet _another generation in a row!_
2nd comment, on a more serious note...
Its great that you want to dive in on the independent testing, and having to go out and buy them shows your dedication to the community.
Thanks again for the well broken down vid.
I Think you need a GPU to test before you can release benchmarks
I saw an nVidia guy say that DLSS 4 is going to be available for all RTX cards. I was confused.
It will. Only multi frame generation is exclusive to 5000 series
Prefect, that's all I really care about really. I don't use FG on my 4090 anyway.
Yeah the upscaler update and stuff will come to other GPUs.
Thanks for actually stating it as it is with the 5070 vs the 4090... i've seen channels do nothing but spit rainbows and sunshine over that BS marketting statement and only briefly mention how it does that. It is really frustrating that because of nvidia's deceptive and downright scummy marketting, people are actually believing it and defending nvidia...
Thanks for glazing my ego so I feel good with my 4090 bahahaha
There is no believing it. It WILL be able to use DLSS 4 MFG and match your 4090 in gaming fps. That is not wrong. Everyone knows it is with those features. Raster is not some elite generation of frames that makes it "real" vs what Ai cores generate they just do it differently.
@@jmangames5562 This is delusional.
The reason 120 fps is and feels faster than 30 fps is because the latency of the real rendered frames have ¼ the latency -- including the inputs that affect the outcome of the next frame. AI/FG frames doesn’t help latency - in fact it increases latency, making your dismal 30 fps even worse in this regard.
5070ti is the best bang for buck imo. It’s comparable in raw performance to the 5080 super
I think the 5070 Ti is the best option as well.
NVIDIA deserves bankruptcy.
Actually, its AMD who will have trouble selling GPUs soon
I think you misspelled Intel
I've just seen some UE5 Demos on the RTX 5090 with DLSS 4 MFG and you can see clearly the rubberbanding, so it's only playable with a pad or super slow mouse movement.
Honestly who gives a fuck, as long as the ai works i could care less. Getting 100+ frames for $600 regardless with how its done, im good.
How much could you care less?
average fool
Who cares is nvidia's entire consumer base and the gaming media. They would love to convince everyone of this mentality, that's the marketing push, so far the reception doesn't seem to show it as working at all. Make no mistake if their sales don't dip this generation they'll continue nerfing specs and "compensating" with software in the future. 70 class cards matched the previous generations flagship performance for the last 10 years straight, now you're getting a card with raster somewhere between a 4070 and 4070 ti at best, maybe not even matching a 4070 super.
I got some snake oil I think you might be interested in...
Does the 5070 Ti generate AI images 2x faster than the 4070 Ti Super?
r.i.p. gpu 1999-2025
killed by its own father
Did you test any 50 series card next to a 40 series one ?
we all 100% want 240fps that feels like 24fps ahh smooth so smooth and higher resolution textures that take up double the vram but are blury and look less sharp than games from 10 years ago
No problem, get an AMD
Multi frame gen sounds absolutely amazing. Generating three frames between each real frame has the potential to provide a smoother and more fluid experience. It’s exciting to imagine how advancements like this could evolve gaming and graphics technology. Nvidia continues to push the boundaries of what AI can achieve, finding creative ways to enhance performance and optimize hardware. This trend shows how innovation can lead us in new directions, blending AI with traditional rendering techniques. While some might prefer native resolution and FPS, there’s no denying the potential of AI-driven features to revolutionize the gaming experience. Embracing progress while balancing raw performance and creativity opens up amazing possibilities for the future of graphics and gameplay.
I like FG. It has its use-cases. But the problem is that they are equating it to "performance", when it is not. Upscaling helps with performance. FG only helps with motion fluidity. If anything, it makes "performance" ever so slightly worse. The problem is in the way they are marketing it.
Another thing people need to take into account is that the frame number on paper says nothing about frame stability and image quality on the actual screen. If I get to run my games at 250fps with RT and Ultra everything, but it looks like vaseline smeared all over my monitor, I'm not paying you even $1 for that GPU
I have Multi Frame Generation 2 up to 4X in any game or video with my 7900XTX with a Lossless Scaling program and it works and delivers. Very very good. Much better than anything I've tried... 8 euros well spent.
Lossless scaling is gimmicky
Already knew it was going to be slower, possibly slower than 40 super series. They compare it to the original 40 to make it seem like it’s an “improvement”. Raw faster there will be no improvement without 4x fake frames
The biggest issue is you are required to use DLSS for the 5070 to even reach said performance of a 4090 which on paper Isn’t a bad thing, however that being said DLSS was created for systems and or parts that couldn’t reach said higher end performance in the first place… so in essence their trying to market a feature that is for lower systems on a top of the line alleged new and improved hardware.
Nvidia is now a AI company not a hardware company, from the specs the 5080 is only going to be marginally faster than a 4080 in pure rasterisation, like 10-15%
More like 25/30%
@@thetheoryguy5544 The 5090 will be 25-30% faster over the 4090 but it also has 30% more CUDA cores and can draw way more power, the rest of the stack isnt going to see that jump
@@BOKtober the 50xx series is manufactured with 4NP process which results in a natural ~25% performance gain so any 50xx series graphics board will see at least that jump in performance.
Seems like they focused mostly on AI TOPS performance. But what does that metric do for gamers?
So for someone like me who is looking at replacing their old 1060 would the 5070 be worth it for its price or would I be better off getting something in the 40xx once they become cheaper due to the new cards?
I think people overlook that we have 25% better ray tracing performance on most cards and Farc Cry 6 data is WITH RT ON, so not even that data really tells use anything about the performance gain, it could be as low as 15%
Generation-over-generation rasterization performance improvements for GPUs within the same tier (e.g., 4070 Ti to 5070 Ti) typically range between 20% and 35%. Claims of significantly higher gains are often overstated, as such advancements are constrained by current manufacturing technology. Substantial performance increases would only be achievable through a major breakthrough in manufacturing processes. For instance, if the 50 series were built on a 4 nm process and the next generation transitioned to a 1 nm process, this could yield a significantly higher performance uplift and noticeably improved power efficiency.
If mult generation is what people want, why not use lossless scaling? I just started using it and i’ll use it’s mult gen that works on every single game with my 4090, unlike dlss mult gen. Idk if i see a point in upgrading to the 50 series unless you have a mid-range 20 series card or something
Why is multi frame gen bad if we use frame gen or even other things on all the games we play already
So it looks like now 5070 is what 4070 shold be - half of 4090 performance, if you do not count half added fake performance.
Your 5070 vs. 4090 argument is like saying an EV (car) will not match an ICE because the EV doesn't have 8 cylinders, 3 turbochargers, super-sparkplugs, etc., basically the components you've become accustomed to. when all that matters is the results in a particular domain.
It all depends on the end cost put out by manufacturers. If 5070 TI will be in same price range as 4070 TI Super, then ofc you go for 5070 TI, because the "raw power" is imo around 10-15% better. The problem will most likely be that the end price will be higher than 4070 TI Super by probably 20%.
Until we get some third party benchmark-tests, all this is highly speculative, but:
- "per core"-uplift from Lovelace to Blackwell could be relatively minor - just look at how many more cores they put onto the 5090 and how large that chip is. If the new cores were that much of an improvement, would they really have needed 21,760 CUDA cores vs the 4090's 16,384?
- total number of cores on non-5090 cards aren't that much higher when compared to their Lovelace-equivalents (5070 = 6,144, 4070 = 5,880, 4070 Super = 7,168 ... 5070 Ti = 8960, 4070 Ti = 7680, 4070 Ti Super = 8448)
- outside of the 5090, amounts of VRAM are identical between 40xx and 50xx, as are memory bus-widths with the only uplift coming, apparently, from the switch from GDDR 6X to 7.
So outside of the 5090 - a sentence that will probably become a mantra with this gen - 50xx performance could be pretty "meh" indeed. It also doesn't fill me with confidence that one of the earliest leaks mentioned Nvidia comparing the 5070 to the 4070 Ti (not to the Ti Super) in internal comparison charts, something they repeated during the launch event. If the 5070 would beat the Ti Super, surely they would announce that fact. Speaking of which: It's also quite disheartening to see how they used all the AI-stuff during that presentation to give a probably false impression of the 5070's capabilities. 5070 = 4090 levels of performance? Yeah... right.
If it gives better fidelity with just a 6% of a second penalty i think it's good. The only poeple who will not like a 50-60 ms latency are pro/ competetive gamers which is how much percentage of the market?
i have question,why you dont compare 50 series to 40 supers?
What this video didn't specifically point out is that none of the 5000 series cards beats my 4090 except for the 5090. For me, anything other than the 5090 will get me worse performance overall, despite what NVIDIA and it's CEO Jensen Huang claim.
So it is like 25% increase in performance for 25% increase in price from 4090 to 5090. It is basically 4090ti. It is not an actual upgrade to the family unless it is the same price with increased performance.
Is it safe to say this won't make the 40 series' price go down?
I'd imagine it would decrease, but only after a while of the 50 series being released.
Just subscribed bro. I like what you’re teaching
So you pay 500+ for a software update?
So if I was to purchased a 4090 back in the day and now seeing that a new graphic card with all the bells and whistle being used in a game to get the same performance. Should I be pissed that the new card that is wayyy cheaper is technically giving me the same performance..from the same company?
You mean the 5080? Will be slower.
@@AallonTapsa Gotcha
Well everyone is coping by saying it isn't real performance so just do that lol.
@@dolpoof2335 yeah thats what I am hearing as well.
its litterally an upgrade over last generation what do you mean
yes its adding fake frames but the specs are still higher than the 40 series
which would u buy a 4080 msi surpreme used for 720€ or wait and buy 5070 TI around 900 € ?!
*none*
I think the real tell will be to see how these new gen architectures perform... the raw stats alone wont tell the whole picture. Im eager to see some real testing when these cards become available. Ill check back then. Great video tho summing up what we know so far.
Stay with 4070 ti ASUs 12gb oc or upgrade and to what?
5070 TI or 5080 for any real noticeable performance upgrade.