The 5070 is NOT as fast as the 4090...marketing BS explained
Вставка
- Опубліковано 9 лют 2025
- NVIDIA claims the 5070 is 4090 performance for cheap...here's what they didn't tell you!
Learn more about the Viewsonic XG2736 and XG2535 eSports Gaming Panels at
XG2736-2K - vsfinch.es/47u...
XG2536 - vsfinch.es/3Tw...
○ Get your JayzTwoCents Merch Here! - www.jayztwocen...
○ Join this channel to get access to perks:
/ @jayztwocents
○ Join the official JTC Discord! / discord
○○○○○○ Items featured in this video available at Amazon ○○○○○○
► Amazon US - bit.ly/1meybOF
► Amazon UK - amzn.to/Zx813L
► Amazon Canada - amzn.to/1tl6vc6
••• Follow me on your favorite Social Media! •••
Facebook: / jayztwocents
Twitter: / jayztwocents
Instagram: / jayztwocents
SUBSCRIBE! bit.ly/sub2Jayz...
Im so excited to see AI generated frames on my monitor playing my AI generated game while on a Discord call with my AI generated girlfriend
truly living the dream
Lol everything is AI
Instant realisation, how can I trust you are not an AI??🤨🤨
whilst you are commenting on a ai generated channel 😁
And you won't own any of it. You will rent your entertainment hardware, you will subscribe to discord and your girlfriend.
@@longaugust sounds like the pro i hire at night😁
Still, tons of reels/shorts about death of 4090 are already flooding the internet. Casual gamers see “5070 is as powerful as 4090” and thats all. They won’t factcheck.
I see the 5070 being as powerful as the 4090 because after some time, the 4070 eventually became as good or better than the 3090
Don't correct them. I'll happily buy their cheap 4090 when they upgrade to a 5070
@@jordanmntungwa3311 The logic lmao.
@@MageLeaderInc The type of person to buy a 4090 is not the type of person who's going to "upgrade" to a 5070. They'll upgrade to 5090 or 5080/5080ti etc
@@MageLeaderInc indeed, the 5070 being as powerful as the 4090 is very impressive and everyone should buy it's stock. Sell the 4090, you don't need that outdated trash. $549 is a bargain.
On the NVIDIA diagram, there is '5070 = 4090,' but in the performance diagram, they compare '5070 vs 4070.'
It is only with extra fake frames it can compare. So it does not have the horsepower, but ability to fake the same output in some cases.
It is bad how they distort consumers understanding by presenting just the special situations and not raw performance.
Hope those "influencers - I hate that term" can present this in a more sensible way.
It's total garbage that companies are allowed to grossly mislead customers when false advertising is supposed to be against the law.
@@glenrisk5234 They aren't, it's against federal law in the U.S. under the anti trust laws. Making false claims is fraud. Both our political parties are so corrupt, all big business is allowed to violate the laws openly. Dems are Corrupt and "T" is a Billionaire and part of the boys club, only out for themselves.
Worst part is that the 5070 specs are not good compared to the 4070S.
Ya but in the 5070 vs 4070 diagram they purport that the 5070 is 100% faster than the 4070. Just you gotta read the fine print to see that an FPS to FPS comparison using multi-framegen on the 5070.
I never use framegen. I want my game to run at atleast 60FPS before framegen so it's responsive. And I really don't need more frames above that, especially if it's not coming with improved latency. And in multi-framegen's case, I'm actually getting worse latency with framegen enabled.
Is anyone surprised that a company would say something is better than it actually is
Nvidia is a special kind of slimy tho
Well, intel undersold the performance of the b580. Nvidia is the king of the market right now, they have no need to obfuscate.
Safe and effe.... I mean faster than the 4090!
@@Plutonium239MXRb580 is worse than 4060 on 5600 instead of 9800x3d they used in tests
Nope! 😛
So, that's like saying that they make a 4-cylinder car's exhaust sound like it has 16 cylinders, but it still runs like a little 4-banger.
4 cylinders-16 valves,if you focus on valves you'll forget about cylinder(s).
Sounds like an exhaust leak to me lmao
makes me think of those exhaust mods that end up accidently sounding like a wet fart machine lol
So only 1/4 is actual in-game rendered footage and the rest is a "AI" interpretation 😐
Perfectly put 👍
Another is the old phrase "putting lipstick on a pig" sure it looks better but it's still a pig
Hear me out... What if devs actually fully optimized games for pc
What are you talking about! That's extra work with little return monetarily. That doesn't save money. But this frame generation does save money if Nvidia and game makers can use it as a crutch. /S
That's called a console
@@siganarchy consoles rely HEAVILY on upscaling. There is not a single modern game that will run at 4k 120fps at native Resolution on consoles no matter the TFLOPS of the gpu. And if it does, it will look like garbage.
The PS5 Pro's GPU has around 16 TFLOPS, which is comparable to a 3060ti. Try running a modern AAA game on that card with GOOD looking settings at 4k and hit at LEAST 90 fps without relying on upscaling.
You won't be able to. Almost every Dev is switching to UE5...and UE5 is an optimization NIGHTMARE. DLSS/FSR and frame Gen are just excuses to not properly optimize their shit.
Take DOOM eternal: great looking game and it will run at BEYOND 100 fps on a 3060ti, max settings @4k, NATIVE. why? Because the game is properly optimized.
But most Devs/Publishers don't want to allocate recourses to optimization, they just slap on that sweet DLSS/FSR batch and call it a day.
@@UNLKYHNTR Games are better optimized for console. All that other crap you wrote is irrelevant.
lol
"Hey guys, we've gotten a lot better at faking better performance" is what I get from it. I can't wait for the reviews to show what the actual uplift is for native resolution rendering.
100% this. I f*cking hate this post pandemic/AI obsessed GPU era. It's all just smoke and mirrors and a huge PR effort to mislead customers and obfuscate facts without actually lying to the point of being held legally accountable. The new 50 series seems powerful enough to not even need this shady, slimy, greasy, misleading used car salesman style marketing. Just give us raw raster numbers, and raw raster+RT numbers. Fake frames and upscaling can be addressed in a different segment.
me thinks that the fc6 chart is exactly that.
No more than 33%, when you look at Nvidia's page for 40- and 5090 both have the DLSS section. On 4090's page it shows 21fps native with full RT in Cyberpunk and on 5090's it shows 28fps.
But that's assuming both numbers are for the same version of the game and drivers. It's likely that 4090 has some old number that didn't take into account some updates/optimizations so that would reduce the difference between the cards.
You will have to realize this is the future. There is no more raw processing approach, at least it wouldn’t make financial sense. Performance is performance. FPS are FPS. You can make a case for whether upscaling has the same image quality, that’s valid. But we are getting to the point is indistinguishable or even better. There is no going back. Accept it. Why would I buy a $3000 GPU that might give me real rendering when I can buy a $500 that would get me there, at almost no cost of image quality.
@@dv5466 but when you're pushing 240fps, but your GPU is only rendering 60fps... you're literally not gaining any performance... it just looks prettier. basically scamming users with smoke and mirrors, and fake frames. this is not the future, this is false advertising and lies and inflated numbers to make dumb consumers go "ooooh more eff pee ess is good"
what good is 240 fps when games will still react at 60fps. that's right... it's not.
The issue is that you'll still get better pure performance from the 4090 than you will from the 5070. so you won't get anything useful from saving the 2500 and buying the new gen. it's all fake frames with no genuine performance uplift.
"Rtx 5070 with rtx 4090 proformance for $549" press X to doubt
lol the 5070 cant even beat 4070 super on most games that doesn't support dlss4
X
@@shabpnd481 pretty much. said the same got a lot of "hate" for saying it. but i guess everyone thinks DLSS4 gonna be in every single game they gonna play too, props to the devs for patching every single game the past 3-5 years lmao & to this day i encountered new games that only supported FSR3 instead of DLSS or rely on DLSS2. it will be a very interesting release. imo the 5070 is a rebranded 5060 in terms of performance, so they can make use of an entry entry level 5060 later on
@diomedes7971well put! Best reply here.
@diomedes7971 Brother are you still pinned here?
So that’s like clicking on a “You’re 18” when you actually aren’t.
Remember when games could render decent fps natively? Pepperidge Farms remembers
Game devs are lazy af now and days bc the market allows it
@@BIOHAZARDCURE Part of me wonders how intentional it is. So Nvidia can sell their next line of GPUs
Yup .Gamers are such little B these days..20 years ago people would have rioted over this nonsense
What is this weird obsession people have that GPU’s must render everything natively for it to be “real”? Basically every aspect video game optimisation is layers upon layers of “cheating” to create the illusion of higher fidelity…
Even the human brain “cheats” by only “rendering” a tiny portion of your vision with full “resolution” in your FOV. Did you know you have a blind spot in your vision that your brain literally fills in with made up information? If the future of video gaming is primarily neural rendering, why does it matter what the original native performance is?
@@MarylandDevin You are so right - these games already almost demand AI bullshit to run decently, with this next iteration generating more then 1 frame per actual - it's going to leave everyone who doesn't or can't run that tech in the dust.
If they want to sell me fake frames. I’m gonna pay with fake money.
Yeah, just put 3 fake dollar bills between each real dollar bill :D
Sup illegal person
Every money is fake, it is a concept invented by the society to quantify trade value...
It doesn't matter if they are fake or not, as long as your eyes feel better. If that makes gaming feel smother.. why not, it's not a bad tech. BUT... do not sell on that! I'm talking to you nVidia. Like Phill said.. latency is also important.
crypto?
FRAMES PER SECOND🚫
FAKES PER SECOND ✅
Good one 😂
We could call it FFPS ..Fake frames per second...
Will it be better then the 3060 to?
I’m extremely confident that the 5070 will not actually be better than the 4090.
i doubt that even 5080 will beat 4090 in real battle
Agreed
@@lo0nyk with a 256 bit bus? No way
it will be ABOUT 20% better than the 4070 Ti Super or whatever the hell it's called, at least according to Hardware Unboxed.
lol the 5070 cant even beat 4070 super on most games that doesn't support dlss4
So you telling me we hit a ceiling and just faking it with frame generation to show advancement.
but DLSS has always been one of the selling points of NVIDia. For a long time now FSR and DLSS have been focal points in this tech space and have widely been accepted as the new norm
Thats not what he is saying. He is just explaining the bs marketing. The ceiling got higher with the 5090.
@@chronozeta just not $400 worth of additional ceiling.
I've heard enough 'doing more is impossible' from NVIDIA's conference that I think they stopped trying and are going for low hanging fruit. Is it really a ceiling if you don't try?
@@jordanmntungwa3311 True, but when it gets to the highest end cards line 4090 & 5090, their core audience there is the professional space. 3D artists, game asset modelers, & even pro gamers just to still consider gamers in that market. The main issue is that you don't get to use DLSS in workload tasks, just gaming. This means that they are being disingenuous to their target demographic for the 90 cards. It's not wrong to say that 5070 can game at 4090 frame rates when the right settings are enabled, but 4090 users in the professional space can't accept the 5070 as a valid substitute for their cards and it's worrisome that some people looking to upgrade their professional workstation may purchase these thinking so if they don't watch videos like this one.
Nvidia pulled the biggest party pooper move. Got us fired up. Then you realize it’s just a marketing scam.
I miss the days of the 1070/1080/1080ti, when GPUs were GPUs and not the inbred offspring of ChatGPT and a rusty toaster.
what are you yapping about, modern cards crush those cards and the framegen and upscaling are literally just a bonus, the cards can render without them and dont rely on ai for basic operation. you sounds like an old man blabbering about old times
@@HELLF1RE9its just pure emotional ranting not really based in reality
@@HELLF1RE9the explain why, without DLSS, the 4090 could be brought to sub 60fps @4k, by games like Alan Wake 2, the new Indiana Jones game, Black Myth Wukong? Game optimization aside, the card shouldn't need frame gen to reach 60+fps.
There's also the price point as well. That's my biggest gripe. Even if I had a spare $2k for the 5090, I'd have issues justifying that purchase.
@@HELLF1RE9 bro u dont get what he sait like at all
@@Bruhngus420honestly, it's or less a gripe about performance to price point. The value of the 5090... Let's face it, isn't that high. On the 10, 20, 30 series? The performance to price was good (the whole scalping bs aside). The 40 series? Not as high. But they're not going to move a ton of 5090s. They're biggest sellers are going to be the 5060, 5070 and possibly the 5080.
Im so tired of upscaling and frame generation
Why ???
@@yaxo11Graphical artifacts. Old games are clear and sharp, new games not so much.
@@yaxo11 it makes any movement a blurry soupy mess but at least screenshots with no movement are pretty
well the entire industry is going there led by nvidia so it's happening.
@@redditredditredditredditreddit you should try DLSS instead of FSR
Thank you for this video. People are eating this up and low balling 4090 sellers right now. Nvidia pulls the same thing every year. I remember the 3070 beating a 2080Ti as well.
Saw a guy offering to sell out contracts trading a new 5080 he will buy for a used 4090 and some people were responding asking for contact details lol
It was on Facebook, go figure.
It was on par with 2080 ti tho
It was definitely on par with 2080ti tho. Especially once drivers matured. What cut the legs off from underneath the 3070 at the end of its life was the insufficient VRAM. The core had the power to keep on going but lacked the memory capacity to feed it data, once PS5/XbX multiplats expecting 12gb started coming out.
@@rarespetrusamartean5433 Don't worry! The scalpers will get most of the ones the OEM's don't get so they'll all be $800 + by the time they are released.
@@РаЫо In most situations, the Nvidia RTX 4070 is considered to be roughly equal to or slightly faster than the RTX 3090 as well
- 4090, did you bring me that frame I asked you for?
- Better, I have a picture of the frame you requested!
- Ok... What about you, 5090?
- I brought you my entire photo album of your favourite frame... You love frames, don't you!
Much moar better
It is a drawring of a frame
5090: "So the Mona Lisa here has a moustache because I had to try to replicate it super fast, I thought she always had one but she didn't, then this Mona Lisa here is a dude, my mistake again but it's still somewhat resembling the Mona Lisa, then this one is negative color Mona Lisa, this one is Mona Lisa missing an eye..." flipping through the album super fast.
Convincing huh? No?
Okay fam, this video was a wild ride! The breakdown on the 5070 vs. 4090 had me shook, like, we really gotta see through the marketing smoke and mirrors. It's all about performance at the end of the day, right? Loved the way you highlighted how appearances can be deceiving-kinda like my cooking skills! 😂 Let’s keep questioning the hype and diving into the realness of tech because it ain't all glitter and glam! Keep it coming! ✨💖
I work in marketing and I fucking hate when companies do this. They should be sued for false advertisement bc this is completely lying to consumers and going to lead to many people getting burnt. Not to mention how bad they are with scalpers, the 5070 will never sell for anywhere near that 549 price. 1000 on the market if we’re lucky.
Where's the false advertising? Their performance graphs have all the features enabled at the bottom of the graph . In the presentation, Jensen said "with AI features enabled".
NVIDIA's marketing has been full of tricks since day one of the company.
@@clarkisaac6372 every company does the same thing. If you're going to rely on marketing and not your own research, you probably deserve whatever scam you fall for. You can't believe anything you read anywhere these days, why would you believe any company lol.
@@sketchyx8307 oh grow up. "Performance" in this context has a decades-long defined meaning, and "fake frames that are not based on user input" does not form part of it.
They just put the 4090 worst case scenario vs 5070 best case scenario, that's not false advertising, it sucks, sure, but if someone gets burned by that, they diserve it, you never trust first party graphs, and wait to see the real reviews and comparisons, to me, if someone gets fooled by an abvious half truth, it's their own bad for putting so much trust on a companie's word, which is only there to make money and hype a product.
What’s more concerning is the people actually defending this B.S.
Don't buy it and go enjoy your actual gpu. If you writing here and you worry about people that defend this b.s. you have some problem. Do your things and let us worry about this b.s.
@ You should probably worry more about grammar and spelling instead of GPU’s.
50 cent army out in force.
You will have to realize this is the future. There is no more raw processing approach, at least it wouldn’t make financial sense. Performance is performance. FPS are FPS. You can make a case for whether upscaling has the same image quality, that’s valid. But we are getting to the point is indistinguishable or even better. There is no going back. Accept it. Why would I buy a $3000 GPU that yeah can give me “native” rendering but I can buy a $500 GPU that would get me there, at almost no cost of image quality. The market for people that will buy a $3k GPU to have “native” rendering will be slim. NVIDIA knows this.
PC gamings death knell was when the whole mainstream "PCMR" era brought on a bunch of console gamer converts. They brought with them their 0 standards in visual clarity and hardware performance and it was inevitably all downhill from there
Ai frame generation is ass, and I NEVER use DLSS of any kind. Like you said, inputs are still going to feel like its native fps, so its essentially useless. We want native raw power to push at 4k (at minimum) with fully maxed rt options on, 120fps. We're in the what, 4th generation of 4k capable cards, and none of them can handle max ray tracing settings in native 4k, and still get at least 60 average fps. Its shameful.
Frame generation and upscaling - the modern equivalent of smearing Vaseline over the lens or squinting.
Just like most of the other post-processing effects that actually make things look worse: depth of field, motion blur, chromatic aberration, lens flare, bloom, etc.
Now including frame generation, upscaling, and that thing where you render a game at higher than native resolution then downscale it to your display's actual resolution.
Or shooting air bubbles into the peanut butter to "whip" it (make it lighter per jar) and then charging extra
Okay Nativecels.
When it comes to graphic in games, everything is smoke, mirrors and shortcuts to achieve the result.
can we go back to MSAA it looked so much better than the smeary mess of TAA
I'm against fake frames.
same
SAME! Always hated it.
I remember when they added DLSS to satisfactory and it made the game look like absolute ass
You won't be when we have 1000hz 2000hz or 8khz monitors. Imagine running games at 250fps natively but using frame generation to reach the monitors refresh rate, you wont feel any latency difference compared to running at 8000fps natively.
This is a step in the right direction, you might not see it yet, but save this comment and check on it 12-14 years from now.
@@STXNCIC faking sjit is NEVER a "step in the right direction". you're delusional
From my experience using frame generation in marvel rivals, when it says "80-120" frames but it FEELS like the actual 20-30 frames in responsiveness, it's terrible. It's a terrible feeling and the disconnect between visual frames and responsiveness is another layer of discomfort on top of the discomfort of it being at such a low frame rate.
It depends on the game. I played Crysis 2 Remastered using AMD's driver frame generation. It went from a locked 60 fps to 120 fps; overlay said this added 10ms extra latency, but inputs felt just as fine as without frame generation enabled.
@@GDRobbyethe issue is us morons who play competitive games. We are usually the sensitive to latency ones.
@@micobugija6284 Doubt. If you played a game with 50ms of full system latency, you wouldn't be able to accurately say when it switches to 55ms. Which is about the latency current framegen adds. It's a difference of 1/200th of a second.
You literally cannot play fps games of any sort with it, the dead zones are insane.
I’m over here sitting on my 2080ti still, and to be honest. I’m still very happy with it.
If you dial back some settings you will have decent performance. I have the 3080 and will wait for the 6000 generation or get an AMD if the price/performance is right.
I'm still on my 2070 and modern games are starting to piss me off XD on one hand you get a few gems like that Robocop game that seems like it will run on anything but looks like it was rendered on a super computer and it just stunning. Then you have 99% of the other games acting like everyone has a 4090 with unlimited vram. The re4 demo kept crashing on me because I only have 8GB of vram! But somehow it used to be more than enough... Hell even cyberpunk when it came out ran perfectly fine on my computer with rt and everything but now is like a slide show with everything turned down or off because they decided to update it and require higher specs. All this "AI" stuff is just going to make games worse because they will be made with it in mind.
Well my 1060 is getting a bit long in the tooth 😅
I am having a 3090 (24GB VRAM) and I play on 1440p and do a lot of AI rendering on stable diffusion.
I was planning on upgrading to a 5099, but I can run everything quite good still with my 3090, and the funny thing is.. for AI and stable diffusion I don’t find the 3090 “slow” or something. It’s quite a good card also for that still.
@@PaletoB replace it with any other amd card like 6800xt, all 2nd hands are still great, nvidia cards are still overpriced even when 2nd hand
We all remember the claim of a 4070 being three times faster than a 3090 and how that turned out. At this point anything Nvidia says cannot be trusted, however without wanting to sound arrogant its not meant for us so to speak, its for people that hear 5070 as good as 4090 for such a cheap price and that is all they will ever hear and go and buy it. While Nvidia may be an untrustworthy company they are not stupid and know it will work on enough people to make a fortune.
The fun thing 4070 is Worst than 6800xt
Not seeing Jay on the thumbnail when it comes to technial data and important statistics brings alot of confidence.
Sooooo, does that mean your FP Shooters will look smoother but you’re gonna miss your shot because your location will be different than what you’re seeing?
If it's pure frame gen issues, then you'll miss due to input lag
Bingo
if you go to 5:30 mark and watch at 0.2x speed. The amount of ghosting you see when spinning the camera around, i realy wonder what that will look like playing a high speed FP shooter where you are constantly spinning the camera in all different directions.
No, it generates frames in between 2 render frames.
I think your game will react with higher latency to your mouse input. i find it unplayable at 60 fps.
If you go to the web page for the RTX 4090 and scroll down, you'll see Cyberpunk side by side with DLSS/FG off and on. 4090 got 20 FPS with DLSS/fg turned off.
The RTX 5090 page also has a Cyberpunk video side by side DLSS/FG off and on. 5090 got 26 FPS with DLSS/FG off.
Soo the rtx 5090 from 30 fps got to 26, its still bad, but i think people should understand that those 30 fps 4k max settings with path tracing si super good, like before it would take minutes or hours to render a frame, now you are doing it in seconds.
So 30% faster is the real number. Good info
20 FPS? Wait a minute…does this mean most of the frames are fake?
@@Name-ho4ttobviously lol they are fake frames
So you are getting 6 fos more for like $400 more on raster? 😂
Frame Generation is "Motion Smoothing", it is NOT rendering "Performance".
Nvidia's claim runs very close to an illegal level of false advertising.
This is as shocking as Nvidia tying the price of its gpus to the A.I. market
I never eat without a brick of salt when it comes to marketing, they want our money and will lie for it.
I love when people say "oh Frames are now AI!" cuz when they think a little about what that means they're argument Falls (nothing in the screen is real so why care if it's Ai or not)
exactly! as if those "RAW" frames were actual film in the old days. :'D
if its better then the 4070 with the price of a 2080ti oc then im getting it
ppl have a hate boner for ai
4 times of fake frames? We should be allowed to buy 50 series cards with 1/5 of their MSRP and pay the rest 4/5 with generated fake money.
it's one real 3 fake, so we should buy at a 1/4 of the price and pay the rest in monopoly money.
@@huh0123 Are US dollars anything other than Monopoly money that the government pretends is real?
Unoriginal AMD you got the math wrong
I got a whack of old monopoly money lying around, think mail order would work?
You'll never get the suggested MSRP outside the slide, therefore 1/5 .
How is this not bordering on false advertising?
That's the thing, my friend. They spend a looooot of money on engineers, marketing, and lawyers to make sure they stay riiiiiight on that line.
It's technically not because the "fps counter saiz so". But In reality, it's not. It's bending the rules/gray area/smoke and mirrors. Whatever you want to call it. BS. LOL
Because if you didn't comment directly and do research, you would know that Nvidia is right about it technically.
When Jensen was publicly stating the 5070 will equal the 4090 in performance he very clearly suggested that all 'crutches' should be enabled (upscaling + multi fake frame gen). Even he knows not to (try to!) mitigate crossing that line.
HOWEVER, yes, I STRONGLY believe he DID cross that line simply because the 99% of buyers NOT 'in the know' will falsely believe the deliberate marketing headline misinformation....that the 5070=4090.
--> PLEASE let there be a class action lawsuit...if only for the bad publicity Nvidia will get!
@@fightnight14 "Technically the bullet killed him. I didn't" still gets you hung.
Hi Phil. Alway nice to see you in front of the camera and your presenting skills are improving further.
They can't do it in hardware, so they'll do it in software. They are really taking the "fake it 'til you make it" business mantra to heart.
amd has fsr and frame gen so you have them off ?stop crying for something u cant get
I take your point but it is hardware accelerated. As in you won't see the same improvement on a 40 series that you will on the 50 series even if they're both running DLSS 4. Hardware improvement for sure but that hardware doesn't actually improve raw performance at all.
Ohh they can do it in hardware. But why would they do that, they are greedy and saving money by holding it up hard with software. From a business standpoint it's amazing but for a consumer your essentially paying just for the small bump in raster and the big push of frame gan.
Imagine being a tech intusiast and saying software evolution is bad??? LMAAAAAAAAO
@@Nobody-su9km Nobody said either of those things. LMAO, indeed.
it's basically just the image smoothing filter on TVs that Scorsese told us to turn off. My console playing friend figured this out years ago, "who needs a gaming PC when I got Image Smoothing?"
After 150fps pc gaming I never used my PS4. It feels like there is a massive lag even if the gameplay is smooth.
Thank you so much for the video. There are already many videos on UA-cam that claim that if you have a 40 series graphics card, it's not worth upgrading to the 50 series, don't throw away your money
I turned frame gen on once and noticed that it screws up movements, shadows, some parts of the UI and markers and other shit....never again, that thing can fuck off.
Raw 60fps is all I need.
It also feels...weird. I know there's more input lag but instead of there being no new frame, there's wrong frames, which feels much worse than there being no frame at all.
I usually have FG off if my 4090 can actually do 4k120fps+ with or without dlss3 quality or even somewhere like 80-90fps in the most demanding games. RT is killing performance and bad optimization of games is even worse, like the recent ue5 games. In some games it changes the game to look visually stunning but runs like crap and vice versa.
When FG is on, the inout lag is noticable higher due to base frame is lower. But this can also be affected by the game engine creating even more input lag which can/will happen on any current and future AI card like 5090 and so on.
Only reason to have FG is when using RT which is the performance killer on any card.
Yes FG will always feel smoother but input lag will feel worse and reflex 2 will not help as much because games are more demanding giving a lower base frame which in turns gives worse FG frames.
@@TheEdmaster87 that someone is even considering optimization on a 4090 is ridiculous...
@@TheEdmaster87 apparently reflex 2 is supposed to be able to accept mouse inputs even on generated frames, with something they are calling 'frame warping', thus 120 fps should feel like 120 fps instead of 30 fps.... obviously needs to be tested though.
My 4090 chuckles at the audacity 😂
So does my 3090...
@SlimeyGuitarStrings 😁👌
This is the time to buy a RTX 4090
idk if you will respond to this but i dont have enough money for a 4090 so would it be better to get a 5070 ti or a 4070 ti?
@milk_savyget a used 4070 ti. Look for vram
Boycotting Nvidia is the way to go
Also notice these comparison charts are comparing the 5070 to the 4070, not the 4090. Check the top right corner.
"Phil, do NOT move from that chair we still have 34 more videos to make" 🤣
I don't mind playing at low FPS, most of my older games only run correctly at 30, but I will never ever use frame gen.
Upscaling is one thing, a whole new frame, that's a step too far.
I think it's completely worth it. You rarely notice the artifacts just playing the game, I more notice the entire image becomes a bit less sharp. Not as bad in modern games just because they're already smeared by TAA or DLSS. Really wish we'd go back to the good old days of razor sharp MSAA.
And from using Lossless Scaling, adding more than one fake frame really doesn't make artifacts that much worse. I honestly even use it a lot on videos just because you still some reason find people uploading 30fps videos today.
@@Skylancer727 smaa msaa ssaa, anything but taa please! I don't play video games standing still.
Literally your previous video title says "RIP 4090 owners"... You're part of the problem.
This UA-cam channel exists to generate revenue. They chase trends. The new Nvidia cards are getting a lot of press so covering any aspect of them is guaranteed to get clicks. The topic of AI frame generation just so happens to be the next harvest.
i dont see that, searched back 3 months.
@@godw1ll99is in the title of the 5090 announcement video before this one, not the thumbnail.
Unless UA-cam is doing another test thing again. I know you can let UA-cam use 3 different thumbnails that it shows people so you can get a better idea of what style performs the best for you. I didn't think it also applied to video titles, but maybe it can and you just got served a non RIP version?
@@Nareimooncatt ahh, right. i see that. i dont see how that is the same though because it doesnt simply say "rip 4090 owners" so op is being misleading by omitting that it is in reference to the 50 series cards and not specifically the 5070 vs 4090 and so the marketing bs regarding such does not apply to that video or the title of that video. its an entirely different discussion/point of contention.
if we are too look at the content of that video it is entirely fair to say "rip 4090 owners". then again that sentiment has been around through every new launch.
yeap all are supriesed cuz they buyed a overpriced GPU. 3090 and 4070 same its normal that they have the same performance :)
Still running a 1080, looking to upgrade.
Looking forward to your reviews and benchmarks!
Best case scenario, this card will be a single player beast which you can comfortably play AAA 4k on high refresh rates and barely notice a difference. On multiplayer and competitive shooter games, you just can’t trust the AI inaccuracies, BUT those games are usually optimized for higher frames anyways, so you should never feel forced to enable it (the raw power would be enough in this case).
I figure mouse lag panning camera around will be the obvious annoyance with anything frame Gen related.
I've always said that frame generation is fake frames. And it really is. I want raw performance numbers with no frame.
Then buy the 5090
@@L9MN4sTCUk it has the same issue as well
I guess that means to go Radeon
Then invent a new kind of semiconductor to replace silicon with. That one is kinda hitting the wall for a while now.
The reason they're jumping hoops and doing these tricks is because you can't just pack twice as many transistors onto a chip at the same cost but half the power draw and heat emissions every 18 months anymore.
nvidia are the best gpus especially in the flagship segment and the only to blame is amd cause they dont have flagship gpus and when they had they were terrible ....wait the benchmarks nvidia will dominate as always for like the previous 25 years
As a 4090 owner, I am gonna skip the next generation of cards.
well the 5090 is the only upgrade for you, so yeah, no reason to upgrade.
also 4090 and waiting for LG OLED tvs with hdmi 2.2 and 240hz. 5090 on 4k240hz will be great even for old games like max payne 3
I will be upgrading mine. Hello Ebay, here is my 4090.
Same. I can kind of see the appeal but not for the staggering amount of cash involved.
4090 cant even 4k max everything in alan wake 2 it struggles so if you need a really high end pc 5080 or 5090 is the only choice to upgrade
Meanwhile, my 2070 Super: "I'm tired boss."
Dont sell your 4000 series card or even a 3090ti card just yet, tests will reveal the actual performance
When competitive gamers / streamers start using this and find out real fast that something is wrong, the viewers and community at large might finally figure it out.
What would be wrong? What are you talking about?
@@britainvernon9286 Responsive problems when playing competitive. Like the video said it does not feel like those fake frames even if it is looking like it. The feel of the play is not there and when playing esports with that, it will be a problem.
It actually sounds like they tried to mitigate that with additional warping based on your camera movement, taken from keyboard and mouse input Don't know how successful they were, but it might feel almost like server lag, rather than being obviously from the AI frame gen.
@@britainvernon9286input lag, advise you to not go down the rabbit hole, it is a huge hole
just wanted to say hope you guys and families are all safe and doing okay out their just now, terrible situation with the fires. thinking of you all out their just now Prayers from the UK say safe.
So basically, with DLSS off, all those shiny 50-series graphics card will be the Emperor's New Clothes?
Yes
And with multi-frame generation on, the 4090 would shut down the 5070 based on the numbers we see so far. Which is why they will not let 40-series owners enable it
Except 5090
@@Informatic1 you think lossless scale app will be an option when combine with 4090?
Running every new upscaler they have, the same upscaler's they have locked the 4090 out of because money, it might be as 'powerful.'
Cyberpunk, Alan Wake and Wukong have a big thing in common : they look beautiful and don't require extremely low latency inputs from players. if i get 4k res & 240+fps, why should I care about about more than 60hz of responsiveness ? I'm not playing CS2.
technically my 1060 is as fast as rtx 9090 if you just close your eyes
First the game devs use DLSS & Frame Gen as a crutch for bad optimisation, now nVidia is doing the same. I'm shocked.
With "multi-frame gen" the game devs will only get lazier. 🤦
*With NO RT, NO DLSS, NO FRAME GENERATION: the 5070 Ti is close to the RTX 4080 Super. It achieves more FPS in games than the RTX 4080 (non-super).*
Yeah.. like the 3090 was the first 8K gaming card.. Are people still gonna believe this or the whole AI absurdity?!
I\m betting on yes, because otherwise the prices of AMD vs Nvidia stocks wouldn't be so massively different.
AMD fumbles presentations currently, but delivers decently in p/p.
Nvidia is stellar at inserting marketing BS in presentations and delivers better RT performance + the entire ability to use AI which afaik just doesn't run at all on AMD from what I've read on many github AI repos.
The last part of that is definitely not enough to drive their stock so goddamn high. Marketing BS has to be a reason why they are doing so well. They're professional liars.
Nevermind they didn't innovate the actual hardware AT ALL. They increased the die size by 30%, the TDP by 30%, and they're getting 20-30% better performance than last gen...shocker.
every nvidia flagship gpu was monopoly the market because amd dont exist in this segment for more than 15 years hahahaha even 7900xtx was on par with 4070 if you open ray tracing in both...
@diomedes7971 Once you realize the market isn't a collective of sophisticated finance gurus and actually just a crackhead casino where barely anyone reads about or understands anything, it makes so much more sense
@diomedes7971 they didn't fool anyone. Idk what "firms" you're referring to. If they're AI-centric firms they pretty much depend on nvidia and if they're not I don't know what they have to do with this unless they're financial firms.
If they are financial firms they don't need to be fooled to invest in nvidia, they just need to think their stocks will go up.
@@Informatic1 somewhat...but the market for individual companies is so damn volatile that even if you know with absolute certainty that a company is doing the exact right thing when it comes to how they develop products, you will still not be able to predict that their stocks will go up.
There are just too many variables.
If someone can take into account EVERY possible variable that comes into play when deciding an investment in an individual company, it most certainly isn't me.
all you had to do is too see raw performance increase in Cyberpunk between 4090 and 5090, it went from 20 to 30fps without frame gen lol
around 30% increased performance for what? 40-50% in price? but again, if you have 4090 and looking for 5090 you should wait for few gen before upgrading.
@@cythose4059 20fps to 30fps gain IS a 50% gain
@@bumble3572 another commenter said only 6 fps diff in c2077. [Can be seen at the bottom of each card page.]
so yeah it could be 30%.
@@bumble3572 in reality it was a bump from 20-21 to around 27-28 fps
I think that calling "performance" should be illegal. Its the same as saying that a random car has the same performance as the Lamborghini and then the metric of speed on one car is measured on it falling trough a building while the other is measured on actual speed and performance on a road .
Companies need to be put in place with actual laws .
Performance =/= measuring a frame counter .
Same as just measuring speed with 0 context.
NVIDIA’s Company Mission: Fake It to Make It!
Aint readin allat mate
I do not even watch live reveals or reveals in general because it is mostly just bs marketing. I wait for reviews from multiple content creators before I decide.
Give this man some more talking point videos like this! If he wants it ha Good work.
When Jensen claimed 4090 performance in the 5070 I said aloud "that is bullshit isn't it"
Literally same. I heard those words and I spit out my drink laughing. I even rewinded to listen again, because it sounded so outlandish. But...wow. He actually said what I thought he said.
My next words were "BuuuuulllllllllSHIT". And it was loud enough that my roommate texted me to quiet down lol
but its not, its stronger than the 4090
@@GhostMan407 good joke
I guess that's why you didn't listen to the next part where he says with the AI features like frame gen
@@sketchyx8307 Nvidia isn't going to kiss you. You don't need to shill for them.
The game will actually always only feel as good as your latency allows.
Always refreshing to get a Phil video ❤
I can’t wait to put a 5090 in my $10,000 gaming command center. Just like every other normie gamer.
yeah, like what was he yapping about? am i poor or sth? that i "only" have a 2000€ pc at home? and not a fully coverd in gold watercooled pc? wtf is wrong with this dude :D
@@Bastyyyyyy You have to add rgb lights, a lot of them!
When you take all the fake frames and fake resolution out of the way, the generational improvement is almost non-existent. Especially when factoring price and power requirement increases.
That and a 70 class card still having 12GB of RAM in the year of our lord 2025, even though it's GDDR7, is ridiculous. Consoles are starting to have more available memory than that.
consoles having 30 fps in 2025 what are u talking about ...consoles are bad for gaming they are like a decade ago in technology
Yeah I think we'll be looking at 15% tops, in _real_ performance gain tier-to-tier.
Yes nvidia purposly set 5090 to 32gb ram and 16gb and 12gb ram on the lower end series 5080/5070 which is so 2020 just to make 5090 look like the ultimate enthusiast card to make people believe 32gb is the way to go. But adding thst much vram already ijust keeps the price up and not down..
I have the 4090 24gb and never came close to 20gb usage at 4k gaming maxed out, but i can imagine and I know that 12gb is and will not be enough on same settings so thats why the are relying on AI to do the work... but the prices are out of this world and I believe its because of amd not following up.
@@Vennux Friend, I'm talking about total available memory, not overall performance. It's obvious that a top-end gaming PC is far better than a PS5 overall, exactly because a console is already old technology by the time it comes out.
However, the PS5 has 12.5GB of addressable memory for developers. The PS5 Pro increases that to 13.7GB. Even if its the slower GDDR6, this is still larger in size than a 5070. Since consoles are usually the lowest common denominator when developing games, that means your fancy 5070 might be already capped for memory while trying to play today's more demanding games, let alone in 4~5 years.
Nvidia is doing with memory size today what Intel did with 14nm a decade ago. Lack of competition lets the company slow innovation so they can always have some 5% improvement to sell each generation, even if they can do more.
@@TheEdmaster87 I get you. My 3090 also is more than enough for most things and I won't be replacing it anytime soon. 24GB VRAM is more than enough for almost anything nowadays and most AAA games are so unoptimized that even a 5090 will need upscaling and frame generation to play the most demanding stuff at reasonable framerates. I don't like that and don't want it to be the future of gaming.
So essentially, as I have a 144 Hz 4K monitor, providing the FG can boost me to 144 Hz if my GPU can't already do that, there will be no benefit whatsoever going beyond that because latency isn't improved at all, and going beyond the refresh rate of your monitor only benefits you if latency can be improved.
You mean a coporation lie to us, standing a hotel basement wearing a forced outfit. No way man they never lie.
I don't mind the frame generation in games. I mind how well it renders 3D projects in Blender.
Not even at 50% of the 4090's CUDA CORES count yet people think it will have a chance outperforming one is crazy😂
I noticed that the PC Latency in the LTT video comparing the 4090 to the 5090 were nearly identical. Meaning that the performance uplift was almost entirely based off of DLSS 4. This really implies that the raster performance might just be the same or similar.
rewatch his vid and go to 5min30 and set playback to 0.2x speed. Thats where he flips the camera around standing on the stairs and the amount of ghosting from DLSS not adjusting fast enough to the shifting perspective is bad. Cant imagine what that looks like playing a high pace shooter yourself.
It's a new version of vsync.
@@_PatrickO Would you care to elaborate?
so lol 😂 @@ronniebots9225
@@ronniebots9225 you know that recording a monitor and then uploading it on youtube heavely impacts that too, and the fact that youtube isnt 240 fps too...
No nvidea fanboy here, just saying what you say has a flaw too :)
Nvidia will always claim that new gen is so much better but we all know that is just marketing BS, like if you agree👍
Have you heard of Lossless scaling on steam? AI FG technology has been around already and they actually steal this from Nvidia to better there own technology. I use this on a daily basis and it does not feel like 60 fps at all. It has a 2x,3x,4x modes to generate frames plus not to mention further optimizing your pc to have the lowest latency possible. I have gotten Lossless to basically have almost no latency between frames.
I really think you guys should do a video about it.
Laughing at people who sell their 4090 rn to get a 5070. 😂
Nobody do that, please give me data who did that
Why would anyone sell a card to buy a card of the same performance? Hello?
@@4KGameScape Because if they sold their $1600 4090 for... let's say $1200 to entice a buyer, and then bought a $550 5070, they would then have $650 to spend on whatever they want while still having the performance of their old card. It's a profit play. I'm not defending that a 5070 is as good as a 4090, I'm just telling you the mindset of why a person who thought the cards were equal in performance would sell their 4090: For profit.
If anyone has a 4090 SUPRIM they need to sell that junk ASAP
I am appalled that they're not getting sued for this. It's like saying "my car is 300 hp but it pulls like 800hp!" Total false advertising. The 5070 will barely reach a 4080's performance in raw rendering let alone a 4090 LOL.
I smell a class action lawsuit for flagrant false advertising
I never thought I'd see a day when we were arguing over "real" rendered computer generated images and "fake" computer generated images. 😂
Thank you world. You've perplexed and surprised me once again.
I don't like that all GPU R&D is going into generative AI instead of raster performance like it used to. Maybe I'm too stuck in my ways but I prefer real rendered frames.
I think the reason for this would be because gpu’s would be insanely more costly than just working on the software side but this can only happen because AMD is severely lacking and can’t compete so nvidia just has free reign to charge what ever and not innovate.
Real frame=real head shot, generated frame=headshot in your head because you miss without realising it 😐😓
@@theanglerfish I’d imagine it might get to the point where it might feel like hit reg feels where you are clearly hitting the character but it doesn’t register but because that frame is 3 frames old……😅
It’s essentially junk fps where it doesn’t give real value just kinda like doritos compared to a steak.
Sad fact is we are reaching the limits of shrinking silicon and within a few generations we will hit the limit and not be able to increase how many transistors we can fit into silicon. That is why AI and software improvements are so important. Until we can switch over to something besides silicon like graphine or living cell computers based on human brain matter AI improvements are all we got.
smells like a class action lawsuit to me. purposefully and knowingly misleading the consumer.
Basically how i understand this. This feels like if i were to drive fastest car that exists and instead of it having own engine, someone put a worse engine and also took away the speedometer. So i might think I'm driving 150 kilometers (80 miles) per hour while in reality im driving 90 kilometers (50 miles) per hour?
yes, a speedometer might show you are doing 120 mph, but in reality you are only doing 40 mph with the new "engine"
Yeah, your going slower, but now the trees on the roadside are runing at you to make you think you're going quicker!
No, it's more like theres a weaker engine, for a fraction of the cost, but theres other massive improvements to the efficiency to parts other than the horsepower of the engine that make the car able to reach similar results in a realistic environment.
No the 5070 isnt going to be a raw strong as the 4090 . But if the game supports dlss 4 and RT at all, You're going to get similar game performance
5070 raw performance not even on par with the 4080 and 4080S let alone 4090.
Can't complain it's cheaper 😂
It used to be every previous gen 80 was next gen 70 perf wise
Seems like clarification should be added to claims. "The 5070 can emulate a frame rate equal to a 4090 in gaming". As a 3D artist, NVIDIA horsepower is not a framerate thing and a 5070 will not touch a 4090 in Blender Cycles rendering.
Note for Phil: latency is not the same with and without FG. They always enable Reflex with FG and had it off when FG was off. Hence the similar latency
Actually the new Reflex at least makes sense. Because old solution just updates your current frame, which doesn't make sense but in sense of numbers it lowers latency. Meanwhile new Reflex or Warp or whatever they called it, accounts for your movements in game too and then update your frame. So this one could be actually useful, maybe it will improve responsiveness and 120 FPS generated from 30 FPS will feel at least like 60, who knows... Can't wait for benchmarks to debunk all those things.
My 4090 with no DLSS enabled at my monitors refresh rate of 4K 144Hz all max settings runs at 144 FPS all day long without breaking a sweat. No upgrades needed this time.
on older games maybe, on the new games cant, there are people still doing videos about 4090 and 4080 and you can tell that devs dont care about the games anymore.
More like publishers don't give time for devs to finish the product.
The most unoptimised games come from shovelware/asset flip devs that don't care at all about anything, that you are right on. and fthen there are the massive publishers like Ubi and EA that don't give the dev time needed for games... Not saying smaller publishers don't fall into the same trap but still
Thanks for explaining this to some of us who were left scratching their heads.
Graphic cards come out too often. Just like smartphones. It’s very scammy. We have like 3-5 games on the market that need a high end gpu. Everything else is pretty mid.
No, there should be no game that needs rtx 4090 or 5090 to play a 4k game, there are games that even rtx 4090 can't do 4k or even 1440p on max settings, its called ASA, and its the beginning, the next gen games are going to be playable only with dlss/fsr from now on
It’s not scammy. You don’t have to upgrade every year.
@@Yakobito Exactly. Only morons think this, in which this marketing works on. Ive had my 5800x3d for years now, and been on AM4 since 2017, and upgraded my 6800xt after nearly 4 years to a 7900xtx that I will have for prolly 4 years. I only upgrade when I need too.
@@Odin3v rofl.. for what game would u have needed to upgrade a 6800xt???6800xt is perfectly capable to play anything today.. so u disprove your own statement.
@@Apophis371 Use your brain. Love how you say I disproved myself while you know not a god damn thing about what I was playing, or doing, or that I wanted an upgrade as well. I am playing 5120x1440-7680x2160 resolution. I was running out of vram in a few new games. The 6800xt was perfectly playable if you are not running superwide-4k reslution. I wasnt happy with barely 60fps in most games having a 120hz panel.
Maybe at 1080p, on one game, on a specific motherboard, with some throttle, and a frame rate cap on…
The reason I dont believe it is because no company on the planet would evaporate their previous product lines like that and say you can get better for a cheaper price.
AI is the death on gaming. We need to all stop with this frame gen. Only raw performance please
Nvidia has lost all respect from me since the 30 series. 40 was a joke, 50 is spitting in your face..
the 1000 series were the worst of them all and what enabled them to be what they are now and the market to be what it is now.
4090 is not a joke
@@rattlehead999 1000s series was the best price to performance from 9XX...wtaf are you waffling on about.
@@dieglhix compared to previous pricing yh that card is a bargain 😭
@@dieglhix at that power draw and msrp it wasn't good either.
That latency problem is a really good point I haven't heard mentioned before. In a builder game (like Planet Coaster) that would be fine, but completely useless in a shooter.
Great video
No, it´s not true that 5070 is as fast as 4090.
Like it was not true that the 4090 was the best investment ever, as the CEO said.
That being said the 4090 was... no is.. a real powerhouse, I mean in a good way - a GPU beast.
Thank you for another, as always, great video!
This just sounds like when the developers enable motion blue to make up for the game running poorly to give the illusion of fluidity.