This is why future proofing is never a thing and people need to understand that things can change between each generation. It's never been more clear than with this gen.
if you have good rumors you can be really smart with future proofing - I've helped to setup multiple PCs for friends this year - we've done some clever tricks - 6700xt in low end, x3d cpus, they see what's going on and theyre safe and keep downloading more fps each month.
While there is no such thing as future proofing, overprovisioning now to maintain an acceptable level of performance for the life cycle of the current consoles is doable. I think most 4090 users will not need to upgrade for this console life cycle. They probably still will because that is just the kind of people they are, but I doubt they will "need to".
@@ultrawidetechchannel Very true. Though with so many new technologies coming with new GPUs now, things can change very quickly. So for all we know there will be something new next generation, implemented either with the GPU or games, which will require new hardware to run optimally. Similarily to Alan Wake 2 for instance. Where you to a certain degree need the new tech to be able to play the game properly. It's very unlikely, but impossible to truly know. I think this is the direction we are going. More and more reliance on these technologies to push the boundaries of what can be done within the games. That's what I meant by it, but you are right :) Overspeccing does increase longevity, but I feel it did so a lot more before A.I. Things progress so rapidly now, and I doubt they will slow down now that the ball has started rolling. Though if A.I chews up the GPU market, like it has been doing so far... we might not even be looking at dedicated gaming systems in the near future anyway. Console or streaming seems more likely.
Aside from games like Starfield this is mostly accurate, but with consoles using raytracing, even in some performance mode scenarios, it feels kind of sad if your gaming pc can't do it as well.
Good stuff. Been using these artificial frames to take 60 FPS to 120 and it's also a pretty compelling argument for that camp. So at least AMD 6000+ and all Nvidia 4000 users DO actually reap benefits from this tech. But yeah, the argument about breathing new life into old cards is very misleading.
Thanks, ya the Nvidia marketing department is really trying to convince you 40 series upgrade is going to solve your problems even if you can't really benefit from the tech on the bottom rung.
Another brilliant and much needed video. I play full native at 4K on a 13700K / 7900XTX system, with no tracing of any kind. That means I don't need to use up-scaling and frame generation software tricks. If I need more frames, I get them by turning something OFF or DOWN, not by turning something ON. I despise how these magic acts have been pushed at sometimes gullible gamers as the second coming.
@@ultrawidetechchannel 60+ fps is doable with all 40 series gpus at their respective resolutions. 4060 will get 80+ at 1080p high in almost all games, turn frame gen on and boom 130 fps with the same input lag
I get that you're trying to simplify things by saying DLSS and FSR 3 are using DLSS and FSR 2 to upscale, but I don't think that's true. They only added the frame generation as new feature to the new version of their upscaler. So using DLSS 3 without upscaling is still DLSS 3 and not 2.
Yes, you can use frame generation (DLSS 3) without upscaling (DLSS 2), but the ‘how it works’ slide I used, is directly from Nvidia and they only show it in combination with DLSS 2, they like me also believe that you should be using DLSS 2 as far as you are comfortable taking it before turning on frame generation. So, while you don't need to use DLSS upscaling to use DLSS frame generation, you should definitely be using DLSS 2 upscaling first. As a side note, every single marketing example that Nvidia has is using both DLSS 2 in conjunction with DLSS 3 but they always only label it DLSS 3. Which is technically not wrong because DLSS 3 is everything it’s the upscaling and the frame gen and now with 3.5 also includes Ray Reconstruction. So it’s one technology but the way they talk about it on stage it’s as if to say DLSS3 is only fame generation when it still includes the upscaling under that umbrella. So. when they show you those slides it makes the frame generation look more impressive than it actually is.
@@ultrawidetechchannel My understanding was that DLSS 3 was an improved version of upscaling over DLSS 2 and also has additional benefits added on. But Perhaps I'm wrong and it just includes the some old version of DLSS 2 with added benefits?
eh I disagree, I managed to play phantom liberty with path tracing and DLSS FG + RR with my 4070ti at 3440x1440, is it perfect? far from it, but it's very playable with frame rates range from 50-75 ish... it's definitely not the same as having real frames but still smoother than the actual real frames my hardware gets at that graphics settings and res... not to mention I'm also aware of the other problems but in the middle of action games, they don't matter in the same way people don't notice bad graphics when playing esports titles...
The reason I made this video is because of the way that Nvidia and to some extent AMD are marketing this tech where they sell it as a no compromise solution for getting ultra settings game play at high refresh rates. When in reality just as you describe there are quite a few compromises but that doesn’t mean you can’t end up with and improved experience for you gaming situation.
If it looks and feels better to you then you shouldn't care what anyone else says. Though I would be surprised if you were saying that about a game you could only get 30-40fps in before resorting to using frame gen.
i have a 5600x paired with a rx6800 and a monitor that can do 1080 at 240 1440 at 144 and 4k at 60. am using the preview drivers right now from AMD and my frame rates are better. Am sure there is fidelity loss somewhere but when am playing a game that before this only gave me 60-72 fps and at times dipped below 60, am extremely happy that they can now hold a 70-105 fps range. Am a gamer not a pixel watcher. So whenever am blasting the baddies on screen am not really that interested in reflection quality and how sharp a poster or sticker or some other background object isn't being rendered at full 4k native resolution. I do appreciate the info, its interesting but am here to game. I remember playing Sega console games and thinking they were the shit. so, comparing to today I think you get my point. Enjoy the games your playing and stop sweating the small stuff.
That 60-72 fps is right where you need to be to have a pretty good time with FSR3 or DLSS3. It definatly puts you into you have to be looking for it territory for the artifacting.
Love your channel and videos. You made a very good point about frame gen. Can you make a series of revisit reviews for 4000 series cards using frame generation? DLSS Quality + Frame Generation is also a good idea to improve Raytracing Performance.
I have a "decent" system (for 2020-2023) with the Ryzen 5950X, 64GB ddr4-3600CL16 (oc to 3800CL16), and the Radeion RX 6900XT-16GB (undervolted for best performance.) Playing at 4k with high settings I can get 60fps on most games, and high refresh levels on many. (Of course, RT is limited. But, that means it is around RTX 2080ti - RTX 3070 levels... Only.) I want to experience true High Refresh Gaming. But anything above 60fps was lost to me on my "mere" 4k-60 monitor. I just had to buy a $800 (Canadian) 4k-144Hz monitor. Now I can play at the Smooth, over 60Hz, gaming. Some at native 90-144Hz, some with FSR2, some (I talking to you Forspoken!) with FSR3/FG. Not noted by most Reviewers/UA-camTech ppl, is the Monitor Cost to this crazy setup. (Expensive CPU, 5800X3D/7800X3D is better; expensive GPU, AND Expensive Monitor.)
Monitor bottlenecking is a real issue that is rarely discussed. While some 1080p high refresh monitors are cheap and some 1440p monitors are not many 4k high refresh monitors are and getting the best of the best in the monitor game will cost you as much as a strong gaming system.
I applaud the rising tide raises all ships approach that AMD is taking with FSR 3 but I’m afraid the way the marketing is being done for these technologies its going to leave a lot of people very disappointed and frustrated.
You are right in the sweet spot for turning on FSR3 you will get a negligible latency penalty but get a great frame rate that your monitor can realize.
That is why I still use "just" a 1080p monitor. rather have native high frames then relying on this. If you are budget or midtier gamer just stay at FHD. The rest is only for people with nothing else in their lifes or rich people.
This is a solid stratagy to not fall into the upscaling everything trap. Its only when you wnat to go up in size that you need more resolution for all the other reasons you use your PC.
Yes, it will but a lot of those older cards will not be getting the quality they may be expecting out of the performance lift they get from FSR3 due their low starting performance. You want to get close to that 60fps target or you will just be better off using other methods of increasing your frame rate.
You might not be playing much going forward. I played God of War using DLSS on quality on a 3090 and it played fine and looked great at 4k. (I just got a 4090)
@@JamesSmith-sw3nk I don't play singleplayer games or mainstream games. I only play Squad, iRacing, Rust, Age of Empires 2 and 4, R Factor 2, Assetto Corsa. I don't really care about having many games, I just want to play the ones that I like already. I don't need upscaling for anything
Me the one who bought a rtx 4070. Who has dealt with ray tracing since the 20 series. Upscaling as well. I can still get a high fps even without frame generation. But one thing to note is the 1% lows suffers without it. I can still run ultrawide. Just not anything higher than 1080p vertically. At least the video somewhat stupid proof. But I recognize most people expect too much from low end to all the way too the high end.
In alot of the traditinal raster titles and earlier RT titles the 4070 can tear it up still but these newer games are really giveing it a hard time if you want to be on 1440P+ resolutions.
That is what any one gettiing 30fps in all these super heavy AAA games will have to do. Double it first with FSR2 or lowered settings then try to double it again with FSR3.
I double my framerate in basically all ray/path traced games with my 4070, thanks to DLSS Quality and Frame Generation, which is a must, considering I'd be getting low 50's at 1440p native. Shit's smooth, so you can tell me I need a 4080 all day long; doesn't really matter. Now, if we're talking FSR3, sure, but that goes for any tier of card. That tech is a shimmer and stutter fest.
A 4070 is a teir or two above the cards i was aluding to. its more the 4060 and even the 4060Ti users that may find they lack the power to fulfill the promise of FSR3 expecialy if they wanted to try throwing a game on their 4k TV.
As long as your not trying to Path Trace you will be in the above 60fps camp for most games, but as path tracing becomes the new Ultra Setting, you will find your self in the struggling to hit 60 camp.
@ultrawidetechchannel your awesome thanks man. So with that said, I just learned whst path trace means, so what your saying, if more games start to pump out like Allan wake 2, that's the time for my generation jump for the latest gpu. I ask because I'm a first time pc builder, so the 3080ti is my first desktop gpu that I actually bought at a store. Normally I've been a laptop and pre built gamer, starting with 1070, 2070, laptop form. Now I hit full desktop gpu with 3080ti, so I see these upset users, and I try to sympathize, though I fortunately don't struggle like them. So I ask as a wsy to learning the pc culture, thank you
@@michaelrivera4299Ya when you start seeing most the AAA titles come out with the same tech that Alan Wake 2 is using then that's when Ultra settings are now off the table for you and you might start feeling like your missing out.
@ultrawidetechchannel bro your amazing, I hope your channel explodes. I'm subbing right now because of how good your communications skills are, obviously when you get to those crazy 10k comments it hard to keep up. Either way, your the best dude
I know it's been a very long time since this transition has happend, but we need to swallow the hard truth, no amount of software is going to bring a 7 year old pc back from the dead, it time to upgrade that hardware
I agree but it's still a shock to the system when the PC you have had that could keep up with every new release for the last 7 years now performs like trash on every game that comes out.
You can FG from 45fps to be in vincinity of 60, you can even FG from locked 30 and it's still an improvement over 30, you can FG and not use fsr2 and that can redically improve your quality. And on top of all that - fsr/dlss will improve the life of the card because In the past an only answer would be to wait 2 years to buy next gen. Having dlss/fsr still prolongs the lifespan of a card. More shameless people will be able to get even more from that, I've played cyberpunk pt and Alan pt on an AMD card with fsr+afmf in decent resolution and framerate. The industry is way too dogmatic about the number 60 here. And new gen will come, pressing for 24gb ram and 16gb vram in January but then it'll stop till 2028. There'll be plenty of time. Also Nvidia is improving very slowly now (4060=1080ti).
Every fps over 30 you can get will cut back on the portion of the screen that ends up looking bad, and its for each person to determine what they can accept visuals vs framerate wise. My view is that especially below 60 you are better off using upscaling and lower settings than going for frame gen, but even above 60 I would push upscaling and lower settings to my visual threshold before going for frame gen.
Consoles do not use all 16 GB of the VRAM available for the GPU because that memory is shared with the CPU, at best they can use 12 GB for video with the other 4 GB reserved for the CPU and even that is starving the CPU. To get around this Sony lowers the clock speeds and thus the performance of the CPU. Nor do they render at native 4K, they use an upscaler and render at a much lower resolution which also lessens the VRAM requirements. Part of the reason most PS5 games even with upscaling still cannot do 60 FPS and are locked to 30 FPS on highest settings because they are CPU bound both by clock speeds and memory limitations. The CPU on a PS5 is even weaker than a desktop 3700X because of clock, voltage and TDP limitations and the GPU is slightly better than a 5700XT. If you set your PC game to the same settings a PS5 is using (Medium with some low settings) then you are likely to not need more than 8 GB VRAM. However if you crank up the textures to Ultra well of course you will need more but if you use the medium textures setting like a PS5 almost always does then 8Gb will get you by in 95% of games
While the memory is shared on the consoles the game code is often written first for consoles and is often less efficient and more memory intensive on the PC. While 12GB cards won’t really have issues until you try to play at native 4k (which the consoles also can't manage most the time) 8GB cards are now even struggling at 1440p and FSR 3 and DLSS 3 actually require more memory when used unlike FSR 2 and DLSS 2 which lower memory usage. Sure, you may avoid issues playing at console settings and resolution scaling on many GPUs, but I don't think most people pay as much or more than a console costs on a GPU to only match its visual settings.
if you consider that most aaa games coming out the past half year or so are barely playable even on crazy gaming rigs WITH dlss 2 and fsr 2 on then yeah youre fucked even with a current gen mid class gpu
when it comes to path tracing the 4090 is the only GPU that can hold up to that kind of punishment. as far as the starfields of this world that are demanding but not because of path tracing then any thing 4070Ti/7900XT and above will be good, if you want to play at a 1440p or higher resolution.
At no point did I ever see anyone at Nvidia promote any graph or chart where they would be using DLSS Ultra Performance below 8K resolution (and usually it's DLSS Auto, which means 4K Performance, 1440p Balanced, 1080p Quality). What a misleading video. Yikes.
In the early DLSS 3 marketing slides that Nvidia sent out they never specified the DLSS Upscaling settings used, they never even mentioned it being used. So, while they may be limiting themselves to Performance mode, since they never say explicitly you just can't be sure. You must admit that needing to render to 1/4 of your monitors resolution before you can even use the next technology is not an Ideal situation.
@@ultrawidetechchannel DLSS Performance is not required to see performance increase, it's just what they settled on with DLSS Auto for 4K and it also happens to not look bad at 4K. Usually these graphs are at 4K, hence DLSS Performance. At this point I think it's obvious that if they don't mention DLSS factor, they are using the "Auto" lookup table (4K DLSS Performance, 1440p DLSS Balanced, 1080p DLSS Quality). Could just be me who got used to it.
please dont clickbait. they work fine for lower end rtx cards. this is misinformation. you can get 60 on a 4060. just lower some settings in graphics from ultra to high or medium. alan wake 2 on ps5 uses low and looks beautiful so. you also wont notice it as bad as you claim you will while you're playing a game in motion with frame generation if you're starting from 40fps-60fps at least with nvidias tech.
When Nvidia's marketing materials are all around using Max settings or Ultra Raytracing and showing games going from 22fps to 90fps+, then the promise they are selling you is DLSS3 will allow your card to play at ultra settings no compromise, which is what I'm debunking here. The truth is just like you say. You can get to 60, to use DLSS3, but you have to compromise by lowering settings and turning on upscaling. Your game will look worse than native ultra but that may be a worthwhile trade depending on the game and the person playing it.
ultra settings look barely diff than high or even medium settings in modern titles, its irrelevant and pc gamers need to recognize the consoles base line hardware is basically a 3060 ti or a radeon 6700(nonxt) running things on low-medium. pc gamers need to wake up to the reality that they arent 2 generations ahead anymore. this whole "you cant do dlss3 on anything but a 4090" narrative is a lie considering i played starfield with dlss on 67% & with framegen on my 4k 120hz screen with hardware unboxed's settings at 1800p with an rtx 4070. it was fine & looked beautiful. you do not need ultra. also the argument that native looks better than dlss is usually irrelevant too because i dont think the average person will notice a difference on a 4k panel, maybe a 1080p one but why are you gaming at 1080p in 2023? just lower your settings and move to 1440. i guess its sorta valid that nvidia is marketing their cards on ultra settings and thats why we've delegated the 4060 or the 4070 to 1440 and 1080 respectably but i feel like that's predatory towards consumers? @@ultrawidetechchannel
Come on 4090 users you know all you need is 30 fps. Stop capping. But seriously isn't upscaling suggested you get to 1440p first for the same reason, too little information in a 720p or even 1080p image to properly upscale?
Nvidia marketing tends to suggest that 1080p internal render resolution is the sweetspot for upscaling from to higher resolutions weather that be 1440p or 4k.
I don't own stock in either company but I don't understand why you think I'm trying to sell you an Nvidia card and not an AMD one. I spend a fair chunk of the video exposing Nvidia's deceptive marketing. All I'm trying to do is warn you that if you're doing everything you can to hit 60 right now and you still can't, then FSR 3 or DLSS3 isn't going to work as well as either company claims it will.
Great video. Here's the secret though: BFI or black frame insertion. If you can find a monitor or TV with a BFI feature running 60hz BFI or 100hz BFI, you can get a game running at 50fps, double the framerate to 100FPS with DLSS/FSR3, and then use black frame insertion to increase the motion resolution to over 250FPS-like motion without any cut to input lag or added artifacts. Or you can find a monitor or TV with 60hz BFI, making 60FPS looks like 150+FPS in terms of motion quality and it's free. I recommand the LG C1 Oled tv to do this.
BFI is definatly cool tech that definaly increase motion clairy but not many BFI monitors can do vairable refresh rate so you need to hit fixed fps targets if you want no tearing. BTW 50fps would only net you about 80fps with FSR3/DLSS3 due to the processing overhead, you would need more like 70fps to ensure a 100fps frame rate.
I should probably have repositioned my microphone when I switched to the conversation portion of the video, but when you’re recording yourself with no real way to monitor what is going on until you review all the footage, it’s just hard to catch these small issues before it becomes too much work to be worth fixing.
Tbh, despite it's price I really think the only gpu worth getting this generation is the 4090. Every other gpu just won't age as well in comparison, imo. Except for the 4090, all gpus are struggling to get playable framerates in games today, and that's with upscaling tech and frame gen. Just look at Alan Wake 2, and if you think that's a one off, nah that's what is to be expected going forward. Imo, the 4090 is the only gpu poised to give decent performance for the near future. All other gpus just fall short, unfortunately.
RX 7900 series has no problem whatsoever to deliver locked 60 fps @1440p Ultra and even locked 144 fps for certain games. Definitely worth it, taking into account that an RX 7900 XT costs around 1000€ in my country, where as a 4090 is 2000€ and upwards.
@@kristapsvecvagars5049 The 7900xtx isn't a bad card at all, but if you want raytracing it's pretty bad in the majority of today's games. And going forward, the demands of these games is only going to get heavier and heavier, even in raster. Alan Wake is a good example of what's to come. I don't think the 7900xtx will be able to hang as long as you would like it to. Tomorrow's games will likely be too much for it at 1440p 144hz. You might even need significant upscaling to have a locked 60fps experience at 1440p in future games.
Unless you have the money anyway, I would honestly just wait until the next console gen (and also games for it) are released before spending big bucks on a big upgrade now. I would recommend to just get something that's at most 2x as powerful as like the ps5 if you want to play at 4k 60fps and just drop settings. I mean if we're honest even on some PS4 Titles it's already really hard to see the visual difference between Hyper Giga Ultra and High Settings.
I think most 4090 users will not need to upgrade for this console life cycle. Path tracing is about as hard as it's going to get this gen untill it becomes practical to increase the ray count when more powerfull hardware comes out.
I have such disdain for DLSS because it sells a lie to the masses when they'd be better off buying a stronger GPU with more VRAM for their money. I have a 4090, but I recommend AMD GPUs for people on a budget that are below the purchasing threshold of fast Ray tracing Nvidia GPUs. Ray tracing and frame gen shouldn't be on the top of your list if you can't afford a GPU that has raw rasterization horsepower.
The marketing for the tech has always oversold but DLSS and even FSR quality are worth using. After that though targeted lowering of settings is probably the best course of action. Focusing on Raster performance first is a good stratagy for any one trying to get the most out of their dollar.
@@ultrawidetechchannel yes, for people buying GPUs at $500 or less, AMD has the best value per $. People keep GPUs for years so getting the most VRAM and fastest overall rasterization for the money will have benefits down the road. 7800 XT costs the same as a 4060 Ti but often outperforms a 4070. The choice should be easy.
This is why future proofing is never a thing and people need to understand that things can change between each generation. It's never been more clear than with this gen.
Expect things like psu and motherboard
if you have good rumors you can be really smart with future proofing - I've helped to setup multiple PCs for friends this year - we've done some clever tricks - 6700xt in low end, x3d cpus, they see what's going on and theyre safe and keep downloading more fps each month.
While there is no such thing as future proofing, overprovisioning now to maintain an acceptable level of performance for the life cycle of the current consoles is doable. I think most 4090 users will not need to upgrade for this console life cycle. They probably still will because that is just the kind of people they are, but I doubt they will "need to".
@@ultrawidetechchannel Very true. Though with so many new technologies coming with new GPUs now, things can change very quickly. So for all we know there will be something new next generation, implemented either with the GPU or games, which will require new hardware to run optimally. Similarily to Alan Wake 2 for instance. Where you to a certain degree need the new tech to be able to play the game properly. It's very unlikely, but impossible to truly know. I think this is the direction we are going. More and more reliance on these technologies to push the boundaries of what can be done within the games. That's what I meant by it, but you are right :)
Overspeccing does increase longevity, but I feel it did so a lot more before A.I. Things progress so rapidly now, and I doubt they will slow down now that the ball has started rolling. Though if A.I chews up the GPU market, like it has been doing so far... we might not even be looking at dedicated gaming systems in the near future anyway. Console or streaming seems more likely.
It's ray-tracing that requires GPUs to need DLSS 2.0 and 3.
Remove RT and all 4000 series cards will perform well within their native resolution.
Aside from games like Starfield this is mostly accurate, but with consoles using raytracing, even in some performance mode scenarios, it feels kind of sad if your gaming pc can't do it as well.
Good stuff. Been using these artificial frames to take 60 FPS to 120 and it's also a pretty compelling argument for that camp. So at least AMD 6000+ and all Nvidia 4000 users DO actually reap benefits from this tech. But yeah, the argument about breathing new life into old cards is very misleading.
Thanks, ya the Nvidia marketing department is really trying to convince you 40 series upgrade is going to solve your problems even if you can't really benefit from the tech on the bottom rung.
Another brilliant and much needed video. I play full native at 4K on a 13700K / 7900XTX system, with no tracing of any kind. That means I don't need to use up-scaling and frame generation software tricks. If I need more frames, I get them by turning something OFF or DOWN, not by turning something ON. I despise how these magic acts have been pushed at sometimes gullible gamers as the second coming.
Ya I want them to be pushing for more higher quality pixels not always just more pixels faster regardless of quality.
@@ultrawidetechchannel Well said.
Very solid explanation! My experience is the same, I have a 4070 ti OC and anything less than a basic 4070 you will not benefit greatly from DLSS3.
Thanks. Ya it’s hard to draw the line between frame rate and visual acceptability.
@@ultrawidetechchannel 60+ fps is doable with all 40 series gpus at their respective resolutions. 4060 will get 80+ at 1080p high in almost all games, turn frame gen on and boom 130 fps with the same input lag
I get that you're trying to simplify things by saying DLSS and FSR 3 are using DLSS and FSR 2 to upscale, but I don't think that's true. They only added the frame generation as new feature to the new version of their upscaler. So using DLSS 3 without upscaling is still DLSS 3 and not 2.
Yes, you can use frame generation (DLSS 3) without upscaling (DLSS 2), but the ‘how it works’ slide I used, is directly from Nvidia and they only show it in combination with DLSS 2, they like me also believe that you should be using DLSS 2 as far as you are comfortable taking it before turning on frame generation. So, while you don't need to use DLSS upscaling to use DLSS frame generation, you should definitely be using DLSS 2 upscaling first.
As a side note, every single marketing example that Nvidia has is using both DLSS 2 in conjunction with DLSS 3 but they always only label it DLSS 3. Which is technically not wrong because DLSS 3 is everything it’s the upscaling and the frame gen and now with 3.5 also includes Ray Reconstruction. So it’s one technology but the way they talk about it on stage it’s as if to say DLSS3 is only fame generation when it still includes the upscaling under that umbrella. So. when they show you those slides it makes the frame generation look more impressive than it actually is.
@@ultrawidetechchannel My understanding was that DLSS 3 was an improved version of upscaling over DLSS 2 and also has additional benefits added on.
But Perhaps I'm wrong and it just includes the some old version of DLSS 2 with added benefits?
eh I disagree, I managed to play phantom liberty with path tracing and DLSS FG + RR with my 4070ti at 3440x1440, is it perfect? far from it, but it's very playable with frame rates range from 50-75 ish... it's definitely not the same as having real frames but still smoother than the actual real frames my hardware gets at that graphics settings and res...
not to mention I'm also aware of the other problems but in the middle of action games, they don't matter in the same way people don't notice bad graphics when playing esports titles...
The reason I made this video is because of the way that Nvidia and to some extent AMD are marketing this tech where they sell it as a no compromise solution for getting ultra settings game play at high refresh rates. When in reality just as you describe there are quite a few compromises but that doesn’t mean you can’t end up with and improved experience for you gaming situation.
Cool but when I turn on DLSS3 it looks and feels better. Why would I care about what you're saying if it looks better and feels better?
If it looks and feels better to you then you shouldn't care what anyone else says. Though I would be surprised if you were saying that about a game you could only get 30-40fps in before resorting to using frame gen.
i have a 5600x paired with a rx6800 and a monitor that can do 1080 at 240 1440 at 144 and 4k at 60. am using the preview drivers right now from AMD and my frame rates are better. Am sure there is fidelity loss somewhere but when am playing a game that before this only gave me 60-72 fps and at times dipped below 60, am extremely happy that they can now hold a 70-105 fps range. Am a gamer not a pixel watcher. So whenever am blasting the baddies on screen am not really that interested in reflection quality and how sharp a poster or sticker or some other background object isn't being rendered at full 4k native resolution. I do appreciate the info, its interesting but am here to game. I remember playing Sega console games and thinking they were the shit. so, comparing to today I think you get my point. Enjoy the games your playing and stop sweating the small stuff.
That 60-72 fps is right where you need to be to have a pretty good time with FSR3 or DLSS3. It definatly puts you into you have to be looking for it territory for the artifacting.
Love your channel and videos. You made a very good point about frame gen. Can you make a series of revisit reviews for 4000 series cards using frame generation? DLSS Quality + Frame Generation is also a good idea to improve Raytracing Performance.
thanks, I frame gen is something i wouuld like to explore more. Not sure how fast that video will come out though.
very good video and explaination of tech they brings to us 👍
Thanks, glad you liked it.
I don’t jump through all these hoops. I just game at 1080p with high end GPU and never worry about FPS or lowering graphics.
That is a valid tactic for avoiding the upscaling arms race.
I have a "decent" system (for 2020-2023) with the Ryzen 5950X, 64GB ddr4-3600CL16 (oc to 3800CL16), and the Radeion RX 6900XT-16GB (undervolted for best performance.)
Playing at 4k with high settings I can get 60fps on most games, and high refresh levels on many. (Of course, RT is limited. But, that means it is around RTX 2080ti - RTX 3070 levels... Only.)
I want to experience true High Refresh Gaming. But anything above 60fps was lost to me on my "mere" 4k-60 monitor.
I just had to buy a $800 (Canadian) 4k-144Hz monitor.
Now I can play at the Smooth, over 60Hz, gaming. Some at native 90-144Hz, some with FSR2, some (I talking to you Forspoken!) with FSR3/FG.
Not noted by most Reviewers/UA-camTech ppl, is the Monitor Cost to this crazy setup. (Expensive CPU, 5800X3D/7800X3D is better; expensive GPU, AND Expensive Monitor.)
Monitor bottlenecking is a real issue that is rarely discussed. While some 1080p high refresh monitors are cheap and some 1440p monitors are not many 4k high refresh monitors are and getting the best of the best in the monitor game will cost you as much as a strong gaming system.
FSR3 it the only one who support everybody even Nvidia old Gpu and can give them Frame gen (even if it shouldnt work)
I applaud the rising tide raises all ships approach that AMD is taking with FSR 3 but I’m afraid the way the marketing is being done for these technologies its going to leave a lot of people very disappointed and frustrated.
I have a 6700XT and get around 80-90 fps with a lil fsr2 and ultra settings on cyberpunk. Luckily it may help my 240hz oled stretch its legs fr
You are right in the sweet spot for turning on FSR3 you will get a negligible latency penalty but get a great frame rate that your monitor can realize.
BASED@@ultrawidetechchannel
That is why I still use "just" a 1080p monitor. rather have native high frames then relying on this. If you are budget or midtier gamer just stay at FHD. The rest is only for people with nothing else in their lifes or rich people.
This is a solid stratagy to not fall into the upscaling everything trap. Its only when you wnat to go up in size that you need more resolution for all the other reasons you use your PC.
1440p is the sweat spot
But fsr 3 will have frame generation for many cards
Yes, it will but a lot of those older cards will not be getting the quality they may be expecting out of the performance lift they get from FSR3 due their low starting performance. You want to get close to that 60fps target or you will just be better off using other methods of increasing your frame rate.
I have resorted to not playing anything that requires upscaling
You might not be playing much going forward. I played God of War using DLSS on quality on a 3090 and it played fine and looked great at 4k. (I just got a 4090)
@@JamesSmith-sw3nk I don't play singleplayer games or mainstream games. I only play Squad, iRacing, Rust, Age of Empires 2 and 4, R Factor 2, Assetto Corsa. I don't really care about having many games, I just want to play the ones that I like already. I don't need upscaling for anything
This is a growing sentiment, one that I myself lean toward but instead of none my lineis Quality DLSS or FSR.
@jamesSmith-sw3nk DLSS quality is really getting indistinguishable from native in most cases.
Me the one who bought a rtx 4070. Who has dealt with ray tracing since the 20 series. Upscaling as well. I can still get a high fps even without frame generation. But one thing to note is the 1% lows suffers without it. I can still run ultrawide. Just not anything higher than 1080p vertically. At least the video somewhat stupid proof. But I recognize most people expect too much from low end to all the way too the high end.
In alot of the traditinal raster titles and earlier RT titles the 4070 can tear it up still but these newer games are really giveing it a hard time if you want to be on 1440P+ resolutions.
Try to enable FSR3 Frame gen + AMD Fluid Motion Frame to Triple the Double of Your FPS 😎🙃
That is what any one gettiing 30fps in all these super heavy AAA games will have to do. Double it first with FSR2 or lowered settings then try to double it again with FSR3.
I double my framerate in basically all ray/path traced games with my 4070, thanks to DLSS Quality and Frame Generation, which is a must, considering I'd be getting low 50's at 1440p native. Shit's smooth, so you can tell me I need a 4080 all day long; doesn't really matter.
Now, if we're talking FSR3, sure, but that goes for any tier of card. That tech is a shimmer and stutter fest.
A 4070 is a teir or two above the cards i was aluding to. its more the 4060 and even the 4060Ti users that may find they lack the power to fulfill the promise of FSR3 expecialy if they wanted to try throwing a game on their 4k TV.
So what group do I sit in if I have a 3080ti
As long as your not trying to Path Trace you will be in the above 60fps camp for most games, but as path tracing becomes the new Ultra Setting, you will find your self in the struggling to hit 60 camp.
@ultrawidetechchannel your awesome thanks man. So with that said, I just learned whst path trace means, so what your saying, if more games start to pump out like Allan wake 2, that's the time for my generation jump for the latest gpu. I ask because I'm a first time pc builder, so the 3080ti is my first desktop gpu that I actually bought at a store. Normally I've been a laptop and pre built gamer, starting with 1070, 2070, laptop form. Now I hit full desktop gpu with 3080ti, so I see these upset users, and I try to sympathize, though I fortunately don't struggle like them. So I ask as a wsy to learning the pc culture, thank you
@@michaelrivera4299Ya when you start seeing most the AAA titles come out with the same tech that Alan Wake 2 is using then that's when Ultra settings are now off the table for you and you might start feeling like your missing out.
@ultrawidetechchannel bro your amazing, I hope your channel explodes. I'm subbing right now because of how good your communications skills are, obviously when you get to those crazy 10k comments it hard to keep up. Either way, your the best dude
@@michaelrivera4299 my goal is to reply to every comment while I still can.
I know it's been a very long time since this transition has happend, but we need to swallow the hard truth, no amount of software is going to bring a 7 year old pc back from the dead, it time to upgrade that hardware
I've updated 3600 to 5800x3d and 6700xt to 7900xt on crazy price cuts - the rest of pc is the same.
I agree but it's still a shock to the system when the PC you have had that could keep up with every new release for the last 7 years now performs like trash on every game that comes out.
@michahojwa8132 with some of the sales i have seen for the 7900XT that is probably the most bang for the buck system yuou could have built.
You can FG from 45fps to be in vincinity of 60, you can even FG from locked 30 and it's still an improvement over 30, you can FG and not use fsr2 and that can redically improve your quality. And on top of all that - fsr/dlss will improve the life of the card because In the past an only answer would be to wait 2 years to buy next gen. Having dlss/fsr still prolongs the lifespan of a card. More shameless people will be able to get even more from that, I've played cyberpunk pt and Alan pt on an AMD card with fsr+afmf in decent resolution and framerate. The industry is way too dogmatic about the number 60 here. And new gen will come, pressing for 24gb ram and 16gb vram in January but then it'll stop till 2028. There'll be plenty of time. Also Nvidia is improving very slowly now (4060=1080ti).
Every fps over 30 you can get will cut back on the portion of the screen that ends up looking bad, and its for each person to determine what they can accept visuals vs framerate wise. My view is that especially below 60 you are better off using upscaling and lower settings than going for frame gen, but even above 60 I would push upscaling and lower settings to my visual threshold before going for frame gen.
Very well put
Thank you.
Consoles do not use all 16 GB of the VRAM available for the GPU because that memory is shared with the CPU, at best they can use 12 GB for video with the other 4 GB reserved for the CPU and even that is starving the CPU. To get around this Sony lowers the clock speeds and thus the performance of the CPU. Nor do they render at native 4K, they use an upscaler and render at a much lower resolution which also lessens the VRAM requirements. Part of the reason most PS5 games even with upscaling still cannot do 60 FPS and are locked to 30 FPS on highest settings because they are CPU bound both by clock speeds and memory limitations.
The CPU on a PS5 is even weaker than a desktop 3700X because of clock, voltage and TDP limitations and the GPU is slightly better than a 5700XT. If you set your PC game to the same settings a PS5 is using (Medium with some low settings) then you are likely to not need more than 8 GB VRAM. However if you crank up the textures to Ultra well of course you will need more but if you use the medium textures setting like a PS5 almost always does then 8Gb will get you by in 95% of games
While the memory is shared on the consoles the game code is often written first for consoles and is often less efficient and more memory intensive on the PC. While 12GB cards won’t really have issues until you try to play at native 4k (which the consoles also can't manage most the time) 8GB cards are now even struggling at 1440p and FSR 3 and DLSS 3 actually require more memory when used unlike FSR 2 and DLSS 2 which lower memory usage.
Sure, you may avoid issues playing at console settings and resolution scaling on many GPUs, but I don't think most people pay as much or more than a console costs on a GPU to only match its visual settings.
if you consider that most aaa games coming out the past half year or so are barely playable even on crazy gaming rigs WITH dlss 2 and fsr 2 on then yeah youre fucked even with a current gen mid class gpu
when it comes to path tracing the 4090 is the only GPU that can hold up to that kind of punishment. as far as the starfields of this world that are demanding but not because of path tracing then any thing 4070Ti/7900XT and above will be good, if you want to play at a 1440p or higher resolution.
Great analysis.
Thank you
At no point did I ever see anyone at Nvidia promote any graph or chart where they would be using DLSS Ultra Performance below 8K resolution (and usually it's DLSS Auto, which means 4K Performance, 1440p Balanced, 1080p Quality).
What a misleading video. Yikes.
In the early DLSS 3 marketing slides that Nvidia sent out they never specified the DLSS Upscaling settings used, they never even mentioned it being used. So, while they may be limiting themselves to Performance mode, since they never say explicitly you just can't be sure. You must admit that needing to render to 1/4 of your monitors resolution before you can even use the next technology is not an Ideal situation.
@@ultrawidetechchannel DLSS Performance is not required to see performance increase, it's just what they settled on with DLSS Auto for 4K and it also happens to not look bad at 4K. Usually these graphs are at 4K, hence DLSS Performance.
At this point I think it's obvious that if they don't mention DLSS factor, they are using the "Auto" lookup table (4K DLSS Performance, 1440p DLSS Balanced, 1080p DLSS Quality). Could just be me who got used to it.
please dont clickbait. they work fine for lower end rtx cards. this is misinformation. you can get 60 on a 4060. just lower some settings in graphics from ultra to high or medium. alan wake 2 on ps5 uses low and looks beautiful so. you also wont notice it as bad as you claim you will while you're playing a game in motion with frame generation if you're starting from 40fps-60fps at least with nvidias tech.
When Nvidia's marketing materials are all around using Max settings or Ultra Raytracing and showing games going from 22fps to 90fps+, then the promise they are selling you is DLSS3 will allow your card to play at ultra settings no compromise, which is what I'm debunking here. The truth is just like you say. You can get to 60, to use DLSS3, but you have to compromise by lowering settings and turning on upscaling. Your game will look worse than native ultra but that may be a worthwhile trade depending on the game and the person playing it.
ultra settings look barely diff than high or even medium settings in modern titles, its irrelevant and pc gamers need to recognize the consoles base line hardware is basically a 3060 ti or a radeon 6700(nonxt) running things on low-medium. pc gamers need to wake up to the reality that they arent 2 generations ahead anymore. this whole "you cant do dlss3 on anything but a 4090" narrative is a lie considering i played starfield with dlss on 67% & with framegen on my 4k 120hz screen with hardware unboxed's settings at 1800p with an rtx 4070. it was fine & looked beautiful. you do not need ultra. also the argument that native looks better than dlss is usually irrelevant too because i dont think the average person will notice a difference on a 4k panel, maybe a 1080p one but why are you gaming at 1080p in 2023? just lower your settings and move to 1440. i guess its sorta valid that nvidia is marketing their cards on ultra settings and thats why we've delegated the 4060 or the 4070 to 1440 and 1080 respectably but i feel like that's predatory towards consumers? @@ultrawidetechchannel
I have a 1660 super. Not sure why I'm here.
FSR3 can technically run on that GPU, just don't expect a in tier upgrade to the 4060 to allow you to really benefit fully from DLSS 3.
Come on 4090 users you know all you need is 30 fps. Stop capping.
But seriously isn't upscaling suggested you get to 1440p first for the same reason, too little information in a 720p or even 1080p image to properly upscale?
Nvidia marketing tends to suggest that 1080p internal render resolution is the sweetspot for upscaling from to higher resolutions weather that be 1440p or 4k.
I do have 580 I can use fsr3 I feel like this guy is laying to save his Nvidia sheres
I don't own stock in either company but I don't understand why you think I'm trying to sell you an Nvidia card and not an AMD one. I spend a fair chunk of the video exposing Nvidia's deceptive marketing.
All I'm trying to do is warn you that if you're doing everything you can to hit 60 right now and you still can't, then FSR 3 or DLSS3 isn't going to work as well as either company claims it will.
The annoying clone was point with what read on the internet 😂
Thanks, jsut had to use what fanboys said to me whenever I recommended the competition's card over their brand.
Great video. Here's the secret though: BFI or black frame insertion. If you can find a monitor or TV with a BFI feature running 60hz BFI or 100hz BFI, you can get a game running at 50fps, double the framerate to 100FPS with DLSS/FSR3, and then use black frame insertion to increase the motion resolution to over 250FPS-like motion without any cut to input lag or added artifacts.
Or you can find a monitor or TV with 60hz BFI, making 60FPS looks like 150+FPS in terms of motion quality and it's free. I recommand the LG C1 Oled tv to do this.
BFI is definatly cool tech that definaly increase motion clairy but not many BFI monitors can do vairable refresh rate so you need to hit fixed fps targets if you want no tearing. BTW 50fps would only net you about 80fps with FSR3/DLSS3 due to the processing overhead, you would need more like 70fps to ensure a 100fps frame rate.
speak louder please you are not in a church
I should probably have repositioned my microphone when I switched to the conversation portion of the video, but when you’re recording yourself with no real way to monitor what is going on until you review all the footage, it’s just hard to catch these small issues before it becomes too much work to be worth fixing.
Tbh, despite it's price I really think the only gpu worth getting this generation is the 4090. Every other gpu just won't age as well in comparison, imo. Except for the 4090, all gpus are struggling to get playable framerates in games today, and that's with upscaling tech and frame gen. Just look at Alan Wake 2, and if you think that's a one off, nah that's what is to be expected going forward. Imo, the 4090 is the only gpu poised to give decent performance for the near future. All other gpus just fall short, unfortunately.
RX 7900 series has no problem whatsoever to deliver locked 60 fps @1440p Ultra and even locked 144 fps for certain games. Definitely worth it, taking into account that an RX 7900 XT costs around 1000€ in my country, where as a 4090 is 2000€ and upwards.
@@kristapsvecvagars5049 The 7900xtx isn't a bad card at all, but if you want raytracing it's pretty bad in the majority of today's games. And going forward, the demands of these games is only going to get heavier and heavier, even in raster. Alan Wake is a good example of what's to come. I don't think the 7900xtx will be able to hang as long as you would like it to. Tomorrow's games will likely be too much for it at 1440p 144hz. You might even need significant upscaling to have a locked 60fps experience at 1440p in future games.
Unless you have the money anyway, I would honestly just wait until the next console gen (and also games for it) are released before spending big bucks on a big upgrade now. I would recommend to just get something that's at most 2x as powerful as like the ps5 if you want to play at 4k 60fps and just drop settings. I mean if we're honest even on some PS4 Titles it's already really hard to see the visual difference between Hyper Giga Ultra and High Settings.
nonsense.
I think most 4090 users will not need to upgrade for this console life cycle. Path tracing is about as hard as it's going to get this gen untill it becomes practical to increase the ray count when more powerfull hardware comes out.
I have such disdain for DLSS because it sells a lie to the masses when they'd be better off buying a stronger GPU with more VRAM for their money. I have a 4090, but I recommend AMD GPUs for people on a budget that are below the purchasing threshold of fast Ray tracing Nvidia GPUs. Ray tracing and frame gen shouldn't be on the top of your list if you can't afford a GPU that has raw rasterization horsepower.
The marketing for the tech has always oversold but DLSS and even FSR quality are worth using. After that though targeted lowering of settings is probably the best course of action. Focusing on Raster performance first is a good stratagy for any one trying to get the most out of their dollar.
@@ultrawidetechchannel yes, for people buying GPUs at $500 or less, AMD has the best value per $. People keep GPUs for years so getting the most VRAM and fastest overall rasterization for the money will have benefits down the road. 7800 XT costs the same as a 4060 Ti but often outperforms a 4070. The choice should be easy.