DLSS 3 Frame Generation- major selling point, or worthless marketing gimmick?

Поділитися
Вставка
  • Опубліковано 22 гру 2024

КОМЕНТАРІ •

  • @Ctrl-Alt-Bruno
    @Ctrl-Alt-Bruno 2 роки тому +1122

    The real problem is when developers launch games that REQUIRES it to run properly even on high end computers. Other than that, it's a nice feature to have.

    • @iurigrang
      @iurigrang 2 роки тому +37

      That can also be said of true performance uplifts, so I can’t really see it as a downside.

    • @maaax1173
      @maaax1173 2 роки тому +54

      It‘s not a problem if it runs fine on Medium/High settings while still looking great. In fact, that should be desired, Ultra settings should be a setting for top-end and future PCs whereas Low settings should be running well on a 5 years old mid-range system

    • @icy1007
      @icy1007 2 роки тому +16

      It is not required in any game...

    • @jaredj9099
      @jaredj9099 2 роки тому +10

      This is it! DLSS 3 is amazing and will beat FSR 3 when it releases. But I hate games that need it to hit 60 fps

    • @jinx20001
      @jinx20001 2 роки тому +32

      i understand your grievance here but i actually completely disagree... i think developers need to push graphics as far as they can in some cases, we need games that push boundries and if they can do that knowing technology is there that will allow them to reach those levels and make them playable im all for it... i think its pretty clear now that games like portal RTX would not exist without DLSS 3, they simply would not have bothered because who could play it at 6fps, the technology literally gives developers a reason to show us what they can do and im happy about that... i do not want to play console quality graphics on my £2000 graphics card, i want a game now and then that goes fuck it lets go for it.

  • @Static_Age
    @Static_Age 2 роки тому +356

    I love how this channel is not just reviews, but quality education on GPUs and games / settings. I have learned so much here.

    • @GregoryShtevensh
      @GregoryShtevensh Рік тому +12

      Thats the natural teacher coming out in him lol

  • @cpuccino
    @cpuccino 2 роки тому +342

    As someone who plays the witcher on a 4k 144hz monitor, I’m definitely thankful this feature exists. I was very skeptical at first cause yknow “fake frames” but after getting a 4090 and experiencing it first hand, it’s pretty nuts

    • @1GTX1
      @1GTX1 2 роки тому +32

      It's great but for millions of people in the future it will be 300$ rtx4050, 60fps with DLSS, and frame generation, so lower image quality, and bad latency

    • @sonnieslim5973
      @sonnieslim5973 2 роки тому +81

      @@1GTX1none of that is true lol

    • @1GTX1
      @1GTX1 2 роки тому +2

      @@sonnieslim5973 What do you mean

    • @aleksfreeman3627
      @aleksfreeman3627 2 роки тому +8

      I agree with you! Playing 2k 240hz Witcher, and FG helpful a lot, without it, frames getting fucked even on 4090...

    • @cpuccino
      @cpuccino 2 роки тому +21

      @@1GTX1 sorry can you explain what you mean by that? if you mean the 4090 will be a xx50 someday then yeah for sure. That’s a good thing though? technology gets better, by the time that happens everyone that’s bought a 4090 would’ve switched to the xx90 already

  • @TheAbvsyork
    @TheAbvsyork 2 роки тому +99

    Yeah I can see that frame generation will be better at higher fps because each frame persists on the screen for a shorter amount of time so the image quality degradation shouldn't be as noticeable. It will be interesting to see how they continue to improve it as well and this really does feel like a game changer.

    • @ToonamiAftermath
      @ToonamiAftermath 2 роки тому +16

      I've tried it at low fps and high fps scenarios and honestly its more important in low FPS scenarios, that said yeah it doesn't really take anything away at high FPS, in my experience its just almost always better

    • @TheAbvsyork
      @TheAbvsyork 2 роки тому +4

      @@ToonamiAftermath so I guess lower fps is when you would need the fps boost the most but the higher the average fps maybe you wouldn't notice image quality issues as much?

    • @xeridea
      @xeridea 2 роки тому

      If there are flickering HUD elements, it doesn't matter what your FPS is. Also, if you are already at high FPS, why introduce glitches when it is already smooth?

    • @TheAbvsyork
      @TheAbvsyork 2 роки тому +1

      @@xeridea not high fps but around 50 or so. What I'm saying is u need the boost for lower fps yes but the higher the fps is the less u will notice the visual degradation. Or let's say 70 fps but could potentially dip below 60 in cpu intensive areas so you turn it on and average alot higher. You prob won't notice because the time between frames is much faster and each frame is on the screen a far shorter amount of time

    • @clownavenger0
      @clownavenger0 2 роки тому

      @@TheAbvsyork If I am getting 50 FPS I don't want my game to feel closer to a 40FPS experience when it comes to responsiveness. I will most likely turn down one of the ultra settings to get a 60 FPS experience. If a game is running at 70-80 I don't mind a slight loss to responsiveness but at that point I feel like the visual downgrade would be the bigger issue.

  • @wicksleysnipes1476
    @wicksleysnipes1476 2 роки тому +172

    Great objective breakdown. I like how Daniel’s “math teacher” side shines through (IMO).
    As someone who works in finance, I like my data to be objective, measurable/quantifiable, and without bias (or with full transparency of existing bias).
    Great work!!
    Thank you, sir.
    And thank you for educating our youth and us.

    • @istheway-gu6yo
      @istheway-gu6yo 2 роки тому +2

      nobody asked lil bro

    • @user-xn3kt6bn5r
      @user-xn3kt6bn5r 2 роки тому +23

      @@istheway-gu6yo nobody asked if anyone asked big fella.

    • @wicksleysnipes1476
      @wicksleysnipes1476 2 роки тому

      @@istheway-gu6yo Nobody told, Biatch. Perhaps it’s for the Algo. I’m a retired WildLand firefighter and I’m sure I could handle your kind. Do your worst, baby boy.

    • @wicksleysnipes1476
      @wicksleysnipes1476 2 роки тому +6

      @@istheway-gu6yo Nope, you didn’t ask.

    • @qubes8728
      @qubes8728 2 роки тому +2

      Like you I need facts. Anything less and I feel the truth is being hidden. If I feel like the truth is being hidden I walk away.

  • @dragonicprime1736
    @dragonicprime1736 2 роки тому +102

    DLSS 3 is amazing. it's supposed to be used with single player games. Not competitive. So the extra latency doesn't matter. It also helps with alleviating cpu bottlenecks, so in games like spider-man that are very cpu heavy, you can get extra smoothness from the image by turning on one setting and having an image that's extremely similar to the original, but with extra smoothness

    • @visceraeyes525
      @visceraeyes525 2 роки тому +4

      input latency still carries over to single player games, so youre getting a less responsive experience in single player which people who play at 100+fps WILL notice

    • @dragonicprime1736
      @dragonicprime1736 2 роки тому +20

      @@visceraeyes525 yes I know it carries over. I'm just saying that it doesn't matter. It's not a 30 second delay. It's milliseconds. You also shouldn't be turning it on if you're already getting over 100fps. There's no point. You feel it, but you don't need the most responsive experience for single player games. Just sit back, relax and enjoy the game

    • @visceraeyes525
      @visceraeyes525 2 роки тому

      @@dragonicprime1736 you will still notice the artifacts from it though as it still ultimately degrades image quality

    • @dragonicprime1736
      @dragonicprime1736 2 роки тому +20

      @@visceraeyes525 The artifacts are minimal from my experience. The game that has the most is spiderman. Other games, they're less noticeable. So if you're actually playing instead of pixel peeping, they're not noticeable

    • @DavidFregoli
      @DavidFregoli 2 роки тому +2

      and multiplayer games are already made to run fast anyway so it's really a non-issue

  • @TalonsTech
    @TalonsTech 2 роки тому +110

    Having actually used Frame Generation in a few games, I will say it's an absolute game changer.

    • @donomar789
      @donomar789 2 роки тому +27

      Yep absolutely think the same way. It was awesome to have as an option in Witcher 3. Instead of 70-80 fps with a 4080, I now get 120-130 and im happy 😊

    • @mercurio822
      @mercurio822 2 роки тому +21

      only if you get 120+ FPS if you are bellow 100 FPS you will notice the artefacts galore. if i already got 100+ FPS there is no reason to turn on DLSS3 and if im bellow 100 FPS there will be a noticeable artefacts, its a lose lose in both cases, ITs useless.

    • @donomar789
      @donomar789 2 роки тому +8

      @@mercurio822 im sure that many people will have differing opinions but as i said, even with a baseline of 70 fps, frame generation worked wonders with witcher 3

    • @TerraWare
      @TerraWare 2 роки тому +6

      @@donomar789 yeah I feel 60 or maybe 50 should be the minimum to enable it, it's almost unnoticeable speaking from personal experience. Its awesome tech to have that will only get better an gain more and more support

    • @donomar789
      @donomar789 2 роки тому +5

      @@TerraWare yup, I hope that AMDs FSR 3 will be comparably good so that people with amd GPUs can experience it as well. And competition is always a good thing of course 🙂

  • @18philbob
    @18philbob 2 роки тому +47

    Super useful breakdown of Frame Generation across multiple use cases. Probably won't be able to take advantage of it myself until the next generation of GPUs, but this sounds great for the types of computationally difficult games I tend to play (single player and often more CPU limited)

  • @FilledWithChi11
    @FilledWithChi11 2 роки тому +63

    After having used frame gen in requiem I can say it’s definitely nice to have and I don’t see any artifacts at all. I play at native 4K with a 4080 and get around 60fps without frame gen and with it on I get about 80-100 fps. It’s definitely nice to have and let’s me use native 4K + DLAA rather then upscale.

    • @TerraWare
      @TerraWare 2 роки тому +15

      Yeah exactly. It was perfect in that game. Like I keep saying and Daniel is 100% right is that you have to try it for yourself to be able to judge it. It like VR and HDR in that way that it can't be accurately portrayed on a YT video or a review.

    • @Drip7914
      @Drip7914 2 роки тому +5

      @@TerraWare STOP COMMENTING AND GO PLAY ON YOUR 4090 😂

    • @TerraWare
      @TerraWare 2 роки тому +4

      @@Drip7914 Not a bad idea

    • @visceraeyes525
      @visceraeyes525 2 роки тому +1

      just because you dont see artifacts doesnt mean they arent there lol. you probably also dont ever notice stutter like shader compilation in games do you? youre obviously used to bad input latency anywaay since you play a 60fps

    • @FilledWithChi11
      @FilledWithChi11 2 роки тому +29

      @@visceraeyes525 And you obviously have never experienced frame gen.

  • @JulianJayme
    @JulianJayme 2 роки тому +49

    Doesn't feel like a gimmick to me. I've been playing Witcher 3 Next Gen at 4K max raytracing. It's so choppy that it's unplayable without frame generation, but with DLSS Quality and Frame Generation, it feels very smooth and I almost never notice the AI generated frames, and I don't notice any input latency.

    • @adamdunne6645
      @adamdunne6645 2 роки тому +6

      I noticed the latency but it's still surprisingly good. I actually said I would probably never use it before I tried it.

    • @stevetriple8
      @stevetriple8 2 роки тому +6

      @@adamdunne6645 Agree, the latency difference is so negligible.. you can notice it ever so slightly but it doesn't affect the experience at all in a game like that.

    • @Thor_Asgard_
      @Thor_Asgard_ 2 роки тому +5

      I dont like it at all, but the thing i hate the most is that i need to use it on my 4090 to have acceptable framerates in witcher 3 next gen ^^

    • @Richard-tj1yh
      @Richard-tj1yh 2 роки тому +9

      create an artificial problem, then make people buy a solution for it, lol

    • @Pisscan
      @Pisscan 2 роки тому +1

      AMD wants to know your location 💀

  • @iChannelz88
    @iChannelz88 Рік тому +13

    My biggest issue with this is when it’s technically possible to run frame generation on a 2000-3000s card but they make it a software limitation to be able to use it. That makes the benchmarks skewed in my opinion

  • @Dave-nm3xc
    @Dave-nm3xc 2 роки тому +6

    Your channel has become one of my favorite tech channels. Keep up the great work!

  • @CookieAntonio
    @CookieAntonio 2 роки тому +23

    Tried out Portal RTX on my 4080 at 4K, it was an amazing experience especially on controller. Absolutely no discernable difference to DLSS2 other than a smoother experience!
    Time to try out portal RTX on my laptop 3050 lmao
    UPDATE MY LAPTOP MELTED AT 10FPS

    • @pwrsocket
      @pwrsocket 25 днів тому

      Portal on controller should actually be illegal

    • @CookieAntonio
      @CookieAntonio 25 днів тому

      @ i grew up playing it on xbox 360 hampter

  • @JoseCoig
    @JoseCoig 2 роки тому +52

    I would argue that fps has not always been a direct representation of input lag, anyway. Depending on how they were programmed, some games can feel much floatier and less responsive at 60 fps than others at 30 fps.

    • @larion2336
      @larion2336 2 роки тому +8

      That's almost always caused by some other thing being enabled; in particular v-sync which introduces a ton of input lag, or certain mouse acceleration issues can cause the mouse to feel off. The relation between FPS and latency should still be the same regardless.

    • @valrond
      @valrond 2 роки тому +15

      Yes, and no. Yes, the game can be more or less responsive. But in the same game, the responsiveness will always be better at 60 fps than at 30 fps.

    • @vmafarah9473
      @vmafarah9473 2 роки тому

      Die the pre rendered frame cause this.

    • @JoseCoig
      @JoseCoig 2 роки тому

      ​@@larion2336 But that's the point. That's true in some games where input processing is locked to render timing. But it doesn't need to be that way. When input processing is not locked to graphics output, the effect of fps on input lag is much smaller. The final delay to display the actual frame still exists, of course, but the game world can react to your input independently even while it's not still being displayed.

  • @Plague_Doc22
    @Plague_Doc22 Рік тому +6

    Opinion prior to video: It's worth it on singleplayer games that dont require quick reactions as the latency is there. Kinda similar to how V sync used to suck for that reason. BUT you gain massive frames making it feel smother. Making it very useful for most singleplayer stuff. (will post an update after the video to see if my opinion has changed)
    Updated opinion: Completetly forgot about nvidia reflex, which could potentially offset the delay. Seems like implementation might the biggest factor. I imagine it could be really really good for getting smoothness if you're hitting fps in the 40-60 range where you really notice the drops. Being able ot maintain 60+ could be huge. So in a sense it feels very much like how DLSS 2.0 is, where you use it for that extra hump of performance over a threshold.

  • @TechGamer-pq1gu
    @TechGamer-pq1gu 2 роки тому +19

    if your original FPS was around 40 or lower and you enable Frame Generation then the latency would suck but the game would look smoother the input delay would just get worst now if you have 60 or more FPS before you enable frame Generation then the latency impact can be tolerable depending on the game.

    • @jjglaser
      @jjglaser 8 місяців тому

      So if i have 220 fps without, and then i get 280 fps with frame gen in MW3(i tested it); then what kind of negative latency increase am i getting? a tiny one?

    • @Jewls838
      @Jewls838 5 місяців тому +3

      ​@@jjglaser you wont notice it if you play casually. as someone who plays competitive i can tell and feel the latency increase and its not worth the fps bump for me

  • @b130610
    @b130610 2 роки тому +17

    I'm really hoping that fsr3, or some future version of dlss frame gen integrates time warp reprojection that has been used in VR for years now. Technology like that actually increases the responsiveness of games, and are pretty light on hardware afaik. It could really improve game feel for keyboard and mouse users.

    • @Wobbothe3rd
      @Wobbothe3rd 2 роки тому

      Many games have had something like timewarp reprojection built in long before VR, going back 12 years.

    • @b130610
      @b130610 2 роки тому +10

      @@Wobbothe3rd really? That's cool, I hadn't heard of that. Which games? I'd be curious to check that out.

  • @davidkatsmam
    @davidkatsmam 2 роки тому +33

    It is a killer feature when is well implemented. In Portal RTX is really perfect, I can’t see any artifacts in this game.

    • @toivopro
      @toivopro 2 роки тому +1

      Like Quake rtx was the bomb?

    • @clownavenger0
      @clownavenger0 2 роки тому +5

      thats due to the simple geometry in that game. You won't be getting path tracing in cyberpunk anytime soon unless you want a 5FPS experience.

    • @dominicstocker5144
      @dominicstocker5144 2 роки тому +4

      @@clownavenger0 the difference in polygon count doesn’t result in as big of a difference in performance compared to raster, though. Also, they bumped that count up tremendously with the new assets

    • @NonOfYourRizznessSir
      @NonOfYourRizznessSir Рік тому

      ​@@clownavenger0 What do you think about path tracing now in Cyberpunk? Now that it is out if I may ask. :)

    • @clownavenger0
      @clownavenger0 Рік тому

      @@NonOfYourRizznessSir its cool and runs well on my 30 series card. the mod that came out within 24 hours is also very useful to boost frame rates.

  • @Chilledoutredhead
    @Chilledoutredhead 2 роки тому +30

    My take on this is it will get better. Much like the leap from dlss 1 to 2, frame generation 2.0 will probably be awesome

  • @valrond
    @valrond 2 роки тому +14

    Good video, Daniel. I got my 4090 a couple of days ago but I have barely been able to use it.
    The problem with DLSS is that it is an "I win more card". That is, it works best, when you don't need it. You have to be over 60 fps already, and in games that aren't fast paced like Spider-man, which is precisely where you want and need the extra fluidity, well, perceived fluidity.
    Also, I'm not sure I'd rather have 87 real fps with 290W or 120 fake fps with 400W of power, as you showed in A Plague Tale Requiem.

    • @路人甲-g9s
      @路人甲-g9s 2 роки тому +2

      Exactly.

    • @simptrix007
      @simptrix007 2 роки тому

      I would say its must have for me at Portal RTX.

    • @ODIOPOWER
      @ODIOPOWER 2 роки тому +3

      power limit your 4090. 3% lost performance with 30% less power draw

    • @gamingmax9455
      @gamingmax9455 Рік тому +1

      True

    • @gurkal89
      @gurkal89 Рік тому +1

      at least one person here which gives an honest statement.

  • @SaccoBelmonte
    @SaccoBelmonte 2 роки тому +16

    After trying it. It is quite awesome. Would I like to have that amount of real frames? yes....but since GPUs cannot possibly do that, FG is a great thing to have and it does make a big difference. Latency wise, as long as you're at 60 real fps or more, I don't feel the latency.

  • @mynameisawesomeman
    @mynameisawesomeman 2 роки тому +6

    Interesting to know how Witcher 3 even tanks on 4090 when you turn on RayTracing because it's CPU limited. I have a 3080 Ti and also noticed that no matter what settings I change (e.g. crank up DLSS 2.0, or turn down graphics options) it doesn't make a huge difference when RT is on; makes me want a 4000 series card to do frame generation!

  • @jbettcher1
    @jbettcher1 2 роки тому +34

    It's a killer feature for the problems that NVidia created, Raytracing. And currently there are only high end 40 series cards with this technology. This feature seems a lot more beneficial to the midrange where you'd actually be taking gameplay from unplayable to playable. When they show the DLSS 3 frame rates versus previous gens it's pure marketing gimmick to justify the price increases.

    • @jambothebairn
      @jambothebairn Рік тому +1

      "taking gameplay from unplayable to playable" there's the problem, Nvidia are a greedy company, they don't want people holding on to their older GPU's

    • @Magnulus76
      @Magnulus76 Рік тому +1

      Raytracing isn't practicable for midrange cards even today with the latest midrange GPU's.

    • @bosneb
      @bosneb Рік тому

      Raytracing is a problem? I'd say screen space reflections are more of a problem because they make my eyes bleed with incessant occlusion artifacts, and/or constantly reflecting objects that are in the foreground.

    • @Magnulus76
      @Magnulus76 Рік тому +1

      I think that's why they paired the 4060 with FrameGen. 4060 is a midrange card (without the typical midrange prices).

    • @bosneb
      @bosneb Рік тому +3

      ​@@Magnulus76 Oh definitely. I was just referring to the OP calling raytracing a "problem" that nvidia created. Raytracing certainly requires excessively overpriced hardware, but when I'm seeing reflections of stuff that's not on screen, or when it's filling in the occlusion gaps in SSR, I think of it as an improvement, not a problem. If someone doesn't want to pay the premium to run it, games still run without it, so the only problem I see is FOMO.

  • @mcgmgc
    @mcgmgc 2 роки тому +32

    It is good, used it in a bunch of games. Feels like real FPS and I don't notice any input lag, probably because it's in singleplayer games mostly so it doesn't matter at all even if there's something. Some games like Portal RTX and Witcher 3 pretty much unplayable without it so I'd say it is a very good feature. When people compare 7900XTX and 4090 they love to just leave out DLSS3 for some dumb reason as if it doesn't exist.

    • @seminooblex8057
      @seminooblex8057 2 роки тому +1

      @@mattkostandin Wel the 4090 still can use some help with fps on the Witcher 3 at 4k with raytracing all the way up

    • @jinx20001
      @jinx20001 2 роки тому +2

      reviewers do the same thing, in fact reviewers are even worse when it comes to GPU reviews, they literally have to ignore half of the graphics cards features to come to a value conclusion in order to make it easier for them. RT doesn't even get a mention in most reviews when it comes to value because apparently it adds no value at all... or they just cant be bothered to change the way they have done value charts for the last 15 years, i know which is more likely.

    • @JoPeTuYaTroJoueY
      @JoPeTuYaTroJoueY 2 роки тому

      frame generation add less than 10ms of input lag (sometimes 10-11ms). On Witcher 3 Next Gen update for exemple, without it, it's 50ms, with it it's 58ms.

    • @Drip7914
      @Drip7914 2 роки тому +1

      @@JoPeTuYaTroJoueY Yh and that’s actually one of the worst ones. In Spider-Man it’s like 3 and 4 in f1, frame gen is getting hate from people who will never experience it lol

    • @kamr7216
      @kamr7216 2 роки тому +1

      ​@@mattkostandin I have 4080 (combined with i7 13700k). I tested frame generation in Witcher 3 (with the latest patch) and it keeps the game above 60 fps even in the most intensive parts of Novigrad. Without it the fps are in the 40s. The only noticeable downside is that the text above npcs is more distorted when galloping around them. Dlss is at quality, without it, fps will drop below 40; the image is somewhat softer with it on (sharpening helps), any other AA option looks worse though.

  • @pajasan
    @pajasan 2 роки тому +8

    I know how this whole enthusiasm can make you feel good. Almost like reading specs + enthusiastic comments about some cool product you would like to purchase but then you learn what those not so positive have to say and look at return rate statistic which give you cold shower feeling. Some say 1000R screens are immersive while ignoring 200 ping, some have it otherwise like me. This whole "feature set" makes me a little bit concerned if it wont make devs lazier by instead of proper (alert buzz word) optimizing just slapping those features and call it a day.

  • @mynameisawesomeman
    @mynameisawesomeman 2 роки тому +12

    The latency induced by DLSS frame generation is probably similar to that you'd experience with Vsync, as they work in similar ways. Yes, it's true that Vsync increases input lag, but not many people, except hardcore FPS players would turn off Vsync and put up with screen tearing.

    • @clairifedverified2513
      @clairifedverified2513 Рік тому +3

      Honestly I would really not describe myself as hardcore and yet I find games feel horrible with vsync on, it always throws me off and ive had to quit games that dont let me turn it off because it feels so awful. I want the game as responsive as possible, and screen tearing is a small price to pay for the game being playable for me.

    • @chaoticsoap
      @chaoticsoap Рік тому +2

      Vsync is abysmal, fastsync is ok... We do have freesync though

    • @yinitialize4930
      @yinitialize4930 Рік тому +1

      vsync is so bad, compared to freesync its day and night

    • @KaneAmaroq
      @KaneAmaroq Рік тому

      It's not actually that similar in my experience. It's just barely, slightly worse than the input delay you'd get running at the framerate that the frame-gen predicts with.
      So running the game at say, 144 fps with frame-gen on will feel like if you ran the game at around 66 fps or so. The reason why is because he's actually not quite right about his assertion. It does not delay a frame to generate a new one between, it PREDICTS what the next frame will be, AS the GPU is rendering a new, real frame.
      That said, I don't think Frame-Gen is useful outside of high refresh-rate scenarios, 120 fps at the minimum. Below that and it does start to feel sluggish.

    • @PeterJohnson76
      @PeterJohnson76 Рік тому

      I use a 4k samsung TV as a monitor, running at 60Hz, get no tearing with vsync off.

  • @juan5h
    @juan5h 2 роки тому +6

    One thing that I notice is that in addition to increasing latency, consumption increases by about 60 watts. On the other hand, if the callisto protocol with a 4090 makes you wish you had dlss 3, it only speaks of how poorly optimized the game is, which is left for those with medium or low-end video cards

    • @jose131991
      @jose131991 2 роки тому

      This is my conspiracy on why most of the latest releases have terrible CPU utilization. To push 4000 series cards (which is good for the PC industry regardless) and also to ease work on devs so they don’t have to properly optimize there games CPU usage thus can fall back on ngreedia’s crutch! It’s a master plan!

    • @ThunderingRoar
      @ThunderingRoar Рік тому +1

      @@jose131991 but Callisto protocol is amd sponsored title???

    • @jaytay420
      @jaytay420 7 місяців тому

      ​@@jose131991 whatever u smoking clearly is the good stuff

  • @patricklagace731
    @patricklagace731 2 роки тому +9

    I've used it on the 4090 and, to me, the tech is a nice addition for sure. Like you said in your video, very few people have seen it live, so saying it's a gimmick without testing it for yourself in your favorite games is not really fair. That being said, you have to take it from Nvidia's perspective. To me, it's clear that Nvidia implemented DLSS 1.0+ at the same time as RT to reduce the FPS penalty and it's pretty much the same with the new 4000 class of GPU's. RT taxes the CPU and cards like the 4090 can't be fully used to it's full potential because of it. Add the brut performance of the card at 1440p and it's CPU bound in most games ! For the most part, DLSS 3 is useful for RT titles which are CPU bound. I really doubt we'll see DLSS 3 in non-RT games.

    • @Keivz
      @Keivz 2 роки тому +2

      It’s already in plenty of games without RT.

    • @patricklagace731
      @patricklagace731 2 роки тому

      @@Keivz You're correct. But my point is that it will be much less useful in those non-RT games, with the horse power of the new gen cards. We'll see...

    • @Physics072
      @Physics072 Рік тому

      I have a 4070FE and hate frame generation in any form. I use DLSS mostly for AA in titles like tomb raider. DLSS does help with alaising even at 1080P 120 or 240hz. Fake frame generation sorry its a gimmick.

  • @stawsky
    @stawsky 2 роки тому +5

    For my DLSS3 working very well with Witcher 3 for example and possibly this will be a good choice for most of RPG games, problem is how to enabled it.... I was looking for ages until finally somebody wrote to enable Hardware Acceleration in Windows otherwise this option do not appear in graphic options (game), at least I do experienced this problem. Now I can finally play with DLSS3.

  • @DuskDD
    @DuskDD 2 роки тому +6

    I have rtx 4080 and I can honestly say almost not feel any issue with image quality and game responsiveness while DLSS 3 enabled. Also feels much smoother with high refresh rate monitor g sync enabled. I belive Nvidia will improve frame generation technology near future...

  • @marsovac
    @marsovac 2 роки тому +4

    What matters is that tearing that happens with DLSS3 over the sync framerate. That's a deal breaker. Also the the latency reported in the nVidia overlay in the video is nonsense. On the left side you have reflex off, while ideally you would have reflex on on both sides. Keep reflex on in the driver. Compare the best case for both sides, not the worst case for the left side and the best case for the right side. You also want reflex on without DLSS if you happen to have low framerate. You know, to feel 30FPS more responsive.

    • @marsovac
      @marsovac 2 роки тому +1

      tldr: if comparing apples to apples it is simply a tradeoff of latency for framerate and you would feel it if you were at low framerate to begin with

    • @cheese186
      @cheese186 2 роки тому

      You can control the tearing by setting a fps limiter to slightly below half of your display so it stays within gsync/freesync range.

    • @Keivz
      @Keivz 2 роки тому +2

      Tell us you’ve never used it without telling us you’ve never used it.
      You’ll always get tearing above your refresh rate, FG or not. That’s why you use reflex to keep your framerate under your max frame rate. Also, in the video he has reflex on with his comparison of FG both on and off. But do realize the Witcher 3 didn’t have reflex until dlss 3 was added and no one complained about the lag then. Regardless, you’re not going to notice a different in a sp game.

  • @GodKitty677
    @GodKitty677 2 роки тому +2

    Thanks for the video, it's given me more information about dlss 3 frame generation. The issue I have is I am lock 4k@60fps on a 4k 60Hz monitor in the games I play. 1:06 contradicts his conclusion about latency. Native 4k 12ms, DLSS Quality 11.6ms and DLSS frame generation 8.2ms.

  • @Sirfrummel
    @Sirfrummel 2 роки тому +3

    This was an outstanding breakdown of the tech and pros/cons. Thanks for your work on this.

  • @soapa4279
    @soapa4279 2 роки тому +2

    DLSS3 itself seems to work pretty well. But for my use case specifically, on an ultrawide in Witcher 3 I get flickering during cutscenes on the sides because they shrink the cutscenes to 16:9. It's so damn distracting I had to disable it.

  • @ToonamiAftermath
    @ToonamiAftermath 2 роки тому +11

    I've tried frame generation with my 4090 extremely underclocked to simulate what it would be like on lower hardware and it still looks way better than just having low framerate. If every game had it I would pretty much always use it. The net gain is almost always more than what you lose in latency, it really does a good job at fooling you that more is there, like 80% compared to a real extra frame

    • @RatBagDad
      @RatBagDad 2 роки тому

      100%. It's an absolute godsend for MS Flight Simulator in 4K VR. Without it you can struggle to manage a locked 60 in highly built up airports. And it's MSFS, you don't notice artifacts because of the inherent sedate nature of the visuals.

    • @dariomladenovski7047
      @dariomladenovski7047 Рік тому

      but does it make the game look worse?

  • @jinx20001
    @jinx20001 2 роки тому +17

    i like it, as a 4090 owner im surprised to say when its been available its been useful. Very difficult to spot any issues when playing... literally game changing in some cases like portal RTX. But im sure there will be those that do not have a 40 series card that will tell us how much of a trash gimmick it is because digital foundry froze the frames for them so that's it that's all they needed to see lol.

    • @路人甲-g9s
      @路人甲-g9s 2 роки тому +2

      If 4090 is giving you 60fps+ natively on all the games why bother frame generation?

    • @jinx20001
      @jinx20001 2 роки тому +6

      @@路人甲-g9s because i use a 120hz OLED display and 120hz is smoother than 60hz... but to be clear ill not use it if i dont need to use it, for example... i do play f1 2022, i have a full simulator setup and i simply dont need to use frame generation because framerates are already high, its just a nice feature to have if you want to use it.
      another side note on f1 2022... everytime you hear youtubers complain that in f1 2022 you can see the names having issues above the car you can take 1 thing away from that... that being they probably dont play the game, because if they did then they would know that you have the option to turn the names off and that is the best option because the names obscure your view of upcoming corners. So f1 2022 is actually a bad example of DLSS 3 not working well because most people who play the game switch the names off to see the corners lol just an interesting side note there.

    • @路人甲-g9s
      @路人甲-g9s 2 роки тому +2

      @@jinx20001 I don’t blame dlss3 as an additional feature that NV offers. But it’s actually charging customer as a big selling point. Like you said, why I pay for the feature I don’t need to use in F1 2022 since the it’s already high fps. Actually for all the competitive games, like fighting, racing, fps games, when you play with human, dlss3 won’t be used. And these games could already have been 200fps+ with 4090..

    • @jinx20001
      @jinx20001 2 роки тому +5

      @@路人甲-g9s i dont really understand what the issue is? if you dont want to use it then you dont have to use it... its nvidias job to sell you a product, that is literally what they want to do and to do that they tell you about all the features it has, what would you prefer them to do just not include it or not mention it?
      You could say they are charging more money because they have extra features but they are not forcing you to buy, its totally and utterly your choice. im happy to pay the extra to have access to the extra features, if you are not then dont buy it, you can pay 200 less and skip all the features if you think thats better value for you (comparing 4080 to 7900xtx ofcourse), i think 200 extra to have much better RT performance, DLSS and FSR and DLSS 3 aswell as better efficiency is a good 200 well spent but thats just me, you certainly can have a different opinion.

  • @DaBrain2578
    @DaBrain2578 Рік тому +5

    It doesn't really matter as long as it is software locked to the 4000 series.
    It is nice to see it works on the 2000 series with mods now, but for real relevance, Nvidia has to offer it on all RTX cards officially.
    Otherwise my money is on FSR 3.0

  • @TheAJKid
    @TheAJKid Рік тому +1

    I think it's great. I don't notice any difference but the more smooth gameplay locked at my refresh rate (144hz). I'm sure in competitive games it may be noticeable but it's literally happening in milliseconds which is nearly imperceivable even if you tried. They also have Nvidia Reflex with most games with these DLSS settings which combats any input latency. I think it's a non issue UNTIL it's being relied on to make games run at a basic level.

  • @oOMeowthOoPlushie
    @oOMeowthOoPlushie 2 роки тому +4

    My opinion towards this frame generation thing is, they are definitely some early stage of unfinished technology, it is like how you buy into the ray tracing technology back in RTX2000 series, you are buying into the ray tracing technology just to find out you need to turn off ray tracing option in the game because it tanks your FPS so much. Similarly for this frame generation thing, you will feel so left out in about 2+ years time when the technology improve so much more rapidly.
    Don't have the mindset of "I'm going to YOLO overpay right now, to have that 16 or 24GB of VRAM, and whatever DLSS3 tech, L2 cache, ray tracing cores, tensor cores just to futureproof", you can't futureproof your GPU purchase, things get so rapidly outdated nowadays, especially with Intel joining the triopoly, competition will heat up and take things to the next level. Unless, you play those games that are not dependent on these graphics technology, then you can sit on your GPU for many years.

  • @randomguy_069
    @randomguy_069 2 роки тому +2

    Callisto Protocal issue is more because of UE 4. And honestly, improvement in CPU limited situation for DLSS 3 is good only for high end cards. What's the use of it when your mid range cards are already running at 99% GPU utilisation?

    • @7bastiec12
      @7bastiec12 2 роки тому +1

      not really, even playing at 2k you can get cpu limited having a mid range card and not all future games will have the best graphics but prob still have cpu threading problems

  • @beaurunnels9030
    @beaurunnels9030 2 роки тому +3

    I'm far more interested in if this will help boost lower end GPUs performance than top their GPUs because that's the only area I see this having value. 90fps to 130fps isn't as noticeable as 30 fps to 60+fps.

  • @Azhureus
    @Azhureus Рік тому +1

    Bro, I love your videos, so detailed yet simple to understand, thanks man !

  • @AdamantMindset
    @AdamantMindset 2 роки тому +5

    it is good, sadly most people hating on it are probably on AMD bandwagon which I can relate as I was heavily biased and wanted amd to win all the time. Now I'm 3070 user and love what dlss has to offer. no plans on getting 4000s series since I just got this one. but definitely getting 5K series and I believe frame generation will improve even more

    • @sammiller6631
      @sammiller6631 2 роки тому +3

      No, there are non-AMD critics too, but that's more a critique of 4090 owners thinking everything is a "game changer" without acknowledging that features work differently on lower cards like a 3060 or 1660. 4090 owners have a vested interest in not seeing the flaws in such an expensive purchase, so they dismiss any discussion from "peasants".

  • @kenshinhimura6280
    @kenshinhimura6280 Рік тому +2

    Imagine paying an absurd amount of money for a 4K graphic card, and having to turn on a fake frames generator that gives you several visual glitches lol. Totally worthless.

  • @blackwater4100
    @blackwater4100 2 роки тому +25

    Absolute insane feature. DLSS and frame generation just work and are fantastic!

    • @Pisscan
      @Pisscan 2 роки тому

      AMD fanboys disagree 💀

  • @ImperialDiecast
    @ImperialDiecast Рік тому +1

    you dont buy the 4000 series for the extra raw power over the 3000 series, you buy it for the softwre update called frame generation. this means that the weakest 40 series cards like the 4050 and 4060 will be indredible deals since they have access ti FG just like the expensive ones, but are the ones actually really needing/benefiting from it while maintining a low MSRP. Maybe this is the reason why nvidia has still not released them. They want to clear midrange 30 series stock as much as possible before releasing the low end 40 series cards.

  • @Zumboria
    @Zumboria 2 роки тому +9

    I used frame generation for my entire play through of plague tale requiem on a 4090, never noticed any difference from regular gameplay. In fact there were a few times when I saw an artifact or graphical glitch thinking it was frame generation only to turn it off and see it was a glitch in the game. I used vsync as well and had frames locked to 141 max on a 144hz monitor, frame generation worked perfect with it and response time was steady at around 21-26ms vs regular 7-9ms which was unnoticeable in a third person game on controller.
    I should note that there was issues with using vsync when I first received the 4090 as the game would appear to extremely stutter if it and frame generation was turned on, however this disappeared in a driver update.

    • @Bootyhole-bandit
      @Bootyhole-bandit 2 роки тому +1

      Does it genuinely feel like running your standard high refresh gameplay?

    • @Zumboria
      @Zumboria 2 роки тому +1

      @@Bootyhole-bandit Well yes and no, at the end of the day it does increase latency as I said I went from 7-9 ms to 21-26 as reported by NVIDIA overlay. At the beginning of the game there’s a tutorial where it teaches you to use the sling and I replayed this section several times turning frame generation on and off to test if I could feel the difference. Using a mouse I definitely could feel the difference but it was very slight and something that you would naturally forget about as your playing the game and not directly comparing. Using a controller there was 0 tangible difference to me.
      The key thing to note was even on mouse where I could feel the difference slightly the overall visual experience was improved drastically in my opinion. I don’t remember exact frame rates but without frame generation with dlss quality at 1440p widescreen I’d hover around 110-120 fps (13900kf). With gsync enabled this should have been fine but requiem suffers from extreme stutters when loading a new area that bugged the hell out of me, it would jerk the camera with the stutter as well so it did affect gameplay. Turning on frame generation completely resolved the issue. It’s not that the performance hit didn’t happen anymore it was that the additional fake frames weren’t affected by this performance hit and prevented a visible stutter in gameplay. It also kept the frame rate at a locked 138 as per NVIDIA reflex which in my opinion combined with the stutter issue was well worth the trade of 21-26ms latency.
      TLDR; No it does not feel exactly the same, but the latency difference is extremely slight when considering the trade for a fps locked smooth visual experience with the prevention of stutters and extreme dips in fps that normally occur in the game.

    • @Bootyhole-bandit
      @Bootyhole-bandit 2 роки тому +1

      @@Zumboria holy crap man 4090 must be the biggest leap nvidia has made ether way my 3090 dips below 60fps on requiem dlss quality (1440p to 4k) and my frame time is far larger than 20ms lol

    • @Anon1370
      @Anon1370 2 роки тому +1

      @@Bootyhole-bandit you know when i got my 3090 people was telling me it was overkill at the time when they was buying 3070s now look 😆the card aint shit

    • @Bootyhole-bandit
      @Bootyhole-bandit 2 роки тому +1

      @@Anon1370 when the 5090 comes out the 4090 will look like shit

  • @djejnyc
    @djejnyc 2 роки тому +1

    Wow, great analysis of DLSS! Thanks for the insight, getting my 4080 today, looking forward to testing this out.

  • @ZeroZingo
    @ZeroZingo 2 роки тому +9

    On my computer spiderman have 50fps in the bottom of swings, DLSS3 doubles this to 100. The game looks noticebly smoother. I can't feel any extra latency, I'm using a controller. I have tried to turn it on/off to notice any artfact, super pixelpeeping. It's extremely hard to spot any. I definitely need to stare at where I think the artifact will pop up.

    • @demrasnawla
      @demrasnawla 2 роки тому +7

      tbh even on this video at full speed I could see the visual artifacts in Spiderman

    • @ZeroZingo
      @ZeroZingo 2 роки тому

      ​@@demrasnawla First, you can't record this and upload it to youtube, and get the result you get when playing. Second, everybody is different. To you it might look awfull, but to many it does not. And certanly not as bad as the difference beetween 100 and 50 FPS. It happens but it's seldom you read about somebody actually having used DLSS3 and dont like it.

    • @mehe
      @mehe 2 роки тому +1

      @@demrasnawla not really fair to the live experience considering the youtube playback is 60fps , and the game runs at 100+ the dlss genrated frames that you see on the playback would be staying on screen twice as long as what it is on the computerr

    • @Keivz
      @Keivz 2 роки тому +3

      Agreed. Try as I might, I couldn’t find artifacts in Spider-Man. Every time I thought I saw one it was normal game engine pop in/lod swap. It’s a game changer.

    • @Drip7914
      @Drip7914 2 роки тому

      @@demrasnawla are you superman??? I couldn’t see Anything lol

  • @pauldubreuil7857
    @pauldubreuil7857 2 роки тому +3

    On a 27in monitor (1440p X 1.78) the visuals in SpiderMan are fine for me with DLSS3 (and im pretty sensitive to visual aesthetics), on a massive 4K monitor that might be different. On my side it just allowed me to crank all the RT and gfx on max and truly enjoy the game without a hiccup. For the Witcher 3 it's literally unplayable without FG if ppl want all the bells and whistles, but the game is still a bit stuttery and I'll hold off till they do more patches. I can't wait to try CP2077 with DLSS3 and for Nvidia to generally improve the tech even further. Fantastic video imo on DLSS3 and I totally agree.

  • @rikachiu
    @rikachiu 2 роки тому +8

    In Portal RTX with FG on, if you enable mouse acceleration in the options and increase mouse sensitivity, that latency feel pretty much disappears even if you are trying to feel for it.

    • @TheMaestro116
      @TheMaestro116 2 роки тому +5

      But then you also have to deal with mouse acceleration...

    • @rikachiu
      @rikachiu 2 роки тому +2

      @@TheMaestro116 What is there to deal with when it negates the latency in portal essentially making it base level and now you get to enjoy double the fps and all the RTX enabled. Until you have sat down playing Portal RTX with a 4090 on an LG C2 48 inch display, I am not sure how else I could ever explain how amazing it looks and feels.

    • @sammiller6631
      @sammiller6631 2 роки тому +2

      "latency feel pretty much disappears" sounds like confirmation bias. Either it disappears or it doesn't. Using weasel wording for wiggle room does not inspire confidence.

    • @rikachiu
      @rikachiu 2 роки тому +2

      @@sammiller6631 try it

    • @swapnilgohil7280
      @swapnilgohil7280 2 роки тому

      @@sammiller6631 do you own a 4000 card?

  • @sprite0nation
    @sprite0nation 2 роки тому +2

    Playing Witcher 3 on max settings with RT in 1440p and the Frame Generation is amazing. Basically pushing my true 60-80 fps to silky smooth 120+ fps. Now when I get into Novigrad, my true fps drops to something like 40+ fps (hard CPU bottleneck), you can notice some issues. But I will still take generated 60 fps over true 30 fps any day.

  • @leonardo.muricy
    @leonardo.muricy 2 роки тому +5

    The big problem with this tech is how much a niche it is. If it's really only useful on slow paced single player games, and you need to be already at 60+ fps before it, then is it really that important? High fps on that sort of game does not matter much, so the only difference is the fluidity, which is something very subjective.

    • @PRiMETECHAU
      @PRiMETECHAU 2 роки тому

      depends on if you like 60fps or 100+ fps..
      The same tricks can be done with async re-projection, but almost no games use that atm outside of SOME VR (but it works great when used).

  • @donaldpetersen2382
    @donaldpetersen2382 2 роки тому +1

    Wait I thought the video encoding deletes the frames in-between to hit 60 fps, not accelerate the playback.

  • @stevetriple8
    @stevetriple8 2 роки тому +12

    After reading many comments on youtube and many reddit threads, I've come to the conclusion that people that have tried frame generation for themself (including myself) almost unequivocally find it to be a killer feature that does a lot more good than it does bad, even more so than DLSS 2. However, it seems a lot of people who haven't tried it seem to think it's a gimmick or that it's cons outweigh the positives. It might just be one of those situations like when people used to say 120hz was a gimmick until they actually tried it. Time will tell but I for one think it's definitely a great feature and has only improved my gaming experiences.

    • @ocha-time
      @ocha-time 2 роки тому +3

      Difference being 120hz is actually (theoretically) outputting 120 frames a second, frames actually being generated by the PC and output to your monitor. Not employing a handful of hacks to kinda sorta do the same thing theoretically almost with only MINOR graphical glitches and inconsistencies. And you're paying $1K+ for the feature. Instead of demanding actual frame output for your money. It's a weird world we're living in... Don't get me wrong, I think these types of frame hacks are fantastic for low-end rigs where they don't have and aren't expected to have the horsepower to actually output the frames you're looking for, but this "crank it up to maximum then fake half the frames as good as you can to sorta give the impression of a smoother image output" is regressive at the high end, and the tax is atrocious. My worry is when I'm looking at a "This card can do THIS" graph, I now have to check to see if this is actual rasterized gaming performance, or just their fluffed numbers for marketing purposes.

    • @QuantumConundrum
      @QuantumConundrum 2 роки тому

      What a great callback, this is absolutely like when people said 120hz is a gimmick!

    • @Dave_AI
      @Dave_AI 2 роки тому

      Yes, and worst comes to worst, people have an extra feature to play with.

    • @Drip7914
      @Drip7914 2 роки тому

      It’s mostly minimum wage brokies and kids with minimum wage parents who couldn’t buy a 4090 even if it was 700. Real people love this tech

    • @visceraeyes525
      @visceraeyes525 2 роки тому +2

      its useless for competitive multiplayer games because of the input latency and fake frames that dont give you real enemy positions on screen and its useless on the cards that have it right now because theyre high end and dont need fake frames.

  • @brothatwasepic
    @brothatwasepic 2 роки тому +1

    Hi guys could someone compare and contrast DLSS frame generation vs that "smooth" extra fps feature that many TV's have built-in?

  • @ondrejsobek6565
    @ondrejsobek6565 2 роки тому +6

    Funnily enough, I recently finished Spiderman where I played with DLSS 3 from start to finish (20 hours for the campaign) and during actual gameplay as well as cutscenes I could honestly not notice anything wrong with the image quality on my 42" LC C2 OLED TV which I am using as a monitor. The main reason I used frame generation were the CPU bottlenecks, because even with an overclocked i9-12900K and fast manually tuned memory I was still getting drops to ~90 FPS or so, which is a very noticeable drop from 120. DLSS 3 solved this and gives me stable 116 FPS (limiter to keep G-sync enabled) even when swinging through the city at street level where all of the cars / NPCs have to be rendered.
    Even more significant impact can be observed with Witcher 3 - a combination of DLSS 2 and 3 allows me to run pretty much 4K @ 120 FPS with maxed out settings. Just disabling DLSS 3 would drop me below 60 FPS in certain parts of Novigrad which is completely unplayable for me (I am a snob who needs 100+ FPS always). NV overlay says the latency is about the same and I cannot tell any meaningful difference (mouse and keyboard).
    The RTX 4090 costs an arm and a leg (2300 Euro for my Asus TUF here in Europe), but damn, playing all games at 4K HDR OLED and 120 FPS is really a completely different experience than what I had with my previous RTX 3080 and Asus PG279Q LCD.

  • @TheBogism
    @TheBogism 2 роки тому +2

    I use DLSS 3 in the Witcher 3 and Spider-Man. Not noticeable since I use controller for both those games. Haven’t noticed any image distortion. In all honestly I think DLSS 2 distorts images worse since it can give it a grainy effect on the outline of textures. That is completely eliminated with DLSS 3.

  • @ThZuao
    @ThZuao 2 роки тому +4

    I think "frame interpolation" would describe the tech better, but I believe that monicker was already taken.

    • @Pisscan
      @Pisscan 2 роки тому +1

      Fake frame interjection … FFI 💀

    • @proxis9980
      @proxis9980 2 роки тому +3

      its wrong....its not interpolating the images....its taking one image + engine state + the next engine state ...if it were "frame interpolation" you would have to wait for the second image/frame to be rendered which it isnt...it alteres rendered frame 1 based on engine data that would lead to frame 2 , parallel to the "normale " hardware starting to actualy render frame 2, that alteration is way faster than the render of frame 2 so frame 1.5 can be output befor frame 2 reaches (and needs ) the frame buffer...
      frameinterpolation is a already used technic on higher end tvs and it is explicitly creating a shiftmap between those 2 images. That name suggest something diffrent IMO...maybe the best name would be "intermediate frame updating "? or "frameremapping" or somehting of that lines

    • @Wobbothe3rd
      @Wobbothe3rd 2 роки тому

      Its not interpolation.

  • @N5O1
    @N5O1 Рік тому

    1:52 how it decrease responsiness? it just generate new frames, but input lag the same if you played on original FPS

    • @danielowentech
      @danielowentech  Рік тому

      It slightly delays showing you the next "real" frame in order to generate and display the "in between" frame.

  • @vladimirmarinov6730
    @vladimirmarinov6730 2 роки тому +6

    Frame Generation is literally a "game changer".

    • @dennisgordon7767
      @dennisgordon7767 2 роки тому +1

      :D

    • @Anon1370
      @Anon1370 2 роки тому

      what about if we had frame generation fps without the frame generation and just had raw performance like we did in the past yeh nah too much to ask for.

    • @kellykelly8276
      @kellykelly8276 Місяць тому

      Money is a game changer. If u got money u can buy any powerful shit u wanna .
      14900 + 4090 n i dont need any dlss shit n staff on 2k

  • @MegaLol2xd
    @MegaLol2xd 2 роки тому

    8:16? What? how can it be cpu limited? my 7700k did mighty fine making from 60 to 80 fps at 3440x1440 with dlss quality, the limit was 2080super.

  • @hrodgarthevegan
    @hrodgarthevegan 2 роки тому +10

    I have 4090. I use frame gen in Darktides. It's revolutionary! It's bigger than upscaling and ray tracing combined. In fact, with it, it allows you to use ray tracing even more easily.
    I want it in as many games as possible, provided it's enabled correctly. I'm pissed off that CDPR has it in Witcher 3 but not in Cyberpunk 2077, when it's already working in reviewers versions.

    • @dagrimmreepa
      @dagrimmreepa 2 роки тому

      Same!

    • @clownavenger0
      @clownavenger0 2 роки тому +2

      Cyberpunk is an FPS with LOTS of HUD details. Might really be a flop in that game.

  • @walkmetothemoon
    @walkmetothemoon Рік тому +1

    Really clear and useful explanation. I think I'll aim at having it off as much as possible, but will definitely try it in a few games to experience by myself how they feel. Thanks for the time and effort you put into it.

  • @Volentek
    @Volentek 2 роки тому +5

    This is exactly why I bought a 4080. I'll be able to play everything maxed out for years to come because of this feature alone.

    • @iraklimgeladze5223
      @iraklimgeladze5223 Рік тому +1

      If you are ready to pay for card $1100, next one will $1300. More you are ready overpay more they will increase price

    • @mckinleyostvig7135
      @mckinleyostvig7135 2 місяці тому

      AMD gave up and there's nobody else. If you want to buy GPUs in high performance there's no choice. ​@@iraklimgeladze5223

    • @Knightfire66
      @Knightfire66 Місяць тому

      go have fun with your fake "200 fps" and your "4K" image with 120ms latency. I am happy with my 100fps on my QHD monitor #10ms with my amd :D

    • @Volentek
      @Volentek Місяць тому

      @Knightfire66 Someone's jelly 🤣. I've been having the gaming experience of my life with this card and my Alienware AW3423DW and will continue to do so. And I couldn't care less about about a minimal decrease to latency when all I play are single player games.

    • @Knightfire66
      @Knightfire66 Місяць тому

      @@Volentek its good that youree having fun. thats the most important. but I dont care. were talking about facts here.

  • @skataneric
    @skataneric 2 роки тому +2

    It has to be experienced in person because we all generally differ in how we feel latency. For example, with framerate, I am one of those that doesn't notice real difference past 75hz on a monitor. So while i do have a 144Hz monitor, it's wasted on someone like me, and I tend to prefer gaming on my 75hz ultrawide. But I do think that it would be more noticeable when say the native performance is putting out 30ish frames and DLSS is pushing out say 50-60, because it will still feel like 30fps input wise. The midrange cards is where DLSS3 is really going to be tested. Not so much these halo high end cards.

  • @Kakaroti
    @Kakaroti 2 роки тому +15

    The fact that you need a $1600+ GPU to enjoy a remaster of a 2015 game is the real joke here.

    • @Knightfire66
      @Knightfire66 Місяць тому

      fps chunkies. what do you expect?

  • @kotboyarkin5032
    @kotboyarkin5032 2 роки тому +1

    DLSS 3 is the best thing that happens after DLSS 2. DLSS 3 is killer feature. And increased input lag is not something you will notice. And there is no noticable artifacts.
    And as for your question "why don't it always double it". For FPS doubling you must have higher CPU bottleneck and thus have more free GPU resources available because DLSS3 isn't free for GPU. If your GPU usage is already 99% before DLSS3 enabling then you won't get double fps. If your GPU usage is ~60% then you will get x2 FPS with DLSS3 but your GPU usage will increase significant and it can even be at 99% usage after enabling DLSS3.

  • @K11...
    @K11... 2 роки тому +9

    I was extremely sceptical to frame generation to the point where I didn't even bother to try it for a long while. I was expecting artifacts and latency similar to what you get on modern TV's that try to do the same thing, how ever turns out it is actually amazing and I find myself wanting it in more games now. Used it in the last half of Plague tail Requiem and now using it in the witcher 3, with a 4090.

    • @maegnificant
      @maegnificant 2 роки тому +2

      I think it looks awful

    • @anuzahyder7185
      @anuzahyder7185 2 роки тому +2

      If u say awful then i guess u havent used it or u r an amdonkey. I was really skeptical as well but amd framegen is brilliant

    • @maegnificant
      @maegnificant 2 роки тому +2

      @@anuzahyder7185 amd has no frame gen. I tried dlss 3 with spiderman and it looked bad.

    • @anuzahyder7185
      @anuzahyder7185 2 роки тому +1

      @@maegnificant i think in spiderman it’s okay. Not that great but other games it’s surprisingly great

    • @maegnificant
      @maegnificant 2 роки тому +1

      @@anuzahyder7185 maybe other people are less sensitive to it, but I find it distracting

  • @kmg501
    @kmg501 2 роки тому

    Let's step back for a second, are there other ways to increase motion fluidity?

  • @freddumartin7878
    @freddumartin7878 2 роки тому +3

    Glad you're talking about DLSS because it's been absent from all the reviews Nvidia 4000 / AMD 7900... Just making more obvious that there is zero reason to consider the AMD GPUs over the 4080/4090.

    • @Pisscan
      @Pisscan 2 роки тому +1

      But fsr 3.0… 💀

    • @chacharealsmooth941
      @chacharealsmooth941 2 роки тому +3

      There is. Money and value.

    • @freddumartin7878
      @freddumartin7878 2 роки тому +1

      I would have liked to see benchmarks with DLSS and FSR, I believe this is how most people are playing modern games. Imo value is not a question anymore with $1,000+ GPUs, if you can spend that money you just get a 4090.

    • @ZXLink
      @ZXLink 2 роки тому +4

      @@freddumartin7878 You're buying a $14000 car, you might as well spend the extra $8000 to get the version with the heated seats.

    • @snowboyken
      @snowboyken 2 роки тому

      @@ZXLink Nice comment!

  • @popcornfilms1
    @popcornfilms1 Рік тому

    Beautifully explained, thank you
    Always a pleasure

  • @randomguy_069
    @randomguy_069 2 роки тому +3

    So DLSS 3 Frame Generation is good for games when you already are getting 60 FPS or so and want to see more fluid motion in single player games. If you are planning to use it on 4050 and think that it will "double" the frame rates from 40 to 80, it will not be a nice experience. Just as Hardware Unboxed said, Frame Generation is good if you already have high FPS, if you are barely running 30 FPS don't think that it will improve your experience by a lot.

    • @Anon1370
      @Anon1370 2 роки тому +1

      thats exactly what people are planning to do with the 4050 oh well let them find out.

    • @randomguy_069
      @randomguy_069 2 роки тому +1

      @@Anon1370 Yeah. DLSS 3 is not worth it. Maybe DLSS 4 will be better, but until that happens at least for another generation DLSS 2 will be the most used and favoured DLSS on the Nvidia GPUs.

  • @tburn8888
    @tburn8888 2 роки тому +2

    Hi, is it like truemotion option we found on tv smoothing videos ? If so, why we need 4000 series to do it if tv soc less cabable can do it

    • @Keivz
      @Keivz 2 роки тому +1

      For one, most tv’s can’t interpolate content that’s already 60 fps. Secondly, there is massive input lag with true motion and it is nearly unusable for most people. And third, it has a ton of artifacts. Not sure you can use vrr with it on either but I doubt it.

  • @neilstack4194
    @neilstack4194 2 роки тому +11

    Yeah nvidia love mass marketing pseudo bar graphs to trick you to think that DLSS 3 while good can be misleading in certain titles. Happy Xmas Dan

  • @malazan6004
    @malazan6004 Рік тому +1

    It is fantastic and keeps improving! I was not really that sure on it but yeah after a bunch of testing it's a great feature.

  • @VanquishR
    @VanquishR 2 роки тому +10

    Fantastic review. You went through everything that many bigger reviewers didn’t bother to look at it. Even though I’m on team red right now, I am interested in seeing how frame generation will help in the future.

  • @Darkswordz
    @Darkswordz 2 роки тому +2

    What's going on with those CPU temps?!

  • @ChekzthisoutTV
    @ChekzthisoutTV 2 роки тому +7

    its always a gimmick. its not as smooth as native.

  • @TheExtremeElementz
    @TheExtremeElementz 2 роки тому +1

    Do you have the option to use DLSS 2 with a 40 series GPU? I wonder how that compares.

    • @mattkostandin
      @mattkostandin 2 роки тому +3

      You can use both at the same time. Works fine that way

  • @Dj-jc4ki
    @Dj-jc4ki 2 роки тому +4

    Haven't seen enough videos on frame generation I think u did a great job and was very informative about the pros and cons and overall experience, I think overclocking the cpu to its boost clock from base clock would help the bottleneck this time around there heavier just an idea, great job again I stumbled apon ur videos from intel xess and enjoyed ur content ever since.

  • @tuskeye
    @tuskeye 4 місяці тому

    What i want to know is , if it affects the timing for perfect dodge or perfect guard or parry in singleplayer rpg games. For example dmc or wukong.

  • @Potaters12
    @Potaters12 2 роки тому +5

    I own a 4090, can confirm NOT a marketing gimmick. Absolute game-changer. The latency addition is there, but not worse than the addition v-sync would add on. I have not noticed any of the visual glitches yet, and I think you'll only notice it if you take a screenshot and study the image. It gives you free frames without softening the image like DLSS does.

    • @marklatham1414
      @marklatham1414 2 роки тому +1

      Right, when reviewers zoom in and pixel peep they will find the artifacts. When actually sitting down to enjoy a game with frame generation on, you don't notice it. What you do notice is the gained fluidity in the visuals. A net gain indeed.

  • @slc9800gtx
    @slc9800gtx Рік тому +1

    I have a Nvidia card that has frame generation. I tried it in a few game such as Cyber Punk 2077 and I like it a lot. It smooths the image on the screen, so it looks very good, as well as increasing fps about 50 percent higher. As everyone should know, it should not be used in competitive shooters, because you do not want anything affecting latency. But is 1st or 3rd person games, I feel is is great. Get a demo or a game and try it yourself. I thought frame generation was maybe a gimmick until I tried it myself, but I found that I like it.

  • @asaneouji15
    @asaneouji15 Рік тому +5

    This technology is amazing. What people don’t realize is that it only adds things and doesn’t remove anything. As long as it looks good while in “motion” it shouldn’t matter if the game generated it or the ai did. It looks incredible :D and I love it! It’s a whole different experience

    • @GMindset959
      @GMindset959 Рік тому +1

      It's a marketing ploy that is used to fool the casual customers.

    • @Wispurr0008
      @Wispurr0008 Рік тому

      Yeah the AI tech is so cool! I was really impressed. I thought path tracing was so cool and it sucked to see that GPU's can't handle it as a base yet.. but they created a usable fix that works! Like you said, as long as it looks good, it doesn't matter how the frames are made. AI only gets better with time too!

    • @Wispurr0008
      @Wispurr0008 Рік тому +1

      @@GMindset959 Naw. They are marketing it, but it's not a ploy.

    • @chrisanytime1
      @chrisanytime1 Рік тому +1

      It removes responsiveness by adding extra latency lol

    • @Wispurr0008
      @Wispurr0008 Рік тому +1

      @@chrisanytime1 Yes it does increase latency, but having played with it myself, singleplayer games like Cyberpunk it's hard to notice it in or even care about.
      There aren't any multiplayer games like Fortnite where I've needed to put on max settings while being competitive. I wouldn't use it for multiplayer games, but I will also never need to either.

  • @LorentGuimaraes
    @LorentGuimaraes 7 місяців тому

    Please do a video comparing the impact of Vram on FG performance. For example, how much more perf does a GPU get with FG if it has more Vram. Does it not matter, is there a threshold?

  • @neverenoughguitars8276
    @neverenoughguitars8276 2 роки тому +3

    I think this is great for demanding single player games like cyberpunk and plague tale where the highest responsiveness isn't that critical.

    • @mr.grumps3544
      @mr.grumps3544 2 роки тому

      Cyberpunk doesn't even need frame generation to hit 60fps with all raytracing on with a 4000 series gpu. That game is actually surprisingly well optimized now.

    • @neverenoughguitars8276
      @neverenoughguitars8276 2 роки тому

      @@mr.grumps354460fps? Pshhh. I'll turn on frame generation and get 100fps thank you very much.

  • @andyguzman6232
    @andyguzman6232 Рік тому +2

    NVIDIA is marketing it as a game changer and it isn't , it is a feature that improves experience only in very specific situations , for most pc games this will actually hurt image quality and input lag , specially at lower framerates , which means using this in a 4070 or 4060 will likely almost always hurt image quality to where it's not worth using. Also if i have to turn this on to get good framrates after paying more than 1000 dollars for a gpu shit is worst then i thought

  • @Fezzy976
    @Fezzy976 2 роки тому +3

    Google "black frame insertion".
    Nvidia basically copied that and are trying to sell it as a revolutionary new tech when it's not new and it's not revolutionary.
    It's just going to make developers lazy with optimization of games. Who would care about fixing or releasing an optimised game if they can just add in FG and have that fix their crappy stuttery coded game.
    Witcher 3 update is a prime example. The game doesn't look that much better but the performance without FG and Upscaling is a complete stuttery mess even on a 4090. This is from a game that was originally released in 2015!
    These kinds of tech should be baked into a TV for both things like Upscaling, black frame insertion, etc. This way all that GPUs will be judged on is their power and not their gimmicky features.

    • @AndrewB23
      @AndrewB23 2 роки тому

      Black frame literally puts black frame and the fake 240 or even 600hz tvs just duplicated the last frame, that's not the same as this it actually renders a different frame

  • @packetcreeper
    @packetcreeper 2 роки тому

    Loads of useful information. Thanks for a wonderful video!

  • @imo098765
    @imo098765 2 роки тому +5

    Seen it in person, I dont think DLSS3 is gimmick. It really does make the animations more fluid and if someone wants the better looking animations and most of the time imperceivably small cost to latency its a fantastic tech
    however I believe DLSS2 is far more of the gamechanger and would rather turn it on and reflex+boost
    but I prefer the balance it brings

    • @kickassguy211
      @kickassguy211 2 роки тому

      It becomes a gimmick when Nvidia try to market it as actual performance gain.

    • @Drip7914
      @Drip7914 2 роки тому

      @@kickassguy211 if the latency increase is negligible then I don’t see a problem with that

    • @kickassguy211
      @kickassguy211 2 роки тому

      @@Drip7914 There is no problem with the tech itself. The problem "lies" with Jensen misleading people into thinking your performance double with dlss 3 turned on or the fact that their chart shows the 4090 being 3x to 4x faster than the 3090 with the use of dlss 3 which isn't true at all.

    • @Drip7914
      @Drip7914 2 роки тому

      @@kickassguy211 it already doubles the 3090 so 3-4x with dlss3 isn’t a lie at all lol

    • @kickassguy211
      @kickassguy211 2 роки тому

      @@Drip7914 The 4090 at best is 60 percent faster than the 3090 on average and 50 percent faster than the 3090ti no way near doubling it. Second of all i don't even think you understood what Daniel explained about dlss 3 that's why you're still confusing it with real actual performance. Just because your frames doubles with dlss 3 doesn't actually mean you're playing at double the frame rate, you're still playing at exact same framerate as you were before you turned on dlss 3 plus added latency. The only real improvement you see are the Visual Fluidity in animation on your display and the input responsiveness would still feel like it's running at whatever frames you started off with. Hopefully that make sense.

  • @astrea555
    @astrea555 2 роки тому +1

    Daniel, you have an LG C1 yet never use it's black frame insertion mode, and it's a shame. Your wasting the TV's perfect response times capabilities.
    I'm very curious to see you try DLSS3 with black frame insertion/OLED motion pro on high at both 60 and 120FPS. The LCD motionblur will be removed and this should allow you to see the DLSS3 artifacts a lot better because of the huge boost in motion clarity that this BFI mode offers. You need to disable VRR for BFI to function but it's always worth it in my strong opinion.

    • @Anon1370
      @Anon1370 2 роки тому

      i use bfi on my lg g2 gsync setting seems broken for me it doesnt work certain games do a little stutter dance

  • @RobertoLaurelli
    @RobertoLaurelli 2 роки тому +8

    As someone who actually uses it your final words nailed it - slowing it down to show if it's good does not represent how good it is in game. DLSS3 is absolutely a game changer and I want it in every game going forward, then let the user decide to keep it on or turn it off (Spoiler: in practice most will just keep it on).

  • @cal5217
    @cal5217 2 роки тому +1

    Hi Daniel, I noticed some screen tearing. Is this caused by the recording software, not having gsync enabled, or is it being caused by using DLSS3?

    • @airelleavils8394
      @airelleavils8394 2 роки тому +2

      There's a note in the vid saying it's the difference between the unlocked framerate of his game vs the 60 fps capture card.

    • @cal5217
      @cal5217 2 роки тому

      @@airelleavils8394 Ok thanks, missed that.

  • @bigmack70
    @bigmack70 2 роки тому +5

    Killer feature in witcher 3, portal, and Warhammer Darktide. Was nice quality of life in Requiem.

  • @pascalfilion3518
    @pascalfilion3518 Рік тому

    That was an awesome video , thank you. I just bought an 4090 , coming from an 6900 xt and you gave me all the information i needed.

  • @DavideDavini
    @DavideDavini 2 роки тому +4

    So, it’s not a gimmick or a game changer. It’s just a thing you can do in very specific situations where it could help.

    • @airelleavils8394
      @airelleavils8394 2 роки тому

      I wanna see where this goes so I'm not dismissing dlss 3 like I dismissed dlss when it first came out.

    • @pedestrian_0
      @pedestrian_0 2 роки тому

      Isn't a "thing you can do in situations where it could help" a game changer? It easily beats those videos that tell you to try 12 different things to help you gain performance.

    • @DavideDavini
      @DavideDavini 2 роки тому

      @@pedestrian_0 not when the situations are very specific.
      But, here I’m going off of what YTers say. Ain’t got no 40 series to test it for myself.

  • @austinchasteeny
    @austinchasteeny 2 роки тому

    Man this is so cathartic, been arguing for days in these comments with people who cant fathom that ray tracing increases cpu load

  • @MarcinQ_53
    @MarcinQ_53 2 роки тому +3

    I have 4090. I was afraid about the latency but when I'm playing Spider-man MM with dlss 3.0 I'm able to do perfect doges with no problem which means (imo) that the increase of latency isn't the problem at all, at least in single player games. Of course there are some artifacts especially if you know where to expect them, however 135 fps with everything maxed out (1440p) is good enough. I bet that if you let play to someone who has 0 knowlege about graphics etc. he won't even notice that something is "wrong". Thank you for your videos Daniel. They were so helpful when I was thinking if going for 4090 is a good idea.

  • @morrays1996
    @morrays1996 2 роки тому +2

    The argument that you’d want to turn this off in a competitive game is funny to me, because both the 4090 and 4080 run circles on esports titles at max settings, no the less the low settings most competitive sweats run their games at. They don’t need frame generation in competitive games, this would exclusive be used on story driving titles/ campaigns lol.