DLSS 3 - My predictions about the feature

Поділитися
Вставка

КОМЕНТАРІ • 512

  • @MonkingFlame
    @MonkingFlame Рік тому +856

    I can't wait for DLSS 6.0 where you just play any game just by giving it a prompt and the game is generated on the fly by the neural network.

    • @dwaynehawkins
      @dwaynehawkins Рік тому +12

      It could be like that yes... games/movies/VR/... will be generated on the fly, in realtime

    • @zzzyyyxxx
      @zzzyyyxxx Рік тому +22

      This but unironically

    • @wuzz111
      @wuzz111 Рік тому +9

      6.0 is generous, at least it will be 60.0

    • @wonkehcheetah1138
      @wonkehcheetah1138 Рік тому +11

      I would so just try to utterly break the Network. I'd be like: "Computer, generate Elmo's Number adventure, but make it an open world third person RPG that crosses over into the Doom universe. Replace Big Bird with Count Chocula, and deep fry all the textures."

    • @buglocc
      @buglocc Рік тому +1

      So playing a game with normally rendered frames?

  • @TheLifeFruit
    @TheLifeFruit Рік тому +120

    Hey nice video! After watching countless rtx 4000 DLSS 3 videos being pretty much the same information, this video in comparison differd and gave me an actual idea how DLSS 3.0 may work. Thanks a lot phillip

    • @HyperOpticalSaint
      @HyperOpticalSaint Рік тому

      IMO nVidia is not so dumb that DLSS 3 would look anywhere close to videos in this one, but the responsiveness issue simply cannot be avoided unless the AI is smart enough to build a mini time travel machine inside each card.

    • @marceelino
      @marceelino Рік тому

      @@HyperOpticalSaint so it's useless. Usable in single player strategy games or sim games.

    • @ChrisM541
      @ChrisM541 Рік тому

      DLSS 3 injects FAKE FRAMES to give a FAKE higher FPS.
      --> DON'T BELIEVE NGREEDIA'S MARKETING BULLSH#T...!!!!
      You eyes will be opened to the depth of Nvidia's anti-consumer practices once the reviews are out...practices we already know such as rebranding a 4070 as a much MUCH more expensive 3080 12GB.....!!!!

    • @sayorimiko
      @sayorimiko Рік тому +1

      @@marceelino So it's not useless, considering the other uses you mentioned. Also for gamers who want to experience scenic visuals in exchange for a bit of latency. Not everyone plays games that demand very low input lag, so I don't know why people are saying that DLSS 3 is a failure.

    • @marceelino
      @marceelino Рік тому

      @@sayorimiko well the uses are minimal as you are getting the same value from FSR 1.0.

  • @kllrnohj
    @kllrnohj Рік тому +222

    The latency hit would make this unusable for VR, where that lag becomes something that causes nausea instead of just a mild annoyance depending on the game

    • @reptarien
      @reptarien Рік тому +44

      Unless it was somehow in the single digits of milliseconds, completely agree. For VR, we just need more actual raw power, faking it can only get us so far.

    • @xavierpony192
      @xavierpony192 Рік тому +39

      Many games utilize asw which is frame rate interpolation.
      Some quest 2 games already run at 60 fps and get interpolated up to 120hz
      On PC the same thing happens, when your PC can't keep up with 90fps, it switches to 45 fps and frame rate interpolation is used

    • @xsct878
      @xsct878 Рік тому +24

      Oculus already has frame insertion when frames drop below a threshold and it's pretty decent.

    • @flamingscar5263
      @flamingscar5263 Рік тому +18

      VR has had this tech on the oculus side of things since the original rift
      It's called ASW (Asynchronous Space Warp) and what it does is if your FPS can't match your headsets refresh rate then it will lock your framerate at half the refresh rate, so 45 for 90hz, 60 for 120hz, etc. and then it will interpret every other frame, I'm not 100% sure if ASW uses AI/Neural networks or not

    • @rtyzxc
      @rtyzxc Рік тому +21

      No, it's not an issue, VR reprojects head movements so there's no latency, for that. For objects like hands there are artifacts.

  • @checker297
    @checker297 Рік тому +484

    I feel this is similar to how supermarkets no longer dont stock 4 packs of beans and force you to buy them individually so overall you pay more. If DLSS 3.0 is just a marketing headline, these new GPU's are terrible value. Will be interesting to see how AMD responds as their cards were really competitive last gen.

    • @devetefan
      @devetefan Рік тому +14

      This guy beans

    • @GewelReal
      @GewelReal Рік тому +37

      Aldi 3.0

    • @WayStedYou
      @WayStedYou Рік тому +5

      We still have 4 packs of beans in Australia but they are usually worse value than buying a single larger can.

    • @LillyP-xs5qe
      @LillyP-xs5qe Рік тому +2

      Lidl still sells 4 packs ..

    • @BillCipher1337
      @BillCipher1337 Рік тому

      They allready did by Lowering 6900xt to 780€ comapred to the 1100€ for the 12gb 4080 with more or less similar Raster Performance by looking at the Nvidia Benchmarks of RE:8, AC Valhalla and Division 2

  • @KiemPlant
    @KiemPlant Рік тому +18

    Input lag was the first thing that popped into my mind when they said they were going to interplolate frames. I hope there's an option where you can easily disable that and only upscale the image, especially for in competitive games that are already well optimized.

    • @VioletViolent
      @VioletViolent Рік тому +1

      It seems that frame generation is a separate toggle for Dlss 3

  • @WayStedYou
    @WayStedYou Рік тому +295

    Don't let this distract you from the fact the shader gap between the "4080" 12gb and 16 is larger than the gap between the 3080 and 3090ti that all used the same 102 die.
    Jensen trying to pad his wallet even more.

    • @AGuy-vq9qp
      @AGuy-vq9qp Рік тому +2

      If you don't like it don't buy it.

    • @redfoxdog1
      @redfoxdog1 Рік тому +7

      my wallet is augmented

    • @hOREP245
      @hOREP245 Рік тому

      @UCEKJKJ3FO-9SFv5x5BzyxhQ my arse

    • @sinom
      @sinom Рік тому

      @UCEKJKJ3FO-9SFv5x5BzyxhQ I think they meant the 3080 and 3090ti used the same die, not that the 4080 and 4080 use the same die. According to the tech power up GPU database the 3080 and 3090tu both use the GA102 die
      Edit: he changed his comment while I was writing mine. Now mine doesn't make as much sense

    • @RobBCactive
      @RobBCactive Рік тому

      @@2kliksphilip Looking around it's believed to be AD102, AD103 and AD104 for the RTX 4078 $899; interestingly VideoCardz thinks the 4080 uses 22.5GHz GDDR6x which is faster than the other models.

  • @HolyOllie
    @HolyOllie Рік тому +78

    God i cant express how greatfull i am for your tech videos. Thank you philip

    • @LSK2K
      @LSK2K Рік тому

      You're welcome.

  • @SigSelect
    @SigSelect Рік тому +23

    I'm not so sure about input latency being an acceptable trade-off for VR users. John Carmack talks at length about how the biggest hurdle getting VR to a usable quality was just reducing input latency, since delays are so noticeable and intolerable with a HMD. DLSS might not take VR back out of that acceptable range, but it may be the case that like competitive games, input latency for any VR games becomes a very sensitive benchmark. You can already see it when you look up benchmarks for VR setups, frametime is a oft included stat.

    • @cube2fox
      @cube2fox Рік тому +1

      VR apps may even care about high frame rate less because it's smoother but more because higher frame rate lessens input lag, since you are waiting not as long for a new frame. But DLSS 3 does the opposite, it makes the input lag longer than it would have been with lower (non-interpolated) frame rate.

    • @paul1979uk2000
      @paul1979uk2000 Рік тому

      @@cube2fox I'm not sure, when it comes to VR, we tend to be a lot more sensitive to what we see, hence why higher frame rates are a must, DLSS 3 adding higher latency might be noticeable a lot more with VR then standard gaming.

    • @SianaGearz
      @SianaGearz Рік тому

      If they want to target VR, they'll probably have to add predictive resolve mode to the algorithm, as opposed to interpolative resolve for quality-sensitive applications. But it's possible to do it then without introducing latency that you don't already have with framewarp, or just a little, in like 2ms range, just the tiniest little fraction of a frame worth.
      But also absolute latency maybe isn't really THAT much of an issue. You can then just hide it by reprojection, so your view is updated at minimum possible latency on head turn, no added latency, while the game events and movement receive a frame of additional latency. It might just work and not be noticeable. Worth trying!

  • @Enivoke
    @Enivoke Рік тому +318

    In other words, allow 3000 series to use DLSS 3 and you’ll see less performance difference between 30 and 40 series.
    Not to mention nvidia is trying to sell x70 class gpu for x80 price.

    • @MrZodiac011
      @MrZodiac011 Рік тому +16

      Well it depends, because for Ray Tracing, there's like 4x as many RT cores as 30 series, so for RT performance, it'd still be well ahead, and they did show the difference between DLSS 3 and 2. It's also interesting that people keep saying it's a 70 class GPU at an 80 class price, because we will eventually have a 70 class GPU, and they've done stuff like this before, the 1060 3gb and 1060 6gb were really more a 1060 and 1060Ti because the 6gb had more of other things so it was faster in general. I also don't know why this is considered a bad thing because if they made this a 4070 at it's price people would have been more angry, this means the eventual 4070 will be cheaper

    • @kamikazilucas
      @kamikazilucas Рік тому +7

      80 ti price really

    • @user-wq9mw2xz3j
      @user-wq9mw2xz3j Рік тому +9

      @@MrZodiac011 1060 3gb snd 6gb were much closer.
      in fact, no big performance difference as people claimed. These have much more cuda core difference.
      And just compare it to last gen. "4080" 12gb doesn't even beat 3090ti.
      3070 beat 2080ti, 2070 beat 1080ti, 1070 beat 980ti, 970 beat 780ti.

    • @Japixx
      @Japixx Рік тому +3

      Salty 30 series owner?

    • @Funkoh
      @Funkoh Рік тому +3

      @@Japixx no, we just dont have as much disposable income as you and are unable to spend at least $899 on a gpu with lackluster performance compared to previous generations. Furthurmore we wish to be able to have a relatively pricey gpu released only a few years ago be compatible with a software which seems to be very beneficial and amazing yet locked behind exclusivity deals

  • @xyoxus
    @xyoxus Рік тому +4

    With filming videos on smartphones we already have all the stuff filtered and processed while recording.
    So yes, I'm 100% with you on your last sentence.

  • @markackerman9485
    @markackerman9485 Рік тому +6

    0:22 "Why hasn't this been done for games already?"
    It kinda has for VR. Asynchronous Space Warp with Meta headsets, Motion Smoothing on SteamVR headsets.
    It's actually a pretty good experience on the Valve Index to do fixed framerate motion smoothing at 72hz to 144hz. I didn't like any lower than that.
    And when I play on Oculus I disable it entirely cause I'm already adding a lot of latency with Virtual Desktop streaming.

  • @jamesthorogood1479
    @jamesthorogood1479 Рік тому +1

    First baked beans, now DLSS.
    Thank you for blessing our day, Philip.

  • @SianaGearz
    @SianaGearz Рік тому +1

    I did an experiment once. Because UA-cam didn't support 60fps at the time, but i wanted to upload smooth looking synthetic footage.
    So i captured at 60fps, and wrote an avisynth script which upinterpolated it to 150 or 180fps i forget, then filtered it down to 30. So added synthetic motion blur basically. And it looked decent!
    Here's what i'm thinking. Let's say you don't have VRR, say you have a TV running at 60Hz. It looks good, but your GPU can only hit real 60 only most of the time; but the occasional drops down to 56, you'd like to hide them, and maybe you don't have enough CPU or compute throughput to make it happen without taking too much of a compromise elsewhere. In that case i think frame resynthesis from motion, similar to how VR does it (generally running at something like 90Hz there) might be the ticket.
    It absolutely makes sense to integrate it into DLSS since the resynthesis can leverage the same motion vector, lower-resolution and accumulation data just to do this framerate independently instead of frame synchronously.
    A hunch that the limitation of DLSS3 to upcoming series of GPUs is artificial though.

  • @HoneyTwee
    @HoneyTwee Рік тому +12

    Definitely in the same boat with you Philip with knowing about the wonders of frame interpolation but never thinking it could be applied in real time to games like this.
    But this is honestly the most comprehensive video I've found on DLSS so far, not even big journalists have broken it down as well as you have here. So hats off. Great vid. 🎉

    • @HoneyTwee
      @HoneyTwee Рік тому +3

      @@2kliksphilip yeah we'll have to see. Hopefully you plan on covering the DLSS 3 situation more as it develops I'm really looking forward to that.
      And for the AI detail/colour overlay weirdness that the GTA example shows off. Really looking forward to stuff like that.
      But I am going to stand on the positive side of things here because I think if you're used to say 60fps and turn it on to get 120fps with the latency of 60fps you'll enjoy it.
      If you're used to 120fps and want to play a harder to run game with DLSS 3 to get 120fps you're probably going to find it laggy at best and sickening at worst.
      But time will tell. 🙂

    • @eniff2925
      @eniff2925 Рік тому

      @@2kliksphilip Reflex only help in situations where one is GPU limited (GPU Usage=100%), it doesn't decrease latency at all in CPU limited scenarios. Setting a framerate cap to limit GPU usage has the same effects on input latency as reflex does.
      I have seen ideas about this a while back on some forums but I never thought nvidia would come up with it out of the blue.
      There will be a delay added that cannot be circumvented in any way. My guess is it will be at least 2 frames.

    • @HoneyTwee
      @HoneyTwee Рік тому

      @@eniff2925 just checked out the Digital Foundry DLSS 3 first showcase video. And they went over latency in scenarios where they are CPU limited and GPU limited (before turning on DLSS 3 frame generation)
      And whilst Nvidia are very careful about what they allowed Digital Foundry to show in that video. FPS numbers being a no no for example. The latency is better than native (due to native being pretty bad as it's a low frame rate) and about on par with DLSS 2 latency without Nvidia Reflex.
      Looked really good latency wise, only thing to keep in mind if that whenever they showed DLSS 3 frame generation, they also had DLSS upscaling in performance mode, so are always working with quite a high FPS (100fps turned to 200fps in Spiderman for example), using frame generation to go from 40fps to 80fps it might be much more of a problem.
      And of course the latency isnt really changed at all when you have DLSS 3 frame generation and Nviida Reflex both on. So whilst latency doesn't get worse, you're not getting half the benefit of the higher FPS, which is less latency. So will 200fps in Spiderman with DLSS 3 actually feel like 200fps. No, it will just look like it. Weird tech. But super cool, hopefully it feels great to usw for those 10 people who can afford a £950 RTX 4070 ☠️

    • @eniff2925
      @eniff2925 Рік тому +1

      @@HoneyTwee I also watched it. They had DLSS 2 active which more than doubled the framerate in some cases. That was the reason for the decrease of latency. Turning DLSS 3 on top of that increases latency, quite heavily, but it is offset by DLSS 2. Compared to native, DLSS 3 should increase latency by at least 2 frames according to my logic. Reflex is not connected to this at all. It cannot help directly in this use case. Reflex was created in response to battlenonsense's findings about input latency drastically increasing in GPU bound scenarios. This input lag can be completely eleminated by setting an FPS cap so reflex is not very beneficial if you know what you are doing.

    • @HoneyTwee
      @HoneyTwee Рік тому

      ​@@eniff2925 True, it would have been cool to see more about the latency penalties of DLSS 3, but it seems they were quite limited on what they could show.
      But if i'm not mistaken, the latency they reported with DLSS 3 + DLSS 2 + reflex was still slightly better than DLSS 2 on its own, and a lot better than native. So I assume reflex is making quite a lot of difference even if the overall latency compared to native is probably mostly coming from DLSS 2.
      Honestly though DLSS 3 looked good from what they showed, I don't think its a make or break to have it or not yet, so i'll probably still get an AMD card like a RX 7800XT if it's £850 ish and is around a 4080 16G, but DLSS 3 looks like really promising technology that I will be glad to have by the time its in more games and hopefully by then AMD will have a competitor to it themselves.

  • @adrianoperillo
    @adrianoperillo Рік тому +1

    Why are you so weird but, yet, so cool. I can't stand liking this sense of humor of yours so much.

  • @milanmestka416
    @milanmestka416 Рік тому +1

    Laughed my as* off at motion extrapolation demo :D Anyway, nice informative video!

  • @Mezurashii5
    @Mezurashii5 Рік тому +7

    I think it would be best to have a way to integrate the tech into the rendering pipeline so that the frame generation tech uses an unfinished frame + the last frame instead of 2 of the last frames.
    The paint drawing -> realistic image tech you showed at the end is probably a step in that direction.

  • @poppymuffinseed
    @poppymuffinseed Рік тому +10

    Honestly, I just hope the artificial frames thing is optional and DLSS at it's core keeps improving!

    • @mariuspuiu9555
      @mariuspuiu9555 Рік тому

      i think you can switch between 2.0 and 3.0

    • @malazan6004
      @malazan6004 Рік тому

      I doubt it unfortunately.. this new fake tech is the thing going forward vs improving 2.0

  • @NiktoTheNobody
    @NiktoTheNobody Рік тому +6

    Based on the available information, I believe DLSS 3.0 could be useful on strong PCs with very high refresh rate displays - 240Hz or 360Hz. If you have a PC that can already run games at say 120 or 144FPS without drops with a decent resolution, which is already smooth in regards to latency, and then run it with the new DLSS in an attempt to reach the higher frame rate, achieving a smooth gameplay and supremely smooth visuals.
    However, I think anything below 120 FPS is too low for this kind of use, and I don't think it's going to work well in regards to the latency - as was mentioned in the video, it will be worse the worse the base FPS is. This is also one of the reasons why I think the mid range GPUs were not shown yet - they can't run the games natively at a good enough resolution/framerate for the new DLSS to provide good enough results. I wouldn't be surprised if, in the end, they won't have this feature available, perhaps with the exception of 4070.

  • @corebite
    @corebite Рік тому +10

    I'm not sure but this sounds a lot like oculus's asynchronous space warp (ASW) which halves your frames and interpolates the other half. It works well in slow games or race games. But less so in fast games as you run into the same artifact issues explained in your video.

  • @Content_Deleted
    @Content_Deleted Рік тому +1

    Couldn't stop laughing my ass off at 2:19, which made it impossible to hear what you were saying. Philip pls fix, Jack Torrence could get phased out of the next Shining if you're not careful xD

  • @JonathanDavidsonn
    @JonathanDavidsonn Рік тому +1

    One thing I really found unique about why this is the new DLSS 3.0 is because it's abusing the pipeline which contains motion vectors for the 3D scene already and is easy to update, maintain, and naturally slides into the DLSS framework without having to add more code into the project. It's all encapsulated into one single package rather than having two DLSS branches to integrate.
    Other frame interpolations have to predict and implement many other algorithms to create a "fake" 3D perspective in order to be able to keep the frame well interpolated in some 3D games. It creates those weird visual effects philip points out, but with DLSS's framework, they get the motion vector data from checking the velocity between the vertex data on each frame inside the vertex buffer.
    So that's why it's part of DLSS 3.0, because DLSS at it's core utilises motion vectors within it's algorithm, why bother creating an entirely new shader package for devs to integrate alongside DLSS 2.0 when they can just focus on installing the entire DLSS 3.0 umbrella. I think they're trying to extend DLSS more than just an upscale algorithm to more of a performance optimisation toolkit in the future.
    I feel like the only draw back to DLSS 3.0's interpolation isn't going to be weird visual artifacts, but more motion blur because it knows the motion of a scene with each frame and it isn't quite tuned to completely sharpen the image. The main issues we'll have is with interpolating transparent/fading objects, inconsistent performance, and weird race conditions when it comes to frame times in some titles.
    I can also imagine DLSS 3.0 being REALLY good for games which are frame-locked to 60 where when you try to break the frame limiter it messes with the physics/timesteps of the game, as it allows you to just get a smoother feel to the game rather than just the crappy locked framerate :) This probably won't impact the timestep calculations this way, and allows you to get a much smoother experience without breaking your entire gameplay experience. So emulators might benefit A LOT with this.
    Thanks for reading my tedtalk, I was interested into being a graphics programmer while I was studying in university but I gave up that path, but if you've got questions about things like this feel free to ask c:

  • @gaydogs
    @gaydogs Рік тому +4

    ive been thinking for a while that the game/pc hardware industry will go down a sort of AI based approach for a while. on one hand, i think DLSS 3 could really help with some of those cases mentioned (though, i think id turn frame generation off for most uses, dont mind the almost free SSAA though), but i believe its only a matter of time until most things are handled with AI intervention, like your example on style transfer.
    especially as the temporal stability increases on those sorts of AI, they will be used more and more to add or enhance details like grass. instead of the traditional way of rendering individual blades and calculating the shadows and AO, you could render a few and supply the AI with information on how to fill the space, approximating the shadows and AO and rendering grass far cheaper than before.
    but of course, handing things off to AI comes at the cost of the image representing the same thing for everybody. something about the idea of a pixel not representing the geometry behind it, rather an AIs approximation of what it thinks it should look like, upsets me. i dont know why. its sort of silly when i think about it, but it does.

  • @TheHalfmanofOz
    @TheHalfmanofOz Рік тому

    That slow mo footage. Majestic.

  • @jonathonjollota8661
    @jonathonjollota8661 Рік тому

    I LOVE THE SELF VIDEO FOR PROOF. Proves the point perfectly, yet makes me just laugh and enjoy your content even more.
    I hope they add dlss 3.0 to csgo so u can go redo the input lag and hitbox videos again =)

  • @Navhkrin
    @Navhkrin Рік тому +27

    I would have preferred if they simply focused on upscaling and threw all that performance at upcaling instead. I dont like the added latency which is what im trying to mimize with DLSS in the first place.
    What they could have done with DLSS if if they told game engine to render only the color (no shadows or GI). This would give them a geometrically accurate representation of game world. They could than feed this to AI to generate a realisticly illuminated version with very noisy ray tracing added in for consistency.
    Rendering only the color is much faster than full render pass, you can test this with UE5 pretty easily. It also wont suffer from input lag of having frame interpolation.
    Another trick is foveated ray tracing. There are only tech demos of this available atm but i think it is amazing idea

    • @HeloisGevit
      @HeloisGevit Рік тому +2

      Why would you want very noisy ray tracing, that's terrible.

    • @RobBCactive
      @RobBCactive Рік тому +1

      I'm not sure they have done it as suggested. Flight Simulator was CPU bound and input lag isn't a big deal. But Digital Foundry have clips from Cyber Punk, past DLSS caused input lag, so I'd be surprised if Nvidia made the same mistake twice.
      Anyway I know 3rd party reviewers will investigate the latency angle to check it out.

    • @Navhkrin
      @Navhkrin Рік тому

      @@HeloisGevit Because they have really powerful AI compute power to cover it up. I mean right now they are rendering completely artificial frames, very noisy ray tracing can help massively with this

  • @fly-hii
    @fly-hii Рік тому +1

    babe wake up, new kliksphilip upscaling video

  • @MerryBytes
    @MerryBytes Рік тому +6

    Nvidia refers to the two separate functions of DLSS 3 as Super Resolution and Frame Generation, and in fact it seems to basically be DLSS 2 but with that new Frame Generation part, still relying on many of the same inputs (motion vectors and depth info, combined with an "optical flow field"). Since the Super Resolution part of DLSS 3 is still backwards compatible with RTX Turing and Ampere, presumably the two features can be toggled individually. Or at least I hope it does in case someone likes one or the other instead of both.
    Either way, while the prices can suck a dick (especially for people on Europe who will get screwed over extra hard), I genuinely hope Nvidia pulls off real time frame interpolation successfully. We are continuously moving to a philosophy of "render smarter, not harder", and this would be a massive step forward. I won't be buying a 4000 series card, but I'm excited to see this technology. I wasn't expecting this sort of thing until a few GPU generations away.

    • @MLWJ1993
      @MLWJ1993 Рік тому +1

      I'm definitely curious how well this works in practice since currently available frame interpolation is not good for real time purposes.

    • @speedstyle.
      @speedstyle. Рік тому +1

      Smart TVs already do real-time motion interpolation, and Nvidia has made fancy hardware for it, I'm sure it will be high quality. But for interactive content I don't know if interpolation will ever be good, I'm hoping this is a step towards extrapolation

  • @dimaz3
    @dimaz3 Рік тому +10

    The real reason nvidia is focusing on dlss 3 and gatekeeping it is because the power difference between the two gens are actually not as big and they need to hook customers somehow. If Amd comes up with a similar solution they would walk back and enable it on old gen just like freesync and gsync.

    • @ladrok97
      @ladrok97 Рік тому +1

      @@huleyn135 it's 50% in games Nvidia picked, so it's probably max jump

  • @lucybell6815
    @lucybell6815 Рік тому +17

    EDIT: I was corrected and the claims made in this comment aren't true, the presentation contradicts them.
    There are a few reasons that I believe that DLSS 3 is, in fact, predicting frames as you proposed at the beginning of this section, and not placing them in-between already generated frames as you seem to believe is the case.
    Firstly, the article about DLSS 3 on the Nvidia website has two images which show the DLSS 2 and 3 pipelines, including the framerate and input lag after dlss has done its magic. These pictures both show an improvement in input lag when DLSS 3 is used compared to DLSS 2. That may be Reflex working its magic even if DLSS 3 is still adding lag by not predicting, but I find that unlikely. Of course, that's just my not massively informed opinion, so that alone is not particularly swaying
    My second reason is that there is precedent for versions of framerate interpolation that predict the next frame to reduce input lag - those seen in VR. Oculus Spacewarp and Valve's solution both attempt to do this exact thing with mixed quality. It has been a couple years, I think, since the release of these two methods; so it's not entirely surprising to me that a new set of graphics cards, with additional hardware dedicated to extrapolating useful information from previous frames (the Optical Flow Accelerator mentioned in the article), could do this with better results than the slightly older VR solutions).
    Of course, it's very early days, and the existence of methods that can predict doesn't necessarily mean that DLSS will. Perhaps we don't have enough information to make a strong statement either way, but those are the reasons that I believe it is predicting frames, and that I think it is incorrect to suggest that it's placing frames inbetween previously generated frames.
    If I lack critical information found elsewhere, or misinterpreted some part of the article, or maybe even misunderstood the video, please someone let me know.

    • @vyor8837
      @vyor8837 Рік тому

      @@2kliksphilip it's been confirmed in an interview that they are placing it between the two frames.

  • @muizzsiddique
    @muizzsiddique Рік тому +4

    Also, DLSS3/motion interpolation is temporal upscaling.

  • @johnclark926
    @johnclark926 Рік тому

    Looks like your predictions were all correct. Well done!

  • @theopenrift
    @theopenrift Рік тому +2

    As someone who despises frame interpolation on televisions, I really, REALLY hope that developers implement the image upscaling and FPS aspects of DLSS separately, because I think this FPS interpolation thing undermines THE WHOLE POINT of having higher framerates: more responsiveness, less input lag.

    • @matthewcullen3486
      @matthewcullen3486 Рік тому

      I despise frame interpolation on tvs as well. Yet this version genuinely excites me because of how fundamentally different it is.

  • @AVerySillySausage
    @AVerySillySausage Рік тому +1

    The point about it only being used to produce frames when needed is interesting. I have a 4k 120hz display that I use for most games and would only want extra frames generated in order to hit that 120fps mark when needed. If I can already get 120fps without any generated frames, then that is what I want to see. But I'm not sure it will be work that way. So by turning on DLSS 3, I'm essentially limiting my real framerate to far below what I usually would and I'm losing out on input lag whenever my machine could in theory be maintaining 120fps without DLSS 3.

  • @bulutcagdas1071
    @bulutcagdas1071 Рік тому +23

    I can't wait for the subsciption based DRM locked graphics cards of the future.

    • @egesanl1
      @egesanl1 Рік тому +1

      Future is now old man!
      3 words
      Nvidia on Linux
      The way they make drivers for linux is absurd.

    • @tablettablete186
      @tablettablete186 Рік тому

      NVIDIA Grid cof cof...

    • @bulutcagdas1071
      @bulutcagdas1071 Рік тому

      @@egesanl1 I have used their Linux drivers on a laptop and while they are certainly more clunky and not open source (compared to AMD), I didn't get the vibe that they were all that different from Windows ones.

    • @bulutcagdas1071
      @bulutcagdas1071 Рік тому

      @@tablettablete186 Those are the server grade cards right? I don't know much about them but honestly I wouldn't be surprised if they did have subscription based drivers, not all that uncommon in the enterprise world.

    • @tablettablete186
      @tablettablete186 Рік тому

      @@bulutcagdas1071 Server and professional grade. This is needed if you want to use the virtualization capabilities of the cards as an example

  • @dwaynehawkins
    @dwaynehawkins Рік тому +38

    If you have 30fps and dlls 3.0 brings it up to 90fps ... you'll still have the input lag of playing at 30fps.
    So yes, you said it right ... in order for the "motion flow" to work in a pleasant way, you'll need a reasonable input fps to begin with.

    • @nobodynoone2500
      @nobodynoone2500 Рік тому

      So, totally not worth it ever. Got it.

    • @raresmacovei8382
      @raresmacovei8382 Рік тому +1

      Well, no.
      DLSS reduces resolution and improves performance.
      So 30 fps becomes 45.
      Then DLSS interpolates, so 45 fps will feel like 45 fps and look like 90.

    • @archinatic678
      @archinatic678 Рік тому +1

      If it is actually waiting a full frame as talked about in the video you'll have more input lag.

    • @HeloisGevit
      @HeloisGevit Рік тому +1

      It's like how DLSS 2 works in the spatial dimension, but now in the temporal dimension. Better results the larger the data set you feed it to begin with. So it does a great job when fed 1440p and asked to output 4k, less so when fed 720p and asked to output 1080p etc. Likewise DLSS3 should do a great job when fed 60 and outputting 120, not so when fed 30 and outputting 60.
      Which means, if you can achieve a native 1440p 60fps without DLSS, expect DLSS 3 to give you 4k 120fps that's of comparable quality to native 4k 120fps except at a quarter the performance demands. That's huge.

    • @complover116
      @complover116 Рік тому

      In fact, you will have MORE input lag, since interpolation needs to know the NEXT frame to insert a generated one, not the current one!
      And of course, all of this is artificially locked not only to NVIDIA GPUs, but to the 4xxx series specifically...

  • @kabu7616
    @kabu7616 Рік тому

    I think the best use case is to set a frame rate (60/120/…) and when the framerate dips below that threshold dlss kicks in and adds additional frames to the existing ones to keep it as smooth as possible

  • @Yogsther
    @Yogsther Рік тому +1

    I feel like the biggest point of playing at a high fps is the lower input lag, this maybe makes sense for playing at very low frame rate with a controller.

  • @-41337
    @-41337 Рік тому +1

    I think how useful this feature will be in VR is overstated. The fact is that image latency has a massive impact on the VR experience and is actually one of the greatest challenges in bringing cheap VR to mass market. Latency in VR causes VR sickness.

  • @D1.y
    @D1.y Рік тому

    What a time to be alive

  • @riv4lm4n
    @riv4lm4n Рік тому +2

    This is neat and all, but according to the couples last years trend it will be used as an excuse for bad game optimization and other stupid things rather than being a nice bonus on top of a well made game.

  • @AndreInfanteInc
    @AndreInfanteInc Рік тому +1

    The thing you describe (generating a fake frame when the game can't keep up) is already a built-in feature of the Oculus runtime called Asynchronous Space Warp (ASW). It's extrapolative instead of interpolative though. Interpolative is a total no-go in VR, extra latency makes you very sick very quickly.

  • @Rainquack
    @Rainquack Рік тому +3

    2:40 - ***scrolls down to the comments, to hide in fear***

  • @hundvd_7
    @hundvd_7 Рік тому

    0:18 And -Door Stuck- UA-cam's best video

  • @gabrielrizzo6969
    @gabrielrizzo6969 Рік тому +1

    I really dont wanna hate, but the bit at 3m20s about how it would help weak machines doesnt apply now since DLSS3.0 is tied to only the RTX 4xxx series cards, maybe in 30 years.
    But the other part about helping insane high res and framerates on powerful machines is very cool.
    DLSS 3.0 sounds VERY cool, but I just hope AMD makes a version of these algorythms unthetered from new GPUs.

  • @matthewcullen3486
    @matthewcullen3486 Рік тому

    Input lag is not the measure of framerate, the user experience is. And that is subjective. Some prefer smoothness, some prefer input speed, and we tailor ourselves around that preference. Many of us like both. This technology seems to support both. For example -- something like gsync like will have more input lag than without, but - for many, it will be worth it for the immersion of frame smoothness and consistency. Same deal here. And this is coming from a guy that hates tv interpolation, yet understands this is a different beast.

  • @rumbleinthejungle3358
    @rumbleinthejungle3358 Рік тому

    Thanks for the upload

  • @GamingXPOfficial
    @GamingXPOfficial Рік тому +1

    This will be insanely good for vr idc for games since I'm not dealing with delay in my comp games but vr idc much

  • @gamingmarcus
    @gamingmarcus Рік тому

    I feel like we're going down the route of TVs advertising 5000Hz through interpolation.

  • @justindressler5992
    @justindressler5992 Рік тому

    VR headsets use it too avoid nausea. Oculus calls it space warp the industry calls it reprojection. Some implementations are better than others it depends it they use raw motion vector data. They are introduce some level of artifacting and latency.

  • @erikm9768
    @erikm9768 Рік тому +1

    4:28 The thing with VR though is that although 100 ish fps is ideal, lags are very noticeable when you move your head around and can induce motion sickness, if its present even a little bit, so dont think this is ideal for VR either.

  • @ogrodniczek3836
    @ogrodniczek3836 Рік тому +5

    I don't understand. Why bother getting 120 fps if it doesn't feel like 120 fps should.

    • @HeloisGevit
      @HeloisGevit Рік тому +1

      Because your eyes get to enjoy a 120fps image which is way smoother to look at. It won't feel any worse than what you could achieve without it, so what's the downside?

    • @iurigrang
      @iurigrang Рік тому +1

      Not all the benefit of 120 fps is low latency input, a lot of it is a smooth looking image. And, in that, it will look like 120 fps should

    • @iurigrang
      @iurigrang Рік тому

      Also, something not discussed, Nvidia reflex usually decreases input lag by more then a 60 fps frame, so using it in 60+ fps will probably still result in a lag reduction

    • @BillCipher1337
      @BillCipher1337 Рік тому

      In Games with Low Interactions Like flight Sim it will be great

  • @MFKitten
    @MFKitten Рік тому +1

    What I personally miss is having VR technology make it back into regular gaming. Asynchronous reprojection and all that stuff for example, is brilliant.

    • @mariuspuiu9555
      @mariuspuiu9555 Рік тому

      we are on the cusp of hardware being good enough and cheap enough.

  • @AnimeUniverseDE
    @AnimeUniverseDE Рік тому

    A lot of TVs have a motion interpolation feature, but I think Samsung deserves a special mention here. They are the only TV brand to offer a low-latency, gaming-optimized motion interpolation feature called "Game Motion Plus". This works only if you send the TV a non-VRR 60 Hz signal, and also only has ~30ms of input delay.

  • @kameleon_rap
    @kameleon_rap Рік тому

    03:00 I can't freakin' believe it's voice of that guy xD

  • @doerkdev1l
    @doerkdev1l Рік тому +3

    As somebody put it so fittingly under a different video on the topic: DLSS is so amazing, it even upscales the prices!

  • @AngryApple
    @AngryApple Рік тому

    My prediction is that because DLSS already introduces more latency to frame rendering than native resolution is that with the newer Tensor cores they reduced the upscaling latency to such a degree that inserting a artificial frame doesnt increase the latency to such a extend that is noticeable. The new Optical Flow Accelerators are MUCH faster than whats inside Turing.
    So yeah of course it will be a fake 120FPS without the responsiveness of a 120FPS image. It looks smoother and feels smoother regardless.

  • @conyo985
    @conyo985 Рік тому

    They should give an option to disable frame interpolation so that you'll still get the benefit of image upscaling.

    • @Mintor94
      @Mintor94 Рік тому +1

      You get it from switching from DLSS 3 to 2 pretty much

  • @sashamakeev7547
    @sashamakeev7547 Рік тому +1

    I guess it should be aiming not to generating new frames to reach from 30 to like 60. BUT instead should be working towards converting existing ~60 fps to 144/165 fps since im with high refresh monitor tend to lower graphics to get at least 144. Maybe DLSS would make it not mandatory to lower graphics but still getting high fps with little to no input delay.
    My only hope is... that DLSS 3 wont make the image so smudged and smoothed as DLSS does now

  • @ErrorCDIV
    @ErrorCDIV Рік тому +1

    Got excited about future videos of yours when DLSS 3 was announced.

  • @Aidiakapi
    @Aidiakapi Рік тому

    The latency in VR has been mentioned plenty of times, and so I don't think this helps. Besides, VR already has reprojection to deal with stutters in the source material, because the image *has* to move in response to your head moving, unless you want many people to get sick.
    On an unrelated note, you'll have less artifacting with NVIDIAs approach than traditional frame interpolation because it doesn't have to estimate the motion vectors, it gets them from the engine.

  • @GrandHighGamer
    @GrandHighGamer Рік тому

    I think the value here, given it's a feature of a flagship £1600 card, is to do ridiculous 4K 144fps gameplay rather than aiming it at 60fps where the input lag issue would presumably be much more noticable (along with any visual artifacts).

  • @dadgamer6717
    @dadgamer6717 Рік тому

    Thanks as always

  • @luisaazul
    @luisaazul Рік тому +2

    Frame interpolation will just add a ton of input lag it doesn't belong in interactive media like games

  • @Henrix1998
    @Henrix1998 Рік тому

    About VR, they won't love this at all. The added latency is way more noticeable on VR compared to monitors

  • @callumsmile
    @callumsmile Рік тому +2

    Could you use DLSS to interpolate frames for games with locked frame rates? Such as old emulated games? that would be a real game changer

    • @TheDravic
      @TheDravic Рік тому

      No, because those games don't support the suite of data that DLSS 2 needs to function (and by extension, DLSS 3). For instance: motion vectors. These need to be fed to the GPU and so they require the game developer to implement DLSS.

    • @callumsmile
      @callumsmile Рік тому

      @@TheDravic but could newer emulators like dolphin add these things or would Nvidia not allow that

    • @TheDravic
      @TheDravic Рік тому +1

      @@callumsmile I don't know how the emulator would be able to add motion vectors among other stuff, it's a pretty complex concept for such old engines unfortunately.

  • @RuskiRozpierdalacz
    @RuskiRozpierdalacz Рік тому +1

    Oh damn, I didn't pay enough attention to presentation, and thought that based on last few frames it predicts the new frame, with high enough input framerate (90 and above) I believe that could work. But I rewatched this part, and you are right. This implementation seems terrible for anything other than MS Flight Sim or similar games. Also I think actually it would make more sense if implemented in driver layer, to help old, single-threaded games, that are highly bottlenecked by cpus (we saw this in Crysis) My guess is they are working on it already.

    • @TheDravic
      @TheDravic Рік тому

      It could be a combination of both and 2kliksphilip doesn't talk about it at all here.
      You could have DLSS 3-generated predictive frames based on the last couple sequential frames, that are only generated and sent to your monitor while the next DLSS 2 frame is being prepared, this way maximizing the benefits of both interpolation and extrapolation by being smart about it. Latency still goes up but not as much, artifacting can still happen but not nearly as badly, and you still get to see tons of "real" frames, as real as DLSS 2 frames are.
      This is likely what's actually going on with DLSS 3 but I can't say for sure until we get more information.

  • @clessili
    @clessili Рік тому +1

    I'm not sure about the added latency. As I understand it GPUs already queue up a bunch of frames in advance, and 3D engines have always sacrified latency in favour of raw framerate (with things like Async Compute and multicore rendering) since it's the only metric g4m3rz care about. Also, an interesting thing is that it will not be more taxing for the CPU, since it won't even be aware of those fake frames.

    • @solaris5303
      @solaris5303 Рік тому +1

      > As I understand it GPUs already queue up a bunch of frames in advance
      Only when the GPU is processing frames faster than the CPU is able to submit them, which is (fortunately) becoming less and less common.

  • @omeruney3535
    @omeruney3535 Рік тому

    basically, use dlss 3/frame generation if your performance doesn't cap out your monitor refresh rate for the smoothest experience, but don't use it if you monitor can't make use of the new frames, as there is no benefit from it at this point and will only increase input lag.

  • @LucasCarter2
    @LucasCarter2 Рік тому +32

    My predications are that I won’t ever be able to use it because I think I’m done with nvidia after my 3080 and not getting support for dlss 3 with it.

    • @chupamelasbolasregem
      @chupamelasbolasregem Рік тому

      How, as far as i know the 40 series it has dedicate part in the DIE for that

    • @BillCipher1337
      @BillCipher1337 Рік тому

      And asking 1200$ for the new 4080

    • @speedstyle.
      @speedstyle. Рік тому

      It uses new application-specific hardware (optical flow accelerators) to interpolate without too much computation or artifacting. But with input lag etc I don't see why you'd even want it

  • @Demmrir
    @Demmrir Рік тому

    You mentioned VR so I have to pop in and say that predicting future frames based on past frames (which is better than DLSS 3) is exactly what Asynchronous Spacewarp (Oculus' tech) does. Based on head movement, previous frame data, etc, it generates brand new frames to double FPS without introducing any latency. Of course, the downside is visual artifacts, as mentioned. But it does mean that DLSS 3 is way less revolutionary than, say, DLSS 2 was.

  • @timtemple8953
    @timtemple8953 Рік тому

    @3:56...Hey looks the DLSS 3 propeller disappeared...don't worry about that...just pay $900 for a 70 series GPU, based on it's FPS it's definitely the next 80 series...put in you preorder before we actually send review samples out though...they won't last long...

  • @EVPointMaster
    @EVPointMaster Рік тому +4

    2:32 this is much less of a concern with DLSS frame generation, because it gets extra information from the game, like motion vectors. So it knows how things are moving between frames as well.
    Also like you said, the framerates are already fairly high in comparison to videos that interpolation would usually be used on and because it only seems to be generating 1 frame in between, any interpolation error will be corrected in the next frame, making it much less noticeable in motion.
    If you want to see what the artifacts look like though, watch Digital Foundrys DLSS3 preview and go frame by frame before Spiderman jumps out of the window.

    • @ChrisD__
      @ChrisD__ Рік тому

      That, and for reducing input lag, they might also be using player input in the interpolation process, like the proper frame interpolation the Quest 2 uses, where input is resampled right before the image is rendered.

  • @xybnedasdd2930
    @xybnedasdd2930 Рік тому

    Reflex can decrease latency from 15-30+ ms. This is 1-2 frames' worth at 60FPS. So using DLSS 3 at 60 to get to 120 could easily give you the same or better latency than using a console, or not using Reflex.

    • @xybnedasdd2930
      @xybnedasdd2930 Рік тому

      @@2kliksphilip Yes exactly, DLSS 3 increases it (naively by 2 frames, though NVidia claims it's a bit more than one, not sure about the specifics there), and Reflex decreases it, so in NVidia's words it should end up roughly equivalent to not using DLSS 3 and not using Reflex (or slightly worse).
      Just using Reflex will give better latency of course.
      Though I would dare claim that the vast majority of users would not have a problem with latency such that they would not see a benefit in "downgrading" to DLSS 2 and still using Reflex, as at the end of the day the latency still ends up being (noticably) better than using a console, and we know how much (or how little) people care about it there.

  • @TheTonVeron
    @TheTonVeron Рік тому +1

    Hmm you mention generating new frames when there's a stutter in vr, but that sounds almost the same as the motion smoothing oculus/meta is doing. Asynchronous spacewarp does help with small stutters but can be very noticeable

    • @mar2ck_
      @mar2ck_ Рік тому

      SteamVR also does this. Async warp just uses the last frame and warps it to match where the headset is a frame later. There are lots of artifacts because AFAIK no motion vector data is being used, just the headset motion. DLSS has access to that plus the next frame also so it has a lot more data to work with.

  • @jforce321
    @jforce321 Рік тому

    DLSS 3 is just a way for them to advertise performance gains in a way that we've never done so before. More frames used to always mean lower input delay due to each frame coming to the screen sooner. Now they can basically fake performance gains while still having the overall input lag of what the game looks like before they add the frames.

  • @leoliu3134
    @leoliu3134 Рік тому

    IMO this feature will be LESS suitable for low framerates unless it's a slow paced game like MSFS, because frame interpolation will naturally perform worse as the distance between frames increase.

  • @Verpal
    @Verpal Рік тому +2

    Personally, I have been playing around Optical flow SDK for quite sometime before announcement, I have been testing it on my RTX 3060, it looks......not good enough, and I heard Turing is even worse. If the optical flow accelerator in Ada only increase speed of computation but not the actual quality, yeah no thanks, not going to use those ''fake frames''. Alternatively, if the quality only become acceptable when you are already doing 120fps, I would rather take my 120 fps game with lower input lag than the 200+ fps game with no actual perceivable benefit.
    With all being said, if NVIDIA can pull out another magic trick and manage to achieve acceptable quality from a 60 fps source, good job, DLSS 3.0 will be immensely useful if that's the case.

  • @_Egitor
    @_Egitor Рік тому

    I love it when AI touches things :)

  • @edragyz8596
    @edragyz8596 Рік тому

    They've shown it off in Overwatch at around 360-400 FPS and the render latency was 4-6ms, when a normal 360ish game like CSGO runs at 2-4ms. That's pretty solid but still not native.
    In Cyberpunk they were getting more latency than I was despite having 50% more frames. 50 FPS vs 90 FPS and 59ms vs 66ms. Not great.

    • @NiktoTheNobody
      @NiktoTheNobody Рік тому

      It's a feature useful only if you can already run a game at high frame rates, but not quite hitting your refresh rate on high RR displays. For example if you hit the 120-144 range, but have a 240 display. Gameplay feels smooth and you get even smoother video output. And of course, it's completely useless for anything competitive multiplayer, where you actually need more real frames as opposed to simply smoother visuals.

  • @lickrish3930
    @lickrish3930 Рік тому +2

    The new Dlss the old VR trick it might work fine on a headset but it's going to feel awful on the monitor or a TV

  • @hentosama
    @hentosama Рік тому

    Therapist: "15 to 60 fps interpotlation 2kliksphilip doesnt exist"
    This Video: 2:57

  • @Mistereee
    @Mistereee Рік тому +7

    DLSS is getting wild

  • @KalkuehlGaming
    @KalkuehlGaming Рік тому

    Lets see if the dlss3 is optimized enough. I guess it will be like Ray Tracing on the 20x Series. A feature that will work well for the future generations.
    Time will tell.

  • @paulbunyangonewild7596
    @paulbunyangonewild7596 Рік тому

    "DLSS, features that would be most beneficial on lowered hardware, reserved for only the latest and greatest"
    Feels like giving fabulously wealthy celebrities free stuff.

  • @resurgam_b7
    @resurgam_b7 Рік тому +6

    At what point does calculating all the AI generated frames start to take more time than just rendering more frames the old fashioned way? Maybe I just don't have a deep enough understanding of what math the computer is trying to do for each frame but it seems like there will be a point where rendering a new frame will cost less computation than trying to analyze already rendered frames and generate a fake one.
    I feel like this technology could be used to great effect on prerendered media. CGI in movies and such, where they currently spend days or weeks rendering complex scenes at high framerates, they could use this to cut down the render time. In that situation they could balance the fidelity loss against the time gain and come out with a shot that still looks perfect, but which took substantially less time for the computer to crunch, allowing for more efficient workflow

    • @shukterhousejive
      @shukterhousejive Рік тому

      The AI techniques use the parts of the GPU die that are only there for compute acceleration. Instead of maxing out the die with graphics cores they mix in the other cores and tack stuff like DLSS onto them for gaming use. In theory they might've been able to get more raw performance from using all graphics cores but NVidia wants to make as much money as possible from the high-margin scientific and datacenter markets so putting non-graphics parts on their low-end cards as a consumer outlet for those markets is probably a good strategy for them

    • @egesanl1
      @egesanl1 Рік тому

      For that CGI must cost something. Today just trowing more money at it solves the problem since it is not that much money. At least not for the mainstream media we have.
      Might be usefull for smaller budget media tho.

    • @Willie6785
      @Willie6785 Рік тому

      One thing that you may not be taking into account when it comes to just doing it the old way is that each frame contains all the updated information from the game itself, so it has to do shadow calculations, sometimes physics, etc, just to generate that new frame. This method takes two finished images and finds a middle ground in-between, without having to do all the aforementioned calculations, so it's much less taxing, especially for more visually complex games.

  • @JessicaZane4realz
    @JessicaZane4realz Рік тому +10

    This is so confusing. I'm getting it though.

    • @goob8945
      @goob8945 Рік тому

      Kliksphilip videos always make them easier to understand for me at least

    • @fawkkyutuu8851
      @fawkkyutuu8851 Рік тому

      You're getting the $1600 4090?

  • @alistermunro7090
    @alistermunro7090 Рік тому

    It's part of DLSS 3 due to requiring the same data that DLSS already uses, e.g. motion vectors.

  • @ashishsb9075
    @ashishsb9075 Рік тому +1

    Now waitung for a ps vr2 predictions. ✌

  • @ddnguyen278
    @ddnguyen278 Рік тому

    It's like those TVs which have framerate upscaling, they interpolate frames, but in this case DLSS probably is predicting it, however its not gonna get around the issue that when a real frame comes in during this "fake" frame, its gonna have to delay that frame until its done or otherwise screen tear. The reason why game don't do this is because it messes with frame pacing and creates a jello like control feel. There was a GDC paper about frame interpolation ages ago, but it never panned out. Nvidia just needed another "feature" to sell people on and this was the lowest hanging fruit in their bucket. Raytracing is played out since AMD has that too. The problem is Nvidia stuffed their cards full of tensor cores for deep learning and it just doesn't have a use in gaming.. so people have to pay more for essentially less gaming performance, thus the need for all these upscaling technologies ie DLSS..

  • @Veptis
    @Veptis Рік тому

    My main issue is the pacing. You take frame0 and frame1 to generate frame0.5... which is older than frame1 which you could show instead. It might look more 'fluid' - but it will be more delayed and more input lag. Nvidia reflex or not.
    What I hoped this might have been is the engine just rendering some of the streams (for example only depth and motion vectors for frame0.5 and then full graphics for frame1 again. With the hope of combining the full frame0 and the reduced information of frame0.5 into an alright looking frame0.5 that gets done before frame1 is fully rendered manually. Hence the idea of having fake frame sin between that actually are available before a newer frame is ready.
    From Nvidias words, the testing videos... And no research paper - it's still unclear to me what is actually happening. So perhaps you figured it out in the future.

  • @VADemon
    @VADemon Рік тому

    At first nVidia subverted the benchmarks (cheating in Futuremark's 3DMark)
    Then nVidia subverted trust in forum participants (that's where "nVidia shill" comes from)
    Now they subvert the FPS as a performance metric. And that's why you can't turn it off.
    As a technical reason behind this: DLSS 3 or whatever other post-processing was so slow, they needed a boost. Here it is.
    Though honestly, the are two reasons for high framerates:
    - input lag (bad for this)
    - motion fluidity (good for this)
    For example Unreal Engine already HAS an unreal amount of input lag. I can guess other modern engines are similar. No way I'd add even more to it!

  • @aussieknuckles
    @aussieknuckles Рік тому

    I understand a lot now.

  • @fhjunior6183
    @fhjunior6183 Рік тому

    Thanks for the vid

  • @Keivz
    @Keivz Рік тому

    I don’t know if the maths work out right to approximatley double the frame rate if the interpolated frame is generated after two frames are rendered. At best I’d think you’d get a 50% boost in frame rate.
    Example with a base of 60 fps:
    1. 16.67 ms to render frame 1
    2. 16.67 ms to render frame 3
    3. 0 ms (for the sake of arguement) to render frame 2 (the interpolated frame)
    4. Frame 1, 2, and 3 are displayed on screen for 11.1 ms each given the total of 33.333 ms take to render the three total frames.
    11.1111 ms if frametime equals 90 fps so a boost from 60 fps to ‘90 fps’. Nvidia is claiming a bigger boost but I dunno how they are accomplishing that

  • @Nogardtist
    @Nogardtist Рік тому

    shoutout to people that still uses GTX 1000 series that most likely dont have access to nvidia DLSS let alone ray tracing
    but AMD fidelity resolution AMD surprisingly works on my craptop 1050ti in only selected few games and the results well havent tried enough to say

  • @42crazyguy
    @42crazyguy Рік тому

    It's really telling about the performance of this new generation that they are putting so much time into filling the gaps of bad performance.

  • @BrightPage174
    @BrightPage174 Рік тому

    2:30 The most cursed snapchat filter

  • @VioletMiracle
    @VioletMiracle Рік тому

    According to NVIDIA, we need real time ray tracing more than we need real time rendered frames.

  • @deathdoor
    @deathdoor Рік тому

    Imagine in a few years if it becomes common to recomend to Nvidia users to disable "motion smoothing" just like "we" have to do with TVs.