[4K] Nvidia RTX DLSS Analysis: AI Tech Boosts Performance by 40% - But What About Image Quality?

Поділитися
Вставка
  • Опубліковано 19 вер 2018
  • Is this too good to be true? The tensor cores in the new Nvidia RTX cards allow 4K gaming to be accelerated by 40 per cent in terms of performance. How does it work and is there a catch? John, Rich and Alex share their thoughts.
    Subscribe for more Digital Foundry: bit.ly/DFSubscribe
    Join the DF Patreon, support the team and get pristine quality video downloads! www.digitalfoundry.net
  • Ігри

КОМЕНТАРІ • 1 тис.

  • @Kal360
    @Kal360 5 років тому +333

    I remember a simpler time when we were shifting from 720p to 1080p and people on forums claimed AA isn't needed at higher resolutions like 1080p...

    • @Michael.Virtus
      @Michael.Virtus 5 років тому +48

      You are right, but still it's a fact that aliasing is less and less of a problem the bigger the resolution is. I remember my 15" monitor playing at 800x600 and it was so bad without AA. Now at 1440p I rarely turn AA on. It's not so bad without it. I prefer maximum fps (at 144hz). AA will always be visually better, but the problem is the performance hit. Some people are okay playing at 60fps/hz. But I am done with that standard. High frequency monitor is an amazing smoothness for the eyes.

    • @JayJapanB
      @JayJapanB 5 років тому +17

      When people would just read 1080p on the back of their 360 game and be happy with their life...

    • @webinatic216
      @webinatic216 5 років тому +6

      I super sample 4k to 1080p just because the enemies are then more visible from a distance. It's good enough.

    • @nomercy8989
      @nomercy8989 5 років тому +14

      I remember the exact same thing when 4k became the big new thing. That we won't need AA any more...

    • @axolet
      @axolet 5 років тому +26

      You have to consider that games from that era were less detailed and more blocky. Aliasing is more prevalent with small, detailed objects such as blades of grass, tree leaves and long draw distances etc

  • @v4tv
    @v4tv 5 років тому +597

    I am more impressed with DLSS so far than I have been with RT.

    • @289kman
      @289kman 5 років тому +18

      thats because ray tracing is early and not ready yet forget about ray tracing

    • @Luredreier
      @Luredreier 5 років тому +29

      +xGeoThumbs Likewise.
      Don't get me wrong, it's not going to make me buy Nvidia hardware anytime soon.
      But it's definitivly impressive technology.
      And I hope AMD brings something similar to the table soon too. =)

    • @kable_U
      @kable_U 5 років тому +1

      DLSS could be impressive. There has been no evidence of this so far though, and there won't be until it gets implemented into actual games.

    • @v4tv
      @v4tv 5 років тому +13

      Noobish It seems to me that competation by a rival company is good for the consumer.

    • @HybOj
      @HybOj 5 років тому +7

      too bad DLSS is blurry as hell

  • @CreativGaming
    @CreativGaming 5 років тому +340

    What I like about Digital Foundry is that they make NO JUDGMENTS or tell people how to SPEND their money. They use objective reasoning when observing new technology. They pretty much just give you the entire picture and let you decide for yourself.

    • @TheLoucM
      @TheLoucM 5 років тому +33

      Excatly this, I hate when tech channels turn into buying guides.

    • @fcukugimmeausername
      @fcukugimmeausername 5 років тому +14

      Just like Toms Hardware!

    • @smokeydops
      @smokeydops 5 років тому +18

      Yeah, these guys are cool; not so much "reviewers", but analysts. I like it.

    • @madfinntech
      @madfinntech 5 років тому +3

      Exactly! The very reason I keep watching them, even after saying things like "smooth 30 fps".

    • @Flamevortex-ky6xh
      @Flamevortex-ky6xh 5 років тому +3

      30fps is smooth with good frame pacing and motion blur, that’s the reason why the new Spider-Man game without motion blur makes my head hurt.

  • @stewardappiagyei6982
    @stewardappiagyei6982 5 років тому +37

    Hello Richard, everyone from Digital Foundry here.

  • @Raph920
    @Raph920 5 років тому +139

    I hope to learn how much of an improvement DLSS is at 4K vs no AA at all.

    • @Luredreier
      @Luredreier 5 років тому +11

      Probably about 30-40%
      Essentially this is like comparing 1440p *with* high end AA vs 4k without high end AA in terms of GPU workload...

    • @BecomeMonketh
      @BecomeMonketh 5 років тому +1

      if it's faster thanTAA with 40% , then it will be faster than no AA by 50% ish

    • @liamness
      @liamness 5 років тому +16

      Even with 4K you would probably want some kind of AA, if only to get rid of shimmer.

    • @icy1007
      @icy1007 5 років тому +4

      I'd rather have native 4K with no AA.

    • @Psychonaut-im3zz
      @Psychonaut-im3zz 5 років тому +1

      DLSS makes 4k relevant (it gives a superior crisper image, a defacto nvidia standard). Other AA solutions are in software, now you have a AA (on-the-fly) hardware solution. It is superior by definition. If anything, it is the secret sauce for raytracing (aside from rasterization speedup). Go a step further NVLink (2x 2080/2080ti) is the secret sauce for raytracing.

  • @antraxbeta23
    @antraxbeta23 5 років тому +48

    Maybe my eye's are bad , but on a lot of DLSS samples the image is blurry

    • @Mr.Beavis667
      @Mr.Beavis667 5 років тому +2

      i thought the same...

    • @mjc0961
      @mjc0961 5 років тому +8

      Your eyes are fine. Upscaled images are always blurry.

    • @gabrielst527
      @gabrielst527 5 років тому +2

      It's upscaling (Nvidia said it on the presentation powerpoint) but using AI to make the lower resolution image look better, i think this could work if you made a 1440p picture look 4k (since the difference is not that huge on a gaming monitor) but if you are using a really big screen i think the differences with DLSS would be noticeable. (in lower image quality i mean)

    • @sacb0y
      @sacb0y 5 років тому +4

      Depends on the scene, some things looked blurrier mostly object edges, other stuff looked a lot clearer like the backgrounds, and things like hair looked a lot cleaner.

    • @marverickbin
      @marverickbin 5 років тому

      They use the 4k rendered images to train the network to super sample the images. A better metric would be comparing the quality of the original 4k that they are using to train and the DLSS results.

  • @DatGrunt
    @DatGrunt 5 років тому +126

    I'm more excited about DLSS than raytracing but the price. Like god damn. AMD please step up the competition.

    • @Luredreier
      @Luredreier 5 років тому +14

      +DatGrunt Yeah, I *really* hope that AMD implements some variation of DLSS for their own cards soon.
      Would be a great upgrade for me and my RX 480 =)

    • @paul1979uk2000
      @paul1979uk2000 5 років тому +2

      It wouldn't supprise me if AMD are already working on something like DLSS for the next gen of consoles and PC gpu's, ray tracing is a diffrent story altogether and I really hope AMD does work on that as well and the only real indicator that they might be working on it is that Microsoft presentation a few months back where Microsoft talked quite a bit about ray tracing, it does suggest the next Xbox console could have ray tracing in it which bolds well for both PC and console gamers because for this kind of tech to be adoption, consoles will need to support it as well as AMD hardware.

    • @DatGrunt
      @DatGrunt 5 років тому +2

      There's also Intel releasing their dedicated GPUs like in 2020. I hope they're not underwhelming.

    • @lawthirtyfour2953
      @lawthirtyfour2953 5 років тому +1

      Been waiting for raytracing for decades. I can look past the very first games taking performance hits and see how amazing it is.

    • @Dick_Valparaiso
      @Dick_Valparaiso 5 років тому

      Paul Aiello I definitely think AMD is working on their own versions of DLSS, and RTX. They're gonna need as many techniques as possible to utilize in the next console gen.
      I think AMD's partnership with MS, and Sony is a big motivator for them to innovate. We might not be seeing the fruits of that "motivation" with Vega atm. However, Sony
      are working directly with AMD on Navi. So, when you think about the Pro's heavy utilization of checkerboad rendering (some on the X as well) DLSS is the obvious next
      step in that evolution. ...I'm also curious to see if Nvidia's new tech is utilized in the rumored next iteration of the Switch.

  • @huskuas643
    @huskuas643 5 років тому

    Extremely informative video, thanks a bunch for this one. And to your editor, I really commend them for using original Unreal BGMs for this video, many, many props to that.

  • @machielste1
    @machielste1 5 років тому +7

    Finally a deep dive into what DLSS actually looks like and how it performs, you guys rock !

  • @OrangeRock
    @OrangeRock 5 років тому +3

    So.. Does dlss work on 1080p?
    Just curious..
    Since in 4k it renders at 1440p
    Does it run 720p at 1080p dlss?
    Or does it only support 4k?

  • @symol30872
    @symol30872 5 років тому

    Great video, very informative (Also love the sneaky Unreal music in the background 😉)

  • @Raigavin
    @Raigavin 5 років тому

    Great analysis, appreciate the effort taken to share your thoughts.
    Although a small side note, the video warping effect between screen transitions hurt my eyes and they just happen to time when I focus my eyes to look at the minute details in the video :( .
    Again, thanks for the great video.

  • @Xilefian
    @Xilefian 5 років тому +46

    It kind of makes sense that frame reconstruction handles transparency better than TAA as it's a treatment on the final image (there's no concept of transparency) whereas TAA is treated earlier on using information from the previous frame and current motion vectors which are completely ruined by transparent surfaces (which frankly should write zero motion as transparency really does screw up screen-space motion).
    My curiosity is how is shader aliasing handled by DLSS? Would it emphasise the issue or leave it alone?
    As for how this works, I imagine it's just training an AI to reconstruct the 64x super sample image from a 1/64th sized input image - like any other AI training set.
    I get criticised for mentioned this often, but I think the Waifu2x AI would be a good example to look at - this AI was trained by showing the AI a low resolution piece of artwork and seeing if it can reconstruct the full resolution source - once it gets all tests 100% perfect, the AI is assumed to be good at up-scaling.
    EDIT: 17:45 shows that shader aliasing is still an issue.
    EDIT2: Single frame was a guess at how the AI works - only NVIDIA knows what kind of data set the AI is trained with. It could be that it takes multiple frames of previous input, but that just seems too bandwidth crazy to me. Single frame is good enough, I know this from doing single frame up-scaling tests with Waifu2x in the past (with the idea of "this would be great for games in the future" - well I guess the future is here...)

    • @Luredreier
      @Luredreier 5 років тому +5

      Yeah.
      The tech is impressive.
      Looking forward to AMDs implementation.
      I'll probably wait with upgrading my GPU till they get the same tech. ^^

    • @Xilefian
      @Xilefian 5 років тому +4

      It's just a shame that very little has been done with AMD's rather beefy compute hardware to allow space for things like AI and neural network training. The hardware is there - just that the API isn't. They bet on OpenCL which just wasn't as easy to use compared to CUDA - and now OpenCL is largely dropped.
      I think AMD have been awaiting a strong Vulkan compute effort, which currently is marred by SPIR-V compilers being a little too disconnected from already existing tools (the situation is improving every week, though - it's a slow community effort). After that it's a case of finding a model that best fits AMD's hardware.
      I've heard that AMD have had a lot of success with developing private tools that can compile a Caffe (Deep learning framework) CUDA application for a specialised AMD GPU compute workload and have excellent performance, but I'm yet to see any of AMD's efforts published as usable tools. It's likely that AI workloads don't fit current AMD GPU hardware and more development is needed.

    • @AkshayKumarX
      @AkshayKumarX 5 років тому +2

      I've been using Waifu2x context menu utility for a long time and it serves its purpose well. Its quite impressive that they've managed to make this new hardware that is applying Deep learning Super sampling techniques on real time high resolution frames and doing it rather well in what seems to be its first iteration I would say.
      I'm assuming that you already know about this but if you don't, look up "Two minute papers" on UA-cam and take a look at some of the recent videos if you want to know more about Deep learning based Super resolution images and more, Great stuff :)

    • @giancarloriscovalentin1865
      @giancarloriscovalentin1865 5 років тому

      Amazing Deep Learning technique

    • @raytracemusic
      @raytracemusic 5 років тому +1

      woah - cool stuff imma check

  • @Aveok
    @Aveok 5 років тому +9

    At 1080 TI the leap was to hugh, you couldnt even know where to start explaining the advantages. With the the RTX even DF has to search for some small benefits to justify the purchase.

    • @miguelpereira9859
      @miguelpereira9859 5 років тому

      That the result of Moore's law going out the window most likely

  • @NateHancock
    @NateHancock 5 років тому +1

    I love the discussion and insights made in this video. Great job DigitalFoundry!

  • @biglevian
    @biglevian 5 років тому +39

    Sounds promising. My next GPU upgrade will be in 2-3 years, so I hope RayTracing and DLSS are ready and available in Midrange then.

    • @TroublingStatue
      @TroublingStatue 5 років тому +4

      Yeah, the only _released_ game that has RTX right now is Shadow of the Tomb Raider. And even then it's not available at launch, it's coming out later on in an update.

    • @Luredreier
      @Luredreier 5 років тому +2

      +biglevian
      Yeah, and I hope AMD gets something equivalent to DLSS too by that time...

    • @TerryMartinART
      @TerryMartinART 5 років тому

      whats AMD making ?

    • @Luredreier
      @Luredreier 5 років тому +2

      +Terry Martin We don't know exactly what they're making for 2-3 years into the future.
      Another version of Vega should come out this year (probably only for prosumers etc), Navi is expected next year (2019), according to the rumors it's intended as a midrange option.
      So I'm hoping a card like the 2080 or 2070 at lower prices for the top of the line from AMD although I wouldn't be surpriced by a card similar to a 1080 costing the same as a current day 580.
      After Navi all we can see on their roadmap is "Next Gen" (roughly 2020)
      The rumors points towards something like the multi-die approach used for Ryzen.
      But no-one really knows for sure what they have in mind...
      Machine Learning has been a thing for a while and there's already some Vega cards with additional machine learning instructions added.
      I wouldn't be surpriced if this improved further with Navi, although I don't know if they'll improve the machine learning capabilities enough to support something like DLSS already by Navi or if that is something we won't see till the "Next-Gen" cards.

    • @TerryMartinART
      @TerryMartinART 5 років тому

      Thank you for that info.

  • @doublevendetta
    @doublevendetta 5 років тому +8

    I'm a fan of reconstruction techniques like this. I'm a fan of enabling DRS in more games. I'm a fan of having a NON-dynamic scaling slider for those who want it. What I'm NOT a fan of, is the pricing of these cards, especially after seeing what is essentially complete confirmation of my suspicions about their typical raster performance over Pascal. The new tech is cool, but it's not widely implemented yet, and this very much feels like paying an "early adopter" tax in the order of several hundred dollars. Until we can see something on the driver level the likes of the NVidia DSR algorithm, I can't see these as a good buy in terms of value per dollar for most anyone.
    Relying on developers to individually implement all of this means you're going to see the big budget ones probably get things out at a decent clip. But AS an indie developer who's kept my eye on all of these things, I can tell you that given the current, very short list of dev software that even SUPPORTS this, we couldn't even implement almost any of it into our workflow, even if we wanted to. And the pricing scheme puts any of these higher tier pieces of silicon outside of what I would call "sensible spending," even for our most optimistic budget projections.

  • @thibautnoah2599
    @thibautnoah2599 5 років тому +3

    That was really interesting especially the part where you talk about how dlss could make rtx run without too much strain in 4k

  • @Khaotika
    @Khaotika 5 років тому +2

    When you're pointing out the blurriness of TAA, have you taken into account that the blur might be intentional?
    For example the beginning of FFXV benchmark demo when they roll out the car and there's rock and bushes in the background, assuming they didn't completely alter the way it should look from the public version, the blur is intentional DoF effect that's present in public version even without AA

  • @b01scout96
    @b01scout96 Рік тому +2

    Today my RTX 4090 reached over 11,000 points in FFXV benchmark (maxed, UHD, DLSS). Compared to about 3,000 points on this video this is astounding! 😀
    But DLSS 2.0 significantly increased the image quality. A shame the game never received the new version, though.

    • @w061a09
      @w061a09 8 місяців тому

      and its even less likely now with what happened to luminous lol

  • @christian-johansson
    @christian-johansson 5 років тому +15

    Question: Since DLSS renders at 1440p, wouldn't it be more fair to compare DLSS performance to 1440p TAA? Even if DLSS outputs 4K pixels, it's not really 4K level of detail, right? Or does DLSS infer missing detail from a lower res image *as well* as removing artifacts?

    • @pipyakas
      @pipyakas 5 років тому +10

      No, it use a base resolution of 1440p but fillin the missing informations of a 4k frame by deeplearning, not traditional rendering. So techically you can say it's rendering 4k but culling a significant amount of pixels to render, like how checkerboard cull half of them on consoles

    • @Elios0000
      @Elios0000 5 років тому +2

      it would blow upscalled 1440p on a 4k monitor way it would not even be close

    • @terradrive
      @terradrive 5 років тому +11

      Just look at the image difference in this video, the DLSS 1440p upscaled to 4K have sharper textures than the TAA native 4K. That’s really good upscaling by the deep learning

    • @mjc0961
      @mjc0961 5 років тому +4

      Quy Nguyen "techically you can say it's rendering 4k"
      No you can't, because it's not rendering 4K.

    • @Brotoles
      @Brotoles 5 років тому +4

      The thing is, although it renders internally at 1440p, it's not just a simple upscaling. The final image has more detail than the original 1440p one because of the "inferences" from the tensor cores.

  • @OxRashedxO
    @OxRashedxO 5 років тому +18

    Unreal 1 music in the background! Great choice.
    👍🏻👍🏻👍🏻👍🏻👍🏻👍🏻

  • @davidcunningham6399
    @davidcunningham6399 5 років тому +1

    Hi DF. Love the videos, but noticing the penchant for "For sure" has been recently replaced with "Heavy". Technically could you be more specific as to what element or load you are referring to? I can only assume DLSS or the other techniques in this particular video, but could be memory, CPU, everything... or some appreciative slang term. Thanks. Do keep up the otherwise excellent work!

  • @xenoaltrax485
    @xenoaltrax485 5 років тому +1

    Is pixel counting of screen captures still valid considering that multi-resolution shading has been in use since Maxwell and now you have variable-rate shading with Turing? How can you tell if the region you're counting pixels in wasn't subjected to a different shader resolution?

  • @ardentlyenthused338
    @ardentlyenthused338 5 років тому +3

    @DigitalFoundy, Since you say DLSS is working off a 2K image and then "inferring" to 4K, I think it could be insightful to include a 2K benchmark. In this video you look at how much of a performance improvement you get relative to 4K, which is absolutely interesting, but it would also be informative to know how much of a performance penalty you get compared to 2K; again, since the ultimate image is 2K + DLSS ~ 4K. How much does that DLSS part cost? Just saying, if you're looking for more DLSS videos or investigation to do while you wait for ray tracing stuff to come out, there's another question you could delve into!

  • @msironen
    @msironen 5 років тому +4

    I expect these 4K purists will also refuse to listen to any kind of compressed audio. It's gotta be that 1.4 Mbps CD Audio from the 80s (or better yet SACD) just so you won't miss any of those least significant zeros in the bitstream.

    • @NoobGamingXXX
      @NoobGamingXXX 5 років тому +2

      They can wait 25 years for a hardware that will support their purist idealogy

  • @191desperado
    @191desperado 5 років тому

    NOBODY else does this kind of material. This is why I’m joining your Patreon ASAP!

  • @sev2300
    @sev2300 5 років тому +1

    When do we play games with the visuals of the first demo?

  • @madfinntech
    @madfinntech 5 років тому +11

    8:55 As a PC gamer and someone who prefers the actual resolution whatever it might be have an open mind for reconstructed image IF it doesn't hinder the quality. All these other techniques have major flaws that make it obvious it not being the real deal. I'll take it IF it can fool me the I think it's the real deal. It doesn't matter what the used technique is, the results matter and something like PS4 Pro's checkerboard technique doesn't pass as 4K to me.

    • @mjc0961
      @mjc0961 5 років тому +1

      "something like PS4 Pro's checkerboard technique doesn't pass as 4K to me"
      Of course not. It is objectively not 4K. It's a lower resolution image upscaled to 4K, which is not 4K no matter how many fancy techniques.

    • @Malus1531
      @Malus1531 5 років тому +1

      mjc0961 Isn't that true of DLSS too, just with a better upscaling technique?

    • @Hayreddin
      @Hayreddin 5 років тому

      @@Malus1531 not really, DLSS uses a 1440p input to produce a 4K output inferring the missing details, it's not simple denoising/upscaling, that's why it's able to produce sharp borders and text, it recognizes them as such and "knows" that borders and text should always look sharp and clean. There might be other cases where tha AI isn't as sure and might produce artifacts instead, maybe putting fake details where there should be none or over-smothing something that should look crisper. The good thing is that this is coming from an nvidia supercomputer that's constantly being trained by them, and everytime a new game is fed to the AI it gets a little better at doing its thing, so I guess game developers could provide better DLSS with patches over time or it could be a thing nvidia does via driver updates/geforce experience. I personally like the "painterly" look of DLSS and I'd like to see it in action on a real interactive game soon.

    • @Malus1531
      @Malus1531 5 років тому +1

      @@Hayreddin So how does that make anything I said false? It's still just a more advanced upscaling. All upscaling means is converting a lower resolution to a higher one, like from 1440p to 4K as you just said. And it's still inferior to native 4K, just like regular upscaling or checkerboard, only more advanced.

    • @Hayreddin
      @Hayreddin 5 років тому

      @@Malus1531 I never said it was superior or even equal to native 4K, but saying it's "just a more advanced upscaling" is not entirely correct either because the image is not upscaled per se but "reimagined" in 4K, I can see it can just be a question of semantics though and my intention wasn't to prove you wrong or start an argument.

  • @l0lzor123
    @l0lzor123 5 років тому +5

    Couldnt really see the image quality difference between DLSS and TAA until 9:37 damn noctis looks blurry there

    • @anogrotter1985
      @anogrotter1985 5 років тому

      Keep in mind DLSS performs better because it's actually a lower resolution, so even if it looks the same it's really impressive.

    • @Luredreier
      @Luredreier 5 років тому

      +l0lzor123
      Yeah, impressive what they can do with lower resolution rendering with this tech.

    • @sheikhrayan9538
      @sheikhrayan9538 5 років тому

      Taa looks so much worse, omg lol

  • @AdnanAhmed
    @AdnanAhmed 5 років тому +1

    Having my tea and Richard talking about tech jargon - this is the life.

  • @shinranews
    @shinranews 5 років тому

    What am I missing? I have a rtx 2080 card and when I go to enable dlss in ffxv windows edition there is no where to do it. How do I enable it?

  • @rainZSlayer
    @rainZSlayer 5 років тому +3

    Lo and behold, "faux K" arrives to the PC master race. Such ground breaking technology!!

    • @HarryPearce7
      @HarryPearce7 5 років тому

      Console drones can't even perceive faults in their absolute parlor trick of a rendering method that is checkerboard

    • @rainZSlayer
      @rainZSlayer 5 років тому

      HarryPearce Harr harr... Faux K is the new revolution. Bow down to pcmr 😂

    • @e5m956
      @e5m956 4 роки тому

      I agree with Chatterjee, I'm a PC master race and don't like the idea of rendering at a lower resolution and up scaling! I pay for 4k hardware I want full fledged 4k! In Shadow of the Tomb Raider with no AA at 4k the image looks much sharper than with DLSS albeit with no AA you get some shimmering (which is also annoying) but AA blurs the nice sharp 4k image! :( But still, PC will still have much better framerate and at least you have the option to use what you want (DSLL, no AA, TAA, this resolution, that resolition, turn this setting down, turn that setting up) where the consoles graphics settings are locked and you're stuck with what you got. So yea, even with console-like upscaling PC is still the master race hehe. :)

  • @erdincylmaz4529
    @erdincylmaz4529 5 років тому +7

    It is much more blurry than taa.

    • @killermoon635
      @killermoon635 5 років тому +2

      Look 5:50, the background looks less blury on DLSS.
      Also, the hairs has no jagged edges on DLSS unlike TAA

  • @Zackemcee1
    @Zackemcee1 5 років тому +1

    Will DLSS support older gens? What about lower resolutions than 4K? (I might have missed some captions from the video so don't backlash me if I did) xD

  • @Theinvalidmusic
    @Theinvalidmusic 5 років тому

    Nice use of the Dusk Horizon theme from Unreal in the background. Still an absolute fave

  • @ashenone3427
    @ashenone3427 5 років тому +4

    Will DLSS work with 1440p and 1080p resolutions? That's what I'm wondering.

    • @pottuvoi2
      @pottuvoi2 5 років тому +1

      Yes, it should work on lower resolutions as well.

    • @mduckernz
      @mduckernz 5 років тому

      Yes. It should work equally as well, possibly even better (since I expect the algorithm is trained on a single very high resolution data set, so the lower the resolution, the more "detail" can be inferred into that set of pixels) since the resolution delta is larger

  • @THEREALKEMMZ
    @THEREALKEMMZ 5 років тому +3

    Is DLSS only on the 2080 series cards?. It would be nice to have it for the older cards as well.

    • @mduckernz
      @mduckernz 5 років тому +2

      Yes, because the operations used in performing the DLSS will only run efficiently on Turing.
      You _could_ run DLSS on Pascal (or an AMD GPU for that matter...) - the algorithm is compatible... it's just math - but tensor operations (an n-dimensional matrix) are slow on older hardware since they only support vectors and matrices, not tensors. You can emulate them perfectly fine, but at significantly slower speed - so much slower that you would be better off just rendering at higher resolution.

    • @steveballmersbaldspot2.095
      @steveballmersbaldspot2.095 5 років тому

      It wouldn't run well as it is designed for hardware with tensor cores.

  • @allbrancereal_
    @allbrancereal_ 5 років тому

    At 17::45 you can see clear shimmering/missingtexture at the center of the DLSS footage in the distance.

  • @jeffhampton6972
    @jeffhampton6972 5 років тому

    This was really informative, thank you. Also, as OxRashedxO would agree, the Unreal music was a fantastic surprise. Hell yes!

  • @buzhichun
    @buzhichun 5 років тому +4

    In 10 years games will only render 4 actual pixels, then upscale it with Deep Leaning to 12K or whatever resolution we're using then. You heard it here first.

    • @ketrub
      @ketrub 5 років тому

      you're not funny nor clever

    • @buzhichun
      @buzhichun 5 років тому +1

      jeez, who shat in your cornflakes this morning?

    • @ole7736
      @ole7736 5 років тому

      I found it funny and clever.

  • @Xenthorx
    @Xenthorx 5 років тому +4

    TAA looks like no AA at 1:50

  • @Brotoles
    @Brotoles 5 років тому

    About the Infiltrator demo, assuming the frame time graph is completely synced, it's interesting to notice that frame time spikes happen at different frames when using DLSS when compared to TAA... It seem that because of the temporal element of TAA, the frame time spikes happen one frame after the DLSS ones...

  • @whatistruth101
    @whatistruth101 5 років тому

    Thank you for your insight and input

  • @mhosni86
    @mhosni86 5 років тому +3

    Are you telling me that the technology that Deckard used in Blade Runner is already happening? XD

    • @mduckernz
      @mduckernz 5 років тому

      Yes, but it's been around for a long time. This is not new technology. I've been using neural network based image and video upscaling for over 5 years now... NVIDIA has just built hardware acceleration for it and packaged it into a convenient API. That's all

  • @mitthjarta5
    @mitthjarta5 5 років тому +3

    Called it.
    Also "deep learning" is more a buzzword (not one I'm fond of). From my understanding it's applying a ruleset of filtering functions on clusters of pixels, based on relative properties between samples (a la how adaptive sharpening and many 'quick AA' methods work), upon recognising a clusters characteristics, it'll apply the most suitable filter pattern for it by traversing a LUT (The filter-pattern LUT being the only "Deep Learning" aspect as far as I'm aware, a pattern set generated by doing comparative analysis on a goliath datasets of 1440p ^ 64x down-sampled frames (if generic probably generated in part by the current demos), that's where the deep learning came in, it's probably a dataset that'll get updated in tandem with drivers, and thus improve over time IF it has any shortcomings), What makes it vastly better than TAA is the amount of raw silicon dedicated to the task, and the ability to apply such a large set of filters that are cluster specific. The "Deep Learning" aspects are actually quite bruteforce if the data sets are generated by super computers. And require entire blocks of compute ICs to be applied in real-time.
    Where TAA and most temporal injection methods fail in comparison is they are very generic and holistic, and have far lower overheads. Even without the DL datasets, the same compute silicon could offer less complex adaptive filtering techniques that improve upon TSAA.
    It should be said it's not "new" per se, it's just a more bruteforce and extensive application of similar techniques that AMD and Nvidia have previously done in the past, EQAA(GCN) and CSAA(NV) sampling methods have been around for years and do something **similar** . But those are MSAA methods, and fall apart at higher resolutions, and are fundamentally incompatible with deferred shading as i understand it.
    What I've noticed especially by the stills NV themselves have promoted is it's clearly upsampled, it has telltale scaling footprints imo (a visible "smeary" loss in texel definition). which is how I called it ahead of time upon it's reveal. It IS quite nice especially with the performance implications. but I'm against any vendor exclusive features when they are so fundamental regardless of vendor.
    Which is why I'm hoping it's not just relegated to RTX SKUs exclusively going forward. There is a reason 8xTSAA in DOOM(2016) with Vulkan performed better on AMD than any other filtering method, because it use async to offload the AA to otherwise IDLE compute cores (in a very similar way to how this is being accomplished). Nvidia now also have the might of async compute at their disposal, which is great, but they're trying to squander it's application, so it won't also benefit AMD.. And using marketing buzzwords and jargon to justify those exclusionary tactics. Yes they put in a nice amount of RnD etc, and should be commended on that, but I'm ambivalent towards their clear anti-competetive leanings.

    • @giancarloriscovalentin1865
      @giancarloriscovalentin1865 5 років тому +1

      Agree

    • @mduckernz
      @mduckernz 5 років тому +2

      That's a very narrow view of DL.
      DL is a term for a cornucopia of different techniques. It is not any one thing, or even one process. It is true that it encompasses the things you mentioned, but it also encompasses far for than that.
      If I had to give a high level summary it would be that of "a deeply layered, sometimes recursive, architecture of learning algorithms, usually with some form of reinforcement learning". Notice this says nothing about the kind of das it processes (nothing about pixels, for instance), nor anything about how this data is processed.
      The technique that NVIDIA seems to be using with DLSS seems to be a neural network based approach. This is not new technology, it is rather old in fact. I've been using AI NN-based upscaling for images and video for well over 5 years now (there are even plugins for MPlayer and I think VLC that do it... I forget the name but if you search "neural network upscaling MPlayer" you will find it) and it works very well. What NVIDIA have done is simply accelerate it in hardware and package it into a nice API. That's it. Not a massive innovation, just clever use of old tech.
      However, with regard to what you said about how this will be used to lock out AMD: absolutely agreed. This was very likely _one of_ their primary goals. I can admire the technology but despise its ulterior motives at the same time.

    • @mitthjarta5
      @mitthjarta5 5 років тому +1

      I agree completely,
      But the terminology of "deep learning" is very one-sided marketing. The umbrella term "Machine Learning" already exist to bundle those things together, DL has primarily been pushed by marketing and PR, not so much by computer scientists. It's nowhere near as bad as "blockchain" as a recent trendy buzzword, but following machine learning for years (auxiliarily, have a local friend in the field), the term "deep learning" never came up from 2013 until it was 'invented'. And the field has been establishing itself for almost a decade at this point. The terms I heard commonly were Machine Learning, Neural Networks, self-learning algorithms etc etc.
      "Machine Learning" was the easy to understand catch all term being used, if you look up all the established Wikipedia articles around the subject, that is the vernacular. Even "Self Learning" was used in the press heavily. But Nvidia themselves are the ones that pushed "Deep Learning" into the vernacular, so heavily that it became somewhat pervasive, and if anyone hears a story about "deep learning" especially a few years back (when it was being associated with Tesla due to their partnership), the only results that came up when searching were predominantly Nvidia's own material. it was essentially a tactic of SEO for CEOs.
      I think it's a perfectly serviceable term but not fond of it because of the above. It's like how tensor-flow became an industry standard toolchain for dataflow programming, Google even went and created a TPU "Tensor Processing Unit" without discouraging others from making similar ASICs, and then Nvidia go and start using "tensor cores" to bank on it's synonymity. even though 'Tensor Cores' are fundamentally just GPGPU cores (albeit with optimizations), not true TPU ASIC design. It's not an isolated incident they do this aggressive marketing driven hijacking of technology all the time.
      I'm glad you commented though to clarify the extent of Machine Learning within DLSS, I did come across too dismissively, ultimately I do agree there is enough to justify it's use. And glad you stated your stance on the ethics of vendor lock-in. I'm genuinely fearful of a repeat of Microsoft in the hardware/'emerging technologies' space.

  • @recordatron
    @recordatron 5 років тому

    Okay so having watched this I'm now more interested in the cards than I was. The DSSL in conjunction with the RT does seem to make a lot of sense and I'm now a lot more interested to see where these cards go.

  • @TerryMartinART
    @TerryMartinART 5 років тому

    Does Reshade offer any pc mods that are similar to interlaced or checkerboard rendering ? RE7 is the only game I have played on PC that had "Interlaced" Honestly couldn't tell a big difference in quality at 4k with RE7's interlaced setting, just double performance.

  • @Bloodred16ro
    @Bloodred16ro 5 років тому +8

    DLSS looks like a disappointment to me. By far the biggest attractions of 4K are the sharpness and the fine detail you can see, especially when close to a PC monitor. DLSS provides a blurred image where both of those qualities are essentially gone. Why would I use a 4K monitor in order to play a 1440p upscale of a game? Why not use a 1440p monitor at native resolution and use DLSS 2X for AA instead? What's the point of aiming for 4K if you're going to give up on the sharpness and detail anyway? Sure, it's better than a traditional upscaler and probably better than checkerboard techniques, but the FFXV screenshots I've seen make it very clear that a lot of detail is lost (even when compared to TAA, which in and of itself sometimes adds some blur; I'd love to see a comparison against a shot with no AA or one with real SSAA), as does the Infiltrator comparison in this video.
    The only real appeal I see to regular DLSS (not 2X) is the idea that it might be used as a sort of trade-off for ray tracing, as in you give up some sharpness and detail and get better lighting/AO/shadows in return at a resolution where ray tracing is otherwise much too slow to use.

  • @madfinntech
    @madfinntech 5 років тому +4

    I really hope this DLSS and other AI features will be utilized in gaming in the future and I'm not only saying this because I bought the 2080 Ti. I bought it for 4K AND 60 fps gaming but it would be cool if these other features would take off as well.

    • @Ford-wt8rn
      @Ford-wt8rn 5 років тому +1

      Same here. It could increase the longevity of these cards too....if Cyberpunk 2077 uses DLSS in say, 2020- the 2080ti would probably be able to play it and higher settings with great framerates, HDR, etc....Of course im speculating but it really could be a game changer. I'll take a checkerboard like image, maxed out, with high framerates, over lowering the settings, native 4k, and lower frame rates for really demanding games. And we all know games will get more demanding these next couple years....

  • @PatrickLofstrom
    @PatrickLofstrom 5 років тому +1

    I loved the plug at the end. Vuvuzelas are BACK!

  • @Engel990
    @Engel990 5 років тому +1

    Holy shit the Unreal music in the background... Just after your left the ship. I fucking recognize that anytime anywhere no matter how softly it is played.

  • @eldarmusayev7653
    @eldarmusayev7653 5 років тому +5

    @21:57 while using the Tensor cores to upscale the ray traced image seems like a good idea, those tensor cores are already busy doing something else - denoising the ray-traced image. Maybe it's possible to run both the upscale model and the denoise mode in parallel and assign them half the tensors each, but I've never seen a single gpu able to run two models in parallel, and would wager that this would require a separate unified model, one that is trained using an image that is both lower-res and noisy from RTX, with supersampled cleaned up ray traced frames as the training target - introducing lots more room for errors in the end result.

    • @TheLoucM
      @TheLoucM 5 років тому

      the star wars demo runs both at ''the same time''.

    • @JimBob1937
      @JimBob1937 5 років тому +1

      Not all devs seem to be using the tensor cores for denoising. Those that are, I see no problem with, the different work will merely be scheduled amongst the cores, why do you feel cores can only be used by one thing? Ignoring the ASIC cores we're talking about, CUDA cores are by their nature able to be scheduled for mixed work loads, that's kind of the whole point of the unified core architecture. With ASIC cores, you'll just be queuing the workload among them, with the work being parallizable, it can shard off into the number of cores needed and available, and it's highly unlikely the denoising will require many of those cores. So, I don't think this is an accurate criticism.

    • @JayJapanB
      @JayJapanB 5 років тому

      Dice weren't using the Tensor cores for anything.

    • @eldarmusayev7653
      @eldarmusayev7653 5 років тому

      @jarrod1937 At least for people using various machine learning libraries, you can't parallelize tasks and assign one to half a gpu. You just assign it a gpu and then off it goes.
      If denoising and supersampling are two separate models, they'd need to be run sequentially. That means extra slowdown. That's why.

    • @eldarmusayev7653
      @eldarmusayev7653 5 років тому

      @JayJapanB Dice were definitely using tensor cores for denoising the ray traced reflections. RTX includes denoising as a vital part of the process. Unless you've got a source where Dice devs explicitly say they were using ray tracing without tensor cores.

  • @MrUngoBrumboBlab
    @MrUngoBrumboBlab 5 років тому +7

    Some Dude: Hey James Cameron what is AI going to be used for in the future? Maybe killer robots or house maids or something?
    James: Nah it'll used to make video games look better... (╯°□°)╯︵ ┻━┻

  • @nihren2406
    @nihren2406 5 років тому +2

    So what you're saying is that DLSS solves the hair aliasing problem in Final Fantasy XV, at least for the most part? That was something that always bugged me even on PC. In cutscenes there was always a lot of shimmering on the characters hair, like watching a bunch of razor blades go down each strand. It's worse on some characters, but still there regardless. Now granted I wasn't playing at 4K, but I had AA cranked to max and that only seemed to have a little effect on it.

    • @pottuvoi2
      @pottuvoi2 5 років тому

      It does seem like it in the video.
      Knowing at least partly how DLSS works it's actually not surprising that it works so well on hair dithering.
      The AI receives 64xSSAA image in part of it's training and dithering basically changes into 64 value gradient in alpha.
      It's very highly different part in source images, so it certainly would make sense for AI to learn how to help with.

  • @jkotka
    @jkotka 5 років тому

    i dont have a 4k monitor to check this with, but at 11:00 are those moire effects on his jacket pocket because of the youtube downscaling or does the DLSS really produce that clear artifacts?

    • @JayJapanB
      @JayJapanB 5 років тому

      That's one of the "one frame after cut" shots so no DLSS is active on the DLSS side.

  • @kenshii_jim
    @kenshii_jim 5 років тому +20

    At 1080p, FF XV probably has one of the worst and most aggressive TAA implementations i have experienced(right beside MH World), i literally said "wtf is this oily layer on my screen" on the first 10min of gameplay. Saying its not the best is being nice.

    • @killermoon635
      @killermoon635 5 років тому +1

      TAA make the game look better. So much jagged edges on the trees if you turn taa off

    • @zblurth855
      @zblurth855 5 років тому +1

      oooo that why i have oil on MHW

    • @gabrielst527
      @gabrielst527 5 років тому

      The problem is that Nvidia intentionally left TAA on or DLSS to make 4k native look bad, this test should be done with 4k native no AA vs 4k upscaled DLSS.

    • @kenshii_jim
      @kenshii_jim 5 років тому

      Yeah Gab Stu, you're right, TAA makes it less sharp, so we dont notice as much of a difference.

  • @Aggrofool
    @Aggrofool 5 років тому +87

    Nvidia's pricing will turn me into a console gamer

    • @sebastianhoyos1961
      @sebastianhoyos1961 5 років тому +7

      I mean i could get checkerboard 4k for much cheaper on a console......

    • @tomaszpapior2143
      @tomaszpapior2143 5 років тому +9

      As a PC player who bought PS4 Pro a year ago, now I spend like 80% of gaming time on this machine. Uncharted, Gran Turismo and Horizon Zero Dawn are so good

    • @budthecyborg4575
      @budthecyborg4575 5 років тому

      Crypto did that to me last year.
      The Xbox X is the only hardware I have that can output a 4K60hz image over HDMI (My 980Ti is basically equivalent to the Xbox, but only has HDMI 1.4), so the console is the only way I can use a 4K TV unless I buy a new GPU, which at this point isn't going to be until AMD comes out with a chip faster than the 1080Ti.
      Hopefully that's only a year away. In the meantime Xbox X runs most games at the same fidelity as my PC would manage anyway.

    • @tomaszpapior2143
      @tomaszpapior2143 5 років тому +2

      @@budthecyborg4575 X is a good choice as well, I've got PS4 Pro mostly because of games not available on PC. I was planning to buy a pc months later, when crypto madness is down but didn't decided as there is still too many games for me to play on the ps4 and PC parts prices are crazy this time.

    • @budthecyborg4575
      @budthecyborg4575 5 років тому +1

      I've been a big fan of Forza for the last decade. Xbox One has not disappointed.
      Xbox X is still the only way to get Halo:MCC, and now that we have it running in 4K that means co-op runs in dual 32:9 1080p screens! Comparing with the price of those things on PC my $300 4K TV split across the middle would be worth about $2,000!

  • @DNAReader
    @DNAReader 5 років тому

    @DigitalFoundry TAA is spatial super-resolution using previous frames and player movement to provide an anti-aliasing through interpolation. DLSS is a series of quasi gaussian filter (convolutional filters like those of a NN) learned from ground truth (4K MSAA8x) compared to no-AA 1440p. The filters are a learned filtering process, accelerated through the Nvidia ASIC : TensorCores (hence why they say "inferencing")

  • @JulianDanzerHAL9001
    @JulianDanzerHAL9001 5 років тому

    I don't like resolution scaling personally but as an anti aliasing method this seems pretty interesting
    the disadvantage being that of all graphics features ever this is probably the hardest to predict so if you want to know how well it does you really can't use benchmarks or even short testruns you prettymuch have to test through an entire game and all possible scenarios just to see how it behaves in all cases

  • @Rithysak101
    @Rithysak101 5 років тому +24

    The thumbnail mislead me into thinking you're talking about versus 13 :(

    • @batmangovno
      @batmangovno 5 років тому +2

      Yeah, I would've loved to see FFvXIII's darker artstyle realized with current-gen top tech.

    • @Rithysak101
      @Rithysak101 5 років тому +1

      @@batmangovno the more I think about it the more it saddens me :(

    • @KagatoAsuka
      @KagatoAsuka 5 років тому +1

      There are mods for FF15 that swaps in some Versus 13 assets I know its not the same but still pretty cool.

  • @Anita028
    @Anita028 5 років тому +3

    That gorgeous Unreal soundtrack in the background is giving me a bad time trying to concentrate in what are they talking... I just want to see more Unreal...What a soundtrack for what a game.

  • @Maschinenzimmer777
    @Maschinenzimmer777 5 років тому

    What's the game in the second half of the vid? Looks amazing, loving the art style as well as the tech behind it, but couldn't pick a name

  • @DELTAGAMlNG
    @DELTAGAMlNG 5 років тому +1

    is dlss only for 4k or will it work for people running 1440p monitors

  • @darak2
    @darak2 5 років тому +3

    The technology seems cool, but the 1440p base resolution is very apparent in scenes with high texture details. No upscaling magic is going to recreate details that were never there to begin with. FF is not a good test case since its texture quality is actually pretty poor (and so is its post-processing stack), not to mention an on-rails demo is the best case scenario for an algorithm based on training.
    Until this is released in a real game and we're able to test its real performance implications and quality over really dynamic and interactive scenes, any review is pointless.

    • @marverickbin
      @marverickbin 5 років тому

      But in fact the details are there, because they use 4k images in the network training.
      Maybe this game is not the best example, and the training architecture have to improve, or more images have to be put on the network. But the technology allows to foresee what is not there. Is like you manage to draw something you already seen from other angle. The information is there in your brain network.

    • @darak2
      @darak2 5 років тому +1

      DLSS will be able to produce a better blend of multiple frames of information in order to prevent aliasing, but it won't be able to add missing information that is not present in the frames generated at runtime. There are two reasons why this is guaranteed. The first is a dataset size issue: you'll need to store somewhere all the missing pixels from the original 4K pictures for each and every frame, data which won't be available otherwise at runtime; the size would be prohibitively large. The second is speed. The more individual cases (texture details) the algorithm must react to, the bigger the network will grow and the slower it will run. Some existing superscaling neural networks are able to infer non-existing details, but those run extremely slow and unlike DLSS they are trained for extremely specific cases such as celebrity faces in a particular angle (and even then, the results are horrible 50% of the time).
      Neural networks are a great technology, but they are not magic. Even if you don't believe me, you have plenty of examples of the missing texture details in the existing DLSS captures.

  • @keyboard_g
    @keyboard_g 5 років тому +7

    So the conclusion is that faux 4k is easier to do than real 4k. Ground breaking stuff.....

    • @glorious240fps6
      @glorious240fps6 5 років тому

      You didn't understand a single thing in this video.

    • @themodfather9382
      @themodfather9382 5 років тому

      @@glorious240fps6 Yeah he did, no matter how you slice it, it's a 1440p image. Just like the Xbox360 used to upscale a 576p image to 1080p. I love the xbox360 but let's be honest about what it is.

    • @glorious240fps6
      @glorious240fps6 5 років тому

      @@themodfather9382 it's a new technology,it's getting better and better over time. Look at metro exodus's dlss now Vs. bf5's dlss before.

  • @strongforce8466
    @strongforce8466 5 років тому

    Supercool vid, thank you for your great work! Dlss sound very promising, seems to be all the attention was on Rtx.. but this is pretty cool actually!

  • @themodfather9382
    @themodfather9382 5 років тому +2

    It's not 4k, it's 1440p upscaled. The information is simply not there. Remember upscaling DVD players? Same thing.
    I think it's a cool feature, but people are saying it looks the same as 1800p upscaled. Using a 4k monitor, we should be comparing: 4k, 1800p, 1440p, 1440p+DLSS upscale. People have been upscaling resolutions for over 20 years so let's not pretend this is a new thing.

    • @flarpman2233
      @flarpman2233 5 років тому +1

      Flarp here.
      Anyone who can't spot the difference between native 4K and 1440P upscaled is either making things up or blind. Furthermore, there is no 'performance boost' at 4K from doing this, you are gaining performance due to running at a resolution below 4K.

  • @janklaassen6404
    @janklaassen6404 5 років тому +33

    DLSS makes a painterly look where texture detail disappears. It's like the noise reduction filter on your phone when shooting with little light.

    • @Luredreier
      @Luredreier 5 років тому +3

      +Abbablablah Bablhablah
      What frames are you talking about?
      Can you link the time stamp you're referring to?

    • @GraveUypo
      @GraveUypo 5 років тому +1

      iunno, looked good to me for most part.
      but i'm skeptical. i want game image quality comparisons, not just demos. demos are way too easy to make a very specific neural network that cleans them up almost perfectly.
      1:58 oh. i see it now. looks almost like anisotropic filtering was turned off

    • @GrantZ90
      @GrantZ90 5 років тому +5

      Just take a look at 1:41 for example. Blurry as hell. All the detail in the textures gone.

    • @Darkmasta05
      @Darkmasta05 5 років тому +6

      @@GrantZ90 Hi, this timestamp is the one frame that is yet untreated. The detail lost is to be attributed to the resolution difference 1440p/2160p and not dlss.

    • @LisaSamaritan
      @LisaSamaritan 5 років тому +6

      It might go both ways. Look at the logo, on the table @ 9:49. Clearly more detail in DLSS.
      They show the same frame @ 10:57.

  • @FF-jf8yg
    @FF-jf8yg 5 років тому +8

    DLSS is more impressive than RT, but RT is still in its infancy so......

  • @ardentlyenthused338
    @ardentlyenthused338 5 років тому +1

    If this kind of performance boost is representative, and enough games support it, DLSS could be a game changer, and ultimately be one of the key technologies that helps achieve acceptable performance with Ray Tracing enabled. For instance, if you take the average 30% performance boost of the 2080 Ti over the 1080 Ti (simply from being a faster architecture) and combine that with the 40% increase from DLSS, you're looking at over an 80% performance increase relative to the 1080 Ti (1.3 X 1.4 = 1.82, or about 82%). That's a HUGE gain, one that certainly will party or perhaps entirely offset any performance penalty incurred by Ray Tracing.

  • @UbiquitousDIY
    @UbiquitousDIY 5 років тому +1

    Given dlss is constructing images to some extent how does this affect playing competitive games (siege/csgo etc) which need the most raw and immediate visual info/frames for the best possible gameplay?

  • @ScreamerRSA
    @ScreamerRSA 5 років тому +102

    2080ti ~30% performance increase, ~70% price increase. nVidia, the way its meant to be played.

    • @madfinntech
      @madfinntech 5 років тому +9

      Someone ripped out the Tensor and RT cores the one you bought? I didn't know the 1080 Ti had those.

    • @ronnymueller1918
      @ronnymueller1918 5 років тому +21

      The way its meant to be payed

    • @kevinmlfc
      @kevinmlfc 5 років тому +4

      Yes and it can't hit 4k@60 on a 2 year old game.

    • @mduckernz
      @mduckernz 5 років тому +4

      The way _you're_ meant to be played.

    • @lendial
      @lendial 5 років тому +1

      predictioN: 3080ti ~15% performance increase at 2,000 usd.

  • @clarknova1567
    @clarknova1567 5 років тому +131

    Way too expensive. The performance per dollar metric is really important here because you are getting less at the cost of more. The two features that make this series of cards aren’t even ready at launch.

    • @jacobsmith1173
      @jacobsmith1173 5 років тому +39

      But you have just described the top tier card for every generation ever. I mean this with respect, if your attitude towards these cards is about performance per dollar, I am not sure why you would go to flagship card videos within two minutes of upload, the day the card was released (supposed to release). Are you expecting the card to be the cheapest on the market? OR offer the best bang for buck?

    • @varungandhi8796
      @varungandhi8796 5 років тому +8

      Well, the price per dollar will probably get better as the new more optimized drivers come out

    • @purona2500
      @purona2500 5 років тому +6

      You VASTLY overestimate the performance increase that drivers give

    • @wile123456
      @wile123456 5 років тому +10

      Jacob Smith you must suffer from amnisia then because every new gen has offered one tier higher performance (970 = 780, 1070 = 980/ti) but for the same price as that tier usually cost, or atleast close to it. That's how it has been for years. Why is that? Well because the previous gen will always have a cheaper price when the new gen launches, so in order for the new gen to be a good sell that have to atleast match the old MSRP of the previous gen. And that has been at launch. The price of the new gen will drop as well just like this one, but atleast before there was value to the cards outside of 2 gimmick features only a handful of games will support (and doesn't right now)
      Now you are paying 800 dollars for a 2080 compared to the msrp of the 1080 500 dollars. 1200 dollars (the same as the overpriced titans) for the 2080ti, compared to teh 700 dollar for 1080ti. Then you factor in that 1080 is 400 dollars now and 1080ti is 600 dollars and you have possibly the worst price to performance ratio of any generation of video cards EVER.
      Now if you used your brain instead of gut fanboyism you would be able to see that but sadly nvidia's brand name is too strong. We the consumer, everyone watching this video, loses, because we are paying more for less

    • @jacobsmith1173
      @jacobsmith1173 5 років тому +10

      +Varun Gandhi But it is kind of missing the point. The 2080Ti is just a re-branded Titan card. These cards aren't for people concerned with Frames per Dollar. They are marketed for people who want the maximum possible performance. The best thing about the re-branding is that consumers know that they aren't going to release a re-branded XX80Ti version down the line for $500 less. These are also for people wanting to get the most out of 4K, as recent years have shown, hitting 60Fps has been just within reach. Now, we seem to blow right past it. Giving plenty of headroom.

  • @zervox8427
    @zervox8427 5 років тому

    17:30
    I hear the mention of "eliminating TLAA artefacts" however no mention during motion all the flickering of pixels affected by lights with DLSS?

  • @StavrOgnev
    @StavrOgnev 5 років тому

    In infiltrator FPS dropping in scenes, where texture buffer wiped away before new scenes. I believe, the picture focus changing to some primitive intentional, so no scenes actually lag during intense actions.

  • @dreammfyre
    @dreammfyre 5 років тому +28

    Quantum image reconstruction. Spooky pixels at a distance.

    • @Dictator93
      @Dictator93 5 років тому +3

      LOL - no joke, when looking over DLSS I actually said "spooky action at a distance" to describe some of the behaviour :D
      -Alex

    • @Luredreier
      @Luredreier 5 років тому

      +re hash
      Yeah, I noticed that.
      But I think it's well worth the cost for Nvidia users given the performance benefits.
      A RTX 2060 running a game supporing DLSS will probably give AMD a real beating there.
      And one where Nvidia has actually earned the victory unlike with all the gameworks nonsense...

    • @franzusgutlus54
      @franzusgutlus54 5 років тому

      if you don^t look at the monitor, the image is blurry and crisp at the same time!

    • @st0nedpenguin
      @st0nedpenguin 5 років тому +2

      If you don't look at the screen the game is both running and not running.

  • @zNoteck
    @zNoteck 5 років тому +6

    The unreal music in the background tho

  • @RaufZero
    @RaufZero 5 років тому

    guys, haven't you changed the description by accident? DLSS on the left looks blurry comparing to picture on the right with TAA
    10:24 watch this

  • @deathrow989
    @deathrow989 5 років тому

    Alex is such a credit to the Digital Foundry team, seems like you guys got the dream team

  • @zeldacuz
    @zeldacuz 5 років тому +3

    DigitalFoundry gives DLSS it's stamp of approval? Nice.
    DLSS is the future it seems.

  • @KeyToAnime
    @KeyToAnime 5 років тому +5

    That Versus XIII thumbnail makes we very triggered. ._.

  • @robertodelatorre3829
    @robertodelatorre3829 5 років тому +2

    My question is, why do you need any kind of anti-aliasing at a resolution such as 4K? I've played witcher 3 at 4k 60fps with AA off, I don't see any aliasing.

    • @pottuvoi2
      @pottuvoi2 5 років тому

      Yes, a lot depends on how how surfaces are lit and shaded, but some form of AA is necessary.

    • @robertodelatorre3829
      @robertodelatorre3829 5 років тому

      It might be an opinion, but I disagree. I don't know where people are finding aliasing when doing 4k rendering, and I'm incredibly perceptive to jaggies.

    • @pottuvoi2
      @pottuvoi2 5 років тому

      It depends a lot on the game and how it renders things.
      Large specular surfaces do need AA either in shader or spatial/temporal one.
      Shader aliasing is similar on how color textures would be without mipmapping, a single sample can try to explain what is in 100x100 pixel area within texture.
      Prefiltering was the solution for simple color textures, for normal and roughness maps prefiltering has to be done in slightly different manner and is a form of shader antialiasing. (Collect normals in area and incorporate to specular lobe in mipmaps and/or adjust roughness.)
      This also helps object surface to keep it's look from very close to far distances.
      For edges AA is needed, especially when edges become thin and we start seeing roping and object flickers on/off from existence.

    • @Sonicboomffx1
      @Sonicboomffx1 4 роки тому

      Play a game like AC Syndicate, and it is extremely noticeable even at 4K with aa, and especially without. So it is needed in some games, some not really or they only need a small amount of aa.

  • @TheWarTube
    @TheWarTube 5 років тому +2

    Actually impressed of the results, but I'd like to see how DLSS behaves with lower sample counts (like upscaling from 720p to 1080p) to see if the quality level can keep up.

  • @desmond3245
    @desmond3245 5 років тому +6

    There is no point in using TAA in the first place when it's 4K. So comparing TAA with DLSS at 4K to make DLSS seem better makes no sense. Why don't compare DLSS with No AA at 4K?

  • @fatboymachinegun
    @fatboymachinegun 5 років тому +190

    PC gamers:
    Fake 4k on consoles? Wow what a joke.
    Fake 4k on PC: Wow this technique is so advanced!

    • @RinkuHeroOfTime
      @RinkuHeroOfTime 5 років тому +17

      tbh i really dnt care for 4k

    • @GraveUypo
      @GraveUypo 5 років тому +28

      basically that.
      but honestly, i want fake 4k on both. i want to run games at a native 1080p and have them not look smeary at a 4k screen. that way i can use low end hardware and still have good performance.

    • @Solidouz
      @Solidouz 5 років тому +11

      Lol so true.

    • @EcchiRevenge
      @EcchiRevenge 5 років тому +39

      It's advanced.
      Nobody said it's preferred over true 4k...etc.
      It's just another option that PC Master Race has, sometimes for hitting that 144fps mark(that consoles don't even have the CPU to do).
      Nothing new there, it's not like PC games can't have dynamic resolution.

    • @QuietWookie
      @QuietWookie 5 років тому +8

      it is advanced tho.

  • @rvalent9366
    @rvalent9366 5 років тому +2

    i hope gtx 2060 will have some tensor cores (no rt) so we can still have dlss, and maybe Variable Rate Shading via ia?

  • @mathburn1
    @mathburn1 5 років тому

    @5:53 what's happen to Noctis's inner shirt? Artifact?
    Also a shimmering @10:26 - @10:28, @11:55 - @11:57, @17:33 - @17:52(bright spot in the middle of the screen) on DLSS side. And I think it looks blurrier as well.
    But the performance gain is really something worth sacrifice.

  • @varshoee
    @varshoee 5 років тому +29

    The only tech channel we need on UA-cam for CPU and GPU analysis is DigitalFoundry, always professional and informative contents are presented here.

    • @evanvandenberg2260
      @evanvandenberg2260 5 років тому +10

      Nima V and Hardware Unboxed, and Gamers Nexus, tbh we are kinda spoiled. 😋

    • @EnduroTrailRiding
      @EnduroTrailRiding 5 років тому +6

      Where's there benchmarked review video then? There's far better channels out there that have done that already today and now prove more credible. I assume digital foundry is on nvidias payroll to not release a benchmark comparison video already highlighting how crap these new cards truly are in terms of performance upgrade vs last gen.

    • @Luredreier
      @Luredreier 5 років тому +4

      +Nima V
      I very much like their in-depth look at frame pacing.
      But their quality comes at a cost of quantity.
      They can't really check as many titles or as many variables as other reviewers like Hardware Unboxed does.
      And at the end of the day it's Hardware Unboxed, Gamers Nexus and Actually Hardcore Overclocking that I use the most for most of my purchasing decisions regarding motherboards, CPUs and GPUs.
      PSUs are a different story.
      But yes, Digital Foundry is definitivly a great supplement source of information.
      But "only tech channel we need"?
      Definitivly not.

    • @MarceloTezza
      @MarceloTezza 5 років тому +6

      You are so wrong so many levels, digital foundry was once something now is just a bunch on blind guys.

    • @GraveUypo
      @GraveUypo 5 років тому +1

      Kinda have to agree with Marcelo Tezza here
      quality's dropped HARD from what it once was. their analysis are getting shallower and shallower, not to mention biased. this video was pretty low quality. it was more a conversation than a technical analysis. i could learn almost nothing i already didn't know, and i didn't even do much research into dlss.

  • @tepkisiz
    @tepkisiz 5 років тому +22

    Come on it is obvious that DLSS is rendering at 2K, you can easily see the detail loss in textures. DLSS is cool as a new AA method, but it is stupid to compare it to 4k TAA in terms of performance. Compare to 2K TAA if you have to.

    • @madfinntech
      @madfinntech 5 років тому +16

      1440p, not 2K.

    • @Luredreier
      @Luredreier 5 років тому +1

      +Seko Us
      What frames where you looking at?
      The one where they paused just after change of frame (the only one where I genuinly saw it being 2k instead of 4k as the NN hadn't had time to catch up with the change of frame yet).
      Because I didn't notice anything else other then those frames where it looked any worse then real 4k.

    • @tepkisiz
      @tepkisiz 5 років тому +2

      1:58, 16:30, and all other static side by side comparisons. Check the ground and other static areas in the background and watch at 4k, low resolution will be more obvious in native 4k, as youtube is already compressing the image.

    • @Luredreier
      @Luredreier 5 років тому +4

      +Seko Us
      Alright, I'll admit that 1:58 *does* look better with 4k TAA, but I doubt I'd notice that in a moving picture.
      16:30 Looks about equal to me on the two.
      And several other parts of the video DLSS actually looks *better* to me.
      But part of the point here isn't if 2k picture turned into a 4k with DLSS or a native 4k TAA picture looks better.
      But if it looks better then the 2k picture to start with or close enough to the 4k TAA to accept the dissadvantages.
      The idea being that you can buy a 4k monitor and play games on it with a GPU that is designed for a 2k resolution or keep playing at 4k with a 4k capable GPU long after the demands of the games makes native 4k impossible.
      So no, I don't think it's stupid to compare it with 4k TAA as you *can* see several cases where it *does* look better.

    • @jakub7244
      @jakub7244 5 років тому

      Check car details in comparison images, DLSS just washes everything like if you did it in photoshop 1st year in college. You get sharp edges but textures that looks like stretched 720p... but if they can improve this, its gonna be nice, but far away from native 4k.

  • @MarioManTV
    @MarioManTV 5 років тому

    Seeing only predetermined benchmarks running with DLSS is a bit worrying. Neural networks work best when they work with inputs that they've seen before. It's when they see new data and handle it properly that we actually see any real "learning." Prerendered footage running with this technique is akin to giving RTX card the answers to the test in advance.
    I look forward to a wider release so we can look at actual gameplay footage in action.
    P.S. Would it be possible to get a more dynamic DLSS scene if you used Ansel? The FFXV benchmark has Ansel freecam support, so if DLSS doesn't get disabled there, you should be able to push DLSS a bit harder by taking it off the beaten path.

  • @jonathondipalma2203
    @jonathondipalma2203 5 років тому

    Was waiting for this xx

  • @CharcharoExplorer
    @CharcharoExplorer 5 років тому +108

    RTX 3000 series will be a much better product at 7nm. Especially if Gamers do not vote for these prices and wait. But they are spineless and do not make long term decisions who am I kidding lol?

    • @wile123456
      @wile123456 5 років тому +8

      Yes sadly most gamers nowadays are just sheep buying brand names like it's an iphone or clothing. Most of them don't even watch these videos. Nvidia knows their market and they are abusing it to the fullest extent, making these insane price hikes while people still make them sell out. It's hard to keep faith in this industry when these buyers are screwing over everyone else

    • @ToothlessSnakeable
      @ToothlessSnakeable 5 років тому +1

      It won't be 7nm apparently AMD is getting priority on that from TSMC

    • @bitscorpion4687
      @bitscorpion4687 5 років тому

      10nm Samsung in 2020 is what you should expect...

    • @cremvursti
      @cremvursti 5 років тому +4

      At $1200 I'm pretty sure not many will pick one up. Nvidia isn't Apple and the rtx 2080ti isn't the iPhone X, so I'm pretty sure they won't sell like crazy. Nvidia prolly already knows this and by pricing the new cards so steep it probably means that they don't even expect to sell as many cards and this generation of new cards is more like a minor one, the middle step between two bigger generations. That means that the bulk of the sales will still come from the GTX 10xx generation, at least until AMD releases new cards as well, at which point we might see some changes in the pricing policies.

    • @madfinntech
      @madfinntech 5 років тому

      You don't say! They aren't out now though.

  • @DerMitWieWoOhneNamen
    @DerMitWieWoOhneNamen 5 років тому +36

    1:58 The DLSS image is extremely blurry compared to the traditional method. That's not closer to 4k, it's farther away from higher revolution... Look at the hair for example

    • @wile123456
      @wile123456 5 років тому +31

      Well that's because they pause at a frame change, so it doesnt have a previous frame to use for reconstrcution (it's also why TAA doesnt remove the edges on the right either)

    • @erdincylmaz4529
      @erdincylmaz4529 5 років тому +7

      bla bla bla, dlss is shit, nice nvdiia ass licking session on this video.

    • @bitscorpion4687
      @bitscorpion4687 5 років тому +8

      5:47 seems pretty good to me

    • @Luredreier
      @Luredreier 5 років тому +9

      +Monchi AwesomeSauce
      I'm a AMD fan and I'm probably *never* going to buy a Nvidia card.
      However give credit where credit is due.
      DLSS is a *good* technology that I'm looking forward to AMD implementing themselves.
      Sure it's not perfect, the first frame after a major change like the one they paused at is at a lower resolution, but quite frankly without such a pause you're *never* going to notice that.
      And 40% performance uplift compared to regular AA at a given resolution is a no-brainer.
      I'm going to recommend that friends of mine who *are* Nvidia users (and fans) buy something like this (or possibly the next generation) to benefit from this technology as I think this is sort of Nvidias equivalent of the first GCN processors on our side.
      A HD 7970 GHz edition is still a good GPU today because of all its future looking technology, many of those features where not used till only a couple of years ago.
      But yes, a GPU from June 22nd 2012 can *still* be used to game at 1080p in 2018.
      That's impressive.
      And I think this thing might be something equivalent on the Nvidia side although it might be worth waiting for the second gen card with this tech for Nvidia fans.

    • @hugh31098
      @hugh31098 5 років тому +9

      Have you watched the whole video? The hair on TAA looks sharper because there're a lot more artifact.
      DLSS definitely handle aliasing better, even when compare to such blurry technique like Temporal Anti-Aliasing. And it's also give extra performance.

  • @user-dz3ph7dl4m
    @user-dz3ph7dl4m 5 років тому +1

    the reason nVidia demonstrated DLSS vs TAA is that TAA introduces blurring / softening. that lower DLSS pixel count will be less noticeable when side by side with TAA... when available please compare to native 4k DigitalFoundry. while TAA might look nice at 1080p it defeats the purpose of applying it when rendering at 4k imo.

  • @raytracemusic
    @raytracemusic 5 років тому

    has there been any talk of DLSS being provided in someway outside of gaming (i.e. as maybe a CUDA accelerated plugin for image editing programs)? - sorry Tensor or whatever theyre called :p

  • @DJHeroMasta
    @DJHeroMasta 5 років тому +5

    *This* Is What I'm 2nd Most Excited Out With RTX Cards.... *DLSS* 😍.

  • @Solidouz
    @Solidouz 5 років тому +15

    Man you guys are damage controlling so hard for the 2080. The 2080 is bad product no saving it. I canceled my 2080 preorder, got a 1080ti for much less. Maybe next iteration. Or a 2080ti at MSRP will be worth it.

    • @alonofisrael4802
      @alonofisrael4802 5 років тому

      YES YES YES DAMAGE CONTROL

    • @youtuberobbedmeofmyname
      @youtuberobbedmeofmyname 5 років тому

      @@alonofisrael4802 it is. lol this channel is bass ackwards

    • @glorious240fps6
      @glorious240fps6 5 років тому

      @@alonofisrael4802 lol braindead AMD fanboy.

    • @glorious240fps6
      @glorious240fps6 5 років тому

      @@youtuberobbedmeofmyname brainless AMD fanboy crying and accusing a channel being biased,lol.

    • @glorious240fps6
      @glorious240fps6 5 років тому

      @Solidouz Snake lolz,so now saying the truth is damahe controling?? Ok brainwashed idiot. Not everything about rtx cards is bad,the only bad thing is the price,other than that, it's a kickass card with great features, it's sad to see those features absent at launch. Only one game has ray tracing and dlss is MIA. But these features are actually good,AMD CEO herself said so and praised nvidia for being early adopters. AMD is also joining the party.
      You are clearly brainwashed by AMD shill channels that made you think that rtx 2080 is just bad and there's no saving it.

  • @BUDA20
    @BUDA20 5 років тому

    low pixel crawling is visible on DLSS infiltrator demo 16:53 metal structure on the building over the statues

  • @d0jima
    @d0jima 5 років тому

    ahaha, the air horn during the plug...