GeForce RTX Unboxing + Turing Performance Technologies Explained!
Вставка
- Опубліковано 13 вер 2018
- Rich sits down quickly to unbox some Nvidia RTX Founder's Edition GPUs and to talk about new enhancements in the Turing architecture, designed to provide even higher performance increases. We put this video together today rather quickly and unscripted - we have a ton of benchmarking and analysis to carry out, so excuse a slight lack of polish!
Subscribe for more Digital Foundry: bit.ly/DFSubscribe
Join the DF Patreon, support the team and get excellent quality video downloads: www.digitalfoundry.net - Ігри
1:35 and then I realized I had never seen a UA-camr's socks before.
You clearly haven't seen any Linus's video. Over there you also have sandals to go with them
You've never watched a kiss/lick/touch my body challenge?
@@nowonmetube what
You-Tube socks, indeed...
Head over to Super Bunny Hop and get yourself some George Socks.
2 kidneys is overrated anyway
I heard *liver* is in high demand on the black market.
Gross morons
are you 12 or something?
ROFL!!! I know right! 😜😜😜
True! Paying so much money to play videogames seems silly
Basically tech noobs in the comments will have to eat there words after release.
People complaining about the prices when the 2080 £715 is set to perform on par the Titan XP £1500 and the 2080ti £1050 is set to perform on par with Titan V £3000.
Basically, people wanted a faster horse and NVIDIA gave them a car and the people have no idea how it works or what to do with it. In the long run it will be seen as the breakthrough that it is but for now everyone is understandably cautious.
Totally agree. The fact Ray Tracing is even remotely possible in 1080p 60fps is a miracle. This kind of stuff will make games look more like movies.
Indeed. I can imagine seeing indie games looking amazing with Ray tracing. It may be a game changer with this tool.
@Iguana Divergent
Well all we saw in the demos was selective use to make reflections look nice. None of them lit the whole scene with rays. The mainstream engines don't exist with that feature available yet.
I remember a real-time ray tracing demo doing to rounds on YT 3 or 4 years ago. It was very noisy in motion and then took a couple of seconds to render the scene at full "resolution" when the scene was static. Was clear then it that it if the required performance gains could be made it was possible. Looks like Nvidia have managed to find most of those, along with way to clear up the remaining noise nicely.
5 years later and you were totally right! Now Intel, AMD, Apple, Sony and Microsoft are all looking to do what Nvidia did in 2018 with Turing on their own hardware. The PS5 Pro is the latest example.
I don't mind the slight lack of Polish; I'm happy enough with content being in English
Why would Richard speak in Polish tho.
nevermind, I just woke up.
Nie Kurwa?! 😭
I shall Finish this comment quickly
Nice video! I'm very excited to see real benchmarks! :D
Educational video with some good info about the architecture. Keep up the good work.
In the Cyberpunk 2077 trailer, I've noticed there are some low quality shadows around the edge of the screen. I wonder if they have anything to do with the variable rate shading or something similar. Maybe I'm wrong...
Just shadows? Probably simply allocation of splits. Look up PSSM and think further from that, essentially, enveloping the camera frustum by each split is really not the only viable way to go about generating each split, as long as in total all splits still cover the frustum, your tradeoff is that the density of the shadow map can be higher, but there will be some low-resolution shadows somewhere off to the side.
uhuh yeah i understood some of these words
A lot of great information was put into this video. Great job DF!
IMO 10 series is still pretty relevant and will remain for a lot more time
And that is exactly what Nvidia wants your, and many other people's opinion to be... After all, how else were they going to get rid of that 500 or 800 thousand backstock of 10 series cards with the new hotness hitting? Looks like by overpricing the RTX (which they can get away with thanks to the lack of competition) is working out nicely for that goal.
Depending on what you definition of "alot more time" is. If you mean at least another 2 years than yes but no more than that
My 980ti is 10% slower than a 1080 with overclock. Its got a year or 2 left on it. Every card is relevant, just depends on the individual.
I just picked up a 1080 Ti, so I hope so! :)
EvilPotato I feel like everything above Gtx 970 is still pretty relevant.
Fascinating can’t wait to be doing a build in which I can consider this
I appreciate you guys doing more content like this ASMR unboxing. Looking forward to more.
Give it about 6 to 12 months before we can see any difference. I'm still a bit unsure how to take this tech right now it's new and Nvidia are keeping to much under wraps before it's release. We now need to see what AMD have on the cards.
2:45 I can't believe that nasty yellow fly gets to touch an RTX before I do !
Can I plug my Switch to the usb c port of the card to have RTX in Zelda ?
lol
The real questions
Only if you also have the $20 RTX Season Pass and the $30 RTX Amiibo Pack.
Yes
That would be the dream, having a new Switch Dock that connects to an external graphics card
Are you guys going to do a mcc analysis on Xbox one x?
A what? :)
T. B. The Master Chief Collection got a 4k update :)
@@richarddaronco1018 it got a lot more than that :)
Yesssssss feel like if been waiting for ages bahahaha.
Hardly anybody owns the X, why cater to such a small audience? psvr has sold more ffs
Genuinely interested and intrigued. Can't wait for actual tests...
Okay, sooo optimizing for ray traced graphics is very different from optimizing for rasterized graphics. One of the weird quirks of ray traced graphics is that using alpha in an image to give foliage detail can actually be less efficient than using individual polygons for the individual leaves. I'm interested to see what other techniques arise from a future in ray tracing [which I hope to see become the norm].
I'm just talking about ray tracing in general. I play around a ton with blender cycles and eevee. I'm assuming that other raytrace focused cards from NVidia's competitors will probably have a slightly different feature set than the 2080s. Still, LOD like the 2080s offer is something to drool over.
Did anyone else notice the little bollox perched on the top of the card at 2:46?
People keep saying how heavy the cards are. Does it come with support bracket?
Will draw calls automatically be much better with rtx or does it have to be specifically optimized? Usually draw calls is an issue in mmos with tons of players and that on the screen
That graph @5:20 makes me want to see a RTX 2080Ti vs Titan V comparison sooo badly - how much the new tech is really worth vs high usual CUDA cores count. Also, I'd love to see a 1080p + DLSS vs native 4K comparison, especially in a 2080 vs 1080Ti fight.
4k is only going to get more and more performance intensive as time goes on. If DLSS doesn't sacrifice that much image quality then I can see that as a huge leap. Make those 4k/144hz monitors seem viable in a short few years instead of 10
I think the whole purpose of RTX GPUs is to run games at 1080p, where they are able to handle RT @60FPS, and then boost image quality with DLSS close to native 4K (like CBR on consoles do).
Looking forward to those performance metrics, your tear downs of the performance is super interesting
Does anyone know if the ram stacks with NVlink? Like 2x NVLINK RTX 2080TI will have 22gb Vram at it's disposal? I have tried to search this question but so far I haven't gotten a concrete answer...
Can you test the Davinci Resolve Denoising 4K performance with NVlinked cards?
Richard Leadbetter, did you use to write for magazines like Mean Machines and C+VG back in the day? :)
@DigitalFoundry
That's great that even with fewer CUDA cores, each CUDA core performs about 50% better.
So it's a better card overall.
So maybe now, we'll be seeing things like the GTX 1080 Ti being tweaked to come in a smaller form factor, less power requirement and lower prices?
The only unboxing video that it's worth watching.
i really want to see this variable rate shading, and how it could work on older titles?
variable rate shading + tobii eye tracking to fully render where you are looking at a given moment
No its not a waste of money because this is changing the way games can be desgined graphically, yeah it only works @1080 with less than 60fps but u guys need to understand that ray tracing is being rendered in REAL TIME which will make your gaming experinece more competitive by look at reflections and the shadows of the opposing team or Ai, like in R6 i use shadows alot when playing because when an enemy in light is near by i can see his shadow move and i can base my response of that, and with raytracing if an enemy is in front of a mirror and u can see the reflection u can see where hes looking and attack, stop complaing its still first GEN!!!, When 2Gen releases itll be much more optimized and hopefully game designers will optimize there game for that
It's right there and we can't benchmark it yet. How can you resist lool.
sooo... when will we get the benchmarks?
Can Digital Foundry please answer if NVlink will allow GPU ram stacking with a multiple RTX GPU setup? F.E. RTX 2080TI x2 in NVlink= 11+11=22gb Vram?
i'm not him but that's right
@@Naburish That would make these cards video 3d rendering beasts if true!
Very curious to see what the numbers are going to be.
DLSS is the most interesting thing about these cards to me. But to gain 2x performance it must be running half resolution of what they are comparing it to right? I’m very interested to see how picture quality stands up against the full res comparison and TAA.
What is the name of the song playing in the background of this video ? I think i have heard this song in one of the gran turismo trailers.
Do current games even use GPUs for integer? I was under the impression that they were pretty exclusively floating point, and that game engines use the CPU for integer. Should we care about better integer on GPU?
Is there any potential of dlss being trained on video content? I would love to upscale some of this 1080p nonsense I've got laying about :)
9:47 what hardware?
no benchmarks. Dont waste your time
no shit embargo date is Sept 19, only thing lifting today is uboxing and more importantly architecture details.
Based brainlet for not wanting to see what's under the hood and only wanting to see raw numbers
I wish i got more on the architecture of new RT-core and Tensor cores :(
no drivers. don’t waste your time.
Siana Gearz check anandtech or nvidias Turing whitepaper.
Do you think it will be good for video conversion dvdfab etc etc
My question Does 8GB memory buffer enough for future games at 4k .. i was waiting for atleast 10GB memory but im disappointed..
Great video explaining the new architecture
Can't wait to get my hands on mine!
iPhone Xs or RTX 2080ti?
Which one would you get if you only had $999?
I'm all for the RTX. I understand how this ray tracing can affect the industry. This technology will be available to not only for big companies but also indie studios. This could also streamline development. The only thing I just can't agree on is the price. I think we need AMD to step up their game and shake NVidia's bold pricing.
Turn the flashing lights off on the router behind the TV by using the button on the back of it.
Hi, please suggest ,will a 650 watt psu will be enough for rtx 2080 + i7 8700
Depends on the rest of your components to some extent, but probably yes. I've got an 8700k @5ghz with an overclocked 1080ti, at peak load it pulls around 460W from the psu.
I still haven't heard anyone talk about nvidia denoising. The Tensor core is downplayed as some edge case device that wont be utilized but I think this is incorrect. We still dont have enough rays to be able to do true real time ray-tracing at $1,200 (maybe RTX 8000 at $10,000). In order to achieve a good compromise we need to limit the ray-tracing to a single sample and use a trained deep learning algorithm to denoise the image. Even then, use could be limited to rays for culling, rays for shadows, reflections etc.
Will my i5-8600k work fine with the 2080?
what do you think richard should i buy .
Absolute unit. In awe at the size of this lad
2:46 the tiny insect: "god bless this graphic card!"
8:20 wouldn't it make sense to get your deep learning algorithm to practice on material from super high res footage of the same game?
Big up the DF mandem! Respect to the OG Richard. Spendin my fridays watching DF vids whilst snorting ket of a bints arse crack. Bless up yourself.
that's how it's done ;) my man !
Big Friday lol
What does this mean
wait is DLSS rendering the image at a lower resolution?
Yep, it's just a form of ai upscaling.
Anyone know if there'll be MXM version of these cards?
Isn’t DLSSAA using deep learning to identify what part of the image is needed to be supersampled to look psychovisually similar to SSAA, hence the performance increase? From the video it came through as some kind of AI extrapolation.
The new technology in these cards is going to be amazing. It's not only the first generation of cards with ray-tracing, but it brings to the table significant improvements in performance per CUDA core, expands performance gains with other techniques (AI - DLSS and psychovisual redundancy - Adaptive Rendering). Great stuff, because that means that those performance gains don't require more transistors and extra power consumption/cooling power/noise.
The fact that these cards are 'only' 35-40% faster than Pascal on an apples-to-apples comparison may seem a tad small for the price, but that's the price you pay for AMD f*cking up with Vega and Nvidia having to make do with 12nm and huge die sizes. They'll probably release a 7nm refresh next year and then the performance will improve even further and price will probably come down.
Very exciting about this GPU line. I'd buy one if there was a way for it to drive FreeSync.
2:45 - A serious GPU enthusiast
I had no idea so much technology went into Rooster Teeth Expo.
I'll see myself out.
Mesh Shading + Variable Rate Shading + Content Adaptive Shading + Motion Adapting Shading = the usual Epic Pop-ins on a Blurry Layer with thin Stuttery Coating and Glorious Lag on top.
Sooo when can we see the review? No one is mentioning when that embargo ends. The one in this video said September 14th, which is today so thats obviously for this video.
19th of this month
How much cost a pc to play AAA games on native 4K/Ultra settings/60FPS?
What kind of benchmarks don't show fps.
People forget that power is nothing without the architecture to manage it. The upcoming results will be interesting and not as straight cut as a percentage sign.
Awesome video, but please consider toning down the background music.
So DLSS is basically just interpolation, right? Did I understand that right?
I am also confused about this. Will this create more artifacting than the traditional TAA?
Well... to really make it blow up the lower value cards ought to have all these features as well.
I am guessing part of the NDAs have gone as my UA-cam feed is filled with rtx vids....but unfortunately not the information i would like .....though not that it matters as i could not afford one
2:44 dude is that a mosquito on the card? Lmao
3:30 to see Rich deathgrip the cards delicate gold bits
literally love the design
Does anybody know if these cards will work in an Alienware graphic amplifier ?
It might be horribly bandwidth limited. Last I checked the connection for those were still PCIe 4x, or have they improved it?
I didn't watch any other tech channels for info on the new cards, DF opinion is enough for me.
I love that this new tech is in the new consumer cards. Ray Tracing will be huge for all developers soon. That said, I’m sticking with Pascal until 2100 series when the kinks are worked out and more games are using. RT.
This is some of the best 20 series coverage or there right now.
I need a new GPU, and I think Im gonna wait until AMD puts out his new line, gets reviewev on DF and then comparing the best of the two brands will make my move, until then...saving, saving and saving :D
Say what you want about the 20 series cards, but they're bringing some exciting technological advancements with them. Like everything, they've got to start somewhere and usually the first go has some flaws. Which can only be improved from here on out!
That being said, I'm definitely sticking with my 1080ti. There's no reason to upgrade at all.
that gpu looks really rigid and sturdy. gj
Next time you should benchmark the box and provide an in depth analysis of the box art.
Volume for voice is too low cf to background music
“This is quite a unit.” Come on man, this is too easy! Lmao
Awesome video like always Richard, I think that the new generation of consoles are going to make a huge leap in graphical fidelity, all that's new technologies needs time to settle down and when the ps5 comes i think that is going to make a real difference.
Well those will be AMD, this is Nvidia. Hard to say where AMD is comparatively atm
It seems very unlikely that the next generation of consoles will have any kind of Hardware based Ray tracing abilities. Unless GPU technology takes a major leap in performance By 2020, and I don't see that happening. At least not for the type of off the shelf affordable technology that the consoles will have inside of them.
And that's okay imo. Ray tracing seems underwhelming currently and developers need to find new techniques and years of development to hone in on the technology. They're just getting started.
Playstation bum boy alert
I think you'll probably be disappointed with the leap in graphical performance. If their render target is 3840x2160 a lot of the additional horsepower will be consumed with rendering a native 4K image. Considering that these consoles are likely going to be priced at ~$400 to ensure steady mass adoption, the tech in them will be rather modest. Now, what holds great promise is their CPUs. The Jaguar processor in the PS4 and Xbox 1 is some terrible, tablet-grade APU from 2012-2013. The leap from that to something like a Zen or Zen 2 derivative would be ridiculous. Next gen games may not look much better, but the amount of simulation and complexity they'd be capable of would dwarf current offerings.
I don’t see any opportunity for SLI
Wait a second, they wont even support HDMI 2.1 with Features like the non gsync related variable refresh rate?
Love how he's not wearing shoes. :)
Damn! I used to think my 3dfx voodoo was big back in the day!
2 X VOODOO 2 12 MB in SLI that was really big i still have them
I had a Vodoo in the 90's and my mate had a Power VR. That was the golden generation of PC gaming.
3dfx made a card so huge it needed its own external power supply. You had to plug a cable into the back from the wall socket.
I agree 100%. 90s were the golden age of PC gaming. Online gaming was on the rise, file sharing craze forced some to go broadband, modding becoming a community, and etc.
So basically the performance boost over 1080 and 1080ti will vary from game to game very drastically. Still not sure if thats worth the money, its like having a PS4 Pro or a Xbox One X where you are at the mercy of the developer if they decide to actually make a decent use of the hardware and improve the game on it, otherwise you just get a slight fps boost with the same shit quality from the low end console.
Lol how is that any different then now
It will be about 30% faster than last generation if only brute force will be used without any optimalization.
@@Klempus lol your crazy to think that will be gained in 4k
Nick Jaime time will show but that is literally what Nvidia is showing on diagrams - nearly 40pfs in Infiltrator demo at 4k using1080ti vs nearly 60fps in the same demo at 4k using 2080ti. Engineers from Nvidia also are saying about 25% to 35% boost in performance. So yeah I must be crazy.
@@Klempus yea nvidia is saying and no one else is saying that much gain. Then the ppl who say it say it look's good but are skeptical of that much gain as well. Yes the optimization to the cuda cores will help but gains per game isn't going to be that high. Depending on the development teams we could see a good jump but 4k is a big GPU hog.
Wow the reference card is gorgeous. I actually prefer this over all the AIB partner cards.
It wasn't that long ago you could build a top end pc for the price of the TI. Crazy times. Stick with my 970 and xbox x for 4k for now. Ultra 4k 60fos + will be out of reach for the foreseeable future for me.
Wichard is back! I love you!
My Developer hands are ready to code!
I grab a nice 2nd hand 1070 once a game is released that is actually worth playing.
edit: probably nothing before Cyberpunk as it seems.
Can't wait to use all that cutting edge tech in those 4 games that will support it. Also, consoles are the base standard for most multiplatform games so as long as they don't use it, we are not gonna see many devs spending time on it.
it just works
specs please
Was anyone else waiting for one of those cards to go flying in a Richard hand motion mishap!?