NVIDIA in 2019: Not enough VRAM NVIDIA in 2020: Not enough VRAM NVIDIA in 2021: Not enough VRAM NVIDIA in 2022: Not enough VRAM NVIDIA in 2023: Not enough VRAM NVIDIA in 2024: Not enough VRAM NVIDIA in 2025: Hello Mid Range gamers, how about a brand new GPU with 8GB of VRAM and a sub 10% performance uplift
The whole point of trying to jam dedicated ray tracing hardware down everyone's throats and push the gaming industry there was to reset the goal posts so they could tell you you needed to keep upgrading and chasing performance, rather than continuing to push raster for native 4K high FPS performance for reasonable prices. So of course they're going to drip performance, so you need to keep buying more.
@@EhNothingdevs asked for ray tracing. We will be able to do really cool stuff with it (with very few rays) once we can assume every gpu has some basic rt support. This will probably take a couple of extra years.
I mean steam literally says 50+% users are still on 1080p. No reason for entry level gpus to target anything more. Older 720/768p monitor are starting to be not available anymore & 1080p is becoming cheapAF, it will still take time for 1440p to be entry level. The actual sad fact is the price of "entry level" cards. 60-series was meant to to be b/w 2-250$. But rtx20 series moved that, while rtx30 series didn't have any decent entry level sub 250$ chips.
@@DBTHEPLUG Haha, no. I bought a GTX 1060 and then a GTX 1070. No need for a RTX 2069. But I did upgrade to a second hand RTX 3060Ti for EUR 300 and a second hand RX 6600XT for EUR 200. I am waiting for the generation that has a mainstream GPU with 80%+ better performance than my RTX 3060Ti. It's just frustrating to wait for that for generation after generation.
yeah, rtx in 2018 was basically a scam, but im still impressed how well the 5 years old card run brand new games, this video talks about a card that is literally one console generation away, does not make justice to the card at all
@@leandrrob Ray Tracing is a scam in general unless you have 1500+ to spend on a 4080+. It was meant to push the capability of video cards, but not quite meet the basic requirements.
Nvidia: "hub doesn't focus enough on rt" Tim: "ok, how about we see if you kept the promises your marketing made at launch?" Nvidia: "n-not like that..."
Yeah but imagine comparing a 6 year old card, the weakest card of that generation, against the latest modern RT games, half of which could use better optimization, and expect it to manage. Even the RT core arch has changed.
@@be0wulfmarshallz Mate, those were crap from the get go aside from 2080Ti that was actually powerfull and for once, wasn't a massive FU to Titan owners.
@@be0wulfmarshallz There weren't any RTX titles at launch, RTX back then was a "future proof" argument, totally appropriate to take NVIDIA at their word here.
@@be0wulfmarshallz What about things like Watchdogs and The Witcher 3? Those are older games that still didnt deliver even a 1080P 30FPS experience with RT on AND DLSS helping out.
Not that surprised, though. The GTX/RTX 60 and RX 600 series are the low end series of GPUs. You'd need to go the mainstream, (the 70/700 series) to start to get any decent uplift. This is even more compounded by the fact that the last few generations of GPUs had extremely poor uplift "gen to gen" for the same tier. E.g.: the 2060 to the 3060 to the 4060, or the AMD 6800XT to the 7800XT high-end tier series. Typically it should for about 25% jump between tiers in the same generation and a 50% jump from gen to gen in the same tier. In-fact, the only tier that has continuously gotten any normal "gen to gen" uplift i.e.: ~45% - 50% uplift was the Nvidia 90 series (previous known as the 80Ti) enthusiast tier. And with that came a 2x price adjustment (starting with the 1080Ti to 2080Ti). Can't wait to see what it is with the 5090... Probably 3X price..
I said this back when it launched. The 2080ti could barely run ray tracing so the fact that they advertised a 2060 to do it should have been illegal. It's basically lying.
Rofl, ' barely run rt ' ? Bs. I own a 2080ti, it runs rt, and everything else, just fine. Never had any slowdowns, stutters, or any isues whatsoever even with the initial launch of Cyberpunk 2077. 😂
Blaming Nvidia makes no sense at all. It's perfectly capable to run RTX games, as demonstrated by Quake 2 RTX, for example. The problem is when you take an obscenely heavy, poorly-optimized renderer that has "organically evolved" over 20 years and slap yet another very demanding feature on top of it.
Remember that Tom's Hardware article back in 2018? "Just buy it - when your whole life flashes before your eyes, how much of it do you want to not have ray tracing?" ...yyyeah, about that...
I can already see the 5070 marketing slides THE NEW RTX 5070 - A RAY TRACING POWERHOUSE AT AN AFFORDABLE PRICE Cyberpunk 2077, RT Overdrive ON [|||||||||||] 62 FPS* (1080p, DLSS Performance, DLSS Frame Generation ON) only $699
The amount of times I look at raytracing compared with traditional rendering, I feel like the performance hit isn't entirely warranted for what might be a minor visual improvement. If there is a meaningful improvement however, then it's a different story.
I played through a few fully path traced games on 3080. Half-Life 1, Quake 2, Portal RTX. But even Quake 2 remaster without RT has better lighting than Q2RTX. I also use RT shadows in WoW cuz they give a decent visual improvement and there is enough GPU performance for my 90 FPS target (the game is mostly CPU bound anyway). Other than that, I haven't used RT to a worthwhile extend.
At a point, when a behemoth company no longer cares if a particular segment purchase their products or not, basically grants them the license to do what ever they want.
Well, people still buy them. They are at the issue where they can no longer produce enough to feed the supply so the only way they can make money is by selling higher
There is. They take up precious die space and they only benefit gamers. Nvidia wants to sell compute and AI performance to professionals who pay "CPU like margins" for the die space they're getting.
@@andersjjensen I thought the only reason the cards _have_ the "RTX" and tensor cores was as a by-product of their AI development? I.e. it was a way to market existing technology.
@@bricaaron3978 Well ray tracing when the 20 series came out was basically a rarity...not many games had RT. It wasn't until 40 series that a lot of games have some kind of ray tracing...and that was years later. Now...should the 2060 card be tested on games literally coming out this year? Especially unoptimized ones? Yeah I donno if THATs a fair thing. Nobody today would say "yea lets imagine comparing the lowest end card to video games 4+ years from now, damn it doesn't do good."
lmao that tom's hardware article was just insane "When you die and your whole life flashes before your eyes, how much of it do you want to not have ray tracing?"
It will always be just out of reach. Monte Carlo rendering is extremely taxing. As soon as hardware progresses, they'll just up the number of raysand bounces that are used. The whole point is that your hardware is never good enough
@@mryellow6918 We *might* see cards able to run ray tracing without hacks like dithered rendering and temporal effects in 20 years without a significant shift in hardware methods. 4090 is nowhere close. The path tracing in CP2077 is using nowhere close to the amount of rays needed to actually make a scene look good. It is dithered and blurred to make it look somewhat acceptable - as long as there is no movement in the scene. Once there is movement, the dithering and temporal effects need to be recalculated and you get visual noise (which is what the hack of ray reconstruction is trying to solve).
It wont. If everything thing else stopped getting better ie resolution, texture quality and fps, then yes, RT would catch up. On the plus side in a few gens you should be able to come back to some old games and have decent performance.
@@DSP_Visuals ding ding ding, tell the man what he's won. The whole point of chasing ray tracing is to tell you you always need to be upgrading and your performance is always bad. Just ignore it entirely and buy what offers the best performance for the price for raster and you'll be golden.
In fairness we all knew the first gen of RTX was meant to pay for the GPU future improvements, but I suspect many Nvidia devotees didn't expect them to continue to bring out products with not VRAM. Forget the 2060, charging $400+ for an 8gb 4060ti is just scamming people.
Well, they asked 600 dollars for a RTX 3070 Ti and that also had 8 GB and was only good for 1080p rasterized gaming or 1440p with DLSS. It can't do RayTracing. I feel sorry for anyone who bought that card, at least with the 4060 Ti reviewers warned us not to buy it.
I made a quick test in UE5.5 on RTX 2060 and Hardware RT via Megalights can double framerate vs zero RT, but the shadows are much noisier (although they also look more correct especially in overlapping areas form multiple lights). Megalights is apparently unusable on GTX 1060 without hardware RT. In Blender it's also a noticeable jump in speed when using RT cores even on 2060. So the feature does have a real value in some cases, despite not being worth using in majority of current games.
I have a Liquid Devil 6800 XT and that usually means my options are 4K native high/max settings w/ RT disabled, 1440p with low RT and FSR quality/balanced, or 1080p with high/max RT. I almost universally prefer the higher resolution with no RT for both visuals and performance.
@@K31TH3R This is what I call "eyelash rendering" stuff: where the engineers and devs focus on stuff that makes very little actual differences to gamers and gameplay, but is technologically "impressive". Like if they spent a huge chunk o' cash and chronology on how to render photorealistic eyelashes you are...never gonna notice. Like, is RT nice to look at? Sure. But if it quarters my frame rate, melts my GPU, and necessitates me playing at resolutions I haven't touched since Gotye had a hit, then no.
It's well worth it. I would take 30 fps with RT if it's implemented well over 60 fps any day. Any higher fps would be a no brainer. 50 fps instead of 100 fps is an even easier deal to make.
Developers and game engines got so good at shaders and other techniques such that even after 5-6 years of raytracing being around - I don't feel like I'm missing anything by not using it.
i do miss shadows though. the ugly cascades and lack of microdetail. iam a fan of UE5´s VSM. but lumen is worse than baked lighting because denoisers are awful except nvidias proprietary ray reconstruction denoiser. and having nice GI like metro is always great for immersion, but when the cost of trading fps vs looks comes, its always fumbled. 6 years in and RT is sort of an afterthought. but metro and avatar have proven RT to be amazing, if, IF a dev builds the game around that from the ground up.
not quite correct since even 4090 struggles at 1080p render in some games, but should be able to keep fps above 60 in Wukong without DLSS performance hit.
@@CrazySerb As a 4090 owner myself, I have yet to play a game needing DLSS performance. What struggle is to you, vs someone else.. who knows. Seems many have variable definitions as in 60fps for some cards = GOOD, and 60fps for other cards = struggles. One thing it all has in common, is it often comes from internet experts with 0 xp, and/or social media analysts.
The idea of turning every other visual setting down to barely on and enabling ray tracing so we can say ray tracing works is patently absurd, as is the whole ray tracing scam. A single "effect" that costs even the best GPUs 1/3 or more of their performance is just silly. Turning on fake resolution and fake frames to make up for it is even sillier. RAISE SHIELDS!
You are clearly misinformed. Ray tracing isn't an effect. It is a completely new way of rendering. It takes about 60 to 1000 times longer and uses at least 4x memory when done on a CPU compared to classic rendering. That fact that nvidia was able to implement it with barely 30% perf drop (rather than 30 times perf drop) is extremely amazing from technical point of view. And guess what, nobody is asking you to turn on RT or ultra settings. You can enjoy your games on medium. If you are playing games only because of ultra, and they don't give you entertainment on medium I pity you - you need to find new games, because you are just wasting your time with the ones you are currently playing.
@@igelbofh Note that I put the word _effect_ in quotation marks, indicating to the literate that the writer knows the term is not perfectly accurate. No matter how impressive it may be that Nvidia has an RT implementation that only costs 30%, that's still too much. When using RT becomes the standard -- always on, this is how we do it, sucker -- and everyone suffers a 30% hit all the time, we'll all regret allowing Nvidia to push it so hard just so they could 1) wring more money out of us and 2) beat AMD at something relevant to gamers. As for playing the wrong games, I suggest that if you have to go to medium settings to achieve decent FPS, you might be the one to whom the concept applies. If you're buying a high-end display and a high-end graphics card to play on medium and get more FPS, that's your choice. But why do you think the higher settings are there? The answer is easy -- to give us options. I can play at ultra and get 60-90FPS; you can play at medium and get over 144FPS. We can both enjoy the game as we see fit. At least, that is, until RT replaces rasterization. Then we're all screwed.
@@rangersmith4652 I am still playing on a GTX 960 and a 60Hz 1980x1200 projector at 260". And everything I play, including first person shooters is fine (with occasional dips to 45fps) . And sometimes I am even in the mood to play jagged alliance from the mid 90s. Given the limitations of the human vision processing system, one is wasting money at more than 1080p in less than 27" at more than 4 ft away from the display at more than 90fps, unless you are a professional competitive gamer, and it's how you make your income. What most people are made to believe they should buy is complete marketing demand gen psyop. What you should pay for though is viewing angles above 140 degrees, color dE of less than 1.2 and at least 150% sRGB coverage, and perhaps a curved monitor if you are going at more than 30". Plus a colorimeter for calibrating in your environment. RT is not going to replace rasterization for technical reasons any time soon. If that happens it will be another demand gen psyop. Even if we stop writing new raster rendering engines today, the ones we have are really well optimized and further optimization is almost nigh improbable. On the Ngreedia side - agree.
i feel like the 3060 12 gb will end up as the best mid tier gpu for a while now, especially if AMD and Nvidia gonna keep producing 8gb 60 cards for the same or similar price
Nvidia came up with DLSS upscaling, because of RT, that was the entire reason. They wanted to push RT but everyone was complaining of the garbage performance, so they add to come up with something, good for them AI add a boom and they could take advantage of it. I remember for a time, before DLSS was a thing, Nvidia CEO was saying "with RT, frame-rate numbers don't matter anymore! Is all about the graphics, it will change everything!"
Reality has always been like that. xx60 and 50 tier cards are capable of enabling RT but not required by user. The main thing is obviously DLSS I have had rtx 3050 and i knew it can use RT but i only was using DLSS so I can have boosted fps at my 1080p res
@@LoricSwift or do brand because they still support RT?! Everyone knows 60 class geforce rtx can enable but its pointless because they are the most entry level such as 4060 or 4060 ti and RT on makes sense on cards of 70 class like rtx 2070 S or 3070 or 4070
I think Metro Exodus Enhanced proves that rt games can run well if they're built for rt from the ground up. But that would require developers to put that effort in and to be willing to leave out a huge portion of gamers with no / weak rt cards.
This! To this day Metro Exodus EE is showing what's possible if you put in the work and make difficult business decisions. While Cyberpunk has an impressive RT Overdrive mode it was full of artifacts unlike Metro EE. Imagine how much better that mode could have been if it didn't have thousands of hours to support raster, and the previous generation of consoles.
@@BlackParade01 Any UE5 game is built for RT as well.. lumen is optimized form of raytracing, the new ue5 megalight is also another optimized from of raytracing
Enabling Megalights (runs solely on hardware RT) doubles framerate in UE 5.5 vs no RT if there are dozens of lights with shadows in the scene. It significantly boost performance even on 2060, but the noise can be problematic on lower end cards. So the assumption that enabling RT lowers performance might be wrong in future games.
@@kazioo2 that is only because unreal engine is hot garbage which even epic themselves don't know how to work on all while removing the ways you could optimize the engine for better performance which has been a complaint for a good while if you want to see what happens when devs actually care to make a good engine check out how NFS 2015 looks with its frostbite engine to realize that it isn't far off graphically from path traced cyberpunk 2077 while asking for significantly less compute from a GPU
I will ask here again: How come Metro Exodus Enhanced LOOKS and RUNS so beautifully on my 7900xt with RT Ultra and games like Cyberpunk and Alan Wake 2 are so heavy? Especially Alan Wake runs and looks awful if I enable RT on full.
@@panospan3565 Ask the developers. I doubt many in a comments section are going to know what's going on in the engine enough to know why without making assumptions.
RT is _absolutely_ the future, but we haven't seen the hardware performance uplift we should have seen that would make it more viable. Baked lighting can be very convincing but it's a lot of work and when something gets missed it sticks out like mad. RT would enable developers to skip a LOT of that work with no concerns of missing anything, not to mention it gives another level of realism *when done right,* and it's that "done right" that's usually the issue with how half assed and meaningless it's often applied in today's games. When antialiasing first appeared it absolutely tanked framerates too and the naysayers were making the exact same argument that it's a useless scam feature no one should use. Fast forward to today and some games won't let you disable it at all. RT isn't a scam, the pathetic performance uplift we've seen since it's inception is.
@@zodwraith5745I agree. It’s not the technology but the marketing which is the scam. I have the feeling that these lower end cards are just there to be bought up by people that are not very deep into the scene and call it a day when they can’t use something like RT.
Hi, "Some" here. It's 1000% the biggest boondoggle in gaming history. Massive performance loss for negligible benefit in the few cases it's even implemented. It's Hairworks 2.0, and virtually all the media outlets, including HUB, lap it up like good boys and girls and say to buy nvidia cards over AMD "because ray tracing and DLSS." Don't forget their Wukong "benchmarks" showing the 4060 Ti out performing the 7900XTX. At least GN still has credibility.
instead of pricing, i am more worried about what kind of gimmick Nvidia is going to introduce for their RTX5000 cards lmfao (and of course lock it exclusively to the RTX5000, because Nvidia is being Nvidia as usual)
Got to love how the algorithm works. I just subbed to you guys; glad I did it. I own an MSI RTX 2060 Gaming Z. I bought it secondhand for around $130 quite some time ago. I still have the original box, in mint condition, manuals, and all the goodies. The first game I ever played with it was Control. Nowadays, I stick with Warframe, Rust, BF's, and a few others. I played with the RTX settings for a while with Control and other games; a few hours later, I was like, "Nah, this isn't how it's supposed to be." Anyways, most of the games I've played over the years have been without Ray Tracing enabled, and I have to admit the experience I've had with this card (still using it) has been phenomena;l for me. I don't demand over 120 fps; I don't use Ray Tracing. The games I play run great in 1080p with over 60 fps, no problem. I was using an old RX580. So the jump in performance has been, and still is, very enjoyable. I'm buying a new GPU around mid-2025. I already knew this card wasn't what they told it was. But, oh boy, this 2060 has made me feel happy and given me a tremendous amount of joy these last few years without any troubles. I take care of it. I clean it. I paste it. I thermal pad it often. I can't imagine what the next card will bring. But I know it's going to be awesome! Thanks for all the information and data guys.
pretty stupid , my 2070 super was my first" high end" card ever so not feeling too bad about my purchase in that regard but the rt capability is lacking most of the time even though i'm at 1080p monitor
Good thing I warned everyone I knew to get a AMD 5700xt instead of this POS at the time. All of them are still enjoying the vastly superior performance today
yeah except some important features than you can not use like DLSS, H265 Encoding (AMDs encoder is absolute trash), TONS and TONS of features for professional and hobbyist software like Blender that are not supported by AMD (like Hardware acceleration for raytracing in the viewport of 3D Software), stable "it just works" drivers etc. ( I coud go on for an entire essay but enjoy your feature crippled card at the almost same price of a better product) Buy cheap, buy twice.
@TheNerd 5700xt runs games as good or sometimes better than a 1080ti nowadays. 2060 has been performing around 25-30% slower for years, and even worse because of the vram limitation. I know you are coping, but man you're really delusional in this case. This AMD gpu is amazing and way more durable than its competition at the time.
@@TheNerd what is it with you people and the whole "driver issues" shebang? My 57xt has been completely solid since I got it back in 2020, hell even my msi 390x from 2015 never gave any issues driver wise. Am I just not playing the correct games for this to be happening? Honest question, no bs, because it feels an awful lot like bs every time I see drivers brought as an issue against AMD when I just haven't seen it. Do I need this fancy technology that halves my fps for twice the cost, no. Do I do professional work and need software like blender, no, I'm a gamer at my most demanding. Do I need encoding, I don't think so, this isn't going into my NAS box so I'm not worried about dvd playback. I got what works and fulfills my requirements. Buy according to your needs, not hype.
@@TheNerd "Buy Cheap, Buy Twice"... you're not wrong, all those 5700XT owners sold their GPU for 4-5x what they payed for it during the crypto boom and got a significant upgrade over your mediocre 20 series GPU.
In the puny circle of ultra settings elitists, there are a large portion of gamers who cannot afford high end expensive 16gb+ cards and is contented to play on High+DLSS or native Medium. you make it sound like its the consumer's fault that they bought below 16gb cards when the market only opens for them at the 8gb card price range budget.
I feel like the few examples that work well on a 2060 go to show that it had potential as ray tracing on a budget, but DLSS ironically backfired to the point where developers optimize their minimum spec requirements for DLSS performance mode, which is just not worth using at 1080p. As it is now, the 3060 serves as the actual practical baseline for RTX.
@@stephenallen4635I have a 4080 super and a regular 4070, raytracing is perfectly usable on both with the right resolution. 4070/3080 I'd say are the minimum for having decent rt performance, it can run it relatively well at 1080p and with the help of DLSS it can run it at 1440p.
Performance mode at 1080p = I wanna play the game, but my GPU is not able to run it at playable fps in any other way. As for baseline - 4060 with frame gen and soon 5060. Don't gaslight Yourself that 3060 was good in the first place launching at $329. It was on par with 2060 super. Same goes with 4060. It's barely "ok" at 1080p60 without RT enabled. If 5060 will require RT with Frame gen just as 4060 did, then we will know exaclty how marketing of RT worked all the way back in 2018 when we had just 4 games "having RT" and 2080 Tie not being able to handle CBP2077 at 60fps at max graphics.
LTT recently released a video on a parallax 3D monitor. And mentioned that about 150 games support the effect. And I find it hilarious that a similar number of games support RTX lighting and shaddows as support an obscure 3D monitor.
Lmao, most of those "150" games don't have any real bespoke support for the 3D tech. They just tested them. The monitor can inject the stereoscopic effect to some extent but your mileage can vary drastically from game to game.
Like the razor haptic feedback thing dawid just tested. Some of the games that “support” it like hogwarts you just get a buzz when you cast a spell. I’d hardly call that “supported”.
I always wondered about the other PCIe slots and if you could put in a discrete unit with more tensor and shader cores for example but then people wouldn’t buy the higher end cards lmao
That what you tell is not SLI. You mean dedicated render workloads like Nvidia PhysX often does. Only 0,0000001 of the users use the dedicated feature. An RT Implementation will be impossible, the GPU will run asynchron and every effect like reflection and shadows will get a terrible delay. CPU need to work with more drawcalls that reduces the performance. If you do this on the expensive Path Tracing you will enter a bright room without lightning, that will get after few seconds dark, thats not worth. It can solve the insane FPS drop, but with a big image quality drop.
@@slickzMdzn Funny thing with that. Even when SLi was a thing, High end cards were still purchased and some people even purchased multiple card for 3 or 4 way SLi.
The 1080 Ti will always be the GOAT. Unfortunately, we will never see a card like the 1080 Ti ever again that was all horsepower, innovative, and affordable. I am glad that my 1080 Ti is still part of my GPU repertoire and with its upscaling update not long ago - the 1080 Ti still keeps up quite well in 2024. But, as a PC owner myself, I have to be honest and simply say that we have no one but ourselves to blame for crappy GPU's after the 10 series from Nvidia that cost an arm and a leg because people fell for the hype and bought without questioning giving Nvidia the greenlight to overprice for crappy products. Today's cards, in my eyes, are simply a V4 with a twin turbo (DLSS & FG) no longer emphasizing on affordability and on the power of the engine and that the twin turbos is just a plus but not needed. I never turn RT on in any games simply because at this point in time, it is pointless. I mean the 4090 can barely do 60 fps at native on high or ultra and that is very very disappointing especially priced at 2 plus grand. If PC owners were simply more stern, today's GPU's from Nvidia would be far more powerful at native settings on ultra that yes would have taken longer to create but with more bangs for the buck, or, simply priced properly since there seems to be a new card every year or every two years now. I never buy the newest tech, always stick to one generation behind, always wait 2 years before I buy if the upgrade is needed so all the updates are completed on that tech, and always wait for half of its price once the new tech comes out. So, when the 50 series come through and people again buy blindly giving both arms and legs this time around - the 40 series should all be priced properly as they should have been from the start at half of its price. And then, sure it is worth upgrading.
1080 Ti had incomplete Asytnc Compute support. That's why every Turing card save for the 2060 beats it nowadays. In some games even the 2060 beats the 1080 Ti at lower res.
@@avatarion LOL show me one game where a 2060 beats a 1080Ti. I've never seen it even in the 1080Ti revisit videos late this year. The 1080Ti even crushes the RTX 2080 in a lot of games and the RTX 2070 Super in pretty much every game.
@@ArmadaOne Off the top of my head, in Far Cry 6 at 1080p the 2060 is faster. Like I said, async compute is incomplete in Pascal. Turing benefits greatly from games that are optimized from newer hardware, Pascal not so much. You won't find a single DX12 game where the 1080 Ti beats the 2080. It's not uncommon to find the 2060 Super beating the 1080 Ti, and in 99% of cases the 2070 Super does it as well.
@@avatarion Funny how both this channel and it's benchmarks and the GamersNexus benchmarks prove you wrong. I even rewatched both videos I mentioned to be sure and just like I said, the 1080Ti is way faster than the 2060, beats the 2070 with ease as well and is faster even than the 2080 Super in quite a few modern titles. So, I have proof, you have nothing. Thanks for playing, you lose again. Also, I don't even see how a 6 GB graphics card can even play Far Cry 6 unless you really dial down the settings. I had a RTX 3080 Ti 12 GB card and when I installed Far Cry 6, the game warned me that I needed 16 GB of VRAM to use the HD texture pack. So the only way a 2060 would even work on Far Cry 6 is if you didn't install the HD textures and then run it at low or maybe medium settings at best.
@@ArmadaOne 1080 Ti beats the 2060 in most cases, but in select cases few like Far Cry 6 it actually loses. That's just a fact. Case closed. This is all due to Async Compute being incomplete in Pascal and Maxwell. Turing benchmarks made in 2018 are not comparable to benchmarks made in 2020+ when games shifted to this new console gen and on PC started being optimized for newer architectures. Turing cards have gained on Pascal since 2020. The 2060 used to be equal to a 1070 Ti, but now it beats the 1080 handily. The 2080 started equal to the 1080 Ti, but now it easily beats it every new game.
I think Jim from AdoredTV, when he was still active, warned maybe 4 years ago that ray tracing performance does not scale linearly with hardware. That is, the RT cores and die space they occupy needs to get increasingly more for smaller and smaller gains in rendering power. So we will probably not see a breakthrough in it for a while, until GPU engineers figure out a way around that.
@@chrys9256 Yes, maybe. What I meant though is that for a meaningful increase in RT performance we might need ridiculously large dies, which at the current chip market and production costs (even assuming high yield rates), will never be cheap and/or efficient power wise. Let's say a 700 mm² total die area, even comprised of several smaller chiplets is probably not the best way forward.
I've never turned Raytracing on for a single game as i value performance too much vs the small improved visuals from RT, however i do play at all ultra settings
I do the same. Im on a 3060 laptop so it’s not worth all the other compromises to turn on RT. But, even in the games that I can get it to run and hold 60fps I tend to turn it off and get 100+fps instead. A subtle boost in visuals isnt worth it. I can see how RT is useful in the long run especially for developers and it will make games look better but I still think it’s a few years away. Mainly because affordable GPUs can’t do it justice with their gimped VRAM.
6 years and still a meme. We already had a much more efficient, higher quality and temporally accurate way to render a game, It's called rasterization.
@@ume-f5j I don't think he's wrong, especially not about temporal accuracy. Modern games are trading too much fidelity/visual clarity for overly computationally intensive shader effects and RT, which means they effectively have 10% better lighting at the expense of -50% or lower render resolution. Studios are not finding the right compromises between effects and fidelity for the available hardware like they did before upscaling/TAA became prevalent. If the game isn't plagued by aliasing, chances are it's full of temporal ghosting, denoising artifacts and performance issues once you start increasing the resolution. While there are a few exceptions, I almost always prefer the performance and visuals of 5-10 year old games downsampled from 5K or 8K versus using upscaling in a modern title. Forza Motorsport vs. Assetto Corsa with mods is a great example. Forza has all the modern RT tech and upscaling to make it run well, but the 10 year old Assetto Corsa with CSP/Pure shaders and a high quality track running at a native 5K easily goes toe to toe with Forza in visuals while often doubling the framerate. Graphics tech is in a transition phase right now, and it's really not in a good state. RT is the next step for sure, but we jumped on it about 10 years too early, and the compromises are not good.
I had to buy a 2060 12GB in 2022 when they re-released it and it was the only GPU available for over a year at my local micro-center. My older GPU failed and had to regress backwards through my inventory during the pandemic. It hurt to pay so much money for a 2060 12GB, now I feel almost a little stuck with it. Although at least the modern cards aren't exponentially better.
Who are surprised? RT is a gimmick and will be until Greedy companies give us sufficient hardware. 4090 have problems at 4K. RT is a constant money grap.
Nah it completely transforms some games. Some games have average or settings that arern't worthwhile turning on, but you could say that about a lot of things.
RT is not a gimmick. It's just not a feature end-users touch, but experience. It's a tool for developers to make the visuals they create have more realistic lighting based on actual physics and math. It's pure eye candy. It can be much better than rasterized lighting effects that are artistically drawn to be realistic. But as with all things, budgets are budgets... for developers and end users. Once developers actually develop with RT in mind, there are significant time savings since they don't have to guess where shadows should go, how items would be lit by a lamp at a table, etc. If developers stop doing lighting through traditional means, RT is the only game in town.
well, its no longer a gimmick, developers "sponsored by NVidia" will now make games with sub-par raster and hide everything in RT, players will then be forced to play with RT on, that's why RT-capable cards have lower vram so that you will buy the highest end ones just for RT.
The 3090 was the First Card that came Out and did actually pull IT Off in a meaningful way, imo a 3080ti or 4070 super is the bare Minimum to get into serious rt, maybe a 5060ti will get there as the First 60 Card ... But i doubt it
Serious 1440p RT indeed can only be achieved on a pretty powerful card. 4070/3080 is the minimum level here. But there are a few RT games, like Metro Exodus EE or Spider Man, which even 3060 can run at 1440p and decent quality settings.
I agree. As a 4070 super user, I can use all the RT features on all the games I've tested so far with DLSS. I think with a weaker card you might have to disable some stuff or to run them at lower quality for a great experience.
The difference wasn’t that big between a 3080 ti and a 3090. It was those times where a 90 class wasn’t needed for high end gaming performance due to the small increase relative to the next card down.
Nvidia laughing at gamers. Just remember. AMD was giving gamers 16GB when nvidia was giving 8GB on the 3070.... Nvidia just wants to screw and squeeze people for everything they got . 5070 will have 12GB and 7070 will probably STILL have 12GB.
Aha... and where are those 16GB cards today? No one bought them. I wonder why that is... I can tell you: they were unreliable, they had MASSIVE driver problems, they even had to recall some and in the end no one bought them.
amd is just unreliable bro, in my case it's not even compatible with the autodesk/adobe sets that i work on. Have a friend that whenever apex legends updates something stops working, overwatch wont work properly till it restarts like 3 times, ark won't even launch... and he has a 7800xt, amd cards will only provide good experiences with partner games, the last one of those was starfield so you can 2+2 right?...
@@TheNerd No matter how much you want to trash AMD, the fact remains, they gave you 16gb back then, Nvidia gave you a measly 8GB. Now, Nvidia giving you 12GB and AMD giving you 16GB, Nvidia giving you 16GB and AMD giving you 20GB. So, the trashing argument is dead. Because the current AMD equivalent are great cards.
RT took years to have a practically night and day difference in visuals in games, Metro Exodus EE was probably one of the first alongside CP2077, thus by the time the 3xxx series came out the early RT GPUs were mostly obsolete at the low-mid range (even with DLSS). 2060 was DoA for RT from the start because it was just a marketing gimmick. Only now with the 4xxx and soon to be 5xxx cards is it really something worth considering despite the hit on framerate because the maturing of the technology but also because games are now being designed with it enabled to begin with.
Any idea how to remove the excessive white/grey wash from Metro Exodus EE? Tried playing it with RT on a 3060ti and distinctly remember it being too bright, to a point that the original dark grim atmosphere was lost.
It would be interesting to see this with the 2080ti and 3070 as well. Nobody can say that these were bottom of the stack product that stood no chance at actual ray tracing.
Honest question here: How come Metro Exodus Enhanced LOOKS and RUNS so beautifully on my 7900xt with RT Ultra and games like Cyberpunk and Alan Wake 2 are so heavy? Especially Alan Wake runs and looks awful if I enable RT on full.
Because contrary to popular belief, Metro Exodus is not using RT to do most of the new updated effects, like the god rays, the better fog and other stuff, is just misleading marketing and hype from certain gamers, the fact is that game, is still mostly rasterization. In the other end, Alan Wake 2 and Cyberpunk, make heavy use of raytracing and in ultra of path tracing, that is a more faithful but heavier kind of raytracing, even for Nvidia high end GPU's but it affects AMD GPU's more.
@Argoon1981 I see what you say and you probably know more than me, but Metro Exodus Enhanced really looks better than the normal version. I can see it myself in the lights, the soft shadows, the GI. There is difference from the normal version. Even Tim says that in his last VIDEO that Metro Exodus E.E. is one of the very few games that changes significantly while using RT. And it has a smooth 60 fps which for me is enough. On the other side, Alan wake is struggling with light RT and if I enable full RT it is almost unplayable and it also looks bad, it needs the Nvidia denoiser or something, I don't know, which is suspicious by itsself.
I mean they kinda already have, Nvidia’s step down from the 4090 is the 4080 super which still costs 1k with 16GB of vram, in comparison to AMDs offerings which the 7900XT and XTX have 20 and 24GB respectively and the XTX is the only card reaching 4080 prices, the 7900XT can easily be found for $700 sometimes less than that
@@shadowcastyt AMD is still not providing enough ray tracing I'm uncertain if they will be cost equivalent for ray tracing next generation and having enough to make it worth having in the 8600xt seems unlikely.
@@anthonylipke7754 If you're eyeballing a 5060 or 8600XT class card for this round... forget about RT. I was baffled that in the "4090 vs XTX" thing Tim included settings that resulted in 35FPS 1% Lows as "good RT configurations" for the 4090. That's a 4k with DLSS Quality (so 1440p internal rendering). If se take upscaling out of the equation and look at what resolution a 4090 can deliver a never-below-60FPS experience the answer is.... at 1080p (or 1440p DLSS Quality). So the TLDR; is that if you have a brain you'll be going for raster-per-dollar and VRAM. Setting texture quality to Ultra and everything else to Medium, when the card ages, will look a lot better than setting everything to Low and enabling RT for two effects.
@@jsullivan2112 In the semi conductor space you don't have to until lithography hits the quantum wall. People always want more performance and the next node is 2-3 years out. Once that arrives you take advantage of the increased density, and presto, last gen looks bad.
RT is a scam, honestly. It was never even made to look good. It was made to save development time on setting up lighting, and offloading it to ray tracing. Then, it was **marketed** as a better graphics preset. At the cost of half of your framerate. People believe shadows can't look good without RT, because NVIDIA killed PCSS. People believe real-time reflections are not possible without RT, where we had functioning mirrors in DOOM 3 in 2004.
indeed. the industry came up with new "features" to lower GPU performance, because GPUs were getting fast enough so that people wouldn't need to buy new hardware. Of course it''s a scam. Other tricks they do: gimp older GPUs in the drivers and get Devs to not optimize games. Also convince people they need 4k resolution for gaming.
@@p4radigm989 not only does it benefit NVIDIA and makes people buy into new hardware (selling 4K and, it's even ridiculous to remember it now - 8K, was never going to work, and never did). It also allows developers to offload dev time and cost on YOU, while they "forget" how to make decent shadows and reflections from 2012-2018 and make both worse on purpose.
The only good thing about RT is rendering images in 3d softwares, also those old planar reflections used to work because models did not have the same layers of quality as now and developers used them in tight spaces, spaces which had close and limited amount of objects, you can't just mirror hundreds of buildings on a puddle, SSR barely has any performance impact rn.
@@yaldabaoth2 I am in one. Two 100 watt LED lights,and a 120= watt LED and one 60 watt LED about 10 feet awat. The others are very close and well placed. Bet you is pre diabetic,the needing a black background and white or orange or red text is a quickly growing demographic. GamerNexus knows this and goes well out of there way to help, as does TechYesCity and others do it.
I have a 2060 hooked up to a 1080p60 tv, and I just treated as if it doesn't have ray tracing. For rasterized gameplay, it still feels sufficient. I probably wouldn't have gone with the 2060, except that it was the most capable card I could get at the time that would fit in the small ITX case I used.
For me 40-50 fps is fine for most singleplayer titles. So for an entry level GPU that launched 6 years ago it's actually not too bad that it hits that target about half the time. Most of the performance issues seem to stem from vram which is a shame. I expect a 2070 or a 3060 to fare much better.
Fun fact: RTX 2060 does an excellent job with one of the best RT implementations in my (and hwu's) opinion - metro exodus enhanced edition. So maybe it’s still a matter of poor optimization and implementation of ray tracing in most titles?
You rather mean one of the earliest rt improvements. And frankly said it looks quite poor. Control on the other hand still looks quite impressive even against new titles.
@@aladdin8623 metro ee has a great RTGI and works well on weak hardware, while typical modern game has several puddles with reflections that drop FPS twice, feel the difference
@@aladdin8623quite poor? wtf are you smooking? unlike most rt games, metro exodus EE completly reworked for better the illumination, meanwhile most game with RT enable just turn the floor wet and the walls glossy like glass😂😂😂
@@BlindTrustProject No it is not running at 60, that is what your fake frame-rate numbers want you to think, it is not even running at true 1080p... What PC gaming has become, when many gamers think fake frame-rates and fake resolutions, is "just fine" I bet GPU makers are really happy with this new breed of PC gamers.
@@BlindTrustProjectIf you are runing framegen and getting 60fps it's not runing fine. Framegen is a smoothing technology that is made to work at minimum 60fps and above before getting enabled to give okay quality frames and input lag will be worse than native 60 or 30 or whatever the fps you were getting before framegen and it looks like shit framegening under 60fps.
2:30 that graph is incorrect btw, Steve made a video correcting that mistake saying that he used the 1080p data for all the GPU from 1050Ti to 570 but 1440p for the 2060
The fact that I’m a happy gamer without ray tracing using an RTX 2060 speaks volumes. I appreciate what the card has given me and I can play RE4 Remake and Doom Eternal without any issues.
I got this card less than 6 months after release, which is huge where I live (Argentina) because hardware here go for literally twice as much as MSRP due to taxes and shipping and so on. I ran face first into the subject of this video. It was the first and last time I adopt a new technology without waiting a generation or two of iteration for it to be properly defined. I just got a 4080 super, which is a card that'll actually allow me to use these nvidia features properly. I also upgraded my cpu from a ryzen 5 2600x to a ryzen 7 5700x3d. I was just about to buy an am5 motherboard and some of the new amd cpus and ddr5 ram, when I remembered this hard earned lesson about the newest technologies. Seems to have been the right move, as the new am5 cpus are not yet performing enough to make them cost efficient. You live and you learn!
I bought the 2060 for 1440p when it first came out. Ray tracing was not a factor. I was using it up until summer of 2024 to play Elden Ring, and while it was not great, it was playable, the worst problem being frequent chunky hiccups loading new areas due to the low VRAM. I'm using a 4070 and I still don't enable ray tracing.
I had a 2070 super at the time, while using a 1080p monitor. Even with this GPU, I always turned off RT. The huge performance impact never justified the image quality improvement.
Even when I was buying my 2070 Super I knew turning on RT was out of the question because of the performance hit it takes for almost nothing in return, it's just crazy that people expected good performance with RT on 2060 :D
Thank you for saying what I've been thinking all this time: RT continues to be pushed as the future of gaming, and it is, but it was pushed so soon in an effort of Nvidia to get a competitive advantage over AMD that it required (and still does) crutches like DLSS and Frame-gen to be remotely viable... and before it gets there, they are pushing Path Tracing that struggles to run at native 1080p at the medium to high end! And while I won't question the usefulness of DLSS (less so Frame Gen), it's really been a bummer to see it being showcased as a mainline feature instead of asking, like you do here, for proper improvements in the raw performance that would allow raytracing to become mainstream. Speaking of, I would love to see included, when testing with DLSS/FSR, what is the native resolution it's being scaled from, just to get the full picture. Great work with the videos otherwise!
Speaking about the RTX 20 series. 2 years ago I bought a used 2080ti for only $300 before Vram drama, and it is still very good to play games at 1440p. DLSS and 11Gb of Vram extended the life of this GPU significantly. You will get a better experience than the PS5 Pro. Raster performance is similar, but RT performance and upscaling is better. Undervolted to 1800Mhz 0.825V runs silent and consumes 180-200W.
Great card. Has DLSS which means you don't have to deal with the struggles that TAA has in newer releases. If you do want to do some RT you have better RT performance than a 7800xt in games with hardware RT. And since DLSS quality looks better than "native" TAA you can also get a decent frame rate. Of all the games that Techpowerup has done performance reviews on in the last year non have needed 11+GB of vram for 1440p ultra. Only one actually needed over 11gb at 4k ultra. Almost half had RT always on. And their testing along with Digital Foundrys had DLSS looking better than "native" TAA in all tested games. New engines are more efficient with resources. Asset streaming and decompression has made ssds required and made loading 4 copies of every asset for LoDs a thing of the past.
Things become truly dire for the 2060 when you combine this performance data with Tim's previous look into visual fidelity of ray tracing. You can count the amount of raytracing uses that run well AND look good on the 2060 on one hand.
@@mojojojo6292it came out 7 years later... that's the bare minimum the 4060 can have without being completely VRAM limited. HUB already showed multiple times that the almost entirety of the 40 series (except maybe the 4090 and the 4060ti 16gb) is VRAM limited, they're treating VRAM as a planned obsolescence
@@mojojojo6292 yeah I dont think the memory bus width is the problem in the 4060, just the performance it lacks is. 4060ti though lacks memory bandwidth, that should been a minimum of 160bit or gone up to 192bit like 4070
@@lucazani2730 Complete nonsense. I have 4070 and it's never vram limited and I game at 4k. DLSS quality or balanced in the more demanding games reduces vram requirements.
@@dominicshortbow1828 They would need to have put extra memory on it then. Bus width is 100% dependent on how many memory chips are on the card. 32 bits per chip. Which the series should have had to be fair, 4060 should have been 12gb, 4070 should have 16 and 4080 should have had 20gb. Hopefully the 50 series fixes this but with 3gb chips being available for gddr7 the bus widths will probably still be very low but overall bandwidth will still be fine because the memory is much faster again.
Thank you for this video. It proves a point i was making back in the day when these released: there was no point in getting a 20 series apart from the 2080ti and maybe the 2080s for RT, and even then you had to be an enthusiast. Even the regular 2080 soon was too slow to really be able to handle RT, plus it too was crippled in terms of VRAM. These first and second gen RT cards got sold by the millions and all people did was try RT for a few minutes before going back to double the FPS for almost no graphical impact. Same is true on AMDs side of course.
03:00 did they increase prices due to gamers buying expensive GPUs or was it more due to crypto miners buying up all the cards and reselling them later in a crappy condition for high buck because manufacturers couldnt keep up with demand
hey! it would be helpful if you guys just added the year in which the game was launched next to the name of the game, it shall allow us viewers to better understand how well the card has aged over time and v/s newer games. loving the content !
I had this card for a while before switching to the 6700xt which is sooo much better! I bought it cause it was the only GPU I could afford, i never bought it for the ray tracing, and I dont regret buying it.
1:44 That is the exact model I am using now! I still use my old 2017 OG 2060 6G. Games like Starfield makes it cry and join SpaceX for takeoff. Games like RDR2 makes it crash with "Insufficient memory" errors or something like that. My next tech refresh I am gonna aim for at least 8-12G gpu
Heck, I recently played Jedi Survivor and even my 3090 didn't have enough for Ray Tracing in acceptable framerates in that game. lol So yeah, I'd say it's Nvidia's fault for mainstreaming this superfluous tech, especially way before GPUs could actually run it. But we also gotta admit, ray tracing or no ray tracing, AAA games have become dangerously unoptimized, even when they don't release completely broken.
@@ZeroHourProductions407 That doesn't even make any sense? Ray tracing has a big impact on CPU usage. It doesn't matter if the CPU doesn't "understand it"
@@ZeroHourProductions407 Raytracing and path tracing, before real-time raytracing on GPU's, was a CPU only thing for decades, so why wouldn't CPU's "understand" raytracing?
@Argoon1981 because it would take an entire week per frame for a CPU. Pixar needed Crays with thousands of processor cores to even render stuff like _Toy Story_ in the drug-addled attention span of a Hollywood movie producer.
Unpopular opinion: RTX is for DLSS only. Ray Tracing not only tanks every game, but also in many cases is worse outcomes (subjective - it might look closer to reality, but the question you have to ask - is it fun to play?!).
The RTX 2060 is now a 6-year-old midrange card. For example, Pixel Shaders were introduced in 2001 with the GeForce 3. Who would have expected to achieve 60fps with max settings in Full HD by 2007? Probably no one. The only real shortcoming of the RTX 2060, in my eyes, is the measly 6GB frame buffer.
It's more of an issue that we got a terrible generation over generation improvement this time because Nvidia artificially raised their product tiers as a cash grab where the ONLY maybe worthwhile card gen over gen was the 4090, which was insane. If the 4070 (which is actually the 4060 or 4060 Ti) was in line with historical price point, no one would be worried about upgrading.
The problem IMHO though isn't that the card is old and not capable of running modern titles with RT /anymore/, but that it never was. If it would've had a good run, but after 6 years its just to old to keep up - fine, not everyone can be a 1080Ti. But for the fist couple of years Nvidia's marketings' and their fanboys' meme was that buying RT cards is some sort of future proofing, for when we finally get to a point where basically every game has or even requires it. And that was actually quite a successful argument, everyone brings it up whenever Nvidia cards are much worse value than the competition in pure raster. Trouble is, by the time there finally arrived an arguably decent selection of games that use RT - and now even a handful of them in a way that gives a meaningful visual upgrade - the card is obviously too weak for it. There was never a time when it wasn't, just no games with actual meaningful RT that could demonstrate it.
Why go back and test the lowest ever RT card 6 years later?!?!? Why? I owned a 2080ti which to me is a 2k 120 or 4k 30 card. I have a 4090 for the past 2 years, why go back 6 years later to test an entry level RT card? Why? I love this channel as it’s my fave by far and the only one I trust along with GN, but why 6 years later…nobody’s buying a 2060, why Tim?
I sold my 2060 ko last year and side graded to an rx 6600, I actually made a little money in the swap and gained some performance, 2 gb ram, and lower power consumption and quieter fans. Very happy with that decision, the RTX features were useless on the 2060.
just a reminder, the r9 390 came out with 8GB of VRAM in 2015 for the price of a 970 and the performance of a 970 and that card had 3.5GB+512MB of slow VRAM. Intentionally gimped, this is what monopoly's do. Nvidia never changed but yet i never get people who buy their low or mid range cards. if you're going to go Nvidia only get the top line card otherwise you are intentionally fk ing yourself. and before someone calls or assume i'm an AMD fanboy, i'm sitting with x2 Titan XP in my rig. i dont give a f, i'm just pointing out the obvious.
Here's my problem: Here AMD is like -10% $$$ of Nvidia. *** *BUT old Nvidia (Fanboys) Cards flood the second use market.* *** It's literally cheaper to buy a 3060 12GB than a 7600XT 16GB. And yes I thought Intel should also be cheaper ... NOP 770 16GB $$$ = AMD 6800 16GB $$$
I admit to being Ray Tracing Curious when the 2060 was released but following your advise I skipped it and went with the 2070 Super , then realized I dont really care about Ray Tracing
Paying the early adopter tax on a mid-priced product never bodes well. Especially when the previous generation jump (9xx to 10xx) saw the 1060 delivering 980 performance and this thing struggling to put some distance between it and the 1070 (while losing 2GB Vram).
The 2060 is what you get when you let one hardware company define what the new hot features are. Used to be the games/apps that drove that, then nVidia came along and decided they'd do it and set the price for it too. People played along and now we have an unusable low end, a mediocre mid range, an upper mid range that's decent, but invariably overpriced, and a high end designed to separate people from their money with performance they can rarely actually take advantage of.
With Ray Tracing, you are going for excellent visuals. It doesn't make sense to use low graphics settings that degrade visuals in order to turn on a feature to improve visuals. Ray Tracing only makes sense on graphics cards powerful enough to run Ray Tracing at high graphics settings. And at high RT settings as well, as with the half measure RT games the RT affects are barely noticeable, it at all. The overlap area of the Venn diagram between cards that can do it and games that offer it is still super small. Probably just Cyberpunk 2077 and Alan Wake 2, and only on a 4070 Ti Super, 4080, 4080 Cheaper, or 4090. I don't anticipate running any games with RT until maybe 2028.
The perfect duo of lying coping fanboys. "My i7 4790K still runs everything" and "My RTX 2060 gets 60FPS with RT in [insert game the doesn't have RT]".
1. "My i7 4790K still runs everything" - show me a game that doesn't run if it's a lie. Low framerate is till technically "running" ;D 2. Today I bumped performance of a scene in UE5.5 from 40 FPS to 60 FPS by switching to hardware RT with Megalights on RTX 2060. To be fair I was purposefully using a lot of lights with shadows and the noise was quite bad and it had upscaling enabled, but it's real, so it counts ;P
I had an i5-4690k, it got bottlenecked by my 2080 Ti so I moved up to Ryzen 7 2700X. Still wasn't worth running RT on anything. The cope by Nvidia fanboys is strong but the fact is that RT has been and continues to be a meme for most users.
And this is exactly why ray-tracing didn't count into my buying it back in 2019. Everything else lined up for it as a mid-range GPU to get me playing Destiny 2 on PC. I've since moved on to a Radeon RX 7800XT, play more at 1440p than 1080p, and upgraded the CPU to the 5700X two years ago. Ray-tracing is interesting, but still not important to me right now.
Hard disagree. In titles where it makes a big difference, as soon as RT is on it looks far better than ultra rasterized settings. Control with a few settings to medium and with RT looks much better than Ultra and no RT. Metro Exodus Enhanced Edition at high-medium looks far better than the regular one at max settings.
@@tensorup6595 not really, it depends entirely on the game and the skill of the devs, I find when developers have good tradtionally rasterized lighting skills, the customized non ray traced lighting still looks better than raytracing and generally less dark, even if its technically less realistic.
@@geerstyresoil3136 lighting and shadows are mostly on par between raster and rt but every game with a ton of reflective surfaces looks waaaaaaay better with rt enabled. CP77, Spiderman or Watchdog for example are totally different games with rt enabled. With rt enabled the games feel way more lifelike but i also tent to disable rt because i can't play with the terrible framerate a 3080 gives me with rt enabled. I also tried FSR3 mods but it looks and feels terrible.
@@tensorup6595 100 percent. We have games like BMW where high preset with full RT looks and runs better than Cinematic without hardware RT. The difference it makes in that game is night and day. Makes you able to see the detail of the textures. Games typically take around 5 years to make. So it's no surprise that we are now seeing games developed with RT in mind. Not just it tacted on overtop of the baked lighting. Half the games the TPU did performance reviews for this year only have RT lighting, no way of using raster lighting.
Great material Tim. Congrats. To others - lot of perf can be won with OC/UV (around 15%) and details (in between presented). In Cyberpunk when GI set to medium it looks like ultra but nearly doubles performance - but need to turn off some raster settings to stay within 6gb vram. And starting from 40fps you can enable FG and send card to almost 75fps that usually looks close to native. This is still an amazing card - just needs 1080p and some tweaking to fit in vram.
Remember, Nvidia canceled this guy for saying this when this card released
Nvidia can't cancel anything.
@@tracesmith3572 Nvidia tried to blacklist Hardware Unboxed for not caring about ray-tracing in their review.
@@tracesmith3572You're going to need a "change in editorial direction."
@@no-barknoonan1335 which ironically they did
@@imo098765 They did not.
NVIDIA in 2019: Not enough VRAM
NVIDIA in 2020: Not enough VRAM
NVIDIA in 2021: Not enough VRAM
NVIDIA in 2022: Not enough VRAM
NVIDIA in 2023: Not enough VRAM
NVIDIA in 2024: Not enough VRAM
NVIDIA in 2025: Hello Mid Range gamers, how about a brand new GPU with 8GB of VRAM and a sub 10% performance uplift
Nvidia in 2016/2017: Loads of VRAM
Wonder why 1000 Pascal series sold so well and lasted a long time, surely a coincidence
10 series had plenty of vram, since then its been barely enough to get by
So true and it's really sad to see because otherwise their cards are really good. The least they could do is offering options with double the VRAM.
wish we could easily swap out VRAM like RAM on motherboards
Rtx5060 8gb after 6 years of 8gb 😂😂😂😂😂
The meme at launch: "rtx 2060 is 1080p raster, 480p ray tracing"
Nobody was expecting that it would still apply with the 4060, damn.
The whole point of trying to jam dedicated ray tracing hardware down everyone's throats and push the gaming industry there was to reset the goal posts so they could tell you you needed to keep upgrading and chasing performance, rather than continuing to push raster for native 4K high FPS performance for reasonable prices. So of course they're going to drip performance, so you need to keep buying more.
@@EhNothingdevs asked for ray tracing. We will be able to do really cool stuff with it (with very few rays) once we can assume every gpu has some basic rt support. This will probably take a couple of extra years.
I mean steam literally says 50+% users are still on 1080p. No reason for entry level gpus to target anything more. Older 720/768p monitor are starting to be not available anymore & 1080p is becoming cheapAF, it will still take time for 1440p to be entry level.
The actual sad fact is the price of "entry level" cards. 60-series was meant to to be b/w 2-250$. But rtx20 series moved that, while rtx30 series didn't have any decent entry level sub 250$ chips.
@@manon-gfx Those cores are used for AI performance, that's why they pushed them, not for games.
@@EhNothing This is simply a stupid und misinformed comment.
RTX 2060 introduced at GTX 1070 pricing.
2 years later and 2 gigabytes less
The more you buy the more you save
"Just buy it!" - Tom's Hardware
Nobody forced you to buy that crap. Stop blaming your irresponsibility on others.
@@DBTHEPLUG Haha, no. I bought a GTX 1060 and then a GTX 1070. No need for a RTX 2069.
But I did upgrade to a second hand RTX 3060Ti for EUR 300 and a second hand RX 6600XT for EUR 200.
I am waiting for the generation that has a mainstream GPU with 80%+ better performance than my RTX 3060Ti.
It's just frustrating to wait for that for generation after generation.
yeah, rtx in 2018 was basically a scam, but im still impressed how well the 5 years old card run brand new games, this video talks about a card that is literally one console generation away, does not make justice to the card at all
@@leandrrob Ray Tracing is a scam in general unless you have 1500+ to spend on a 4080+. It was meant to push the capability of video cards, but not quite meet the basic requirements.
Holy shit, it's been 6 years? After school, it feels like you blink and 10 years pass.
seriously.
Yep. Welcome to adulthood 😂
Blink 5 times and you find peace 😂😂
**rapid blinking intensifies**
It's definitely how time works lol
Nvidia: "hub doesn't focus enough on rt"
Tim: "ok, how about we see if you kept the promises your marketing made at launch?"
Nvidia: "n-not like that..."
They took military grade flamethrower to take out hornet's nest.
Yeah but imagine comparing a 6 year old card, the weakest card of that generation, against the latest modern RT games, half of which could use better optimization, and expect it to manage. Even the RT core arch has changed.
@@be0wulfmarshallz Mate, those were crap from the get go aside from 2080Ti that was actually powerfull and for once, wasn't a massive FU to Titan owners.
@@be0wulfmarshallz There weren't any RTX titles at launch, RTX back then was a "future proof" argument, totally appropriate to take NVIDIA at their word here.
@@be0wulfmarshallz What about things like Watchdogs and The Witcher 3? Those are older games that still didnt deliver even a 1080P 30FPS experience with RT on AND DLSS helping out.
When the 2060 released, there was NO RT in games, so it was enough at launch to run all 0 games with RT...
Checkmate gamers.
Yea and then 6 months later Nvidia released the RTX 2060 Super with 8GB VRAM 🤦♂
You know what is like that raytracing was, right now?
The NPU for a.i software on Arrowlake...
😭😂😂
Haha I was thinking this too.
RTX2060 vs the back catalogue of raytraced games? Man, it never stood a chance with the first lot!
Red dit tier comment
And 4060 doesn't with current new games without frame gen.
Not that surprised, though. The GTX/RTX 60 and RX 600 series are the low end series of GPUs. You'd need to go the mainstream, (the 70/700 series) to start to get any decent uplift. This is even more compounded by the fact that the last few generations of GPUs had extremely poor uplift "gen to gen" for the same tier. E.g.: the 2060 to the 3060 to the 4060, or the AMD 6800XT to the 7800XT high-end tier series. Typically it should for about 25% jump between tiers in the same generation and a 50% jump from gen to gen in the same tier.
In-fact, the only tier that has continuously gotten any normal "gen to gen" uplift i.e.: ~45% - 50% uplift was the Nvidia 90 series (previous known as the 80Ti) enthusiast tier. And with that came a 2x price adjustment (starting with the 1080Ti to 2080Ti). Can't wait to see what it is with the 5090... Probably 3X price..
@@gorillagroddgaming you must have a 2060
Maybe it could handle Quake RTX
I said this back when it launched. The 2080ti could barely run ray tracing so the fact that they advertised a 2060 to do it should have been illegal. It's basically lying.
Rofl, ' barely run rt ' ? Bs. I own a 2080ti, it runs rt, and everything else, just fine. Never had any slowdowns, stutters, or any isues whatsoever even with the initial launch of Cyberpunk 2077. 😂
I'm still using an RTX 2080Ti, and I've never played any RTX games with it, apart from DLSS. Waiting for the 5090 to play all games in 4K rezzzz.
Blaming Nvidia makes no sense at all. It's perfectly capable to run RTX games, as demonstrated by Quake 2 RTX, for example. The problem is when you take an obscenely heavy, poorly-optimized renderer that has "organically evolved" over 20 years and slap yet another very demanding feature on top of it.
@@donkeymoo1581 Even Console players prefer 60. Sony themselves said that most Playstation players prefer performance mode.
@@DeadNoob451most console players would be happy with 1080p and 60 fps but the consoles want to compete with the PC.
Remember that Tom's Hardware article back in 2018? "Just buy it - when your whole life flashes before your eyes, how much of it do you want to not have ray tracing?"
...yyyeah, about that...
Damn, forgot about that nonsense, surely weird time it was when ngreedia bought biggest reviewers for this rt scam
Is your profile pic from Blue Submarine No.6 and if so, where did you get that picture and the background on your profile from?
@@moe8935 Yes! And it's a commission I ordered from a wonderful pixel artist. More info in my channel description
I can already see the 5070 marketing slides
THE NEW RTX 5070 - A RAY TRACING POWERHOUSE AT AN AFFORDABLE PRICE
Cyberpunk 2077, RT Overdrive ON
[|||||||||||] 62 FPS*
(1080p, DLSS Performance, DLSS Frame Generation ON)
only $699
In 3.5 years of having a 3080 I think I have used ray tracing once for Cyberpunk. 🤷
The amount of times I look at raytracing compared with traditional rendering, I feel like the performance hit isn't entirely warranted for what might be a minor visual improvement. If there is a meaningful improvement however, then it's a different story.
I played through a few fully path traced games on 3080. Half-Life 1, Quake 2, Portal RTX. But even Quake 2 remaster without RT has better lighting than Q2RTX.
I also use RT shadows in WoW cuz they give a decent visual improvement and there is enough GPU performance for my 90 FPS target (the game is mostly CPU bound anyway).
Other than that, I haven't used RT to a worthwhile extend.
You missed out on a lot of good stuff then.
Black Myth Wukong looks very nice with full ray tracing options turn on. Works with a 3080.
@@mathdeep did I...or did I just not bother turning RTX on😐
At a point, when a behemoth company no longer cares if a particular segment purchase their products or not, basically grants them the license to do what ever they want.
customers fault, they buy and even as a youtuber if you dare to say bad things about nvidia you will get dislikes and frustrated kids comments
Well, people still buy them. They are at the issue where they can no longer produce enough to feed the supply so the only way they can make money is by selling higher
would never be a behemoth company when the fanboys would not buy it no matter what they get back for it.
Nvidia is drip feeding us those RT Cores.....like there's a shortage of them.
and same for cheap vram
There is. They take up precious die space and they only benefit gamers. Nvidia wants to sell compute and AI performance to professionals who pay "CPU like margins" for the die space they're getting.
@@andersjjensen I thought the only reason the cards _have_ the "RTX" and tensor cores was as a by-product of their AI development? I.e. it was a way to market existing technology.
@@bricaaron3978 Well ray tracing when the 20 series came out was basically a rarity...not many games had RT. It wasn't until 40 series that a lot of games have some kind of ray tracing...and that was years later. Now...should the 2060 card be tested on games literally coming out this year? Especially unoptimized ones? Yeah I donno if THATs a fair thing. Nobody today would say "yea lets imagine comparing the lowest end card to video games 4+ years from now, damn it doesn't do good."
Nvidia features be bussin
Nvidia give their GPUs enough VRAM challenge: Impossible
isn't technically impossible, u only need to pay them $1k+
It doesn't matter how bad it was, remember - "Just buy it"...
consume my consumers
Wasn't it "It just works" ?
The more you buy the more you save 😂
lmao that tom's hardware article was just insane
"When you die and your whole life flashes before your eyes, how much of it do you want to not have ray tracing?"
RT is still a tech demo that hurts performance too much, once there is no significant fps drop by enabling it, it'll be viable.
It will always be just out of reach. Monte Carlo rendering is extremely taxing. As soon as hardware progresses, they'll just up the number of raysand bounces that are used. The whole point is that your hardware is never good enough
So a 4090
@@mryellow6918 We *might* see cards able to run ray tracing without hacks like dithered rendering and temporal effects in 20 years without a significant shift in hardware methods. 4090 is nowhere close. The path tracing in CP2077 is using nowhere close to the amount of rays needed to actually make a scene look good. It is dithered and blurred to make it look somewhat acceptable - as long as there is no movement in the scene. Once there is movement, the dithering and temporal effects need to be recalculated and you get visual noise (which is what the hack of ray reconstruction is trying to solve).
It wont. If everything thing else stopped getting better ie resolution, texture quality and fps, then yes, RT would catch up. On the plus side in a few gens you should be able to come back to some old games and have decent performance.
@@DSP_Visuals ding ding ding, tell the man what he's won. The whole point of chasing ray tracing is to tell you you always need to be upgrading and your performance is always bad. Just ignore it entirely and buy what offers the best performance for the price for raster and you'll be golden.
In fairness we all knew the first gen of RTX was meant to pay for the GPU future improvements, but I suspect many Nvidia devotees didn't expect them to continue to bring out products with not VRAM.
Forget the 2060, charging $400+ for an 8gb 4060ti is just scamming people.
Nvidia gimp mid range so people will buy high end.
Scamming? Aren't you the one giving out YOUR money?
There's nothing fair about that
Well, they asked 600 dollars for a RTX 3070 Ti and that also had 8 GB and was only good for 1080p rasterized gaming or 1440p with DLSS. It can't do RayTracing. I feel sorry for anyone who bought that card, at least with the 4060 Ti reviewers warned us not to buy it.
I turned on RT in one game and my framerate halved. Not worth it on a 3070 either.
I made a quick test in UE5.5 on RTX 2060 and Hardware RT via Megalights can double framerate vs zero RT, but the shadows are much noisier (although they also look more correct especially in overlapping areas form multiple lights). Megalights is apparently unusable on GTX 1060 without hardware RT. In Blender it's also a noticeable jump in speed when using RT cores even on 2060. So the feature does have a real value in some cases, despite not being worth using in majority of current games.
I have a Liquid Devil 6800 XT and that usually means my options are 4K native high/max settings w/ RT disabled, 1440p with low RT and FSR quality/balanced, or 1080p with high/max RT. I almost universally prefer the higher resolution with no RT for both visuals and performance.
@@K31TH3R This is what I call "eyelash rendering" stuff: where the engineers and devs focus on stuff that makes very little actual differences to gamers and gameplay, but is technologically "impressive". Like if they spent a huge chunk o' cash and chronology on how to render photorealistic eyelashes you are...never gonna notice.
Like, is RT nice to look at? Sure. But if it quarters my frame rate, melts my GPU, and necessitates me playing at resolutions I haven't touched since Gotye had a hit, then no.
It's well worth it. I would take 30 fps with RT if it's implemented well over 60 fps any day. Any higher fps would be a no brainer. 50 fps instead of 100 fps is an even easier deal to make.
Developers and game engines got so good at shaders and other techniques such that even after 5-6 years of raytracing being around - I don't feel like I'm missing anything by not using it.
i do miss shadows though. the ugly cascades and lack of microdetail. iam a fan of UE5´s VSM. but lumen is worse than baked lighting because denoisers are awful except nvidias proprietary ray reconstruction denoiser. and having nice GI like metro is always great for immersion, but when the cost of trading fps vs looks comes, its always fumbled. 6 years in and RT is sort of an afterthought. but metro and avatar have proven RT to be amazing, if, IF a dev builds the game around that from the ground up.
mmm...there is a good case that a lot of engines are not making the optimizations they need to.
Ray tracing has always been about making 80 class cards for 1080p again.
lol my rtx 4080
not quite correct since even 4090 struggles at 1080p render in some games, but should be able to keep fps above 60 in Wukong without DLSS performance hit.
@@CrazySerb Those are not 'games', they are tech demos.
@@CrazySerb As a 4090 owner myself, I have yet to play a game needing DLSS performance. What struggle is to you, vs someone else.. who knows. Seems many have variable definitions as in 60fps for some cards = GOOD, and 60fps for other cards = struggles. One thing it all has in common, is it often comes from internet experts with 0 xp, and/or social media analysts.
@@Mcnooblet 60fps is budget entry level. 120fps is new target.
The idea of turning every other visual setting down to barely on and enabling ray tracing so we can say ray tracing works is patently absurd, as is the whole ray tracing scam. A single "effect" that costs even the best GPUs 1/3 or more of their performance is just silly. Turning on fake resolution and fake frames to make up for it is even sillier. RAISE SHIELDS!
You are clearly misinformed. Ray tracing isn't an effect. It is a completely new way of rendering. It takes about 60 to 1000 times longer and uses at least 4x memory when done on a CPU compared to classic rendering. That fact that nvidia was able to implement it with barely 30% perf drop (rather than 30 times perf drop) is extremely amazing from technical point of view. And guess what, nobody is asking you to turn on RT or ultra settings. You can enjoy your games on medium. If you are playing games only because of ultra, and they don't give you entertainment on medium I pity you - you need to find new games, because you are just wasting your time with the ones you are currently playing.
@@igelbofh RT on, RT off - no visual difference for me and for 99.9999% gamers.
@@igelbofh Note that I put the word _effect_ in quotation marks, indicating to the literate that the writer knows the term is not perfectly accurate. No matter how impressive it may be that Nvidia has an RT implementation that only costs 30%, that's still too much. When using RT becomes the standard -- always on, this is how we do it, sucker -- and everyone suffers a 30% hit all the time, we'll all regret allowing Nvidia to push it so hard just so they could 1) wring more money out of us and 2) beat AMD at something relevant to gamers.
As for playing the wrong games, I suggest that if you have to go to medium settings to achieve decent FPS, you might be the one to whom the concept applies. If you're buying a high-end display and a high-end graphics card to play on medium and get more FPS, that's your choice. But why do you think the higher settings are there? The answer is easy -- to give us options. I can play at ultra and get 60-90FPS; you can play at medium and get over 144FPS. We can both enjoy the game as we see fit. At least, that is, until RT replaces rasterization. Then we're all screwed.
@@rangersmith4652 I am still playing on a GTX 960 and a 60Hz 1980x1200 projector at 260". And everything I play, including first person shooters is fine (with occasional dips to 45fps) . And sometimes I am even in the mood to play jagged alliance from the mid 90s. Given the limitations of the human vision processing system, one is wasting money at more than 1080p in less than 27" at more than 4 ft away from the display at more than 90fps, unless you are a professional competitive gamer, and it's how you make your income.
What most people are made to believe they should buy is complete marketing demand gen psyop. What you should pay for though is viewing angles above 140 degrees, color dE of less than 1.2 and at least 150% sRGB coverage, and perhaps a curved monitor if you are going at more than 30". Plus a colorimeter for calibrating in your environment.
RT is not going to replace rasterization for technical reasons any time soon. If that happens it will be another demand gen psyop. Even if we stop writing new raster rendering engines today, the ones we have are really well optimized and further optimization is almost nigh improbable.
On the Ngreedia side - agree.
@@igelbofh LOL show me one game where full RayTracing has a less than 30% performance drop. I'll wait.
i feel like the 3060 12 gb will end up as the best mid tier gpu for a while now, especially if AMD and Nvidia gonna keep producing 8gb 60 cards for the same or similar price
As long as both give like +10% FPS for +9.99999% $$$
R.T.X low/mid range =
No Rays
No Tracing
No Xtreme
It's always been my opinion that with the -50 and -60 level RTX cards, their primary selling feature is DLSS, not ray-tracing.
Nvidia came up with DLSS upscaling, because of RT, that was the entire reason.
They wanted to push RT but everyone was complaining of the garbage performance, so they add to come up with something, good for them AI add a boom and they could take advantage of it.
I remember for a time, before DLSS was a thing, Nvidia CEO was saying "with RT, frame-rate numbers don't matter anymore! Is all about the graphics, it will change everything!"
It really always has been.
Reality has always been like that.
xx60 and 50 tier cards are capable of enabling RT but not required by user. The main thing is obviously DLSS
I have had rtx 3050 and i knew it can use RT but i only was using DLSS so I can have boosted fps at my 1080p res
Don't brand them as 'RTX' then...
@@LoricSwift or do brand because they still support RT?! Everyone knows 60 class geforce rtx can enable but its pointless because they are the most entry level such as 4060 or 4060 ti and RT on makes sense on cards of 70 class like rtx 2070 S or 3070 or 4070
I think Metro Exodus Enhanced proves that rt games can run well if they're built for rt from the ground up. But that would require developers to put that effort in and to be willing to leave out a huge portion of gamers with no / weak rt cards.
Might as well revert to the days when some people had only B&W TVs and could not benefit at all from programming in color. Yes, such a time existed.
This! To this day Metro Exodus EE is showing what's possible if you put in the work and make difficult business decisions. While Cyberpunk has an impressive RT Overdrive mode it was full of artifacts unlike Metro EE. Imagine how much better that mode could have been if it didn't have thousands of hours to support raster, and the previous generation of consoles.
Alan Wake and Star Wars Outlaws are also built for RT from the ground up.
@@BlackParade01 Any UE5 game is built for RT as well.. lumen is optimized form of raytracing, the new ue5 megalight is also another optimized from of raytracing
FPS > RT. Every single time.
Enabling Megalights (runs solely on hardware RT) doubles framerate in UE 5.5 vs no RT if there are dozens of lights with shadows in the scene. It significantly boost performance even on 2060, but the noise can be problematic on lower end cards. So the assumption that enabling RT lowers performance might be wrong in future games.
@@kazioo2 ya no.
@@kazioo2 that is only because unreal engine is hot garbage which even epic themselves don't know how to work on all while removing the ways you could optimize the engine for better performance which has been a complaint for a good while
if you want to see what happens when devs actually care to make a good engine check out how NFS 2015 looks with its frostbite engine to realize that it isn't far off graphically from path traced cyberpunk 2077 while asking for significantly less compute from a GPU
Nah. Good RT > fps. Every single time.
How to tell others you have never seen good RT live without directly telling it.
Some say RT has been one of the biggest scams on gamers
I will ask here again: How come Metro Exodus Enhanced LOOKS and RUNS so beautifully on my 7900xt with RT Ultra and games like Cyberpunk and Alan Wake 2 are so heavy? Especially Alan Wake runs and looks awful if I enable RT on full.
@@panospan3565 Ask the developers. I doubt many in a comments section are going to know what's going on in the engine enough to know why without making assumptions.
RT is _absolutely_ the future, but we haven't seen the hardware performance uplift we should have seen that would make it more viable. Baked lighting can be very convincing but it's a lot of work and when something gets missed it sticks out like mad. RT would enable developers to skip a LOT of that work with no concerns of missing anything, not to mention it gives another level of realism *when done right,* and it's that "done right" that's usually the issue with how half assed and meaningless it's often applied in today's games.
When antialiasing first appeared it absolutely tanked framerates too and the naysayers were making the exact same argument that it's a useless scam feature no one should use. Fast forward to today and some games won't let you disable it at all.
RT isn't a scam, the pathetic performance uplift we've seen since it's inception is.
@@zodwraith5745I agree. It’s not the technology but the marketing which is the scam. I have the feeling that these lower end cards are just there to be bought up by people that are not very deep into the scene and call it a day when they can’t use something like RT.
Hi, "Some" here. It's 1000% the biggest boondoggle in gaming history. Massive performance loss for negligible benefit in the few cases it's even implemented. It's Hairworks 2.0, and virtually all the media outlets, including HUB, lap it up like good boys and girls and say to buy nvidia cards over AMD "because ray tracing and DLSS." Don't forget their Wukong "benchmarks" showing the 4060 Ti out performing the 7900XTX. At least GN still has credibility.
NVIDIA marketing - 2060 Ray Tracing better visuals in game / real marketing - That slide show really looks impressive!
What should happen:
RTX-5060 = $300, RTX-4070-Super Performance
RTX-5070 = $500, RTX-4080-Super Performance
RTX-5080 = $749, RTX-4090 Performance
What's going to happen:
RTX-5060 = $499, RTX4060-Ti performance +10%
RTX-5070 = $899, RTX4080 Performance
RTX-5080 = $1499, RTX4090 Performance -10%
The full 50 stack is in trouble if the 5080 can't even beat GPU more than 2 yrs old.
Trump Tariffs: those are some nice prices, lemme throw in an extra 30%
black market has been light on kidney's.
Not only that
5060 8gb
5070 12gb
5080 16gb
That is exactly why I bought a 4090 PC for a good deal recently. I doubt the 5080 will be better and won't have 24GB.
instead of pricing, i am more worried about what kind of gimmick Nvidia is going to introduce for their RTX5000 cards lmfao (and of course lock it exclusively to the RTX5000, because Nvidia is being Nvidia as usual)
Got to love how the algorithm works. I just subbed to you guys; glad I did it.
I own an MSI RTX 2060 Gaming Z.
I bought it secondhand for around $130 quite some time ago. I still have the original box, in mint condition, manuals, and all the goodies.
The first game I ever played with it was Control. Nowadays, I stick with Warframe, Rust, BF's, and a few others.
I played with the RTX settings for a while with Control and other games; a few hours later, I was like, "Nah, this isn't how it's supposed to be."
Anyways, most of the games I've played over the years have been without Ray Tracing enabled, and I have to admit the experience I've
had with this card (still using it) has been phenomena;l for me. I don't demand over 120 fps; I don't use Ray Tracing.
The games I play run great in 1080p with over 60 fps, no problem.
I was using an old RX580. So the jump in performance has been, and still is, very enjoyable.
I'm buying a new GPU around mid-2025.
I already knew this card wasn't what they told it was.
But, oh boy, this 2060 has made me feel happy and given me a tremendous amount of joy these last few years without any troubles.
I take care of it. I clean it. I paste it. I thermal pad it often.
I can't imagine what the next card will bring. But I know it's going to be awesome!
Thanks for all the information and data guys.
Raytracing was the future. I wonder how people who bought the rtx 2060 for Raytracing in future titles feel today .
pretty stupid , my 2070 super was my first" high end" card ever so not feeling too bad about my purchase in that regard but the rt capability is lacking most of the time even though i'm at 1080p monitor
Would be interesting to see how the 12gb version is holding up. Back when it released there was almost no need for it
Good thing I warned everyone I knew to get a AMD 5700xt instead of this POS at the time. All of them are still enjoying the vastly superior performance today
yeah except some important features than you can not use like DLSS, H265 Encoding (AMDs encoder is absolute trash), TONS and TONS of features for professional and hobbyist software like Blender that are not supported by AMD (like Hardware acceleration for raytracing in the viewport of 3D Software), stable "it just works" drivers etc. ( I coud go on for an entire essay but enjoy your feature crippled card at the almost same price of a better product)
Buy cheap, buy twice.
@TheNerd 5700xt runs games as good or sometimes better than a 1080ti nowadays. 2060 has been performing around 25-30% slower for years, and even worse because of the vram limitation. I know you are coping, but man you're really delusional in this case. This AMD gpu is amazing and way more durable than its competition at the time.
@TheNerd You say buy cheap, buy twice but the reality is that 2060 buyers have needed a new gpu much earlier than 5700xt owners.
@@TheNerd what is it with you people and the whole "driver issues" shebang? My 57xt has been completely solid since I got it back in 2020, hell even my msi 390x from 2015 never gave any issues driver wise.
Am I just not playing the correct games for this to be happening? Honest question, no bs, because it feels an awful lot like bs every time I see drivers brought as an issue against AMD when I just haven't seen it.
Do I need this fancy technology that halves my fps for twice the cost, no.
Do I do professional work and need software like blender, no, I'm a gamer at my most demanding.
Do I need encoding, I don't think so, this isn't going into my NAS box so I'm not worried about dvd playback.
I got what works and fulfills my requirements. Buy according to your needs, not hype.
@@TheNerd "Buy Cheap, Buy Twice"... you're not wrong, all those 5700XT owners sold their GPU for 4-5x what they payed for it during the crypto boom and got a significant upgrade over your mediocre 20 series GPU.
A VRAM-cripple ends-up beeing a FPS-cripple sooner or later. Remember this you 6, 8, 10 and 12GB NVidia customers.
When the card is too fast, VRAM can be the bottleneck. A 5060 8GB or a 5070 12 GB would be terrible.
RT and FG use VRAM so it is even worse.
Yeah my 1080ti (11GB) finally starting to die.
In the puny circle of ultra settings elitists, there are a large portion of gamers who cannot afford high end expensive 16gb+ cards and is contented to play on High+DLSS or native Medium. you make it sound like its the consumer's fault that they bought below 16gb cards when the market only opens for them at the 8gb card price range budget.
@@Ka5himvote with your wallet, nvidia reduced VRAM going from 3060 to 4060 from 12GB to 8GB and people still bought it
True, upscaling to 4K I might be forced to turn off ray tracing on my 3080 in 2026, 6 years after buying it... I think the longevity has held up fine.
I feel like the few examples that work well on a 2060 go to show that it had potential as ray tracing on a budget, but DLSS ironically backfired to the point where developers optimize their minimum spec requirements for DLSS performance mode, which is just not worth using at 1080p.
As it is now, the 3060 serves as the actual practical baseline for RTX.
I have a 4070, even on that ray tracing is a no go
@@stephenallen4635 thats, what? I have a 3080, whenever I get the option I turn RT on. How come RT is too heavy for your GPU?
@@stephenallen4635I have a 4080 super and a regular 4070, raytracing is perfectly usable on both with the right resolution. 4070/3080 I'd say are the minimum for having decent rt performance, it can run it relatively well at 1080p and with the help of DLSS it can run it at 1440p.
Performance mode at 1080p = I wanna play the game, but my GPU is not able to run it at playable fps in any other way.
As for baseline - 4060 with frame gen and soon 5060. Don't gaslight Yourself that 3060 was good in the first place launching at $329. It was on par with 2060 super. Same goes with 4060. It's barely "ok" at 1080p60 without RT enabled.
If 5060 will require RT with Frame gen just as 4060 did, then we will know exaclty how marketing of RT worked all the way back in 2018 when we had just 4 games "having RT" and 2080 Tie not being able to handle CBP2077 at 60fps at max graphics.
3060 had 12 GB of VRAM. I actually bought one instead of a 4060 because the 4060 took such a VRAM hit.
This is exactly why for example GN never reviews based on promises.
LTT recently released a video on a parallax 3D monitor. And mentioned that about 150 games support the effect. And I find it hilarious that a similar number of games support RTX lighting and shaddows as support an obscure 3D monitor.
Lmao, most of those "150" games don't have any real bespoke support for the 3D tech. They just tested them. The monitor can inject the stereoscopic effect to some extent but your mileage can vary drastically from game to game.
Like the razor haptic feedback thing dawid just tested. Some of the games that “support” it like hogwarts you just get a buzz when you cast a spell. I’d hardly call that “supported”.
Developers don’t need to add ray tracing if they can achieve their desired visual quality without it. Ray tracing is just a tool.
That Ltt video was such a joke
Imagine if SLi was still alive and you could slot in a lesser card that would handle the ray tracing load.
I always wondered about the other PCIe slots and if you could put in a discrete unit with more tensor and shader cores for example but then people wouldn’t buy the higher end cards lmao
That what you tell is not SLI.
You mean dedicated render workloads like Nvidia PhysX often does.
Only 0,0000001 of the users use the dedicated feature.
An RT Implementation will be impossible, the GPU will run asynchron and every effect like reflection and shadows will get a terrible delay.
CPU need to work with more drawcalls that reduces the performance.
If you do this on the expensive Path Tracing you will enter a bright room without lightning, that will get after few seconds dark, thats not worth.
It can solve the insane FPS drop, but with a big image quality drop.
@@slickzMdzn Funny thing with that. Even when SLi was a thing, High end cards were still purchased and some people even purchased multiple card for 3 or 4 way SLi.
Pog
The 1080 Ti will always be the GOAT. Unfortunately, we will never see a card like the 1080 Ti ever again that was all horsepower, innovative, and affordable. I am glad that my 1080 Ti is still part of my GPU repertoire and with its upscaling update not long ago - the 1080 Ti still keeps up quite well in 2024. But, as a PC owner myself, I have to be honest and simply say that we have no one but ourselves to blame for crappy GPU's after the 10 series from Nvidia that cost an arm and a leg because people fell for the hype and bought without questioning giving Nvidia the greenlight to overprice for crappy products. Today's cards, in my eyes, are simply a V4 with a twin turbo (DLSS & FG) no longer emphasizing on affordability and on the power of the engine and that the twin turbos is just a plus but not needed.
I never turn RT on in any games simply because at this point in time, it is pointless. I mean the 4090 can barely do 60 fps at native on high or ultra and that is very very disappointing especially priced at 2 plus grand. If PC owners were simply more stern, today's GPU's from Nvidia would be far more powerful at native settings on ultra that yes would have taken longer to create but with more bangs for the buck, or, simply priced properly since there seems to be a new card every year or every two years now. I never buy the newest tech, always stick to one generation behind, always wait 2 years before I buy if the upgrade is needed so all the updates are completed on that tech, and always wait for half of its price once the new tech comes out. So, when the 50 series come through and people again buy blindly giving both arms and legs this time around - the 40 series should all be priced properly as they should have been from the start at half of its price. And then, sure it is worth upgrading.
1080 Ti had incomplete Asytnc Compute support. That's why every Turing card save for the 2060 beats it nowadays. In some games even the 2060 beats the 1080 Ti at lower res.
@@avatarion LOL show me one game where a 2060 beats a 1080Ti. I've never seen it even in the 1080Ti revisit videos late this year. The 1080Ti even crushes the RTX 2080 in a lot of games and the RTX 2070 Super in pretty much every game.
@@ArmadaOne Off the top of my head, in Far Cry 6 at 1080p the 2060 is faster. Like I said, async compute is incomplete in Pascal. Turing benefits greatly from games that are optimized from newer hardware, Pascal not so much. You won't find a single DX12 game where the 1080 Ti beats the 2080. It's not uncommon to find the 2060 Super beating the 1080 Ti, and in 99% of cases the 2070 Super does it as well.
@@avatarion Funny how both this channel and it's benchmarks and the GamersNexus benchmarks prove you wrong.
I even rewatched both videos I mentioned to be sure and just like I said, the 1080Ti is way faster than the 2060, beats the 2070 with ease as well and is faster even than the 2080 Super in quite a few modern titles.
So, I have proof, you have nothing.
Thanks for playing, you lose again.
Also, I don't even see how a 6 GB graphics card can even play Far Cry 6 unless you really dial down the settings.
I had a RTX 3080 Ti 12 GB card and when I installed Far Cry 6, the game warned me that I needed 16 GB of VRAM to use the HD texture pack.
So the only way a 2060 would even work on Far Cry 6 is if you didn't install the HD textures and then run it at low or maybe medium settings at best.
@@ArmadaOne 1080 Ti beats the 2060 in most cases, but in select cases few like Far Cry 6 it actually loses. That's just a fact. Case closed. This is all due to Async Compute being incomplete in Pascal and Maxwell. Turing benchmarks made in 2018 are not comparable to benchmarks made in 2020+ when games shifted to this new console gen and on PC started being optimized for newer architectures. Turing cards have gained on Pascal since 2020. The 2060 used to be equal to a 1070 Ti, but now it beats the 1080 handily. The 2080 started equal to the 1080 Ti, but now it easily beats it every new game.
I think Jim from AdoredTV, when he was still active, warned maybe 4 years ago that ray tracing performance does not scale linearly with hardware. That is, the RT cores and die space they occupy needs to get increasingly more for smaller and smaller gains in rendering power. So we will probably not see a breakthrough in it for a while, until GPU engineers figure out a way around that.
Maybe chiplets is the solution. You could have an overall larger die area without the additional cost and lower yields of a monolithic die.
@@chrys9256 Yes, maybe. What I meant though is that for a meaningful increase in RT performance we might need ridiculously large dies, which at the current chip market and production costs (even assuming high yield rates), will never be cheap and/or efficient power wise. Let's say a 700 mm² total die area, even comprised of several smaller chiplets is probably not the best way forward.
I've never turned Raytracing on for a single game as i value performance too much vs the small improved visuals from RT, however i do play at all ultra settings
@Sashaw.-.999Weak bait get better.
@Sashaw.-.999 Not even a 4090 can do native 4K, full raytracing and give you a high FPS in any decent looking game.
I do the same. Im on a 3060 laptop so it’s not worth all the other compromises to turn on RT. But, even in the games that I can get it to run and hold 60fps I tend to turn it off and get 100+fps instead. A subtle boost in visuals isnt worth it. I can see how RT is useful in the long run especially for developers and it will make games look better but I still think it’s a few years away. Mainly because affordable GPUs can’t do it justice with their gimped VRAM.
6 years and still a meme. We already had a much more efficient, higher quality and temporally accurate way to render a game, It's called rasterization.
Lmao no, not at all.
"higher quality" the cope is unreal
@@ume-f5j I don't think he's wrong, especially not about temporal accuracy. Modern games are trading too much fidelity/visual clarity for overly computationally intensive shader effects and RT, which means they effectively have 10% better lighting at the expense of -50% or lower render resolution. Studios are not finding the right compromises between effects and fidelity for the available hardware like they did before upscaling/TAA became prevalent. If the game isn't plagued by aliasing, chances are it's full of temporal ghosting, denoising artifacts and performance issues once you start increasing the resolution.
While there are a few exceptions, I almost always prefer the performance and visuals of 5-10 year old games downsampled from 5K or 8K versus using upscaling in a modern title. Forza Motorsport vs. Assetto Corsa with mods is a great example. Forza has all the modern RT tech and upscaling to make it run well, but the 10 year old Assetto Corsa with CSP/Pure shaders and a high quality track running at a native 5K easily goes toe to toe with Forza in visuals while often doubling the framerate.
Graphics tech is in a transition phase right now, and it's really not in a good state. RT is the next step for sure, but we jumped on it about 10 years too early, and the compromises are not good.
I had to buy a 2060 12GB in 2022 when they re-released it and it was the only GPU available for over a year at my local micro-center. My older GPU failed and had to regress backwards through my inventory during the pandemic. It hurt to pay so much money for a 2060 12GB, now I feel almost a little stuck with it. Although at least the modern cards aren't exponentially better.
Who are surprised? RT is a gimmick and will be until Greedy companies give us sufficient hardware. 4090 have problems at 4K. RT is a constant money grap.
it's not
Nah it completely transforms some games. Some games have average or settings that arern't worthwhile turning on, but you could say that about a lot of things.
@@d_shi "some" is like 4 games out of 200 games with rt out of 50000 without rt
RT is not a gimmick. It's just not a feature end-users touch, but experience. It's a tool for developers to make the visuals they create have more realistic lighting based on actual physics and math. It's pure eye candy. It can be much better than rasterized lighting effects that are artistically drawn to be realistic. But as with all things, budgets are budgets... for developers and end users.
Once developers actually develop with RT in mind, there are significant time savings since they don't have to guess where shadows should go, how items would be lit by a lamp at a table, etc. If developers stop doing lighting through traditional means, RT is the only game in town.
well, its no longer a gimmick, developers "sponsored by NVidia" will now make games with sub-par raster and hide everything in RT, players will then be forced to play with RT on, that's why RT-capable cards have lower vram so that you will buy the highest end ones just for RT.
The 3090 was the First Card that came Out and did actually pull IT Off in a meaningful way, imo a 3080ti or 4070 super is the bare Minimum to get into serious rt, maybe a 5060ti will get there as the First 60 Card ... But i doubt it
Serious 1440p RT indeed can only be achieved on a pretty powerful card. 4070/3080 is the minimum level here.
But there are a few RT games, like Metro Exodus EE or Spider Man, which even 3060 can run at 1440p and decent quality settings.
I agree. As a 4070 super user, I can use all the RT features on all the games I've tested so far with DLSS. I think with a weaker card you might have to disable some stuff or to run them at lower quality for a great experience.
@@stangamer1151 Look at PT 1080p RTX 4090 results in Cyberpunk. Pathetic.
RT requirements are only going to increase, every time new 60 card gets better performance, optimization gets worse.
The difference wasn’t that big between a 3080 ti and a 3090. It was those times where a 90 class wasn’t needed for high end gaming performance due to the small increase relative to the next card down.
I just need a 4060 wattage , 4070 super performance , 4080 Vram card... Under 500$/€ . Can we have that ?!
Future 8700 XT will probably be close enough to what you want.
Do you know how to solder?
Not on team Green... but on team Red? It could happen.
@@mryellow6918 we still need someone to make a modded bios and driver after soldering
Team Red and maybe even Team Blue might have you covered. Team Green might as well not exist if you want to spend less than $500 on a graphics card.
Nvidia laughing at gamers. Just remember. AMD was giving gamers 16GB when nvidia was giving 8GB on the 3070.... Nvidia just wants to screw and squeeze people for everything they got . 5070 will have 12GB and 7070 will probably STILL have 12GB.
Aha... and where are those 16GB cards today? No one bought them. I wonder why that is... I can tell you: they were unreliable, they had MASSIVE driver problems, they even had to recall some and in the end no one bought them.
I'm happy with my 6gb vram because im not a graphic cultist
amd is just unreliable bro, in my case it's not even compatible with the autodesk/adobe sets that i work on. Have a friend that whenever apex legends updates something stops working, overwatch wont work properly till it restarts like 3 times, ark won't even launch... and he has a 7800xt, amd cards will only provide good experiences with partner games, the last one of those was starfield so you can 2+2 right?...
@@TheNerd No matter how much you want to trash AMD, the fact remains, they gave you 16gb back then, Nvidia gave you a measly 8GB.
Now, Nvidia giving you 12GB and AMD giving you 16GB, Nvidia giving you 16GB and AMD giving you 20GB.
So, the trashing argument is dead. Because the current AMD equivalent are great cards.
@@miguelzl5228 Whether you like them or not, the point still stands.
RT took years to have a practically night and day difference in visuals in games, Metro Exodus EE was probably one of the first alongside CP2077, thus by the time the 3xxx series came out the early RT GPUs were mostly obsolete at the low-mid range (even with DLSS). 2060 was DoA for RT from the start because it was just a marketing gimmick. Only now with the 4xxx and soon to be 5xxx cards is it really something worth considering despite the hit on framerate because the maturing of the technology but also because games are now being designed with it enabled to begin with.
Any idea how to remove the excessive white/grey wash from Metro Exodus EE?
Tried playing it with RT on a 3060ti and distinctly remember it being too bright, to a point that the original dark grim atmosphere was lost.
@@main_stream_media_is_a_joke
There is a launch command for it, -deependark if I remember correctly. Google around to make sure.
CP2077 was the game that inticed me to grab one 😂
@@main_stream_media_is_a_joke Use the -deependark launch option.
And now we have path tracing to cripple frame rates
It would be interesting to see this with the 2080ti and 3070 as well. Nobody can say that these were bottom of the stack product that stood no chance at actual ray tracing.
No offence, but 4060 should be called gtx4060...
GTX4050 actually.
@nipa5961 i mean, sure why not.
Gt4040
fOurTy-tEn!
@@TheMamaluigi300 nah, that is too far...)
RT cores are the biggest waste of silicone and money on anything other than the highest tier GPUs.
Honest question here: How come Metro Exodus Enhanced LOOKS and RUNS so beautifully on my 7900xt with RT Ultra and games like Cyberpunk and Alan Wake 2 are so heavy? Especially Alan Wake runs and looks awful if I enable RT on full.
Because contrary to popular belief, Metro Exodus is not using RT to do most of the new updated effects, like the god rays, the better fog and other stuff, is just misleading marketing and hype from certain gamers, the fact is that game, is still mostly rasterization.
In the other end, Alan Wake 2 and Cyberpunk, make heavy use of raytracing and in ultra of path tracing, that is a more faithful but heavier kind of raytracing, even for Nvidia high end GPU's but it affects AMD GPU's more.
@Argoon1981 I see what you say and you probably know more than me, but Metro Exodus Enhanced really looks better than the normal version. I can see it myself in the lights, the soft shadows, the GI. There is difference from the normal version. Even Tim says that in his last VIDEO that Metro Exodus E.E. is one of the very few games that changes significantly while using RT. And it has a smooth 60 fps which for me is enough. On the other side, Alan wake is struggling with light RT and if I enable full RT it is almost unplayable and it also looks bad, it needs the Nvidia denoiser or something, I don't know, which is suspicious by itsself.
I hope AMD hears this. I expect Nvidia to sell too little ram to push planned obsolescence.
I mean they kinda already have, Nvidia’s step down from the 4090 is the 4080 super which still costs 1k with 16GB of vram, in comparison to AMDs offerings which the 7900XT and XTX have 20 and 24GB respectively and the XTX is the only card reaching 4080 prices, the 7900XT can easily be found for $700 sometimes less than that
@@shadowcastyt AMD is still not providing enough ray tracing I'm uncertain if they will be cost equivalent for ray tracing next generation and having enough to make it worth having in the 8600xt seems unlikely.
@@anthonylipke7754 If you're eyeballing a 5060 or 8600XT class card for this round... forget about RT. I was baffled that in the "4090 vs XTX" thing Tim included settings that resulted in 35FPS 1% Lows as "good RT configurations" for the 4090. That's a 4k with DLSS Quality (so 1440p internal rendering). If se take upscaling out of the equation and look at what resolution a 4090 can deliver a never-below-60FPS experience the answer is.... at 1080p (or 1440p DLSS Quality). So the TLDR; is that if you have a brain you'll be going for raster-per-dollar and VRAM. Setting texture quality to Ultra and everything else to Medium, when the card ages, will look a lot better than setting everything to Low and enabling RT for two effects.
Everyone pushes planned obsolescence. Everyone. It's not even a question.
@@jsullivan2112 In the semi conductor space you don't have to until lithography hits the quantum wall. People always want more performance and the next node is 2-3 years out. Once that arrives you take advantage of the increased density, and presto, last gen looks bad.
RT is a scam, honestly. It was never even made to look good. It was made to save development time on setting up lighting, and offloading it to ray tracing. Then, it was **marketed** as a better graphics preset. At the cost of half of your framerate.
People believe shadows can't look good without RT, because NVIDIA killed PCSS.
People believe real-time reflections are not possible without RT, where we had functioning mirrors in DOOM 3 in 2004.
indeed. the industry came up with new "features" to lower GPU performance, because GPUs were getting fast enough so that people wouldn't need to buy new hardware. Of course it''s a scam. Other tricks they do: gimp older GPUs in the drivers and get Devs to not optimize games. Also convince people they need 4k resolution for gaming.
@@p4radigm989 not only does it benefit NVIDIA and makes people buy into new hardware (selling 4K and, it's even ridiculous to remember it now - 8K, was never going to work, and never did). It also allows developers to offload dev time and cost on YOU, while they "forget" how to make decent shadows and reflections from 2012-2018 and make both worse on purpose.
Exactly. Just like DLSS, it is a scam designed to force gamers into subpar cards to maximise profits
Exactly. Just like DLSS, it is a scam designed to force gamers into subpar cards to maximise profits
The only good thing about RT is rendering images in 3d softwares, also those old planar reflections used to work because models did not have the same layers of quality as now and developers used them in tight spaces, spaces which had close and limited amount of objects, you can't just mirror hundreds of buildings on a puddle, SSR barely has any performance impact rn.
6:15 hey uh, where did you find that suit outfit for Rocket?
Could you switch to dark mode on the graphs? Or make it less white?
This x 3,do stop flash banging those of us with these color blinded issues.
Thanks.
Have you tried sitting in a well lit room?
@@yaldabaoth2 I am in one. Two 100 watt LED lights,and a 120= watt LED and one 60 watt LED about 10 feet awat. The others are very close and well placed. Bet you is pre diabetic,the needing a black background and white or orange or red text is a quickly growing demographic. GamerNexus knows this and goes well out of there way to help, as does TechYesCity and others do it.
have you tried not keeping screen brightness at full?😂😂😂
@@Tonyx.yt. Ohh thank you sooo muuuch!!! Never thought about before! You are a life saveeer! Smh
I have a 2060 hooked up to a 1080p60 tv, and I just treated as if it doesn't have ray tracing. For rasterized gameplay, it still feels sufficient.
I probably wouldn't have gone with the 2060, except that it was the most capable card I could get at the time that would fit in the small ITX case I used.
They sold gamers a rat's asshole for a wedding ring.
For me 40-50 fps is fine for most singleplayer titles. So for an entry level GPU that launched 6 years ago it's actually not too bad that it hits that target about half the time. Most of the performance issues seem to stem from vram which is a shame. I expect a 2070 or a 3060 to fare much better.
Fun fact: RTX 2060 does an excellent job with one of the best RT implementations in my (and hwu's) opinion - metro exodus enhanced edition. So maybe it’s still a matter of poor optimization and implementation of ray tracing in most titles?
You rather mean one of the earliest rt improvements. And frankly said it looks quite poor. Control on the other hand still looks quite impressive even against new titles.
@@aladdin8623 metro ee has a great RTGI and works well on weak hardware, while typical modern game has several puddles with reflections that drop FPS twice, feel the difference
@@aladdin8623quite poor? wtf are you smooking? unlike most rt games, metro exodus EE completly reworked for better the illumination, meanwhile most game with RT enable just turn the floor wet and the walls glossy like glass😂😂😂
I stand by my word. Metro EE has a lot of lights leaking despite using rt. And this is the primary indication for bad quality rt implementation.
@@aladdin8623
Meanwhile RT in most titles be like: three puddles with reflections that drop your fps twice
Interesting idea. I'd like to see the same for RTX 4060
My laptop 4060 handle rtx just fine when used with DLLS 3.5. path traced cyberpunk and Alan Wake 2 runs mostly a 60 in 1080p with framegen.
@@BlindTrustProject the real visual difference vs raster is when path tracing is used. sometimes.
@@BlindTrustProject No it is not running at 60, that is what your fake frame-rate numbers want you to think, it is not even running at true 1080p...
What PC gaming has become, when many gamers think fake frame-rates and fake resolutions, is "just fine" I bet GPU makers are really happy with this new breed of PC gamers.
@Argoon1981 Every frame are fake. Just the way to create them is different. Going forward rendering native 8k or more is a waste of GPU power.
@@BlindTrustProjectIf you are runing framegen and getting 60fps it's not runing fine. Framegen is a smoothing technology that is made to work at minimum 60fps and above before getting enabled to give okay quality frames and input lag will be worse than native 60 or 30 or whatever the fps you were getting before framegen and it looks like shit framegening under 60fps.
4:25 vid starts here
you will never gonna see this type of video from digital foundry, kudos for your honesty
2:30 that graph is incorrect btw, Steve made a video correcting that mistake saying that he used the 1080p data for all the GPU from 1050Ti to 570 but 1440p for the 2060
Do you have a link to that correction?
@@The11devans UA-cam doesn't like link
The fact that I’m a happy gamer without ray tracing using an RTX 2060 speaks volumes. I appreciate what the card has given me and I can play RE4 Remake and Doom Eternal without any issues.
Had a 3090 for 3 years, have never turned RT on.
That sounds like you think that it's something cool about it lol.
had 3090 for 1 year, played 40 games and dragon age Veilguard is the only game I've used Ray tracing with.
My only use for RTX has been in blender.
@@bubsy3861 not sure why you think that lol
@@dazzlerweb Glad to be wrong.
I got this card less than 6 months after release, which is huge where I live (Argentina) because hardware here go for literally twice as much as MSRP due to taxes and shipping and so on. I ran face first into the subject of this video. It was the first and last time I adopt a new technology without waiting a generation or two of iteration for it to be properly defined. I just got a 4080 super, which is a card that'll actually allow me to use these nvidia features properly. I also upgraded my cpu from a ryzen 5 2600x to a ryzen 7 5700x3d. I was just about to buy an am5 motherboard and some of the new amd cpus and ddr5 ram, when I remembered this hard earned lesson about the newest technologies. Seems to have been the right move, as the new am5 cpus are not yet performing enough to make them cost efficient.
You live and you learn!
I bought the 2060 for 1440p when it first came out. Ray tracing was not a factor. I was using it up until summer of 2024 to play Elden Ring, and while it was not great, it was playable, the worst problem being frequent chunky hiccups loading new areas due to the low VRAM. I'm using a 4070 and I still don't enable ray tracing.
I had a 2070 super at the time, while using a 1080p monitor. Even with this GPU, I always turned off RT.
The huge performance impact never justified the image quality improvement.
Even when I was buying my 2070 Super I knew turning on RT was out of the question because of the performance hit it takes for almost nothing in return, it's just crazy that people expected good performance with RT on 2060 :D
Thank you for saying what I've been thinking all this time: RT continues to be pushed as the future of gaming, and it is, but it was pushed so soon in an effort of Nvidia to get a competitive advantage over AMD that it required (and still does) crutches like DLSS and Frame-gen to be remotely viable... and before it gets there, they are pushing Path Tracing that struggles to run at native 1080p at the medium to high end! And while I won't question the usefulness of DLSS (less so Frame Gen), it's really been a bummer to see it being showcased as a mainline feature instead of asking, like you do here, for proper improvements in the raw performance that would allow raytracing to become mainstream.
Speaking of, I would love to see included, when testing with DLSS/FSR, what is the native resolution it's being scaled from, just to get the full picture. Great work with the videos otherwise!
Speaking about the RTX 20 series. 2 years ago I bought a used 2080ti for only $300 before Vram drama, and it is still very good to play games at 1440p. DLSS and 11Gb of Vram extended the life of this GPU significantly.
You will get a better experience than the PS5 Pro.
Raster performance is similar, but RT performance and upscaling is better.
Undervolted to 1800Mhz 0.825V runs silent and consumes 180-200W.
nice deal!
I guess that GPUs like yours are very happy about fsr 3 and its mods
Great card. Has DLSS which means you don't have to deal with the struggles that TAA has in newer releases. If you do want to do some RT you have better RT performance than a 7800xt in games with hardware RT. And since DLSS quality looks better than "native" TAA you can also get a decent frame rate.
Of all the games that Techpowerup has done performance reviews on in the last year non have needed 11+GB of vram for 1440p ultra. Only one actually needed over 11gb at 4k ultra. Almost half had RT always on. And their testing along with Digital Foundrys had DLSS looking better than "native" TAA in all tested games. New engines are more efficient with resources. Asset streaming and decompression has made ssds required and made loading 4 copies of every asset for LoDs a thing of the past.
You will get a better experience than the PS5 Pro. citaton needed
You have low standards😂
Things become truly dire for the 2060 when you combine this performance data with Tim's previous look into visual fidelity of ray tracing. You can count the amount of raytracing uses that run well AND look good on the 2060 on one hand.
Nvidia GTX 1060 6GB in 2016 -> 192-bit memory interface
Nvidia RTX 4060 8GB in 2023 -> 128-bit memory interface
Yet the 4060 has nearly 50% higher memory bandwidth because of the much faster memory.
@@mojojojo6292it came out 7 years later... that's the bare minimum the 4060 can have without being completely VRAM limited. HUB already showed multiple times that the almost entirety of the 40 series (except maybe the 4090 and the 4060ti 16gb) is VRAM limited, they're treating VRAM as a planned obsolescence
@@mojojojo6292 yeah I dont think the memory bus width is the problem in the 4060, just the performance it lacks is. 4060ti though lacks memory bandwidth, that should been a minimum of 160bit or gone up to 192bit like 4070
@@lucazani2730 Complete nonsense. I have 4070 and it's never vram limited and I game at 4k. DLSS quality or balanced in the more demanding games reduces vram requirements.
@@dominicshortbow1828 They would need to have put extra memory on it then. Bus width is 100% dependent on how many memory chips are on the card. 32 bits per chip. Which the series should have had to be fair, 4060 should have been 12gb, 4070 should have 16 and 4080 should have had 20gb. Hopefully the 50 series fixes this but with 3gb chips being available for gddr7 the bus widths will probably still be very low but overall bandwidth will still be fine because the memory is much faster again.
Thank you for this video. It proves a point i was making back in the day when these released: there was no point in getting a 20 series apart from the 2080ti and maybe the 2080s for RT, and even then you had to be an enthusiast. Even the regular 2080 soon was too slow to really be able to handle RT, plus it too was crippled in terms of VRAM.
These first and second gen RT cards got sold by the millions and all people did was try RT for a few minutes before going back to double the FPS for almost no graphical impact. Same is true on AMDs side of course.
Would you guys also do a 2080 one aswell. So can get an idea on 1st rtx generation 60 and 80
03:00 did they increase prices due to gamers buying expensive GPUs or was it more due to crypto miners buying up all the cards and reselling them later in a crappy condition for high buck because manufacturers couldnt keep up with demand
hey! it would be helpful if you guys just added the year in which the game was launched next to the name of the game, it shall allow us viewers to better understand how well the card has aged over time and v/s newer games. loving the content !
Google exists
I had this card for a while before switching to the 6700xt which is sooo much better! I bought it cause it was the only GPU I could afford, i never bought it for the ray tracing, and I dont regret buying it.
1:44 That is the exact model I am using now! I still use my old 2017 OG 2060 6G. Games like Starfield makes it cry and join SpaceX for takeoff. Games like RDR2 makes it crash with "Insufficient memory" errors or something like that. My next tech refresh I am gonna aim for at least 8-12G gpu
i never had any error with rdr2 and rtx 2060 with mostly high settings.
Heck, I recently played Jedi Survivor and even my 3090 didn't have enough for Ray Tracing in acceptable framerates in that game. lol
So yeah, I'd say it's Nvidia's fault for mainstreaming this superfluous tech, especially way before GPUs could actually run it. But we also gotta admit, ray tracing or no ray tracing, AAA games have become dangerously unoptimized, even when they don't release completely broken.
Everyone loves to forget the CPU hit RT has.
Made even worse because Nvidia driver overhead
Because nobody expects the cpu to understand what ray tracing is, let alone how to support any part of it.
@@ZeroHourProductions407 That doesn't even make any sense?
Ray tracing has a big impact on CPU usage. It doesn't matter if the CPU doesn't "understand it"
@@ZeroHourProductions407 Raytracing and path tracing, before real-time raytracing on GPU's, was a CPU only thing for decades, so why wouldn't CPU's "understand" raytracing?
@Argoon1981 because it would take an entire week per frame for a CPU. Pixar needed Crays with thousands of processor cores to even render stuff like _Toy Story_ in the drug-addled attention span of a Hollywood movie producer.
Unpopular opinion: RTX is for DLSS only. Ray Tracing not only tanks every game, but also in many cases is worse outcomes (subjective - it might look closer to reality, but the question you have to ask - is it fun to play?!).
The RTX 2060 is now a 6-year-old midrange card. For example, Pixel Shaders were introduced in 2001 with the GeForce 3. Who would have expected to achieve 60fps with max settings in Full HD by 2007? Probably no one. The only real shortcoming of the RTX 2060, in my eyes, is the measly 6GB frame buffer.
2060 was top of the bottom range. The 2070 was the true mid-range card and 2080 high end.
It's more of an issue that we got a terrible generation over generation improvement this time because Nvidia artificially raised their product tiers as a cash grab where the ONLY maybe worthwhile card gen over gen was the 4090, which was insane. If the 4070 (which is actually the 4060 or 4060 Ti) was in line with historical price point, no one would be worried about upgrading.
The problem IMHO though isn't that the card is old and not capable of running modern titles with RT /anymore/, but that it never was. If it would've had a good run, but after 6 years its just to old to keep up - fine, not everyone can be a 1080Ti. But for the fist couple of years Nvidia's marketings' and their fanboys' meme was that buying RT cards is some sort of future proofing, for when we finally get to a point where basically every game has or even requires it. And that was actually quite a successful argument, everyone brings it up whenever Nvidia cards are much worse value than the competition in pure raster. Trouble is, by the time there finally arrived an arguably decent selection of games that use RT - and now even a handful of them in a way that gives a meaningful visual upgrade - the card is obviously too weak for it. There was never a time when it wasn't, just no games with actual meaningful RT that could demonstrate it.
Why go back and test the lowest ever RT card 6 years later?!?!? Why? I owned a 2080ti which to me is a 2k 120 or 4k 30 card. I have a 4090 for the past 2 years, why go back 6 years later to test an entry level RT card? Why? I love this channel as it’s my fave by far and the only one I trust along with GN, but why 6 years later…nobody’s buying a 2060, why Tim?
@@davidmorgan05 Because it is Tim, and probably being phased out of the channel slowly.
I sold my 2060 ko last year and side graded to an rx 6600, I actually made a little money in the swap and gained some performance, 2 gb ram, and lower power consumption and quieter fans. Very happy with that decision, the RTX features were useless on the 2060.
just a reminder, the r9 390 came out with 8GB of VRAM in 2015 for the price of a 970 and the performance of a 970 and that card had 3.5GB+512MB of slow VRAM. Intentionally gimped, this is what monopoly's do. Nvidia never changed but yet i never get people who buy their low or mid range cards. if you're going to go Nvidia only get the top line card otherwise you are intentionally fk ing yourself. and before someone calls or assume i'm an AMD fanboy, i'm sitting with x2 Titan XP in my rig. i dont give a f, i'm just pointing out the obvious.
Here's my problem:
Here AMD is like -10% $$$ of Nvidia.
*** *BUT old Nvidia (Fanboys) Cards flood the second use market.* ***
It's literally cheaper to buy a 3060 12GB than a 7600XT 16GB.
And yes I thought Intel should also be cheaper ... NOP 770 16GB $$$ = AMD 6800 16GB $$$
@@weltsiebenhundert yeah gpu markets been f k ed for a while since bitcoin, covid and now ai.
I admit to being Ray Tracing Curious when the 2060 was released but following your advise I skipped it and went with the 2070 Super , then realized I dont really care about Ray Tracing
Paying the early adopter tax on a mid-priced product never bodes well. Especially when the previous generation jump (9xx to 10xx) saw the 1060 delivering 980 performance and this thing struggling to put some distance between it and the 1070 (while losing 2GB Vram).
It was 20% faster than 1070 using pretty much the same node. Tf are you talking about??
my 3600x really hampered my 1080p RT experience, I am using a 3070 and upgraded the cpu to the 5800x3d where RT performance increased.
Yeah looking back at it happy I got a 5700xt back it was a great gpu
The drivers were horrible.
Returned my 5700xt for a 2070 super. Best decision. Still use it.
Best GPU at the time!
The 2060 is what you get when you let one hardware company define what the new hot features are. Used to be the games/apps that drove that, then nVidia came along and decided they'd do it and set the price for it too. People played along and now we have an unusable low end, a mediocre mid range, an upper mid range that's decent, but invariably overpriced, and a high end designed to separate people from their money with performance they can rarely actually take advantage of.
Facts
I'm shocked. Shocked!
well really not that shocked
With Ray Tracing, you are going for excellent visuals. It doesn't make sense to use low graphics settings that degrade visuals in order to turn on a feature to improve visuals. Ray Tracing only makes sense on graphics cards powerful enough to run Ray Tracing at high graphics settings. And at high RT settings as well, as with the half measure RT games the RT affects are barely noticeable, it at all. The overlap area of the Venn diagram between cards that can do it and games that offer it is still super small. Probably just Cyberpunk 2077 and Alan Wake 2, and only on a 4070 Ti Super, 4080, 4080 Cheaper, or 4090. I don't anticipate running any games with RT until maybe 2028.
The perfect duo of lying coping fanboys. "My i7 4790K still runs everything" and "My RTX 2060 gets 60FPS with RT in [insert game the doesn't have RT]".
1. "My i7 4790K still runs everything" - show me a game that doesn't run if it's a lie. Low framerate is till technically "running" ;D
2. Today I bumped performance of a scene in UE5.5 from 40 FPS to 60 FPS by switching to hardware RT with Megalights on RTX 2060. To be fair I was purposefully using a lot of lights with shadows and the noise was quite bad and it had upscaling enabled, but it's real, so it counts ;P
I had an i5-4690k, it got bottlenecked by my 2080 Ti so I moved up to Ryzen 7 2700X. Still wasn't worth running RT on anything.
The cope by Nvidia fanboys is strong but the fact is that RT has been and continues to be a meme for most users.
@@kazioo2 Did the OP just call out your specific build?! 😂
@@kazioo2 we're hitting levels of pedantry we never thought possible
I always find it funny when they say “MY xyz has this capability “ as though they manufactured it 😂, its all the same yours is not special
And this is exactly why ray-tracing didn't count into my buying it back in 2019. Everything else lined up for it as a mid-range GPU to get me playing Destiny 2 on PC.
I've since moved on to a Radeon RX 7800XT, play more at 1440p than 1080p, and upgraded the CPU to the 5700X two years ago. Ray-tracing is interesting, but still not important to me right now.
Ray tracing doesn't even make that big of a difference in visual quality. Most gamers will always favor higher resolutions and frames over RT.
Hard disagree. In titles where it makes a big difference, as soon as RT is on it looks far better than ultra rasterized settings. Control with a few settings to medium and with RT looks much better than Ultra and no RT. Metro Exodus Enhanced Edition at high-medium looks far better than the regular one at max settings.
@@tensorup6595 not really, it depends entirely on the game and the skill of the devs, I find when developers have good tradtionally rasterized lighting skills, the customized non ray traced lighting still looks better than raytracing and generally less dark, even if its technically less realistic.
@@geerstyresoil3136 lighting and shadows are mostly on par between raster and rt but every game with a ton of reflective surfaces looks waaaaaaay better with rt enabled. CP77, Spiderman or Watchdog for example are totally different games with rt enabled. With rt enabled the games feel way more lifelike but i also tent to disable rt because i can't play with the terrible framerate a 3080 gives me with rt enabled. I also tried FSR3 mods but it looks and feels terrible.
@@tensorup6595 100 percent. We have games like BMW where high preset with full RT looks and runs better than Cinematic without hardware RT. The difference it makes in that game is night and day. Makes you able to see the detail of the textures. Games typically take around 5 years to make. So it's no surprise that we are now seeing games developed with RT in mind. Not just it tacted on overtop of the baked lighting. Half the games the TPU did performance reviews for this year only have RT lighting, no way of using raster lighting.
Just for fun, let's not forget RT Minecraft. It looks great sadly it is on the bad version for Minecraft
Great material Tim. Congrats. To others - lot of perf can be won with OC/UV (around 15%) and details (in between presented). In Cyberpunk when GI set to medium it looks like ultra but nearly doubles performance - but need to turn off some raster settings to stay within 6gb vram. And starting from 40fps you can enable FG and send card to almost 75fps that usually looks close to native. This is still an amazing card - just needs 1080p and some tweaking to fit in vram.
Remember fellas: heavy rt is only present in a handful of games, some of which you'll probably never even play
this
2:31 It's weird to see a mid-range Nvidia GPU at $500, today you have to pay $100 more to get an XX70, that was just 5 years ago. 😟