Contrary to the table at 0:13, the RTX 4060Ti is PCIe x8, not x16. We also said the X370 chipset does not support ReBAR but that was added in a later update. We apologize for the confusion!
@@karolwesoek5350 I think most board that is still rocking PCIe 3.0, will be bottlenecked on the CPU before you buy this GPU.... The argument for x8 bus is more relevant to 50-series GPU, which either just run on PCIe, or just sip powers from one 6-pin/8-pin so it does not overwhelm the poor used-office PC PSU. The memory bus already knee-cap this GPU anyway.
Couldn't it be Nvidia chose to 'only' go with a 128bit bus because whenever they used a 256bit bus in combination with the new larger L2 cache the performance of the 4060ti would be to close to the 4070? Would be interesting to find that out.
And the 3060ti might actually also be more powerful than the 3060ti for people with pcie 3.0. NOBODY NEEDS PCIE 4.0 IN THIS LOW END, EXCEPT WHEN GPU MAKERS ARE SO CHEAPO WHILE MAKING CARDS THAT USES ONLY HALF OF THE PCIE LANES. This is a pcie 4.0x8 card as far as i know.
the 3060ti was slept on so bad, I gave my 3060ti to my brother for his pc when I upgraded. He's able to run games so smoothly at 1440p 240hz with no issues. Forza runs like a dream, he plays GTA V mostly with redux and it runs like a dream. Hands down the best budget card ever imo, 3060ti was unreal
Here at Nvidia we made a mistake with a few cards, and realised they were a bit too good and had far too much longevity for our tastes. So we went to work and made sure that won't happen again: we are proud to announce, the 4060Ti, a card that will be almost useless for new games in 6 months!
Boys, it's time we rise up. Lets..... go outside, I know its hard, but you have to be a soldier and do it. Lets... play physical sports like soccer, volleyball etc. Lets get dirty. We dont need ngreedia.
I literally just got the 4060ti without heavily researching it first (oops). I paired it with a I7-7700T , 32gb and 1tb nvme (gen3). All these components were sitting around collecting dust, so I wanted to make a low-wattage gaming rig. I'm also gaming on a 1080p 180mhz monitor and so far, it performs extremely well, no complaints whatsoever. It obviously crushes 1080p gaming so I'm perfectly happy with it.
I have a Ryzen 3200g, 16 gb ram, 500 GB SSD and a used 1660 super (which I got for 110$ a year back). I am still crushing 1080p 60fps gaming. That's why the whole 40 series looks like a comical scam to me. 😂😂 So for the first time, when I do upgrade I think I will look at AMD or Intel this time. Nvidia is as hopeless as nokia at this point.
I'm having no issues with it, I don't need 4k, and in most cases, I'm at 1440 easily at 60fps. I'll just have to upgrade a little earlier than I want. No biggie.
I wonder if they'll go google and call it quits after just one full card. What I don't get is how they can do so badly. They company has made integrated graphics since forever, and CPUs since time.
@@gorkskoal9315 Intel has atleast 2 more generations from what they said a couple months ago. I just hope Battlemage is a good price to performance like ATI/AMD used to be because many people would jump to it.
@@gorkskoal9315 Do so badly? Linus just showed were the A770 is beating the 4070ti in some task and not far behind in most. Drivers are constantly improving each month and the raw power within the card can surpass the 3070 in time, all for less $$$. I'd say they are doing amazing for a first time being serious in the GPU market. They are defiantly committed for two more gens being Battlemage and Celestial, Battlemage aiming for 4070 to 4080 performance from what I hear and Celestial probably on par with whatever 50 series mid range will offer (5060ti/5070) And will more then likely not increase much in price, maybe $400 to $500 for the top end model. I think once more people feel safe with the drivers, which are already pretty stable, they will sell good AND iNTEL will continue to keep new Arc cards rolling out.
@@gorkskoal9315 What meme echo chambers have you been staying in? The fact that the company only did IGPUs for so long is not only taken into consideration by the tech commentary space, it was a concern that Intel would have trouble tackling a discrete GPU after only having that as experience since iGPU =/= discrete GPU, especially a discrete GPU with all the features Intel was promising, and they'd have to compete with a duopoly with literal decades of experience with discrete GPU drivers. And they did have trouble, so it very much was _not_ a surprise and people were shitting on them for the totally expected outcome. Intel actually did better than expected with the hardware being decent, since people were expecting they'd botch that too, and most of the problem being the expected driver inexperience and bullshit interface software. Shitting on them that became less loud with the driver improvements over the last several months.
Don’t forget the 64Bit buswidth and pcie 4x4 connection and decreased shadercore count But it will feature dlss 4.0 Get 4 inaccurate fake frames - unusable in any non slow paced offline point and click adventure game - for every upscale frame produced Oh did we forget to mention To keep manufacturing costs down upscaling will be on by default Now to the price It’s just 600$ for the 1 Gig and 750$ for the 2gig variant
Looking around the tech channels, this looks like the most positive review of the 4060 TI today, and the conclusion was "Buy Arc A770 instead". Amazing.
Wouldn't make a difference. People keep buying. They can slap a 5x increase of price at this point and it wouldn't hurt them at all, people have disposable income, or they save up for the latest and greatest GPU. The money isn't even in the consumer market anymore.
Well, the competition is busy sniffing glue and pricing their less appealing products at similarly high, rather than trying to gain marketshare with sensible prices
As a 3070 user, it's also aged very well, if you can even call 2 years "aging". So not only is Nvidia being beaten by amd, its being beaten by itself from just recently lol. Edit: not saying this to be an nvidia fanboy, i only got it because i had a great opportunity by getting a prebuilt with it, and I actually use DLSS 2.0
@@calmkat9032 "So not only is Nvidia being beaten by amd, its being beaten by itself from just recently lol." That's due to NVidia's new 'Self-Flagellating' architecture.
@@paulbantick8266nvidia's architecture was soo good, but their scalped price margins were too high to ignore, so they shifted their dies up a tier to upsell us consumers :(
@@PannyLanny My undervolted 3060 draws about 85W-115W tops (mostly around 70W), while delivering all of its performance. So 4060ti can probably be undervolted to about same level maybe even more.
@@groenevinger3893 bro Nvidia fucked up so bad, that it made AMD's last gen 6000 series GPU look like even better option, and with the huge price cut you can get a 4070 type GPU for less than a 3060ti price.
This ^ I bought a 3060 12gb to keep my old gaming PC going. Then built a new gaming pc with a 3070ti a year later. That 8gb 3070ti isn't looking too good now, but not like there's much alternative in the 40 series anyway 😂
Yes it was. A lot of people wanting to justify the 3070 and 3080 say the 3060 should've been an 8GB VRAM card, not realizing its all the Ampere stack that needed way more VRAM and the 3060 was the only gem of the crop
I got a 3060 for my first "workstation" PC and it feels pretty good to be able to run UE5, Photoshop, and ten different browser tabs without it breaking a sweat. You really feel all that space in VRAM.
I think you mean the 6700XT. Really, pretty much the entire RDNA2 lineup at current prices looks absolutely amazing compared to the current generation from both companies right now.
@@Blackfatrat I thought of getting one for my new build as well, but it still isn't available in my country. Hope in 5 years we'll do our mini scrapyard war with an A770 and a 13700K
We might actually get to see a shake up in the graphics industry next generation thanks to Intel. A770 Made me really excited for battle mage as well, Intel is crushing it!
16gb arc enjoyer here, if you value peace of mind don't do it. If you like at least a tiny amount of tinkering you can do it. If you want sff build with arc like me- fuck no
If I was NVIDIA I would be pretty damn embarrassed to see the 6700XT go toe to toe with the 4060TI. Lets hope AMD doesn't screw up the 7600 and 7700 pricing so the whole 4060 series can be totally irrelevant.
It would be an easy win for AMD, launching their mid tier cards at a competitive price. 6700XT and 6800XT are already really good deals, make the 7000 series a little bit better without being greedy and Nvidia will have to rethink their strategy.
@@oxfordsparky In ray tracing the 6700xt has the 12 gigs, so when even more and more Ray tracing workloads use more then 8gb, the 4060ti will be completely unusable, maybe the 16gb card is more useable but it really depends on the ray tracing load on if it is heavy on the bus or not, while the 6700xt gets some framerate in ray tracing compared to the 4060ti 8gb. In Content creation, a 4060ti (16gb variant) probably compares to or beats 3080 noting that a 4070 beats a 3090ti in content creation and probably beats a upcoming 7950xtx & 7900xtx in blender, So that point still stands.
AMD needs to price the 7600 below the price of the OG 4060 with performance in between it and the 4060ti Ie: Peformance around a 6700xt. I think a good price for a 4060/4060ti killer would be between $200-$280.
I know some people with the card, it is still a heavily buggied mess, but it has improved a lot. I absolutely do not recommend the card for non-tech savvy people or people without patience for crashes and bugs, since you will experience a lot of those, and it will take years for them to solve the majority of them, if Intel doesn't drops the venture first, but when you hit a game that is not bugged (usually DX12) they are a great bang for the buck. For the less adventurous the RX 6500 XT, RX 6600 and RTX 3060 12GB stands on the same price ranges with comparable performance and way better energy efficiency, though, but the Intel offerings might potentially improve past their performance level with the time, or not, time will tell. The A350 is also nice if you need cheap AV1 encoding for less than $150 bucks, but the card is kinda doo doo for gaming.
notels snarc gpdont?It's emulating direct x9 and is, frankly, a bug ridden pile of shit compared to the other options. at an effective 400 dollars, before tax and shipping you can get a 3060ti, 3060, 1060ti and 1060, and well you get the idea a few cards that do just about as well without half the suck.
As someone daily driving the A750 it's perfectly fine. I've had no issues playing anything (Ark, Warcraft 3, Rocket League, Tomb Raider, CyberPunk and GTA5 mainly) - I never had any issues from day one though about a year ago when I bought it.
Happy to see the Arc A770 punching here and there, even if it's a mixed bag of results all the time. I expected it to catch up at 1440p and over. Oh well. EDIT: In GN's review it matched the 4060Ti on Cyberpunk at 4K. Not bad!
It is sad that all these influencers have decided they can get more response and comments by jumping on the 4060TI negatively. I just installed mine and love it to death. It is massive, impressive and fast. It has the newest 12 pin high power connector and with inflation added, I think it is a great value. These people have a sytem they use when making videos. UA-cam tells them what they will pay more money for, and they respond in kind. Hate videos do better than informational videos many times.
"We realized that we were giving consumers cards that are too future proof... so we decided to make each new generation become obsolete in a years time!"
Thanks for the great review of the RTX 4050ti. This gpu seems very reasonable at $250 and I’m sure the budget market will be happy to finally have a marginal upgrade over the 2060 Super.
Jokes aside... even the 2060 super has 8gb of vram, lol. This card isn't an upgrade for anyone unless they're on a 1060 or older imo, and even then, still better options for the price.
The Arc A770 has been getting huge performance upgrades since release and it would be an interesting video to see how far its come since. Probably not a "30-Day Challenge" video, but a Value Proposition Card compared to others around it's class currently.
But isnt Intel scaling down their GPU division operations? How long can they continue this run of performance increases? Intel is a bit of a dark horse in this GPU space, but we all should absolutely be cheering them on.
@@scroopynooperz9051 As far as I know, those were rumors as word got out that Intel was closing or downscaling their GPU Division. BUT Intel has since dispelled the rumors and released a diagram saying that they will work on GPUs from Battlemage (B-Series) up to their Druid (D-Series) Architecture. It's at least guaranteed that Battlemage will come out by the end of 2024.
I feel like it should've been mentioned that the 4060 TI has a x8 PCIE 4 connection. Would've been great to see you testing the 4060 TI on a PCIE 3 system to see how much performance might be lost.
I guess that’s part of the agreement Nvidia would never allow that Like they wouldn’t allow similar priced higher tier amd cards shown in the same chart
@@CrocoDylianVT You do, actually. These days, texture streaming is a thing, thus making the bus the bottleneck. 128 bit is a joke. No matter how you look at it.
Love my 6750 XT and probably soon 6950 XT. Imo no one should be buying nvidia rn cause of the insane greed and afwul price to performance compared to team red. Dont know why so many people are still buying cards like the 3060 which is awful value
This generation of cards (both companies) is a tough sell... the more (generations) we progress the more caveats we hear to justify the price... The 1080ti was a miracle. Best purchase ever.
@@mrcaboosevg6089 You see no reason to upgrade from a 2080? Do you not play newer games? I am on a 2080ti and absolutely ready to be done with it. Ordered a 4090.
Just a friendly reminder that the rx 6950XT is going for about $600 still. It has gaming performance on-par with a 3090, only suffering in ray-tracing and productivity tasks. If you're primarily a gamer, it's really worth taking a look at this card, it's an unbeatable value.
@@remeuptmain true, I consider anything below a 4090 a waste of sand but I don't want to pay 2000+ euros for it even if I could buy literally 100 cards because that would be burning money away. These cards need to be half of the price to be worth it, all segments, from both nvidia and amd.
dude this graphics for the graphs are absolutely SPOT ON. LTT have clearly listened to their community about feedback on how clear the graphics could be, and have made massive improvements. thank you so much
@@muyoso It's different approaches, and that's OK. GN includes a large variety of cards, but LTT reruns benchmarks for every card in every review. One isn't better than the other, but it shouldn't be surprising that GN's breadth and depth of data is bigger.
@@muyoso those are different styles. GN shows a lot but i have to pause every time he shows a graph otherwise i will miss something important. LTT shows a lot less, but what they show i can see at a glance.
I do wish they would skip the weird squares in the background, makes it harder to look at the content. edit: it's the LTT logo... makes it looks like crap in 480p.
I actually got a killer deal when I got my 6700XT (was over $100 cheaper than the 3060 non-Ti at the time) - and can 100% agree, as long as you don't need ray tracing, it holds its weight insanely well without breaking the bank.
Ok as an Intel Arc A770 LE owner it is very much deserved. This card doesn't do well enough in modern games vs an RX 6600 to justify your entire library of older games being a buggy mess that doesn't work at all 20% of the time due to game breaking bugs. I've had the card 1 month because I listened to everyone complain about VRAM, and who cares about VRAM when the experience is literally garbage. Intel has 3 different programs to control the cards software and the card can't even boot correctly from sleep mode 8 months after launch. The performance is only sometimes good in optimized Direct X12 brand new sponsored games, and it is a joke for everything else. I replaced an RX 480 8gb and I feel like it did a better job than the ARC card. While I may only be getting 40 ish FPS, its better than playing a glitchy mess with high FPS because of emulation. The AV1 encoder also doesn't work half the time in OBS for no apparent reason either. Theres entire forum posts filled with bugs that are known for months and they will never get to them because the ROI is not just worth it for them. Just buy the RX 6600 for 200$ on a budget and upgrade whenever you have an issue. Because think of this. Spend 200$ now keep the 80-150$ you'd spend to get to the RX 6700/6750XT series, throw it in a stock market portfolio and use it to buy a brand new RX 8600XT realistically or something like what that might be. It'll be the latest product with the latest feature set as well.
@@poochyenarulez meh I'd give it another year to smooth out. Point being that the Arc cards are going to age much better than this piece of shit card. There is not improvement to come for the 4060ti, this is it. It's done, it's garbage forever, but Arc still has room to stretch, and provided Intel stays true to supporting their GPU division it will get better.
@@RyTrapp0 it's not going to take nearly as long as you'd expect. This may be hard to believe, but Nvidia basically had to nuke all driver optimizations after Turing. This is also why post Turing a significant number of older titles encounter strange artifacting or similarly terrible DX8/9 performance when compared against older generations of cards. The only exception to this being the rare few titles which are still popular and on DX9, which amounts to essentially none now that CS has moved to source 2.
@@poochyenarulez As someone else who uses Arc... they're better... 😐 They're not "dumpster fire" tier like before January, but seeing as how they now just got to the point of having Arc Control launch at start without requiring UAC and now enabling fan control, they have quite a ways to go. Granted, Intel has put in a lot of leg work to fix them, but starting from so far behind means that DX11/10 performance is a bit janky. DX9/VK isn't great either, though that might just be because I play janky map games and I have the thing clocked as high as it will go while not BSOD'ing every five seconds. It's considerably better than at launch. Also, DX12 performance is insane. I personally don't stream, so I don't know how nice it plays with OBS, but in Handbrake the thing is a beast.
The 4060 Ti made me realise how insane the value and the price to performance ratio of the 3060 Ti back in 2020. I was lucky to get one at MSRP when it was launched.
HEll, I'm still enjoying my 1080ti purchased on its release day, which can still render most new games at 60fps at max settings as long as you're happy wit 1080p resolution, which is the best my TV/Monitor is capable of anyway.
Power consumption, heat and noise are big for me. I don't know if I speak for many PC gamers, but I tend to run discarded professional workstations as a platform for my gaming PC's. I can usually get them for free after they are a few generations old. These PC's while powerful and able to support discrete GPU's, they do not have tons of airflow and also their power supplies don't have the same wattage or PCIe cables as aftermarket gaming PCs do. For me, staying well below 200W for the GPU is a hard requirement. I am finally able to replace my 1070ti with something that is nearly twice as powerful while staying in (actually below) my existing card's 180W thermal envelope. The same cannot be said for the AMD cards or even the 4070 series.
@@Ja_Schadenfreude Not good value but it's a good card in itself. That's what they said. Get something else, but if price was better this would be good.
@ww7285 do you get off on hating. This wasn't a positive review the conclusion is literally that the pricing is designed to get you to spend more money than you should but that luckily the competition at this price is pretty good even including intel. How can this be anything but negative when one of the conclusions is maybe try ark instead.
It's on pat with the 11th gen Core CPUs! Actual true waste of sand, since 10th gen was generally better in every way... But this GD Furry60 shit is coming real close. Lol! I know the 8x vs 16x thing is probably not a huge deal for most... But fuck me! Who thought that would have been a good decision? Simply from an online trolling standpoint! I intend to make the fuckin fun outta that... And the decrease bus... And... And. . ..... WTF?
it might be only 10% faster, but the most important part. And this is the part that really matters. Its profit margins are 100% more than the previous card. Thanks for the leather.
The moral of the story is that if you're able to find RTX on a deal, then you can be 100% sure that there is a RX card somewhere that performs same or even better with lower price. With is only NVIDIA's fault and nobody's else. Great work!
"Yesterday's performance for yesterday's price!", I feel like the prices of used 3060ti's and lower won't go down much with the release of this one, just no insentive. :(
@@aleksandarlazarov9182 I think it goes "Yesterday's performance for tomorrow's price" but yeah... The whole value of 4000 series is so bad that prices of last-gen barely moved (at least where I live). Also, there are not that many people upgrading so it's also harder to find something used from 3000 series.
@@AdamKuzniar Yeah, the CUDA stuff from NVIDIA has great support from any software in existence and there is no denying it, and only then there is some kind of AMD acceleration (but also probably inferior anyway) But that's kinda the issue as NVIDIA has so much profit from the enterprise that needs CUDA acceleration as well as AI stuff that I don't see just why they're treating the general customer in such a bad way for giving us such bad value products just to make it feel more premium :( I feel like only self-employed people would want to seriously use CUDA or AI acceleration on something like a 70 series card or lower just because they already had that card anyway. Any serious business would go straight for 4080 or 4090 or some Quadro thing as time equals money. Those same businesses can also justify spending such a ridiculous amount of money for a 4090's xd
NVidia's engineering effort seems to be all-in on product segmentation rather than product quality. The cache+reduced memory bus allows decent game performance while hamstringing the professional/AI workloads and so allows for better upselling to the higher-tier product.
Hey, it's a loss for the consumer, but they're not cannibalizing their products either - something AMD has already stumbled on a couple times with their Zen chips.
@@squidwardo7074 Notwithstanding the gimped memory capacity, I/O, and PCIe connectivity, the core has about the same performance as the RTX A4000 professional card, which is currently retailing for about $930. So, yeah, it could easily bite into professional sales if it wasn't so neutered.
@@squidwardo7074 My university loved to buy Nvidia's gaming cards for machine learning research back when I was there 6 years ago. Quadro and Fire Pro cards are way too expensive for some organizations.
Just wanted to say BIG thank you for highlighting the graphics card in question in the graphs of this video! I don't often comment on UA-cam videos, but this improvement is so worth it! Big praise to the people who create the graphs ❤
All the videos I've seen testing the 4060ti show it performing really well in games like cyberpunk (with DLSS) and Red Dead Redemption 2, I currently have a 1070 and Im intending on upgrading to a new build with the 4060ti 8gb which I could just swap out for the 16gb version or a 4070 sometime in the future if I really needed to if there really is a VRAM issue which from what I've seen there actually isn't. I haven't even run into any VRAM issues with my 8gb 1070 at 1440p, it's just that its just not powerful enough anymore and I want access to DLSS and RTX.
The VRAM concerns are valid because the current gen consoles have 16gb shared between textures and everything else. When they port these games to PC it translates to needing more than 8gb VRAM, especially if you turn texture details up. Native PC games might be a bit more optimized but simultaneously tech marches on and those games are going to need more VRAM anyway.
Yes, and it would be pretty pathetic if a graphics card that cost as much as a current gen console on it's own was forced to use settings that make it look worse than said console because of VRAM limitations.
seriously, it almost feels like nvidia lately is secretely spearheading sony's ps5 marketing campaign. Why play on PC at all when the GPU alone costs as much as an entire PS5 digital?
The competition is more fierce than ever before so much so, if all you care about is gaming, this card is already 200 overpriced, if this thing sells then, it's because it offers good enough value or because the consumer isn't very smart.
If Nvidia had put a decent width memory bus in this thing, it would probably be a bit more competitive. As it stands, they are not going to sell many of these things.
I don't think ngreedia gives a shit. Amd might be spanking them in the 1020-1440 raster user area, but that's what? 10 maybe 12% of nvidias target? Testla only needs the Cuda cores for their "ai" to work. I have no idea if they need bandwidth. render farms might need bus and clock speeds but those guys have more than enough cash just from 2hours of interest alone to buy up a fuckton of creem of the crop cards.
This is effectively e-waste at its price point. The only thing I can see this being alright in is prebuilt PCs from the usual guys (HP, Acer and so on) or people who don't want to break the bank but need/want to upgrade. The issue I see is people deciding to jack up the prices on 3060ti and so forth like the 2xxx series were.
I bought a 6750XT in January for about $430 including tax. I was worried when the budget RTX cards released I was going to be upset I didn't wait. I want to take this time to thank Nvidia for not making me feel bad about my purchase, in fact, I'm happier with my purchase now than I was a week ago! 😂
I paid $1,400 for my 6900xt during the shortages. I didn't feel too bad at the time as it was only $400 over MSRP, but I felt worse later on when everything came back in stock. Now with these absolute lackluster releases I don't feel so bad. Also since 6900xt's have gotten so cheap I'm not afraid to OC the piss out of my card now.
@@YourLordMobius I overclocked the snot out of my 6750XT basically as soon as I got it since I got it for pretty much a steal, especially since it was the Red Devil variant.
I just recently oc'ed my 6900xt with the morepowertool. Peak it's pulling 395w and won't pull anymore even though it's set to limit at 450w. It's on water so no worries there. Stable in most applications at 2800 mhz. So yeah, I got a little extra performance out of mine LOL.
@@YourLordMobius I'd say! My buddy has a 6950XT and he's a little bit computer illiterate. I tried to talk him into letting me OC it, but he's super worried I'm going to screw it up lol
@@tbthunderer ah yes, the old "you could blow it up if you OC" fear. Only way you're going to blow one up is by overvolting. Even with the massive amp draw on mine the voltage is still relatively tame, and it just won't pull any more power even though it's completely unrestrained. Edit: Radeon cards have always been notorious for allowing crazy high amp draws. Even without an insane OC and on-air my Vega 56 used to pull 550w with a mild OC.
What makes this even better is that due to walking back the now 4070ti from it's original 4080 moniker, it's possible this was intended to be the 4070.
@@ilnumero1234 HAHAHAHA! Fanboy fantasy speculation! The chips were specced over a year ago in R&D. There's no chance the moniker change also changed CUDA core counts, and Nvidia has had x70ti tiers for about a decade now. So there was going to be one no matter what. At best - at absolute best - they removed an x70 Super variant for the change. Can't prove it though so it's speculation vs speculation. Mine's funnier though, so :P
To be honest I think that the new cards are great considering how much less power they use. I hope they will continue that trent with future releases. Only downside is the price at the moment, but it will come down around black friday and xmass maybe. I hope so at least. Still... I don't get people that would consider a 6650xt or 3060ti over this one with the reason being that it is much cheaper. Well, yes, maybe if you buy a card for one or two years. If you game on the 4060ti you'll get a bit more performance while costing you less then the cheaper models in the long run due to the energy savings.
That is so true. The price is garbage but the higher 40xx atleast are fine. I run a 4090 in cb2077 psycho Pathtracing with framegen on 90fps while drawing 340-370 Watts. Those cards are sooo efficient it's amazing
If you live with your parents, you go for the best AMD card you can buy because you don't care about electric bills and air conditioning costs! And when bitcoin goes back up...the 4060 Ti and 5060 Ti and 6060 Ti will cost $2,000+. So think of your 4060 Ti as a speculative investment! rofl!
An additional issue here that wasn't covered was textures completely failing to load on this card in newer games. I think overall these videos are good for a numbers to numbers comparison, but often times there's more to the story that other reviews are hammering on the gpu for. Textures shouldn't be failing to load on new titles for a $400 product.
Exactly, for a 400$ product which could be under 300$ (one tiny die!), I would be expecting atleast 12GB, a bit over 3070 Ti performance for a nice 1440p experience.
@@squidwardo7074 I guess. The 4060ti 16GiB version is going to be at least $550. If it even gets released. For me, the $100 for better performance and more RAM is worth it. I'm going to pull the buy it trigger to replace my GeForce 1080. What's amazing to me is that this 2 year old card bests nVidia's card for $100 difference. I understand they're not supposed to be comparable cards. I looked up some used 6700XT cards, and they're $100 or more cheaper than the 4060 and 4060Ti. And these two cards are supposed to be comparable.
The 7090 ti can give you 120 fps at 8k with pathtracing on, with a headroom to spare! It’s a marvel of engineering The 7060 can do 1080p 60 in some games, but not even 30 fps in others. Good deal at just $799!
My 6700 XT looks better and better each time. I just wish I had waited a bit longer as I paid about 100 bucks more than I should have for it. Even still, Team Red is looking to be a winner here.
I can relate man I bought my 6700xt for $400, but think about it this way: I bought the GPU in fall '22, and GPU prices are ALWAYS going to drop as time goes on, so no point about dwelling about it. You had the opportunity to use the GPU much earlier than a decent amount of people
Considering the 4070 costs at least around $745 and above here in India, instead of $600. I am so glad I got the 6700XT at the equivalent of 360 dollars last month. The 4060ti launch price will easily cross $500 here.
The graphs are a far more significant upgrade since the last benchmarks video compared to this card VS its previous generation. Props to the Labs and Editing team for listening to community feedback!
Would've been neat to see the RTX3070 in the list also. It's also a 8GB card, has a pricetag around the same as the 4060Ti and has bigger numbers overall.
They still need it to look somewhat decent 👀they also didn't really harp on the fact it loses to the 3060TI in some 4k games and is only like 5% faster at 1440p, but they need free gpus not paid ones. Still better review then Jay2Cents
the tragedy of the RTX 3070 is that it still could have been very relevant today. it has plenty of horsepower but totally kneecapped by that 8GB. even just a 12GB RTX 3070 would probably badly embarrassing Nvidia's current offerings
@@sudbtd That's a ridiculous statement. The fact it could be better if it had even 4gb more vram doesn't automatically make it bad. I've a 1080 that I still play 1440p on and it still performs fairly well on most modern games even though it's weaker than a 3070. And by fairly well i mean High 60+ FPS on most games, very rarely do I have to turn it down to medium. I definitely don't recommend running out and grabbing one just because it still performs well as another 2-3 years that's probably not going to be the case anymore, but the point that just because it doesn't have 12 or 16gb of vram and loses a little performance because of that doesn't make it a bad card automatically still stands. The 3070 has similar performance to the 4060ti so if 1440p ultra 100+ fps on most games is bad.. yeah..
See, this is precisely why I’m glad intel got in the game, because being able to punch up considering their pricepoint is exactly the reality check you need to know Nvidia’s pricing is completely out to lunch.
Nvidia literally doesn't care at this point. They are only in consumer cards for name recognition. They honestly might just stop making budget cards all together given how the data center market is going for them. Budget cards from now on will just be last gen stock.
I just built my sister her first budget gaming rig. I went with a 4060Ti. Personally I use AMD (and I love to trash talk Nvidia), but admittedly I do fight with drivers occasionally and that isn't something I want her to struggle with. She's really only gaming at 1080, and at $379 on sale it was hard to beat for the price-to-performance, DLSS support, and knowing it's just going to work. It's probably not a card for anyone who watches LTT, but I do think it has a place in budget builds for casual gamers, especially if you can find it on sale.
@@RayanMADAO 980 to 1060, 1080 to 2060 (though it had a vram downgrade), 2080 super to 3060 Ti; Theoretically you could also count 680 to 960, since 600 and 700 were the same architecture. Oh, also 580 to 660 ti
No point in including another 8GB card you shouldn’t buy. Besides, helping them sell 3070s would have been doing Nvidia a favour and they don’t deserve any.
Its actual peak comedy that there's one guy still riding on the big old "LTT the Nvidia child company" bullshit and another explaining that LTT wants to do anything but help nvidia, right next to eachother Made my day
As now a dedicated Arc user (my a380 is treating me well, thank you very much) I am so happy to see Arc putting up a fight in price tiers nobody seems to consider as valid.
@@Innosos You're talking as if Intel can't make decent cards. They can, people just need to be patient and wait for better drivers because the hardware is solid, only software is their weak point. But in the long run, Intel is going to force both Amd and Nvidia (Nvidia mostly, Amd is still somewhat consumer friendly) to get off their high horse prices because aint nobody buying overpriced gpus anymore when you can get an A770 16gb for 340$ that offers similar, if not better performance than the Rtx 3060.
Thanks for including the RTX 2060 in the results. It's good to have cards that are only a couple of years old that many of us still use shown to give us a good idea if an upgrade is worth it. Slightly older cards are always left out so we end up with apples to apples comparisons, when many of us need apples to oranges. Never forget that about 98% of us that game aren't rocking the latest and greatest or have the cash for top tier hardware.
I'm convinced that Nvidia has gone into a tick tock cycle. 10 series - good performance 20 series - ray tracing / dlss 30 series - good performance 40 series - frame gen 50 series - good performance?
The first GPU that I ever bought in my life was in 2012 - a GTX 660. I was 13 years old back then and built my first Gaming PC ever. I cannot believe that this old card that I bought for 200€ and that is still somewhere in my garage today, has a bigger interface then Nvidias latest mid range cards in 2023 that costs 500€ and more. :D
The GTX 660 was a nice card, I bought one (GTX660-DC2O-2GD5) second hand in January 2014 for just 140€. I miss the times when graphics cards weren't so expensive
150€ for a GTX 550Ti back in, idk, 2010 or so? After that a GTX 670 in 2012 for 350€, which was a huge jumpnfor a poor student back then. Now I'm looking at something that's supposed to be just one tier below the xx70 card (while looking like it belongs in the xx50Ti tier) and it costs more than my first PC as a whole...
@@choppings54 There is a GTX 1080 in my PC currently, however I barely play on my PC anymore. I game on my Steam Deck or on my Xbox Series X most of the time.
I said it when ARC was released - I'm glad we're finally at a point again where there's a viable 3rd option. Just, so long as you're not concerned about being made fun of when your buddies ask what card you've got :)
Nvidia keeps going the way it is Intel will be nothing to laugh about much faster, they've only released thier 1st gen and in 2yrs it went from 🤣 to "well maybe?" 🤔 if they release battlemage and keep at it with driver improvements we'll have a 3rd contender by next gen
@@scarletspidernz Battlemage does seem like it's coming out, the real question is whether their 3rd will come. Part of the problem is that to my understanding, some of the hardware issues with Alchemist were found after Battlemage already entered production. So the real test will be when Cleric or Crusader or whatever they call it comes out.
Exactly why I upgraded to 6800xt from 1660ti. Team red actually gives what consumers need. I have no worries with my 16gb of vram at 1440p. Seems like nvidia is becoming another asus🤨
Quick question as that’s what I’m considering to upgrade to as well; currently running a 2060 OC, and it gives me 65-85 fps at 1440p but the 6800XT looks amazing for the price & performance (PLUS ALL THAT VRAM)… would you say your AAA gaming experience is pretty amazing in terms of fps & settings? Really want a bump with games like RDR2 & GOW 2018 … and the 6800XT looks more than capable for 1440p. Genuine question, greatly appreciate any responses back!
@@mass_stay_tapped_in528 consider buying a 6950xt instead of the 6800xt. The prices have dropped quite close enough that it makes it worth it. Like $60 price difference.
@@mass_stay_tapped_in528 6800xt is an awesome card for 1440p! I currently own 6700xt and the thing meets all my expectations at 1440p. 60 fps in the latest AAA single player titles with high settings, 90+ fps in beautiful online games and older AAA titles. 6800xt is like 50% more performance than 6700xt so it'll translate to 90 fps in the latest AAA single player games and 140+ in online ones.. I don't think that I NEED that upgrade, but I know damn well that I want it 😭
Only price is the issue, the card is slightly flawed but still fantastic as a full package. If it gets cheap enough it would be a very good option even with the lower RAM. If it doesn't, then it should be avoided.
I was told by friends I was nuts for doing a 3090 for 4K gaming when it came out. I knew VRAM was going to be an issue. I mod games a lot, and that eats VRAM. It was only a matter of time. Now, I can still sit comfortably not worrying about it for a few more years.
Unfortunately fanboys are always serial copers and Nvidia is no different. I know Some guys who swear up and down that VRAM is irrelevant and nobody needs it and you're just an AMD shill for saying VRAM is vital to modern gaming.
Love my 2nd hand 3090. 900€ and playing on 1440P everything by maxed out. ACC F1 Flight Simulator. Of which i play ACC and Flight sim in VR sometimes. I don’t even get why people would go for a 4080 for 1500+ while paying 900 gets you a nice 3090 with 24gb vram.
I don't know who had the idea to put a green and a red bar in the charts, but that is extremely useful! Thank you!, finally I can enjoy this type of content without pausing it every single time a new chart appears. Yes, sometimes is useful to see where all the other products are located in comparison, but sometimes I just want to hear the speaker talking.
I just bought a 3060TI for $320 brand new last week , a day later they announce the 4060 family and I regretted that I didn't wait. Little did I know, there was nothing to regret
Same deal for me with the Arc A770. I wanted to buy a 3060 but I needed the extra VRAM for my job and AMD cards unfortunately don't perform well for my work.
@@21preend42 the 1070 was better than the 980, the 2070 was better than the 1080, the 3070 was on the toes of the 2080ti..... the 4070 is also horrible value and you guys need to grow some basic standards
Guessing Nvidia said reviews couldn't compare it to the 3070? 4060ti has basically same performance as the 3070 for up to half the watts. Would have been a good take. Now that the prices are below msrp, not a bad buy.
@@rustler08 Something that people dont mention is Power Efficiency. I considered a 6700/6700xt but both draw more power than the 3060, i dont need my electric bill going up even more when i have other things i have to worry about.
@@rustler08 You're right and this is why the VRAM complaints are kinda dumb. In all honesty almost 0 games will use all 8gb, and if it does you can just turn down the graphics from high to medium and you probably won't even be able to tell the difference. If you're trying to play 4k on a 60 series card you're doing it wrong, and that's about the only time you will use all 8gb. The AMD cards absolutely destroy it in some games, but in others the 4060ti is 10-20% better, so I guess it depends which games you play. For me personally I stick with nvidia for now mainly because of shadowplay, and most of the games I play are better with nvidia.
Concerning the 8 GB of vram, it really makes me miss the HBCC high-bandwidth Cache Controller from the Radeon Vega graphics cards. That was an insanely good feature especially when coupled with the HBM2 video memory. Good Times.
As an owner of a Radeon VII, I completely agree! I just wish Vega had been better for gaming rather than compute. I honestly think that a 6900xt with HBM would have given the 3090ti a run for it's money, as it always seemed like it was the memory bandwidth keeping RDNA2 back. Not a gpu engineer, though, so could definitely be wrong
@@TheHavocInferno you are right. HBM didnt do Jack shit for those GPUs except make them expensive garbage. Those GPUs had nowhere near the power needed for HBM to make any difference. It was a marketing scheme, same way nvidia is adding bigger l2 cache while cutting bandwith, instead of actually adding more vram capacity. Its all just marketing schemes. The fact that these fanboys above think HBM was really good, really shows you how dumb people are to logic.
I was so excited about the 16GB variant when I first heard about it. I've been wanting to replace my 1060 for years with something not overly power hungry, but I also have plans for a CUDA application next year that ties me to Nvidia. The stupid 128-bit bus is a double-whammy for me (gaming and computing) that's going to force me to buy something higher tier and run it in a reduced power mode
Same here. VRAM bandwidth is piss poor on RTX 40 series, and a lot of compute application scale in performance directly with raw VRAM bandwidth, not "effective" bandwidth, as the cache is useless here. So you get half the performance of the RTX 30 predecessor for the same or higher price. DOA.
@@MindBlowerWTF I've been waiting for AV1 since before most people have heard of it. Strongly considered getting a low-end Arc card specifically for it, but I don't want to deal with managing 2 gpus with different sets of drivers across both Linux and Windows
@@ever611 I write computing software that works with matrices that have effectively no upper limit on their size. As the matrices become larger, CPUs just can't keep up in terms of speed. It's the type of computing that makes sense for GPUs, and I want to target people with access to HPC facilities/supercomputers, which tend to predominantly support Nvidia (including the one I have access to). Since it's so heavily used, there's also just a ton of resources on CUDA programming. OpenCL or Vulkan compute could be viable, and I would love to eventually support non-proprietary alternatives to CUDA, but CUDA is going to be the lowest resistance route to reaching the most people I'm trying to target
NVIDIAcould have released the 16gb 4060Ti for $430, making a mid-range card that is future-proof, and not a total rip-off. But if they cared about their consumers, then they wouldn't be NVIDIA
@@hampuswirsen7793 it would work just fine, but don't expect to play most games anywhere near the 240fps mark. Esports games with lower settings and/or fsr you may get close but most newer games at 1440p high I get between 100-170
Game designer (in progress) here, on the note of the VRAM thing. Pretty much it's just industry growing pains, developers are pushing the traditional geometry shading method to it's limit in regards to complexity and it's overwhelming VRAM busses (and CPUs) all around. Tack on UE4 not being built for that level of complexity like in Jedi Survivor or Hogwarts Legacy, you get a lot of problems. We know this is a growing pain thing because Fortnite with UE5 with Nanite enabled, the mode that HAS EVERY SINGLE BLADE OF GRASS BEING A MODEL runs actually pretty fine on VRAM despite the current size and complexity of the game map despite that vegetation change when Nanite is on resulting in a geometric complexity at least at the level of Jedi Survivor. Virtualized Geometry/Mesh Shading is the solution to the VRAM problem as you only need to make your full-res model and a PLOD (Preferably), versus the Full-Res Model, 4-6 LOD tiers, then the Persistent LOD of standard shading/geometry. When models are AS Complex as they are nowadays, those extra LOD tiers are not "Simple" at all unless a developer bruteforce optimizes it which you only see on console Exclusives because PC players like extra scaling so you have to forego that to a degree or other consoles may not take kindly to the LOD Setup you have for that specific console.
@thomasboob559 I'm just saying that optimization takes longer/is harder becuase of how they are pushing against complexity limits for the current Paradigm.
Yeah but you dont get DLSS and RT. And I turn those on in literally every game on my 3070ti. It doesnt hurt performance enough to notice. If you have a Radeon you're basically playing on low settings for every game. The difference between RT on and off is way bigger than low to ultra.
@@scubasausage RT is sweet, or can look great, but it's not at all like Low to Ultra, guess you've never played at lower settings - it's a cherry on top for more realistic lighting and it's not even always worth it. Some games certainly look great and can benefit quite a bit from it though. Also, the 6800XT can handle some RT, even though surely not at RTX 3080 levels.
@@dennisjungbauer4467 You are incorrect, RTX on is a way bigger difference than from low to ultra. You must never have used ray tracing before. Also a 6800 can do ray tracing but not at playable frame rates. It gets regularly beaten by a 2070 when you turn RT on.
While I can cross-reference the 4k data, I would love to see a VR performance rundown. VR is still a smaller market, but maybe a once a year rundown of current cards to current heasets and games would be great
@@remeuptmain No need to be rude. VR is a very real niche, and the performance needs are different than 1080p, 1440p, and 4k. This is great feedback for the team to have.
I just realised the 4060ti has fewer cores than the 3060ti, and the two things it does better are L2 cache and watts, but decreased power for the same performance is expected when decreasing the size of the transistors on the die but they also took away a lot of the other things that would take more power anyway.
Contrary to the table at 0:13, the RTX 4060Ti is PCIe x8, not x16. We also said the X370 chipset does not support ReBAR but that was added in a later update. We apologize for the confusion!
Excellent job, this review was much better than previous ones.
Is it enough for PCIe 3.0?
@@karolwesoek5350 I think most board that is still rocking PCIe 3.0, will be bottlenecked on the CPU before you buy this GPU....
The argument for x8 bus is more relevant to 50-series GPU, which either just run on PCIe, or just sip powers from one 6-pin/8-pin so it does not overwhelm the poor used-office PC PSU. The memory bus already knee-cap this GPU anyway.
Geez, thats skeezy
Couldn't it be Nvidia chose to 'only' go with a 128bit bus because whenever they used a 256bit bus in combination with the new larger L2 cache the performance of the 4060ti would be to close to the 4070? Would be interesting to find that out.
Congrats Nvidia !
After the successful un-launch of the 4080 12Gb you more than made up for it by re-releasing the 3060Ti !
Just remember, it's pronounced "tie"
- Thanks Steve!
@@JXSnWp Underrated comment.
MSRP adjusted for inflation too
@@onicEQ Inflation is BS at this point, all they want is to keep the same profit levels as they had during the mining craze. 🤡
Unlaunched their own 4060 Ti to replace it with the 3060 Super lol
The fact that the RTX 3060ti is so close in all charts and can even beat the 4060 ti at higher res (sometimes) is just hilarious
And the 3060ti might actually also be more powerful than the 3060ti for people with pcie 3.0. NOBODY NEEDS PCIE 4.0 IN THIS LOW END, EXCEPT WHEN GPU MAKERS ARE SO CHEAPO WHILE MAKING CARDS THAT USES ONLY HALF OF THE PCIE LANES. This is a pcie 4.0x8 card as far as i know.
Makes me even happier knowing I went with a 6700xt instead of waiting on new gpus.
Yes dude, because of the memory. 4060ti need 12gb with wider bus, like 192bits, to proper beat the 3060ti in 1440p.
brb, on my way to buy a pre owned 3060 ti and tell everyone I bought a new 4060 ti, nvidia has lost its mind and my money
the 3060ti was slept on so bad, I gave my 3060ti to my brother for his pc when I upgraded. He's able to run games so smoothly at 1440p 240hz with no issues. Forza runs like a dream, he plays GTA V mostly with redux and it runs like a dream. Hands down the best budget card ever imo, 3060ti was unreal
Here at Nvidia we made a mistake with a few cards, and realised they were a bit too good and had far too much longevity for our tastes. So we went to work and made sure that won't happen again: we are proud to announce, the 4060Ti, a card that will be almost useless for new games in 6 months!
Ah yes the good old 1080ti i bet that mistake won't happen again 😅
That waste of sand card is useless right now you don't need even need to wait 6 months.
Boys, it's time we rise up.
Lets..... go outside, I know its hard, but you have to be a soldier and do it.
Lets... play physical sports like soccer, volleyball etc.
Lets get dirty. We dont need ngreedia.
@@wanderer7779 wait for FSR 3.0 to get the 1080 ti final form.
That means that my 1080Ti 11GB will have to survive (at least) the next generation. I can wait untill The elder Scrolls 6 is out in 2026 to 2029
I literally just got the 4060ti without heavily researching it first (oops). I paired it with a I7-7700T , 32gb and 1tb nvme (gen3). All these components were sitting around collecting dust, so I wanted to make a low-wattage gaming rig. I'm also gaming on a 1080p 180mhz monitor and so far, it performs extremely well, no complaints whatsoever. It obviously crushes 1080p gaming so I'm perfectly happy with it.
I got it for $240 so I wasn't malding about its mediocrity
I have a Ryzen 3200g, 16 gb ram, 500 GB SSD and a used 1660 super (which I got for 110$ a year back). I am still crushing 1080p 60fps gaming. That's why the whole 40 series looks like a comical scam to me. 😂😂 So for the first time, when I do upgrade I think I will look at AMD or Intel this time. Nvidia is as hopeless as nokia at this point.
Haha me too. Thinking to return it and spend the extra $100 to get the 4070
@@thedeathstar420 I'd save those 100$ for future upgrade. And, when the right time comes I'll sell 4060 ti + use these 100$ to get new GPU.
I'm having no issues with it, I don't need 4k, and in most cases, I'm at 1440 easily at 60fps. I'll just have to upgrade a little earlier than I want. No biggie.
God i hope Intel continue their efforts in the GPU market.....
A new player really is needed...
I wonder if they'll go google and call it quits after just one full card. What I don't get is how they can do so badly. They company has made integrated graphics since forever, and CPUs since time.
@@gorkskoal9315 they are not quitting, as an arc a770 16gb LE owner they are making strides with driver improvements, battlemage will be legendary
@@gorkskoal9315 Intel has atleast 2 more generations from what they said a couple months ago. I just hope Battlemage is a good price to performance like ATI/AMD used to be because many people would jump to it.
@@gorkskoal9315 Do so badly? Linus just showed were the A770 is beating the 4070ti in some task and not far behind in most. Drivers are constantly improving each month and the raw power within the card can surpass the 3070 in time, all for less $$$. I'd say they are doing amazing for a first time being serious in the GPU market. They are defiantly committed for two more gens being Battlemage and Celestial, Battlemage aiming for 4070 to 4080 performance from what I hear and Celestial probably on par with whatever 50 series mid range will offer (5060ti/5070) And will more then likely not increase much in price, maybe $400 to $500 for the top end model. I think once more people feel safe with the drivers, which are already pretty stable, they will sell good AND iNTEL will continue to keep new Arc cards rolling out.
@@gorkskoal9315 What meme echo chambers have you been staying in? The fact that the company only did IGPUs for so long is not only taken into consideration by the tech commentary space, it was a concern that Intel would have trouble tackling a discrete GPU after only having that as experience since iGPU =/= discrete GPU, especially a discrete GPU with all the features Intel was promising, and they'd have to compete with a duopoly with literal decades of experience with discrete GPU drivers.
And they did have trouble, so it very much was _not_ a surprise and people were shitting on them for the totally expected outcome. Intel actually did better than expected with the hardware being decent, since people were expecting they'd botch that too, and most of the problem being the expected driver inexperience and bullshit interface software. Shitting on them that became less loud with the driver improvements over the last several months.
Can’t wait for the 5060Ti with 2GB of GDDR5. That’s when Nvidias line of retro gaming cards will really hit its stride.
Don’t forget the 64Bit buswidth and pcie 4x4 connection and decreased shadercore count
But it will feature dlss 4.0
Get 4 inaccurate fake frames - unusable in any non slow paced offline point and click adventure game - for every upscale frame produced
Oh did we forget to mention
To keep manufacturing costs down upscaling will be on by default
Now to the price
It’s just 600$ for the 1 Gig and 750$ for the 2gig variant
"retro gaming cards" 🤣🤣
can't wait sell my 1060 6 GB for $800
$700 cards baby, can't wait to play Unreal Tournament 2004 on that beauty
I can't wait to spend $500 on a brand new CGA card
Looking around the tech channels, this looks like the most positive review of the 4060 TI today, and the conclusion was "Buy Arc A770 instead". Amazing.
yeah this review is almost shilly in comparison to how bad the price perf is
Jayz2Cents said it was a good card but took the review down. He had an apology video up now lul
TPU gave it a highly recommended badge 😂😂
@@LaskaiTamas23 people donate to hot tub streamers so people will still buy this.. gamers are .....idk what to even call them these days
LTT needs to make the sarcasm more obvious.
You know, if they'd shipped it as the 4050 Ti for $250, they'd have a winner on their hands.
Heck if they shipped it as 4060 at 300, it would still be a winner...
250? Keep dreaming
@@chilldoc9638 where
@@hirakuotaku5386 in your head, because you're not getting it anywhere else for that price
@@chilldoc9638 no shit that's why they said "if" vlearly Nvidia is likely never selling GPUs at that price point ever again
At this point Nvidia really thinks we are stupid lol
or we are thinking that nvidia's the stupid one
Who's we?
@@chic-fil-ashouldopenonsund3623lol
at this point?
I think Nvidia is stupid.
This is why competition is important, because without it we have companies like Nvidia gouging us for GPU prices.
Wouldn't make a difference. People keep buying. They can slap a 5x increase of price at this point and it wouldn't hurt them at all, people have disposable income, or they save up for the latest and greatest GPU.
The money isn't even in the consumer market anymore.
They would be even worse!
We have competition. It's called AMD
Your wallet is king.
Keep hold of your money because someone will want to sell you something at some point. The decision is ultimately ours.
Well, the competition is busy sniffing glue and pricing their less appealing products at similarly high, rather than trying to gain marketshare with sensible prices
My RX 6700xt aged very well. At the time of purchasing, I was disappointed that the 3070 was out of stock. But in hindsight, it was a great purchase
As a 3070 user, it's also aged very well, if you can even call 2 years "aging". So not only is Nvidia being beaten by amd, its being beaten by itself from just recently lol.
Edit: not saying this to be an nvidia fanboy, i only got it because i had a great opportunity by getting a prebuilt with it, and I actually use DLSS 2.0
@@calmkat9032You know they messed up if not only their competitor is held their own against you but your oldee product lineup beats your newer ones 😂
@@calmkat9032 "So not only is Nvidia being beaten by amd, its being beaten by itself from just recently lol."
That's due to NVidia's new 'Self-Flagellating' architecture.
@@paulbantick8266nvidia's architecture was soo good, but their scalped price margins were too high to ignore, so they shifted their dies up a tier to upsell us consumers :(
Hold on, gotta upgrade from the 3060 Ti to the 4060 Ti real quick.
(opens MSI Afterburner)
💀
imagine undervolting 4060ti tho
@@PannyLanny My undervolted 3060 draws about 85W-115W tops (mostly around 70W), while delivering all of its performance. So 4060ti can probably be undervolted to about same level maybe even more.
@radomiami.....Best comment here. I actually lol'd.
This card is just straight up e-waste. Good job, Nvidia!
true
more like paperweights that think of circles and then commit suicide right after
not e-waste 4060 Ti 16GB would be a good buy at $299.
@@ambhaiji but it’s not 299 so it’s still garbage.
@@ambhaiji 4090ti would be a good buy at $699. what is your point?
This gen of cards is the best advertisement for RX6000 cards AMD could've gotten.
why?
I would have gotten 4070 if it weren't for the 12gb. So i went with a 7900xt
fr
@@groenevinger3893 bro Nvidia fucked up so bad, that it made AMD's last gen 6000 series GPU look like even better option, and with the huge price cut you can get a 4070 type GPU for less than a 3060ti price.
@@Justachamp772 So actually you are saying you can better buy the 4070?? So why even start talking about the 6000 generation? Amd still lacks RT.
As someone who bought a 3060 OC during the pandemic, this makes me happy that i not missing out on much. It even has 12Gb vram 🥴
Same tbh
Scamvidia. How long do I wait for msi 4070 slim $700 to drop to $500. 2 years?
Honestly looking back, the humble RTX 3060 with 12GB of VRAM was an absolute miracle that did not get appreciated nearly enough at the time.
This ^ I bought a 3060 12gb to keep my old gaming PC going. Then built a new gaming pc with a 3070ti a year later. That 8gb 3070ti isn't looking too good now, but not like there's much alternative in the 40 series anyway 😂
I have a 3060TI and it's been really good so far but I'd sure love some more memory
Yes it was. A lot of people wanting to justify the 3070 and 3080 say the 3060 should've been an 8GB VRAM card, not realizing its all the Ampere stack that needed way more VRAM and the 3060 was the only gem of the crop
I got a 3060 for my first "workstation" PC and it feels pretty good to be able to run UE5, Photoshop, and ten different browser tabs without it breaking a sweat. You really feel all that space in VRAM.
I think you mean the 6700XT. Really, pretty much the entire RDNA2 lineup at current prices looks absolutely amazing compared to the current generation from both companies right now.
feels good to see Intel making its way up into the "worthy of being compared with" list
really excited to see Battlemage
@@Blackfatrat I thought of getting one for my new build as well, but it still isn't available in my country. Hope in 5 years we'll do our mini scrapyard war with an A770 and a 13700K
We might actually get to see a shake up in the graphics industry next generation thanks to Intel. A770 Made me really excited for battle mage as well, Intel is crushing it!
I would buy if they had more powerful gpus, but right now I can't justify going from a 3060 to a A770
@@MrMaddog2004subscribe Then you must be excited for Intel battlemage?
@@chronometer9931 what's that?
Nvidia did a great job with making me consider going with ARC. Which I never thought I’d consider. Thanks Nvidia!
16gb arc enjoyer here, if you value peace of mind don't do it. If you like at least a tiny amount of tinkering you can do it. If you want sff build with arc like me- fuck no
@@remeuptmain This isn’t twitter ya degen
@@Gajaczek93 "enjoyer" > proceeds to display discontent. I 100% believe you are indeed an arc owner.
Been using an ARC 770 for a while now and it's been great!
Always Blue
So what is the problem with having 128 bus with 32MB L2 cache than 256 bus with 4MB L2 cache?
If I was NVIDIA I would be pretty damn embarrassed to see the 6700XT go toe to toe with the 4060TI. Lets hope AMD doesn't screw up the 7600 and 7700 pricing so the whole 4060 series can be totally irrelevant.
It would be an easy win for AMD, launching their mid tier cards at a competitive price. 6700XT and 6800XT are already really good deals, make the 7000 series a little bit better without being greedy and Nvidia will have to rethink their strategy.
it goes toe to toe in raster only, add in literally any other factor and the 6700xt loses big time.
@@oxfordsparky your point?
@@oxfordsparky In ray tracing the 6700xt has the 12 gigs, so when even more and more Ray tracing workloads use more then 8gb, the 4060ti will be completely unusable, maybe the 16gb card is more useable but it really depends on the ray tracing load on if it is heavy on the bus or not, while the 6700xt gets some framerate in ray tracing compared to the 4060ti 8gb. In Content creation, a 4060ti (16gb variant) probably compares to or beats 3080 noting that a 4070 beats a 3090ti in content creation and probably beats a upcoming 7950xtx & 7900xtx in blender, So that point still stands.
AMD needs to price the 7600 below the price of the OG 4060 with performance in between it and the 4060ti Ie: Peformance around a 6700xt. I think a good price for a 4060/4060ti killer would be between $200-$280.
Any chance of an ARC revisit video soon? Would really love to hear your team's thoughts on how much the drivers have improved.
I know some people with the card, it is still a heavily buggied mess, but it has improved a lot. I absolutely do not recommend the card for non-tech savvy people or people without patience for crashes and bugs, since you will experience a lot of those, and it will take years for them to solve the majority of them, if Intel doesn't drops the venture first, but when you hit a game that is not bugged (usually DX12) they are a great bang for the buck.
For the less adventurous the RX 6500 XT, RX 6600 and RTX 3060 12GB stands on the same price ranges with comparable performance and way better energy efficiency, though, but the Intel offerings might potentially improve past their performance level with the time, or not, time will tell. The A350 is also nice if you need cheap AV1 encoding for less than $150 bucks, but the card is kinda doo doo for gaming.
I think they may wanna wait until Intel releases either another huge driver update or another card altogether.
notels snarc gpdont?It's emulating direct x9 and is, frankly, a bug ridden pile of shit compared to the other options. at an effective 400 dollars, before tax and shipping you can get a 3060ti, 3060, 1060ti and 1060, and well you get the idea a few cards that do just about as well without half the suck.
As someone daily driving the A750 it's perfectly fine. I've had no issues playing anything (Ark, Warcraft 3, Rocket League, Tomb Raider, CyberPunk and GTA5 mainly) - I never had any issues from day one though about a year ago when I bought it.
@@gorkskoal9315 Don't they now use dxvk in their drivers to translate D3D9 to Vulkan?
Happy to see the Arc A770 punching here and there, even if it's a mixed bag of results all the time. I expected it to catch up at 1440p and over. Oh well. EDIT: In GN's review it matched the 4060Ti on Cyberpunk at 4K. Not bad!
with intel's engineering team I am kinda looking forward to what they do with their battlemage architecture.
For card priced at 250 USD it's a steal
If I switch from NV, I'm hoping it will be Arc... But I like top end cards, so we will see.
@@remeuptmainyou responding to everyone's comment with the same shitty ratio comment get a grip kiddo 😂😂
@@KarrasBastomiIf AMD didn't exist. Unfortunately, amd exists and their cards are way faster at $250
It is sad that all these influencers have decided they can get more response and comments by jumping on the 4060TI negatively. I just installed mine and love it to death. It is massive, impressive and fast. It has the newest 12 pin high power connector and with inflation added, I think it is a great value. These people have a sytem they use when making videos. UA-cam tells them what they will pay more money for, and they respond in kind. Hate videos do better than informational videos many times.
"We realized that we were giving consumers cards that are too future proof... so we decided to make each new generation become obsolete in a years time!"
It costs more than I want to spend and it won't even play today's games the way I am interested in playing them. Nevermind future proof is failing now
Thanks for the great review of the RTX 4050ti. This gpu seems very reasonable at $250 and I’m sure the budget market will be happy to finally have a marginal upgrade over the 2060 Super.
This needs to be said outloud.
Jokes aside... even the 2060 super has 8gb of vram, lol. This card isn't an upgrade for anyone unless they're on a 1060 or older imo, and even then, still better options for the price.
2023 GPU market (Good Ending)
The Arc A770 has been getting huge performance upgrades since release and it would be an interesting video to see how far its come since. Probably not a "30-Day Challenge" video, but a Value Proposition Card compared to others around it's class currently.
Is ARC a seperate company or part of AMD??
@@awesomestuff2496 It's intel's graphics card division
But isnt Intel scaling down their GPU division operations? How long can they continue this run of performance increases?
Intel is a bit of a dark horse in this GPU space, but we all should absolutely be cheering them on.
@@scroopynooperz9051 As far as I know, those were rumors as word got out that Intel was closing or downscaling their GPU Division.
BUT Intel has since dispelled the rumors and released a diagram saying that they will work on GPUs from Battlemage (B-Series) up to their Druid (D-Series) Architecture. It's at least guaranteed that Battlemage will come out by the end of 2024.
This.
Jedi Survivor crashed on my 4090 with 24 GB Ram, with the notification "not enough VRAM available.." with max settings and RT on. What a joke!
I feel like it should've been mentioned that the 4060 TI has a x8 PCIE 4 connection.
Would've been great to see you testing the 4060 TI on a PCIE 3 system to see how much performance might be lost.
I guess that’s part of the agreement
Nvidia would never allow that
Like they wouldn’t allow similar priced higher tier amd cards shown in the same chart
The card will not bottleneck the PCIe lanes or bandwidth on PCIe 3 so the difference is minimal, almost within margin of error.
x8 is whatever, x4 is where the fun (read: not fun) begins.
You don't lose any on 3.0, just a bit lower 1% lows
@@CrocoDylianVT You do, actually. These days, texture streaming is a thing, thus making the bus the bottleneck. 128 bit is a joke. No matter how you look at it.
It's safe to say that AMD really did well in coming back with a bang in the market
Yep
Love my 6750 XT and probably soon 6950 XT. Imo no one should be buying nvidia rn cause of the insane greed and afwul price to performance compared to team red. Dont know why so many people are still buying cards like the 3060 which is awful value
how lol? they got destroyed in the top end , they have yet to launch a lower end card than the 7900xt and xtx
7900 xtx is better than 4080 and it's cheaper
Well for the last gen cards at least. 6xxx are kinda getting the last laugh here.
This generation of cards (both companies) is a tough sell... the more (generations) we progress the more caveats we hear to justify the price...
The 1080ti was a miracle. Best purchase ever.
They'll be the best card to buy at 2025
I think AMD are better but they've both lost the plot with their pricing, i've got a 2080 and see no reason to upgrade
@@mrcaboosevg6089 You see no reason to upgrade from a 2080? Do you not play newer games? I am on a 2080ti and absolutely ready to be done with it. Ordered a 4090.
@@mrcaboosevg6089 2080 is just bad for modern gaming. 20 series is a flop
still rocking my 1080ti as well!
who ever fucked up the table at 0:12 is so getting fired 💀
This card is the 4050 and should be MSRP at $249.99. $279.99 for the 16 GB version. AND discounted a few months in by $50.
not in my wildest dream 😂
I totally agree with you. They're slowly trying to move the line so that you'll think this is 4060 but it is more like a 4040
Or should come as a bonus when buying a pack of Doritos
Impossible!!! If they do that, how could Jensen afford his new leather jacket???
Nvidia is pulling the 4080 12GB/16GB model scheme again with the 4060 Ti 8GB/16GB models.
Just a friendly reminder that the rx 6950XT is going for about $600 still. It has gaming performance on-par with a 3090, only suffering in ray-tracing and productivity tasks.
If you're primarily a gamer, it's really worth taking a look at this card, it's an unbeatable value.
Very high energy consumption though.
@@pauloa.7609 better than 3090 however
@@aravindpallippara1577 true but that doesn't lower it's outrageous power consumption, right?
@@remeuptmain true, I consider anything below a 4090 a waste of sand but I don't want to pay 2000+ euros for it even if I could buy literally 100 cards because that would be burning money away. These cards need to be half of the price to be worth it, all segments, from both nvidia and amd.
@@pauloa.7609 So the only card you think is good you are too poor for. Nice man
dude this graphics for the graphs are absolutely SPOT ON. LTT have clearly listened to their community about feedback on how clear the graphics could be, and have made massive improvements. thank you so much
@@muyoso that sounds like a personal problem you have 😂. Are you unhappy in life
@@muyoso This is a video, not a presentation. You know you can pause right?
@@muyoso It's different approaches, and that's OK. GN includes a large variety of cards, but LTT reruns benchmarks for every card in every review. One isn't better than the other, but it shouldn't be surprising that GN's breadth and depth of data is bigger.
@@muyoso those are different styles. GN shows a lot but i have to pause every time he shows a graph otherwise i will miss something important. LTT shows a lot less, but what they show i can see at a glance.
I do wish they would skip the weird squares in the background, makes it harder to look at the content.
edit: it's the LTT logo... makes it looks like crap in 480p.
Not me and my 100 dollar laptop knowing damn well we can never afford that💀💀💀
I actually got a killer deal when I got my 6700XT (was over $100 cheaper than the 3060 non-Ti at the time) - and can 100% agree, as long as you don't need ray tracing, it holds its weight insanely well without breaking the bank.
I've turned on Ray tracing on my 6700xt and it works fine. Totally usable. Nvidia does not have the monopoly on Ray tracing
@@SinisterSlay1 oh, it's definitely usable with it, it just takes more of a performance hit from it than nVidia last I tried it.
@@xenstence AMD shill
Bugs, glitches, crashing, half your library barely works.. Buy a 3060, you will be amazed how optimized Nvidia is
@@xenstence OK shill. Have you played any real games? Games not made specifically for AMD powered consoles
I'm so happy to see Arc finally getting some recognition. It's just been hated on all across UA-cam.
Ok as an Intel Arc A770 LE owner it is very much deserved. This card doesn't do well enough in modern games vs an RX 6600 to justify your entire library of older games being a buggy mess that doesn't work at all 20% of the time due to game breaking bugs. I've had the card 1 month because I listened to everyone complain about VRAM, and who cares about VRAM when the experience is literally garbage. Intel has 3 different programs to control the cards software and the card can't even boot correctly from sleep mode 8 months after launch. The performance is only sometimes good in optimized Direct X12 brand new sponsored games, and it is a joke for everything else. I replaced an RX 480 8gb and I feel like it did a better job than the ARC card. While I may only be getting 40 ish FPS, its better than playing a glitchy mess with high FPS because of emulation. The AV1 encoder also doesn't work half the time in OBS for no apparent reason either. Theres entire forum posts filled with bugs that are known for months and they will never get to them because the ROI is not just worth it for them. Just buy the RX 6600 for 200$ on a budget and upgrade whenever you have an issue. Because think of this. Spend 200$ now keep the 80-150$ you'd spend to get to the RX 6700/6750XT series, throw it in a stock market portfolio and use it to buy a brand new RX 8600XT realistically or something like what that might be. It'll be the latest product with the latest feature set as well.
@@CleonofAthens wow its still bad? I thought the drivers were good at this point.
@@poochyenarulez meh I'd give it another year to smooth out. Point being that the Arc cards are going to age much better than this piece of shit card. There is not improvement to come for the 4060ti, this is it. It's done, it's garbage forever, but Arc still has room to stretch, and provided Intel stays true to supporting their GPU division it will get better.
@@RyTrapp0 it's not going to take nearly as long as you'd expect. This may be hard to believe, but Nvidia basically had to nuke all driver optimizations after Turing. This is also why post Turing a significant number of older titles encounter strange artifacting or similarly terrible DX8/9 performance when compared against older generations of cards. The only exception to this being the rare few titles which are still popular and on DX9, which amounts to essentially none now that CS has moved to source 2.
@@poochyenarulez As someone else who uses Arc... they're better... 😐 They're not "dumpster fire" tier like before January, but seeing as how they now just got to the point of having Arc Control launch at start without requiring UAC and now enabling fan control, they have quite a ways to go. Granted, Intel has put in a lot of leg work to fix them, but starting from so far behind means that DX11/10 performance is a bit janky. DX9/VK isn't great either, though that might just be because I play janky map games and I have the thing clocked as high as it will go while not BSOD'ing every five seconds. It's considerably better than at launch. Also, DX12 performance is insane.
I personally don't stream, so I don't know how nice it plays with OBS, but in Handbrake the thing is a beast.
The 4060 Ti made me realise how insane the value and the price to performance ratio of the 3060 Ti back in 2020. I was lucky to get one at MSRP when it was launched.
same straight from EVGA before they stopped making nvidia GPUs for good (LMAO)
But the card was crippled like 3070 and ti with mere 8 gigs knowing you’ll be forced to upgrade much quicker
@@pfizerpricehike9747 3060 xc has +4GB vram with +36% bandwith compared to the 4060 Ti. but lower L2
@@pfizerpricehike9747 Yeah 8 GB of VRAM was a concern even back then in 2020 but it wasn't as big of a deal unlike today.
HEll, I'm still enjoying my 1080ti purchased on its release day, which can still render most new games at 60fps at max settings as long as you're happy wit 1080p resolution, which is the best my TV/Monitor is capable of anyway.
Power consumption, heat and noise are big for me. I don't know if I speak for many PC gamers, but I tend to run discarded professional workstations as a platform for my gaming PC's. I can usually get them for free after they are a few generations old. These PC's while powerful and able to support discrete GPU's, they do not have tons of airflow and also their power supplies don't have the same wattage or PCIe cables as aftermarket gaming PCs do. For me, staying well below 200W for the GPU is a hard requirement. I am finally able to replace my 1070ti with something that is nearly twice as powerful while staying in (actually below) my existing card's 180W thermal envelope. The same cannot be said for the AMD cards or even the 4070 series.
Steve at GN said this card is "bordering on a waste of sand," and honestly, I think that's more accurate than you're willing to get on this review.
Yeah, definitely disappointed in LTT on this one. This card simply is not good value.
Gotta keep Nvidia happy so they can keep getting review samples lmao
@@Ja_Schadenfreude Not good value but it's a good card in itself. That's what they said. Get something else, but if price was better this would be good.
@ww7285 do you get off on hating. This wasn't a positive review the conclusion is literally that the pricing is designed to get you to spend more money than you should but that luckily the competition at this price is pretty good even including intel. How can this be anything but negative when one of the conclusions is maybe try ark instead.
It's on pat with the 11th gen Core CPUs! Actual true waste of sand, since 10th gen was generally better in every way... But this GD Furry60 shit is coming real close. Lol! I know the 8x vs 16x thing is probably not a huge deal for most... But fuck me! Who thought that would have been a good decision? Simply from an online trolling standpoint! I intend to make the fuckin fun outta that... And the decrease bus... And... And. . ..... WTF?
it might be only 10% faster, but the most important part. And this is the part that really matters. Its profit margins are 100% more than the previous card.
Thanks for the leather.
Can you give me a 4090? I'll give you a pretty leather jacket and I'll lie about the 4060Ti being good.
Nice!
The moral of the story is that if you're able to find RTX on a deal, then you can be 100% sure that there is a RX card somewhere that performs same or even better with lower price. With is only NVIDIA's fault and nobody's else. Great work!
"Yesterday's performance for yesterday's price!", I feel like the prices of used 3060ti's and lower won't go down much with the release of this one, just no insentive. :(
unless you do any pro work and need those CUDA and AI cores
Nvidia's cards always commanded a premium because features and fanbois
@@aleksandarlazarov9182 I think it goes "Yesterday's performance for tomorrow's price" but yeah... The whole value of 4000 series is so bad that prices of last-gen barely moved (at least where I live). Also, there are not that many people upgrading so it's also harder to find something used from 3000 series.
@@AdamKuzniar Yeah, the CUDA stuff from NVIDIA has great support from any software in existence and there is no denying it, and only then there is some kind of AMD acceleration (but also probably inferior anyway)
But that's kinda the issue as NVIDIA has so much profit from the enterprise that needs CUDA acceleration as well as AI stuff that I don't see just why they're treating the general customer in such a bad way for giving us such bad value products just to make it feel more premium :(
I feel like only self-employed people would want to seriously use CUDA or AI acceleration on something like a 70 series card or lower just because they already had that card anyway. Any serious business would go straight for 4080 or 4090 or some Quadro thing as time equals money. Those same businesses can also justify spending such a ridiculous amount of money for a 4090's xd
How many sponsors do you want in your video?
Uploader: Yes.
Jesus, can't get rid of ads even when video browsing.
NVidia's engineering effort seems to be all-in on product segmentation rather than product quality. The cache+reduced memory bus allows decent game performance while hamstringing the professional/AI workloads and so allows for better upselling to the higher-tier product.
Hey, it's a loss for the consumer, but they're not cannibalizing their products either - something AMD has already stumbled on a couple times with their Zen chips.
Almost nobody is buying this to use with AI or "professional" workloads.
@@squidwardo7074 Notwithstanding the gimped memory capacity, I/O, and PCIe connectivity, the core has about the same performance as the RTX A4000 professional card, which is currently retailing for about $930. So, yeah, it could easily bite into professional sales if it wasn't so neutered.
@@squidwardo7074 My university loved to buy Nvidia's gaming cards for machine learning research back when I was there 6 years ago. Quadro and Fire Pro cards are way too expensive for some organizations.
@@jamesalexander5559 I doubt that's more than 5% of sales
6700XT is the one to go for at this price range.
Just wanted to say BIG thank you for highlighting the graphics card in question in the graphs of this video! I don't often comment on UA-cam videos, but this improvement is so worth it! Big praise to the people who create the graphs ❤
All the videos I've seen testing the 4060ti show it performing really well in games like cyberpunk (with DLSS) and Red Dead Redemption 2, I currently have a 1070 and Im intending on upgrading to a new build with the 4060ti 8gb which I could just swap out for the 16gb version or a 4070 sometime in the future if I really needed to if there really is a VRAM issue which from what I've seen there actually isn't. I haven't even run into any VRAM issues with my 8gb 1070 at 1440p, it's just that its just not powerful enough anymore and I want access to DLSS and RTX.
Did you get the 4060ti? How is the performance? I am on a 1060 6gb and 4060ti would be a 3 gen leap for me.
@@smeet3010 no. I got a 4070 and it was well worth it
I appreciate the older cards in the graph. A 1060 is probably where a lot of people buying these will be upgrading from
I'm still running a GTX 1660 Super ... and I intend to continue doing so for a while.
@@Grumpy_old_Boot My GTX780 was still running things fine until 2 weeks ago it died and my friend gave a his old 1660 as he felt bad 😅
@@cncgeneral
Yup, as long as you don't have a high fidelity display with g-sync or something like that, the middle class cards is plenty.
As things are going it doesn't seem that people will upgrade from the 1060 for a few years more.
@@cncgeneral how it died and how old it was? I'm curious about the life span of the PC components.
The VRAM concerns are valid because the current gen consoles have 16gb shared between textures and everything else. When they port these games to PC it translates to needing more than 8gb VRAM, especially if you turn texture details up. Native PC games might be a bit more optimized but simultaneously tech marches on and those games are going to need more VRAM anyway.
Yes, and it would be pretty pathetic if a graphics card that cost as much as a current gen console on it's own was forced to use settings that make it look worse than said console because of VRAM limitations.
seriously, it almost feels like nvidia lately is secretely spearheading sony's ps5 marketing campaign. Why play on PC at all when the GPU alone costs as much as an entire PS5 digital?
In this 4k world we're living in now the textures alone will eat up memory, without those 4k textures there's little point in even playing in 4k
@@mrcaboosevg6089 noibody is buying a 60 series gpu and playing at 4k...
@@squidwardo7074 They can handle 4k in a lot of games, 4k isn't that special any more
The only way to force Nvidia's prices down is if competition gets really fierce
Just don't buy NVIDIA
AMD cant even do that
its been pretty fierce if you ignore ray tracing performance. amd card hold their own now and much cheaper prices than nvidia equivalents.
@lonzo for mvp people like you are the peobl
The competition is more fierce than ever before so much so, if all you care about is gaming, this card is already 200 overpriced, if this thing sells then, it's because it offers good enough value or because the consumer isn't very smart.
The Dnf at 3:09 is actually crazy
If Nvidia had put a decent width memory bus in this thing, it would probably be a bit more competitive. As it stands, they are not going to sell many of these things.
I don't think ngreedia gives a shit. Amd might be spanking them in the 1020-1440 raster user area, but that's what? 10 maybe 12% of nvidias target? Testla only needs the Cuda cores for their "ai" to work. I have no idea if they need bandwidth. render farms might need bus and clock speeds but those guys have more than enough cash just from 2hours of interest alone to buy up a fuckton of creem of the crop cards.
This is effectively e-waste at its price point.
The only thing I can see this being alright in is prebuilt PCs from the usual guys (HP, Acer and so on) or people who don't want to break the bank but need/want to upgrade.
The issue I see is people deciding to jack up the prices on 3060ti and so forth like the 2xxx series were.
@@gorkskoal9315 Steam survey shows that something like 77% of users are playing on 1080p still.
I bought a 6750XT in January for about $430 including tax. I was worried when the budget RTX cards released I was going to be upset I didn't wait. I want to take this time to thank Nvidia for not making me feel bad about my purchase, in fact, I'm happier with my purchase now than I was a week ago! 😂
I paid $1,400 for my 6900xt during the shortages. I didn't feel too bad at the time as it was only $400 over MSRP, but I felt worse later on when everything came back in stock. Now with these absolute lackluster releases I don't feel so bad. Also since 6900xt's have gotten so cheap I'm not afraid to OC the piss out of my card now.
@@YourLordMobius I overclocked the snot out of my 6750XT basically as soon as I got it since I got it for pretty much a steal, especially since it was the Red Devil variant.
I just recently oc'ed my 6900xt with the morepowertool. Peak it's pulling 395w and won't pull anymore even though it's set to limit at 450w. It's on water so no worries there. Stable in most applications at 2800 mhz. So yeah, I got a little extra performance out of mine LOL.
@@YourLordMobius I'd say! My buddy has a 6950XT and he's a little bit computer illiterate. I tried to talk him into letting me OC it, but he's super worried I'm going to screw it up lol
@@tbthunderer ah yes, the old "you could blow it up if you OC" fear. Only way you're going to blow one up is by overvolting. Even with the massive amp draw on mine the voltage is still relatively tame, and it just won't pull any more power even though it's completely unrestrained.
Edit: Radeon cards have always been notorious for allowing crazy high amp draws. Even without an insane OC and on-air my Vega 56 used to pull 550w with a mild OC.
What makes this even better is that due to walking back the now 4070ti from it's original 4080 moniker, it's possible this was intended to be the 4070.
I haven't really been keeping up with GPU performance relative to each other but wouldn't the 3070 cream this card if it was called a 4070?
No,there wouldn’t be any 4070ti and the 4070 would have higher cuda corr
@@ilnumero1234 HAHAHAHA! Fanboy fantasy speculation! The chips were specced over a year ago in R&D. There's no chance the moniker change also changed CUDA core counts, and Nvidia has had x70ti tiers for about a decade now. So there was going to be one no matter what. At best - at absolute best - they removed an x70 Super variant for the change. Can't prove it though so it's speculation vs speculation. Mine's funnier though, so :P
@@ilnumero1234 copium
I wonder what the 4050 will be like
To be honest I think that the new cards are great considering how much less power they use. I hope they will continue that trent with future releases. Only downside is the price at the moment, but it will come down around black friday and xmass maybe. I hope so at least. Still... I don't get people that would consider a 6650xt or 3060ti over this one with the reason being that it is much cheaper. Well, yes, maybe if you buy a card for one or two years. If you game on the 4060ti you'll get a bit more performance while costing you less then the cheaper models in the long run due to the energy savings.
That is so true. The price is garbage but the higher 40xx atleast are fine. I run a 4090 in cb2077 psycho Pathtracing with framegen on 90fps while drawing 340-370 Watts.
Those cards are sooo efficient it's amazing
If you live with your parents, you go for the best AMD card you can buy because you don't care about electric bills and air conditioning costs! And when bitcoin goes back up...the 4060 Ti and 5060 Ti and 6060 Ti will cost $2,000+. So think of your 4060 Ti as a speculative investment! rofl!
Looking forward to seeing RTX 5060-Super-Ti with 12 Gb and 32-bit bus aimed at 720p gamers at 60 fps ++++.
Technically, it should be 24 or 48-bit
12GB??? this is Ngreedia, it's gonna be 6GB GDDR4, at $600
Why can I only get 30 with the RTX 5060-Super-Ti? What gives?
@@thomaswinwood true, I didn't notice the lowercase b
An additional issue here that wasn't covered was textures completely failing to load on this card in newer games. I think overall these videos are good for a numbers to numbers comparison, but often times there's more to the story that other reviews are hammering on the gpu for. Textures shouldn't be failing to load on new titles for a $400 product.
Exactly, for a 400$ product which could be under 300$ (one tiny die!), I would be expecting atleast 12GB, a bit over 3070 Ti performance for a nice 1440p experience.
Yup, I watch LTT videos but I wouldn’t use them as a guide to buy a GPU ever. I’m glad for creators like HWUB and GN.
How is that related to the GPU
How is that the GPU's fault?
That's optimization dude
I was hoping you would include the 6800XT. It's a 16GiB card that will be in the same price range as the 4060 Ti, but with more memory.
Doesn't it also have faster rawspeed to? That might help a little for performance.
Wouldn't call $100 more in the same price range
@@squidwardo7074 I think most would though. Computer hardware pricing generally already varies that much for the same SKU just between retailers
@@kusucks991 Most of that is a waste of money for the lower end cards
@@squidwardo7074 I guess. The 4060ti 16GiB version is going to be at least $550. If it even gets released.
For me, the $100 for better performance and more RAM is worth it. I'm going to pull the buy it trigger to replace my GeForce 1080. What's amazing to me is that this 2 year old card bests nVidia's card for $100 difference. I understand they're not supposed to be comparable cards.
I looked up some used 6700XT cards, and they're $100 or more cheaper than the 4060 and 4060Ti. And these two cards are supposed to be comparable.
Would love if you specified whether the 1060 is 3 or 6 gb. Great review though, fair and informative
Thank you NVDIA for making the 60s cards 10% better every year so the 7060 ti can maybe get the same results as the 4090
4070Ti
The 7090 ti can give you 120 fps at 8k with pathtracing on, with a headroom to spare! It’s a marvel of engineering
The 7060 can do 1080p 60 in some games, but not even 30 fps in others. Good deal at just $799!
@@itsalwaysdarkestbeforethes1198 How much power will the 7090 ti use? Maybe 1200 watts?
@@GamerErman2001 1.21 gigawatts
4060 ti i like it
My 6700 XT looks better and better each time. I just wish I had waited a bit longer as I paid about 100 bucks more than I should have for it. Even still, Team Red is looking to be a winner here.
It's only money, no point dwelling on it as you'll have already made that $100 back.
I can relate man I bought my 6700xt for $400, but think about it this way: I bought the GPU in fall '22, and GPU prices are ALWAYS going to drop as time goes on, so no point about dwelling about it. You had the opportunity to use the GPU much earlier than a decent amount of people
Dude I got my 6700xt in Oct of 2021 for like 800 something. I look at it now and it has paid for it's self. I have no need to upgrade to a new GPU.
Considering the 4070 costs at least around $745 and above here in India, instead of $600. I am so glad I got the 6700XT at the equivalent of 360 dollars last month. The 4060ti launch price will easily cross $500 here.
This! The silicon tax is way too real. For anyone thinking of buying a RTX 4000 card at msrp, no, they're not.
RX 6700XT is a great card, I've been using one for 6 months now with no problems. 3dsmax/CAD and plays new AAA unoptimized games pretty well.
Flipkart ❤
For the first time ever we got PC components for the same price as USA
@@KaranYadav-gr5xj I mean, on offer yes. We still pay a 16% tax
I'm sorry, did I miss the part where you critisize nVidia for almost 0% generational upgrade and no pcie 3.0 for a couple hundred dollars more??
The graphs are a far more significant upgrade since the last benchmarks video compared to this card VS its previous generation.
Props to the Labs and Editing team for listening to community feedback!
Would've been neat to see the RTX3070 in the list also. It's also a 8GB card, has a pricetag around the same as the 4060Ti and has bigger numbers overall.
They still need it to look somewhat decent 👀they also didn't really harp on the fact it loses to the 3060TI in some 4k games and is only like 5% faster at 1440p, but they need free gpus not paid ones. Still better review then Jay2Cents
the tragedy of the RTX 3070 is that it still could have been very relevant today. it has plenty of horsepower but totally kneecapped by that 8GB.
even just a 12GB RTX 3070 would probably badly embarrassing Nvidia's current offerings
@@silentwonder-chilled It is already bad for 1440p. 8gb isn't enough.
@@sudbtd That's a ridiculous statement. The fact it could be better if it had even 4gb more vram doesn't automatically make it bad. I've a 1080 that I still play 1440p on and it still performs fairly well on most modern games even though it's weaker than a 3070. And by fairly well i mean High 60+ FPS on most games, very rarely do I have to turn it down to medium.
I definitely don't recommend running out and grabbing one just because it still performs well as another 2-3 years that's probably not going to be the case anymore, but the point that just because it doesn't have 12 or 16gb of vram and loses a little performance because of that doesn't make it a bad card automatically still stands. The 3070 has similar performance to the 4060ti so if 1440p ultra 100+ fps on most games is bad.. yeah..
@@TheZanzou even though, on games like Jedi, the 3070 just cannot run? Ah yes. 8gb vram is "enough"
See, this is precisely why I’m glad intel got in the game, because being able to punch up considering their pricepoint is exactly the reality check you need to know Nvidia’s pricing is completely out to lunch.
Nvidia literally doesn't care at this point. They are only in consumer cards for name recognition. They honestly might just stop making budget cards all together given how the data center market is going for them. Budget cards from now on will just be last gen stock.
@@TheEclecticDyslexic Which is fine by me, I just won’t ever buy them. They don’t have a value-add for me.
I just built my sister her first budget gaming rig. I went with a 4060Ti. Personally I use AMD (and I love to trash talk Nvidia), but admittedly I do fight with drivers occasionally and that isn't something I want her to struggle with. She's really only gaming at 1080, and at $379 on sale it was hard to beat for the price-to-performance, DLSS support, and knowing it's just going to work. It's probably not a card for anyone who watches LTT, but I do think it has a place in budget builds for casual gamers, especially if you can find it on sale.
Why not 3060?
Thank you for including intel arc in the benchmarks!
I really like the little red bubble things to help distinguish where the information that’s being talked about is. Props LMG editors
I miss the times, where the 60 series was about the performance of last gen's 80 series
When?
@@RayanMADAO gtx 1060 vs gtx 980 ti and rtx 3060/ti vs rtx 2080 (and even older examples if you keep digging down)
@@RayanMADAO 980 to 1060, 1080 to 2060 (though it had a vram downgrade), 2080 super to 3060 Ti; Theoretically you could also count 680 to 960, since 600 and 700 were the same architecture. Oh, also 580 to 660 ti
@BjornsTIR 2060 was a 1070ti not a 1080
Don't ignore Bidenflation ... All 40 series cards are at the same price, or cheaper than their predecessor if u recognize the Bidenflation
Feeling really good about my 3060Ti I got for $430 a year ago right now.
Would’ve like to see the 3070 included in the comparisons
It's faster.
But nvidia would not like that. So they did not include it.
No point in including another 8GB card you shouldn’t buy. Besides, helping them sell 3070s would have been doing Nvidia a favour and they don’t deserve any.
Its actual peak comedy that there's one guy still riding on the big old "LTT the Nvidia child company" bullshit and another explaining that LTT wants to do anything but help nvidia, right next to eachother
Made my day
It’s the same card.
As now a dedicated Arc user (my a380 is treating me well, thank you very much) I am so happy to see Arc putting up a fight in price tiers nobody seems to consider as valid.
Thank you for your service! We should support Intel GPU launch to get in the main ring with AMD and Nvidia.
You poor soul. Thank you for your sacrifice!
@@Innosos "poor soul" is a bit much. 😂😂
@@Innosos You're talking as if Intel can't make decent cards. They can, people just need to be patient and wait for better drivers because the hardware is solid, only software is their weak point.
But in the long run, Intel is going to force both Amd and Nvidia (Nvidia mostly, Amd is still somewhat consumer friendly) to get off their high horse prices because aint nobody buying overpriced gpus anymore when you can get an A770 16gb for 340$ that offers similar, if not better performance than the Rtx 3060.
Airbus gpu
Thanks for including the RTX 2060 in the results. It's good to have cards that are only a couple of years old that many of us still use shown to give us a good idea if an upgrade is worth it. Slightly older cards are always left out so we end up with apples to apples comparisons, when many of us need apples to oranges.
Never forget that about 98% of us that game aren't rocking the latest and greatest or have the cash for top tier hardware.
I'm convinced that Nvidia has gone into a tick tock cycle.
10 series - good performance
20 series - ray tracing / dlss
30 series - good performance
40 series - frame gen
50 series - good performance?
The first GPU that I ever bought in my life was in 2012 - a GTX 660. I was 13 years old back then and built my first Gaming PC ever. I cannot believe that this old card that I bought for 200€ and that is still somewhere in my garage today, has a bigger interface then Nvidias latest mid range cards in 2023 that costs 500€ and more. :D
The GTX 660 was a nice card, I bought one (GTX660-DC2O-2GD5) second hand in January 2014 for just 140€. I miss the times when graphics cards weren't so expensive
@@choppings54 Laughs in 3.5 Gigs :D
150€ for a GTX 550Ti back in, idk, 2010 or so? After that a GTX 670 in 2012 for 350€, which was a huge jumpnfor a poor student back then. Now I'm looking at something that's supposed to be just one tier below the xx70 card (while looking like it belongs in the xx50Ti tier) and it costs more than my first PC as a whole...
@@choppings54 There is a GTX 1080 in my PC currently, however I barely play on my PC anymore. I game on my Steam Deck or on my Xbox Series X most of the time.
The 60 series card is going for what I bought my 980 STRIX for back in the day, absolute insanity
Not to mention the used market for Radeon cards if you are willing to take the chance, you can score a 6800 XT for cheaper sometimes!
@@rustler08 thanks, yikes, guess I checked a while back, they used to be cheaper
@@harwil32 nvidia cards are cheaper on the second market, maybe because there are more of them.
except you can do the same thing with nvidia gpus...
0:18 - My god, NVIDIA is just pulling an Apple with this gen.
I said it when ARC was released - I'm glad we're finally at a point again where there's a viable 3rd option. Just, so long as you're not concerned about being made fun of when your buddies ask what card you've got :)
Nvidia keeps going the way it is Intel will be nothing to laugh about much faster, they've only released thier 1st gen and in 2yrs it went from 🤣 to "well maybe?" 🤔 if they release battlemage and keep at it with driver improvements we'll have a 3rd contender by next gen
@@scarletspidernz Battlemage does seem like it's coming out, the real question is whether their 3rd will come. Part of the problem is that to my understanding, some of the hardware issues with Alchemist were found after Battlemage already entered production. So the real test will be when Cleric or Crusader or whatever they call it comes out.
@@Junebug89 hopefully they keep at it this is a golden opportunity for them atm
lol😂
As someone who bought the rx 6700 xt I'm relieved that it doesn't make that much of a difference even if it's less efficient
Exactly why I upgraded to 6800xt from 1660ti. Team red actually gives what consumers need. I have no worries with my 16gb of vram at 1440p. Seems like nvidia is becoming another asus🤨
Quick question as that’s what I’m considering to upgrade to as well; currently running a 2060 OC, and it gives me 65-85 fps at 1440p but the 6800XT looks amazing for the price & performance (PLUS ALL THAT VRAM)… would you say your AAA gaming experience is pretty amazing in terms of fps & settings? Really want a bump with games like RDR2 & GOW 2018 … and the 6800XT looks more than capable for 1440p. Genuine question, greatly appreciate any responses back!
My 5700xt has.been holding off well also $300 back in Feb 2020
I'm waiting for the 2nd gen intel arc cards to come out before I finally upgrade my 1070
@@mass_stay_tapped_in528 consider buying a 6950xt instead of the 6800xt. The prices have dropped quite close enough that it makes it worth it. Like $60 price difference.
@@mass_stay_tapped_in528 6800xt is an awesome card for 1440p! I currently own 6700xt and the thing meets all my expectations at 1440p. 60 fps in the latest AAA single player titles with high settings, 90+ fps in beautiful online games and older AAA titles. 6800xt is like 50% more performance than 6700xt so it'll translate to 90 fps in the latest AAA single player games and 140+ in online ones.. I don't think that I NEED that upgrade, but I know damn well that I want it 😭
I also believe we should be thankful for the 4060ti…the entertainment value on watching it burn in reviews is priceless
Only price is the issue, the card is slightly flawed but still fantastic as a full package. If it gets cheap enough it would be a very good option even with the lower RAM.
If it doesn't, then it should be avoided.
I was told by friends I was nuts for doing a 3090 for 4K gaming when it came out. I knew VRAM was going to be an issue. I mod games a lot, and that eats VRAM. It was only a matter of time. Now, I can still sit comfortably not worrying about it for a few more years.
Unfortunately fanboys are always serial copers and Nvidia is no different. I know Some guys who swear up and down that VRAM is irrelevant and nobody needs it and you're just an AMD shill for saying VRAM is vital to modern gaming.
“few more years” you’re set for a very long time.
Duh you bought the extremely over qualified and overpriced top of the line graphics card of last gen, it better be decent for a lot more years.
24 gigs of VRAM is going to suffice for at least 8 years at 4K
Love my 2nd hand 3090. 900€ and playing on 1440P everything by maxed out.
ACC
F1
Flight Simulator.
Of which i play ACC and Flight sim in VR sometimes.
I don’t even get why people would go for a 4080 for 1500+ while paying 900 gets you a nice 3090 with 24gb vram.
I don't know who had the idea to put a green and a red bar in the charts, but that is extremely useful! Thank you!, finally I can enjoy this type of content without pausing it every single time a new chart appears. Yes, sometimes is useful to see where all the other products are located in comparison, but sometimes I just want to hear the speaker talking.
I just bought a 3060TI for $320 brand new last week , a day later they announce the 4060 family and I regretted that I didn't wait.
Little did I know, there was nothing to regret
*laughs in 256bit bus*
the 4070 is not bad, it's on par with the 3080 and it's cheaper, people lost their minds when the 3080 was out.
Same deal for me with the Arc A770. I wanted to buy a 3060 but I needed the extra VRAM for my job and AMD cards unfortunately don't perform well for my work.
@@21preend42Lost their minds in what way? The 3060ti & the 3080 were the best 3000 series cards imo.
@@21preend42 the 1070 was better than the 980, the 2070 was better than the 1080, the 3070 was on the toes of the 2080ti..... the 4070 is also horrible value and you guys need to grow some basic standards
Guessing Nvidia said reviews couldn't compare it to the 3070? 4060ti has basically same performance as the 3070 for up to half the watts. Would have been a good take. Now that the prices are below msrp, not a bad buy.
So glad I picked up the 12GB 3060, really coming in handy lately
Same here that 3060 aorous elite 12gb card comes in so handy now
@@rustler08 You may be right but it's pretty fast compared to my old 980
@@rustler08 Something that people dont mention is Power Efficiency. I considered a 6700/6700xt but both draw more power than the 3060, i dont need my electric bill going up even more when i have other things i have to worry about.
Should have gone for the 6700xt if you wanted the 12gb vram
@@rustler08 You're right and this is why the VRAM complaints are kinda dumb. In all honesty almost 0 games will use all 8gb, and if it does you can just turn down the graphics from high to medium and you probably won't even be able to tell the difference. If you're trying to play 4k on a 60 series card you're doing it wrong, and that's about the only time you will use all 8gb. The AMD cards absolutely destroy it in some games, but in others the 4060ti is 10-20% better, so I guess it depends which games you play. For me personally I stick with nvidia for now mainly because of shadowplay, and most of the games I play are better with nvidia.
Concerning the 8 GB of vram, it really makes me miss the HBCC high-bandwidth Cache Controller from the Radeon Vega graphics cards. That was an insanely good feature especially when coupled with the HBM2 video memory. Good Times.
HBM will be fondly remembered 🤧
As an owner of a Radeon VII, I completely agree! I just wish Vega had been better for gaming rather than compute.
I honestly think that a 6900xt with HBM would have given the 3090ti a run for it's money, as it always seemed like it was the memory bandwidth keeping RDNA2 back. Not a gpu engineer, though, so could definitely be wrong
Didn't HBCC do basically nothing tangible for performance?
@@TheHavocInferno you are right. HBM didnt do Jack shit for those GPUs except make them expensive garbage. Those GPUs had nowhere near the power needed for HBM to make any difference. It was a marketing scheme, same way nvidia is adding bigger l2 cache while cutting bandwith, instead of actually adding more vram capacity. Its all just marketing schemes. The fact that these fanboys above think HBM was really good, really shows you how dumb people are to logic.
I was so excited about the 16GB variant when I first heard about it. I've been wanting to replace my 1060 for years with something not overly power hungry, but I also have plans for a CUDA application next year that ties me to Nvidia. The stupid 128-bit bus is a double-whammy for me (gaming and computing) that's going to force me to buy something higher tier and run it in a reduced power mode
Why you need CUDA?
Same here. VRAM bandwidth is piss poor on RTX 40 series, and a lot of compute application scale in performance directly with raw VRAM bandwidth, not "effective" bandwidth, as the cache is useless here. So you get half the performance of the RTX 30 predecessor for the same or higher price. DOA.
buy last gen used then? 3080, the one with more VRAM?
@@MindBlowerWTF I've been waiting for AV1 since before most people have heard of it. Strongly considered getting a low-end Arc card specifically for it, but I don't want to deal with managing 2 gpus with different sets of drivers across both Linux and Windows
@@ever611 I write computing software that works with matrices that have effectively no upper limit on their size. As the matrices become larger, CPUs just can't keep up in terms of speed. It's the type of computing that makes sense for GPUs, and I want to target people with access to HPC facilities/supercomputers, which tend to predominantly support Nvidia (including the one I have access to). Since it's so heavily used, there's also just a ton of resources on CUDA programming. OpenCL or Vulkan compute could be viable, and I would love to eventually support non-proprietary alternatives to CUDA, but CUDA is going to be the lowest resistance route to reaching the most people I'm trying to target
NVIDIAcould have released the 16gb 4060Ti for $430, making a mid-range card that is future-proof, and not a total rip-off. But if they cared about their consumers, then they wouldn't be NVIDIA
Can confirm the 6700xt for 1440p is a great card, best bang for your buck in the price range for someone who doesn't care much about ray tracing.
6750XT owner, can defenetly recommend that one too.
@@brutallica2944 Same, managed to pick one up on Newegg for the same price as a 6700XT and couldn't be happier with how it's performing at 1440p
I really hope 7700xt is sub 400, that would be such a smack for the 4060 ti
How would it work with a 5120X1440 240 hz like the samsung oddysey g9?
@@hampuswirsen7793 it would work just fine, but don't expect to play most games anywhere near the 240fps mark. Esports games with lower settings and/or fsr you may get close but most newer games at 1440p high I get between 100-170
This RTX 4060 Ti seems like good news for AMD's older gen cards.
Game designer (in progress) here, on the note of the VRAM thing.
Pretty much it's just industry growing pains, developers are pushing the traditional geometry shading method to it's limit in regards to complexity and it's overwhelming VRAM busses (and CPUs) all around.
Tack on UE4 not being built for that level of complexity like in Jedi Survivor or Hogwarts Legacy, you get a lot of problems.
We know this is a growing pain thing because Fortnite with UE5 with Nanite enabled, the mode that HAS EVERY SINGLE BLADE OF GRASS BEING A MODEL runs actually pretty fine on VRAM despite the current size and complexity of the game map despite that vegetation change when Nanite is on resulting in a geometric complexity at least at the level of Jedi Survivor.
Virtualized Geometry/Mesh Shading is the solution to the VRAM problem as you only need to make your full-res model and a PLOD (Preferably), versus the Full-Res Model, 4-6 LOD tiers, then the Persistent LOD of standard shading/geometry.
When models are AS Complex as they are nowadays, those extra LOD tiers are not "Simple" at all unless a developer bruteforce optimizes it which you only see on console Exclusives because PC players like extra scaling so you have to forego that to a degree or other consoles may not take kindly to the LOD Setup you have for that specific console.
How insecure do you have to be to say something so unimportant to anything?
@thomasboob559 I'm just saying that optimization takes longer/is harder becuase of how they are pushing against complexity limits for the current Paradigm.
@thomasboob559 It barely runs on the last generation consoles because of CPU problems. Heck even had problems on current gen last I recall
Can I just say how glad I am that Intel finally has a viable GPU product? The more competition, the better imo, AMD can't take on Nvidia alone.
4060 ti i like it
Still don't regret getting a 6800 in Feb. 16GB is nice
My decision on buying a 6800XT at $540 is getting better and better. Thanks NVIDIA.
You have 6900xt for that price actualy.
Yeah same. Buying a Nvidia card is just a waste of money if you care about price.
Yeah but you dont get DLSS and RT. And I turn those on in literally every game on my 3070ti. It doesnt hurt performance enough to notice. If you have a Radeon you're basically playing on low settings for every game. The difference between RT on and off is way bigger than low to ultra.
@@scubasausage RT is sweet, or can look great, but it's not at all like Low to Ultra, guess you've never played at lower settings - it's a cherry on top for more realistic lighting and it's not even always worth it. Some games certainly look great and can benefit quite a bit from it though.
Also, the 6800XT can handle some RT, even though surely not at RTX 3080 levels.
@@dennisjungbauer4467 You are incorrect, RTX on is a way bigger difference than from low to ultra. You must never have used ray tracing before.
Also a 6800 can do ray tracing but not at playable frame rates. It gets regularly beaten by a 2070 when you turn RT on.
While I can cross-reference the 4k data, I would love to see a VR performance rundown. VR is still a smaller market, but maybe a once a year rundown of current cards to current heasets and games would be great
@@remeuptmain No need to be rude. VR is a very real niche, and the performance needs are different than 1080p, 1440p, and 4k. This is great feedback for the team to have.
@@remeuptmain Are you really going into comments sections and getting mad about people having opinions?
@@remeuptmain u post minecraft videos and get cooked in your own comments....
@@remeuptmainis that cause yours walked out?
@@remeuptmain you must be too poor to afford VR, I can see why you’re so angry all the time lmao
I just realised the 4060ti has fewer cores than the 3060ti, and the two things it does better are L2 cache and watts, but decreased power for the same performance is expected when decreasing the size of the transistors on the die but they also took away a lot of the other things that would take more power anyway.