We do our best to catch mistakes and fact check, but sometimes things slip through into production. We're sorry about that. Corrections: 5:19 The Xbox Series X actually has 16GB of shared memory. 10GB of 320-bit memory and 6 GB of slower, 192-bit memory. The PS5 has 16GB at 256-bit. Makes the VRAM situation seem even more dire. 6:06 The 3060 Ti has just 8GB of VRAM. The 3060 12GB should be highlighted instead. 6:19 Typo in the graph, all 1440p results were tested at 2560 x 1440
Everyone at LTT wishes they made fewer videos, because they value quality over quantity. Please Linus, I prefer LTT over GN. Your videos style are better for a general audience. However, this problem is becoming more apparent and annoying as an avid watcher.
They won’t until they fix dx11 and the cpu overhead at 1080p. My r5 5600 is bottlenecking my a750 in a decent amount of games. Throwing it in my r7 7800x3d build fixed that a can in some dx11 games massively improve performance.
Intel GPU caused me a nightmare today, I went to a new customer that has 10 pc's with Arc a380's installed. Every single pc booted up today and showed the desktop including icons but mouse and keyboard was unresponsive. I was able to repair by booting into safe mode and downloading the latest driver and then pulling out network cable and removing arc from device manager. Installed the new software and booted back into windows and I thought great that's fixed it. 2 hours after I left the client after pc's has gone to sleep and woken back up the same issue has reoccurred and so they are dumping for Nvidia 1660 supers.
I'd want them to succeed if their cards were actually any good. Competition for the sake of competition won't be great for customers if they're buying overpriced trash with a choice of 3 colours
I mean it's equally valid at this point to potentially look at higher end cards within those generations. I'm seeing plenty of 2080s under $250 (within the US at least), which depending on resolution and game can give you a substantial uplift more in to the 6700 XT neighborhood according to some sources. Of course it wouldn't be new in box with warranty, while I may be comfortable with buying used (the 3080 I have now was bought "for parts or repair") no hate to anyone that isn't as comfortable.
@@AmaraTheBarbarian old top tiers were always way to go, but those cards use too much power. So in the long run you will spend more, not to mention possible PSU limitation that in some cases makes it even more pointless.
fr. i have a 1650 atm which i think is in the middle of those 2 and now i know how fucking insane of a boost im gonna get if i buy one of these (which i will). like... thats a 2-3 TIMES preformance increase. with minecraft with my shaders n shit i get 40-60 fps (100 when not at my survival base and about 300 without shaders) and now i might be able to get above the refresh rate of my monitor
Love the animations that draw attention to a specific data point you're talking about in the graphs. Makes it much easier to digest at a glance (especially at 2-3x)
Hmm, I don't agree completely. I think the animation could be better, I forget which card were talking about if the red dot is all the way at the end of the graph and before I can trace the card name back vertically they're already talking about something else
@@zetsubou3704 GN gives you data to analyse yourself and cross reference. It is often plain more informative than LTT but it's not meant to be a little candy to draw your eyes which is what LTT aims for. Different approaches entirely. LTT is when you want to hear opinions and review on a new product and don't want to think much. GN is when you want details and want to dig into the topic
@@zetsubou3704 Tbh, For me, it's easier to see which card they're reffering to over at GN, than those red dots suddenly appearing somewhere over the screen here..
@@NickyNicest It's also because it was/is (IIRC) one of the most popular GPU's for a long time. So for a lot of people, like me, it was their first GPU so it's just a game of reference for a lot of its.
@@neuro3423 yeah, same here. I haven't kept track of performance of video cards for years, so i don't have a whole lot of reference now a'days for numbers lol. then i see the 1060 listed and i'm like "oh yeah, that's how it performs ok pretty decent"
Рік тому+20
Exactly! They really need to keep doing this! So many people still have a 1060 (just check the Steam hardware survey) AND the GTX 1060 6GB is so legendary by now that pretty much every gamer knows what gaming feels like on it!
I just want to say a big thank you linus and team, for highlighting the graphs and for adding the big red dot by the gpu graph being compared to. Thanks for listening to us, it's been a huge improvement and makes the graphs make more sense now to, love it 😭👌
you say that, but many people arent willing to make incremental upgrades for just a little more when building. it's a mystery. They will spend >$100 more for a bling mobo and cpu combo and then pull a 3060 with 13600k pairing kinda bs
Once these mid tier cards calm down a little i think it would be worth having a "Intel...how are they doing now" revisit. This gen it definitely seems they are the ones putting in some serious work and maybe the unsung hero's? The A770 numbers in this video were surprising.
yeah the A770 seems like a great buy, but there's still a bunch of issues like the lack of support from oculus (afaik.) and some games not launching. once these issues get fixed it'll be a great mid tier VR GPU (because stuff like vrchat needs a lot of vram).
@@woodtvnetwork The fact that the 4060 ti can perform worse than the 3060 is quite wake up call we needed to ditch Nvidia until they get their shit together
Yep and both AMD and Nvidia left Intel ARC a big reserved parking spot smack bang in the sweet spot of the Midranged. Driver and game devs will make ARC a LOT better over time too. Years ago when Vulkan/DX12 games were first releasing; lot of the talk then was low-level api's put a lot more of the onus of hardware performance on game devs, not just driver teams like DX11.CDPR has already demoed what I said as fact, they released a patch for XeSS, ARC general performance even without XeSS also significantly improved in their two big games.
Even if Battlemage only gives us a generic generational leap in perfomance... Then that would be enough to both beat Nvidia and AMD since there's pretty much no improvement in regard to price. And let's not forget that benchmarks aside, on paper the A770's die is bigger than the 3070, came out with 16gb, AV1, better ray tracing than AMD, and yet Intel still managed to make it $350.
I just got a 750 for this very reason, while its not going to be the most stable to begin with, I think Intel has the potential to oust both Nvidia and AMD in a big way with their future gen cards. I think that a lot of people also forget that this is a 1st Gen card with brand new drivers that is still a very viable competitor to cards being released from companies with 2 decades of experience and drivers behind them. Lots of potential that hopefully we'll see with driver updates and the release of the Battlemage GPUs sometime next year!
@@notfunny3397 iuno the 750 is close enough to a 770 which was close enough to a 3060ti. So getting a thing for $200 that was similar to a $400+ thing sounds fine. Unless you’re just gonna say since they’re not Nvidia they suck and the cards should basically be free.
@@joemarais7683 ehh tbh, as much as I hate Nvidia they have a lot of benefits that I can't ignore, even if everything was matched at rasterization performance Nvidia GPUs GENERALLY have better rtx, better upscaling (dlss 2/3), better efficiency, longer driver support, better screen capture, etc I prefer AMD driver application though
@@notfunny3397 All of which are very minor advantages, CUDA is major. Don't get me started on RT, the vast majority of people don't play games that have it, and those that do, most of them, don't even turn it on even when the Hardware can run it. It's not there yet. 2x - 3x more gpu releases and it should be important, till then it's a gimmick for when you get too many FPS.
I picked up a 6600 in October of last year for ~$200, and have been loving every second of it. I've only got 1080p panels, and I have no real qualms with it. Massive upgrade from the 1050ti it replaced!
Upgraded from 1050 ti to the 6600 too. Was about to pull the trigger on an a750 but my 450 watt psu is saying no. No regrets with this card. Better than novideos 3050
Even a modest Vram increase to 10 GB like the 6700 (non-XT), could have gone a long way to making this a fantastic 1080p option, as well as playing some recent games and most older games at 1440p. Then giving it a price comparable to the 6650 XT would make it sell like gangbusters.
I've got the 6700 and I game at 1440p native in all my games. Mostly maxed out graphics settings, occasionally a game might need a little tweaking like dropping from Ultra to High, but on the whole 10GB is enough for 1440p for the time being in new games. The 7600 is $260, but for $280 you can get a 6700, that's the no brainer. I agree it needs to be cheaper to apeal to the widest audience of budget conscious gamers.
The slightly wider memory bus for 10GB would also make it faster. And there'd still be the possibility to offer a cheaper version with 8GB e.g. XT and non-XT.
@@jochenkraus7016 If they don't want to double memory sizes it really is the time for odd memory bus widths, 160 bit 10GB, a few more shaders and the original price point could be a winner at the same margin. But both AMD and Nvidia planned full greed.
@@MeakerSE Yep neither of these companies are ever going to give the consumers what they need. The cost to add more vram is not that much more. Its just about slow rolling out the specs and getting as much money as you can.
It's simply clever marketing. There's a blatant hole in the new budget GPU market right now, they would have been stupid not to take advantage of it ^^
@@donkeymoo1581 sounds like you completely bought into the marketing. Who cares about raytracing? FSR is a good competitor to DLSS. You wouldn't even need it if you got amd because you'd get an actually decent card and with good price/performance
@@donkeymoo1581 I have an EV and an AC that use a lot of power. Insane power consumption is all relative, and the difference between the 6700XT and 7600 is minimal.
@@nickwilson3499 well... Marketing because the efficiency of amd 6000 is actually quite ok (on par or better than rtx3000?). But the 6700/6750 sits on a too high tdp value (for me ) and is the first >8gb model amd offers. Nvidia at least has the 3060-12gb. (Also why I'm pissed of at the 4060ti 16gb markup on the 16gb model ). It's true the 6650xt offers more fps for less money, but it's 8gb only and i have a bit of FOMO of getting an 8gb card at the moment. 3060-12gb is slower , but there isn't a game where the minimum fps is below 60fps where the 6650xt would be above 60fps. So the experience (to me at least) is the same. The game is locked at 60fps anyway. And dlss is (again, for me ) a feature worth paying extra for. More supported games and quality is superior to fsr2.1 and below. Fsr2.2 looks nice but is only available in 1 or 2 games now, and fsr isn't really upgradable by replacing a dll file like DLSS is .
6700xt is a really strong price to performance champ. Coming in about 200 cheaper from the next card up, the 6800, and about 100 more than it's lower tier rival 6650xt. It gives really high 1080p frames on anything, above average FPS at 1440p and half decent FPS at 4k with lower settings or FSR. Excellent value card, I'm so glad I got it. Will last me until a great sale on the 7000 series or the 8000 series comes out
my 6700XT hasn't gone over 14% usage yet and I'm playing emulated games with 4k graphics mods. Safe to say, it's powerful enough for me and probably for most of us.
@@Michal_Bauer i smuggled used rx6600xt for 190+20 USD from ebay + local delivery service a few month ago. Asrock challenger are perfect card for this since it weights 1KG in a box.
so glad i got my 6700xt back when it launched, was going to switch over to the 40 series and sell my 6700xt but based on these charts, and own personal experience, hands down team red win
Thanks for highlighting the card in question in graphs now! It’s a small change but easier to follow along while also listening without having to pause, much appreciated!
I'm still holding out on replacing my 1060 until the next gen Arc drops. If Intel can stay at the same price point, I have a feeling it will give red and green a very competitive run for their money.
my question is, will you update to a new 1080p GPU or will you go for something above that resolution? 1440p maybe? If that's the case, there are great GPUs already out there at a very acceptable value (specially 6750XT - 3070 GPUs). If your plan is to get something at 1080p, don't wait that long. But if you're looking for something at 2K-4K on the other hand, get something amazing!
@@AndreyStevn I've decided to go with team Red. I'll be taking a trip to my local Microcenter next week (it's only 20mins away). Time for some upgrades, including a 5900x to replace my 1700x. Still on the fence about getting a 7600 or a 6700xt. 1080 is fine for me, as I only have room for two 24in monitors on my desk.
@Randolph Ortega I've been on an rtx 2060 for 4 years now and it's not bad, but I switched to 1440 2 years ago and it's starting to show its age. Probably my last upgrade before a new rig entirely
don't forget to do a clean uninstall of your old drivers, so that there's no potential conflict (which could lead to instability or performance issues)
THANK YOU for actually highlighting the portion of the graph the script is referring to; so many times pausing, jumping back, and scanning through the list to actually observe what they're referring to; that little change makes such a difference for hearing, seeing, observing, all within the window of just watching the content.
Been very pleased with my 6700XT. Very consistent 4k* gaming experience with FSR even on some good modern titles. Only got it 2 months ago so I missed out on the price drop but it was worth it to pay $70 to have that level of improvement for the last 2 months.
@@Shpeedle its definitely better than some cards, but just today i watched a video where it was compared with some other gpus that are around $300-$350 and it was literally half (or less) of the performance of the other ones.
@@shabanino I got the RTX 3050 during the mining shortage knowing that it was bad gaming card, specifically for editing in Adobe Premiere Pro and for up-converting video with Topaz AI. For that purpose, I don’t regret it and I would have bought a different card if it were for gaming. That said, I wouldn’t buy it now but when other much older cards that also shouldn’t be bought now are on the list, I think showing the 3050 makes sense.
When they really start to have one, unless AMD and intel really start to push equivalent in term of quality and diffusion on game of upscaling technique (plus a good level of RT performance). The problem is the fact that Nvidia is literally fighting in another league with is technology and that why it as most of the market share; that you like it or not with AMD you're paying less because it offer less (they would have the same price of NVIDIA if they could really compete on the same level).
@@Frigobar_Ranamelonico I haven't kept up with things lately, but isn't DLSS still game specific? FSR works with literally anything. I like that approach much more
I like that you are trying to show what gpu youre talking about with a red dot (like at 2:21) but I think a more subtle effect in a larger area would be better. For example: making the background area of the card breathe a white color, like you do with static green for the rx 7600. Or the red dot at the name of the card, or a combination
You guys really should have included the Arc A750 maybe even over the A770. Its been going for $200 lately and I find its the most interesting competitor to the 7600
As someone who bought an RX 6650 XT a couple weeks ago, I'm happy to hear that it's still competitive, even if i could have done a little better on price or performance by waiting longer.
It really is sad just how terrible this generation of cards is. Its very similar to the 20 series of nvidia, but this time for both AMD and Nvid. Just absolutely piss poor performance by both companies.
I had a dream that I was brought on to be part of an LTT video. It felt so real and the time crunch and pressure to perform was also very real. I feel weird about watching videos now. Just wanted to share this.
Man when I made my PC in 2017 I was able to play every game at 1080p ultra no problem with my $200 GTX 1060 3GB. The 10 series era really was the pinnacle of the PC gaming experience and it has been going down hill ever since.
Cyberpunk is what made me upgrade from the 1060 3GB card. I barely got 30fps running around and don't even talk about the sub 10fps driving. Upgraded to a 6700XT and haven't looked back.
@@argonzeit The 6GB version wasn't faring that much better either, while the frame rates where better here and there, the frame times where almost always horrendous.
1 year later, driver updates removed early driver stability issues, and still running great. Outpaces the 2060 and 3060, I love the great looking games running at 1440 75Hz.
Good on Amd for listenting to feedback and adjusting the price last minute in a favorable direction. It would have been ideal to hit the $250 price point, but its a step in the right direction. Undoubtedly in the coming months it will have sales that bring it down a bit, making it even more appealing and afforable for a mainstream budget card. The 8gb of vram issue isn't such a big deal at this price point either (as opposed to the abominable 4060ti 8gb at $400, which is inexcusable).
Actually, it didn't even take a month. Amazon is selling it, along with RE4, for $260, with a coupon, at least in the US. I don't know whether your "future transmission" was "serious" or not about it being a no brainer, but it's at that price, now.
Gotta love the fact that Intel cards are just not finishing benchmarks, they are like the Nicholas Latifi last year on F1. I hope Intel intensifies development for Arc, extra competition in the GPU market could make a huge difference.
I'm glad I went with AMD this year for my GPU. Missing a few Nvidia exclusive features but I'm still pretty happy with it. I expected to lose more features but AMD has a lot of their equivalent features.
Well done! Yes it's difficult to break away from what you are used to (probably one of the reasons people keep going with Nvidia even when they are bending you over publicly lol), But you will feel great knowing you got a much better deal! And there's not much you can't do on an AMD that you can on Nvidia now unlike a few years ago.. They've come along way :)
Could anyone check if the replay feature from AMD still gives a separate audio file when splitting audio tracks? Back when I used an AMD GPU it did that and it was annoying
Got that marked in my calendar. Hopefully, this means AMD saw the writing on the walls and is gonna make some waves with their pricing, and maybe, just maybe, it will force Nvidias hand. I'm not gonna hold my breath, but one can hope.
It'll force Intel, but I doubt it for Nvidia. They have a huge following in the GPU market. I'm just glad AMD caught up with performance a few years ago.
2016 levels of Performance.... !!! GTX 1080 with 8BG VRAM from 2016 @ 499$ has the Same VRAM and is so fast as a 3060 with 8GB who was also selled for 499$ at the beginning of the year 2023
@@cosmic_gate476 Yea 2016 is when things started to stagnate on VRAM. The 1080ti had the same VRAM as the 2080ti, then the rest of the lineup was in the same boat, with the only one that had any decent amount being the titan cards, following with 30 series doing it AGAIN with the 3090, and only tiny boosts on the 3080 12gb and 3080ti from previous gen. Even 12gb is already looking like it's the new 6gb, with a handful of games using well over 12gb at 1440p if you give it to them. 16gb should be the new mid-upper range standard going forward, with 32gb for the high end. I'm honestly surprised we haven't seen more VRAM on more cards by now, but then we all know nvidia has to keep those margins up at the expense of gamers right? It's honestly fucking pathetic how stupid nvidia thinks we are treating us like this, that extra VRAM would still leave them with impressive margins and we'd all be way better off for it
@@TheOriginalFaxon 6700XT has 12GB, to me the obvious mid-range choice right now at $360. That's the bare minimum for me - my 1080ti 11GB is almost always maxed out. Looks like I'm keeping it around for another couple years at this rate, cause I need CUDA so I can't switch to AMD.
@@cosmic_gate476 They're referring to the low vram amount issue that the 700 series had which caused them to age poorly similar to this generation. He obviously doesn't mean that the rtx 4070 literally has 2gb vram. 2gb of vram in 2013 is like 8gb today.
me too. It's the perfect card to hold me over until a great sale on this generations higher end cards. For the majority of games I play it's more than enough. It's hilarious taking all the old games I used to struggle to run and maxing settings in 4k with no problem
@@jp4361 even used I doubt it. What's the cheapest 2080ti or 3090 out there? I think even in a few years the 4090 will still be an excellent 4k card and massive overkill for 1440p.
Damn these AMD and Nvidia GPU reviews lately are actually making me consider grabbing an ARC GPU even more. I'm surprised to see how the A770 out preformed the RX 7600 and was within tagging distance of the 4060 TI. Considering this is Intel's first GPU and was released last year it really puts team red and green to shame. If they keep up with those driver improvements it's gonna cause some major disruptions in the GPU space. I can't wait to see the 4050 I mean 4060 non ti benchmarks lol.
I was set on finally upgrading my 1060 to a 4070 this gen, but am so pissed off that I’ve gone and bought an A770 just to support Intel so they can come out with a great card next gen.
@@dearlove88 Bruh I feel you. Ideally I would get a 4070 as well but that $600 price tag is insulting. As a creator and a gamer the A770 is looking as the next real option as it's performance is on par with anything else at it's price range plus the 16 GB vram and AV1 encoding is a real bonus for me. That said, stability with app like Resolve and Blender are a big concern for me.
for a similar price, the rx6700 xt also looks great. It can run most games at ultra 1440p, has almost certainly enough vram for future 1080p gaming and it similar enough to a ps5/xbox series x in performance.
The 6750xt from MSI is on sale right now for $330 on Newegg. The Intel A750 is only $200. They may be going down even further now that this card is released.
You should do a video about best for the buck GPU:s at the moment! Maeby from all the different gategories from low end (1080p) to the semi-high to high end (4k) gaming. Not only current generation but also the past ones included.
It’s seems that gpus as a whole got too powerful in the last generation to the point that even the lowest end chip could put up a fight in 1440p and 4K so now instead they’re limiting vram to force each tier gpu into their respective resolutions
It's weird that the 3070 being an 8GB card is running the way it's supposed to and there are not much if ever vram problems for my use cases. In certain tasks that are demanding, then yeah, the lack of vram is a slight issue, but not quite noticeable since I went from 1050ti to 3070 after the crypto crash.
3070 has been shown to have vram issues as well. Your just not running the new AAA games at the settings where it apears at. (most AAA games comming out in 2023 that isnt dead island 2) are having some sort of vram issues on launch, esp if you run your 3070 in the resolution you should have purchased it for @1440p or 4k. OR maybe even in some titels at 1080p maxed and raytracing enabled. There was a review site that made a test with a special produced/or Quadro(A series) product@ 16gb 3070 vs the regular 8gb 3070 class cards, and the 16gb was miles better in most cases.
@@AdaaDK also sometimes the game auto upscale the textures if it gets insufficient VRAM, so it might be fine to some people, not realizing their background is 720p.
It really depends on which games you play. For example, Hogwarts Legacy.. E.g. The RTX 3060 12GB is certainly slower than a 3060ti 8Gb, which I have, but the 3060 non-ti with 12gb still outperformed the 3060ti only due to having more vram in that game. But in any other games I played so far, the 3060ti is way better.
There was a test between tye 3070 and the 6700xt similarly priced cards, but the 6700xt has 16gb, and it hands down beat the 3070 in raytracing in new titles
You can tell Linus isn't much of a car guy when he's using the civic for his analogy. I may have grown up with V8s fueled by hate, but even I'm cautious of a civic. A junkyard turbo makes those things come to life.
I've seen a lot of varying performance recently depending on the game and use cases, so I just gave in and bought a 6700xt, and I'm pretty sure it'll be enough to last me quite a while.
Looking at the specs it seems the 7600 is identical to the 6650XT aside from a small difference in clock speed in favor of the 6650XT and a very small difference memory speed in favor of the 7600. That means that the positive difference we're seeing with the 7600 is pure architecture tweaks.
Micro Center is currently having a 6650 XT sale. Just $219.99, in-store only. One heck of a deal though. Not to mention the 6600 XT, which I am seeing as low as $199 online. If anything, the launch of these new cards made it the PERFECT time to get a truly budget card, especially if you have an older system, like me (2020). I just replaced my 5600 XT with a 6650 XT and the leap in performance was astounding. I was shocked to see clock speeds near 3000mhz, where my old 5600 XT barely hit 1800. Not to mention I do lots of VRAM intensive work, like VFX. It is nice to get a couple extra gigs of VRAM after dealing with 6GB for several years. (Yes, I know cards like the 6650 aren't the best for productivity. But for the work I do, it is fine. Not to mention I am a poor college student LOL).
Hey thanks for putting that red dot in there. In the past often times i wasn't fast enough to find the right bar when you were talking about the performance difference.
Whoever edited this video or whoever came up with the idea for red dots on those graphs- insane major kudos. I always end up looking at those graphs and start thinking and tuning out sounds(low comprehension skill) cause i cant take in the info within the 5-6 second slide. That red dot helped me identify what you guys were talking about right away instead of me being all googly eyed looking at numbers May not seem like a lot, but hey, free upgrade And im gonna be looking for those comments in november 😂
It is nice to see that Linus has a renewed energy on camera, since he announced he is stepping down from the role of CEO. I love it. Glad to see our NCIX kid back. 😁
I know it is older (and was not the most available card to get at the time) but I wish the Radeon VII made it into these discussions about needing 16Gb of RAM. I use a 1440p monitor, and just don't know if we have hit the point where I really need to bother upgrading from my VII or 5700XT cards.
Me: *gets 2 notifications and checks my phone* The notifications: - LTT: “AMD, you need to hire me!” - JayzTwoCents: “AMD’s NVIDIA Smack Down!” Me: “Oh, this is gonna be great”
as a data nerd I would hope additional information will be included in the FPS chart. say when you are discussing the importance of VRAM, stating the amount of VRAM in the FPS chart will facilitate us to quickly understand your point
At some point the price will be slashed further but even at its current pricing... it just doesn't have enough VRAM to justify the cost. Best to get a 12GB VRAM card for a bit more like the 6700 XT.
It's find if you don't stream or have a 2-4K display with other stuff running aside from the game window. Which is.. let's face it, this crowd has 1440P monitors or at least a dual 1080p setup by now since just the real estate alone is a game-changer. Hardly anyone I know games on a single 1080P display. Which means.. yep.. you need 1-4 GB more to handle the side stuff/2nd display/extra desktop. Now, Windows is nice in that it was long ago decided that you will never run out of memory. But this means shuttling off any extra to your SSD. Ouch. Same as running Windows 10/11 on 8GB. You can do it, but 12GB+ is where they are happy.
@@josephoberlander I'm pretty sure I replied to you in another comment lol. You need to learn the difference between VRAM and RAM. It doesn't take 1-4 GB to do the side stuff. I do agree however the 8 GB is not near enough, especially when you consider the next gen games coming over the next year and beyond
@@josephoberlander thing is, it won't have any resell value most likely. Some games already need more than 8GB so by the time you come to sell your RX 7600, it will even be 1080p obsolete.
@@thepcenthusiastchannel2300 True. This is a stopgap measure. But when faced with $400+ "street prices" for anything good, this at least gets you up and playing games.
Glad I bought the 6650 XT back on Black Friday for $200. It has yet to get back to that price point and it even beats the 7600 in some benchmarks. I am kind of surprised that the 7600 is really not a performance boost whatsoever compared to the previous gen. It could be the drivers need to be tweaked for the new cards still, but neck and neck with old tech is not good.
the "incoming transmission from the future" bit made me chuckle so hard! and these repositioned sponsor spot segues are keeping me on my toes when i watch you guys. keep it coming! 😍
I have a 5700 XT as well and still runs strong even on latest games at 1440p, I just look at the 6600 XT since it performs about the same as the 5700 XT.
@@HeadphoneMAN0017 I have one of those fancy 49" wide monitors and it's a bit of a struggle to get 120fps in apex legends. I'm just trying to figure out how much of an improvement I'd get, but it looks like I'll have to look elsewhere.
@@HeadphoneMAN0017 5120x1440. Samsung Odyssey Neo g9. On low settings I hit maybe 130fps, but it will drop down to like 94 sometimes. I use the overclock tool and that helps keep things consistent. I'd love a video card that could run it at 240hz, but I don't want to spend that much money lol. My wife won the pc and monitor, so I'm trying to keep the price as close to zero as possible lol.
I thought about upgrading to a 6650 or 7600 from my RX 570 8GB but I am really concerned about the VRAM issues which kinda makes me not want to upgrade at all.
@@rideroftheapocalypse9953 4 + 4 is possible, 8 + 4 isn’t due to memory bus limitations on that card. The 7600 is 128 bit so you can only add 8 gb modules at a time. Adding a 4 gb module just simply wouldn’t work, it’s how the card works
Yoo, I’m new to pcs and have been doing some research! I’m more leaning into getting a 6700xt as all the benchmark and fps comparisons I’ve seen are beating newer cards even though it’s 2 years old! Do you think a ryzen 7700 or a i7 13700k would be a better pairing?
@@adamsewell7878 In regards to whether it's better to get AMD or Intel with an AMD gpu, it doesn't really make a difference, they should both work fine and have resizeable bar/smart access memory (which is a feature you should enable in your BIOS). Performance-wise, judging by reviews they seem pretty similar, with Intel edging out in productivity, while AMD has less power consumption, and them pretty much neck-and-neck for gaming. So in the end it comes down to price, it seems the AMD CPU is cheaper, but needs a more expensive motherboard (because it's a new platform) and ddr5 ram, while Intel has some motherboards that can use ddr4. On the flip side, AMD should have longer motherboard support, making it easy to upgrade down the line. If you're in the US, I'd recommend checking out Microcenter, they have great cpu and motherboard combo deals, and I'd go with whichever you can find for cheaper. If they come out to the same price, I'd personally go with AMD just because of the longer motherboard support and lower power usage, but if you do a lot of productivity work Intel would be a better fit. Hope this helps!
@@Manu__R cheers bro, I’m pretty sure I’m set. I’ve gone with ryzen 7700 x, the red dragon 6800 XT, 2x16gb ddr5 Corsair vengeance, deepcool 360 aio, 750w power, b650e aorus elite ax, Kingston KC3000 2tb. I think maybe the gpu could be a bit higher spec for the 7700 but I’m happy with this. Any thoughts/criticisms on the build would be helpful and be appreciated
We do our best to catch mistakes and fact check, but sometimes things slip through into production. We're sorry about that.
Corrections:
5:19 The Xbox Series X actually has 16GB of shared memory. 10GB of 320-bit memory and 6 GB of slower, 192-bit memory. The PS5 has 16GB at 256-bit. Makes the VRAM situation seem even more dire.
6:06 The 3060 Ti has just 8GB of VRAM. The 3060 12GB should be highlighted instead.
6:19 Typo in the graph, all 1440p results were tested at 2560 x 1440
Please proofread your stuff
Thanks for telling us, really appreciate ❤
When Asus had a typo, GN made a video about it. We will let you go this time. 😂
@@YounesLayachi ...
Everyone at LTT wishes they made fewer videos, because they value quality over quantity. Please Linus, I prefer LTT over GN. Your videos style are better for a general audience. However, this problem is becoming more apparent and annoying as an avid watcher.
"And then there's Nvidia who has transitioned from making gpus for gamers to fans of Nvidia" So true.
Physical fans or human fans?
lol
@@Juice1984 Both, considering the power consumption...
@@Jimmogon_ What? If there's anything that the 4000 series does right it's efficiency/power consumption.
@@xiuxiu1108 To be fair, you're not wrong with that. The 4090 though...
I love the fact that the charts finally point to which device they are currently talking about when doing comparisons. Highly appreciated
I REALLY want Intel's GPUs to succeed. MORE competition is a great thing- and BOTH AMD and nVidia NEED the kick in the ass.
Nvidia needs a punch in the face, AMD needs a spank and Intel need a gentle applause for competing.
They won’t until they fix dx11 and the cpu overhead at 1080p. My r5 5600 is bottlenecking my a750 in a decent amount of games. Throwing it in my r7 7800x3d build fixed that a can in some dx11 games massively improve performance.
Intel GPU caused me a nightmare today, I went to a new customer that has 10 pc's with Arc a380's installed. Every single pc booted up today and showed the desktop including icons but mouse and keyboard was unresponsive. I was able to repair by booting into safe mode and downloading the latest driver and then pulling out network cable and removing arc from device manager. Installed the new software and booted back into windows and I thought great that's fixed it. 2 hours after I left the client after pc's has gone to sleep and woken back up the same issue has reoccurred and so they are dumping for Nvidia 1660 supers.
@@aarontaylor7750 mouse and keyboard being unresponsive has nothing to do with the gpu driver.
I'd want them to succeed if their cards were actually any good. Competition for the sake of competition won't be great for customers if they're buying overpriced trash with a choice of 3 colours
Thank you for including benchmarks for the gtx 1060/2060. There are still a lot of us looking to upgrade from low/mid tier cards of those generations.
I mean it's equally valid at this point to potentially look at higher end cards within those generations. I'm seeing plenty of 2080s under $250 (within the US at least), which depending on resolution and game can give you a substantial uplift more in to the 6700 XT neighborhood according to some sources. Of course it wouldn't be new in box with warranty, while I may be comfortable with buying used (the 3080 I have now was bought "for parts or repair") no hate to anyone that isn't as comfortable.
@@AmaraTheBarbarian old top tiers were always way to go, but those cards use too much power. So in the long run you will spend more, not to mention possible PSU limitation that in some cases makes it even more pointless.
for reals had a 1060 3gb for 8 years and just recently got a 4090. Always felt good to not be forgotten 🥲
@@MegaSoulReaper666 From a 1060 to a 4090? Dang, that's a massive jump!
fr. i have a 1650 atm which i think is in the middle of those 2 and now i know how fucking insane of a boost im gonna get if i buy one of these (which i will). like... thats a 2-3 TIMES preformance increase. with minecraft with my shaders n shit i get 40-60 fps (100 when not at my survival base and about 300 without shaders) and now i might be able to get above the refresh rate of my monitor
Love the animations that draw attention to a specific data point you're talking about in the graphs. Makes it much easier to digest at a glance (especially at 2-3x)
Hmm, I don't agree completely. I think the animation could be better, I forget which card were talking about if the red dot is all the way at the end of the graph and before I can trace the card name back vertically they're already talking about something else
@@bblz9171 Miles better than GN tho tbh
@@zetsubou3704 GN gives you data to analyse yourself and cross reference. It is often plain more informative than LTT but it's not meant to be a little candy to draw your eyes which is what LTT aims for. Different approaches entirely. LTT is when you want to hear opinions and review on a new product and don't want to think much. GN is when you want details and want to dig into the topic
It's like a little laser pointer
@@zetsubou3704 Tbh, For me, it's easier to see which card they're reffering to over at GN, than those red dots suddenly appearing somewhere over the screen here..
Thanks for including the 1060 in your testings. Makes it a lot easier for me to actually see the performance leap of the newer generations
Seems like many people make this comment mainly because they want their Nvidia gpu prices to come down so they can still buy those
@@NickyNicest It's also because it was/is (IIRC) one of the most popular GPU's for a long time. So for a lot of people, like me, it was their first GPU so it's just a game of reference for a lot of its.
@@neuro3423 yeah, same here. I haven't kept track of performance of video cards for years, so i don't have a whole lot of reference now a'days for numbers lol.
then i see the 1060 listed and i'm like "oh yeah, that's how it performs ok pretty decent"
Exactly! They really need to keep doing this! So many people still have a 1060 (just check the Steam hardware survey) AND the GTX 1060 6GB is so legendary by now that pretty much every gamer knows what gaming feels like on it!
@ RX 580s!!!! :P
I am actually impressed how the arc a770 managed to beat the RX 7600 in so many benchmarks!
The Vram advantage sure is important like these, but the driver updates gives this card the life it wanted to have.
It's almost 100 dollars more though for the 16 gb version and 70 more for the 8 gb version
@@danieloberhofer9035 almost 100 dollars more too
But its hard to sell it as 2nd hand, better invest in known cards for a while.
Clearly The Last of Us marked as a DNF is not an Intel win.
I just want to say a big thank you linus and team, for highlighting the graphs and for adding the big red dot by the gpu graph being compared to. Thanks for listening to us, it's been a huge improvement and makes the graphs make more sense now to, love it 😭👌
All this review did was convince me fully that the 6700xt has become probably the best price to performance on the market
Or 6950xt for higher end.
Made me feel like my 6700xt will stay in my machine for another year or two judging by these results
Rx 6800 too
you say that, but many people arent willing to make incremental upgrades for just a little more when building. it's a mystery. They will spend >$100 more for a bling mobo and cpu combo and then pull a 3060 with 13600k pairing kinda bs
Tell that to my 6 year old 1080Ti :D
Once these mid tier cards calm down a little i think it would be worth having a "Intel...how are they doing now" revisit. This gen it definitely seems they are the ones putting in some serious work and maybe the unsung hero's? The A770 numbers in this video were surprising.
yeah the A770 seems like a great buy, but there's still a bunch of issues like the lack of support from oculus (afaik.) and some games not launching.
once these issues get fixed it'll be a great mid tier VR GPU (because stuff like vrchat needs a lot of vram).
"surprising" but again, that performance is at least somewhere it supposed to be given the specs and price
I dunno. I think everyone wants to wait for Battlemage right now.
@@arnox4554 I’ve bought an A770 to support Intel so when the battlemage drops, it’ll be a great card
is their no-go for so many older games fixed now? That would be a massive show stopper.
Linus hires a new boss then immediately starts asking other companies to hire him
All I can say is that the 6700xt still looks like a pretty good deal
it is. im so glad i got one below $400 last year on ebay
And so with RX 5700 XT
Just bought one a few months ago for under 400, love it
Got one for $250 used hahaha it’s great
I built for my brother with it, amazing build honestly totally worth it
Last time I clicked on anything this early my 4090 cables fried.
Relatable
Ok this needs to be the top comment
"You're connecting them wrong"
I love seeing Intel gpus in more benchmarks now. I can't wait for the 2 Gen ones in a few years.
Agreed the entry level to mid gou market is very exciting right now. Glad two companies can compete against Nvidia's BS.
@@woodtvnetwork The fact that the 4060 ti can perform worse than the 3060 is quite wake up call we needed to ditch Nvidia until they get their shit together
yea bud
But Intel will suck in dx9 and dx11 for sure so I won't buy them
@@A-BYTE94 good point bud
Considering how well the a770 is doing I cant wait for next gen Intel. So long as they continue giving ample vram for their (actual) midrange cards
Yep and both AMD and Nvidia left Intel ARC a big reserved parking spot smack bang in the sweet spot of the Midranged. Driver and game devs will make ARC a LOT better over time too. Years ago when Vulkan/DX12 games were first releasing; lot of the talk then was low-level api's put a lot more of the onus of hardware performance on game devs, not just driver teams like DX11.CDPR has already demoed what I said as fact, they released a patch for XeSS, ARC general performance even without XeSS also significantly improved in their two big games.
Even if Battlemage only gives us a generic generational leap in perfomance... Then that would be enough to both beat Nvidia and AMD since there's pretty much no improvement in regard to price.
And let's not forget that benchmarks aside, on paper the A770's die is bigger than the 3070, came out with 16gb, AV1, better ray tracing than AMD, and yet Intel still managed to make it $350.
I just got a 750 for this very reason, while its not going to be the most stable to begin with, I think Intel has the potential to oust both Nvidia and AMD in a big way with their future gen cards. I think that a lot of people also forget that this is a 1st Gen card with brand new drivers that is still a very viable competitor to cards being released from companies with 2 decades of experience and drivers behind them. Lots of potential that hopefully we'll see with driver updates and the release of the Battlemage GPUs sometime next year!
@@lordhunter1202 I've been on the a770 16GB for several months and it's already getting really stable
@@chronometer9931 That's great to hear!
Intel also dropped the price of the 750 with 8GB to 199$
We are now back to square one
We have 2 companies making barely competitive cards
@@notfunny3397 iuno the 750 is close enough to a 770 which was close enough to a 3060ti. So getting a thing for $200 that was similar to a $400+ thing sounds fine.
Unless you’re just gonna say since they’re not Nvidia they suck and the cards should basically be free.
@@joemarais7683 ehh tbh, as much as I hate Nvidia they have a lot of benefits that I can't ignore, even if everything was matched at rasterization performance
Nvidia GPUs GENERALLY have better rtx, better upscaling (dlss 2/3), better efficiency, longer driver support, better screen capture, etc
I prefer AMD driver application though
@@notfunny3397 All of which are very minor advantages, CUDA is major. Don't get me started on RT, the vast majority of people don't play games that have it, and those that do, most of them, don't even turn it on even when the Hardware can run it. It's not there yet. 2x - 3x more gpu releases and it should be important, till then it's a gimmick for when you get too many FPS.
@@kiloneie "its a gimmick for when you get too many FPS"
WELL SAID!
I picked up a 6600 in October of last year for ~$200, and have been loving every second of it. I've only got 1080p panels, and I have no real qualms with it. Massive upgrade from the 1050ti it replaced!
Same here, bought mine last october too. I replaced my R9 380, similar performance to your 1050 Ti. Big jump for $200!
I got one right at the beginning of the year for only $460 🥲
@@Rokomarn how?
@@poor_youtuber1390 that was the going price before the crash (beginning of the year of 2022)
Upgraded from 1050 ti to the 6600 too.
Was about to pull the trigger on an a750 but my 450 watt psu is saying no.
No regrets with this card.
Better than novideos 3050
My conclusion for this was intel gpu’s are actually getting pretty good
I love how they kept popping up in the charts either right below or above. You can't miss how good Intel is doing lol
So true
I got one and am happy but I appreciate the validation. Even better I got it on open box for less than 300
DNF for TLOU
Still can't get an intel gpu. I play way too many classic pc games and intel still has performance hits with games older than dx11
Even a modest Vram increase to 10 GB like the 6700 (non-XT), could have gone a long way to making this a fantastic 1080p option, as well as playing some recent games and most older games at 1440p. Then giving it a price comparable to the 6650 XT would make it sell like gangbusters.
I've got the 6700 and I game at 1440p native in all my games. Mostly maxed out graphics settings, occasionally a game might need a little tweaking like dropping from Ultra to High, but on the whole 10GB is enough for 1440p for the time being in new games. The 7600 is $260, but for $280 you can get a 6700, that's the no brainer. I agree it needs to be cheaper to apeal to the widest audience of budget conscious gamers.
The slightly wider memory bus for 10GB would also make it faster. And there'd still be the possibility to offer a cheaper version with 8GB e.g. XT and non-XT.
@@jochenkraus7016 If they don't want to double memory sizes it really is the time for odd memory bus widths, 160 bit 10GB, a few more shaders and the original price point could be a winner at the same margin. But both AMD and Nvidia planned full greed.
@@MeakerSE Yep neither of these companies are ever going to give the consumers what they need. The cost to add more vram is not that much more. Its just about slow rolling out the specs and getting as much money as you can.
@@bigjoeangel ive seen multiple games use more than 10gb vram at 1440p in my 7900xtx. Tarkov uses 21gb vram at 1440p 😂
Both these videos honestly just made me think "wow, the A770 is really coming into its own"
I bet a770 16gb version will surpass RTX 3070 by the end of this year
intel cpu : 👎
intel gpu : 👍
@@rixyrohaizieMy 13400 has been putting in work. I can't complain.
@@rixyrohaizie meanwhile:
13600k vs 7600x
13600K WINS
13700K VS 5800X
13700K SOLOS
13900K VS 7900X
13900K WINS
13900K VS 7950X3D
13900K CRUSHES
say hi to motherboard prices
So AMD realised what budget means and Nvidia forgot 😂
Forgot? Nvidia threw it out of the window during the meeting
@@845madhatterni LMAO 🤣🤣🤣
It's simply clever marketing. There's a blatant hole in the new budget GPU market right now, they would have been stupid not to take advantage of it ^^
@@845madhatterni why do you think some people call them Ngreedia?
@@peeferin9476 They are releasing 4060 for $299 though?
All I know is the 6700 xt is honestly the card I would recommend for anyone wanting a mid-range beast.
@@donkeymoo1581 sounds like you completely bought into the marketing. Who cares about raytracing? FSR is a good competitor to DLSS. You wouldn't even need it if you got amd because you'd get an actually decent card and with good price/performance
@@donkeymoo1581 I have an EV and an AC that use a lot of power. Insane power consumption is all relative, and the difference between the 6700XT and 7600 is minimal.
@@donkeymoo1581 I can barely see a differene in rt most of the time lmao.
@@donkeymoo1581also your power usage claims are dumb, it's about the power usage as equivelent nvidia cards?
@@nickwilson3499 well... Marketing because the efficiency of amd 6000 is actually quite ok (on par or better than rtx3000?). But the 6700/6750 sits on a too high tdp value (for me ) and is the first >8gb model amd offers. Nvidia at least has the 3060-12gb. (Also why I'm pissed of at the 4060ti 16gb markup on the 16gb model ). It's true the 6650xt offers more fps for less money, but it's 8gb only and i have a bit of FOMO of getting an 8gb card at the moment.
3060-12gb is slower , but there isn't a game where the minimum fps is below 60fps where the 6650xt would be above 60fps. So the experience (to me at least) is the same. The game is locked at 60fps anyway. And dlss is (again, for me ) a feature worth paying extra for. More supported games and quality is superior to fsr2.1 and below. Fsr2.2 looks nice but is only available in 1 or 2 games now, and fsr isn't really upgradable by replacing a dll file like DLSS is .
"8GB of VRAM is not enough for 1080p gaming"
Meanwhile me with 4GB on my RTX 3050Ti:
"Yeah, yeah, of course"
I love how the RX 6700 XT is the universal middle ground for pricing and performance this gen.
For like 300$ new cards are a joke
5700xt and 1080ti are the value kings
@@silenthill4 Completely agree.
You can get a 5700XT for around 155€, a 1080 for less than 150€ and I just bought a Vega 64 for 145€.
6700xt is a really strong price to performance champ. Coming in about 200 cheaper from the next card up, the 6800, and about 100 more than it's lower tier rival 6650xt. It gives really high 1080p frames on anything, above average FPS at 1440p and half decent FPS at 4k with lower settings or FSR. Excellent value card, I'm so glad I got it. Will last me until a great sale on the 7000 series or the 8000 series comes out
my 6700XT hasn't gone over 14% usage yet and I'm playing emulated games with 4k graphics mods.
Safe to say, it's powerful enough for me and probably for most of us.
And the story holds firm:
The 6700xt is the best card on the market currently in terms of price-to-performance
Not everywhere
I'm happy with my XFX Radeon RX 6700XT Speedster Swift309 12GB card. And mine only cost me $379 three and a half months ago.
@@rajesh1ization410 my toughts exactly. Still no 6700xt below 480$ in my country.
@@FunBunChuck Twins
@@Michal_Bauer i smuggled used rx6600xt for 190+20 USD from ebay + local delivery service a few month ago. Asrock challenger are perfect card for this since it weights 1KG in a box.
so glad i got my 6700xt back when it launched, was going to switch over to the 40 series and sell my 6700xt but based on these charts, and own personal experience, hands down team red win
Its nice to see the new CVO getting some camera time, the CEO is a nice guy.
Technically he isn't that yet. It's effective i think in July.
But you're also totally right nevertheless
Looking forward to the RX 7700, judging by the trends of AMD's x7xx series
Should be no more than 400
7700 will be 6800xt perhaps. Less TDP etc . 7800xt will be 6950xt less TDP both 16gb vram
Thanks for highlighting the card in question in graphs now! It’s a small change but easier to follow along while also listening without having to pause, much appreciated!
I'm still holding out on replacing my 1060 until the next gen Arc drops. If Intel can stay at the same price point, I have a feeling it will give red and green a very competitive run for their money.
my question is, will you update to a new 1080p GPU or will you go for something above that resolution? 1440p maybe?
If that's the case, there are great GPUs already out there at a very acceptable value (specially 6750XT - 3070 GPUs).
If your plan is to get something at 1080p, don't wait that long. But if you're looking for something at 2K-4K on the other hand, get something amazing!
I wouldnt hold my breath about Intel making some new Arc graphic card.
I hope a third major competitor will force them both to lower they’re prices
1060 gigachad.
@@AndreyStevn I've decided to go with team Red. I'll be taking a trip to my local Microcenter next week (it's only 20mins away). Time for some upgrades, including a 5900x to replace my 1700x. Still on the fence about getting a 7600 or a 6700xt. 1080 is fine for me, as I only have room for two 24in monitors on my desk.
I bought a 6750 XT yesterday after noticing the price drop. Couldn't have been better timing I see. Excited to try out an amd gpu
6750xt is a beast on 1440p, runs up there with my buddy’s 3080 10gb on Warzone 2.0
@Randolph Ortega I've been on an rtx 2060 for 4 years now and it's not bad, but I switched to 1440 2 years ago and it's starting to show its age. Probably my last upgrade before a new rig entirely
Matching it with the AMD Ryzen 9 5950X makes it even better ! ;-)
@@HeroicVigilantu selling ur rtx2060 😅?
don't forget to do a clean uninstall of your old drivers, so that there's no potential conflict (which could lead to instability or performance issues)
THANK YOU for actually highlighting the portion of the graph the script is referring to; so many times pausing, jumping back, and scanning through the list to actually observe what they're referring to; that little change makes such a difference for hearing, seeing, observing, all within the window of just watching the content.
Been very pleased with my 6700XT. Very consistent 4k* gaming experience with FSR even on some good modern titles. Only got it 2 months ago so I missed out on the price drop but it was worth it to pay $70 to have that level of improvement for the last 2 months.
I have mine running at 1440p and it works well!
Same got mine 3 months ago. Very good 1440p card
Mine is rock solid at 3440x1440 for 2 years now
Same here, got a reference 6750XT a couple months ago and I really can't ask for more performance @1080p usage.
Thanks for slotting the last minute updates in so effectively. Would love to see the RTX 3050 in the charts for future videos in this price segment.
if you are thinking of getting a gpu, please dont buy the 3050. almost every gpu in that price range beats it by a mile, it was a total flop.
@@shabanino 3050s aren't thaaaat bad
@@Shpeedle its definitely better than some cards, but just today i watched a video where it was compared with some other gpus that are around $300-$350 and it was literally half (or less) of the performance of the other ones.
@@shabanino I got the RTX 3050 during the mining shortage knowing that it was bad gaming card, specifically for editing in Adobe Premiere Pro and for up-converting video with Topaz AI. For that purpose, I don’t regret it and I would have bought a different card if it were for gaming. That said, I wouldn’t buy it now but when other much older cards that also shouldn’t be bought now are on the list, I think showing the 3050 makes sense.
I love the fact that my purchase of the 6700 XT because it has 12 GB and that will be needed in the future is already coming true.
NVIDIA really needs to take a better look at their competition.
NVIDIA is like apple it dose not matter what the competition is people will still buy it
When they really start to have one, unless AMD and intel really start to push equivalent in term of quality and diffusion on game of upscaling technique (plus a good level of RT performance).
The problem is the fact that Nvidia is literally fighting in another league with is technology and that why it as most of the market share; that you like it or not with AMD you're paying less because it offer less (they would have the same price of NVIDIA if they could really compete on the same level).
Nvidia is on the AI hype. PC gamers aren't really a priority for them at this point.
@@Frigobar_Ranamelonico I haven't kept up with things lately, but isn't DLSS still game specific? FSR works with literally anything. I like that approach much more
Idk, personally I’ll be stuck nvidia until and isnt trash in vr. Regardless of price
I like that you are trying to show what gpu youre talking about with a red dot (like at 2:21) but I think a more subtle effect in a larger area would be better. For example: making the background area of the card breathe a white color, like you do with static green for the rx 7600. Or the red dot at the name of the card, or a combination
that would be logical, and work better, but isn't as cool as the red dot which is the notification thing :D
Once again impressed by the improvement in charts and graphs. Big props to whoever prepared them!
Also loved the acting outro. Love when the team goes in "low budget movie mode"
You guys really should have included the Arc A750 maybe even over the A770. Its been going for $200 lately and I find its the most interesting competitor to the 7600
As someone who bought an RX 6650 XT a couple weeks ago, I'm happy to hear that it's still competitive, even if i could have done a little better on price or performance by waiting longer.
6000 series is still worth it. I just bought 6750 xt. Been on my radar last 5 month. Latest card aren't worth much.
6650 is very solid still. I just got one on sale for $220 at Micro-Center. No complaints what-so-ever.
It really is sad just how terrible this generation of cards is. Its very similar to the 20 series of nvidia, but this time for both AMD and Nvid. Just absolutely piss poor performance by both companies.
In the same spot as you mate, managed to pick up an 6750XT cheap on Newegg and it's great to see just how relevant the 6000 series still are
Again, solid Arc fan and owner here, so happy to see them making sense on the charts.
Same here, would love to know why it DNF on the last of us
Good graphs with the GTX 1060! So good to see them compared to some of the old stuff that many would be still using today!
The 1060 is still the 2nd most popular card used by gamers
I had a dream that I was brought on to be part of an LTT video. It felt so real and the time crunch and pressure to perform was also very real. I feel weird about watching videos now. Just wanted to share this.
The RX6700 series is, at this moment, a very good "cost x benefit" option, if not one of the best. It can run almost anything pretty well at 1440p.
Sapphire RX 6700 10GB sold out at $279 on newegg yesterday only XFX left for $279
@@jaggsta so only the best manufacturer of amd cards then
@@RayWelpott sapphire is better or xfx? I never had xfx but my sapphire pulse 5700 was a tank. OCd to 2150mhz stable and didn't die on me
@@cosmic_gate476 I have a RX 6700 of XFX, the Swift 309 and handles the temperatures very good, no noises at all.
Man when I made my PC in 2017 I was able to play every game at 1080p ultra no problem with my $200 GTX 1060 3GB. The 10 series era really was the pinnacle of the PC gaming experience and it has been going down hill ever since.
Cyberpunk is what made me upgrade from the 1060 3GB card. I barely got 30fps running around and don't even talk about the sub 10fps driving. Upgraded to a 6700XT and haven't looked back.
@@argonzeit The 6GB version wasn't faring that much better either, while the frame rates where better here and there, the frame times where almost always horrendous.
Had a 1070, have a 7900xtx now
Today's GTX 1060 is the RX 6600
1 year later, driver updates removed early driver stability issues, and still running great. Outpaces the 2060 and 3060, I love the great looking games running at 1440 75Hz.
i got a rx 6600 for 200 on amazon a month or 2 ago, pretty happy. upgraded from rx 570
Good on Amd for listenting to feedback and adjusting the price last minute in a favorable direction. It would have been ideal to hit the $250 price point, but its a step in the right direction. Undoubtedly in the coming months it will have sales that bring it down a bit, making it even more appealing and afforable for a mainstream budget card. The 8gb of vram issue isn't such a big deal at this price point either (as opposed to the abominable 4060ti 8gb at $400, which is inexcusable).
I'm expecting Black Friday sales.
@@JoeStuffzAlt Even before then, but yeh also definitely by Holiday Season.
Actually, it didn't even take a month. Amazon is selling it, along with RE4, for $260, with a coupon, at least in the US. I don't know whether your "future transmission" was "serious" or not about it being a no brainer, but it's at that price, now.
It's kinda sad to see that we still are getting 8GB in 2023 even on a somewhat budget card.
Gotta love the fact that Intel cards are just not finishing benchmarks, they are like the Nicholas Latifi last year on F1.
I hope Intel intensifies development for Arc, extra competition in the GPU market could make a huge difference.
Why the dig on the 🐐
@@glk0728 I still lmao on the wrong turn at the Japanese GP and "the car, real strange"
Great idea with a red dot on the graphs - otherwise it sometimes hard to find what host is talking about at the moment without pausing the video.
Amd just became the first GPU manufacturer to produce card which promote price/performance of his last gen. What a time to be a pc gamer
Nvidia did the same. The 4060ti performs basically identical to the 3060ti if you remove the 'fake' frames
I'm glad I went with AMD this year for my GPU. Missing a few Nvidia exclusive features but I'm still pretty happy with it. I expected to lose more features but AMD has a lot of their equivalent features.
Well done! Yes it's difficult to break away from what you are used to (probably one of the reasons people keep going with Nvidia even when they are bending you over publicly lol), But you will feel great knowing you got a much better deal! And there's not much you can't do on an AMD that you can on Nvidia now unlike a few years ago.. They've come along way :)
AMD' "feature's" are not equivalent and they fucking suck. FSR is just terrible.
Could anyone check if the replay feature from AMD still gives a separate audio file when splitting audio tracks? Back when I used an AMD GPU it did that and it was annoying
@@BUCCIMAIN you can set it if you want separate or not in ReLive.
what particular do you miss ? Asking for Intel *g*
Need more future Linus informing us to tune into the WAN show, same bit time, same bit channel.
Got that marked in my calendar. Hopefully, this means AMD saw the writing on the walls and is gonna make some waves with their pricing, and maybe, just maybe, it will force Nvidias hand. I'm not gonna hold my breath, but one can hope.
It'll force Intel, but I doubt it for Nvidia. They have a huge following in the GPU market. I'm just glad AMD caught up with performance a few years ago.
I got my 6650XT for $180 after MIR. Unbeatable card at thag price
Wish we had those prices. Never seen the 6600 under 230€.
@@richardyao9012 a750 is garbage
@@iAMaReaperGotprobZ A750 is an excellent card, quit smoking crack lol
I have one of the XFX RX 6600XTs. Got it used and it still rips at 1440p, but very excited to see what happens in the Radeon space
AMD & Nvidia in 2023:
2018 levels of performance
2013 levels of VRAM
Lol funny comment but that's not true. In 2013 the GTX 770 had 2gb of vram and the competition R9 280x had 3gb
2016 levels of Performance.... !!! GTX 1080 with 8BG VRAM from 2016 @ 499$ has the Same VRAM and is so fast as a 3060 with 8GB who was also selled for 499$ at the beginning of the year 2023
@@cosmic_gate476 Yea 2016 is when things started to stagnate on VRAM. The 1080ti had the same VRAM as the 2080ti, then the rest of the lineup was in the same boat, with the only one that had any decent amount being the titan cards, following with 30 series doing it AGAIN with the 3090, and only tiny boosts on the 3080 12gb and 3080ti from previous gen. Even 12gb is already looking like it's the new 6gb, with a handful of games using well over 12gb at 1440p if you give it to them. 16gb should be the new mid-upper range standard going forward, with 32gb for the high end. I'm honestly surprised we haven't seen more VRAM on more cards by now, but then we all know nvidia has to keep those margins up at the expense of gamers right? It's honestly fucking pathetic how stupid nvidia thinks we are treating us like this, that extra VRAM would still leave them with impressive margins and we'd all be way better off for it
@@TheOriginalFaxon 6700XT has 12GB, to me the obvious mid-range choice right now at $360. That's the bare minimum for me - my 1080ti 11GB is almost always maxed out. Looks like I'm keeping it around for another couple years at this rate, cause I need CUDA so I can't switch to AMD.
@@cosmic_gate476 They're referring to the low vram amount issue that the 700 series had which caused them to age poorly similar to this generation. He obviously doesn't mean that the rtx 4070 literally has 2gb vram. 2gb of vram in 2013 is like 8gb today.
Love your videos Linus!!!!!
Keep on going!😁
Linus still dropping things in 2023.
I bought a 6700 XT for $350 a few months ago and all these GPU reviews are making me so happy with my purchase.
me too. It's the perfect card to hold me over until a great sale on this generations higher end cards. For the majority of games I play it's more than enough. It's hilarious taking all the old games I used to struggle to run and maxing settings in 4k with no problem
6700 xt is a weak card for todays games
Yep. 6700xt is very solid 1440p card.
@@mjkittredge Maby in couple years we can buy that 4090 card like 200-300$ and play 1440p games forever.
@@jp4361 even used I doubt it. What's the cheapest 2080ti or 3090 out there? I think even in a few years the 4090 will still be an excellent 4k card and massive overkill for 1440p.
The minor change of highlighting the graph points you're speaking about make them a lot easier to follow. Nice touch, LTT!
Damn these AMD and Nvidia GPU reviews lately are actually making me consider grabbing an ARC GPU even more. I'm surprised to see how the A770 out preformed the RX 7600 and was within tagging distance of the 4060 TI. Considering this is Intel's first GPU and was released last year it really puts team red and green to shame. If they keep up with those driver improvements it's gonna cause some major disruptions in the GPU space. I can't wait to see the 4050 I mean 4060 non ti benchmarks lol.
I was set on finally upgrading my 1060 to a 4070 this gen, but am so pissed off that I’ve gone and bought an A770 just to support Intel so they can come out with a great card next gen.
@@dearlove88 Bruh I feel you. Ideally I would get a 4070 as well but that $600 price tag is insulting. As a creator and a gamer the A770 is looking as the next real option as it's performance is on par with anything else at it's price range plus the 16 GB vram and AV1 encoding is a real bonus for me. That said, stability with app like Resolve and Blender are a big concern for me.
@@Xero_Wolf I thought it was doing really well in blender?
@@dearlove88 great choice brother
for a similar price, the rx6700 xt also looks great. It can run most games at ultra 1440p, has almost certainly enough vram for future 1080p gaming and it similar enough to a ps5/xbox series x in performance.
The 6750xt from MSI is on sale right now for $330 on Newegg. The Intel A750 is only $200. They may be going down even further now that this card is released.
You should do a video about best for the buck GPU:s at the moment! Maeby from all the different gategories from low end (1080p) to the semi-high to high end (4k) gaming. Not only current generation but also the past ones included.
5:00 Me from my 4 GB card: yeah yeah whatever *keeps playing 1080p just fine*
With either older games or lower detail settings. Nobody is saying you can't reduce quality to reduce VRAM usage.
would be interesting to see how it holds up against the rx 6700 10gb non xt as that was marketed as a 1440p gpu.
Hope that I will still live till the point when grandpa Linus will sit in the rocking chair, near the fireplace and tell stories from the past
decided to go with a 6700xt becuase of my budget, but have been so blown away how good its been for most my gaming
I hope the 7700XT is coming soon, because I want a 1440p card and also AV1 for Linux.
The red dots on the graph highlighting what your talking about is a great improvement, keep doing that!
It’s seems that gpus as a whole got too powerful in the last generation to the point that even the lowest end chip could put up a fight in 1440p and 4K so now instead they’re limiting vram to force each tier gpu into their respective resolutions
Nobody should be gaming at 1080p. It's not 2012
@@covvy tell that to everyone who’s games either on steam deck or on the old family tv
@@covvy people aren’t that rich bro
It's weird that the 3070 being an 8GB card is running the way it's supposed to and there are not much if ever vram problems for my use cases. In certain tasks that are demanding, then yeah, the lack of vram is a slight issue, but not quite noticeable since I went from 1050ti to 3070 after the crypto crash.
3070 has been shown to have vram issues as well. Your just not running the new AAA games at the settings where it apears at. (most AAA games comming out in 2023 that isnt dead island 2) are having some sort of vram issues on launch, esp if you run your 3070 in the resolution you should have purchased it for @1440p or 4k. OR maybe even in some titels at 1080p maxed and raytracing enabled. There was a review site that made a test with a special produced/or Quadro(A series) product@ 16gb 3070 vs the regular 8gb 3070 class cards, and the 16gb was miles better in most cases.
@@AdaaDK also sometimes the game auto upscale the textures if it gets insufficient VRAM, so it might be fine to some people, not realizing their background is 720p.
I think your use case is the important thing here.
It really depends on which games you play. For example, Hogwarts Legacy.. E.g. The RTX 3060 12GB is certainly slower than a 3060ti 8Gb, which I have, but the 3060 non-ti with 12gb still outperformed the 3060ti only due to having more vram in that game. But in any other games I played so far, the 3060ti is way better.
There was a test between tye 3070 and the 6700xt similarly priced cards, but the 6700xt has 16gb, and it hands down beat the 3070 in raytracing in new titles
You can tell Linus isn't much of a car guy when he's using the civic for his analogy. I may have grown up with V8s fueled by hate, but even I'm cautious of a civic. A junkyard turbo makes those things come to life.
I've seen a lot of varying performance recently depending on the game and use cases, so I just gave in and bought a 6700xt, and I'm pretty sure it'll be enough to last me quite a while.
Looking at the specs it seems the 7600 is identical to the 6650XT aside from a small difference in clock speed in favor of the 6650XT and a very small difference memory speed in favor of the 7600.
That means that the positive difference we're seeing with the 7600 is pure architecture tweaks.
Wow! The graphs are amazingly easy to understand now. Kudos ltt labs
Micro Center is currently having a 6650 XT sale. Just $219.99, in-store only. One heck of a deal though. Not to mention the 6600 XT, which I am seeing as low as $199 online.
If anything, the launch of these new cards made it the PERFECT time to get a truly budget card, especially if you have an older system, like me (2020). I just replaced my 5600 XT with a 6650 XT and the leap in performance was astounding. I was shocked to see clock speeds near 3000mhz, where my old 5600 XT barely hit 1800. Not to mention I do lots of VRAM intensive work, like VFX. It is nice to get a couple extra gigs of VRAM after dealing with 6GB for several years.
(Yes, I know cards like the 6650 aren't the best for productivity. But for the work I do, it is fine. Not to mention I am a poor college student LOL).
it was only 3 hours ago i was thinking of a new gpu and was scouring the internet seeing AMD next move with the RX 7 series
Hey thanks for putting that red dot in there. In the past often times i wasn't fast enough to find the right bar when you were talking about the performance difference.
Whoever edited this video or whoever came up with the idea for red dots on those graphs- insane major kudos.
I always end up looking at those graphs and start thinking and tuning out sounds(low comprehension skill) cause i cant take in the info within the 5-6 second slide. That red dot helped me identify what you guys were talking about right away instead of me being all googly eyed looking at numbers
May not seem like a lot, but hey, free upgrade
And im gonna be looking for those comments in november 😂
It is nice to see that Linus has a renewed energy on camera, since he announced he is stepping down from the role of CEO. I love it. Glad to see our NCIX kid back. 😁
3070ti nó ăn điện kinh khủng, nếu là nguời nâng cấp thì có khi lại phải mua thêm cục nguồn khác luôn
Not sure how I feel about the ads being further in the video rather than right after the intro... Now I don't know when to skip them.
I know it is older (and was not the most available card to get at the time) but I wish the Radeon VII made it into these discussions about needing 16Gb of RAM.
I use a 1440p monitor, and just don't know if we have hit the point where I really need to bother upgrading from my VII or 5700XT cards.
Me: *gets 2 notifications and checks my phone*
The notifications:
- LTT: “AMD, you need to hire me!”
- JayzTwoCents: “AMD’s NVIDIA Smack Down!”
Me: “Oh, this is gonna be great”
GN: “Everything is bad and the world is on fire!!! 😂
@@MonolithStudiosMelbourne fire? It's bbq time!! -Guga foods
Same here lol
Hub: everything is DOA 😅
We are confused 🔫⚰️
@@MonolithStudiosMelbourne Yeah but they're not wrong 😂
as a data nerd I would hope additional information will be included in the FPS chart. say when you are discussing the importance of VRAM, stating the amount of VRAM in the FPS chart will facilitate us to quickly understand your point
At some point the price will be slashed further but even at its current pricing... it just doesn't have enough VRAM to justify the cost. Best to get a 12GB VRAM card for a bit more like the 6700 XT.
It's find if you don't stream or have a 2-4K display with other stuff running aside from the game window. Which is.. let's face it, this crowd has 1440P monitors or at least a dual 1080p setup by now since just the real estate alone is a game-changer. Hardly anyone I know games on a single 1080P display. Which means.. yep.. you need 1-4 GB more to handle the side stuff/2nd display/extra desktop. Now, Windows is nice in that it was long ago decided that you will never run out of memory. But this means shuttling off any extra to your SSD. Ouch. Same as running Windows 10/11 on 8GB. You can do it, but 12GB+ is where they are happy.
@@josephoberlander I'm pretty sure I replied to you in another comment lol. You need to learn the difference between VRAM and RAM. It doesn't take 1-4 GB to do the side stuff. I do agree however the 8 GB is not near enough, especially when you consider the next gen games coming over the next year and beyond
@@josephoberlander thing is, it won't have any resell value most likely. Some games already need more than 8GB so by the time you come to sell your RX 7600, it will even be 1080p obsolete.
@@thepcenthusiastchannel2300 True. This is a stopgap measure. But when faced with $400+ "street prices" for anything good, this at least gets you up and playing games.
Looks like RX 6700 XT is still the best overall mid-to-high end choice, considering current price, performance, and VRAM.
@@insertnamehere4419 6700xt - midrange, 4060ti - entry level (building a computer for the first time).
As a an American welder I highly appreciated the ending
Glad I bought the 6650 XT back on Black Friday for $200. It has yet to get back to that price point and it even beats the 7600 in some benchmarks. I am kind of surprised that the 7600 is really not a performance boost whatsoever compared to the previous gen. It could be the drivers need to be tweaked for the new cards still, but neck and neck with old tech is not good.
So its a 3060, two years after the 3060 was released... But "future proofed" with the rdna3 and AI tech that doesn't work yet.
Yeah this card is trash lol
At no point could you get a 3060 for that price, so no.
@@TrueThanny Only because of the crypto boom so get real lol
@@chronometer9931 No. Ever. Even now. Just go look at current prices.
the "incoming transmission from the future" bit made me chuckle so hard! and these repositioned sponsor spot segues are keeping me on my toes when i watch you guys. keep it coming! 😍
Would have loved to see older cards like the 5700xt vs. this one.
I have a 5700 XT as well and still runs strong even on latest games at 1440p, I just look at the 6600 XT since it performs about the same as the 5700 XT.
@@HeadphoneMAN0017 I have one of those fancy 49" wide monitors and it's a bit of a struggle to get 120fps in apex legends. I'm just trying to figure out how much of an improvement I'd get, but it looks like I'll have to look elsewhere.
@@STEVEH0LT what resolution is it? Apex Legends if quite heavy honestly specially on high settings.
@@HeadphoneMAN0017 5120x1440. Samsung Odyssey Neo g9. On low settings I hit maybe 130fps, but it will drop down to like 94 sometimes. I use the overclock tool and that helps keep things consistent. I'd love a video card that could run it at 240hz, but I don't want to spend that much money lol. My wife won the pc and monitor, so I'm trying to keep the price as close to zero as possible lol.
I thought about upgrading to a 6650 or 7600 from my RX 570 8GB but I am really concerned about the VRAM issues which kinda makes me not want to upgrade at all.
me too. think im just gonna keep my rx580 for now
6700 non xt or 6700xt- better yet a 6800 for 16gb could be sufficient- look up the prices on them they've fallen quite a bit
Grab a 67500XT or 6700XT
You could try going for a second hand 1080ti
6700 is the same price as this card and performance but has 10, gigs
That original outro was legit. Keep up the good work
hopefully a 12 or 16gb variant launches at some point
12gb version is impossible bc of the bus width, unless they make it smaller.
OK, I am not an expert in that regard, but just saying. Saphire also made a 8gb rx 6500 xt even tho the original one had only 4gb.
@@rideroftheapocalypse9953 4 + 4 is possible, 8 + 4 isn’t due to memory bus limitations on that card. The 7600 is 128 bit so you can only add 8 gb modules at a time. Adding a 4 gb module just simply wouldn’t work, it’s how the card works
OK, I did not know that
@@rideroftheapocalypse9953 At least now you do...
With every new GPU review that comes out, I get more and more happy that I bought a 6700XT at a nice price.
Same here, but with a 12gb RTX 2060! XD
Yoo, I’m new to pcs and have been doing some research! I’m more leaning into getting a 6700xt as all the benchmark and fps comparisons I’ve seen are beating newer cards even though it’s 2 years old! Do you think a ryzen 7700 or a i7 13700k would be a better pairing?
@@adamsewell7878 In regards to whether it's better to get AMD or Intel with an AMD gpu, it doesn't really make a difference, they should both work fine and have resizeable bar/smart access memory (which is a feature you should enable in your BIOS).
Performance-wise, judging by reviews they seem pretty similar, with Intel edging out in productivity, while AMD has less power consumption, and them pretty much neck-and-neck for gaming.
So in the end it comes down to price, it seems the AMD CPU is cheaper, but needs a more expensive motherboard (because it's a new platform) and ddr5 ram, while Intel has some motherboards that can use ddr4. On the flip side, AMD should have longer motherboard support, making it easy to upgrade down the line.
If you're in the US, I'd recommend checking out Microcenter, they have great cpu and motherboard combo deals, and I'd go with whichever you can find for cheaper. If they come out to the same price, I'd personally go with AMD just because of the longer motherboard support and lower power usage, but if you do a lot of productivity work Intel would be a better fit.
Hope this helps!
@@Manu__R cheers bro, I’m pretty sure I’m set. I’ve gone with ryzen 7700 x, the red dragon 6800 XT, 2x16gb ddr5 Corsair vengeance, deepcool 360 aio, 750w power, b650e aorus elite ax, Kingston KC3000 2tb. I think maybe the gpu could be a bit higher spec for the 7700 but I’m happy with this. Any thoughts/criticisms on the build would be helpful and be appreciated
@@adamsewell7878 So you went with the 6800xt, instead of the 6700xt? Seems like a solid build, enjoy it!