Whoever had the idea with the blinking dots next to the GPU names is a genius. This helps so much with following and understanding what is being said by Linus.
@@chrisbullock6477 This has nothing to do with being smart, but presenting data in a thought out, visually understandable way. Good presentations don't vomit data on you, but piece meal them with visual aids to make following the flow of the meeting easier. Because that makes a meeting with data quicker and more efficient. Which saves money. And one should always be able to see the value in that. Granted, if they can use basic logic. Emphasis on can. 🙃
I appreciate the red dot that was used for the spec comparisons. I don’t know if this was something that was changed due to production break but it makes these comparisons substantially easier to follow along with at home.
Now I wished they colored the names of the graphics cards or perhaps the graph bars by team red/green/blue so I dont have to pause every time and read everything. also could probably do something to make the generation of the card intuitive without having to read the names.
Found the graphs in this review significantly better than previous ones. The rounded bars with more space between, along with the highlighting and indicator dots were excellent. Love to see it, and actually able to see it. A couple graphs were on screen for very short timeframes, but I could have paused it if I really cared about that data. All in all, an excellent review in my opinion.
Probably the new hire! As a scientist, I agree that I would minimize/reduce the amount if rounding at the corners since graphs should never compromise information for appearance (having a flat section with the corners rounded would be fine, imo).
I'm 80% sure they marketed the 7800 XT as a 6800 (non xt) replacement but unfortunately they shot themselves in the foot this generation with the naming scheme
Yeah, It seems like most of the cards had their naming schemes bumped up a tier leading to misleading results. The 7900 XTX should have just been the 7900 XT, the 7900 XT should have been the 7800 XT, and the 7800 XT should have just been called the 7800. Nvidia isn't doing much better. The 4070 Ti only is named such because of consumer backlash to it's originally planned 4080 12gb naming scheme. This likely pushed the performance of the 40 series mid range cards down further. The 4060/ti in particular are really bad compared to their predecessors. They seem more like xx50 series replacements to me IMO
@@mitchellhalter4 Also funny how the 4060 uses the xx7 SKU which was afaik always the SKU for the 50 cards (except the 950 which used the GM-206). So I guess the 4060 Ti should have been the 4060 and the 4060 should have been the 4050/Ti.
I agree, this is also a case of "not being named right" RTX 4080 was. It's the same chip as the RX 7700 XT so why not call them RX 7700 and RX 7700XT? Or at least 7700XT and RX 7800 standard even tho. Or 7800 and 7800XT since they share the same chip.
Nvidia is doing better, because they did it totally on purpose, no room for "oh it's just a mishap". 4090 is a great card, but expensive. Then there's 4080 that's close to 2/3 of 4090's performance for less money, but still expensive for what it gives. And it has only 16 GB VRAM. Then we had a try to push a bar even lower by having 4080 12 GB and only when faced a backlash from community they decided to rebrand it. Eveything else though follows the down-trend, basically being off by one to half a tier. That's why the only case I may consider Nvidia right now is power efficiency. 4000 series excels in it. Otherwise - big middle finger from me. AMD doesn't seem to be as much scumbaggy, but they aren't pure white for sure. And their cards are mediocre in performance, which the launch of these two just confirms. Less cores gen to gen is a "fuck you" move to customers. If you cannot keep up - don't call your products the same way. It's like being a diamond manufacturer, releasing generation after generation, but at one point you release coal instead. Still carbon, sure, but it ain't the same thing. Change the scheme or rebrand fully. Otherwise you are misleading your customers. Price helps to mitigate this, but it's almost like painting grass green. So yes, most cards from both sides should be tiered lower. Or just be a decent, real uplift gen to gen. @@mitchellhalter4
Super super impressed at 4:17 when the audio changes for three seconds. Clearly there was a re-recording or correction here. This is awesome! It means the new processes are working.
Have to say, im impressed with the new graphs, much better then before, the dots that follow the scrip really helps to see the data. Before for me it was hard to follow the things that was said and the graphs. Thumbs up on actually making things better.👍
Definitely a nice added touch. Before I would have to rewind or pause if I wanted to place what was said on the graphs. If this is what we can expect in the future, I'm here for it.
It would be nice to see idle power draw figures in the reviews. For someone who uses their PC a lot and only occasionally games it's really important in figuring total system power consumption since the GPU is idle more often than not.
My electric bill was 67 dollars in southern July summer with a window ac on 68 247 for a month. I fail to see how expensive this could be when 1kwh is like 14 cents
I hope I'm correct at assuming that my 6900XT is less efficient than the new gen GPU's, but that card idles around 8W and sometimes shoots to 20W max when just using my PC without GPU dependent stuff like games etc. The 20W spike only sometimes happens when watching 4k YT. Otherwise it fluctuates between 8 & 12W.
Gotta love AMD's deals with included games. Built a PC in 2019, with a Ryzen 5 2600 and a Vega 56, got The Division 2, REmake 2 and DMC 5 for free with the hardware.
It confused me too, especially since the cards are ranked based on their average fps and not 1% low. If 1% low is so important, then why not use that to rank the cards?
I appreciated the corrections you guys definitely added in post. You can hear it occasionally, where the microphone has clearly changed, yet the dialogue seems to be fluid. I look forward to when you guys are able to get this down to a science and make them less apparent, but I will take this over ignored mistakes any day.
@@ben_cav Could also be due to a completely unnoticed ambient noise causing the mic or software to record it differently, or editing that in post makes _just_ different enough to notice after the fact.
@@ben_cavI'd say it's mostly down to the distance between the mic and the speaker (or the speaker's mouth). In this case, the inserted the recording didn't have much bass, so it's either recorded with a phone mic or from a distance with a professional mic. Or maybe it's AI, who knows?
@@danielkelsosmith They deleted videos didn't they? That tanks socialblade numbers. The there's also the fact they didn't upload, every channels stats are driven by the huge initial numbers on new videos. If that stops, that also tanks the comparative numbers which is what socialblade shows.
The new graphs are such a big improvement! I like that you included fewer gpus in this video. It makes the graphs easier to process quickly. You can really see the extra attention to quality control, and it's greatly appreciated.
THANK YOU for correcting information with audio too, not only visuals. I heard the edits and I appreciate it - as I was only listening to your presentation the first time. The second time I watched the visuals too. The improved graphics are much appreciated too.
The dot appearing near the gpu models in charts when mentioned by voice are a very good guidance to find them immediately to look and compare the values at the same time.Whoever came up with that should be congratulated, this kind of tiny details reduce fatigue for viewers when watching multiple videos one after the other. 🎉
This video clearly shows the importance of the production break you took, and is significantly better than your earlier reviews. I am very happy to see how effective this change has been for you guys, and if the recent uploads are anything to go by, it seems like you are having more fun with the videos too! Great work guys, I really enjoyed this video.
IMO the fact that this review came out many hours after the embargo lifted, shows me that they wanted to make sure they got every detail right, especially with that weird 8GB VRAM anomaly that they retested 10x for.
I think Jay brought it up, but AMD may be doing it because they intend to replace the "non-XT and XT" monikers with the "XT and XTX." Maybe they've found it's causing confusion to be able to have both a Ryzen 7 7800 and an RX 7800 being released right around the same time.
@@CanIHasThisName Happened to me 3 Weeks ago. I was preparing a Shard which Cards to buy and putting all Specs into it. (50 Learning stations for Videoediting) had the Price of the RTX 4070 and the Values of the RTX 4070 ti in it, now its corrected and the RTX 4070 ti inserted as well. But the price of the 7700XT and the 7800XT is compelling, or the A770. On the RTX 4060 you don't get enough value for the money especially not enough bandwidth and vram.
If that's what AMD has planned, then this is a HUGE fail on their end from a marketing perspective. Again, if that's the actual situation, this should be one of first thing documented in the information sent to reviewers.
I like the innovation in the videos after the introspection period, it's refreshing to see that you guys actually care about this community and want what's best for it.
@@Sam-fq5quand yet you're still effectively helping them by watching. Idgaf if they're "doing it for the money". Are you that childish to think they're the only one? Or that gamers Nexus or any of the others aren't?
@@ActuallyAwesomeName You can still make money and take pride in your work, LTT didn't and only changed after they got called out, still didn't take the hint, then finally the penny dropped. Not to mention lying about Billet labs.
Love the improvements all around (tracking red dot being the favorite). I wish you included at least one GPU that performed better than the one you are testing so we knew how far behind it was from the next highest option, without that it’s a downward view and very difficult to get the full picture of performance.
6950xt is the next one. It's around 14% faster than the 7800xt in 15 games average. Check your local pricing, new and used and see if it's a good deal.
Good idea! And yeah love their overall cleanup! It looks very professional. Glad to see Linus is still in the picture and hope they will also have some "goofiness" to some of their episodes, like they used to. ^^
They did mention this is their first post-hiatus review video, and they did have a deadline. I expect that these kinks will be ironed out in the near future!
Guys, that was a fantastic first hardware review after the incident. Like others, I heard the corrections and I really appreciated it. Also, I didn't care that you had to reduce the number of cards tested to ensure quality was there, it was just important you included a good selection and I think you did that with the mandatory previous gen cards and then adding in the 2070 for an older gen comparison. This is a great first review video back. Oh and those indicators you added during graph discussions was a fantastic idea to ensure the viewer is following along with the presenter. Good content, well presented.
While the change in audio for the corrections was obvious, I appreciate that they were audio and not on screen while I hear the incorrect info while I'm working away on another screen. I like the improvement.
450€ for a Sapphire RX 6900xt Nitro+ SE (Toxic/6950xt board) in Germany, bought about 5 month ago :D Used but looks mint, quick repaste+repad and it performs like a brand new card. The undervolting potential is insane on this particular card.
if the 4070 had 16 gigs of vram it would be an easy pick for me but with just 12 I can't shake the feeling the 7800XT would be the better buy in the long term
I just bought a 7800xt for my first pc build. Ive been having doubts though bc ive seen reviews that say, when runnning a dual monitor setup, one of the two screens green screens. From what ive seen via forums, people are saying its not a hardware issue but a software issue. Something to do with the adrenaline software running the GPU at 2565MHz with two monitors plugged in... when the card is rated for a max of 2430MHz. Someone i saw said lowering it down to the rated 2430MHz fixed the issue. Someone else I saw said they switched from the adrenaline software to afterburner, and uninstalled the GPU drivers using DDU and reinstalling in "driver only" mode which also fixed the crashes. I just wanted to bring attention to this issue people have been having with the 7800xt so that anyone who wants to run a dual monitor setup is aware. I personally am still deciding on a dual monitor setup... leaning more into an ultra wide 1440p monitor l (bc of my intended usage if the pc for gaming and university). I also want to see tech youtubers talk abt the issue in order to their views on the issue, possible fixes, and so on. PLEEEAAASSEEE LINUS MAKE A VIDEO ON IT 🥹
I'm amidst the users who are facing issues with a 7800xt :) I will RMA the card next week. Frankly no idea if it's an hardware or software problem, but definitively I've tried 10000 different solutions and still having the problems of green screens like in the video. Mine crashes always after a cold boot, with every possible driver and adrenaline software combination :D FYI: whenever the green screen happens the system crashes too, the pc reboots, and then I'm able to play for hours with no problems... Really a shite.
A massive step in the right direction with the new graphs, much better than before, the blinking dot really makes it easy to follow. I like this review Good job Linus :D
I think the biggest take away is how well AMD RDNA 2 aged. If RDNA 3 has the same generational advancements they'll be significantly ahead of the 6800xt/6700xt
I really appreciate the new color scheme in the graphs, it clearly drags your attention to 1% lows (because it's really all that should be important to you) and the average is there too in case you need it, cool, thumbs up to that
@@RatRugRug 1% lows are basically the frame rate stability identifier, average FPS doesn't really tell you much, if in a duration of, say, 5 seconds your FPS fluctuates from 40 to 80, you average is 60fps, but the actual experience of jumping between 40 and 80 would obviously be absolutely awful, which is what 1% low statistic would show you, you can see that average FPS is 60 but 1% lows would show something like 45 FPS. I hope I didn't explain it too terribly
The voice over segments used a noticeably lower-quality mic but I still really appreciate the fact that you're taking the time to go back and make corrections/additions
Really appreciate the 2070 Super being included in the charts. It's the card I'm rocking, but I never see it in benchmarks for new GPU reviews, so I usually have to do a lot of cross comparisons to see the relative performance.
Have a normal 2070, but yeah, it's been able to keep playing 1080p on high settings to this day. Maybe I'll think of upgrading to 4k gaming in a generation or two.
I had EVGA's 2070 Black OC, and yeah it was a workhorse for a number of years. I recently gave it to my friend (along with an 8th gen i7) so that he could play Street Fighter 6, and it hit the recommended specs perfectly for 1080p, so despite all the crap it got back in the day, that card really shows its worth, especially with DLSS. That said, I recently upgraded to a 4070ti (also a card that's gotten a lot of crap) and oh boy, what a difference it made. I can count on a bare minimum of max settings 1440p with ray tracing on every new release, but in most cases (like with Armored Core 6) I can usually get 4K max everything at an unbreakable 60+. I haven't even had to bother with DLSS yet. You're definitely in a perfect place to feel a massive improvement once you upgrade.@@calebsmith7179
@@asheeshkumar3946 Card and CPU should go fine together, ,especially at 1440p. You may get some CPU bottleneck on specific titles at 1080p, but it won't be a significant problem for you.
I thought about this, as well. I'd like to see how day 1 launch drivers on the 6800XT would stack up to current drivers. When it launched, did the 6800XT... suck more? AMD products usually age like a fine wine so I'd be interested in seeing that.
QC prioritization shows, and it shows big time in this video. I am happy to see the change, because the attention to details and the delivery is excellent! As far as the cards go, nah, my 3090ti still chugs along more than fine, but if I was shopping for something, that 6800 is hard to ignore. Thanks for a great vid!
with a 3090it man you going have a long road ahead. I have wait to get a new GPU. I really thinking about getting the 7800 XT, but still now sure. I been on team Green for so long
imagine even thinking about replacing a 3090 is mental LOL. Its understandable if ppl still use a 2070/2080 or 3060/3070 or even 4060. Fair to say youre still good to go for some years mate. For me its about time now with my 2070 and i7 8700k(i build the system 2018) sadly this socket aint has any new upgrades to offer on my Mainboard so i need a full rebuild but i think i gotta start with just replacing the 2070 with the 7800 XT and a new PSU and then some months later when i bought ram board and cpu then i make a full new Build
@@mavrickabb I was in the same boat and I pulled the trigger on a 7800xt about a month ago. I'm disabled so my PC gets used way more heavily than the average gaming pc. I probably have close to 200 hours on it so far and I've had zero issues of any kind. It's been every bit as reliable as the 2080 I had in my system before. I game, live stream, and occasionally video and photo edit Using shortcut and dark table. No issues to speak of. I've had a game crash 1 time but I think it's because Hogwarts legacy is still a tad unstable.
Seems like I'll stick with the 6800 Xt. Still curious how it hands vr games through. The graph updates are awesome, with the dots being very helpful to draw your attention to whats being talked about. The coloring is also nice, more defined and stand out more from each other. Keep up the good work, and I hope you guys add vr games soon.
@unholydonutsit makes sense. If you're trying to play the latest and greatest games at the highest resolution and frame rate possible, but seeing as there's very few people doing that I I do agree with your statement for the majority of people
@@the_undead but it doesn't, because highest frame rate only matters for esports titles lmao. You will never notice the fps on newer games when they already look/feel great. (Unless you got a unfinished game.)
@@vaderglenn I never said people trying to play the latest and greatest games at the highest quality and frame rate are logical, I simply stated there are people who do this. And I do agree that realistically speaking, buying the best hardware every generation is a waste of money
It actually bothers me when people act like this is a dramatic improvement over their previous reviews. I would agree the presentation is better, but quite frankly that's the biggest improvement. There is no easy way to confirm what the data accuracy was before, but if you want me to be completely honest, I don't think the accuracy was bad enough to justify having less than half as many GPUs in these reviews. But that's what they basically have to do because of people in the community who quite likely over exaggerated a data accuracy problem, which as far as I can tell at least was nowhere near as bad as it was made out to be.
@@bolbi914 The only thing in that first video that I found truly egregious was the fact that it wasn't monetized, that kind of a move is what I would expect of a politician who is convinced that they are going to lose the next election unless they basically just destroy the career of their opponents, not what I would expect a tech UA-cam channel to do when another attack youtube channel has made some mistakes. Something else that's interesting, at some point, gamers Nexus said once LTT grows beyond a certain size (I forget how Steve worded it) he will start treating LTT like any other company, The fact that he was so aggressive and pulling out the moves of politicians tells me one thing, Steve is only favorable to companies that pay him to be favorable to that company. And seeing as LTT has not paid him anything and is a threat to his channel he took his chance to attack LTT in the hopes that it would destroy the company. TLDR, Steve is just as corrupt as basically any Russian politician and should not be trusted for anything
Video suggestion: Make full videos about playing games with the highest settings with the best computer components. Btw that could be a series or whatever UA-cam calls it!
I never leave comments on videos, but despite the new graphics and the dot on the cards that helped a lot, I feel like this video wasn't rushed and Linus had time to talk and explain things. I was able to read the slides and wasn't just a bunch of information thrown in at a really fast pace where it seemed that the priority was to get things out of the way rather than actually informing. Really happy with the quality of this video and I really appreciate the tone of seriousness instead of making a lot of jokes and little content about the product.
Nice work on improving graphs readability and improving the things inside LMG ! Maybe one small remark, round 12 minutes you compare two power consumption graphs, the first with Nvidia vs AMD cards, then with AMD vs older ben AMD cards, and the 4 colors used are the same, while two of those colors actually refer to different cards in graphs that are show one just after the other ! Maybe using a set of color per video/per card would be an even further improvement :).
Really surprised with productivity results considering the 7800 has nearly double the FP32 performance of the 6800. Hopefully, there will be more testing in this area.
How am I just hearing about the resizable bar?!?!?! I have a 3080 (for years!) and this never popped up on my radar. Now I'm back to finish this video as I went down that rabbit hole to update everything and get it enabled.
First video I've watched since the hiatus - maybe it's just placebo, but it seems like things were less rushed. Hopefully the LTT changes keep these quality improvements going forward!
@@sirseriously"way above" lol no, they are at the same level. and the 3080 is 10gb only so is pretty bad. the 7800 xt is on par with the 6900xt at least the overclocked from factory models.
As someone who bought the 6950XT XFX 319 a few months ago, I would have liked to see where it comes into play with these newer AMD cards. I personally am over the moon with it, I only use it for gaming and RT doesn’t bother me. Couldn’t be happier
I'm still happy I bought the 6800 XT earlier this year. Must have been a good time to get one. The price was as low as it still is (and it was available!), the performance is still strong and no new GPU in this price point actually made me feel bad for making the decision to get one. Not even the 7800 XT even though it might have a little benefit in both RT and power efficiency.
@@lordcommander3224 it was but for $20-30 saving not so great option today. Yes, performance is about the same but newer gen is newer gen, better driver optimization, some new tech and probably two years of additional support once you put it on second hand market or give someone are not worth saving few $. I'm surprised so many reviewers skip over these facts.
New graphs look amazing! Also the highlighting and flashing dots to go with what Linus is talking about is fantastic - makes these much easier to follow.
needed the upgrade and 3070 and 3080s are still 500 plus in my area. So I got the 7800xt red devil. other than some driver issues it has been good so far.
Note: Not "all" AMD GPU's come with starfield. Some retailers are not offering this deal. I purchased two Gigabyte 7800XT's from best buy at 12:15am on launch day. Starfield was not offered as a bundle with any of the new 7800/7700 SKU's
I think I would actually really like to see these graphs sorted by 1% lows. Then you are showing not the peak performance, but putting the emphasis on the smoothness and playability.
The issue is that in edge cases, it could shuffle up the order of GPUs far too much. It could become very confusing to follow, especially in video format. Most people want to compare GPUs in the same performance class.
This really felt slowed down and thought out of a review, and I greatly appreciate it. Made me feel like you really wanted me to comprehend what was going on, not just the sentiment of the product like previous reviews. If this is the newer processes at work please keep it up!
Whoever had the idea with the blinking dots next to the GPU names is a genius. This helps so much with following and understanding what is being said by Linus.
In all honesty the reason the 6800XT looks this good is mature drivers. Give the 7700XT and 7800XT about 6-8 months to get themselves together through driver revisions and it will be an excellent choice for a future upgrade. It may also be cheaper by then.
Well to be honest, the naming is off for the card. For CUs it matched the 6800 non-XT as well as starting MSRP being near each other. And it's also a 32 chip instead of cut down 21, so in that regard 6700XT is a closer match. Should have called this the 7700XT and named the 7700XT the 7700, but marketing...
That's an issue caused by marketing, not products themselves. AMD decided to add a new highest tier to bring back XTX naming, and it consequently pushed everything up a tier. 7900XT -> 7900XTX 7800XT -> 7900XT 7800 -> 7900GRE 7700XT -> 7800XT 7700 -> 7700XT Generational increase in performance is there, but unfortunately it's in line with the price increase. I suppose messing up the naming was the way to increase actual gen-over-gen MSRP. Reviewers tend to focus a lot on MSRP as it's a nice number marketed everywhere, but it's the real world retail pricing that truly matters. It is highly variable based on region and availability, so it's pretty much impossible to evaluate for reviewers. For example, I got my 7900XT in March far cheaper than my friends got their 6800XT/6900XT/3080 a year prior during the shortage. Every reviewer under the sun was saying how much better value the 7900XTX is, but for me 7900XT was 900€, XTX was 1200€.
I love seeing the proper voice overs for messups and redone graphs and little details, this video is considerably more quality than what you guys have done previously for quality. Really nice to see, hope you guys can hold it up!
You can check if resizable bar is activated in your GPU bios by going to NVIDIA control panel, click System Information in the bottom left corner and see if Resizable BAR has a YES next to it.
Is it just me or does the writing for this video feel a lot better than previous day 1 reviews? There seems to be a bit more humor, or human flare to it rather than sounding like a spec sheet read or an AI analysis of graph data. Good job LTT, looks like that hiatus to self reflect has had a positive impact. Keep going!
Awesome review. It followed a logical path, provided clear data with clear discussion points and arguments, and no errors I could see other than the weird audio dub switching. Definitely a clear step up in quality video wise.
Good on you for noting (and none of the other review sites mentioned this either) that the 4070 did what it did with about 100 watts less power. I replaced my 1080 with one this summer and haven’t had to replace anything else (notably power supply)
Exactly. People are so wrapped up in the price to performance ratio they forget about reliability. In this video alone, Linus brought up repeated computer crashes while playing games. Nvidia, for all of its exorbitant pricing makes much more stable hardware. If you can afford it, going for an Nvidia card past the $500 GPU budget.
@@trippinhard250 well.. this hurts to shatter your illusion but nvidias hardware is much more unstable/hazardous than ANY other hardware on the market XD what YOU may be talking about is software which (fyi) can be easily replaced all the time
@@trippinhard250 My 3070 has reliably failed to manage a consistent 60 fps in modern titles because it's being throttled by a measley 8 gigs of VRAM :)
I don't know that it's a new LTT. It's still the exact same LTT but now they have more unity and better processes and it absolutely shows in this video.
@@jdude2005 Yeah I was talking about how most of there video's was about hooking up industrial car sized fans for cooling a pc or something pretty dumb that none of us would ever do, and now I hope they get back to more real everyday things that we the viewers will encounter lol
Those focus dots are a welcome addition - I pretty much always listened instead of look at the graphs, so this type of guidance what's talked about actually makes me read the graphs.
@8:59 That Base Clock for RX 7800 XT doesn't feel right. How's the new QC going? Stated Base clock is even lower that RX 6800. Mr. Google tells me it's 1,800 Mhz, not 1,295 Mhz.
We've confirmed the 1295MHz base clock through our reference card, and through searching third-party specs. You can find that info here: linustechtips.com/topic/1530236-amd-radeon-rx-7700-xt-7800-xt-review/?do=findComment&comment=16129576
For the 1% low graphs - if possible it might it a little bit more easy to understand if the max gradient (ie red) was pinned to the right-side of the slide. So the top card with highest 1% lows had full red, but then the card with the lowest 1% lows might only get a max of yellow or orange. Otherwise, thanks!
I'm really happy they went for a longer form review here. The last few product reviews have been pretty short. Like others have noted, that was probably due to LMG trying to push out as many videos as possible. It's actually nice getting a little more indepth with products like this.
remember when we were shipped properly optimized products and if someone wanted to get 5% more performance for 50% more power draw they could do that themselves? nowadays we're getting severely overclocked nonsense with still underperforming cooling and no way to run without these weird external power cables... that's a mess
@@Z4KIUS very poor logic, sorry. If you want to limit yourself to a quarter of today's performance you have sub 75w options, leave the rest of us alone 👍
wow! i can definitely see improvements on your gpu reviews, i can say that it is not just entertainment but also very important infos for buying decisions. keep it up mate 👍👍👍
The crashing part is interesting to me, since my 7900 XTX also crashes in what I will call a 'consistently random' way in DX12 games. Perfectly fine in DX11 tho. I'm *really* hoping it's a driver bug and not a faulty card, since it's already been replaced once and it's watercooled so a real pain to take out and replace.
A lot of people have been having issues with crashing lately on the XTX, mine included, I chased ghosts trying to fix it and no dice. Only clue is the clock speeds spike to 3000+ and it crashes the driver
Hope you get it sorted out :( Got a XTX myself and 2 of my friends have it. Haven't had a single one since i got it (1,5 months) And not heard of any issues from my friends either
It took AMD almost two years for them to have enough experience to deliver what they had already delivered. That shows how far ahead of its time the RX 6800XT was in its field.
AMD just misnamed this product. It seems to be more like 6800 (non-XT) on new process node (by TSMC). So the improvement by AMD.... hmm this looks bad. This is not a new gen, this is just a new process node.
@@innocentiuslacrim2290 And the fact that it's a whole new chip architecture maybe ? Edit : And with chiplets, a first in a commercial GPU i am pretty sure
@@attilavs2 who cares about "new chip architecture" if it produces almost ZERO benefits for the user. This really is a rather embarrassing GPU gen for AMD.
@@innocentiuslacrim2290 It's at the same price, for slighlty/decently better performance, and that's with poor launch drivers. And the chiplets mean they will be able to make large gpus for cheaper, and it limits the cost of lowering node size. Next gen you will see the advantages.
@@attilavs2 ah promises promises. Unfortunately as a user of GPUs I have to say that this "gen" was not really a gen at all. Just rebranding of old products with maybe a minor facelift. What was it this time? 10% improvement after 2-3 years of R&D? Basically all of that probably coming from the improvements that TSMC did?
My opinion on raytracing is that I'd rather have higher FPS with smoother gameplay and less input latency than it on. Except for scenic, slower paced games, but then most Radeon cards handle that 'well enough' since the fps matters less. AMD still have to catch up but luckily for them, raytracing is still somewhat of a gimmick and still absent in a ton of games coming out today... Definitely recommend picking up a Radeon card nowadays, especially with Nvidia's insulting pricing and leaving their gaming market to collect dust while they rake in cash from AI
I don't know, it's going to depend on pricing really. I know that for US$499 it's a really good deal compared to a 4070 if you don't care about RT, but here in Australia, there's no price difference between the 7800XT and the 4070, so I'd rather get the card with DLSS and better RT capabilities for the same price. The performance difference in raster isn't significant enough in enough games for me to really care when you consider that Nvidia also gives you access to tools like RTX Remix, and better drivers (though AMD is still better than they used to be, Nvidia is still ahead. As for RT being a gimmick, in modern games, mostly yes. But older games? Now that I can see a massive market for. RTX Remix has given modders the tools to give old games a massive facelift with RT and not to mention the in-built texture upscaling. So here in Australia at least (and I'm sure in plenty of other countries with varied pricing) the 7800XT feels like a bad choice across the board.
@@Kyle-yn5hy I get what you're saying and for certain niche markets it makes sense. Though I've heard a lot of bad things about RTX Remix, stories about how it removes the charm from older games or make them actually look worse. And texture upscaling is not a new thing, modders have been doing AI texture upscaling and modding it into the game for a while now. A lot of the reason people buy Nvidia cards is for the reliability, the drivers and the tech. The issue I see with this in the future is that Nvidia is rumoured to be spending less and less on their gaming division, not giving them enough resources to, for example, get decent drivers out for Starfield. This could be a huge upcoming issue for anyone wanting to stick with 40-series cards.
@@TheRealAstro_ What I meant by the upscaling is that it's all done from within the Remix editor, it can do entire scenes at once, without requiring the modder to extract the texture, use another piece of software to upscale it, then inject it back in. I am aware that modders have been using AI upscaling for a while now, but it's not integrated like it is in RTX Remix. As far as it removing charm from older games, I think the same can be said for any mod that alters the graphics of a game. I find some Skyrim texture mods to be worse than the base textures of the game because they feel out of place, and I found Deus Ex: Revision added too much clutter to its scenes and made it harder to navigate without a map. That kind of thing comes down to how a modder chooses to use their time and the technology available to them, will some RTX Remix mods look bad? Yes, but so do some regular mods too. It's all about how RTX is implemented, and it can look amazing. Just look at their Morrowind showcase or Portal RTX, both of which were made with RTX Remix, albiet by professionals. I think for the same price, the 4070 is a better buy, even just for DLSS which is still the more popular option for developers despite FSR being available on more cards. On a 4070, you can use both FSR and DLSS (best of both worlds), and while it might be unfair to Team Red after they generously open sourced FSR and made it available on any GPU, at the end of the day we're consumers and we have to pick the option that provides the best value. Based on the majority of the reviews I've read, the 4070 and 7800 XT trade blows in raster, and the games that AMD does win big in I personally don't play that much of (like F1 for example). Otherwise it's fairly narrow margins in raster, and in RT the 7800 XT is still blown away.
Definitely my take on it. Currently playing the Witcher 3 and my 7900xt can just about manage 60-80fps at decent settings with raytracing but it's a deal breaker when I've got a 240hz monitor and I get a smooth 150+ without raytracing.
@@Kyle-yn5hy I agree with your takes for sure, at the same price (if you want a new card) you should probably go for a 4070 unless the games you enjoy most play better on the 7800XT.
I actually was looking to upgrade from a 1660s and happened to look into the Radeon RX 6000-7000 series over Nvidia options like the 3070 and 4070, solely based off of price and performance. This review would help greatly in finalizing my decision, or maybe patiently wait for a deal when market prices drop lower.
Hey guys at 9:09 mark I there’s a typo stating the rx 6800 has the same gen 2 ray tracing cores as the rx 7800 xt. This should be gen 1 with the rx 6800 as that series was the first gen cards to have raytracing tech.
This video has inspired me to grab a 7800xt. Just placed the order and eagerly awaiting a finally well priced and well spec’d card. The 16gb of vram was the knockout punch for me!
@@yunacuddles not much. It really doesn’t like wallpaper engine. Crashes every time the computer goes to sleep. Other than that it crashed on me once early on and i had to reinstall my drivers. The card was newer and that one was to be expected. All in all it’s still not painless, but it’s def a helluva lot more stable than the last AMD card I used.
Still nice to see the 6700 XT holding up well. Got mine new a few months back. An excellent card for the money. Was a Nvidia user (GTX 1070) but really enjoying using a full AMD system.
I think this was the best written review I've seen LTT produce. Loving the post-break changes! Lots of obvious stuff with the graphics, but some subtle changes with the writing and supplementary storytelling around the testing experience was really good and important.
Keep the colors consistent between comparable graphs! In the Kombustor power consumption graph in 11:47 and the F1 23 graph in 12:07, 4060 Ti and 4070 switched colors(blue and orange), and it looks like the 4070 draw less power when playing F123 than Kombustor.
I really appreciate the pick-up VO audio to have correct information, but it would be great if the quality/tone of the VO audio matched what is recorded on-set. It is a bit jarring to hear tinny Linus pop in and out. I love the new graphics. They make it much easier to follow what the presenter is saying, especially when discussing multiple cards and comparing them. The bit of extra editing time really makes this content shine and helps add to the evergreen quality of the videos.
Thanks to an older LTT video I’ve bought a Sapphire Nitro + 7900XTX and honestly for 4K gaming (without RT that I really don’t care about) performance are pretty pretty good. I’ve always been a team green user but this time around thanks to the cursed prices of Nvidia’s GPUs here in UE I’ve turned to team red. At the moment I’m playing Starfield at 4K at rock solid 60 FPS and I couldn’t be happier. So thanks Linus for making me switch sides, my gaming experience is really astounding for a decent price paid (1049€ for a top custom GPU).
The 7900XTX is the third fastest card in RT. Don´t understand people saying RT is no makable on AMD Cards. False informations holding in peoples memory so persistent this days. Of course, I also think the loss of performance isn't worth it.
Whoever had the idea with the blinking dots next to the GPU names is a genius. This helps so much with following and understanding what is being said by Linus.
yep, absolutely thought so as well
wow, you must of been a real special kid in school.
@@chrisbullock6477 Must of?
@@chrisbullock6477 This has nothing to do with being smart, but presenting data in a thought out, visually understandable way. Good presentations don't vomit data on you, but piece meal them with visual aids to make following the flow of the meeting easier. Because that makes a meeting with data quicker and more efficient. Which saves money.
And one should always be able to see the value in that. Granted, if they can use basic logic. Emphasis on can. 🙃
@@anton2497 he meant he smelled musty, like beans
must of bean
/s
I appreciate the red dot that was used for the spec comparisons. I don’t know if this was something that was changed due to production break but it makes these comparisons substantially easier to follow along with at home.
Now I wished they colored the names of the graphics cards or perhaps the graph bars by team red/green/blue so I dont have to pause every time and read everything. also could probably do something to make the generation of the card intuitive without having to read the names.
they used to have a yellow box for highlighting
its a welcome change but i hope those comparison charts get a bit more refinded, it still looks a bit overwhelming to me
They have been a thing for a little before the break, so I don't think it's a change that resulted from it :/
No more pausing on the graphs. It's such a simple but effective idea!
Found the graphs in this review significantly better than previous ones. The rounded bars with more space between, along with the highlighting and indicator dots were excellent. Love to see it, and actually able to see it. A couple graphs were on screen for very short timeframes, but I could have paused it if I really cared about that data. All in all, an excellent review in my opinion.
Spacing and labeling are good, but rounding the ends obscures the length of the bar. Is there a benefit?
yeah, they were visually appealing to look like unlike previously in the last couple of reviews
A little extra care with the review process is noticed. Glad to see you are making the changes you promised and it wasn't just talk.
@@stalbrechtvisually appealing and the end of the bar tells the number anyways
Probably the new hire! As a scientist, I agree that I would minimize/reduce the amount if rounding at the corners since graphs should never compromise information for appearance (having a flat section with the corners rounded would be fine, imo).
It's a little more than a year later and i believe that the 7800 xt and 7700xt absolutely have become the value monsters everyone wished for
I just picked up a 7800xt off newegg for 459 after the $10 promo code and I'm stoked. Going from a 4060 to that will be a big jump
I'm 80% sure they marketed the 7800 XT as a 6800 (non xt) replacement but unfortunately they shot themselves in the foot this generation with the naming scheme
Yeah, It seems like most of the cards had their naming schemes bumped up a tier leading to misleading results. The 7900 XTX should have just been the 7900 XT, the 7900 XT should have been the 7800 XT, and the 7800 XT should have just been called the 7800.
Nvidia isn't doing much better. The 4070 Ti only is named such because of consumer backlash to it's originally planned 4080 12gb naming scheme. This likely pushed the performance of the 40 series mid range cards down further. The 4060/ti in particular are really bad compared to their predecessors. They seem more like xx50 series replacements to me IMO
Pricing match up might work except in the UK the 6800 is still about £430 compared to £480 for the cheapest 7800xt
@@mitchellhalter4 Also funny how the 4060 uses the xx7 SKU which was afaik always the SKU for the 50 cards (except the 950 which used the GM-206). So I guess the 4060 Ti should have been the 4060 and the 4060 should have been the 4050/Ti.
I agree, this is also a case of "not being named right" RTX 4080 was. It's the same chip as the RX 7700 XT so why not call them RX 7700 and RX 7700XT? Or at least 7700XT and RX 7800 standard even tho. Or 7800 and 7800XT since they share the same chip.
Nvidia is doing better, because they did it totally on purpose, no room for "oh it's just a mishap". 4090 is a great card, but expensive. Then there's 4080 that's close to 2/3 of 4090's performance for less money, but still expensive for what it gives. And it has only 16 GB VRAM. Then we had a try to push a bar even lower by having 4080 12 GB and only when faced a backlash from community they decided to rebrand it. Eveything else though follows the down-trend, basically being off by one to half a tier. That's why the only case I may consider Nvidia right now is power efficiency. 4000 series excels in it. Otherwise - big middle finger from me.
AMD doesn't seem to be as much scumbaggy, but they aren't pure white for sure. And their cards are mediocre in performance, which the launch of these two just confirms. Less cores gen to gen is a "fuck you" move to customers. If you cannot keep up - don't call your products the same way. It's like being a diamond manufacturer, releasing generation after generation, but at one point you release coal instead. Still carbon, sure, but it ain't the same thing. Change the scheme or rebrand fully. Otherwise you are misleading your customers. Price helps to mitigate this, but it's almost like painting grass green.
So yes, most cards from both sides should be tiered lower. Or just be a decent, real uplift gen to gen.
@@mitchellhalter4
Super super impressed at 4:17 when the audio changes for three seconds. Clearly there was a re-recording or correction here. This is awesome! It means the new processes are working.
Yes mate i noticed that too! I felt a disturbance in the force and rewinded. Seems pretty good!
@@rigf1997 To be honest, I'd rather a tiny bit of audio weirdness yet have CORRECT information!
The new graphics are amazing and the blinking dot really makes it easy to follow the story of the review instead of a screen filled with bars
Have to say, im impressed with the new graphs, much better then before, the dots that follow the scrip really helps to see the data. Before for me it was hard to follow the things that was said and the graphs. Thumbs up on actually making things better.👍
Definitely a nice added touch. Before I would have to rewind or pause if I wanted to place what was said on the graphs. If this is what we can expect in the future, I'm here for it.
@@kombat200 same
Gamers Nexus fan boys will be studying them closely for even the slightest of errors to go running back to Steve.
The LABS watermark is cringe
@@Heinz-bx8sdwhy’s it cringe? It shows that it’s labs certified data
It would be nice to see idle power draw figures in the reviews. For someone who uses their PC a lot and only occasionally games it's really important in figuring total system power consumption since the GPU is idle more often than not.
My electric bill was 67 dollars in southern July summer with a window ac on 68 247 for a month. I fail to see how expensive this could be when 1kwh is like 14 cents
@@djnone8137you might not Believe it but some people don't live where you live
@@djnone8137 Not everyone has 14c/kWh electricity, you know.
@@djnone8137 35 cents here bro. stop expecting that everyone is as lucky as you are right now. thats called entitlement
I hope I'm correct at assuming that my 6900XT is less efficient than the new gen GPU's, but that card idles around 8W and sometimes shoots to 20W max when just using my PC without GPU dependent stuff like games etc. The 20W spike only sometimes happens when watching 4k YT. Otherwise it fluctuates between 8 & 12W.
Gotta love AMD's deals with included games. Built a PC in 2019, with a Ryzen 5 2600 and a Vega 56, got The Division 2, REmake 2 and DMC 5 for free with the hardware.
I liked that in the graphs 1% lows are coloured and on top indicating it is more important stat than average fps
Lol funny, i have the opposite response. Both metrics are important to make a good decision. Greyed-out average wasn't clear enough.
@@bartbroekhuizen5617agreed. Avg more important to me
@@bartbroekhuizen5617yeah, a less vibrant colour other than grey would have been nice I guess
It confused me too, especially since the cards are ranked based on their average fps and not 1% low. If 1% low is so important, then why not use that to rank the cards?
@@ms-dosman77221% lows are important just not as important the reason it is important is too see consistency
I appreciated the corrections you guys definitely added in post. You can hear it occasionally, where the microphone has clearly changed, yet the dialogue seems to be fluid.
I look forward to when you guys are able to get this down to a science and make them less apparent, but I will take this over ignored mistakes any day.
I had to rewind to make sure I wasn't losing it with the audio changes , this is much better than the text on screen method
It sounds a lot like eleven labs AI voice. Which would mean corrections could be made without even needing Linus
@@ben_cav Could also be due to a completely unnoticed ambient noise causing the mic or software to record it differently, or editing that in post makes _just_ different enough to notice after the fact.
Why does this comment thread sound like AI....
@@ben_cavI'd say it's mostly down to the distance between the mic and the speaker (or the speaker's mouth). In this case, the inserted the recording didn't have much bass, so it's either recorded with a phone mic or from a distance with a professional mic.
Or maybe it's AI, who knows?
Thank you folks for all the hard work you've put into improving. The new style of graphs were great. I appreciate it!
Lmao wdym Thanks? That's the least they should be doing after all the shit they have done lol
Meat rider
It’s a shame their numbers are lower than they’ve been in years. Taking a look at their social blade and it’s a YIKES!
@@danielkelsosmith That was to be expected tho, it will recover but it will take time. In the end this is a good thing for them.
@@_aullik True. Time will tell
@@danielkelsosmith They deleted videos didn't they? That tanks socialblade numbers.
The there's also the fact they didn't upload, every channels stats are driven by the huge initial numbers on new videos. If that stops, that also tanks the comparative numbers which is what socialblade shows.
The new graphs are such a big improvement! I like that you included fewer gpus in this video. It makes the graphs easier to process quickly. You can really see the extra attention to quality control, and it's greatly appreciated.
THANK YOU for correcting information with audio too, not only visuals. I heard the edits and I appreciate it - as I was only listening to your presentation the first time. The second time I watched the visuals too. The improved graphics are much appreciated too.
The dot appearing near the gpu models in charts when mentioned by voice are a very good guidance to find them immediately to look and compare the values at the same time.Whoever came up with that should be congratulated, this kind of tiny details reduce fatigue for viewers when watching multiple videos one after the other. 🎉
yeaaa that is super nice for me since I get easily lost with all the numbers
This video clearly shows the importance of the production break you took, and is significantly better than your earlier reviews. I am very happy to see how effective this change has been for you guys, and if the recent uploads are anything to go by, it seems like you are having more fun with the videos too! Great work guys, I really enjoyed this video.
That shook them alright. Im glad they are doctoring their own content more.
And it just looks like their old reviews when they were doing them from the house.
IMO the fact that this review came out many hours after the embargo lifted, shows me that they wanted to make sure they got every detail right, especially with that weird 8GB VRAM anomaly that they retested 10x for.
I think Jay brought it up, but AMD may be doing it because they intend to replace the "non-XT and XT" monikers with the "XT and XTX." Maybe they've found it's causing confusion to be able to have both a Ryzen 7 7800 and an RX 7800 being released right around the same time.
This is very plausible. I've encountered people mixing up the XT and non-XT cards and even search engines can get confused about it.
@@CanIHasThisName Happened to me 3 Weeks ago. I was preparing a Shard which Cards to buy and putting all Specs into it. (50 Learning stations for Videoediting) had the Price of the RTX 4070 and the Values of the RTX 4070 ti in it, now its corrected and the RTX 4070 ti inserted as well. But the price of the 7700XT and the 7800XT is compelling, or the A770. On the RTX 4060 you don't get enough value for the money especially not enough bandwidth and vram.
@@CanIHasThisNamethe 7600 is a nightmare for rhis
I planned on getting rx7600 and have a Ryzen 7600 😂😂😂
If that's what AMD has planned, then this is a HUGE fail on their end from a marketing perspective. Again, if that's the actual situation, this should be one of first thing documented in the information sent to reviewers.
I like the innovation in the videos after the introspection period, it's refreshing to see that you guys actually care about this community and want what's best for it.
Stop..
They only care about their own image and revenue.
@@Sam-fq5quand yet you're still effectively helping them by watching. Idgaf if they're "doing it for the money". Are you that childish to think they're the only one? Or that gamers Nexus or any of the others aren't?
@@ActuallyAwesomeName You can still make money and take pride in your work, LTT didn't and only changed after they got called out, still didn't take the hint, then finally the penny dropped. Not to mention lying about Billet labs.
The red dots on charts to follow what Linus is saying is a fantastic addition, please keep doing this in future reviews!
Love the improvements all around (tracking red dot being the favorite). I wish you included at least one GPU that performed better than the one you are testing so we knew how far behind it was from the next highest option, without that it’s a downward view and very difficult to get the full picture of performance.
I was thinking the same thing. 7900xt would’ve been nice
6950xt is the next one. It's around 14% faster than the 7800xt in 15 games average. Check your local pricing, new and used and see if it's a good deal.
Good idea! And yeah love their overall cleanup! It looks very professional. Glad to see Linus is still in the picture and hope they will also have some "goofiness" to some of their episodes, like they used to. ^^
They did mention this is their first post-hiatus review video, and they did have a deadline. I expect that these kinks will be ironed out in the near future!
Linus is built like a rare coin collector
I'm in tears holy shit 😂
Investing In this comment
Dbrands new skin
Bro what does that even meannnn😭😭😭😭😭😭
r/rareinsults
Guys, that was a fantastic first hardware review after the incident. Like others, I heard the corrections and I really appreciated it. Also, I didn't care that you had to reduce the number of cards tested to ensure quality was there, it was just important you included a good selection and I think you did that with the mandatory previous gen cards and then adding in the 2070 for an older gen comparison.
This is a great first review video back.
Oh and those indicators you added during graph discussions was a fantastic idea to ensure the viewer is following along with the presenter.
Good content, well presented.
While the change in audio for the corrections was obvious, I appreciate that they were audio and not on screen while I hear the incorrect info while I'm working away on another screen. I like the improvement.
I got my 7800xt today and it's perfect is an unbelievable upgrade to my previous 6700xt so far I'm really happy with it
I waited 4 years to finally put a more powerful GPU in my PC and bought an rx 6800 below $500. Looking for deals nearly the entire time. Glad I did.
you should have got the 6800XT
yeah i just bought used rx 6800 xt just for 298 dollars
Why didn't you go +100$ and bough much more powerful 6950 XT?
6800 is such a great cart. Even the power consumption is pretty decent.
450€ for a Sapphire RX 6900xt Nitro+ SE (Toxic/6950xt board) in Germany, bought about 5 month ago :D Used but looks mint, quick repaste+repad and it performs like a brand new card. The undervolting potential is insane on this particular card.
I love that you guys added visual cues to follow along on the charts as Linus talked about each spec.
if the 4070 had 16 gigs of vram it would be an easy pick for me but with just 12 I can't shake the feeling the 7800XT would be the better buy in the long term
I also thought about getting the 4070 but AMD is so tempting now
Power consumption isn't great. And I also don't like how they simply priced the 7700xt just to force people into buying the 7800xt.
@@username8644wait until you find out about global marketing
So glad the 4070 is not using the 12VHWPR connector
Tbh I think by the time you need all 16gb you would have already upgraded to the next best thing lol
“This inconsistency (in names) makes intergenerational comparisons extremely difficult for consumers”
This is on purpose
I just bought a 7800xt for my first pc build. Ive been having doubts though bc ive seen reviews that say, when runnning a dual monitor setup, one of the two screens green screens. From what ive seen via forums, people are saying its not a hardware issue but a software issue. Something to do with the adrenaline software running the GPU at 2565MHz with two monitors plugged in... when the card is rated for a max of 2430MHz.
Someone i saw said lowering it down to the rated 2430MHz fixed the issue. Someone else I saw said they switched from the adrenaline software to afterburner, and uninstalled the GPU drivers using DDU and reinstalling in "driver only" mode which also fixed the crashes.
I just wanted to bring attention to this issue people have been having with the 7800xt so that anyone who wants to run a dual monitor setup is aware. I personally am still deciding on a dual monitor setup... leaning more into an ultra wide 1440p monitor l (bc of my intended usage if the pc for gaming and university).
I also want to see tech youtubers talk abt the issue in order to their views on the issue, possible fixes, and so on.
PLEEEAAASSEEE LINUS MAKE A VIDEO ON IT 🥹
I'm amidst the users who are facing issues with a 7800xt :) I will RMA the card next week.
Frankly no idea if it's an hardware or software problem, but definitively I've tried 10000 different solutions and still having the problems of green screens like in the video.
Mine crashes always after a cold boot, with every possible driver and adrenaline software combination :D
FYI: whenever the green screen happens the system crashes too, the pc reboots, and then I'm able to play for hours with no problems...
Really a shite.
A massive step in the right direction with the new graphs, much better than before, the blinking dot really makes it easy to follow. I like this review Good job Linus :D
I think the biggest take away is how well AMD RDNA 2 aged. If RDNA 3 has the same generational advancements they'll be significantly ahead of the 6800xt/6700xt
Me still rocking with my 1080 ti, i feel like anything is not worth to upgrade
There are many things worth the upgrade, entirely dependent on the games you play.
If you’re still using a 1080 Ti then there is PLENTY of options that would be worth the upgrade. IMO
I finally had to cave this year. I was using a 1070. Shows how good the 10 series cards were!
@@xpodx If its actually 2025 release you'll have enough time anyway
I really appreciate the new color scheme in the graphs, it clearly drags your attention to 1% lows (because it's really all that should be important to you) and the average is there too in case you need it, cool, thumbs up to that
Why, may I ask, are the 1% lows the most important? For me it seems logical that the average would be most important?
@@RatRugRug 1% lows are basically the frame rate stability identifier, average FPS doesn't really tell you much, if in a duration of, say, 5 seconds your FPS fluctuates from 40 to 80, you average is 60fps, but the actual experience of jumping between 40 and 80 would obviously be absolutely awful, which is what 1% low statistic would show you, you can see that average FPS is 60 but 1% lows would show something like 45 FPS. I hope I didn't explain it too terribly
The voice over segments used a noticeably lower-quality mic but I still really appreciate the fact that you're taking the time to go back and make corrections/additions
Really appreciate the 2070 Super being included in the charts. It's the card I'm rocking, but I never see it in benchmarks for new GPU reviews, so I usually have to do a lot of cross comparisons to see the relative performance.
Have a normal 2070, but yeah, it's been able to keep playing 1080p on high settings to this day. Maybe I'll think of upgrading to 4k gaming in a generation or two.
I also have 2070 super. Thinking of buying 7700xt but not sure if my ryzen 7 3700x and b550m asrock can support this card or not?
I had EVGA's 2070 Black OC, and yeah it was a workhorse for a number of years. I recently gave it to my friend (along with an 8th gen i7) so that he could play Street Fighter 6, and it hit the recommended specs perfectly for 1080p, so despite all the crap it got back in the day, that card really shows its worth, especially with DLSS.
That said, I recently upgraded to a 4070ti (also a card that's gotten a lot of crap) and oh boy, what a difference it made. I can count on a bare minimum of max settings 1440p with ray tracing on every new release, but in most cases (like with Armored Core 6) I can usually get 4K max everything at an unbreakable 60+. I haven't even had to bother with DLSS yet. You're definitely in a perfect place to feel a massive improvement once you upgrade.@@calebsmith7179
@@asheeshkumar3946 Card and CPU should go fine together, ,especially at 1440p. You may get some CPU bottleneck on specific titles at 1080p, but it won't be a significant problem for you.
@@asheeshkumar3946pay the tiny bit extra for the 7800xt then you can keep it when you update your cpu
I'm curious how they perform a few months from now. AMD cards seem to get better with age as drivers mature.
I thought about this, as well. I'd like to see how day 1 launch drivers on the 6800XT would stack up to current drivers. When it launched, did the 6800XT... suck more? AMD products usually age like a fine wine so I'd be interested in seeing that.
imagine paying that much for a gpu and you have to hope they make drivers that work properly
@@Bunstercouldn't be Intel ...
@@Bunsteror not buy right now and wait for a few months and then decide.
imagine thinking you made the smart choice after buying a current gen GPU@@Bunster
the best card review from LTT in a long time, good job guys!
Error in the table at 8:58 -The Rx 6800 vs 7800
Ray Tracing Cores : RX 6800 60 gen2 is incorrect / correct is 60 gen1 👍
QC prioritization shows, and it shows big time in this video. I am happy to see the change, because the attention to details and the delivery is excellent! As far as the cards go, nah, my 3090ti still chugs along more than fine, but if I was shopping for something, that 6800 is hard to ignore. Thanks for a great vid!
with a 3090it man you going have a long road ahead. I have wait to get a new GPU. I really thinking about getting the 7800 XT, but still now sure. I been on team Green for so long
@@mavrickabb for 500 it's pretty good new gen and rdna 3 I already bought one since I was planning on getting the 6800xt but waited a little
imagine even thinking about replacing a 3090 is mental LOL. Its understandable if ppl still use a 2070/2080 or 3060/3070 or even 4060. Fair to say youre still good to go for some years mate. For me its about time now with my 2070 and i7 8700k(i build the system 2018) sadly this socket aint has any new upgrades to offer on my Mainboard so i need a full rebuild but i think i gotta start with just replacing the 2070 with the 7800 XT and a new PSU and then some months later when i bought ram board and cpu then i make a full new Build
@@mavrickabb I was in the same boat and I pulled the trigger on a 7800xt about a month ago. I'm disabled so my PC gets used way more heavily than the average gaming pc. I probably have close to 200 hours on it so far and I've had zero issues of any kind. It's been every bit as reliable as the 2080 I had in my system before. I game, live stream, and occasionally video and photo edit Using shortcut and dark table. No issues to speak of. I've had a game crash 1 time but I think it's because Hogwarts legacy is still a tad unstable.
Definitely liking the new graphs, the lack of annotations, and the explanations. Feels like a really quality review from the LTT team.
Seems like I'll stick with the 6800 Xt. Still curious how it hands vr games through.
The graph updates are awesome, with the dots being very helpful to draw your attention to whats being talked about. The coloring is also nice, more defined and stand out more from each other.
Keep up the good work, and I hope you guys add vr games soon.
@unholydonutsit makes sense. If you're trying to play the latest and greatest games at the highest resolution and frame rate possible, but seeing as there's very few people doing that I I do agree with your statement for the majority of people
@@the_undead but it doesn't, because highest frame rate only matters for esports titles lmao. You will never notice the fps on newer games when they already look/feel great. (Unless you got a unfinished game.)
@@vaderglenn I never said people trying to play the latest and greatest games at the highest quality and frame rate are logical, I simply stated there are people who do this. And I do agree that realistically speaking, buying the best hardware every generation is a waste of money
The Rx 7800xt is going to get significantly faster with driver updates xd
@@kaizel2264 May not be the case since RDNA 3 cards has been released for a year.
The improvement is amazing, the data is comprehensive and also anomaly is reported
It actually bothers me when people act like this is a dramatic improvement over their previous reviews. I would agree the presentation is better, but quite frankly that's the biggest improvement. There is no easy way to confirm what the data accuracy was before, but if you want me to be completely honest, I don't think the accuracy was bad enough to justify having less than half as many GPUs in these reviews. But that's what they basically have to do because of people in the community who quite likely over exaggerated a data accuracy problem, which as far as I can tell at least was nowhere near as bad as it was made out to be.
@@bolbi914 The only thing in that first video that I found truly egregious was the fact that it wasn't monetized, that kind of a move is what I would expect of a politician who is convinced that they are going to lose the next election unless they basically just destroy the career of their opponents, not what I would expect a tech UA-cam channel to do when another attack youtube channel has made some mistakes. Something else that's interesting, at some point, gamers Nexus said once LTT grows beyond a certain size (I forget how Steve worded it) he will start treating LTT like any other company, The fact that he was so aggressive and pulling out the moves of politicians tells me one thing, Steve is only favorable to companies that pay him to be favorable to that company. And seeing as LTT has not paid him anything and is a threat to his channel he took his chance to attack LTT in the hopes that it would destroy the company.
TLDR, Steve is just as corrupt as basically any Russian politician and should not be trusted for anything
Video suggestion: Make full videos about playing games with the highest settings with the best computer components. Btw that could be a series or whatever UA-cam calls it!
I never leave comments on videos, but despite the new graphics and the dot on the cards that helped a lot, I feel like this video wasn't rushed and Linus had time to talk and explain things. I was able to read the slides and wasn't just a bunch of information thrown in at a really fast pace where it seemed that the priority was to get things out of the way rather than actually informing. Really happy with the quality of this video and I really appreciate the tone of seriousness instead of making a lot of jokes and little content about the product.
Nice work on improving graphs readability and improving the things inside LMG !
Maybe one small remark, round 12 minutes you compare two power consumption graphs, the first with Nvidia vs AMD cards, then with AMD vs older ben AMD cards, and the 4 colors used are the same, while two of those colors actually refer to different cards in graphs that are show one just after the other !
Maybe using a set of color per video/per card would be an even further improvement :).
"im gonna thank AMD for not treating me like a highschool bully" Linus' smile is so ominous no one would dare to bully him
Why are you everywhere
Touch grass
ok
Really surprised with productivity results considering the 7800 has nearly double the FP32 performance of the 6800. Hopefully, there will be more testing in this area.
How am I just hearing about the resizable bar?!?!?! I have a 3080 (for years!) and this never popped up on my radar. Now I'm back to finish this video as I went down that rabbit hole to update everything and get it enabled.
First video I've watched since the hiatus - maybe it's just placebo, but it seems like things were less rushed. Hopefully the LTT changes keep these quality improvements going forward!
LTT clearly putting more time and effort and thought into their GPU reviews. Great to see early change - need to see it keep up!
The most thoroughly checked graphs in LTT history
Would have liked to see the 3080, 6900xt/6950xt that are good options in second hand market
They're both way above this card which is why they weren't included.
@@sirseriously 3080 has same perf as 4070, 6900xt same perf as a custom 7800 xt
@@sirseriously the 3080 is very close, and even worse in some case than the 4070 and 7800xt based on others' benchmark
@@sirseriously"way above" lol no, they are at the same level. and the 3080 is 10gb only so is pretty bad. the 7800 xt is on par with the 6900xt at least the overclocked from factory models.
As someone who bought the 6950XT XFX 319 a few months ago, I would have liked to see where it comes into play with these newer AMD cards. I personally am over the moon with it, I only use it for gaming and RT doesn’t bother me. Couldn’t be happier
I was bummed that I bought a 6800xt and then the 7800xt came out soon after for the same price I got mine at. But now I'm glad I did.
Here in my country, the 4070 is currently 20% higher than the SRP of the 7800 XT, so the 7800XT is the better option.
I'm still happy I bought the 6800 XT earlier this year. Must have been a good time to get one. The price was as low as it still is (and it was available!), the performance is still strong and no new GPU in this price point actually made me feel bad for making the decision to get one. Not even the 7800 XT even though it might have a little benefit in both RT and power efficiency.
6800XT is incredible value
@@lordcommander3224 it was but for $20-30 saving not so great option today. Yes, performance is about the same but newer gen is newer gen, better driver optimization, some new tech and probably two years of additional support once you put it on second hand market or give someone are not worth saving few $. I'm surprised so many reviewers skip over these facts.
Fellow 6800xt user here, this card is so underrated it's criminal. Will probably use it for the next four-five years, it slays at 1440p
New graphs look amazing! Also the highlighting and flashing dots to go with what Linus is talking about is fantastic - makes these much easier to follow.
needed the upgrade and 3070 and 3080s are still 500 plus in my area. So I got the 7800xt red devil. other than some driver issues it has been good so far.
Note: Not "all" AMD GPU's come with starfield. Some retailers are not offering this deal. I purchased two Gigabyte 7800XT's from best buy at 12:15am on launch day. Starfield was not offered as a bundle with any of the new 7800/7700 SKU's
QC coming in hot for the quality of this video and im not even halfway! Nice.
I think I would actually really like to see these graphs sorted by 1% lows. Then you are showing not the peak performance, but putting the emphasis on the smoothness and playability.
The issue is that in edge cases, it could shuffle up the order of GPUs far too much. It could become very confusing to follow, especially in video format. Most people want to compare GPUs in the same performance class.
This really felt slowed down and thought out of a review, and I greatly appreciate it. Made me feel like you really wanted me to comprehend what was going on, not just the sentiment of the product like previous reviews. If this is the newer processes at work please keep it up!
Love the change to higher quality videos.
Whoever had the idea with the blinking dots next to the GPU names is a genius. This helps so much with following and understanding what is being said by Linus.
In all honesty the reason the 6800XT looks this good is mature drivers. Give the 7700XT and 7800XT about 6-8 months to get themselves together through driver revisions and it will be an excellent choice for a future upgrade. It may also be cheaper by then.
Looking good guys. Round of applause for the review and visuals team on this one. This hiatus was annoying, but it’s only going to make you stronger!
Pretty sad how 6800 xt is doing better or very similar to 7800 xt.. which shouldn't even be happening
Yeah how is it not spitting in his face when its just the same card +5 fps? The only thing this card will do is allow the 6800 xt to go down in price
Well to be honest, the naming is off for the card. For CUs it matched the 6800 non-XT as well as starting MSRP being near each other. And it's also a 32 chip instead of cut down 21, so in that regard 6700XT is a closer match. Should have called this the 7700XT and named the 7700XT the 7700, but marketing...
If they just named this card the 7800 (non xt) everyone would be happy, that's where they went wrong.
That's an issue caused by marketing, not products themselves. AMD decided to add a new highest tier to bring back XTX naming, and it consequently pushed everything up a tier.
7900XT -> 7900XTX
7800XT -> 7900XT
7800 -> 7900GRE
7700XT -> 7800XT
7700 -> 7700XT
Generational increase in performance is there, but unfortunately it's in line with the price increase.
I suppose messing up the naming was the way to increase actual gen-over-gen MSRP.
Reviewers tend to focus a lot on MSRP as it's a nice number marketed everywhere, but it's the real world retail pricing that truly matters. It is highly variable based on region and availability, so it's pretty much impossible to evaluate for reviewers. For example, I got my 7900XT in March far cheaper than my friends got their 6800XT/6900XT/3080 a year prior during the shortage. Every reviewer under the sun was saying how much better value the 7900XTX is, but for me 7900XT was 900€, XTX was 1200€.
its because they have 7900xtx and 7900xt
why not consistent with the naming?
5000 and 6000 series dont have xtx and gre
I love seeing the proper voice overs for messups and redone graphs and little details, this video is considerably more quality than what you guys have done previously for quality. Really nice to see, hope you guys can hold it up!
You can check if resizable bar is activated in your GPU bios by going to NVIDIA control panel, click System Information in the bottom left corner and see if Resizable BAR has a YES next to it.
Thank you.
Is it just me or does the writing for this video feel a lot better than previous day 1 reviews? There seems to be a bit more humor, or human flare to it rather than sounding like a spec sheet read or an AI analysis of graph data. Good job LTT, looks like that hiatus to self reflect has had a positive impact. Keep going!
Yup, talking a little slower pace, focusing on versions, longer sentences to put point across. They have improved the process
@@TCharlieA i've noticed that too, i even jumped back that line to have sure of what i have heard
Mmm noticed that to!
Still sitting on my 1660super for the foreseeable future, can't wait for these companies to get over their greed pricing
Awesome review. It followed a logical path, provided clear data with clear discussion points and arguments, and no errors I could see other than the weird audio dub switching. Definitely a clear step up in quality video wise.
Good on you for noting (and none of the other review sites mentioned this either) that the 4070 did what it did with about 100 watts less power. I replaced my 1080 with one this summer and haven’t had to replace anything else (notably power supply)
Exactly. People are so wrapped up in the price to performance ratio they forget about reliability. In this video alone, Linus brought up repeated computer crashes while playing games. Nvidia, for all of its exorbitant pricing makes much more stable hardware.
If you can afford it, going for an Nvidia card past the $500 GPU budget.
@@trippinhard250 well.. this hurts to shatter your illusion but nvidias hardware is much more unstable/hazardous than ANY other hardware on the market XD what YOU may be talking about is software which (fyi) can be easily replaced all the time
@@trippinhard250 My 3070 has reliably failed to manage a consistent 60 fps in modern titles because it's being throttled by a measley 8 gigs of VRAM :)
The dots on the pics to show what you are talking about are an amazing new festure ;) cooool thaaanks
I absolutely love that the Rx 5700 XT is in the graphs aswell, so much easier to see where the permonce is right now.
It's good to see you getting back to the basics of reviewing tech and more meaningful content, I am liking the new LTT 👍
I don't know that it's a new LTT. It's still the exact same LTT but now they have more unity and better processes and it absolutely shows in this video.
@@jdude2005 Yeah I was talking about how most of there video's was about hooking up industrial car sized fans for cooling a pc or something pretty dumb that none of us would ever do, and now I hope they get back to more real everyday things that we the viewers will encounter lol
An obvious improvement in charts and discussing pitfalls while testing. Hopefully, this continues into future videos.
Those focus dots are a welcome addition - I pretty much always listened instead of look at the graphs, so this type of guidance what's talked about actually makes me read the graphs.
@8:59 That Base Clock for RX 7800 XT doesn't feel right.
How's the new QC going? Stated Base clock is even lower that RX 6800.
Mr. Google tells me it's 1,800 Mhz, not 1,295 Mhz.
so you changed RT cores "gen2" to "gen1" on RX 6800 ... but didn't notice the base clock in the same table?? Solid QA ...
We've confirmed the 1295MHz base clock through our reference card, and through searching third-party specs. You can find that info here: linustechtips.com/topic/1530236-amd-radeon-rx-7700-xt-7800-xt-review/?do=findComment&comment=16129576
For the 1% low graphs - if possible it might it a little bit more easy to understand if the max gradient (ie red) was pinned to the right-side of the slide. So the top card with highest 1% lows had full red, but then the card with the lowest 1% lows might only get a max of yellow or orange. Otherwise, thanks!
I'm really happy they went for a longer form review here. The last few product reviews have been pretty short. Like others have noted, that was probably due to LMG trying to push out as many videos as possible. It's actually nice getting a little more indepth with products like this.
remember when we were shipped properly optimized products and if someone wanted to get 5% more performance for 50% more power draw they could do that themselves?
nowadays we're getting severely overclocked nonsense with still underperforming cooling and no way to run without these weird external power cables... that's a mess
Since when are external power cables weird? They have been around since like 2002...
@@riba2233probably means the 12pin connector from nvidia? Idk what that has to do with amd tho
@@riba2233 since they appeared they make no sense
@@Z4KIUS very poor logic, sorry. If you want to limit yourself to a quarter of today's performance you have sub 75w options, leave the rest of us alone 👍
@@Z4KIUSthere will be a new Standard anyway, which puts the power connector into the notherboard
Love the red dots next to the data points you are talking about. Great change. makes it easier to follow along.
I really like those red dots telling us where you are looking, so much easier to follow along
wow! i can definitely see improvements on your gpu reviews, i can say that it is not just entertainment but also very important infos for buying decisions. keep it up mate 👍👍👍
The crashing part is interesting to me, since my 7900 XTX also crashes in what I will call a 'consistently random' way in DX12 games. Perfectly fine in DX11 tho. I'm *really* hoping it's a driver bug and not a faulty card, since it's already been replaced once and it's watercooled so a real pain to take out and replace.
3060TI, random crashes in DX12 as well. It's not only an AMD problem.
A lot of people have been having issues with crashing lately on the XTX, mine included, I chased ghosts trying to fix it and no dice. Only clue is the clock speeds spike to 3000+ and it crashes the driver
@@sedixmrboss5625 Oh. That's… good to know, I guess? :P
@@mixni That sounds like a driver thing... Ouch. In my case, it usually runs out of vram, and instead of lagging, it just... dies.
Hope you get it sorted out :( Got a XTX myself and 2 of my friends have it. Haven't had a single one since i got it (1,5 months) And not heard of any issues from my friends either
It took AMD almost two years for them to have enough experience to deliver what they had already delivered. That shows how far ahead of its time the RX 6800XT was in its field.
AMD just misnamed this product. It seems to be more like 6800 (non-XT) on new process node (by TSMC). So the improvement by AMD.... hmm this looks bad. This is not a new gen, this is just a new process node.
@@innocentiuslacrim2290 And the fact that it's a whole new chip architecture maybe ?
Edit : And with chiplets, a first in a commercial GPU i am pretty sure
@@attilavs2 who cares about "new chip architecture" if it produces almost ZERO benefits for the user. This really is a rather embarrassing GPU gen for AMD.
@@innocentiuslacrim2290 It's at the same price, for slighlty/decently better performance, and that's with poor launch drivers. And the chiplets mean they will be able to make large gpus for cheaper, and it limits the cost of lowering node size. Next gen you will see the advantages.
@@attilavs2 ah promises promises. Unfortunately as a user of GPUs I have to say that this "gen" was not really a gen at all. Just rebranding of old products with maybe a minor facelift. What was it this time? 10% improvement after 2-3 years of R&D? Basically all of that probably coming from the improvements that TSMC did?
I've watched Gamers nexus / Daniel Owen / Vex / reviews but this one was the easiest to follow .. well done on the graphs
I know it’s more work but oh my God the little dot highlighting the spec being talked about is SUCH a welcome addition
Comments section just proves that no matter what you do right, someone’s always gonna complain 😂
My opinion on raytracing is that I'd rather have higher FPS with smoother gameplay and less input latency than it on.
Except for scenic, slower paced games, but then most Radeon cards handle that 'well enough' since the fps matters less.
AMD still have to catch up but luckily for them, raytracing is still somewhat of a gimmick and still absent in a ton of games coming out today...
Definitely recommend picking up a Radeon card nowadays, especially with Nvidia's insulting pricing and leaving their gaming market to collect dust while they rake in cash from AI
I don't know, it's going to depend on pricing really. I know that for US$499 it's a really good deal compared to a 4070 if you don't care about RT, but here in Australia, there's no price difference between the 7800XT and the 4070, so I'd rather get the card with DLSS and better RT capabilities for the same price. The performance difference in raster isn't significant enough in enough games for me to really care when you consider that Nvidia also gives you access to tools like RTX Remix, and better drivers (though AMD is still better than they used to be, Nvidia is still ahead.
As for RT being a gimmick, in modern games, mostly yes. But older games? Now that I can see a massive market for. RTX Remix has given modders the tools to give old games a massive facelift with RT and not to mention the in-built texture upscaling.
So here in Australia at least (and I'm sure in plenty of other countries with varied pricing) the 7800XT feels like a bad choice across the board.
@@Kyle-yn5hy I get what you're saying and for certain niche markets it makes sense.
Though I've heard a lot of bad things about RTX Remix, stories about how it removes the charm from older games or make them actually look worse. And texture upscaling is not a new thing, modders have been doing AI texture upscaling and modding it into the game for a while now.
A lot of the reason people buy Nvidia cards is for the reliability, the drivers and the tech. The issue I see with this in the future is that Nvidia is rumoured to be spending less and less on their gaming division, not giving them enough resources to, for example, get decent drivers out for Starfield. This could be a huge upcoming issue for anyone wanting to stick with 40-series cards.
@@TheRealAstro_ What I meant by the upscaling is that it's all done from within the Remix editor, it can do entire scenes at once, without requiring the modder to extract the texture, use another piece of software to upscale it, then inject it back in. I am aware that modders have been using AI upscaling for a while now, but it's not integrated like it is in RTX Remix.
As far as it removing charm from older games, I think the same can be said for any mod that alters the graphics of a game. I find some Skyrim texture mods to be worse than the base textures of the game because they feel out of place, and I found Deus Ex: Revision added too much clutter to its scenes and made it harder to navigate without a map. That kind of thing comes down to how a modder chooses to use their time and the technology available to them, will some RTX Remix mods look bad? Yes, but so do some regular mods too. It's all about how RTX is implemented, and it can look amazing. Just look at their Morrowind showcase or Portal RTX, both of which were made with RTX Remix, albiet by professionals.
I think for the same price, the 4070 is a better buy, even just for DLSS which is still the more popular option for developers despite FSR being available on more cards. On a 4070, you can use both FSR and DLSS (best of both worlds), and while it might be unfair to Team Red after they generously open sourced FSR and made it available on any GPU, at the end of the day we're consumers and we have to pick the option that provides the best value. Based on the majority of the reviews I've read, the 4070 and 7800 XT trade blows in raster, and the games that AMD does win big in I personally don't play that much of (like F1 for example). Otherwise it's fairly narrow margins in raster, and in RT the 7800 XT is still blown away.
Definitely my take on it. Currently playing the Witcher 3 and my 7900xt can just about manage 60-80fps at decent settings with raytracing but it's a deal breaker when I've got a 240hz monitor and I get a smooth 150+ without raytracing.
@@Kyle-yn5hy I agree with your takes for sure, at the same price (if you want a new card) you should probably go for a 4070 unless the games you enjoy most play better on the 7800XT.
I actually was looking to upgrade from a 1660s and happened to look into the Radeon RX 6000-7000 series over Nvidia options like the 3070 and 4070, solely based off of price and performance. This review would help greatly in finalizing my decision, or maybe patiently wait for a deal when market prices drop lower.
What have you went with?
Hey guys at 9:09 mark I there’s a typo stating the rx 6800 has the same gen 2 ray tracing cores as the rx 7800 xt. This should be gen 1 with the rx 6800 as that series was the first gen cards to have raytracing tech.
This video has inspired me to grab a 7800xt. Just placed the order and eagerly awaiting a finally well priced and well spec’d card. The 16gb of vram was the knockout punch for me!
Does it crash ,is it a good card?
@@yunacuddles not much. It really doesn’t like wallpaper engine. Crashes every time the computer goes to sleep. Other than that it crashed on me once early on and i had to reinstall my drivers. The card was newer and that one was to be expected. All in all it’s still not painless, but it’s def a helluva lot more stable than the last AMD card I used.
Still nice to see the 6700 XT holding up well. Got mine new a few months back. An excellent card for the money. Was a Nvidia user (GTX 1070) but really enjoying using a full AMD system.
Had mine for a year now, glad to see I don't really need to be thinking about upgrading for a good while
I think this was the best written review I've seen LTT produce. Loving the post-break changes! Lots of obvious stuff with the graphics, but some subtle changes with the writing and supplementary storytelling around the testing experience was really good and important.
Loving the new graphs, especially the pulsing dot. Very excited to see more improvements like this
Keep the colors consistent between comparable graphs! In the Kombustor power consumption graph in 11:47 and the F1 23 graph in 12:07, 4060 Ti and 4070 switched colors(blue and orange), and it looks like the 4070 draw less power when playing F123 than Kombustor.
I really appreciate the pick-up VO audio to have correct information, but it would be great if the quality/tone of the VO audio matched what is recorded on-set. It is a bit jarring to hear tinny Linus pop in and out.
I love the new graphics. They make it much easier to follow what the presenter is saying, especially when discussing multiple cards and comparing them. The bit of extra editing time really makes this content shine and helps add to the evergreen quality of the videos.
Thanks to an older LTT video I’ve bought a Sapphire Nitro + 7900XTX and honestly for 4K gaming (without RT that I really don’t care about) performance are pretty pretty good. I’ve always been a team green user but this time around thanks to the cursed prices of Nvidia’s GPUs here in UE I’ve turned to team red. At the moment I’m playing Starfield at 4K at rock solid 60 FPS and I couldn’t be happier. So thanks Linus for making me switch sides, my gaming experience is really astounding for a decent price paid (1049€ for a top custom GPU).
Lucky
The 7900XTX is the third fastest card in RT. Don´t understand people saying RT is no makable on AMD Cards. False informations holding in peoples memory so persistent this days. Of course, I also think the loss of performance isn't worth it.