I decided to buy a 3080 instead of a 6800 XT a few days ago to upgrade from my 6600 XT, the card just shipped too, and now you post this video 💀. The only reason I'm still happy with my decision though is that it was a 12GB card and I got it for $390. Update: I got it and I'm happy with it
There have been videos for years showing the 6800xt as the better card. It also overclocks far better. Mine is as fast as a 3090, using air cooling and amd's stock software, which has improved a lot in the past 3 years. No driver issues either.
CUDA worloads aren't a thing you can utilize well with a 3080 - they need a lot of RAM, that's why the professional cards have 24GB. 10GB makes that feature irrelevant - the low memory is a way to force prosumer buyers into paying the Quadro tax. 3080 is useless in pro apps, you get better performance with a A4000 - a 16GB quadro card that uses the 3070 core.
16GB vs 10GB is no contest. Both of these cards are the lowest viable 4k options imo and at 4k 10GB is going to age like milk (even with FSR2/DLSS quality lowering the vram usage- which you'll probably be using as 4k native requires serious horsepower). In blind tests comparing FSR2 and DLSS at 4k quality, 99% of people won't be able to spot the slightly better image quality DLSS offers. The differences become more exaggerated using more aggresive upscaling at lower resolutions. If you don't stream or do any productivity and have a 4k monitor then 6800 xt just seems like the better option. £400 used and lower power which is a factor in the UK where energy is expensive.
As for RT- I've always been about optimising graphics settings. Even before RT, I'd be turning shadows down a few notches from ultra for more performance for very little visual difference. Even on a 3080, RT is expensive and I'd much rather target 120FPS than have some eye candy.
@@Ober1kenobi but that's the issue- I don't want to have to think about keeping my vram in check- 4K textures are one of the best ways to improve the image- whats the point of playing at high resolution using low resolution textures? PC gaming and the pursuit of smooth frametimes is finicky enough without having to worry about vram as well
Well, I can easily see the difference between DLSS Quality and FSR Quality at 4K rendered resolution. Even on my old 1080p screen, let alone 4K screen! DLSS provides much better AA quality and less ghosting. The overall image looks way smoother with DLSS. Plus, RTX cards also offer DLAA - the best currently available AA solution. And also DLDSR, which is an AI driven downscaler.
I just bought a used rx 6800 non xt for 350 usd. I came from a rx 6600xt and I cannot imagine using anything faster. I'll definitely keep it for as long as I can. And to add I never see my card go above 200w, and that is with a slight tuning which makes its just that much faster. I love being able to max out settings and not run out of vram.
what processor do you have and what psu? I have rtx 2070super and want to upgrade to rx 6800 or 6800xt with mine r7 5700x and I only have 650W psu 80+ bronze, i think 2070super runs with same watts like a 6800
I was in this boat and grabbed a used 6800XT over a used 3080. There are two main reasons for this: 1. 16 GB VRAM, self explanatory. Yes I bought into the fearmongering, boohoo. 2. GDDR6X is an actual risk when buying ex-miner cards, GDDR6 not so much. This is really all there is to it. I've already had a 6700XT before so that kinda made it easy to know what to expect. Might be a little harder to risk going a different brand for others especially in used market. Though I don't think 3080 is cheaper because its valued less, I think it has more to do with just how many 3080's Nvidia produced and sold to miners. I just don't think there are as many 6800XT's out there total.
G6 might not be as big as a risk than G6X but I think it really depends on the AIB rather than the card itself. Also fast G6 can produce just as much heat as slower G6X, like the 7900 XT. Reference 6700 XT for example has G6 but are known to run its memory close to 100 degrees. It’s all about how well the cooler is.
@@person1745 AMD OC those GDDR6 cards from factory. Which is why they produce far more heat than they should. 20gbs seems out of spec to me for GDDR6 non-X. I think they did that out of depuration for more performance when the realized their RX 7900 series was not hitting target numbers at 16gbs or 18gbs (GDDR6 at stock). Plus, my 7900xtx if I make the vram clock itself to its 20gbs. on the desktop it consumes like 80w idle with vram clocked up from 10w idle.... yeah, they pushed and overvolted these ram chips.
@@forog1 you can undervolt SoC for what its worth with MPT to reduce power draw a little bit, it works on RDNA2 but I cant guarantee anything for RDNA3.
I grabbed a 6800 xt for $150 less than the cheapest 3080 on the used market and 4070s are still over $300 more idk why anyone would spend more for AMD but ultimately besides the great price the 16gb vram was really what made me buy it.
Couple weeks ago i swapped my 6800XT to a 3080 (10gb) And only reason i swapped with nephew was for AI. The AMD gpu while many say can run stable diffusion. I could never get it to work. So i swapped with my nephew. And SD works flawlessly now. Plus the games i play run about the same on both gpus. So nothing lost there
Good choice my friend, 6800xt will age like fine wine and outperform the 3080 as time goes on and even matches or beats the 4070 with having more vram it'll age very well. Nvidia get Weaker as they get older while amd GPUs just get better as videos have proving this. Nvidia have been trash since the 2000 series. I'm hoping Amd can knock Nvidia of their high horse a bit since there just taking the complete mick out there consumers. As I've said in another comment look how smooth the frame graph is on the 6800xt compared to the 3080 amd is defo the smoother experience. Even the 5000 series had smoother frames compared to the 2000 series. Nvidia are more interested in marketing bs RT which tanks FPS on all gpus, there dlss , ai. There all for buy our product fo these features bs while there 3070/3080, 4060ti suffer because of vram limitations, absolute joke. If the 3070, 3080, 4060ti had 12gb plus vram they would be amazing GPUs but nvidia are more interested than ripping customers off making them to upgrade, horrible greed of a company.
@@xwar_88x30 I would agree normally but the 3080 vs 6800xt seems to buck the trend. If you look around you can see a bunch of people and places doing the comparison again and contrary to the norm, overall the 3080 is stronger against the 6800 xt. I can here from a video comparing them that tested 50 games and the 3080 was on arg faster the the 6800xt by a higher percent then a couple of years ago. Now, it wasn't much of a change (like 9% on avg in 4k vs 7% that it was a couple of years ago 1440p was up the same) but it bucked the normal trend of AMD cards getting stronger vs Nvidia's. I thought it was funny. If I had to pick though I would still go with the 6800 xt because if you look and watch can you can find them new for around 500 or a little under sometimes. You are not going to find a 3080 new in that range or even in the zip code.
That's cool that you're using the RX 6800xt for editing, that used to be on the Nvidia Pro side. The fact is AMD have improved the software a lot, CUDA support via ROCM is coming out too. The thing is Vex has taken the plunge late in the game, there were deals on the AMD cards long ago, so you'd have had the usage out of it not waiting for 3080 prices to settle in the used market. Starfield is nice but not relevant to everyone. Unfortunately a lot of people ignored MSRP reductions and lower prices, scared by all the FUD put out about drivers etc. That doesn't send the market leader the signals it requires to reduce its margins. Gamers bitching about prices isn't enough, AMD have to be able to justify investment into features. That requires sales when they are close, because development costs are a real thing.
@@molochi well if Navi31 had met its expected performance targets the 4080/4070Ti would have looked very stupid, weak and totally over-priced. Scott Herkelman intended to "kick Nvidia's ass". So the 7900xt was crushing the 4080 12GB, the xtx the 16GB while much cheaper. Only fixes to the driver caused severe performance drop and they haven't found a general mitigation. Radical changes in architecture are inherently risky, RDNA3 has disappointed. Targets met, Nvidia added features like fake frames & DLSS would not have been enough, they'd be forced to cut prices or cede market share. Nvidia were maintaining high prices because of their vast RTX 30 GPU stockpile. Now they're cutting production to the contractual minimum rather than sell them cheaper, they figure gamers will relent eventually and cough up. The big money is in AI at the moment, so they're hoping to max out Hopper.
The emulated cuda is not nearly as good as actual CUDA, I hear AMD is going to do the same with Tensor cores as well. So not as good, but way better than nothing. If you own AMD already it is a huge win, if you are buying a new GPU and will be utilizing CUDA/Tensor then you are better off with Nvidia. Video editing and other productivity software results are going to depend mainly on the software you are using, Premier Pro favors Nvidia much more, some free video editing software may or may not be more of a contender.
@@ronaldhunt7617 CUDA is a way to program a GPU, it takes source code and compiles it, for an application. Calling it emulation just shows your game is spreading FUD and confusing people. The AI boom is for large models and so called deep learning, with huge demand for data center products. What you need for running neural nets and training them is very different, laptops are coming out with AI acceleration in cooperation with MS. Like Apple has had, without any Nvidia hardware but be an accelerator on the CPU die.
@@RobBCactive Not sure what you are trying to get at here, Nvidia has dedicated CUDA cores that process independently making certain computations a lot faster. AMD does not have CUDA cores, instead they (just like in everything else) saw what Nvidia had done and added stream processors which are not the same, not as good. Just like raytracing cores, and now it seems like Tensor cores (for AI) but only for the 6000 and newer GPUs, not to mention upscaling and frame generation (which AMD does not have as of now) Monkey see, monkey do... but not at the same level.
After seeing how ridiculous the prices were getting on the new generations of cards at the end of last year and start of 2023, I decided to grab one of these 10GB 3080's used for a deal. The crop of new games which exhaust its VRAM started showing up almost immediately after this, so I probably should have waited a few months :D Oh well.
@@michaelilie1629 Yeah i guess he bought a 400-600 € / $ card to play on medium settings. Nice advice^^ The question is do you need more than ~80 fps in games that are requiring so much VRAM. I think 3080 should do the trick in the next 2 years. But since i don't care for RT and the other stuff NVidia offers i got me the 6800XT anyways.
Make sure your 5900X is running a negative all core in curve optimizer (start at negative 15) and if not running Samsung B-die get a set of 3600 CL14 and really tighten down the timings. This will give you a noticeable uplift on your 5900X.
hey quick question, my ram (GSkill Sniper X 3600 CL19 @1.35v) is detected as Samsung B-die on Thaipoon Burner, but when i tried to tighten the timings even just 1 it refuses to boot, even at 1.45v, but the weird thing is i can lower the voltage when runing xmp to 1.25v. is it possible that i got trash B-die or am i doing something wrong? my cpu is R5 5600
@@therecoverer2481 Its the B-die I had the same issue with my 5600X using Silicon Power ram from amazon. Ended up swapping to some Team Group T-Create 3600mhz CL18 and was able to undervolt and adjust timings.
@@therecoverer2481 3600 MT/s with C19? That's definitely not B-Die. Guaranteed B-Die bins would be the likes of 3200 C14, 3600 C14, 4000 C15, etc. C19 is incredibly loose even at 3600. You shouldn't trust Thaiphoon Burner as it is known to have false positives, and in this case, it is glaringly obvious this is one of those false positives. I almost would say those ICs could actually be Samsung C-Die as they are known to become unstable over 1.35v, no matter the timings. It would also explain the very loose primary timing(s). I'd get ahold of a 3600 C14 kit as this is the best sweet spot for performance on Ryzen. Getting kits over 3600 MT/s isn't beneficial as they aren't necessarily better bins, but pricier; and, you almost can never go over 1900 FCLK which is 3800 MT/s when synced 1:1. Some Zen 3 samples may not even be able to do 1900 FCLK and need to step down to 1866 or 1800 (3733 MT/s and 3600 MT/s respectively). A B-Die 3600 C14 kit should easily do 3800 at the same timings on stock voltage most of the time.
This video aged like milk. The 6800XT now out performs a 3080, because ray tracing tanks your frame rate in any modern game, DLSS looks like someone smeared vaseline on your screen and driver support has pushed the 6800XT way past a 3080.
I would argue that VRAM matters for PC gamers that are not into competitive gaming but rather into the modding scene. NVIDIA is doubling down into this with RTX Remix with the RTX 40 series cause they know the demand of the modding scene in the PC landscape remains high. Skyrim is almost 12 years old, but to this day infact it has not slow down rather the modding scene in the game just hits the peak further instead (Check the view counts in UA-cam or the player counts in SteamDB - its still relevant). High VRAM capacity and Memory Bandwidth are important in Skyrim, I have 1750 mods installed on my 5900X + RX 6700 XT and with my 12GB of VRAM there's a lot of planning I do which mods to keep or not cause of VRAM demand.
Very true, Skyrim is one of those games that demands not just with the GPU but also the CPU. Not to mention at least having a GEN3 NVMe is required to even make Massive LODs work - System RAM too. Skyrim's modding scene is probably the only game out there which is like a symphony. Its uses everything in the PC especially if the person knows how to mod.
@@evergaolbird Indeed. My laptop has 32gb of ddr4 3200mhz ram, an I7 12650h and a 150w 3070ti and a 1tb gen 4 nvme ssd. Skyrim se still see's my CPU being the limitation. Thats despite me giving my CPU all the power it needs.
@@TheAscendedHuman The mobile 3070ti has the same chip and specs as the desktop 3070 except tdp and clocks. The 150w variant is within 10% of the desktop 3070. The i7 12650h is more akin to a i5 12600 non K (if it exists). I'd say thats a solid setup, even for desktop standards. Cause I know, not that many people are buying even a 3060ti or 6700xt on desktop. Most have a 3060 or 2060. My point is, modded skyrim se just needs way too fast of a CPU. I litearlly cannot use my GPU to its fullest, even if I let the CPU run wild. I hit the drawcall limit real fast and exceed it too causing FPS drops.
Nice breakdown. I've had the 3080 10GB since December 2021 and it's been amazing. Now i tend to play games that are a bit older because I have a rule where I wait at least a year after release for games to get patched and go on sale so I have nothing from 2023 but I've played dozens of games from 2015-2022 and the 3080 ran them beautifully at 1440P.
I have had that same 10GB card for almost 2 years, and use it for 4K gaming on high settings for most couple year old games. Granted, the latest AAA games can't be maxed out but still look pretty awesome on high. I run an I9-13900K with 32GB RAM.
@@Cybersawz nice. I have 32gb of ram too but have a 3700X which I may upgrade to a 5800X3D. But at 4K, I'd be GPU bound so probably would be the same results as your 13900K there.
The problem with the encoding is that i don't use that feature neither do I have a use case for CUDA and I'm probably going to stick to pure Rasterisation rather than turning on RT because upscaling or not the performance hit is too much. So for me the 6800XT is a better option.
@@evrythingis1i mean that's literally what most companies do to push their tech. You gotta pay to play. Why would a company offer to put your tech in their game if they have nothing to gain from it. AMD does it too. While I don't think Nvidia GPUs are particularly good value right now since we're at the beginning of this era of AI being used to improve graphics computing. I don't think the technology is bad or a gimmick like most people are trying to say
Why do you say one thing, then contradict what you just said? " I don't think the extra 6 gigs of vram is that big of a deal" Then less than a minute later " but the 16 gigs of awesome, to allow you to enable higher settings.
I got the 6700xt spretral white edition last week for £300, and im so impressed with its performance. 1440p ultra no upscaling 🎉 glad i didnt get the 4060!
I have to disagree with the feature set. And before people call me a shill I have been using nvidia all my life. The 6950xt is the first amd gpu I have owned and used. The dlss vs fsr argument is in most cases highly blown up. Yes dlss provides a better picture. BUT you will almost never notice this during gaming and only notice this If you pause them and watch them next to eachother. If you have to go through such lengths to spot something than yeah that's just not an upside in my opinion. And raytracing is better on nvidia although not bad on amd either. But the cards we have right now and especially the 3080/3090/6800/6900 are just not raytracing cards. Neither is the 4070ti or 4080 or 7900xt or xtx. The only card capable of raytracing at minimum is the 4090. And even that card sucks at it in many ways. So If you plan to raytrace you really shouldn't be looking at these cards. The only upside is cuda but If you are just a gamer you wouldn't care about it. And the vram is just so important. There are so many games I see with my 6950xt that these days shoot past 10gbs. And I wouldn't choose the 4070 above the 6800xt in my opinion. 12gb is alright but as I said I've seen games shoot past 10gb Heck even 13gb. So the 4070 would already be having issues. And that at 600+ bucks new in Europe. Just not worth it in my opinion. At that point you may as well buy a 6950xt if they are still available. Most people buy it indeed for raster and in that case most amd cards just beat nvidias.
For me at the moment it's between the 4070 and the 7900gre but I'll have to see where that price lands in Aus. The 40 series power consumption is pretty compelling.
Could you include a software showcase comparing the nv control panel and amd adrenaline? Because I've used both in recent times and amd is way better, especially since it comes with oc/uv suite that's far easier to use than having to resort to afterburner. These things should be considered when doing a comparision.
See I’d love to see this too, haven’t had an AMD card in years but hated the software back then and have heard it’s improved a lot. Never really had an issue with nvidia
@@Raums Tried both in the last 2 years. As I always undervolt my gpu I must say that amd is way easier and works better. Needing a 3rd party software made only by one outsourced person to undervolta a gpu from the market leader is embarassing imo. Amd software is great these days and drivers, while sometimes have issues like nvidias, have way more performance improvements over the months compared to nvidia.
The fact that AMD has a built-in OC/UV utility is actually a bad thing. They have already used it to artificially limit the speeds in which you can tune your GPU. You want overclocking software to be a third party because they will provide the most impartial and least limited features.
DLSS doesn't eliminate the value of Vram, but it can let you get away with using less with a small reduction to image quality at a given resolution. Also, while DLSS can help compensate for a lack of Vram, FSR can as well, it's just not as good at it at 1440p, but it gets quite hard to tell the difference between the two when upscaling to 4k, at least with the highest quality settings. I did own a 3080 for a while, and have played around a lot with DLSS and ray-tracing in Cyberpunk, and also in Control. Running without any upscaling at all is still better than running with DLSS, as long as you can get a high enough frame rate, and I find it very hard to recommend a card which costs 80-100 more, but has significantly less Vram, if it has about the same amount of standard raster performance. Ray tracing in Cyberpunk required using DLSS, and still ran significantly slower than running at 1440p native without ray tracing. I just didn't think that it was really worth using ray tracing. It never seemed obvious that it was worth running with it turned on, though it certainly was usable, and did look good, running at 1440p with DLSS. With Control, I found that the performance was good enough to run with ray tracing and without any upscaling but the game did still run noticeably smoother with ray tracing off, and the ray tracing itself was still not all that compelling to me. I found that the lighting in both games, even without ray tracing, was amazing. A lot of the lighting effects even with ray tracing turned on, are the same either way, unless you go with full path tracing, but I obviously had nowhere near enough performance to run with full path tracing. The 4070 isn't terrible, and it's not going to suddenly become obsolete because it doesn't have enough Vram at any time in the next 4-5 years, but it would have been a LOT better if it had had 16GB of Vram. It's not like 16GB would have been overkill on it because it has DLSS. That would have made the card that much better, and it would have also had more memory bandwidth as well, which would also be nice. A 16GB 4070 at 660 would have been a significantly better value than the 12GB version is at 600.
The problem with using a 3080 for productivity apps is that 10GB is REALLY not enough for doing actual work with. PLUS - nVidia's drivers limit many of the professional features you need to Quadro drivers. The 3080 Quadro equivalent is the A5000/5500 with 24GB vram and price around 1k-2k. You will get better performance, than 3080, in most CUDA workloads with a Quadro RTX 4000, a 16GB - 3070 equivalent. Because 10GB for any significant work is waaayyy too low - assuming the drivers allow the full feature set in the application on a non Quadro card. As far as AI worloads go; which is much more my wheel house. ROCm is feature complete and CUDA isn't all that relevant in 2023 as it was in 2021, for the AI space. 10GB - again - cripples the card for anything more than fiddling around as a hobby. Try to render a 1440p scene with a 10GB card vs a 16GB, it's not even funny how memory crippled you will be. You will get equivalent performance to a 6700XT with 12GB - which you can get for much cheaper. Additionally, we tend to put GPU render farms on a linux distro, where AMD has much more mature drivers and ROCm support. Specialised AI accelerators are a whole different kettle of fish, in that space, you will be writing your own custom libraries, that you will tune to whichever vendor allocated some boards for you - nobody is going to be picky which one, the lead up times are insane as everything is pre-sold before being fabed. You take what you can get, pay what is asked and count yourselves lucky.
Dude is capping by saying 16gb VRAM doesn't matter. Extra vram allowed cards like the RX 580 8gb and even R9 390 8gb to extend their lifespan way beyond what was initially expected out of them. The 6800 XT is a great long-term purchase and will be viable for 1440p for at least 3-4 more years.
Vram becomes a huge problem when you don't have enough, I'd like to see some benchmark examples with Starfield 8k or maybe even 16k texture mods. Watch that 10GB absolutely tank in performance. With that said, right now 10GB is good enough, it's just not very future proof. When new games are using more and more vram every year, I'm sure that's a concern for many people. Despite all of that, I'm sure the used market is just flooded with more 3080s because of the end of the crypto craze.
Only a concern if you max them out. Gaming at 1080p, never had to drop more than 3 settings from max for stable 60 at 8GB. Haven't played Hogwarts Legacy, TLOU etc. but they don't interest me anyways and I bet you can play them perfectly fine with a few nonsense settings turned down. It's the same deal for higher resolutions, turn down a setting or 2 and watch the game become playable, usually with little visual difference. Ultra settings are beyond the curve for diminishing returns for performance hit vs visuals, when an amount of VRAM isn't enough to run the game at medium-high then it's too little 10GB VRAM is gonna be fine, XBox S has 10GB total that's shared between the GPU and CPU. 10GB may be laughable on a 3080 but in and of itself that amount isn't an issue
When was the card designed for 8k ? Do you have an 8k panel ? You have a $3000 monitor ? Lol Same as testing a 4060 in 2023 at 4K, it doesn’t make sense, it ‘can’ up to a certain texture level. Would I want to, no
I agree with you both, like I said, 10GB is good enough, especially if you're doing 1080p. It's more of a worry with future, poorly optimized games from AAA devs using more vram than necessary. It's like that saying, I'd rather have it and not need it, than need it and not have it.
Nabbed my Red Dragon 6800xt this prime day for 479$. Pretty happy with it, although i do sometimes find myself wishing I still had the CUDA support my 1070 Ti had. Looking forward to ROCm.
Nice Video man, I jumped ship from my 3070 to a 7900XT because it was 200AUD cheaper then the 4070ti and besides not having DLSS as an option I am super happy with my AMD card, in pure raster performance on average I get 15-20% more fps in most games(compared to the 4070ti) and the only issue I have had with my 7900XT TUF is bad coil whine which I am hoping I can send mine back soon and see if I can get a better luck of the draw on that. Keep up the videos you are a great new entry source into the PC DIY learning youtubes.
Your not missing out on dlss since amd has built in fsr in there drivers which you can use in any game and it still looks amazing. Should defo try it out. It's under super resolution in the driver menu, I tend to have the sharpness either at 40/60 looks good imo.
I just put here my two cents about what is keeping me, and a few other folks, on the AMD side when buying a new GPU. Support for open stadards, like FreeSync, FSRX and OpenCL/ROCm: I don't like vendor lock-in, so I support agnostic tecnologies. I'm not the guy who cracks professional software just to tell my friends: I have photoshop 2026, idk how to use it but I have it. So I usally go for open software and I've never had a regret, in both my private labs and professionally. But the main plus above all it's the unix support. At home I can play windows games on linux flawlessly without having to thinker with monthly driver updates, it just works...and 2005 class hardware is still supported. At work I greatly extend hardware lifespan for the same reason and this philosphy allows us to offer fast and reliable citrix like remote desktops with GPU passthrough of graphics cards that would now be e-waste if made by nvidia. Intel is now in the middle between AMD and nVidia philosphy and i hope it will land on the AMD view of the HW/SW stack.
Personally, DLSS is a crutch-technology and should be advertised as a way to keep your e.g. x080 viable for another generation, or three. Not as something to make games playable when the card's still pretty new. Edit: fixing spelling mistakes and such like.
Like AA or anything that improves quality over rendered resolution does? Cruth. Do you play games or freeze frame and pixel peep. DLSS gives you a choice, and when I choose quality it get more FPS at the same visual quality on the same card. Its like a 5th gear that gives me better milage. If it was not valuable. AMD would not be bribing bundle partners like Starfield to no support it.
it's not a 'crutch' technology. There's more to making a game run nice than just raw hp. DLSS is an excellent feature and a great example of hardware and software harmony
AMD users will have to adjust settings on Radeon software 'to get a stable experience' whereas; Nvidia users will just play the game without doing any adjustments. And when RT is turned on, AMD goes insane hahaha @@xwar_88x30
In the "new" market in europe (mostly france) : - For 599€ you got a brand new 3080, a 4070, or a 6900XT (the 6950XT is 629€) - The 6800XT is 60€ less I'm still waiting for the 7xxx from AMD. I don't get their strategy. Still, there isn't 7700 or 7800, XT or not, and they released the 7600.
Recently upgraded to a rtx 2070super for 150€ (card needed to be repasted, not that hard tbh), ill be good for a few years or i can snag something newer on the cheap Edit: i bought the gpu with mobo combo, i7 9700k 16gb ram and aio cooler for 200+150=350€ sold my old setup for 400€ with screen and other peripherals
@@RFKG you mean 1440p the current rig pushes 100fps in almost all games at high/max @1440p i dont use dlss or other resolution scaling, i went from i7 4770k w. Gtx 1070 that also pushed 1440p at medium settings albeit stuttery in demanding games Example: cyberpunk currently averages 80fps @ high to max @ 1440p
The main reason 3080's are cheaper is that they were probably the most produced card of all time. Although I dont know how to find stats on that but so many miners like myself bought every one they could for a couple years and they have been re-selling them on ebay since then.
it's true. ga102 in general was so abundant that they made the following GPUs with them: 3070 ti 3080 10 GB 3080 12 GB 3080 Ti 3090 3090 Ti and those are just the GeForce ones. their serverside cards also used the cores.
gpu market has been confusing lately here in my country indonesia, there were a moment when you can't find RX 6950 XT at $630 USD, it was sold around $700 but ironically, you also can't find brand new RTX 3080 on $600 USD, because it's also sold on 680-700 USD so people who actually buy brand new RTX 3080 is considered as "stupid", since they could get way better GPU which is RX 6950XT and now a brand new RX 6800XT is being sold at $560 USD, just $60 less cheaper than RTX 4070, and you can't find RX 6950 XT anymore, not even in 2nd hand version
I bought a 6800xt 1 year ago. Where I'm from in Spain, for some reason AMD cost more than Nvidia. All my life I was Nvidia, except for an old ATI x800 I had, but the rest of the graphics cards I have had were all Nvidia, a 9800gx2, gtx 280, gtx 285, gtx 480, and Gtx 970. Even being more expensive here the 6800xt than the 3080 I opted for it, it was not for the performance, nor for the price, it was because personally I am up to ...... of Nvidia. First it was with a board with Nforce 790i ultra chipset, an expensive board that did not stop giving problems, then the 9800gx, the worst graphic I've had by far, then with the Physx marketing crap, that if you add a second card to have physical and blah blah blah, never ended up working well, pure marketing, then the 3.5 Gb of the 970, I ended up fed up and for my part they can put their marketing up their ass. It is clear that the streaming and rt in nvidia and superior, but for me it was not so much as to opt for Nvidia again. The Rt is superior but it is clear that in the 3000 series falls short so for me it is still marketing, the Dlss is above the FSR there is no doubt, but who tells me that when they leave the 5000 not leave me sold with as they did with the shit physx, the 3dvision and other technologies, also the Xsee of intel is not so bad and can be used in AMD. This is my humble opinion from years of owning Nvidia cards. I'm not saying that Amd hasn't had problems with their cards, but I personally haven't suffered from it. Sorry for the long and my English, I wrote it with the translator. Good video and good channel
I do not stream to twitch, i do not upscale and i seldom use raytracing (well not that i could complain about raytracing performance of my RX 7900 XTX. It is plenty sufficient). But i have games that use more then 10GB RAM and that raytracing increases the VRAM usage thus the 3080 running out auf VRAM is kind of funny. So i have to disagree. I would chose the 6800XT anytime over a 3080.
Can you tell me what the temperatures of the card are, especially if you use a 1440p ultra wide monitor? For me, with the highest details, the temperature at the connector was 100 degrees Celsius
I got that one on prime day for $480 with starfield. But good points made. The only feature i really would nod the head to is dlss. I’m all for Ray tracing but it’s just not that big of a difference in most games for the performance. I wish amd would focus on fsr 3.0.
People are overblowing the whole VRAM issue.... yeah bad PS ports exist, badly optimized games exist especially recently..., capping FPS is a thing imagine (why would you be making it run more than it's needed unless it's a competitive game and you need that 120FPS), lot of the options in the Ultra preset make little to no visual difference and cost a big performance hit. .... yeah. Great video
You'll feel them in a year or so. 16GB is the new VRAM target for game devs. Hence the sudden explosion in VRAM use in 2023. We ent from "8GB is enough forevahhh" to some games using up to 14GB in months. When I buy a card I intend to keep it for 4 years. And for that I realized 10 or 12GB just wasn't enough, so I bought a 6800XT. Used to own a GTX1080 and saw that it was already being filled up in 2021, I saw the writing on the wall
@@Altrop Realistically the only things using that much vram are unfinished garbage piles or if you're set on raytracing but I don't think a 4070 is really the card for raytracing. Until consoles move away from their 12ish usable that will be the "benchmark"
1080p 240Hz player for all games and I never use RT with my 3080 or 6800. It's not worth the best speed for visuals which I could care less about. I get awesome visuals at 1080p and I'm not going down that rabbit hole to compare 1440p or 4K. I'll leave that up to you guys to complain about. Speed & Efficiency for games I play is absolute! Enjoy 🙂
I wanted to buy a 3080 2.5 years ago, ultimately it became a 6900xt. At that time I had a FHD144 monitor then switched to WQHD165, meanwhile I have a 4k Oled for gaming. Currently 12GB would also be too little for me, already the head thing. With every jerk I would ask myself whether it is perhaps the VRAM. That's why I'm quite happy not to have gotten a 3080, back then I would not have thought how quickly the memory becomes a problem.
I disagree with the VRAM statement. The stutters and graphical artifacts are a dealbreaker for me. I had a 3070 that ran out of VRAM and the drawbacks were unbearable. The additional 2 GB of VRAM for a 3080 would have not resolved the issue. Moreover, the raytracing on the 3080 is quite overblown with how good it is…. Also, it requires more VRAM. So what would you choose, better textures overall or better lighting?
It is possible something else in the system than the GPU draws more power, but the selling point about power consumption certainly doesn't seem like something that would matter in this case.
3080 shoot its leg by 10gb Vram. I would buy 3080 12gb or 3080 ti which also has 12gb if its not significaly more expensive. Was Smar acces memory turned on when you tested both cards? In general both cards are same until you turn on SAM, than 6800 xt stsr pulling head in nearly all games except RT.
I got the 6800xt in november, and I love it. First, I had the good luck of getting a card without the annoying coil wine, so that made me happy (I bought the Gigabyte OC version). Second, I trully love Adrenaline, because there´s no need of Afterburner to adjust voltages, fans speed or whatever and is very easy to measure thermals without using a third party software. To be honest, DLSS is a major feature I would like to have on my card, but considering that Nvidia didn´t supported 30 gen of cards for frame generation and AMD promised FSR 3 will be supported in the 6000 cards; seems that was a deal. If AMD promise is achieved, the 6800xt would destroy even the 3090ti with half the price. We´ll see if true... but at the end, 6800xt seems like a good deal if RayTracing is not what you want. I´m not content creator or editor, not streamer; just using the card to play and even without FSR 3 I love my card. No drivers issues, more VRAM, more rasterization performance and includes Adrenaline. All for less price. Shut up and take my money!!
The reason Nvidia skimps on Vram is so you have to buy the next gen and give you dlss to get by. Amd give you the VRAM and get blamed when theres not much improvement next Gen and offer a serviceable FSR.
Got a 6800xt paired with a 43 inch 4k screen and I can tell you using quality fsr at 4k I cannot tell the diffrence and at 1440p with these cards u don't even have to use upscaling.
If the extra 6GB VRAM gives you 2 extra years of gaming (let's say an additional generation of GPUs) before forking out new money it's worth worth 112$ a year, without taking the resale value into account of both GPUs after respectively 2 a(for the 3080) and 4 years (for the 6800). And not considering if you do more than gaming on the GPU.
great points personaly I like to use driver only install with AMD GPU's lower overhead plus afterburner for undervaluing is just simpler. Though I do lose the ability to overclock my monitor as well as run not sure what the setting is called but it alows you to un cap the FPS without tearing which on this rig can be pushed to 78Hz vs 75Hz not a big difference and you lose freesync with the overclock so not worth the overhead of the radeon software I just cap the frames @ 74FPS which is all this RX570 can handel with flat line frame times anyway I just purchased a red devil RX6800XT used on Ebay for $395 $445 after tax and shipping waiting for delivery to upgrade my RTX 2080 super, I plan to undervolt, and don't use ray tracing or DLSS I mostly play Fortnite with TSR enabled high- epic details it's all about latency for me @ 1080p 144Hz I plan to test it with my 4.75GHz R5 5600X then test it with my R5 5600X3D which I dont use at the moment because it runs hot on my 240 AIO may have to pick up a 360 AIO if the 6800XT performs better with it vs the non X3D's older sibling which is fine I would like to use this 240 AIO in my Rampage III Gene X5690 Xeon machine paired with my ASUS Strix RTX 2070 super which currently has an XFX RX570 that I customised with an XFX RX580 cooler 2 more heat pipes and direct contact ram plate which I also undervolt and overclock, I got the untested RX 580 for parts on Ebay for $20 im prety sure it was just dirty and needed a repad repaste but my RX 570 is basically new and I didn't want to risk installing a dead GPU on any of my current hardware
I have a 3080 10gb. What's your opinion on what to upgrade too? 40 series are so pricey, but the amd cards are not so higher performance. Edit. Cheers guys! I will stick with the card and hope it doesn't conk out soon :-P
Keep the card for as long as you can and pray that next gen doesn't suck as hard as this gen and that ai doesn't become to next crypto boom. The only issue you could have is VRAM limitations, which is a good old Nvidia planned obsolescence strategy, other than that you should be good for a while.
What really fucking sucks it's been two fucking years census video is made and the RTX 3080 still cost the same on the used Market I honestly hope it goes down after the 50 series launch literally my entire PC including a ryzen 5,5600 is cheaper than what a RTX 3080 goes for right now😢
Literally just bought a red dragon 6800xt today, open box special at $430. Couldnt postpone anymore, coming from a 5700, i hope to keep this around for a few years. Maybe next time around ill go greeen when RT is more optimized and adopted by more games
(Used) 3080-s are cheap because there's an extremely high probability they were mined on. As the crypto drops, there's little incentive left to mine, so the smaller farms get disassembled and sold. We will not disuss the probable condition of those GPUs (which may vary greatly depending on maintenance of the farm). The fresh 3080-s are indeed priced cheaper, because there are only so many advantages they have over an AMD competitor (6800XT): raytracing, DLSS (no frame gen on 3000 series btw) and CUDA, out of which only CUDA is not optional (if you actually need it as a content creator, that is). On the other hand, 3080 has only 10Gb VRAM, which is enough for now, and may (keyword may; it gets filled up in some games on ultra, but those are just singular examples... for now) last you until the 5000 series; 6800XT does not have that "may". Overall, I'd say a 6800XT is a more balanced long-term solution, "the 1080Ti of current era", while 3080 is an "I want the eye candy now and then I'll switch to 5070 as soon as it launches".
I am absolutely thrilled to share that I recently snagged an incredible deal on a 6800 for only $353! Needless to say, my excitement is through the roof!🥰
tbh some of us, well. most of us grew up without ray tracing, why need ray tracing when u can perform higher fps at a higher resolution, honestly, ray tracing is a useless invention when we can use its much more easier counter part also known as pixels.
I had a Red Devil 6800 XT and took it back when the 6750 XT refresh came out. I should have kept it. Incidentally, 6950 XT Reference cards are only $550 at Microcenter right now. Pretty comparable to the 7800 XT, or better in several games.
It's just a shame that Nvidia doesn't want to give enough VRAM to use their GPUs to the fullest. I can forgive the 3000 series, 8 was plenty at the time, but it was becoming clear early enough in development of the 4000 series that they needed to increase the VRAM at every skew except for the 90s. If AI wasn't their focus right now, they probably would have, but, because of AI, Nvidia just doesn't care.
When it comes to VRAM. As it seems right now (if I'm not mistaken), there are about 4-6 games that (for whatever reason) exceed 8, 10 and even 12 GB, at a resolution of 1080. To me, that is entirely different to when Ray-Tracing started getting games, as it was obvious that it'd take years before buying a graphics cards just for Ray-Tracing was a viable (or at least a wise one - besides, how good was the RTX 2000-series with Ray-Tracing, really?). *This*, on the other hand, looks like the start of a trend, that the VRAM usage is spilling over the 8GB (with 10 and 12 following shortly, would be my guess) limit. And I'm guessing that it'll be at a rate, at least twice, the rate of the adoption of Ray-Tracing (at least after 2024). To me, Ray-Tracing is an add-on, a luxury item. But being able to play games at 1080p, on a card (that was released in the roaring 20's, mind you) that had an MSRP of $700, is (with exceptions, of course) the least one should expect.
10GB is veeeery borderline for 1440p. I very often see over 9GB. Over 10 is less often but we are getting there. I don't know for how long the 12GB on my 4070 will last. Ideally I wanted more VRAM but there wasn't any alternative on this price range. I didn't want to go with the 6800XT and lose DLSS 2,3 and better RT cause I am only playing single player games. Fingers crossed for the 12GB then :P
That's all Nvidia is good for is borderline with the vram scam on their GPUs making people upgrade when there 3070/3080s/4060/4060 ti or whatever other bs Nvidia GPU that starts to stutter in heavy vram games. 16gbs Is the safe spot at the minute anything over 16gbs is even better future proof.
The prices are WAY different in my country. A used 6800 xt has the same price as a 3060 or a 3060 ti. Which makes it a hugely better deal than nVIDIA ones. 3080 is much more expensive than 6800 xt in here.
I've purchased the 6800xt. I am having nothing but issues with it. Stuttering games, Flashing desktop, so on and so fourth. I started to look into it, (I haven't owned anything amd for a while). And I noticed that people have been complaining about the stuttering for literal years!? There are work around everywhere! This is crazy to me. I am returning the card tomorrow and getting the 3080 instead.
for the last 3months been testing used 3080 and 6800xt cards, all gotten for under $350 in great conditions. I decided to keep the NVIDIA card in the end (a Gigabyte GAMING OC) because of the better performance with my QUEST2 for PCVR. I really fell in love with VR since I bought the Q2 (mainly used for Simracing). In the titles I play in desktop mode (4k 120Hz TV), never really saw a VRAM bottleneck. Maybe having a 5800X3D helps as well but that's my user experience
Have the 3080ti and a 6800xt and usually can’t tell them apart. But I think they are both CPU limited one on a 9900xf at 5.0 and a 8700k at 5.0. I will say the textures in LoU look muddy as it’s likely loading lower texture quality. Hardware Unboxed does some really good VRAM testing you should check out.
"FSR doesn't look as good as DLSS" that's funny you say that right after showing a great comparison where both XESS and DLSS failed miserably at properly rendering a reflected light source, while FSR was the closest to the actual render. So that is anything but clear-cut. TBH all have problems, and I don't like the idea of using any of them unless I'm forced to. But if you really enjoy looking at an AI fever dream and pretending it's an actual render, you do you.
My goal was to get something that allowed me to max out details and textures in open world environments without worrying about stuttering or skipping, and upscaling isn't that important to me. At a reasonable price of $400, the 6800xt wasn't a hard decision.
Bro increase your parameters from the default through Andrenaline. The fan speed and Power limiter by Powercolor's default are kinda low. run that bitch on 2000rpm with the power slider maxed and an undervolt&OC and it will be even faster.
Where I am 3080s are going for about the same price as RX 6800s on the used market. RX 6800s are typically going for around RM1400, and the 6800XT a cool RM1800, RM400 more than the 3080. For context, RM400 is roughly $88. I paid RM1600 for my 6800 (had to pay extra 200 for the reference card tax, since this card is rare af) just last month and have not used RT in any of the games that I've played. I personally don't feel the necessity to turn RT on, games look more than good enough as it is and I prefer the higher framerate. I only play at 1080p so I don't see the point of using either DLSS or FSR; the image quality degradation is too great at this resolution. So I don't think I'm missing out too much by not getting a used 3080.
I bought mine part of a Cyberpower PC Build I picked my Parts.. Didn't feel like dealing with it this time. But I have an XFX 6800XT and it costs more because it is fast in most games and it has 6 more gigs of VRAM!! It's Hands Down the Better card and anyone who plays Hogwarts knows it.. The 10gigs is barely just enough for Brand New Games and I think that's why it flipped around as far as pricing goes. Also the 6800XT is even slightly faster than the 3080TI on Hardware Unboxed 50 game benchmarks too..
Also depends on the card manufacturer brands. Do you have any idea how expensive EVGA 30 series cards are going up in value? I was just barely able to buy 3090before they started going up pass 1,200 and the kingpin series 3090 I've seen a couple go for over 2k
Gotta love at 3:54 "the nvidia card is pulling ahead pretty significantly"(+14%) then 10 seconds later when "the amd is pulling ahead on same game the amd is actually pulling ahead"(+14). I feel like for the rest of this video nvidia is gonna get way better choice words. Smh🤥
@@vextakes that doesn't change the choice of words that was used though. I see what your getting at but i'm saying using words like significant just sounds way more than what actually took place within that 12 seconds of that time stamp.
three years ago i really wanted 6800 xt however considering mining craze started and there was no amd card in my market, only option was 3080 10gb :( it ain't bad card but 10gb buffer is definitely holding it back
Now for a card that competes with xx80, you have to pay $1000 just because Nvidia raised the price of the xx80 from $700 to $1200. This is duopoly cooperation, not competition. The funny thing is that 7900xtx is relatively slower than the 6800xt but costs $1000 pretending to be like 6900xt when in reality relative performance is not even close. 6900xt = 3090 -7% -> $1000 6800xt = 3090 -15% -> $650 7900xtx = 4090 -20% -> $1000 6800 non-xt = 3090 -24% -> $580
I got my Asus TUF RTX3080 (OG model 10GB) at launch during C19 and got it at basically retail price (MSRP) and been happily using it as my main gpu (recently upgraded to RTX4090 & 4k HDR monitor) for my 1440p gaming needs and some games with DLSS 2 (upscaling) for some titles for moderately higher fps for much better experience, rather than around a 60-70fps in some newer harder to run titles. I love a high refresh rate experience. I do also have an Rx6700XT which I have in a spare PC which I use a my partners place. Both gpus here as mentioned have their place in the market and featureset depending on your individual needs.
When next generation games comes out 10gb is going to be a problem. Also so many was used for mining, those two elements. So many more about. The 3080 used is a bit pricey. Still a good card.
I want to buy a new GPU. *BUT* I have yet to see a review/video for gamers that clearly identifies a video card without too damn much mumbo jumbo. I am NOT non-technical. I have been around since BEFORE anti-aliasing. I remember when nVidia introduced anti-aliasing and reviewers were nah-this and nah-that, f-ing short sighted. Even so, I realized, while it was a step in the right direction it was "not there yet". I bought their cards but didn't turn anti-aliasing on for performance sake. Then nVidia introduced cards with ray tracing and reviewers were nah-this and nah-that. But look at what is happening in the industry, cards and games! Now PLEASE, just tell us what card is the next damn 1080TI that will last for 4 (four) more generations. I'm sick of the BS and price gouging. Any whatever (Potato), now I'm waiting for the nVidia 5000 series and the AMD/Intel competing brands... Nothing in this generation has been identified as a MUST buy. 👎 BTW, I'm still rockin' a 1060 and I was ready to buy a 3080, etc. if it weren't for the BS that happened in 2020.
That is definitely a hot take on the 16GB not being a big deal. There's a LOT of other youtube tech channels that would vehemently disagree with that statement.
Hi, I was thinking to upgrade my GPU, at the moment I’m using ASUS ROG Strix GeForce RTX 2060 OC edition 6GB GDDR6 and my CPU is an Intel core i7-9700k 36gz 12mb cache, can you please recommend what GPU to buy for better performance!!! I’m playing Destiny 2, Overwatch 2, CS, Valorante. Thank you
I still had a 1070 until a month ago and only upgraded to get a card for maybe more modern titles. I undervolted my 6800XT, lose like 1% performance (the way I did it, at least only has that little impact) and i have about the same power draw as the 4070. Raytracing does not exist for me. Whenver people show their benchmarks and have RT enabled, I can never tell the difference anyways, maybe because I don't look at the game but at skill bars and crosshairs. Being able to run Eldren Ring at 1440 p with just a power draw of 90-120 watts beats my 1070 (175+ Watts) by a long shot, I don't regret the purchase.
Fsr 2.2/2.2.1 is amazing actually you wont notice the difference with dlss unless if you really look for it. But 2.2 is maded for racing games in mind, modded 2.2.1 looks great on games.
because RTX 3080 was used a lot in mining rig (better hashrate), and those cheap listing are probably in terrible condition (rusty, dirty and need a repaste)
This is a hugely wrong view. Ive been mining with cards for years and they still perform exactly the same as when brand new. Most miners learn how to reduce heat and power for max efficiency which is far easier on the cards than gaming.
@@professorchaos5620 running 24/7 on an open bench, inside a humid room, never get clean, yeah sure 🤣 most used GPUs were in terrible condition, dust combined with humid air is the worst, it became black gunk that's hard to clean, the fins & IO shield get rusty, some of the fan is starting to fail. I saw it myself, most miners don't give a shit.
@@notchipotle honestly, the issues with quality are a huge one with used cards, regardless whether those are mined or not. there are some mining operations out there that do keep their room dry, clean their cards, and try their best to take care of their hardware. The best way to grab a used GPU is to get one locally, like arranging a meet-up instead of delivery. You can even negotiate pricing while you're at it, and at the same time you can also inspect the quality of the card carefully. Oh, and by the way, if you have any safety concerns, just know that you can do this right in front of your local police department.
@@Very_Questionable I sell used PC parts since 2011, even the most careless gamer can keep their GPU in good condition for at least a year, but mining cards already looks like crap after a couple months. you can say whatever you want, I was just describing the reality, not a personal opinion 🤣
Those extra feature need to be actually intended to be used to the max, if the admitted planned obsolescence of less VRAM is going to make you spend more money sooner. DLSS needs to be the difference between playable and unplayable in more than just using RT. In something where it would be foolish to even bother with RT like multiplayer, DLSS can have odd results (more than just IQ) compared to native. Coming from a 1080ti, the 3080 10GB never made sense to me. Besides on paper I've tested it vs a 6800XT and a 3090. The 6800XT was at least on par or near for less money. Worse was needing a 3090 in places where the crutch of DLSS wasn't an option. The only place I absolutely needed more performance was to play DCS in VR (G2), and the 3080 was subpar due to VRAM, and the 6800XT due to bandwidth but basically the same. OpenXR could use open source FSR, but a 3090 wouldn't need it. A squad mate had similar complaints about the 3080 stuttering in the same scenario which further confirmed my suspicions with the 3080. The 3080 is not worth buying because that frame buffer is obviously going to be an issue sooner in other places. Cuda and H.264 need to be equally if not more important (actually making money important) than gaming for it to make sense, IMO.
🚨🚨 NVIDIA SHILL ALERT 🚨🚨
real
I love Nvidia! Daddy Jensen make me into one of your jackets
I decided to buy a 3080 instead of a 6800 XT a few days ago to upgrade from my 6600 XT, the card just shipped too, and now you post this video 💀. The only reason I'm still happy with my decision though is that it was a 12GB card and I got it for $390.
Update: I got it and I'm happy with it
There have been videos for years showing the 6800xt as the better card. It also overclocks far better. Mine is as fast as a 3090, using air cooling and amd's stock software, which has improved a lot in the past 3 years. No driver issues either.
CUDA worloads aren't a thing you can utilize well with a 3080 - they need a lot of RAM, that's why the professional cards have 24GB. 10GB makes that feature irrelevant - the low memory is a way to force prosumer buyers into paying the Quadro tax. 3080 is useless in pro apps, you get better performance with a A4000 - a 16GB quadro card that uses the 3070 core.
I'm coming back here in 3 days with snacks to read the comments.
lmfao 😂
"muh dLsS and Rtxsss" vs "yeah but that 16 gigawatts of Vruuuuum!!!!"
Bruh please ignite some fire and go
Coconuts are good for snacks.
Ayy
16GB vs 10GB is no contest.
Both of these cards are the lowest viable 4k options imo and at 4k 10GB is going to age like milk (even with FSR2/DLSS quality lowering the vram usage- which you'll probably be using as 4k native requires serious horsepower).
In blind tests comparing FSR2 and DLSS at 4k quality, 99% of people won't be able to spot the slightly better image quality DLSS offers. The differences become more exaggerated using more aggresive upscaling at lower resolutions.
If you don't stream or do any productivity and have a 4k monitor then 6800 xt just seems like the better option. £400 used and lower power which is a factor in the UK where energy is expensive.
As for RT- I've always been about optimising graphics settings. Even before RT, I'd be turning shadows down a few notches from ultra for more performance for very little visual difference. Even on a 3080, RT is expensive and I'd much rather target 120FPS than have some eye candy.
Texture Sliders.
So long as you keep the Vram in check,
It’ll be faster no,,,
@@Ober1kenobi but that's the issue- I don't want to have to think about keeping my vram in check- 4K textures are one of the best ways to improve the image- whats the point of playing at high resolution using low resolution textures? PC gaming and the pursuit of smooth frametimes is finicky enough without having to worry about vram as well
Well, I can easily see the difference between DLSS Quality and FSR Quality at 4K rendered resolution. Even on my old 1080p screen, let alone 4K screen! DLSS provides much better AA quality and less ghosting. The overall image looks way smoother with DLSS.
Plus, RTX cards also offer DLAA - the best currently available AA solution. And also DLDSR, which is an AI driven downscaler.
@@stangamer1151 Cope. I will go nom nom on my vram.
I just bought a used rx 6800 non xt for 350 usd. I came from a rx 6600xt and I cannot imagine using anything faster. I'll definitely keep it for as long as I can. And to add I never see my card go above 200w, and that is with a slight tuning which makes its just that much faster. I love being able to max out settings and not run out of vram.
Bought a 6800 XT Sapphire Nitro+ for 355 usd....
@@hiriotapa1983 Awesome! What a killer deal!
what processor do you have and what psu? I have rtx 2070super and want to upgrade to rx 6800 or 6800xt with mine r7 5700x and I only have 650W psu 80+ bronze, i think 2070super runs with same watts like a 6800
@@Adrian-tl5ueit's good enough
@@Adrian-tl5ueFunciona sim,eu uso uma 650 w plus bronze tambem !
I was in this boat and grabbed a used 6800XT over a used 3080. There are two main reasons for this:
1. 16 GB VRAM, self explanatory. Yes I bought into the fearmongering, boohoo.
2. GDDR6X is an actual risk when buying ex-miner cards, GDDR6 not so much.
This is really all there is to it.
I've already had a 6700XT before so that kinda made it easy to know what to expect. Might be a little harder to risk going a different brand for others especially in used market. Though I don't think 3080 is cheaper because its valued less, I think it has more to do with just how many 3080's Nvidia produced and sold to miners. I just don't think there are as many 6800XT's out there total.
G6 might not be as big as a risk than G6X but I think it really depends on the AIB rather than the card itself. Also fast G6 can produce just as much heat as slower G6X, like the 7900 XT.
Reference 6700 XT for example has G6 but are known to run its memory close to 100 degrees. It’s all about how well the cooler is.
@@person1745 well if you're buying an ex mining card known to run memory at 100c you're kinda asking for problems.
I think the fear mongering about 8gb is valid. 12gb 198bit is probably enough, but I wanted 256bit and that's 16gb.
@@person1745 AMD OC those GDDR6 cards from factory. Which is why they produce far more heat than they should. 20gbs seems out of spec to me for GDDR6 non-X. I think they did that out of depuration for more performance when the realized their RX 7900 series was not hitting target numbers at 16gbs or 18gbs (GDDR6 at stock). Plus, my 7900xtx if I make the vram clock itself to its 20gbs. on the desktop it consumes like 80w idle with vram clocked up from 10w idle.... yeah, they pushed and overvolted these ram chips.
@@forog1 you can undervolt SoC for what its worth with MPT to reduce power draw a little bit, it works on RDNA2 but I cant guarantee anything for RDNA3.
I grabbed a 6800 xt for $150 less than the cheapest 3080 on the used market and 4070s are still over $300 more idk why anyone would spend more for AMD but ultimately besides the great price the 16gb vram was really what made me buy it.
Couple weeks ago i swapped my 6800XT to a 3080 (10gb) And only reason i swapped with nephew was for AI. The AMD gpu while many say can run stable diffusion. I could never get it to work. So i swapped with my nephew. And SD works flawlessly now. Plus the games i play run about the same on both gpus. So nothing lost there
Good choice my friend, 6800xt will age like fine wine and outperform the 3080 as time goes on and even matches or beats the 4070 with having more vram it'll age very well. Nvidia get Weaker as they get older while amd GPUs just get better as videos have proving this. Nvidia have been trash since the 2000 series. I'm hoping Amd can knock Nvidia of their high horse a bit since there just taking the complete mick out there consumers. As I've said in another comment look how smooth the frame graph is on the 6800xt compared to the 3080 amd is defo the smoother experience. Even the 5000 series had smoother frames compared to the 2000 series. Nvidia are more interested in marketing bs RT which tanks FPS on all gpus, there dlss , ai. There all for buy our product fo these features bs while there 3070/3080, 4060ti suffer because of vram limitations, absolute joke. If the 3070, 3080, 4060ti had 12gb plus vram they would be amazing GPUs but nvidia are more interested than ripping customers off making them to upgrade, horrible greed of a company.
@@xwar_88x30 I would agree normally but the 3080 vs 6800xt seems to buck the trend. If you look around you can see a bunch of people and places doing the comparison again and contrary to the norm, overall the 3080 is stronger against the 6800 xt. I can here from a video comparing them that tested 50 games and the 3080 was on arg faster the the 6800xt by a higher percent then a couple of years ago. Now, it wasn't much of a change (like 9% on avg in 4k vs 7% that it was a couple of years ago 1440p was up the same) but it bucked the normal trend of AMD cards getting stronger vs Nvidia's. I thought it was funny.
If I had to pick though I would still go with the 6800 xt because if you look and watch can you can find them new for around 500 or a little under sometimes. You are not going to find a 3080 new in that range or even in the zip code.
@@xwar_88x30lol no
@@jamesdoe7605 settle down there james, dont be silly now.
That's cool that you're using the RX 6800xt for editing, that used to be on the Nvidia Pro side.
The fact is AMD have improved the software a lot, CUDA support via ROCM is coming out too.
The thing is Vex has taken the plunge late in the game, there were deals on the AMD cards long ago, so you'd have had the usage out of it not waiting for 3080 prices to settle in the used market. Starfield is nice but not relevant to everyone.
Unfortunately a lot of people ignored MSRP reductions and lower prices, scared by all the FUD put out about drivers etc. That doesn't send the market leader the signals it requires to reduce its margins.
Gamers bitching about prices isn't enough, AMD have to be able to justify investment into features.
That requires sales when they are close, because development costs are a real thing.
I thought it was considerate of NV to price their 4080 so high. Gives the 7900 xtx a chance to make money for amd as well. The really are gentlemen.
@@molochi well if Navi31 had met its expected performance targets the 4080/4070Ti would have looked very stupid, weak and totally over-priced.
Scott Herkelman intended to "kick Nvidia's ass".
So the 7900xt was crushing the 4080 12GB, the xtx the 16GB while much cheaper. Only fixes to the driver caused severe performance drop and they haven't found a general mitigation. Radical changes in architecture are inherently risky, RDNA3 has disappointed.
Targets met, Nvidia added features like fake frames & DLSS would not have been enough, they'd be forced to cut prices or cede market share. Nvidia were maintaining high prices because of their vast RTX 30 GPU stockpile. Now they're cutting production to the contractual minimum rather than sell them cheaper, they figure gamers will relent eventually and cough up.
The big money is in AI at the moment, so they're hoping to max out Hopper.
The emulated cuda is not nearly as good as actual CUDA, I hear AMD is going to do the same with Tensor cores as well. So not as good, but way better than nothing. If you own AMD already it is a huge win, if you are buying a new GPU and will be utilizing CUDA/Tensor then you are better off with Nvidia. Video editing and other productivity software results are going to depend mainly on the software you are using, Premier Pro favors Nvidia much more, some free video editing software may or may not be more of a contender.
@@ronaldhunt7617 CUDA is a way to program a GPU, it takes source code and compiles it, for an application. Calling it emulation just shows your game is spreading FUD and confusing people.
The AI boom is for large models and so called deep learning, with huge demand for data center products.
What you need for running neural nets and training them is very different, laptops are coming out with AI acceleration in cooperation with MS.
Like Apple has had, without any Nvidia hardware but be an accelerator on the CPU die.
@@RobBCactive Not sure what you are trying to get at here, Nvidia has dedicated CUDA cores that process independently making certain computations a lot faster. AMD does not have CUDA cores, instead they (just like in everything else) saw what Nvidia had done and added stream processors which are not the same, not as good. Just like raytracing cores, and now it seems like Tensor cores (for AI) but only for the 6000 and newer GPUs, not to mention upscaling and frame generation (which AMD does not have as of now) Monkey see, monkey do... but not at the same level.
After seeing how ridiculous the prices were getting on the new generations of cards at the end of last year and start of 2023, I decided to grab one of these 10GB 3080's used for a deal. The crop of new games which exhaust its VRAM started showing up almost immediately after this, so I probably should have waited a few months :D Oh well.
what about medium settings and dlss/fsr?
Imagine thinking about using medium settings with a last generation high end card. Pathetic@@michaelilie1629
@@michaelilie1629 I'm mostly focused on 1440 since that's my native resolution, so I'll just have to turn off eye candy until it hits the gsync window
Same but i got a 3070 so im so screwed 💀
@@michaelilie1629 Yeah i guess he bought a 400-600 € / $ card to play on medium settings. Nice advice^^
The question is do you need more than ~80 fps in games that are requiring so much VRAM. I think 3080 should do the trick in the next 2 years.
But since i don't care for RT and the other stuff NVidia offers i got me the 6800XT anyways.
Make sure your 5900X is running a negative all core in curve optimizer (start at negative 15) and if not running Samsung B-die get a set of 3600 CL14 and really tighten down the timings. This will give you a noticeable uplift on your 5900X.
hey quick question, my ram (GSkill Sniper X 3600 CL19 @1.35v) is detected as Samsung B-die on Thaipoon Burner, but when i tried to tighten the timings even just 1 it refuses to boot, even at 1.45v, but the weird thing is i can lower the voltage when runing xmp to 1.25v. is it possible that i got trash B-die or am i doing something wrong? my cpu is R5 5600
@@therecoverer2481 Its the B-die I had the same issue with my 5600X using Silicon Power ram from amazon. Ended up swapping to some Team Group T-Create 3600mhz CL18 and was able to undervolt and adjust timings.
@@therecoverer2481 3600 MT/s with C19? That's definitely not B-Die. Guaranteed B-Die bins would be the likes of 3200 C14, 3600 C14, 4000 C15, etc. C19 is incredibly loose even at 3600. You shouldn't trust Thaiphoon Burner as it is known to have false positives, and in this case, it is glaringly obvious this is one of those false positives. I almost would say those ICs could actually be Samsung C-Die as they are known to become unstable over 1.35v, no matter the timings. It would also explain the very loose primary timing(s). I'd get ahold of a 3600 C14 kit as this is the best sweet spot for performance on Ryzen. Getting kits over 3600 MT/s isn't beneficial as they aren't necessarily better bins, but pricier; and, you almost can never go over 1900 FCLK which is 3800 MT/s when synced 1:1. Some Zen 3 samples may not even be able to do 1900 FCLK and need to step down to 1866 or 1800 (3733 MT/s and 3600 MT/s respectively). A B-Die 3600 C14 kit should easily do 3800 at the same timings on stock voltage most of the time.
This video aged like milk. The 6800XT now out performs a 3080, because ray tracing tanks your frame rate in any modern game, DLSS looks like someone smeared vaseline on your screen and driver support has pushed the 6800XT way past a 3080.
I would argue that VRAM matters for PC gamers that are not into competitive gaming but rather into the modding scene. NVIDIA is doubling down into this with RTX Remix with the RTX 40 series cause they know the demand of the modding scene in the PC landscape remains high. Skyrim is almost 12 years old, but to this day infact it has not slow down rather the modding scene in the game just hits the peak further instead (Check the view counts in UA-cam or the player counts in SteamDB - its still relevant).
High VRAM capacity and Memory Bandwidth are important in Skyrim, I have 1750 mods installed on my 5900X + RX 6700 XT and with my 12GB of VRAM there's a lot of planning I do which mods to keep or not cause of VRAM demand.
I agree. Skyrim modding also requires the fastest cpu to play smoothly. So a x3d cpu is almost a must.
Very true, Skyrim is one of those games that demands not just with the GPU but also the CPU. Not to mention at least having a GEN3 NVMe is required to even make Massive LODs work - System RAM too.
Skyrim's modding scene is probably the only game out there which is like a symphony. Its uses everything in the PC especially if the person knows how to mod.
@@evergaolbird Indeed. My laptop has 32gb of ddr4 3200mhz ram, an I7 12650h and a 150w 3070ti and a 1tb gen 4 nvme ssd. Skyrim se still see's my CPU being the limitation. Thats despite me giving my CPU all the power it needs.
@@TheAscendedHuman The mobile 3070ti has the same chip and specs as the desktop 3070 except tdp and clocks. The 150w variant is within 10% of the desktop 3070.
The i7 12650h is more akin to a i5 12600 non K (if it exists).
I'd say thats a solid setup, even for desktop standards. Cause I know, not that many people are buying even a 3060ti or 6700xt on desktop. Most have a 3060 or 2060.
My point is, modded skyrim se just needs way too fast of a CPU. I litearlly cannot use my GPU to its fullest, even if I let the CPU run wild. I hit the drawcall limit real fast and exceed it too causing FPS drops.
0:05 I'm like 95% sure the mining begun right when that generation began. Remember it cause it was impossible to buy any gpu at launch.
That was actually covid shortages. 2020 and the first lockdowns caused supply chain issues on a global scale
Nice breakdown. I've had the 3080 10GB since December 2021 and it's been amazing. Now i tend to play games that are a bit older because I have a rule where I wait at least a year after release for games to get patched and go on sale so I have nothing from 2023 but I've played dozens of games from 2015-2022 and the 3080 ran them beautifully at 1440P.
I dont have problem on 4K with my i7-9700 … 60 fps are stable…. Now I can never go back to 1080p
Right the 16gb of vram isn’t all that 10g is all you need it nvidia still gives you advantages
I have had that same 10GB card for almost 2 years, and use it for 4K gaming on high settings for most couple year old games. Granted, the latest AAA games can't be maxed out but still look pretty awesome on high. I run an I9-13900K with 32GB RAM.
@@Cybersawz nice. I have 32gb of ram too but have a 3700X which I may upgrade to a 5800X3D. But at 4K, I'd be GPU bound so probably would be the same results as your 13900K there.
The problem with the encoding is that i don't use that feature neither do I have a use case for CUDA and I'm probably going to stick to pure Rasterisation rather than turning on RT because upscaling or not the performance hit is too much. So for me the 6800XT is a better option.
That’s the thing. Everybody talks about the Nvidia feature advantage. But beside DLSS, most features are used rather scarcely
@@chetu6792 DLSS is just another scarcely used feature. They have to pay companies to implement it in their game, how ridiculous is that?
@@evrythingis1i mean that's literally what most companies do to push their tech. You gotta pay to play. Why would a company offer to put your tech in their game if they have nothing to gain from it. AMD does it too. While I don't think Nvidia GPUs are particularly good value right now since we're at the beginning of this era of AI being used to improve graphics computing. I don't think the technology is bad or a gimmick like most people are trying to say
@@gotworc go ahead and tell me what proprietary features AMD has had to pay game devs to implement, I'll wait.
Why do you say one thing, then contradict what you just said? " I don't think the extra 6 gigs of vram is that big of a deal" Then less than a minute later " but the 16 gigs of awesome, to allow you to enable higher settings.
I got the 6700xt spretral white edition last week for £300, and im so impressed with its performance. 1440p ultra no upscaling 🎉 glad i didnt get the 4060!
I have to disagree with the feature set. And before people call me a shill I have been using nvidia all my life. The 6950xt is the first amd gpu I have owned and used.
The dlss vs fsr argument is in most cases highly blown up. Yes dlss provides a better picture. BUT you will almost never notice this during gaming and only notice this If you pause them and watch them next to eachother. If you have to go through such lengths to spot something than yeah that's just not an upside in my opinion.
And raytracing is better on nvidia although not bad on amd either. But the cards we have right now and especially the 3080/3090/6800/6900 are just not raytracing cards. Neither is the 4070ti or 4080 or 7900xt or xtx. The only card capable of raytracing at minimum is the 4090. And even that card sucks at it in many ways. So If you plan to raytrace you really shouldn't be looking at these cards.
The only upside is cuda but If you are just a gamer you wouldn't care about it.
And the vram is just so important. There are so many games I see with my 6950xt that these days shoot past 10gbs.
And I wouldn't choose the 4070 above the 6800xt in my opinion. 12gb is alright but as I said I've seen games shoot past 10gb Heck even 13gb. So the 4070 would already be having issues. And that at 600+ bucks new in Europe. Just not worth it in my opinion. At that point you may as well buy a 6950xt if they are still available.
Most people buy it indeed for raster and in that case most amd cards just beat nvidias.
For me at the moment it's between the 4070 and the 7900gre but I'll have to see where that price lands in Aus. The 40 series power consumption is pretty compelling.
Not sure if this also matters to you, but the 4070 has a nicely small form factor. Can fit into a pretty wide range of case sizes
The 7900GRE will be limited to pre-built pcs. You might as well get the 4070.
Get the 4070, AMD is garbage beneath the 7900 xt and xtx
@@Dereageerdercap 6950xt definitely gaps or beat 4070 at times
@@chriswright8074i think he meant anything in the 7000 lineup, but i get your point
Could you include a software showcase comparing the nv control panel and amd adrenaline? Because I've used both in recent times and amd is way better, especially since it comes with oc/uv suite that's far easier to use than having to resort to afterburner. These things should be considered when doing a comparision.
See I’d love to see this too, haven’t had an AMD card in years but hated the software back then and have heard it’s improved a lot. Never really had an issue with nvidia
@@Raums Tried both in the last 2 years. As I always undervolt my gpu I must say that amd is way easier and works better. Needing a 3rd party software made only by one outsourced person to undervolta a gpu from the market leader is embarassing imo. Amd software is great these days and drivers, while sometimes have issues like nvidias, have way more performance improvements over the months compared to nvidia.
They always leave this important fact out.
But Cuda is so important for a gamer?
We are not all contant creater! Dont give a shit
The fact that AMD has a built-in OC/UV utility is actually a bad thing. They have already used it to artificially limit the speeds in which you can tune your GPU. You want overclocking software to be a third party because they will provide the most impartial and least limited features.
@@jjlw2378🤡
DLSS doesn't eliminate the value of Vram, but it can let you get away with using less with a small reduction to image quality at a given resolution. Also, while DLSS can help compensate for a lack of Vram, FSR can as well, it's just not as good at it at 1440p, but it gets quite hard to tell the difference between the two when upscaling to 4k, at least with the highest quality settings.
I did own a 3080 for a while, and have played around a lot with DLSS and ray-tracing in Cyberpunk, and also in Control. Running without any upscaling at all is still better than running with DLSS, as long as you can get a high enough frame rate, and I find it very hard to recommend a card which costs 80-100 more, but has significantly less Vram, if it has about the same amount of standard raster performance. Ray tracing in Cyberpunk required using DLSS, and still ran significantly slower than running at 1440p native without ray tracing. I just didn't think that it was really worth using ray tracing. It never seemed obvious that it was worth running with it turned on, though it certainly was usable, and did look good, running at 1440p with DLSS. With Control, I found that the performance was good enough to run with ray tracing and without any upscaling but the game did still run noticeably smoother with ray tracing off, and the ray tracing itself was still not all that compelling to me.
I found that the lighting in both games, even without ray tracing, was amazing. A lot of the lighting effects even with ray tracing turned on, are the same either way, unless you go with full path tracing, but I obviously had nowhere near enough performance to run with full path tracing.
The 4070 isn't terrible, and it's not going to suddenly become obsolete because it doesn't have enough Vram at any time in the next 4-5 years, but it would have been a LOT better if it had had 16GB of Vram. It's not like 16GB would have been overkill on it because it has DLSS. That would have made the card that much better, and it would have also had more memory bandwidth as well, which would also be nice. A 16GB 4070 at 660 would have been a significantly better value than the 12GB version is at 600.
The problem with using a 3080 for productivity apps is that 10GB is REALLY not enough for doing actual work with. PLUS - nVidia's drivers limit many of the professional features you need to Quadro drivers. The 3080 Quadro equivalent is the A5000/5500 with 24GB vram and price around 1k-2k. You will get better performance, than 3080, in most CUDA workloads with a Quadro RTX 4000, a 16GB - 3070 equivalent. Because 10GB for any significant work is waaayyy too low - assuming the drivers allow the full feature set in the application on a non Quadro card.
As far as AI worloads go; which is much more my wheel house. ROCm is feature complete and CUDA isn't all that relevant in 2023 as it was in 2021, for the AI space. 10GB - again - cripples the card for anything more than fiddling around as a hobby. Try to render a 1440p scene with a 10GB card vs a 16GB, it's not even funny how memory crippled you will be. You will get equivalent performance to a 6700XT with 12GB - which you can get for much cheaper. Additionally, we tend to put GPU render farms on a linux distro, where AMD has much more mature drivers and ROCm support. Specialised AI accelerators are a whole different kettle of fish, in that space, you will be writing your own custom libraries, that you will tune to whichever vendor allocated some boards for you - nobody is going to be picky which one, the lead up times are insane as everything is pre-sold before being fabed. You take what you can get, pay what is asked and count yourselves lucky.
Dude is capping by saying 16gb VRAM doesn't matter.
Extra vram allowed cards like the RX 580 8gb and even R9 390 8gb to extend their lifespan way beyond what was initially expected out of them.
The 6800 XT is a great long-term purchase and will be viable for 1440p for at least 3-4 more years.
Vram becomes a huge problem when you don't have enough, I'd like to see some benchmark examples with Starfield 8k or maybe even 16k texture mods. Watch that 10GB absolutely tank in performance. With that said, right now 10GB is good enough, it's just not very future proof. When new games are using more and more vram every year, I'm sure that's a concern for many people. Despite all of that, I'm sure the used market is just flooded with more 3080s because of the end of the crypto craze.
8k and 16k textures are more for the 90 class cards. 10gb is plenty for starfield
wtf u need 8k texture for ? you crazy and delusional
Only a concern if you max them out. Gaming at 1080p, never had to drop more than 3 settings from max for stable 60 at 8GB. Haven't played Hogwarts Legacy, TLOU etc. but they don't interest me anyways and I bet you can play them perfectly fine with a few nonsense settings turned down. It's the same deal for higher resolutions, turn down a setting or 2 and watch the game become playable, usually with little visual difference. Ultra settings are beyond the curve for diminishing returns for performance hit vs visuals, when an amount of VRAM isn't enough to run the game at medium-high then it's too little
10GB VRAM is gonna be fine, XBox S has 10GB total that's shared between the GPU and CPU. 10GB may be laughable on a 3080 but in and of itself that amount isn't an issue
When was the card designed for 8k ?
Do you have an 8k panel ?
You have a $3000 monitor ?
Lol
Same as testing a 4060 in 2023 at 4K, it doesn’t make sense, it ‘can’ up to a certain texture level.
Would I want to, no
I agree with you both, like I said, 10GB is good enough, especially if you're doing 1080p. It's more of a worry with future, poorly optimized games from AAA devs using more vram than necessary. It's like that saying, I'd rather have it and not need it, than need it and not have it.
Nabbed my Red Dragon 6800xt this prime day for 479$. Pretty happy with it, although i do sometimes find myself wishing I still had the CUDA support my 1070 Ti had. Looking forward to ROCm.
I got brand new 6800 asrock challenger pro (non-XT) for 360usd in May 2023 and I absolutely love the card
Nice Video man, I jumped ship from my 3070 to a 7900XT because it was 200AUD cheaper then the 4070ti and besides not having DLSS as an option I am super happy with my AMD card, in pure raster performance on average I get 15-20% more fps in most games(compared to the 4070ti) and the only issue I have had with my 7900XT TUF is bad coil whine which I am hoping I can send mine back soon and see if I can get a better luck of the draw on that. Keep up the videos you are a great new entry source into the PC DIY learning youtubes.
Your not missing out on dlss since amd has built in fsr in there drivers which you can use in any game and it still looks amazing. Should defo try it out. It's under super resolution in the driver menu, I tend to have the sharpness either at 40/60 looks good imo.
I just put here my two cents about what is keeping me, and a few other folks, on the AMD side when buying a new GPU.
Support for open stadards, like FreeSync, FSRX and OpenCL/ROCm: I don't like vendor lock-in, so I support agnostic tecnologies.
I'm not the guy who cracks professional software just to tell my friends: I have photoshop 2026, idk how to use it but I have it.
So I usally go for open software and I've never had a regret, in both my private labs and professionally.
But the main plus above all it's the unix support.
At home I can play windows games on linux flawlessly without having to thinker with monthly driver updates, it just works...and 2005 class hardware is still supported.
At work I greatly extend hardware lifespan for the same reason and this philosphy allows us to offer fast and reliable citrix like remote desktops with GPU passthrough of graphics cards that would now be e-waste if made by nvidia.
Intel is now in the middle between AMD and nVidia philosphy and i hope it will land on the AMD view of the HW/SW stack.
Personally, DLSS is a crutch-technology and should be advertised as a way to keep your e.g. x080 viable for another generation, or three. Not as something to make games playable when the card's still pretty new.
Edit: fixing spelling mistakes and such like.
Like AA or anything that improves quality over rendered resolution does? Cruth. Do you play games or freeze frame and pixel peep. DLSS gives you a choice, and when I choose quality it get more FPS at the same visual quality on the same card. Its like a 5th gear that gives me better milage. If it was not valuable. AMD would not be bribing bundle partners like Starfield to no support it.
it's not a 'crutch' technology. There's more to making a game run nice than just raw hp. DLSS is an excellent feature and a great example of hardware and software harmony
Depends on how you see it. If DLSS is "crutch" then rasterization is also a crutch.
@@_n8thagr8_63it's only a crutch for Devs who don't know what optimization means imo
@@arenzricodexd4409 You don't know how DLSS works do you?
Upgraded from 6700K, GTX 1080 and 1080p gaming to RTX 4070, 7600x, 6000 MHz 32 GB DDR5 and Dell G3223D monitor. It was a nice upgrade.
i upgraded from 4690k, gtx 970 to 7800x3d and 4080, and monitor G2724D
I upgraded from i3 19100f, original gtx 1650 to ryzen 9 7900x3d and rx 7900 xtx
@@Ghostlynotme445 good choice, better upgrade than the other two noobs 😂😂
@@xwar_88x30 ATI fanboy here
AMD users will have to adjust settings on Radeon software 'to get a stable experience' whereas; Nvidia users will just play the game without doing any adjustments. And when RT is turned on, AMD goes insane hahaha
@@xwar_88x30
In the "new" market in europe (mostly france) :
- For 599€ you got a brand new 3080, a 4070, or a 6900XT (the 6950XT is 629€)
- The 6800XT is 60€ less
I'm still waiting for the 7xxx from AMD. I don't get their strategy. Still, there isn't 7700 or 7800, XT or not, and they released the 7600.
Recently upgraded to a rtx 2070super for 150€ (card needed to be repasted, not that hard tbh), ill be good for a few years or i can snag something newer on the cheap
Edit: i bought the gpu with mobo combo, i7 9700k 16gb ram and aio cooler for 200+150=350€ sold my old setup for 400€ with screen and other peripherals
Grats on your buy, mate. The 2070 super is still a very competent 1080p card.
2070 Super was a very good card for it's era in bang per buck. More a 1080 card now but if that's your end use it has a lot of life left yet.
Great buy for $150. It has slightly better performance than my RX 5700 XT and I play every game at 1440p at 60+ FPS with FSR enabled no problem.
@@RFKG you mean 1440p the current rig pushes 100fps in almost all games at high/max @1440p i dont use dlss or other resolution scaling, i went from i7 4770k w. Gtx 1070 that also pushed 1440p at medium settings albeit stuttery in demanding games
Example: cyberpunk currently averages 80fps @ high to max @ 1440p
@@moes95 That's amazing, i honestly had no idea that the 2070super could still do 1440p gaming
The main reason 3080's are cheaper is that they were probably the most produced card of all time. Although I dont know how to find stats on that but so many miners like myself bought every one they could for a couple years and they have been re-selling them on ebay since then.
it's true. ga102 in general was so abundant that they made the following GPUs with them:
3070 ti
3080 10 GB
3080 12 GB
3080 Ti
3090
3090 Ti
and those are just the GeForce ones. their serverside cards also used the cores.
F*** you miners, f*** you miners
when shitcoins and cryptocurrencies collapse i will rejoice
@@CPSPD Thats good central bank slave. Keep rejoicing in your life long slavery
gpu market has been confusing lately
here in my country indonesia, there were a moment when you can't find RX 6950 XT at $630 USD, it was sold around $700
but ironically, you also can't find brand new RTX 3080 on $600 USD, because it's also sold on 680-700 USD
so people who actually buy brand new RTX 3080 is considered as "stupid", since they could get way better GPU which is RX 6950XT
and now a brand new RX 6800XT is being sold at $560 USD, just $60 less cheaper than RTX 4070, and you can't find RX 6950 XT anymore, not even in 2nd hand version
I bought a 6800xt 1 year ago. Where I'm from in Spain, for some reason AMD cost more than Nvidia. All my life I was Nvidia, except for an old ATI x800 I had, but the rest of the graphics cards I have had were all Nvidia, a 9800gx2, gtx 280, gtx 285, gtx 480, and Gtx 970. Even being more expensive here the 6800xt than the 3080 I opted for it, it was not for the performance, nor for the price, it was because personally I am up to ...... of Nvidia. First it was with a board with Nforce 790i ultra chipset, an expensive board that did not stop giving problems, then the 9800gx, the worst graphic I've had by far, then with the Physx marketing crap, that if you add a second card to have physical and blah blah blah, never ended up working well, pure marketing, then the 3.5 Gb of the 970, I ended up fed up and for my part they can put their marketing up their ass. It is clear that the streaming and rt in nvidia and superior, but for me it was not so much as to opt for Nvidia again. The Rt is superior but it is clear that in the 3000 series falls short so for me it is still marketing, the Dlss is above the FSR there is no doubt, but who tells me that when they leave the 5000 not leave me sold with as they did with the shit physx, the 3dvision and other technologies, also the Xsee of intel is not so bad and can be used in AMD. This is my humble opinion from years of owning Nvidia cards. I'm not saying that Amd hasn't had problems with their cards, but I personally haven't suffered from it. Sorry for the long and my English, I wrote it with the translator. Good video and good channel
I do not stream to twitch, i do not upscale and i seldom use raytracing (well not that i could complain about raytracing performance of my RX 7900 XTX. It is plenty sufficient). But i have games that use more then 10GB RAM and that raytracing increases the VRAM usage thus the 3080 running out auf VRAM is kind of funny. So i have to disagree. I would chose the 6800XT anytime over a 3080.
Can you tell me what the temperatures of the card are, especially if you use a 1440p ultra wide monitor? For me, with the highest details, the temperature at the connector was 100 degrees Celsius
Um what holy shite bro that's probably not normal or is it
@@Anto-xh5vn as far as i checked normal 10 degrees before thermar throtling with undervolt its around 90
I got that one on prime day for $480 with starfield. But good points made. The only feature i really would nod the head to is dlss. I’m all for Ray tracing but it’s just not that big of a difference in most games for the performance. I wish amd would focus on fsr 3.0.
Imagine fsr 3.0 miraculously comes with starfield lol
You would pick a 12GB card over a 16GB one? Do you play at 1080p or something?
this vram talk needs to stop, most the test where 12gb is barely enough is with 4k textures, you can't even notice the difference below 4k
People are overblowing the whole VRAM issue.... yeah bad PS ports exist, badly optimized games exist especially recently..., capping FPS is a thing imagine (why would you be making it run more than it's needed unless it's a competitive game and you need that 120FPS), lot of the options in the Ultra preset make little to no visual difference and cost a big performance hit. .... yeah. Great video
I just recently upgraded to a 4070, a few weeks ago, I gotta say, I haven't felt the vram issues yet. every game I play, plays flawlessly.
what resolution do you play on?
@@antonnrequerme6256 1440, what about you?
You'll feel them in a year or so. 16GB is the new VRAM target for game devs. Hence the sudden explosion in VRAM use in 2023. We ent from "8GB is enough forevahhh" to some games using up to 14GB in months.
When I buy a card I intend to keep it for 4 years. And for that I realized 10 or 12GB just wasn't enough, so I bought a 6800XT. Used to own a GTX1080 and saw that it was already being filled up in 2021, I saw the writing on the wall
Should be good on 12gb for most stuff at 1440 where the 4070 is well matched. I'm on a 6700xt atm at 1440 and don't have issues with vram.
@@Altrop Realistically the only things using that much vram are unfinished garbage piles or if you're set on raytracing but I don't think a 4070 is really the card for raytracing. Until consoles move away from their 12ish usable that will be the "benchmark"
1080p 240Hz player for all games and I never use RT with my 3080 or 6800. It's not worth the best speed for visuals which I could care less about. I get awesome visuals at 1080p and I'm not going down that rabbit hole to compare 1440p or 4K. I'll leave that up to you guys to complain about. Speed & Efficiency for games I play is absolute! Enjoy 🙂
I wanted to buy a 3080 2.5 years ago, ultimately it became a 6900xt.
At that time I had a FHD144 monitor then switched to WQHD165, meanwhile I have a 4k Oled for gaming.
Currently 12GB would also be too little for me, already the head thing.
With every jerk I would ask myself whether it is perhaps the VRAM.
That's why I'm quite happy not to have gotten a 3080, back then I would not have thought how quickly the memory becomes a problem.
I disagree with the VRAM statement. The stutters and graphical artifacts are a dealbreaker for me. I had a 3070 that ran out of VRAM and the drawbacks were unbearable. The additional 2 GB of VRAM for a 3080 would have not resolved the issue. Moreover, the raytracing on the 3080 is quite overblown with how good it is…. Also, it requires more VRAM. So what would you choose, better textures overall or better lighting?
It is possible something else in the system than the GPU draws more power, but the selling point about power consumption certainly doesn't seem like something that would matter in this case.
3080 shoot its leg by 10gb Vram. I would buy 3080 12gb or 3080 ti which also has 12gb if its not significaly more expensive.
Was Smar acces memory turned on when you tested both cards? In general both cards are same until you turn on SAM, than 6800 xt stsr pulling head in nearly all games except RT.
I got the 6800xt in november, and I love it. First, I had the good luck of getting a card without the annoying coil wine, so that made me happy (I bought the Gigabyte OC version). Second, I trully love Adrenaline, because there´s no need of Afterburner to adjust voltages, fans speed or whatever and is very easy to measure thermals without using a third party software. To be honest, DLSS is a major feature I would like to have on my card, but considering that Nvidia didn´t supported 30 gen of cards for frame generation and AMD promised FSR 3 will be supported in the 6000 cards; seems that was a deal. If AMD promise is achieved, the 6800xt would destroy even the 3090ti with half the price. We´ll see if true... but at the end, 6800xt seems like a good deal if RayTracing is not what you want. I´m not content creator or editor, not streamer; just using the card to play and even without FSR 3 I love my card. No drivers issues, more VRAM, more rasterization performance and includes Adrenaline. All for less price. Shut up and take my money!!
Upgraded from a Rx570 from 2017 to a Rx6800 (non XT) for 300€ on the used market. Currently wasting its potential on Dota and Wow :)
The reason Nvidia skimps on Vram is so you have to buy the next gen and give you dlss to get by. Amd give you the VRAM and get blamed when theres not much improvement next Gen and offer a serviceable FSR.
Got a 6800xt paired with a 43 inch 4k screen and I can tell you using quality fsr at 4k I cannot tell the diffrence and at 1440p with these cards u don't even have to use upscaling.
If the extra 6GB VRAM gives you 2 extra years of gaming (let's say an additional generation of GPUs) before forking out new money it's worth worth 112$ a year, without taking the resale value into account of both GPUs after respectively 2 a(for the 3080) and 4 years (for the 6800). And not considering if you do more than gaming on the GPU.
AMD had a much favorable time competing with RTX 3000 because of the huge node advantage that AMD had compared to NVIDIA on Samsung's crappy 8nm node.
great points personaly I like to use driver only install with AMD GPU's lower overhead plus afterburner for undervaluing is just simpler. Though I do lose the ability to overclock my monitor as well as run not sure what the setting is called but it alows you to un cap the FPS without tearing which on this rig can be pushed to 78Hz vs 75Hz not a big difference and you lose freesync with the overclock so not worth the overhead of the radeon software I just cap the frames @ 74FPS which is all this RX570 can handel with flat line frame times anyway I just purchased a red devil RX6800XT used on Ebay for $395 $445 after tax and shipping waiting for delivery to upgrade my RTX 2080 super, I plan to undervolt, and don't use ray tracing or DLSS I mostly play Fortnite with TSR enabled high- epic details it's all about latency for me @ 1080p 144Hz I plan to test it with my 4.75GHz R5 5600X then test it with my R5 5600X3D which I dont use at the moment because it runs hot on my 240 AIO may have to pick up a 360 AIO if the 6800XT performs better with it vs the non X3D's older sibling which is fine I would like to use this 240 AIO in my Rampage III Gene X5690 Xeon machine paired with my ASUS Strix RTX 2070 super which currently has an XFX RX570 that I customised with an XFX RX580 cooler 2 more heat pipes and direct contact ram plate which I also undervolt and overclock, I got the untested RX 580 for parts on Ebay for $20 im prety sure it was just dirty and needed a repad repaste but my RX 570 is basically new and I didn't want to risk installing a dead GPU on any of my current hardware
I have a 3080 10gb. What's your opinion on what to upgrade too? 40 series are so pricey, but the amd cards are not so higher performance.
Edit. Cheers guys! I will stick with the card and hope it doesn't conk out soon :-P
There is no need for you to upgrade unless you get a 4090
Keep the card for as long as you can and pray that next gen doesn't suck as hard as this gen and that ai doesn't become to next crypto boom.
The only issue you could have is VRAM limitations, which is a good old Nvidia planned obsolescence strategy, other than that you should be good for a while.
3080 is still a good card but if u want more vram for 4k etc.. go with a 7900 XT or XTX. If u want Nvidia, minimum a 4070ti
4090 or just don't. You are set for now.
wait for next gen
What really fucking sucks it's been two fucking years census video is made and the RTX 3080 still cost the same on the used Market I honestly hope it goes down after the 50 series launch literally my entire PC including a ryzen 5,5600 is cheaper than what a RTX 3080 goes for right now😢
Literally just bought a red dragon 6800xt today, open box special at $430. Couldnt postpone anymore, coming from a 5700, i hope to keep this around for a few years. Maybe next time around ill go greeen when RT is more optimized and adopted by more games
(Used) 3080-s are cheap because there's an extremely high probability they were mined on. As the crypto drops, there's little incentive left to mine, so the smaller farms get disassembled and sold. We will not disuss the probable condition of those GPUs (which may vary greatly depending on maintenance of the farm).
The fresh 3080-s are indeed priced cheaper, because there are only so many advantages they have over an AMD competitor (6800XT): raytracing, DLSS (no frame gen on 3000 series btw) and CUDA, out of which only CUDA is not optional (if you actually need it as a content creator, that is). On the other hand, 3080 has only 10Gb VRAM, which is enough for now, and may (keyword may; it gets filled up in some games on ultra, but those are just singular examples... for now) last you until the 5000 series; 6800XT does not have that "may".
Overall, I'd say a 6800XT is a more balanced long-term solution, "the 1080Ti of current era", while 3080 is an "I want the eye candy now and then I'll switch to 5070 as soon as it launches".
awsome video man, i love the detail you get into. that red dragon looks like its performing great!
I am absolutely thrilled to share that I recently snagged an incredible deal on a 6800 for only $353! Needless to say, my excitement is through the roof!🥰
I got an used 6800 no xt for 260usd aiming for 1440p im so happy also (1 year warranty) (in argentina is a good deal)
tbh some of us, well. most of us grew up without ray tracing, why need ray tracing when u can perform higher fps at a higher resolution, honestly, ray tracing is a useless invention when we can use its much more easier counter part also known as pixels.
I had a Red Devil 6800 XT and took it back when the 6750 XT refresh came out. I should have kept it. Incidentally, 6950 XT Reference cards are only $550 at Microcenter right now. Pretty comparable to the 7800 XT, or better in several games.
3080 10GB doesn't have enough VRAM to handle Cyberpunk in 1440p with Ultra-RT. Scaling up from a lower resolution alleviates that memory pressure.
Are you using FSR in quality mode? Because it looks fantastic.
It's just a shame that Nvidia doesn't want to give enough VRAM to use their GPUs to the fullest. I can forgive the 3000 series, 8 was plenty at the time, but it was becoming clear early enough in development of the 4000 series that they needed to increase the VRAM at every skew except for the 90s. If AI wasn't their focus right now, they probably would have, but, because of AI, Nvidia just doesn't care.
When it comes to VRAM.
As it seems right now (if I'm not mistaken), there are about 4-6 games that (for whatever reason) exceed 8, 10 and even 12 GB, at a resolution of 1080.
To me, that is entirely different to when Ray-Tracing started getting games, as it was obvious that it'd take years before buying a graphics cards just for Ray-Tracing was a viable (or at least a wise one - besides, how good was the RTX 2000-series with Ray-Tracing, really?).
*This*, on the other hand, looks like the start of a trend, that the VRAM usage is spilling over the 8GB (with 10 and 12 following shortly, would be my guess) limit. And I'm guessing that it'll be at a rate, at least twice, the rate of the adoption of Ray-Tracing (at least after 2024).
To me, Ray-Tracing is an add-on, a luxury item. But being able to play games at 1080p, on a card (that was released in the roaring 20's, mind you) that had an MSRP of $700, is (with exceptions, of course) the least one should expect.
lazy game devs
Hey if yu see this i have a question , in present time if theres no price difference which would you take?(basically rt or vram for all around use)
Vram all day. You cant use rt if you dont have enough vram, something he overlooked pretty foolishly in this video.
10GB is veeeery borderline for 1440p. I very often see over 9GB. Over 10 is less often but we are getting there. I don't know for how long the 12GB on my 4070 will last. Ideally I wanted more VRAM but there wasn't any alternative on this price range. I didn't want to go with the 6800XT and lose DLSS 2,3 and better RT cause I am only playing single player games. Fingers crossed for the 12GB then :P
That's all Nvidia is good for is borderline with the vram scam on their GPUs making people upgrade when there 3070/3080s/4060/4060 ti or whatever other bs Nvidia GPU that starts to stutter in heavy vram games. 16gbs Is the safe spot at the minute anything over 16gbs is even better future proof.
I got a 6900 xt for 400 euros used and I couldn't be any happier with it, crazy good GPU for it's price
The prices are WAY different in my country. A used 6800 xt has the same price as a 3060 or a 3060 ti. Which makes it a hugely better deal than nVIDIA ones. 3080 is much more expensive than 6800 xt in here.
I've purchased the 6800xt. I am having nothing but issues with it. Stuttering games, Flashing desktop, so on and so fourth. I started to look into it, (I haven't owned anything amd for a while). And I noticed that people have been complaining about the stuttering for literal years!? There are work around everywhere! This is crazy to me. I am returning the card tomorrow and getting the 3080 instead.
for the last 3months been testing used 3080 and 6800xt cards, all gotten for under $350 in great conditions.
I decided to keep the NVIDIA card in the end (a Gigabyte GAMING OC) because of the better performance with my QUEST2 for PCVR. I really fell in love with VR since I bought the Q2 (mainly used for Simracing).
In the titles I play in desktop mode (4k 120Hz TV), never really saw a VRAM bottleneck. Maybe having a 5800X3D helps as well but that's my user experience
I also use a Quest 2 for PCVR. I was considering getting an RX 6800xt. What made the 3080 better? And how much better?
can i ask what is your ambient temp of your cpu . your cpu is cooler while gaming
Have the 3080ti and a 6800xt and usually can’t tell them apart. But I think they are both CPU limited one on a 9900xf at 5.0 and a 8700k at 5.0. I will say the textures in LoU look muddy as it’s likely loading lower texture quality. Hardware Unboxed does some really good VRAM testing you should check out.
"FSR doesn't look as good as DLSS" that's funny you say that right after showing a great comparison where both XESS and DLSS failed miserably at properly rendering a reflected light source, while FSR was the closest to the actual render. So that is anything but clear-cut. TBH all have problems, and I don't like the idea of using any of them unless I'm forced to. But if you really enjoy looking at an AI fever dream and pretending it's an actual render, you do you.
Just a quick FYI, your model is actually called a Red "Dragon" and not "Devil". Easy to confuse, just thought I'd point it out.
I love AMD, but Id never say no to a competatively priced 3080. Did the power connect get sorted out tho?
not really, but all boards that's not the founder edition cards (for the 30 series) have the standard 8-pin PCI-E connectors.
My goal was to get something that allowed me to max out details and textures in open world environments without worrying about stuttering or skipping, and upscaling isn't that important to me. At a reasonable price of $400, the 6800xt wasn't a hard decision.
Terrible comparison, a high en3080 vs a lower end 6800xt... the power consumption and the frequencies are too low for a 6800xt
He is a Nvidia fanboy
Bro increase your parameters from the default through Andrenaline. The fan speed and Power limiter by Powercolor's default are kinda low.
run that bitch on 2000rpm with the power slider maxed and an undervolt&OC and it will be even faster.
Where I am 3080s are going for about the same price as RX 6800s on the used market. RX 6800s are typically going for around RM1400, and the 6800XT a cool RM1800, RM400 more than the 3080. For context, RM400 is roughly $88.
I paid RM1600 for my 6800 (had to pay extra 200 for the reference card tax, since this card is rare af) just last month and have not used RT in any of the games that I've played. I personally don't feel the necessity to turn RT on, games look more than good enough as it is and I prefer the higher framerate. I only play at 1080p so I don't see the point of using either DLSS or FSR; the image quality degradation is too great at this resolution. So I don't think I'm missing out too much by not getting a used 3080.
I bought mine part of a Cyberpower PC Build I picked my Parts.. Didn't feel like dealing with it this time. But I have an XFX 6800XT and it costs more because it is fast in most games and it has 6 more gigs of VRAM!! It's Hands Down the Better card and anyone who plays Hogwarts knows it.. The 10gigs is barely just enough for Brand New Games and I think that's why it flipped around as far as pricing goes. Also the 6800XT is even slightly faster than the 3080TI on Hardware Unboxed 50 game benchmarks too..
Also depends on the card manufacturer brands. Do you have any idea how expensive EVGA 30 series cards are going up in value? I was just barely able to buy 3090before they started going up pass 1,200 and the kingpin series 3090 I've seen a couple go for over 2k
Very nice compare, honest and straight forward to give everyone freedom to choose
Gotta love at 3:54 "the nvidia card is pulling ahead pretty significantly"(+14%) then 10 seconds later when "the amd is pulling ahead on same game the amd is actually pulling ahead"(+14). I feel like for the rest of this video nvidia is gonna get way better choice words. Smh🤥
Literally said amd has is very favorable with lumen there and that’s is a huge advantage in the future
@@vextakes that doesn't change the choice of words that was used though. I see what your getting at but i'm saying using words like significant just sounds way more than what actually took place within that 12 seconds of that time stamp.
three years ago i really wanted 6800 xt however considering mining craze started and there was no amd card in my market, only option was 3080 10gb :( it ain't bad card but 10gb buffer is definitely holding it back
Especially since the 2080ti has 11gb. My poor EVGA FTW 2080ti. She sploded.
Now for a card that competes with xx80, you have to pay $1000 just because Nvidia raised the price of the xx80 from $700 to $1200. This is duopoly cooperation, not competition.
The funny thing is that 7900xtx is relatively slower than the 6800xt but costs $1000 pretending to be like 6900xt when in reality relative performance is not even close.
6900xt = 3090 -7% -> $1000
6800xt = 3090 -15% -> $650
7900xtx = 4090 -20% -> $1000
6800 non-xt = 3090 -24% -> $580
I got my Asus TUF RTX3080 (OG model 10GB) at launch during C19 and got it at basically retail price (MSRP) and been happily using it as my main gpu (recently upgraded to RTX4090 & 4k HDR monitor) for my 1440p gaming needs and some games with DLSS 2 (upscaling) for some titles for moderately higher fps for much better experience, rather than around a 60-70fps in some newer harder to run titles. I love a high refresh rate experience. I do also have an Rx6700XT which I have in a spare PC which I use a my partners place. Both gpus here as mentioned have their place in the market and featureset depending on your individual needs.
When next generation games comes out 10gb is going to be a problem.
Also so many was used for mining, those two elements. So many more about.
The 3080 used is a bit pricey. Still a good card.
I want to buy a new GPU. *BUT* I have yet to see a review/video for gamers that clearly identifies a video card without too damn much mumbo jumbo.
I am NOT non-technical. I have been around since BEFORE anti-aliasing. I remember when nVidia introduced anti-aliasing and reviewers were nah-this and nah-that, f-ing short sighted. Even so, I realized, while it was a step in the right direction it was "not there yet". I bought their cards but didn't turn anti-aliasing on for performance sake.
Then nVidia introduced cards with ray tracing and reviewers were nah-this and nah-that. But look at what is happening in the industry, cards and games!
Now PLEASE, just tell us what card is the next damn 1080TI that will last for 4 (four) more generations. I'm sick of the BS and price gouging.
Any whatever (Potato), now I'm waiting for the nVidia 5000 series and the AMD/Intel competing brands... Nothing in this generation has been identified as a MUST buy. 👎
BTW, I'm still rockin' a 1060 and I was ready to buy a 3080, etc. if it weren't for the BS that happened in 2020.
The fact that AMD makes FSR 3.0 available to NVIDIA users is enough for me to go with AMD. Forget NGREEDIA...
That is definitely a hot take on the 16GB not being a big deal. There's a LOT of other youtube tech channels that would vehemently disagree with that statement.
Hi, I was thinking to upgrade my GPU, at the moment I’m using ASUS ROG Strix GeForce RTX 2060 OC edition 6GB GDDR6 and my CPU is an Intel core i7-9700k 36gz 12mb cache, can you please recommend what GPU to buy for better performance!!! I’m playing Destiny 2, Overwatch 2, CS, Valorante. Thank you
That was NOT before the mining craze. When I build my last PC in 2017 is when it later took off for real in 2018.
6700xt
I still had a 1070 until a month ago and only upgraded to get a card for maybe more modern titles. I undervolted my 6800XT, lose like 1% performance (the way I did it, at least only has that little impact) and i have about the same power draw as the 4070.
Raytracing does not exist for me. Whenver people show their benchmarks and have RT enabled, I can never tell the difference anyways, maybe because I don't look at the game but at skill bars and crosshairs.
Being able to run Eldren Ring at 1440 p with just a power draw of 90-120 watts beats my 1070 (175+ Watts) by a long shot, I don't regret the purchase.
Just checked on eBay, the non-auction options costs around 600 to 750CAD, not really a good deal considering thay are pre-mined
Newegg has the Asrock RX 6800XT for 750 CAD, new with a premium Starfield code. If the used cards cost that much its just as well to buy new.
Valhalla, CODW2, Forza 5, Daying Light 2, For Honor, RX6800xt is 30+ > 3080.
I enjoy your videos, I imagine that's because I share your rational on so far 100% of things.
Fsr 2.2/2.2.1 is amazing actually you wont notice the difference with dlss unless if you really look for it. But 2.2 is maded for racing games in mind, modded 2.2.1 looks great on games.
because RTX 3080 was used a lot in mining rig (better hashrate), and those cheap listing are probably in terrible condition (rusty, dirty and need a repaste)
This is a hugely wrong view. Ive been mining with cards for years and they still perform exactly the same as when brand new. Most miners learn how to reduce heat and power for max efficiency which is far easier on the cards than gaming.
@@professorchaos5620 running 24/7 on an open bench, inside a humid room, never get clean, yeah sure 🤣 most used GPUs were in terrible condition, dust combined with humid air is the worst, it became black gunk that's hard to clean, the fins & IO shield get rusty, some of the fan is starting to fail. I saw it myself, most miners don't give a shit.
@@notchipotle honestly, the issues with quality are a huge one with used cards, regardless whether those are mined or not.
there are some mining operations out there that do keep their room dry, clean their cards, and try their best to take care of their hardware.
The best way to grab a used GPU is to get one locally, like arranging a meet-up instead of delivery. You can even negotiate pricing while you're at it, and at the same time you can also inspect the quality of the card carefully.
Oh, and by the way, if you have any safety concerns, just know that you can do this right in front of your local police department.
@@Very_Questionable I sell used PC parts since 2011, even the most careless gamer can keep their GPU in good condition for at least a year, but mining cards already looks like crap after a couple months. you can say whatever you want, I was just describing the reality, not a personal opinion 🤣
nah i got mine for 400 and it was from a gaming rig in mint condition
Those extra feature need to be actually intended to be used to the max, if the admitted planned obsolescence of less VRAM is going to make you spend more money sooner. DLSS needs to be the difference between playable and unplayable in more than just using RT. In something where it would be foolish to even bother with RT like multiplayer, DLSS can have odd results (more than just IQ) compared to native. Coming from a 1080ti, the 3080 10GB never made sense to me. Besides on paper I've tested it vs a 6800XT and a 3090. The 6800XT was at least on par or near for less money. Worse was needing a 3090 in places where the crutch of DLSS wasn't an option. The only place I absolutely needed more performance was to play DCS in VR (G2), and the 3080 was subpar due to VRAM, and the 6800XT due to bandwidth but basically the same. OpenXR could use open source FSR, but a 3090 wouldn't need it. A squad mate had similar complaints about the 3080 stuttering in the same scenario which further confirmed my suspicions with the 3080. The 3080 is not worth buying because that frame buffer is obviously going to be an issue sooner in other places. Cuda and H.264 need to be equally if not more important (actually making money important) than gaming for it to make sense, IMO.