ERRATA 08:34 I meant to say "with lows just short of 80", not 70 🤦♂ In the Ray Tracing chapter, I think I forgot to mention that I turned down the overall quality preset in Black Myth Wukong. The initial non-RT testing was done at both Very High and High presets, but in order to keep VRAM utilisation in check, I only tested RT With the High quality preset.
Only seems like yesterday when Iceberg was saying he got his 3070 because his dreams of owning a 3080 were gone. Now he’s reviewed both the standard and Titanium models 🎉
I still dream of owning an EVGA rtx 3090ti Kingpin edition or the 3080ti "hydro-copper" (or whatever it was called) Kingpin edition. Id might even choose one of those over a new 40 series GPU just for the bragging rights and nostalgia lol
@madcrasher1419 oh trust me, I know how hot the 3090's can get. I have a Dell rtx 3090 that i have had to upgrade with better thermal solutions due to the vram temps climbing to a uncomfortable level. Haven't had a water cooled GPU yet though.
In Cyberpunk 2077 I cannot tell the difference between high and ultra, but I can sure tell the difference in FPS. 1080p high it is for me and my 3060 laptop.
@@nithinravi4401 The RTX 3080 is about 50% faster than the RTX 4060. It's a shame that the difference isn't smaller because the GTX 1060 6GB was on par with the GTX 980 back in the day, and the RTX 2060 was very close to the GTX 1080.
@@notapplicable8231 Formerly high end-ish. Got a 3090 which was high end back in 2020 nowadays midrange and with the new generation around the corner it's about to drop once more. Anyways for $300, great performance
The 10GB 3080 still edges out the 16GB 6800XT at 4K and below 4 years on, hasn't lost any ground. That said I agree 12GB would have felt more sensible.
@@Battleneter with the performance at 1440p barley achieving 60fps you actually think 4k gaming is doable with it? Maybe at low preset lol. 6800xt trades blows if not beats it unless RT used.
3080fe still insanely good. Only games that don't run well with 10gb of vram at 1440p are normally ones that are terribly optimized. I'm running cyberpunk with optimized settings + RT Overdrive + fsr3 comfortably
this! lowkey this is the thing that most tech reviewers are completely glazing over/forgetting, the massive number of poorly optimized games that are being released nowadays due to corporate corner cutting and the poorly ported console games over to PC - just look at how allen wake performed and explain to me why you cant get that same performance out of other games with similar quality
Iceberg has too much time, but well worth it for some of us. Keep an eye out as he's always doing something. Feels like a game of spot the transition or something along the lines 😊
I have a RTX 3060 12GB and it's a great 1440p. Too many PC gamers, including this channel, seem to be falling back into the mindset of "if it isn't at high then it's not worth bothering with." My previous card, a GTX 970, lasted me almost a decade and I expect the RTX 3060 to do the same because, and keep this quiet in case it damages my PCMR credentials, I don't mind turning the settings down.
Like with the Ratchet and Clank part. "If you're looking for 4K60, this ain't it...". Turn down a few settings or use high? Not being able to run at very high doesn't mean the card is worthless for 4K............................
@@13bmitchell But still saying high, why not drop to medium or hell even low if you absolutely must have 4K 60fps. 1080p is still the most used resolution according to the Steam hardware survey, with 4k actually dropping (by a very small amount.) I'm just frustrated that so many PC tech channels that start out more orientated to what most gamers are actually using, end up slipping into the high end.
I understand your point but a 60 tier card for a decade, even higher tiers, is quite unreasonable for most people. Not everyone is willing to turn graphics to literal mud at 20 fps. A few settings down, sure, but a 760 which is the 60 tier card from 10 years ago, is not usable today.
I understand the bus width versus memory size condition, but still 8gb for 3070 and 10gb for 3080 was a bad decision. I will be using my 3070 anyways till at least Half Life 4 comes out.
Dude, don't listen to this... with FSR3.1 we have fg too, and with that our 30 series had gotten new life! I mean what's the major difference between the 30 series and 40? It's just fg. But now we have that too, so what's the point of getting a 40 series if I have a 3080ti?!? There is None!
@@ramborambokitchenkitchen6357 You're probably right, though not about me personally. There is no way in hell I'll play any GTA game after 5 (including 5 itself, I highly disliked it even back then).
No, they were not lmao. I did not see anyone complain about the VRAM shortage when both of these cards released and 3060's 12 GB was considered "excessive", it was only after games like TLOU2 released, that people started suddenly panicking about VRAM.
@@X_irtz If a top-tier GPU turns out to be short on VRAM just three years into its life, it was short at release. A top-tier GPU should last longer than three years.
@@X_irtz There were absolutely well documented concerns about the VRAM when these cards released. Anyone in the know saw this coming a mile away. They just weren't as loud as the hype around "2080ti performance for $500 woah this is so crazy"
Hardware Unboxed retested 3080 roughly a year ago and they actually came to a pretty positive conclusion - 10GB was not a major issue at that time. Moreover they have not even found any real life gaming scenarios, when 3080 could run out of VRAM. By that time 3080 was a 3 year old GPU. So 3080 owners had 3 years of so-called uncompromised gaming experience. While 3070 had already had some issues in certain games just a year after it's release.
Now imagine all this with 1440P DLSS - Quality which is 90% of image sharpness of DLAA/Native AND sometimes better than it and you will get clearer picture of how much 3080 is still powerful after 4 years ❤
I have a 6800xt, and while it may not always beat a 3080 in performance, I know the 16gb of vram will last me many more years than the 10gb would. Hell, playing now at 1440p I frequently use upwards of 14gb of VRAM.
but 6800xt is faster overall. Even my rx 6800 non xt is nearly on par with 3080. I can do little OC on my gpu and get 3080 performance but it will use 200W instead 380W like 3080.
I got an RTX 3070 and it still works well for everything I play at 1440p It only becomes an issue if you need to play Sweet Baby injected AAA slop that's is powered by unoptimized code and bloat.
I have the 12GB version of this card, and it’s still going pretty strong at 1440P ultrawide. I have gotten to the point where I need DLSS or lower the settings a bit to keep framerates high. I’ve been considering a 4080S or 7900XTX, but I think I’ll wait for next gen to see what they have to offer.
After waiting 4 months post-release for a 3080, my local pc shop offered me an XFX 6800 xt with a $100 discount. Still glad I went with the 6800 xt. I can't help but wonder how different my gaming experience would have been had I waited for the RTX card, but its cool to see that the 3080 is still hanging in there.
I've had my 3080 since launch and it's never let me down, the only problems I run into are when the game is clearly at fault and run like crap on everything.
If you look up Monster Hunter Wild's recommend specs. Frame Gen is now required for 1080p 60fps. Not much to do with the cards on the list. More AI upscales are making devs more and more lazy.
But he said VRAM wasn't an issue for any of the game he tested. The card's 4 years old, almost 2 generations. Can't expect it to run all these shitty games on ultra forever. And I never understood people acting like VRAM is the only metric of performance. A 12GB 3060 running a game at half the fps of a 3080 doesn't make the experience better just because it has more VRAM lol. Bizarre.
@@MusaM8some people with Nvidia hardware really like the aesthetics of textures popping in and out, I guess. As long as VRAM number is lower, it’s great.
Nvidia's "8K gaming" or whatever cards can't even run 1440p 😂😂😂 IDK who tf saw a high end card with 10 GB VRAM with like twice the supposed performance of an ancient card WHICH HAD MORE RAM THAN THAT and thought "this is great, I want this". Y'all got what was coming. My Arc A750 gets comparable performance, the 3080 is that bad.
It still is powerful lol. It’s legit the same speed as a 4070 and it’s currently 4 years old. Sometimes it even beats the 4070. The only cards that beat the 3080 are currently 800 dollars or more. For some reason nowadays people think “it can’t run 1440p ultra at 144 fps so it’s obsolete” and that’s a goofy sentiment. I promise you in any vram limited titles you can just lower the settings slightly and barely notice any loss in quality and enjoy the game perfectly fine at 1440p. People who act like this performance isn’t acceptable are the ones paying thousands to Nvidia for the few cards that out perform it therefore enabling Nvidia to continue selling at these crazy prices. Anyone with a 3080 should keep it and enjoy it at high settings 1440p. Don’t need ultra, it’s 4 years old.
I upgraded to a 4090 system from my last PC that had a 3080 late last year, and I definitely felt like the 3080 was very good apart from its 10GB of VRAM. I mostly played at 1440p, and now at ultrawide 3400x1440p, and the VRAM definitely caused a few issues for me, especially when RTX was on. If I had the 12GB model or the Ti variant that had 12GB of VRAM to begin with, I'd probably have used the card for another year or two.
ugh, paid 900 for my 3080 right after the crypto boom stopped. I could have waited but I wanted to upgrade from my i7 4790 and 1060 so bad during that whole time.
Still choosing to use my RTX 3080 FE paired with a 5800X3D rather than my other rig which rocks a 6800XT and 12600K. It just seems smoother somehow. I wonder if the 6800XT would pair better with the AMD processor but so far have been too lazy to swap it all round.
@@LeftJoystick I'd be curious to see that 12600k with E cores off and 4.9ghz-5ghz all core. I was able to cool that on air, I think it was 183w. There is no getting around the X3D cache being good for gaming though.
11:28 nice transition :D ! Btw I have that card (asus tuf) from launch with a i9 9900kf and 16gb ram. Some times I notice the VRAM is the main drop, but 16GB too in some games. What I'm going to upgrade next is the cpu and Ram (for better 1% lows and less stutters, that's what I prioritize the most), then the GPU. At the moment, I play in a good quality graphics settings in 1440p, and in some games I use the DLSS FG to AMD FG MOD. For exemple, in wukong I use 2 mods, One for the shadows rendering and the one for DLSS FG to AMD FG, so I can use DLSS with AMD FG and I play in 80-100fps, 1%Lows above 65fps, with my settings mixed between cinematic and V.H at 1440p
interesting build; id definitely recommend upgrading cpu and ram by now, the 9th gen was pretty comparable to the ryzen 3000 series, and not many would use a ryzen 3000 chip with a 3080. newer architecture and ddr5 would go a long way to getting all the performance out of that card
I bought the same card from an ex-mining rig. It was $400 and definitely a risk, but after a repaste it has been an awesome card for the past 2.5 years.
The circumstances surrounding the 3080 10gb was such a disappointment. I didn't try to get one until prices started shooting up when the mining boom started so I wound up with a 3060 ti because it was all I could find for less than a million dollars. Then when I finally got a 3080 10gb a year and a half later I quickly realized that 10gb wasn't going to be enough graphics card ram so I decided to sell it and get a used 4080 if I could find a used one for $800 (which never happened.) Then I was made an offer of a used 4090 for $1000 plus my old 3080 and I jumped on it. The circumstances surrounding the 3080 where I waited a long time to finally get one only to be disappointed was such that I would rather just forget the whole thing. I'm not going to buy another gpu for a long time because now I can use my 4090 with a 1440p monitor and that should last many more years. Then when that gets too slow I'll switch back to my old 1080p monitor. Then when that gets too slow I'll switch to an old 900p monitor I have. Then when that gets too slow I'll get a 720p monitor. Then when that gets too slow I'll find a 480p monitor. Then when that gets too slow...
I have the same feelings as you surrounding RTX 3000. I ended up paying $1000 for a 3070ti. Every time I look at it in my PC, I don't feel good about it. Also, no, you know full well you aren't going to go back to your decade-old 1080p monitor once the 4090 feels "slow".
What a good troll comment lol. I wonder if there really are people out here Getting 4090's to play at 1440p. Well, cyberpunk does have its psycho settings for a reason lol.
@@Buddhakingpen Ah, it's another person who thinks "a 4090 is overkill for 1440p." No, it's not. The issue is that there are no 4k cards unless you're one of those people that are fine with 60fps. I am not. I like 120fps and I'm getting that. In less gpu-heavy games I use dsr at 1920p and 4k but here's the thing: when needed I can drop it down to native 1440p where it still looks sharp and with a 4k monitor you can't do that. I could go on and on about this explaining every detail of the advantages of 1440p monitors over 4k monitors but I think you get the idea, or maybe you don't.
I use a 3080 10GB in my editing/gaming PC and I've got no complaints. I think if you stick to games that have even slight optimisation, you're good at 1440p. Even then, in the poorly optimised UE5 games, you can do some settings tweaks to improve the perf.
We are in one of the worst eras for gaming ever. Way too much emphasis on ray tracing. With games now not even giving you the ability to turn it off. The performance hit from ray tracing and UE5 games is not worth the minor upgrade in fidelity. Games have gone from using 8-12gb vram at 4k to 16-20gb in the matter of a couple years. My 3080 is capable of playing most games from a couple years ago at 1440p-4k. On these new games it’s not even capable of hitting 1080p 60? These new engines and ray tracing are not well enough optimised, and they’ve just shit on anyone who invested in a top gpu 3 years ago. Forcing people to feel the need to upgrade early. The fact a 24gb 3090 will most likely not hit above 50 fps at 1440p says enough. Ray tracing even on a 4090 is not worth the performance hit that comes with it.
I feel like keeping helldivers 2 would have been fine. That one is multiplayer horde game, where as now all the playstation games are the same singleplayer over the shoulder action genre.
The battle between the 6800 XT & 10G 3080 is Legendary. Hope hardware unboxed test the battle again, wil watch their comment section while watching the vid if that's happened 😂.
@@milescarter7803 in my local market, that is tough to find. i dont know if you used USD for convenience and you are european, but it's definitely not easy to find a 3080 for that cheap here
had a 3080 in 2022, swapped to a 7900GRE earlier this year, i was consistently having issues with VRAM but that was the only issue that card had. loaned it out to a friend for a bit when his card died, now its my crown jewel of my EVGA card collection. a reminder of when they still made cards.
My guy you can't start off a re review of a gpu that is a few years old by testing it against the most dogsht unoptimized games that even a 4080 struggles with and going "idk guys it's not really reaching that 60 fps" a 4090 is what it requires to get 60 fps at 4k in black myth. The game doesn't know optimisation at all lol. Star wars outlaws... It's ubisoft enough said there. God of war. Have you played ff16? Sony can't create a pc port to save their life. You are doing the 3080 dirty putting it up against games that are flat out broken and bad. You see how in older games that have actually seen some optimisation patches it still does amazing. This is the reality for all gpus even new ones that people call bad for no reason. They aren't bad, game developers are bad. They refuse to make a game and instead rely on upscaling and fg to do it for them.
hot take - VRAM issues are inflated by poor optimizations/PC ports and current gen ray tracing and shaders techniques (me coping in 3080) coping aside, there have already been promising developments on the software side of things to optimize shaders and VRAM usage through AI (AMDs AI texture compression and Unreal engine 5.5s lumen improvements)
My 3080 was mostly fine for me until I upgraded to 4k. Now I have vram issues in many games, especially BeamNG. Shame because it has enough power for even 4k, I don’t mind turning some settings down, but Nvidia gimped it with only 10gb of vram. Looks like I’ll have to upgrade to RTX 50, but we could be looking at the exact same problem in a couple years or sooner if 5080 only has 16gb. Ugh.
Hey let me ease your anxiety - VRAM isn't going to go up. We are at the worst part of the cycle - multiple major game engines got a version update (like UE5) and companies looking to icebreak and cash grab on fancy graphics ahead of the curve are slopping out unoptimized crap. You upgraded to 4k which needs more ram because of math, but 16GB is fine. It will very very easily play anything you want at 4k. You may have to turn off raytracing, but otherwise it will clobber it. Running at 1080/1440 is not going to take more memory right now. The number of pixels per frame in 1080 and 1440 resolutions will never change. The pixel density of a 1440p 27" monitor will always be good for gaming. VRAM usage will stay where it is right now or go down at 1440/1080, outside of RT being turned on. That's literally the only thing that will swing the needle beyond where it's been for 10 years. In 2016 I bought a 1070. The games I played still didn't run out of VRAM at 1080p for the 6 years I played on it and my son now plays on it. Stuff used 6-7.5 gb vram on the 'ultra fancy' end of settings and less with high/medium settings. In 2024 it's the same story. same 6-7.5gb used at 1080. The ONLY exceptions are shit console ports that have redundant gpu and system memory use because they were only coded for consoles. If you don't care about Star Wars story games running at max settings or Forspoken, you're literally still fine with 8GB at 1080p. If you have 10GB you're set for 1440p the same way, and will be likely until the card is irrelevant because of the GPU grunt being too little for the era's games. Having too much VRAM is like having a car with an 80 gallon gas tank. What is it actually enabling you to do? Yeah, a 4 gallon tank is going to be annoying and cause every short road trip to include a gas stop, sure. The thing is, there's no difference in driving 250 miles with a 20 gallon tank and with an 80 gallon tank. You get there in an identical amount of time, all things equal. All you need is a big enough tank to make the trip and then beyond that nothing changes. I'm super fatigued by all of the weird AMD-backers saying that 16GB isn't enough RAM right now. As if buying a card with 16 damn GB of memory will somehow limit you while playing Baldur's Gate or Warzone. Those games aren't even hitting 10GB at 1440 my guys.
there is supposedly a 24GB 5080 that's going to come out later. They're going to use non-binary 3GB modules across eight channels to get the Memory up to 24GB.
I have the 3080 10GB and game with a 3440 monitor, i have around 130 games, i can all my games playing on max setings. Yes, some heavy games i use dlls for around 20fps boost or more. But i can play all my games on this moment on the max settings. Maybe i gonna buy the 5080 or if i have enough money the 5090.
Consoles do NOT have access to 12GB of shared memory as VRAM. Its usually just under 10GB once the OS, subsystem and game data in memory itself is accounted for.
I'm running a 3080 Ftw3 in my backup and even on games that push it, a few minor tweaks in the settings and its right back to running a 1440p ultrawide. It can't always handle the ultra preset like it used to but overall its still a pretty good card.
Vid definitely was before Helldivers 2's revival because its very awkward to say the game has already "had its time in the sun". Game's still extremely healthy on PC and its kind of embarrassing to skip it but include God of War: Ragnarok when its failing so badly on PC.
hooray, a video where I can go "hey thats my card!" Hoping to get another solid few years out of my Suprim x model, that 10GB frame buffer is just barely enough for me.
Can’t help but feel slightly superior waiting some time and getting the 3080 12gb with my grandpa. We keep coming back to talking about those 2 extra gigabytes.
A 3080 having just 10GB gddr6x was rough. For a -full price 80 class card- $2500 card during the mining boom. (Though I was set on a 6800XT since I value vram more than RT Performance)
@@ArchieBunker11 without rt the 6800 xt is on par with or outperforms the 3080 in many games. The vram will matter more in the future. 10 GB isn't enough for 1440p/4k anymore which the card is still decent at. The 16 gb of vram on the 6800 xt vs the 10 on the 3080 will matter big time in the next two or three years even. The 3080 will be running out of vram at that point and having much worse 1% lows in games and stuttering meanwhile the 6800 xt will still be just fine.
@@xTurtleOW6700xt given time will probably age much better than all the other modern cards will considering it's price. 1080ti is legendary but the 6700xt will probably take the spot in the future. If the 3060ti had 12gb of vram like the reg 3060 then I'd give the title to the 3060ti but we don't live in that timeline sadly
I'm not gonna be upgrading from my 3080 for a long time. i don't play any UE5 era titles that need more, and it feels like the price of every GPU above a 3080 has an absurdly inflated price on both the new and used market.
prebaked lighting still cooks raytracing to me its a very cool technology but jesus, the framerates.. why do all reflections have to be 1:1 resolution anyway?
Really glad to see that buying a used 3080 was a good decision, I was really worried that at 4K I will be needing 4080 for low/medium setting with DLSS to get 120fps, but I was luckily wrong.
lol what are you talking about? Its very popular. Just not super viral. Both Avatar and Star Wars are technical masterpieces despite lacking sensible NPC/AI.
@@ICeyCeR3Al "Ubisoft has officially confirmed that Star Wars Outlaws underperformed internal expectations. The revelation came via a financial targets update from the company on Wedenesday." Were we thinking of the same game?
@@ShadyButFresh Because they expected super Viral. As it is a Star Wars open world game etc. If you search some recent sales data, such as European sales chart in August, Outlaws at one of the top selling games. You read a headline, congratulations. But you don't read the articles or look at data lol.
@@ICeyCeR3Al Have you seen Ubisoft's market value? I wouldn't place too much emphasis on those European sales. Best I could find on short notice was an estimation from analytical firm Ampere Analysis that around 800,000 copies were sold as of August 31, 2024. That doesn't coincide with concurrent player count though (which we do not know unfortunately) and how it was generally poorly received by players. For reference - is this your technical masterpiece? ua-cam.com/video/JuJV0flzPEA/v-deo.html "Sick ray reconstruction textures! Hopefully people look at that and not the glaring issues with our game." - Ubisoft C-Suite Employees, probably Just remember: you can polish a turd to your heart's content, but at the end of the day it's still a turd.
When do we stop putting the blame on the cards and start blaming trash devs, Star Wars outlaws is not a graphically impressive game in any way as a matter of fact it’s a hideous game there’s no reason a 3080 should be getting 30 fps
I will give you a example: 7:56 here he runs it at 80fps, but its so extremely easy to run this game on my 3080 at 180fps at 1440p. You tune down volumetrics (you cant tell a visual difference) and shadows from v.high to high (you HONESTLY cant see a difference neither) add DLSS quality and you are at 150fps. Enable FSR frame gen and you at 180fps and the GPU is chilling (I have 180hz monitor) At this framerate the latency is a non issue, and its so fast you will never notice any visual artifacts of the generated frames. DLSS quality at 1440p is also almost same as native... you dont loose any quality nowadays, you will never be able to tell while playing the game..... Or you can always buy 1600USD GPU and "Crank it to ultra"... its your choice. Id rather save 1200 USD but thats just me.
@@organicinsanity2534 no worries mate, this card rocks hard, VRAM is a non issue. Only better card costs 1000 USD or 1600 USD, take your pick... instead of throwing money at the wall, learn how to deep optimize your Windows 10 (tutorials, manual settings and than some utility like Hellzergs Optimizer 16.5) and Drivers (use utility like NVCleanstall_1.16.0 and strip the drivers to bare bones, than learn about all the other settings there on utube) ... optimize the hell out of your PC, its fun and it will make a big difference in every task u throw at it.
@@organicinsanity2534 also, learn about the apps like Lossless scaling, and modding in the Frame gen in the games using mods. It can be helpful in some cases. Get the latest DLSS DLLs on the internet, swap the outdated DLLs in your games for most recent ones. Same if the game uses Direct Storage, get the latest DLLs on net, swap it in your games... Set the paging file to a SSD and make it use same min/max values (32gb and forget it if u have storage)....
@martinrakovicky1189 i used the voltage curve editor in msi afterburner, I can maintain about 1830-2025mhz without exceeding 1.0V, it stays around 375-385w.
Using mine DLDSR 1920p to DLSS quality in every game I can. Still a beast that can do almost maxed or maxed settings. Only time I do 1440p with DLSS is when I enable RT.
im not going to lie i think it would have bean much better if you pared the 3080 with something like the ryzen 7 7800x or a i7 12700k or a i9 12900k to not have much of a bottle neck because no one is going to par it with a ryzen 7 7500f and iv seen the 3080 do much better with a better cpu and no one with a 3080 is going to par it with a ryzen 5 7500f it just dosent make cents to me and it will bottle neck the 3080 is most games but not all so i would take this bench mark with a grain of salt and starfield is known for running better on intel on top of that and for cyberpunk runs better with more cpu cores but other than that this benchmark is good but only for those who have this build this is not fair for the 3080 you have to test it in its best case scenario and like i said no one that has the money will par it with a 7500f it makes no cents
I just upgraded to the same model tested in the video earlier this month. It replaced my GTX 970 after 8 years of use :) I had a budget of around 430 USD, and this was the best offer in my country. Living in Europe, especially in a country with one of the highest VAT rates, the 3080 Ti/4070 Ti were my next options, but all of those were starting for the equivalent of 675/880 USD, which didn’t seem like a good value proposition to me. I guess it’s a different situation in other countries, but I'm happy with my purchase, especially since I’m already used to not playing games on ultra settings (considering I’ve only just replaced my 970 :'D). I'm more than satisfied with it.
I paid $1300 for my RTX 3080 Sep 27th 2020. It is the Gigabyte Aorus waterforce xtreme waterblock edition. So it came stock with a water block, not AIO or air cooler which made it's MSRP a little higher than a air cooled founders edition. Still paid way too much and is the main reason I don't have a 40 series yet. My only complaint about the card is that it came with a locked BIOS. Yes you heard me correctly, gigabyte released a custom water cooled GPU that cannot be overclocked by voltage with stock BIOS... I did put a modded bios on mine eventually. But I don't bother overclocking the core for just basic gaming. I really only did it trying to hit a higher 3DMark score.
My 3080 can still run games at 1440p ultra 60 fps (usually) so I'm very happy with it If it doesnt, just turn it down to high, maybe put on dlss quality, and I get another 2-4 years out ofnit
I actually know someone that has a 3080 10GB that gets over the VRAM limit every now and then. I know there are always options you can turn down, but that shouldn't be a problem in my opinion. Plus a higher texture quality can be a game changer in my eyes
I got my (used) 3080 last year, because I needed an HDMI 2.1 GPU for my new TV. It does well with DLSS, but specifically Alan Wake 2 (medium setting without RT) in 4K DLSS Performance or higher just doesn't run well. So it seems as if newer games really pull the card down to being for 1440p
I have the 12gb model and its still working just fine at 4k. All ultra settings on red dead redemption 2 and 90 fps average with a few dips into the lower 80's in st denis. The 10gb 3080 has fewer cores too.
I got one of these this year, a ftw3 ultra model from EVGA. For the used price I got it I absolutely love it, and I won't mind turning down settings in the future, as even low presets are starting to look solid.
I have this exact 3080, got it used a year ago for just about $450. I replaced it in short order for specific build reasons (VRAM, mainly), but my EVGA FTW3 3080 10GB will always have a place on my shelf. I just love the thing (RIP EVGA). Still a solid card if you measure your expectations.
you can also use lossless scaling to double or triple your framerate as well especially for these single player games where a little bit of added input latency doesnt matter
I have a dell version and so far is not bad. Though with newer games I'm starting to get dips and less frames on 1440p. Wanting to upgrade but can't justify a $900 cost for a 4080 used.
the thing about the used 3080 is that it became obsolete since the 4070 (super) dropped. However used prices for the 3080 still remain ridiculously high compared to a new 4070(s).
I plan on upgrading to the 5080 at least by next year as I haven't upgraded in a while but the 3080 is still a great card for 1440p gaming for any new game I've played this year. Ultra (or highest possible) settings has always been a pointless endevaour in a lot of games with very little added in terms of visuals. Turn your settings down a couple of notches and you'll see you're still hitting high framerates without the game looking bad.
I got one just to collect. Last generation of EVGA cards. It was on my mind to get a 3090 instead, but it seemed a bit much since I have zero plan to even plug it in…
ERRATA
08:34 I meant to say "with lows just short of 80", not 70 🤦♂
In the Ray Tracing chapter, I think I forgot to mention that I turned down the overall quality preset in Black Myth Wukong. The initial non-RT testing was done at both Very High and High presets, but in order to keep VRAM utilisation in check, I only tested RT With the High quality preset.
These are unacceptable mistakes. Unfortunately you have to delete your channel now sorry.
my 1080 ti plays all this games comfortable as hell so no idea what youre on about
my 3080 is sweating while i'm watching this video
me using gt 720 rn 😅
Same lol
@@newbiegunsmith :D
Yeah mine too! Mind you, I haven't been disappointed with it yet
@@grumpywurzel1973 it's still an awesome card for me. even for work!
Only seems like yesterday when Iceberg was saying he got his 3070 because his dreams of owning a 3080 were gone. Now he’s reviewed both the standard and Titanium models 🎉
I still dream of owning an EVGA rtx 3090ti Kingpin edition or the 3080ti "hydro-copper" (or whatever it was called) Kingpin edition.
Id might even choose one of those over a new 40 series GPU just for the bragging rights and nostalgia lol
@@T.Lspitzas someone that went from a 3090 to 4090 I can’t recommend it as they run very hot
The fact he got his hands on a Titan RTX is somehow more impressive to me
@madcrasher1419 oh trust me, I know how hot the 3090's can get.
I have a Dell rtx 3090 that i have had to upgrade with better thermal solutions due to the vram temps climbing to a uncomfortable level.
Haven't had a water cooled GPU yet though.
Nasty boy! ;-)
Top Tip: If your card is struggling, turn your settings down.
I've just saved you £1000. You're welcome.
You can not see detailed reflections in the mud, but u can buy something cool for you and your family)
In Cyberpunk 2077 I cannot tell the difference between high and ultra, but I can sure tell the difference in FPS. 1080p high it is for me and my 3060 laptop.
A lot of these tests had lower fps than they should've because they were at ultra instead of optimized settings
No, i want that eyecandy
Congrats, you've just destroyed the entire high-end GPU industry.
I got mine for 300$ off eBay, and it works perfectly. My first truly high end card, and I'll never stop loving it.
Me too. Just got one for 340. Considering I was initially going for a 4060 , its a huge jump in performance
it's not high end tho... that's why you were able to buy it for $300
@@papabepreachin8664going from a 3050 to this is a pretty big jump, and considering the MSRP this card launched at, I consider it a high end card.
@@nithinravi4401 The RTX 3080 is about 50% faster than the RTX 4060. It's a shame that the difference isn't smaller because the GTX 1060 6GB was on par with the GTX 980 back in the day, and the RTX 2060 was very close to the GTX 1080.
@@notapplicable8231 Formerly high end-ish. Got a 3090 which was high end back in 2020 nowadays midrange and with the new generation around the corner it's about to drop once more. Anyways for $300, great performance
3080 12gb should've been the default
3080Ti is the default for Ampere, specially the Asus TUF version which is faster than a 3090 FE.
@@saricubra28673080ti was way too expensive for what it was. A 3090 with less vram but still +$1400 back then... Wasnt worth the $400 uplift
@@saricubra2867 mmmkay
The 10GB 3080 still edges out the 16GB 6800XT at 4K and below 4 years on, hasn't lost any ground. That said I agree 12GB would have felt more sensible.
@@Battleneter with the performance at 1440p barley achieving 60fps you actually think 4k gaming is doable with it? Maybe at low preset lol. 6800xt trades blows if not beats it unless RT used.
3080fe still insanely good. Only games that don't run well with 10gb of vram at 1440p are normally ones that are terribly optimized. I'm running cyberpunk with optimized settings + RT Overdrive + fsr3 comfortably
Not gonna lie tho, FSR 3 on 2077 looks like absolute trash. They gotta implement FSR 3.1 so we can use DLSS 2 upscaling.
Should upgrade my 3060ti to a 3080? Would it be worth it today
@@matrxzeno4761 3080 is not really worth it rn. Go for 4070 Ti or Super version
this! lowkey this is the thing that most tech reviewers are completely glazing over/forgetting, the massive number of poorly optimized games that are being released nowadays due to corporate corner cutting and the poorly ported console games over to PC - just look at how allen wake performed and explain to me why you cant get that same performance out of other games with similar quality
the transition at 11:30 was so smooth
what transition?
@@hyperixz 11:28
That was cool
Iceberg has too much time, but well worth it for some of us.
Keep an eye out as he's always doing something. Feels like a game of spot the transition or something along the lines 😊
I have a RTX 3060 12GB and it's a great 1440p. Too many PC gamers, including this channel, seem to be falling back into the mindset of "if it isn't at high then it's not worth bothering with."
My previous card, a GTX 970, lasted me almost a decade and I expect the RTX 3060 to do the same because, and keep this quiet in case it damages my PCMR credentials, I don't mind turning the settings down.
Like with the Ratchet and Clank part. "If you're looking for 4K60, this ain't it...". Turn down a few settings or use high? Not being able to run at very high doesn't mean the card is worthless for 4K............................
@@13bmitchell But still saying high, why not drop to medium or hell even low if you absolutely must have 4K 60fps.
1080p is still the most used resolution according to the Steam hardware survey, with 4k actually dropping (by a very small amount.)
I'm just frustrated that so many PC tech channels that start out more orientated to what most gamers are actually using, end up slipping into the high end.
I understand your point but a 60 tier card for a decade, even higher tiers, is quite unreasonable for most people. Not everyone is willing to turn graphics to literal mud at 20 fps. A few settings down, sure, but a 760 which is the 60 tier card from 10 years ago, is not usable today.
I understand the bus width versus memory size condition, but still 8gb for 3070 and 10gb for 3080 was a bad decision. I will be using my 3070 anyways till at least Half Life 4 comes out.
So forever? 😂
@@DavideDavini yes 🌚 bro I was a gt 610 gamer for years so 3070 is a nasa gpu for me.
@@itson230 I get it mate I just found the way you used to put it extremely amusing. I have a 2060 Super right now, BTW. 😉
Cheers.
Bro boutta beat time itself with the 3070
@@DavideDavini cheers bro 🌚 I sometimes remember running games at 800x600 low 20fps and smile a lot.. 2060S is and will be a decent card for years
Not a single game exists that's actually worth upgrading for. I'll keep my 3080 a while longer, thank you.
Dude, don't listen to this... with FSR3.1 we have fg too, and with that our 30 series had gotten new life! I mean what's the major difference between the 30 series and 40? It's just fg. But now we have that too, so what's the point of getting a 40 series if I have a 3080ti?!? There is None!
@@Kenny-yl9pc Exactly
GTA 6 pc port in 3 years time will drive a significant amount of upgrades. Until then tho can’t think of much tbh lol.
@@ramborambokitchenkitchen6357 You're probably right, though not about me personally. There is no way in hell I'll play any GTA game after 5 (including 5 itself, I highly disliked it even back then).
@@ramborambokitchenkitchen6357 GTA 6 is gonna run on current gen consoles lol. It's not gonna have high system requirements. It's just a meme
I bought 3080FE 10GB for MSRP 4 years ago and it's still strong. The best price/performance ratio.
Luckyyyy
The 3080 and 3070 were both short on VRAM at release, so of course they both suffer from VRAM shortage much more acutely now.
No, they were not lmao. I did not see anyone complain about the VRAM shortage when both of these cards released and 3060's 12 GB was considered "excessive", it was only after games like TLOU2 released, that people started suddenly panicking about VRAM.
@@X_irtz If a top-tier GPU turns out to be short on VRAM just three years into its life, it was short at release. A top-tier GPU should last longer than three years.
@@X_irtz There were absolutely well documented concerns about the VRAM when these cards released. Anyone in the know saw this coming a mile away. They just weren't as loud as the hype around "2080ti performance for $500 woah this is so crazy"
If the 3070 could be bought cheap enough, upgrade the vram to 16gb. Would be as fast as a 4060ti. Might as well look for a deal on a 3090.
Hardware Unboxed retested 3080 roughly a year ago and they actually came to a pretty positive conclusion - 10GB was not a major issue at that time. Moreover they have not even found any real life gaming scenarios, when 3080 could run out of VRAM. By that time 3080 was a 3 year old GPU. So 3080 owners had 3 years of so-called uncompromised gaming experience.
While 3070 had already had some issues in certain games just a year after it's release.
Now imagine all this with 1440P DLSS - Quality which is 90% of image sharpness of DLAA/Native AND sometimes better than it and you will get clearer picture of how much 3080 is still powerful after 4 years ❤
Im loving my 6800xt especially with the 16gb vram and it is pretty much equivalent
Try turning on rt
@@shlapmasterhow about you try increasing the textures to high 🤭🤭
I skipped the 6700xt for 3070 after watching HUB wukong benchmarks
@@mahardhipeddu2496 I have a 3090, I have more Vram than you peasant.
@@shlapmaster The RT is not bad on it too lol you can check comparison videos
I have a 6800xt, and while it may not always beat a 3080 in performance, I know the 16gb of vram will last me many more years than the 10gb would. Hell, playing now at 1440p I frequently use upwards of 14gb of VRAM.
but 6800xt is faster overall. Even my rx 6800 non xt is nearly on par with 3080. I can do little OC on my gpu and get 3080 performance but it will use 200W instead 380W like 3080.
A 3080 is more than (just good) my 2080ti is still amazing in 1440p lol
I got an RTX 3070 and it still works well for everything I play at 1440p
It only becomes an issue if you need to play Sweet Baby injected AAA slop that's is powered by unoptimized code and bloat.
I have the 12GB version of this card, and it’s still going pretty strong at 1440P ultrawide. I have gotten to the point where I need DLSS or lower the settings a bit to keep framerates high. I’ve been considering a 4080S or 7900XTX, but I think I’ll wait for next gen to see what they have to offer.
Wanted one bad when they released. But I had a 2080, it served me until I got a crazy deal on a 4070ti.
2080 is still solid midrange
@@aaz1992 definitely is. It got hate on release, but it's been a solid worker for me for 6yrs now.
@@J.Wick. The hate was ridiculous.
Life as a 1080p gamer is so much simpler
My 3070ti still feels so big and strong - and what would I even do with so much VRAM?? 😲
Saying "but only just" about a three-year old GPU that's still more powerful than a console about to be released...
After waiting 4 months post-release for a 3080, my local pc shop offered me an XFX 6800 xt with a $100 discount.
Still glad I went with the 6800 xt. I can't help but wonder how different my gaming experience would have been had I waited for the RTX card, but its cool to see that the 3080 is still hanging in there.
If you waited for a 3080 you'd have probably spent more, and waited longer for hardly any benefit.
At release-These 1/2 measures Kicked AMDs tail.
rememba walt.. no more half measuras...
Love the "allegedly" when showing the 900 series cards lol
3.5gb of VRAM, that was such a scam on Nvidia's part.
We need reparations for the 970. Nvidia needs to ship every gamer in the world a 512MB GDDR5 Vram Chip.
@@BonusCrook I'm pretty sure there was already a class-action lawsuit over it, everyone who qualified (Original buyer with receipt) got a whole $20.
@@KiraSlith its not enough, we need that damn GDDR5 MODULE!
I've had my 3080 since launch and it's never let me down, the only problems I run into are when the game is clearly at fault and run like crap on everything.
same experience.
If you look up Monster Hunter Wild's recommend specs. Frame Gen is now required for 1080p 60fps. Not much to do with the cards on the list. More AI upscales are making devs more and more lazy.
Man we neefd developer to use mGPU more than ever!
I'd be pissed if I bought such a high end expensive card only to be running out of vram a few years later.
But he said VRAM wasn't an issue for any of the game he tested. The card's 4 years old, almost 2 generations. Can't expect it to run all these shitty games on ultra forever. And I never understood people acting like VRAM is the only metric of performance. A 12GB 3060 running a game at half the fps of a 3080 doesn't make the experience better just because it has more VRAM lol. Bizarre.
@@snoweh1 Some games automatically turn down the graphics if there isn't enough VRAM, so FPS doesn't tell everything.
@@MusaM8some people with Nvidia hardware really like the aesthetics of textures popping in and out, I guess. As long as VRAM number is lower, it’s great.
Shitty games? Based
Nvidia's "8K gaming" or whatever cards can't even run 1440p 😂😂😂
IDK who tf saw a high end card with 10 GB VRAM with like twice the supposed performance of an ancient card WHICH HAD MORE RAM THAN THAT and thought "this is great, I want this". Y'all got what was coming. My Arc A750 gets comparable performance, the 3080 is that bad.
In this time frame, the RX 6800 XT should have been the most wanted, most popular, and best selling GPU.
Except AMD decided making like 3 of them would be enough to satisfy the whole market.
That power usage is almost at 400watts hot stuff… I forgot how bad Samsung 8nm is!!!
How quickly the formerly mighty fall. I remember when I first got a 3080 a few years back, and how powerful it felt at that time.
It still is powerful lol. It’s legit the same speed as a 4070 and it’s currently 4 years old. Sometimes it even beats the 4070. The only cards that beat the 3080 are currently 800 dollars or more. For some reason nowadays people think “it can’t run 1440p ultra at 144 fps so it’s obsolete” and that’s a goofy sentiment. I promise you in any vram limited titles you can just lower the settings slightly and barely notice any loss in quality and enjoy the game perfectly fine at 1440p. People who act like this performance isn’t acceptable are the ones paying thousands to Nvidia for the few cards that out perform it therefore enabling Nvidia to continue selling at these crazy prices. Anyone with a 3080 should keep it and enjoy it at high settings 1440p. Don’t need ultra, it’s 4 years old.
It still runs good lol. If you really want to play on ultra settings then just get a 4090...
@@zeronin4696 That's actually what I ended up doing. I invested in a high refresh 4K monitor, and honestly even the 4090 struggles with that.
I upgraded to a 4090 system from my last PC that had a 3080 late last year, and I definitely felt like the 3080 was very good apart from its 10GB of VRAM. I mostly played at 1440p, and now at ultrawide 3400x1440p, and the VRAM definitely caused a few issues for me, especially when RTX was on. If I had the 12GB model or the Ti variant that had 12GB of VRAM to begin with, I'd probably have used the card for another year or two.
11:29 that was a very smooth transition ❤
ugh, paid 900 for my 3080 right after the crypto boom stopped. I could have waited but I wanted to upgrade from my i7 4790 and 1060 so bad during that whole time.
Still choosing to use my RTX 3080 FE paired with a 5800X3D rather than my other rig which rocks a 6800XT and 12600K. It just seems smoother somehow. I wonder if the 6800XT would pair better with the AMD processor but so far have been too lazy to swap it all round.
The 5800X3D is a generational leap ahead f the 12600k lol. It's no wonder that system feels better.
@@LeftJoystick I'd be curious to see that 12600k with E cores off and 4.9ghz-5ghz all core. I was able to cool that on air, I think it was 183w. There is no getting around the X3D cache being good for gaming though.
@@milescarter7803i7-12700K is faster than a 5800X3D and the 12600K would be more like a Ryzen 5 7500F alternative.
@@saricubra2867 that is only true with like 8MiT ram, which you won't have on a 12700k. with hardware from the time, the 5800x3d beats even the 12900k
@@GraveUypo No, DDR5 5200 i7-12700K gets 80fps 1% lows for Hogsmead Town in Hogwarts Legacy. 5800X3D is just 60fps.
Zen 3 is slow by current standards
Exact 20 minuts of calming talk lets goo. Keep it up i enjoy very much this channel (sorry for my English)
11:28 nice transition :D !
Btw I have that card (asus tuf) from launch with a i9 9900kf and 16gb ram. Some times I notice the VRAM is the main drop, but 16GB too in some games. What I'm going to upgrade next is the cpu and Ram (for better 1% lows and less stutters, that's what I prioritize the most), then the GPU. At the moment, I play in a good quality graphics settings in 1440p, and in some games I use the DLSS FG to AMD FG MOD. For exemple, in wukong I use 2 mods, One for the shadows rendering and the one for DLSS FG to AMD FG, so I can use DLSS with AMD FG and I play in 80-100fps, 1%Lows above 65fps, with my settings mixed between cinematic and V.H at 1440p
I have the same build as you; I feel like the i9
Bottle necks the 3080 sometimes
interesting build; id definitely recommend upgrading cpu and ram by now, the 9th gen was pretty comparable to the ryzen 3000 series, and not many would use a ryzen 3000 chip with a 3080. newer architecture and ddr5 would go a long way to getting all the performance out of that card
@@thomaslayman9487it’s a lot better than ryzen 3000. Pretty much the same as a 5000 series or faster especially overclocked
I bought the same card from an ex-mining rig. It was $400 and definitely a risk, but after a repaste it has been an awesome card for the past 2.5 years.
The circumstances surrounding the 3080 10gb was such a disappointment. I didn't try to get one until prices started shooting up when the mining boom started so I wound up with a 3060 ti because it was all I could find for less than a million dollars. Then when I finally got a 3080 10gb a year and a half later I quickly realized that 10gb wasn't going to be enough graphics card ram so I decided to sell it and get a used 4080 if I could find a used one for $800 (which never happened.) Then I was made an offer of a used 4090 for $1000 plus my old 3080 and I jumped on it. The circumstances surrounding the 3080 where I waited a long time to finally get one only to be disappointed was such that I would rather just forget the whole thing. I'm not going to buy another gpu for a long time because now I can use my 4090 with a 1440p monitor and that should last many more years. Then when that gets too slow I'll switch back to my old 1080p monitor. Then when that gets too slow I'll switch to an old 900p monitor I have. Then when that gets too slow I'll get a 720p monitor. Then when that gets too slow I'll find a 480p monitor. Then when that gets too slow...
I have the same feelings as you surrounding RTX 3000.
I ended up paying $1000 for a 3070ti. Every time I look at it in my PC, I don't feel good about it.
Also, no, you know full well you aren't going to go back to your decade-old 1080p monitor once the 4090 feels "slow".
144p monitor? 🤣
What a good troll comment lol. I wonder if there really are people out here Getting 4090's to play at 1440p. Well, cyberpunk does have its psycho settings for a reason lol.
@@BuddhakingpenI run a 4090 at 1440p
@@Buddhakingpen Ah, it's another person who thinks "a 4090 is overkill for 1440p." No, it's not. The issue is that there are no 4k cards unless you're one of those people that are fine with 60fps. I am not. I like 120fps and I'm getting that. In less gpu-heavy games I use dsr at 1920p and 4k but here's the thing: when needed I can drop it down to native 1440p where it still looks sharp and with a 4k monitor you can't do that. I could go on and on about this explaining every detail of the advantages of 1440p monitors over 4k monitors but I think you get the idea, or maybe you don't.
I use a 3080 10GB in my editing/gaming PC and I've got no complaints. I think if you stick to games that have even slight optimisation, you're good at 1440p. Even then, in the poorly optimised UE5 games, you can do some settings tweaks to improve the perf.
We are in one of the worst eras for gaming ever. Way too much emphasis on ray tracing. With games now not even giving you the ability to turn it off. The performance hit from ray tracing and UE5 games is not worth the minor upgrade in fidelity. Games have gone from using 8-12gb vram at 4k to 16-20gb in the matter of a couple years. My 3080 is capable of playing most games from a couple years ago at 1440p-4k. On these new games it’s not even capable of hitting 1080p 60? These new engines and ray tracing are not well enough optimised, and they’ve just shit on anyone who invested in a top gpu 3 years ago. Forcing people to feel the need to upgrade early. The fact a 24gb 3090 will most likely not hit above 50 fps at 1440p says enough. Ray tracing even on a 4090 is not worth the performance hit that comes with it.
I feel like keeping helldivers 2 would have been fine. That one is multiplayer horde game, where as now all the playstation games are the same singleplayer over the shoulder action genre.
The battle between the 6800 XT & 10G 3080 is Legendary. Hope hardware unboxed test the battle again, wil watch their comment section while watching the vid if that's happened 😂.
Haha true, you just need grab a popcorn when that happen
I will rocking my 3080 untill 60seriers will be out
I snagged a 3080 last week for just about 370 euros. For that price it's an excellent deal.
Commonly available for 350 USD, even 285 'datacenter' (missing fan/shroud), if you planned to watercool or have access to a compatible heatsink.
@@milescarter7803 in my local market, that is tough to find. i dont know if you used USD for convenience and you are european, but it's definitely not easy to find a 3080 for that cheap here
had a 3080 in 2022, swapped to a 7900GRE earlier this year, i was consistently having issues with VRAM but that was the only issue that card had. loaned it out to a friend for a bit when his card died, now its my crown jewel of my EVGA card collection. a reminder of when they still made cards.
My guy you can't start off a re review of a gpu that is a few years old by testing it against the most dogsht unoptimized games that even a 4080 struggles with and going "idk guys it's not really reaching that 60 fps" a 4090 is what it requires to get 60 fps at 4k in black myth. The game doesn't know optimisation at all lol. Star wars outlaws... It's ubisoft enough said there. God of war. Have you played ff16? Sony can't create a pc port to save their life.
You are doing the 3080 dirty putting it up against games that are flat out broken and bad. You see how in older games that have actually seen some optimisation patches it still does amazing. This is the reality for all gpus even new ones that people call bad for no reason. They aren't bad, game developers are bad. They refuse to make a game and instead rely on upscaling and fg to do it for them.
hot take - VRAM issues are inflated by poor optimizations/PC ports and current gen ray tracing and shaders techniques (me coping in 3080)
coping aside, there have already been promising developments on the software side of things to optimize shaders and VRAM usage through AI (AMDs AI texture compression and Unreal engine 5.5s lumen improvements)
3080 is a great 1080p card.
🤣🤣.................wait, that's true. 😖🤢
nah, its still a high refresh rate 1440p card for 99% of games. For those which it is not, you need a 1600 USD GPU. Take your pick
@@HybOj Yes, but it is still a great 1080P card 🤣🤣
I've got the 12gb water-cooled 3080! It's been running well for years!
My 3080 was mostly fine for me until I upgraded to 4k. Now I have vram issues in many games, especially BeamNG. Shame because it has enough power for even 4k, I don’t mind turning some settings down, but Nvidia gimped it with only 10gb of vram.
Looks like I’ll have to upgrade to RTX 50, but we could be looking at the exact same problem in a couple years or sooner if 5080 only has 16gb. Ugh.
Hey let me ease your anxiety - VRAM isn't going to go up. We are at the worst part of the cycle - multiple major game engines got a version update (like UE5) and companies looking to icebreak and cash grab on fancy graphics ahead of the curve are slopping out unoptimized crap. You upgraded to 4k which needs more ram because of math, but 16GB is fine. It will very very easily play anything you want at 4k. You may have to turn off raytracing, but otherwise it will clobber it.
Running at 1080/1440 is not going to take more memory right now. The number of pixels per frame in 1080 and 1440 resolutions will never change. The pixel density of a 1440p 27" monitor will always be good for gaming. VRAM usage will stay where it is right now or go down at 1440/1080, outside of RT being turned on. That's literally the only thing that will swing the needle beyond where it's been for 10 years.
In 2016 I bought a 1070. The games I played still didn't run out of VRAM at 1080p for the 6 years I played on it and my son now plays on it. Stuff used 6-7.5 gb vram on the 'ultra fancy' end of settings and less with high/medium settings. In 2024 it's the same story. same 6-7.5gb used at 1080. The ONLY exceptions are shit console ports that have redundant gpu and system memory use because they were only coded for consoles. If you don't care about Star Wars story games running at max settings or Forspoken, you're literally still fine with 8GB at 1080p.
If you have 10GB you're set for 1440p the same way, and will be likely until the card is irrelevant because of the GPU grunt being too little for the era's games.
Having too much VRAM is like having a car with an 80 gallon gas tank. What is it actually enabling you to do? Yeah, a 4 gallon tank is going to be annoying and cause every short road trip to include a gas stop, sure. The thing is, there's no difference in driving 250 miles with a 20 gallon tank and with an 80 gallon tank. You get there in an identical amount of time, all things equal. All you need is a big enough tank to make the trip and then beyond that nothing changes.
I'm super fatigued by all of the weird AMD-backers saying that 16GB isn't enough RAM right now. As if buying a card with 16 damn GB of memory will somehow limit you while playing Baldur's Gate or Warzone. Those games aren't even hitting 10GB at 1440 my guys.
there is supposedly a 24GB 5080 that's going to come out later. They're going to use non-binary 3GB modules across eight channels to get the Memory up to 24GB.
@@CyberneticArgumentCreator having more vram is ALWAYS a good thing, expecially for AI
I have the 3080 10GB and game with a 3440 monitor, i have around 130 games, i can all my games playing on max setings.
Yes, some heavy games i use dlls for around 20fps boost or more.
But i can play all my games on this moment on the max settings.
Maybe i gonna buy the 5080 or if i have enough money the 5090.
Consoles do NOT have access to 12GB of shared memory as VRAM. Its usually just under 10GB once the OS, subsystem and game data in memory itself is accounted for.
I'm running a 3080 Ftw3 in my backup and even on games that push it, a few minor tweaks in the settings and its right back to running a 1440p ultrawide. It can't always handle the ultra preset like it used to but overall its still a pretty good card.
Vid definitely was before Helldivers 2's revival because its very awkward to say the game has already "had its time in the sun".
Game's still extremely healthy on PC and its kind of embarrassing to skip it but include God of War: Ragnarok when its failing so badly on PC.
hooray, a video where I can go "hey thats my card!"
Hoping to get another solid few years out of my Suprim x model, that 10GB frame buffer is just barely enough for me.
Ampere cards aged like milk. They all have stupidly high TDP, too.
Except the 3060. Its been aging like fine wine next to the rotten milk known as the """4060""".
Not really, I have the same model shown in the video and with a little undervolt it only uses 280w at full load
Can’t help but feel slightly superior waiting some time and getting the 3080 12gb with my grandpa. We keep coming back to talking about those 2 extra gigabytes.
I'm still running an EVGA 1080ti FTW3. Next, a 5090 in a new rig.
You have a casual 2000-2500 usd lying around?
About £5000 for now and I intend to spend about £6000 when the cards are out and reviewed.
Still using a launch day 10gb XC3 3080 and don't plan on upgrading until GTA6 PC release.
A 3080 having just 10GB gddr6x was rough. For a -full price 80 class card- $2500 card during the mining boom. (Though I was set on a 6800XT since I value vram more than RT Performance)
Ouch. I was able to get my 3080 12gb a couple days after release(damned bots) for 1700 bucks.
@@Glubbdubdrib ouch, at least you got a 10/10 card for the pandemic
The 3080 objectively beats the 6800xt more often than not. Regardless of vram
🙏
@@ArchieBunker11 without rt the 6800 xt is on par with or outperforms the 3080 in many games. The vram will matter more in the future. 10 GB isn't enough for 1440p/4k anymore which the card is still decent at. The 16 gb of vram on the 6800 xt vs the 10 on the 3080 will matter big time in the next two or three years even. The 3080 will be running out of vram at that point and having much worse 1% lows in games and stuttering meanwhile the 6800 xt will still be just fine.
I got a 7800xt when building my first pc earlier this year cus nbidia prices are still absurd in asia
Love it and will probs stick with team red
The only GPU I've bought is the 6700xt, that thing is a goat of a GPU
yeah it's so good as long as you don't use ray tracing!
@@FlorentChardevel path tracing is where the fun begins lol. I tried it in cp2077 and it was hilarious
1080ti is goat gpu, your card is not even close
@@xTurtleOW6700xt given time will probably age much better than all the other modern cards will considering it's price. 1080ti is legendary but the 6700xt will probably take the spot in the future. If the 3060ti had 12gb of vram like the reg 3060 then I'd give the title to the 3060ti but we don't live in that timeline sadly
@@xTurtleOW while chugging all that power, not efficient at all
I'm not gonna be upgrading from my 3080 for a long time. i don't play any UE5 era titles that need more, and it feels like the price of every GPU above a 3080 has an absurdly inflated price on both the new and used market.
I upgraded from this card to a 4090, Horizon forbidden west killed the card at 4k.
facts, resident evil 4 remake is what killed my 3080.
for 4k, this card is dying fast.
@@CRF250R1521 Yeah I play RE4R frequently and it uses 14+ gb of VRAM at 1440p on my 6800xt, so glad i have that instead of a 3080
Nothing better than opening youtube and being greeted with one of your videos!
prebaked lighting still cooks raytracing to me
its a very cool technology but jesus, the framerates..
why do all reflections have to be 1:1 resolution anyway?
Really glad to see that buying a used 3080 was a good decision, I was really worried that at 4K I will be needing 4080 for low/medium setting with DLSS to get 120fps, but I was luckily wrong.
I don't quite understand why Star Wars: Outlaws is included. Why include a game no one is going to play?
impressive and intensive graphics, no one bought avatar and alan wake 2 sadly didn’t sell super well but still worth benchmarking both of those
lol what are you talking about? Its very popular. Just not super viral. Both Avatar and Star Wars are technical masterpieces despite lacking sensible NPC/AI.
@@ICeyCeR3Al "Ubisoft has officially confirmed that Star Wars Outlaws underperformed internal expectations. The revelation came via a financial targets update from the company on Wedenesday."
Were we thinking of the same game?
@@ShadyButFresh Because they expected super Viral. As it is a Star Wars open world game etc. If you search some recent sales data, such as European sales chart in August, Outlaws at one of the top selling games. You read a headline, congratulations. But you don't read the articles or look at data lol.
@@ICeyCeR3Al Have you seen Ubisoft's market value? I wouldn't place too much emphasis on those European sales. Best I could find on short notice was an estimation from analytical firm Ampere Analysis that around 800,000 copies were sold as of August 31, 2024. That doesn't coincide with concurrent player count though (which we do not know unfortunately) and how it was generally poorly received by players.
For reference - is this your technical masterpiece? ua-cam.com/video/JuJV0flzPEA/v-deo.html
"Sick ray reconstruction textures! Hopefully people look at that and not the glaring issues with our game." - Ubisoft C-Suite Employees, probably
Just remember: you can polish a turd to your heart's content, but at the end of the day it's still a turd.
Yes the card is still good in 2024....developer optimisation is not
When do we stop putting the blame on the cards and start blaming trash devs, Star Wars outlaws is not a graphically impressive game in any way as a matter of fact it’s a hideous game there’s no reason a 3080 should be getting 30 fps
I will give you a example: 7:56 here he runs it at 80fps, but its so extremely easy to run this game on my 3080 at 180fps at 1440p. You tune down volumetrics (you cant tell a visual difference) and shadows from v.high to high (you HONESTLY cant see a difference neither) add DLSS quality and you are at 150fps. Enable FSR frame gen and you at 180fps and the GPU is chilling (I have 180hz monitor)
At this framerate the latency is a non issue, and its so fast you will never notice any visual artifacts of the generated frames. DLSS quality at 1440p is also almost same as native... you dont loose any quality nowadays, you will never be able to tell while playing the game..... Or you can always buy 1600USD GPU and "Crank it to ultra"... its your choice. Id rather save 1200 USD but thats just me.
Yeah on top of that he doesn’t test with DLSS which was one of the main selling points for the RTX cards
The point of a benchmark is to showcase performance. DLSS is a selling point on top of that.
Yea this video really gave me a hiccup in my throat about the msi trio z I have coming in the mail. Thank u for reassuring me.
@@organicinsanity2534 no worries mate, this card rocks hard, VRAM is a non issue. Only better card costs 1000 USD or 1600 USD, take your pick... instead of throwing money at the wall, learn how to deep optimize your Windows 10 (tutorials, manual settings and than some utility like Hellzergs Optimizer 16.5) and Drivers (use utility like NVCleanstall_1.16.0 and strip the drivers to bare bones, than learn about all the other settings there on utube) ... optimize the hell out of your PC, its fun and it will make a big difference in every task u throw at it.
@@organicinsanity2534 also, learn about the apps like Lossless scaling, and modding in the Frame gen in the games using mods. It can be helpful in some cases. Get the latest DLSS DLLs on the internet, swap the outdated DLLs in your games for most recent ones. Same if the game uses Direct Storage, get the latest DLLs on net, swap it in your games... Set the paging file to a SSD and make it use same min/max values (32gb and forget it if u have storage)....
RT still being a joke more than 5 years on is such a nvidia thing
Like SLI, it too ran a long time and was only moderately adopted before it was dropped.
Thanks great video!
Im pretty happy with my 3080 12gb
what model and did you try to undervolt it by any chance?
@martinrakovicky1189 rog strix OC model and yes I had to undervolt it a little bit because 400w power draw was too much
@@Gimpy17 i own asus tuf 12gb version. But didnt try to undervolt yet. What clocks at what voltage did you manage to get from the card?
@martinrakovicky1189 i used the voltage curve editor in msi afterburner, I can maintain about 1830-2025mhz without exceeding 1.0V, it stays around 375-385w.
Using mine DLDSR 1920p to DLSS quality in every game I can. Still a beast that can do almost maxed or maxed settings. Only time I do 1440p with DLSS is when I enable RT.
im not going to lie i think it would have bean much better if you pared the 3080 with something like the ryzen 7 7800x or a i7 12700k or a i9 12900k to not have much of a bottle neck because no one is going to par it with a ryzen 7 7500f and iv seen the 3080 do much better with a better cpu and no one with a 3080 is going to par it with a ryzen 5 7500f it just dosent make cents to me and it will bottle neck the 3080 is most games but not all so i would take this bench mark with a grain of salt and starfield is known for running better on intel on top of that and for cyberpunk runs better with more cpu cores but other than that this benchmark is good but only for those who have this build this is not fair for the 3080 you have to test it in its best case scenario and like i said no one that has the money will par it with a 7500f it makes no cents
I just upgraded to the same model tested in the video earlier this month. It replaced my GTX 970 after 8 years of use :) I had a budget of around 430 USD, and this was the best offer in my country. Living in Europe, especially in a country with one of the highest VAT rates, the 3080 Ti/4070 Ti were my next options, but all of those were starting for the equivalent of 675/880 USD, which didn’t seem like a good value proposition to me. I guess it’s a different situation in other countries, but I'm happy with my purchase, especially since I’m already used to not playing games on ultra settings (considering I’ve only just replaced my 970 :'D). I'm more than satisfied with it.
I love how DLSS/FSR is used as a crutch instead of optimizing games nowadays
I'll take the Ryzen 4070 instead
cringe
I paid $1300 for my RTX 3080 Sep 27th 2020. It is the Gigabyte Aorus waterforce xtreme waterblock edition. So it came stock with a water block, not AIO or air cooler which made it's MSRP a little higher than a air cooled founders edition. Still paid way too much and is the main reason I don't have a 40 series yet. My only complaint about the card is that it came with a locked BIOS. Yes you heard me correctly, gigabyte released a custom water cooled GPU that cannot be overclocked by voltage with stock BIOS... I did put a modded bios on mine eventually. But I don't bother overclocking the core for just basic gaming. I really only did it trying to hit a higher 3DMark score.
Me watching this from AIO water cooled undervolted 2080 super
Aging like milk
why?
Just keep on mate! ;-)
I swear your takes are beyond bad.
My 3080 can still run games at 1440p ultra 60 fps (usually) so I'm very happy with it
If it doesnt, just turn it down to high, maybe put on dlss quality, and I get another 2-4 years out ofnit
6800XT > 3080
turn on that ray tracing😈
@@megacoolartoturn on them max textures 😊
@@c1nqbl7 Never had to turn them off Turn on that DLSS and take advantage of them CUDA cores in productive workloads🤭
@@c1nqbl7 very few games max out the memory on a 3080 pal.
🔥
I actually know someone that has a 3080 10GB that gets over the VRAM limit every now and then. I know there are always options you can turn down, but that shouldn't be a problem in my opinion. Plus a higher texture quality can be a game changer in my eyes
I got my (used) 3080 last year, because I needed an HDMI 2.1 GPU for my new TV.
It does well with DLSS, but specifically Alan Wake 2 (medium setting without RT) in 4K DLSS Performance or higher just doesn't run well. So it seems as if newer games really pull the card down to being for 1440p
You can under-volt it a little, it lowers the power and heat and improves performance, see optimum tech
I have the 12gb model and its still working just fine at 4k. All ultra settings on red dead redemption 2 and 90 fps average with a few dips into the lower 80's in st denis. The 10gb 3080 has fewer cores too.
I got one of these this year, a ftw3 ultra model from EVGA.
For the used price I got it I absolutely love it, and I won't mind turning down settings in the future, as even low presets are starting to look solid.
I have this exact 3080, got it used a year ago for just about $450. I replaced it in short order for specific build reasons (VRAM, mainly), but my EVGA FTW3 3080 10GB will always have a place on my shelf. I just love the thing (RIP EVGA). Still a solid card if you measure your expectations.
you can also use lossless scaling to double or triple your framerate as well especially for these single player games where a little bit of added input latency doesnt matter
2:02 Not true. There was a massive mining craze around 2017 as well.
I have a dell version and so far is not bad. Though with newer games I'm starting to get dips and less frames on 1440p. Wanting to upgrade but can't justify a $900 cost for a 4080 used.
Dips on 1440p? with a 3080??? My 3070 ti runs anything and everything at 1440p with no issues. Something wrong there dude.
@@HardWhereHero That's everything maxed out on newer games. There is a possibility it is just a crap card. More of reason to upgrade I guess.
don't buy 4080 used right now it will probable drop to 600$ on the used market soon
@@dvlx8453 I was thinking wait and let it drop or just save up and buy a 5000 series card instead.
the thing about the used 3080 is that it became obsolete since the 4070 (super) dropped. However used prices for the 3080 still remain ridiculously high compared to a new 4070(s).
I plan on upgrading to the 5080 at least by next year as I haven't upgraded in a while but the 3080 is still a great card for 1440p gaming for any new game I've played this year. Ultra (or highest possible) settings has always been a pointless endevaour in a lot of games with very little added in terms of visuals. Turn your settings down a couple of notches and you'll see you're still hitting high framerates without the game looking bad.
I got one just to collect. Last generation of EVGA cards. It was on my mind to get a 3090 instead, but it seemed a bit much since I have zero plan to even plug it in…
Is it better to get an RX6750?