Same here, the 4070ti I recently bought buys me 2 years of 1440p future gaming and anything that was released in the past, which is a LOT of great games! After 6-10 years I'll probably do the same.
Must have 100+ games not even started playing yet over last 10 years. Own about 300 games PC / console and tablet games. Zero interest in 4K gaming so my 3080Ti should last a long time. All you need is 1440p and a massive game collection.
I bought a 970 early 2015, it survived up until this year when halo infinite, god of war, and games like cyberpunk 2077 started showing the cards age (
I've got a similar card, a GTX 1060 3gb that I bought in 2017. It plays everything I have at 1080p. I'll look at the lower spec rdna3 cards from AMD later this year.
@@RolanTheBrave smart move am looking to buy a 6700xt or looking what 7700xt bring to table if it's expensive i buy a 7600xt and call it a day if that gpu be something like 6600xt
That's pretty brave when you consider Nvidia scammed out all 970 Users that the last 500 MB Ram of the GPU runs slower than the rest of the 3,5GB RAM. I bought a 980 instead and ist lasted me more than 5 years. G-Sync does its wonders too to have a steady fps even on fast FPS games
Same here. Upgraded 970 1080p to 2070 wqhd. Runs all games fine with some adjustments. For unreal V games and RT in 4k i will upgrade 2024 to a used 4080/90 or a 5070, depends on specs.
A GPU typically last as long as you want it too. For me I make it last 5-6 years. I have a 3080 now, but my old Rig had 2 980 ti's in SLI. I buy the best at the time to make it last. I save $80 a month to accomplish this goal. $80 * 72 months = $5760. I plan to build a new Rig in 2026. Yeah sure you lose performance over the years, but for many years fps is usually fine if you target 60. With DLSS this goal is even easier to obtain.
idk about that, My GPUs have typically died after around 8-10 years. I've never had a GPU last longer than 10 years(assuming consistent usage). The most recent one to die on me was a 760 gtx that I got in 2013 and died in 2021.
@@Tony_Calvert yeah I like this idea too. I’ve had to drastically compress my saving to get a new rig. Been throwing back 500£ for 6 months now & im doing 700£ for the next 2 months ( hardly living lol ) get my 4090 watercooled rig by June hopefully. But I’m not doing this again lol. It’s hellish 🤣
@@Kreege I only switched pcs last month. After running my current pc with a GTX 660 for almost 11 years. The card is actually still fine, just can't really run anything new anymore. So the pc will still get some use as a basic non gaming desktop after this. If you take good care of your card's temp during the 10 years, it can last very long.
I'm still using a GTX 1080, while looking to upgrade this year. Rather than playing at lower settings, effectively I'm extending the lifetime for my GPU by having shifting habits regarding what to play, as I've been holding off on playing games which seem like they'd be a significantly better experience once I upgrade. There's still plenty of good, less demanding games to play instead, so I feel OK doing this, otherwise I would have upgraded already.
Good point! I play both old and new games, so if my GPU's running out of steam for new games, I play more old games with custom content, such as Doom and Quake. Their modding/mapping communities are still going strong, so I'm not running out of stuff to play even without upgrading to a new graphics card.
Lucky you, my 1080 is starting to reach the end of its lifespan. Originally upgraded to it from a 1060 in 2019. It gets pretty loud now as it goes over 80 degrees. I play mostly CPU bound games, but a few games like OW really ramp up the fans on my 1080. No complaints on performance though, but it does get louder now.
I was using a GTX 1080 @ 40/60fps and it wasnt enough any more and 1440p was shaky. Upgraded to a 4070 Ti @ 1440p /165hz monitor and it works great. Its a 6 year old or so gpu its def needed even at 1080p it can get a bit tricky in some games. DLSS is a game changer as well for games like call of duty. Bread and butter
Could play wow using low to medium settings on an hd 6870 last year and this with 50-60 fps😂....i think that it all depends on the age if the games you play....older games may not look as pretty but the playability is still there😊
@@slob12 Yea, Like if im getting 60fps on certain games im fine. If im getting over 120fps in games like cod im solid as i play on a high refresh rate monitor
Monitor resolution and refresh rate plays a major part in the necessity people feel in upgrading their gpu. No point in having a 4090 and playing on a 1080p/60hz monitor. If you're looking at upgrading a gpu to something faster you also have to look at if it's worth it with the monitor you intend to be playing on. The two really should be considered together. If people are also considering VR they should also take account of VR support and their VRs resolution and refresh rates. My recently replaced 1080ti was well balanced in my previous setup offering good frame rates (80+ fps) for many years but since upgrading to ultrawide 1440 and 144hz it couldn't quite get the smooth frame rates I liked so it was time for the upgrade.
@@xblfanatic360 I've always chosen my gpu with regards to my monitors capabilities. High spec monitors are reasonably priced now compared to gpus though.
And obviously the games that you play on them. I went from a 660 Ti to a 1060 on 1080p and noticed the difference even at that res back in 2017 Since it's easier to just slot in a new GPU and they get replaced more frequiently, I tend to go a bit overboard on the CPU side, knowing that I will use the platform for a while.
@@HappyBeezerStudios There is a lot more adaptability in the games though. Going from Ultra settings to High can gain a lot of fps without much loss in graphical fidelity that you don't really miss anyway. Dropping to medium isn't that major either but does tend to leave you feeling you could gain that extra fidelity if only your gfx card was that bit better. Great games mainly come from the story they tell and how they play more than amazing graphics with no depth.
Only because they've limited it on purpose as an upselling strategy. It's the apple approach. Here's a base MacBook air with 8GB ram and 256gb of storage for 900. Want 16GB of ram and 1tb of storage? 1600 please
@@aightm8 ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers. so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow. 1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency. and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course). the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down. but consoles are just THAT profitable for AMD so they just don't care. until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs). don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE. this is the way
ESPECIALY for unreal 5 games with ultra photo realistic textures. Onward on quest 2 easily uses 16 gb of vram with my settings. even 4080 for more money woulnt handle that. GOODBYE NVIDIA. I am glad I went to amd this build for cpu and gpu.
This basically is the reason I used to target the $350 price range (970, 5700xt). You got incredible value and could consistently good performance by upgrading every 3-4 years. Unfortunately, that midrange value is dead now.
Thats my problem with this generation mid tier cards . They give you more Vram but thin choke it with the memory bandwidth. So you still will be turning down settings as it age more sooner . I just don't understand Nvida
@YourFavoriteLawnGuy your right, as much as I want to like Nvida I can't. What there doing with there products on the mid tier and low tier for what there asking in price is purely out of touch with reality for those of us that understand the actual specs and how it effects performance long term
@@bigdaddyt150 If you start calling them Ngreedia and learn about what planned obsolescence is, then it will all make sense to you. They can no longer make more money by providing better products, so they start to try to push worse products for more money in the pursuit of endless increased profits.
All hardware remains as it was (aside from malfunctions) as it was 1 day after appearance. There are also graphics drivers and optimization of programs (games) and they even work better over time. Quake 2 on Geforce 4 runs as well as it did 20 years ago. It's a matter of new software. No one knows how the software will develop... I roughly hope that my RX 6700 Xt will serve me for everything that has come out so far and for the future for at least another 2-3 years.
I passed down my 970 to someone else and we were surprised at how well it handled some new games like New World (low-medium settings) for a 9-year-old card! We were expecting it to catch on fire as soon as the game loaded lol.
The fact that my 1070 has 8GB of RAM and entry tier cards in 2023 only have the same or less gives my wallet a happy face. Before that, I had a 2gb card from the very first Radeon GCN generation. There are too many games to play that don't need a new, three-slot card that weighs and costs more than an entire laptop for me to upgrade.
@@MAKIUSO uhm 3060 didn't have 12gb for clients benefit, just to jack up price even more during crypto boom and shortage. 3060ti had 8gb and look how nividia is introducing new 3060 with 8 or was it 6gb ram
@@MAKIUSO A game developer isn't running GPUs at 100% load 24/7. The GPU is idle while writing and debugging code between test runs, same for graphics design where software is waiting for user input and the GPU will be sitting at 15-30W refreshing a static screen most of the time.
Honestly the problem with nvidia isn't the ram, but the price. 4070 ti 12gb for 900€ is bad, but for 500€ it would be good. 12gb is enough for 1440p, and will be enough for a few years. Yes it's gonna struggle in 4k in some games, but it's not advertised as a 4K, though it's priced like a 4K card. It's a matter of price in my opinion, and it's the same for 4060. 300€ for a 4060 8gb to play at 1080p is good enough in my opinion.
VRAM was a big part of why I went with AMD instead of Nvidia when I was looking for a new card than my 1070. Some games were close to maxing out 8GB, though the one I had "issues" with was Monster Hunter World with hires textures that used just over 7GB of VRAM. I knew that there wouldn't be long before games would start to bump up against that limit, and even exceed it, so I wanted more VRAM on my next card, as I typically wait 2-3 generation between upgrades. My previous graphics card was a 680, main reason I got a new one was the VRAM limitations. When I saw the 3070 only had 8GB of VRAM, I decided against getting it purely on that reason, especially since I had gotten a 1440p monitor. AMD had 16GB on their 6000-series at the time and was a more attractive purchase. Frankly, until Nvidia stops with artificially low VRAM on their cards, I don't see myself considering them for my next purchase. In the end I got a 6900XT as my 1070 was dying (originally was going to wait for 7000-series), and while looking for a card to buy I was contemplating getting something like a 6600/6700 until the new cards launched and sell off that card instead, but I found a good deal on this XFX Merc 319 6900XT. It cost something like 10% more than the 6700xt and less than all 6800xt models in my country so it wasn't a hard sell. So far it's chugging along quite nicely in all the games I play, with relatively little usage, low temps (typically sits at ~50C in games with a VERY lax fan curve. Even playing Final Fantasy 14 at a (locked) 60fps it's 50C at some 25-30% fan speed. The only downside to my card is some transistor noise, but I don't think it's bad, though your mileage may vary.
ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically sold 10-20-even more times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers. so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow. 1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency. and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course). the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down. most likely consoles are just THAT profitable for AMD so they just don't care. until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs). don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE. this is the way.
My GTX970 is 8 years old and still going strong. I use it for a 4K television box/email/some Photoshop. Don't play modern games on it (got a new system 5900X 2.5 years ago with a 3070) but that said I got 5 years out of it with some compromises which I was fine with (though often going down to 1440 for some games before I got the new system with the 3070). Mostly play racing games in VR these days and will probably wait until 50 series (use Nvidia over AMD for production software reasons) for the next upgrade.
I am a 980TI user. I bought it to use with my Oculus CV1. Still runs great, but it certainly is feeling long in the tooth. I'm looking to upgrade this year but it's been crazy to think I bought my 980TI for $650 and thought, this is way too expensive. Now, that price barely buys you into entry level
Dude it's the goddamn inflation too.. In 2015 650$ was like 900-1000$ today. Still prices are high but about 150-200$ 😂 Seems like we have to pay for inflation affecting their mistresses and boats owned by those rich fks too! SMH
It's worth mentioning that the majority of the large jumps in performance-needs come from the transition from 1080 to 1440, to 4k - so unless people are jumping to 8k I don't think the VRAM will increase in the same speed as before
Not really, vram requirements have increased outside of resolution increments. Games became bigger and more complex, add Ray tracing and unoptimization to the mix and you have a lot of vram requirements in nowadays games.
Honestly, when I built my newest PC back in 2021 (6800xt/5800x) I gave my brother my old PC (2080/i7 10th gen), and he gave his wife his old PC (1070/i7 7700k) and we all are doing fine. I play games at 1440p so I'm happy, my brother plays at 1080p and can now play at 120hz no problem, and his wife tends to play games like The Sims so 1080p/60fps is perfect for her.
@@janbenes3165 Yeah I have a 1080Ti so while a 3070/3070Ti may be a Core Improvement anything less than a 3080 is a backwards step for me as I play Adventure Games on Higest Possible Details
With limiting VRAM to 8 Gigs on 3060 Ti or below 16 Gigs on 3080 already is a limiting factor for a high tear product. But I also hope that many of these RTX 3000 customers don't buy a 40 Series soon, otherwise Nvidia would go on with overpriced cards with to small VRAM on them. We'll see how much VRAM 4060 is going to have (with having in mind that 12 Gigs on 3080 and 4070 Ti already hits a limit.
@Jan Beneš meh, the 10gb on my 3080 works great on my ultrawide. I do think the 10gb version shouldn't of ever been released, and the 12gb version should of been the $700 3080. Nvidia are cunts though.
It’s quite simple: As long as it gets the job done for your needs, keep using it for as long as possible. It saves money on unessesary purchases and reduces our overconsumption of electronics.
My GTX 9604GB that I got in 2015 for like $200 lasted up to late 2021. I mostly played at 1366X768 before I upgraded my entire system in 2021 where I ran 1080p for a while before getting my 3070. That 4GB VRAM which many called overkill for 2015 helped my 960 last so long. Its a shame we aren't getting that much value these days.
Had my old 980 Strix for about 5 years, before I swapped to a 2070S which I am still running. I would want to upgrade now, but prices are idiotic and I don't mind waiting out another GPU gen to leverage the performance diff. The 2070S is perfectly capable with 3K UW, no I don't max out raytracing but I am also satisfied with the visual quality overall in most modern AAA games so what's the point. Most AAA publishers have doubled down on live-service games that rarely get major overhauls and thus don't push visual fidelity and gpu demand like they used to 4-5 years ago. And with no major new titles on the horizon that are expected to push gpu performance demands significantly, there isn't a big reason to not wait for Nvidia 5000 series (or a mid-gen 4000 refresh).
I bought a gtx 660 in 2013 and I'm STILL using it. Many people aren't fans of nvidia, but they're supporting Kepler for 12.5 years, and they have geforce now to help you run certain demanding games. I even did a 1080p stream with it... with my CPU, this is only possible cause of nvenc. I will upgrade in like 1.5 years or so at the latest but I certainly got my money's worth out of the card, I'm happy.
If you are buying a 4090, you are probably buying it for the VRAM anyway and/or productivity, as opposed to the expectation it is going to remain a high-end product for almost a decade. And even then, most people who bought the 1080 Ti are still getting use out of that and okay with it being a great option for 1080p - 1440p high settings.
I’m on a 3090 which I bought about 6 months ago used. I think the 3090 is probably good for another 1-2 years as a high end product just because of its VRAM and still decent RT and 4K performance. 6900XT and 6800XT will probably also be fairly solid cards even at 4K (no rt) just because AMD had the decency to not choke their cards with 8-10gb like Nvidia did.
Yeah i think the last time i bought a brand new card from a store was like back in 2014-15? a gtx 980, store prices are just ridiculous. You can get a used 3090 for 2/3 of the price of a 4070Ti and it will on par with it in 1080p and 1440p and destroys it in 4K due to the crap 192 bit bus of the 4070Ti. The only selling point of 40 series is frame generation no wonder Nvidia made it exclusive to the 40 series otherwise they would be pointless...
ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers. so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow. 1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency. and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course). the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down. but consoles are just THAT profitable for AMD so they just don't care. until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs). don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE. this is the way
This is the reason why I’m heavily considering an AMD card over a 4070ti. The 6800XT is already proving to age better than a 3070, and I think the 7900XT will age better than a 4070ti because of the VRAM. By age I mean only a couple years. For an $850 tier card the 4070ti is already running into a couple games that max out that 12GB of VRAM.
ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers. so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow. 1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency. and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course). the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down. but consoles are just THAT profitable for AMD so they just don't care. until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs). don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE. this is the way
You got way more than a couple years before 12gigs of vram is a issue at 1440p considering it’s way more dedicated vram than the consoles have . Both cards aren’t 4k cards so having the extra vram is pointless .
@Blue I don’t know I just watched hardware unboxed updated video where the 3070 was 19% faster than 6700xt in a 50 game benchmark ??at 4k mind you it was like 14% at 1440p
My GTX 1070 has been going strong for about 6-7 years now. 1080p & 1440p. Still gets 60+ fps on most titles except on unoptimized AAA trash which I rarely play anyway. Was gonna finally replace it with a RTX 4060ti 16GB, but that card turned out to be *censored* so I guess it's time to start using Nvidia Image Scaling in more demanding titles until a better card comes along. :D
I waited 6 years with my GTX1080TI on my 34" 1440p Ultrawide. Still plays most games well on low quality. Bit of laggyness on newer games. I had my heart set on a RTX4080 but settled for RTX4070TI to save $600
I'm in the exact same situation. But I really don't wanna pay like 900-1000€ . 4070ti would Double the performance but idk man. I guess I'll wait few month to drop prices maybe.
@@blue-lu3iz das ist ein guter Preis. Aber ich hab einen g-sync Monitor relativ alt glaub von 2016 pg348q. Weiß nicht ob der freesync kompatibel ist. Ich hab wegen Testzwecken Mal g-sync ausgemacht weil ich 3d Mark gestartet habe. Hab vergessen es wieder einzuschalten danach und man merkt selbst bei 60-70 fps ein krassen unterschied wenn g-sync nicht aktiviert ist. Kommt mir vor als würde das Game permanent Micro ruckler haben. Auf ray tracing kann ich getrost scheißen. Da ist die raw Performance der Rx 7900xt schon besser. Aber lutscht mit auch gleich Mal 50-70 mehr Watt Weg. Hab auch nur ein 650w Netzteil. Da müsste ich mir der 7900xt glaub ich auch aufrüsten. Bin kein Nvidia fanboy aber war immer sehr zufrieden . Aber meine 1080ti hat langsam ausgedient 😂. Ich beobachte Mal noch ein wenig. Mehr als 900 will ich nämlich echt nicht ausgeben.
got a 1070 since 2016. Finally starting to look into upgrading my rig. But current pricing makes me gag. probably be looking into just upgrading my platform and keeping the graphics card a while longer.
You mention how developers target midrange systems. So by buying a top end system you will be within the range targeted by developers for much longer. I went from a 1080ti to a 7900xtx. I never felt the need to upgrade my gpu until this year as I could always play my games at high - max settings and get 60-120+fps. This year is the first time I had to turn anything to low to enjoy a game. That is what pushed me to upgrade. I also play at 1440p.
This is the only reason everyone should consider upgrade their GPUs. If you have the need to change the settings from at least mid to low. That is your red flag to start searching for a new toy. You can squeeze your gpu changing the settings from high to mid, but I feel that the games looks shitty in low settings. Lower your resolution or upgrade your GPU. That's it.
@@nike2706 Really? I feel like resolution should be the last thing you change. Unless we are talking dlss/fsr. Anything non native looks... not great imo.
@@jessefisher1809 For the sake of example, let's assume we're talking about a system built for 4K gaming. If my video card isn't up to snuff, I'd rather play at 1080p then at low settings, because resolution primarily hurts fonts, while low quality textures and shadows hurt everything.
If you're willing to drop settings from ultra to high to medium then it will definitely last longer before sub-45 FPS happens, if you always want near ultra settings just for the heck of it that's another story of course, then it gets expensive buying new top end cards just as often as midrange cards. I am quite used to tweaking settings so I could probably make a high VRAM top end card last a long while (half render scale 1440p/720p might still show 100% size hud elements in some games, etc) but I could also make my current RX480 card last quite long on medium/low settings for much less initial money invested, it just depends on how low you're willing to tweak graphics settings and if CPU is bottlenecking.
Just upgraded to a 3440x1440 display and pleasantly surprised to discover my 5 year old 1080ti plays everything buttery smooth at max or near-max quality settings
What PC Components do you have to go with the 1080ti? As I want to upgrade the rest of my PC (4790k and 16gb 1600 mhz RAM) to a current CPU and Motherboard. Plus, buy a gaming monitor (no OLED - I use a lot of static objects on screen) instead of using my 12 year old LCD 60hz TV.
@@jodiepalmer2404 I have an i7-5820 with 16GB 3200. What I also have - and I believe this is the component that really makes a big difference to the speed of the machine’s operation and my perception of its “smoothness” and responsiveness - is a Samsung 960 Pro NVME chip.
@@jodiepalmer2404 The new Samsung 34” OLED G8. It’s beautiful for gaming and photo editing/video, but there is a colour fringing issue with text due to the shape of the pixels. You get used to it after a while but initially it’s quite jarring. My previous monitor was an Asus MX299Q, which I definitely recommend as a budget option - nice wide desktop and gaming field of view, good for productivity apps and great for gaming. It even has a good contrast ratio for a monitor of its age, but the blacks aren’t quite as inky, and of course the response time is slow by today’s standards. Tbh though the only real quality-of-life difference for me with the OLED G8 vs the MX299 is the enlarged desktop space / gaming FOV, deeper blacks and a minor sense of improved smoothness and responsiveness probably due to the 175hz vs 60hz increase. It’s not that big a deal though. Depending on your financial situation, I might be tempted to wait for the next gen to get rid of the colour fringing issue. Without that, my new G8 really would be the ideal monitor.
Honestly, it completely matters wether or not high resolution monitors will drop in price to be more affordable so they become the norm from what we currently have which are 1080p monitors. Besides that it is up to you what you demand to play, i told to myself that the online games i want to play should run at 144-120fps while the heavy single player games at a minimum 60fps medium/high mix settings. My 3070 does exactly that even though i run it with a somewhat outdated cpu (3600) which i plan to replace with a 7800x3d, i doubt the GTX cards can do that except the 1080ti while the 2000 cards don't hold a candle compared to the 3000 series which can be found extremely cheap especially if you don't mind used.
4090 can't even run all games on midrange 1440p 144Hz monitors with all eye-candy turned on and without using upscale trickery. and yes, 1440p 144Hz+ monitors ARE MIDRANGE if not budget already, look at their prices. there are great MODERN NEW 1440p 144Hz+ monitors for 250-300 bucks. try to find any great 1440p-class GPUs that can run MODERN games at 144+ FPS for 250-300 bucks. 1080p is extreme budget by now. 1080p is obsolete extreme budget level monitors, nobody should make or buy GPUs for more than 100 bucks for 1080p. 1440p is a starting point in 2023. just like 12GB VRAM should be. 1080p GPUs literally should be A VERY BUDGET GPUs with prices around 100 bucks. a 500 bucks GPU with 8GB VRAM is laughable and SHOULD BE RIDICULED as it is morally obsolete dead on arrival overpriced AF piece of crap. but ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers. so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow. 1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency. and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course). the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down. but consoles are just THAT profitable for AMD so they just don't care. until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs). don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE. this is the way
I wonder if this clip was published right now, because you guys had set up a poll asking us at what percentage increase we'd be willing to pull the trigger on a GPU upgrade.
HD5870-> 1GB VRAM buffer was the main problem GTX680-> 2GB VRAM buffer was the main problem GTX1080-> 8GB VRAM buffer is fine for me... but mostly because DIMINSHING RETURNS has meant that most games I own previously are running 1440p at 60FPS on HIGH (ish) settings (previously just cranking up the RESOLUTION alone from, say, 1600x900 to 2560x1440 would make a huge difference). I wouldn't buy a NEW card with less than 16GB VRAM but I also feel no urge to pay a LOT for a new card just so I can buy new games and/or enable ray-tracing. If I buy a NEW game I'll get one that runs great on my GTX1080... meanwhile I've got a back catalogue of great games that are fun, run well and look great!
I'm perfectly happy at 40 to 45 fps on 1080p with HUB optimized settings. I'm hoping this will help me keep my RX 6700 for at least 5 more years. I prefer lower power consumption over fps or ray tracing.
I was using RX580 from its release and up until recently (no money, then no cards, then waiting for a new gen). That 8 GB of VRAM helped a lot. Its still usable now, most online FPS work ok and games like Atomic Heart run quite nice too. My friend now uses that old PC of mine. Hell, that card still keeps OC levels just like a new one. I believe VRAM amount on mid-tier cards today is pathetic. 8 gigs was overkill for rx580 on release but helped a lot at the long run. 8 gigs on 3070 is just a joke. Darktide uses 9.5 gigs at 1080p already! So I have gone with rx7900xtx cause I have no idea when and if I would be able to buy a new card so I need staying power. RT features are secondary and the 1st to go if performance drops down. Nice good textures provide much more to picture quality and 24gigs should be fine for a few generations.
I have a 5600x and a 6800 xt I’ll end up upgrading my CPU before my GPU. 6800 xt drivers have been friendly to me ever since I got it about a year ago. I see zero point in upgrading just to deal with newer driver issues.
@@Shatterfury1871 I play in 4k on a 55in LG C1 so technically I can justify an upgrade. But I get over 100 fps in everything I play using FSR and game play still feel nice and smooth. So it’s still a waste of money. Even if I had a better GPU I’d still be using FSR or dlss.
@AnimeGeek611 You certainly can justify it, at the end of the day you decide what you do with your money. But at 4k, the bottleneck is 100% the GPU, not the CPU. At 4k, the 5600x can handle even the RTX 3090 TI. So going for a 5800 X3D, I guess you are on AM4, you will be lucky if you will see a 5 FPS increase at 4K.
It depends largely on 2 factors main one is the games you play/are likely to play if you're always playing the latest games on release you will more than likely need new gpu more frequently vs someone who plays the same games regularly (ie mostly the big online multiplayer games) The other factor is your screen size/refresh rate weirdly enough as my eyes are getting worse I thought screw it and got a 48inch 1080p tv for cheap, unfortunately that also means I constantly have to force oversized resolution (super sampling) or jagged edges are EVERYWHERE and because of the nature of the issue AA kinda makes it worse so much so that some games I can't play because my 1070 can't really do any higher than 1080p and the jagged edges are headache-inducing, though it is worth noting on things like counter-strike LoL Valorant etc because they are fairly lightweight I can set it to 1440p which gets rid of most if not all the jaggeds and actually makes for a bit of an advantage in shooters because the target is physically bigger so if you do go that route again you will need a beefier card to deal with supersampling even more so if it's a high refresh rate since your acceptable framerate floor is much higher. again though if you're generally into the most popular online games on a 28inch or lower monitor you're probably fine with any 1070-2070 card
when i build a system for myself, i select parts that i think will get me through 5-7 years without having to upgrade in order to do what i want to do. its nice to be able to play at the highest level now, but what im after is longevity and stretching my dollar. i think it all comes down to being able to play at the level you want it to, and there is no set amount of time.
You would get better value AND, on average, better performance as well, if you were to upgrade every 2-3 years, but always only get cards which give optimal performance per dollar. When you consider how much you can recoup by selling the older card each time, the net upgrade cost ends up being very low, but you can also always have very good performance if you never wait longer than 3 years to upgrade from a good mid-range card. You certainly can get more life out of a mid-range card, but you're going to have to lower your expectations as time goes on. 7 years seems really ridiculous to expect a card to last though, especially a high end one, because why on earth would you pay a premium for cutting edge performance if you're planning to keep the card for 5-7 years?! You're not going to be having cutting edge performance for most of that period. On average, you would get better performance if you upgrade more often but only buy cards which have good performance per dollar. That being said, periods of GPU shortage and/or very high prices are a great time to put-off doing any upgrades, and to just make do with whatever you happen to have.
@@syncmonism if you were to buy a 3070ti for $600 and then upgrade to a 4070ti for $800 it would have been cheaper than getting the 3090 for $1500 by $100, but you would have went 2-3 years without the 3090/4070ti level of performance. so in that instance, im not sure i would have went the midrange route.
I'm currently have a Radeon R9 290 that's 9 years old (and since two of the three fans have flown off, so it's ready for replacement). Before that, I had a Radeon X1950 XTX that I had for around 7 years. Soon I'm going to build a new rig, and I hope my new graphics card will also work for around 7-9 years.
My 1070 is still kicking and I got it day 1. I'm honestly surprised it still performs well. But I am looking to upgrade I'll probably get a rx 6800 or whatever else gives me the most performance for about $400.
My GPU did last 6 years, thermal compounds need replacement and the cooler needs more than compressed air to be cleaned. The GPU is not broken YET and I keep GPU core temps under 67C, anything above 72C and my VRAM starts to produce artifacts during stress testing. Thermal paste can lose function a lot faster in systems that are powered off for extended periods.
I think a 50% increase is not enough for a GPU upgrade. I have now retired my 1080 Ti and got myself a 7900 XT. Actually, I first wanted to switch to a 3080 after 4 years, but at the prices at that time and even a reduction in VRAM, that looked like a bad decision. Of course, 144 FPS is awesome and 90 FPS is still great, but i can also live with 60 FPS
I sold my 1080ti that lasted 6 years for £200 late last year and picked up a 3090 for £700. I think buying used from trusted sellers and sites is the way to go in terms of value. Stop rewarding these greedy GPU companies.
I mean I would, but I also can't justify spending 700 bucks on a used 3090 when even the awfully priced 4070Ti offers better performance and much better efficiency for pretty much same money at new. The only major difference is the VRAM ofc, but still the 3090 prices have always been ridiculous and so are the used prices too. If it's used, then it's gotta be a truly great deal or the card has to match some very specific spec requirements for how you're gonna use it. My take is to simply not buy any gpu at all, until prices improve overall. And if that means I skip another gpu gen, then so be it.
My RX 580 works good enough . Cyberpunk only runs at 40fps and Elden Ring 60 unstable. This year I finally switched it for a second hand 6600XT. Oh boy very happy with the change, I wish I had done it earlier.
I'm in the same boat, rx 590 here. But I was looking to upgrade this gen. After seeing prices and performance, I'm currently considering a RX 6950xt at say $600.
2:00 I completely disagree with your sentiment here. It very much depends on what game one plays. Generally speaking I purchase top Tier and game at (now) 3440x1440p at high (not ultra) settings. That means, at worst I go from 3440x1440p @high fps to 3440x1440 at 60. Medium - High 70fps is around the Mark I start wanting to upgrade badly. Graphics settings dont really scale anymore. Ultra is a waste and Medium most of the time looks very similar to high. Top Tier cards usually easily last 4-5 years unless you *demand* ultra settings - which we already established is a waste Plus there is a lot of fanzastic older and less demanding games out there which makes putting off playing top Tier games for 1-2 years very easy.
I don't solely use my GPU for gaming. I upgraded from AMD R9 290 to an Nvidia GTX 1080 because the AMD GPU does not have hardware decoder for HEVC . I might need to upgrade to something newer thats supports AV1 when it is more adopted .
I only upgrade based on the game or program I want to use it for, which means if I choose to play games such as StarCraft 1 or 2 for the vast majority of gaming, then the current video card I have will last me more than half a decade before I consider a new card. The primary reason I will upgrade is the utility of work a new video card will give me, which for me will likely be a GPU that can process multiple streams of tasks such as encoding, decoding, and video output to more than one monitor.
My rule of thumb is pretty much 2 generations, which is around once every 4 years. I bought a GTX 1070 6 years ago and replaced it only last year for a RTX 3070 to keep playing at 1440p high quality setting and try Ray Tracing and DLSS. I'm certainly gonna skip the current generation, but will check if the one following will be worth it for that 1440p sweet spot depending on how demanding games will be.
*GPU 3x rule* My general rule USED to be to look up Techpowerup charts and look at whatever GPU got roughly 3x the FPS. I wouldn't buy a card that only got 50% more because usually with a little tweaking my current GPU would look roughly the same at the same performance. Or if a game NEEDED a new GPU due to more VRAM or whatever I would just not buy that game. Plus CPU's where changing a lot years ago so i thought I'd just build a new PC every five or so years and donate the old PC. That worked well, but modern GAMING is so complicated due to ray-tracing, cost and other reasons. Still, I wouldn't UPGRADE unless two things are true: a) the PERFORMANCE difference on paper is at minimum 2x higher, and b) I think this would make a noticeable difference in my gaming experience (if the game you play already runs great at 1440p/60FPS or whatever do you REALLY need a better graphics card?)
I've owned 6 GPU's in the past decade. My first was the GTX 670, which I upgraded two years later to the GTX 980 because I was convinced to not buy the 980 Ti... which I then ended up buying months later anyway so as of right now the GTX 980 is my most shortlived GPU. And I ran with that 980 Ti for the next 5 years before upgrading to an RTX 2070 Super which I only did because my computer was starting to die after 7 years and I was forced to build a new one earlier than I had planned. One year later I upgraded to the GPU I always intended to get, the RTX 3090 which I bought used off of someone that didn't want it anymore. And now, on the cusp of building another new rig because I made some mistakes on the last one, I managed to get an RTX 4090 for retail price. The only thing that really convinced me that a 40 Series GPU in general was a worthwhile upgrade from a 3090, is because of DLSS 3.0. It's locked to 40 Series cards and it does make a noticeable impact on perceived framerates albeit with mixed results sometimes, but it's obviously actively being worked on. The real question for a graphics junkie like me is if they'll have anything exclusive to the 5090 next year. Otherwise I'll probably stick with the 4090 this time around.
literally was going to sell my 3090 was going to get 800 for it and use that towards a 4090 only spending 1k... keep in mind I got the 3090 for free in 2022. I end up not selling the 3090 and decided to keep it for now. because 1. While the 4090 is sometimes double the performance of a 3090. I still can't justify spending $1700 for a card that does not max out every game at Native 4k with all the bells and whistles. 2. In The Witcher 3 with all of the next-gen upgrades the 4090 averages 49-50 frames even with patches 6 months later. The same thing with Cyberpunk 2077 with all the settings turned up, you still can't Burt force your way to high frame rates. you have to use Upscalling and or DLSS3.. so what is the point in going all out spending $1600-1800 for a 4090 when several released games old and new I still even the 4090 struggle to keep 60 or above without the help of AI. 3. If this is the case I might as well keep my 3090 which I end up using DLSS2 on the games that need it. it seems no matter if you're on the 30 series or 40 series you will still need to use AI to get playable frame rates, and it's only going to get worst once more real unreal 5 games start to roll out next year .So I rather wait since I got my 3090 for free, wait it out and see what the 50 series or 60 series will bring when it comes to price to performance, DLSS3 is a selling point /gimmick that Nvida made to justify the super Overprice 40 series . and here's the silver lining, 30 series owners will not be locked out of frame-gen because AMD is releasING FSR3 soon which will work with any GPU just like FSR2. So is no need to run out and get a 40 series because you feel like you're being locked out by Nvidia, they just want your money.
Have had the 5700 XT for over 3 years now. It hasn't missed a beat currently. Either 60 fps, or 120 fps with upscaling in anything. Only Forspoken might've given it some troubles based on the demo, but otherwise rock solid. I'm skipping 7900 cards and I'll get a 8900 flagship or 9900 flagship. Probably 8900. What's 2-4 years more.
1080ti 11gb still going strong - I use a 100hz 3440x1440 monitor Considering an upgrade but honestly it's not worth it right now. waiting for the 200% uplift @ around 4-500£
I think something that also needs to be mentioned is the kinds of games you play and the res that you play at. 1080p competitive shooters at 240-360hz can still easily be handled by a midrange card from 2020-2021 and probably still can be handled for the foreseeable future!
Got the RX580 when it was already years old in summer of 2020 and I'm still using my RX580 about 3 years later! 1080 60-75 Now I want to upgrade to 4090, so 4K 120+ preferably
Resolution isn't the only thing increasing. Game developers are adding more polygons, more complex shaders, more situations that require high computation, and larger textures. So even 1080p 60fps can be harder to achieve on an older card with some new AAA titles. AAA games will continue to require beefier graphics cards. But some titles will run fine on older cards, too. I play flatscreen 1080. I had a 1060 6GB, then upgraded to an RTX 2060 SUPER to prep for PCVR. Then I got a 3080 to make that even better (but the improvement didn't really blow me away). I'll likely get a 50 series when they come out, assuming a 5080 will render over twice as fast as a 3080, and I don't need to commission a new power grid to do so.
I have a GTX 1080 inside a huge Thermaltake Core v71 ATX case that I bought for my first build back in 2015 and I'm still using that same case for every upgrade lmao. It's like the perfect house for new components due to it's insane airflow and it has 2 front fans, 1 top fan and I use a strong 3000RPM Noctua exhaust fan.
You can expect about 30% uplift with each generation (about every two years.). Games will not outpace the hardware. Using math 2*log(2)/log(1.3) = 5.3 years before you are at half speed. That's general replacement time.
A 1080 ti used by my friend for video rendering for 6 hours daily. Air cooled system three intake and three exhaust. His GPU started to show artifacts after 2.9 years died. After few days upon getting the replacement under warranty, his replaced card last only for 1.5 years Now he has 4080 and it is working cooler at 60 deg C iat 100%
The beauty of PC gaming is that there's usually a way to play games even on dated hardware. I'm still running a 1660 super with a R5 3600 and the total watts I consume is less than some GPUs on the market! I can play the games I want at 1080p and have yet to run into those fringe cases where I need to set graphics all the way down to low
i have a love -hate relationship with this topic. part of me thinks it's agood question to ask because hardware does get outdated. but another part of me thinks it's stupid , because the vast majority of users are gonna wait as long as the can or can't afford to upgrade. the longest usage i got on any gpus was: 1: the ATI radeon 9700pro (5 years) and then 2: the geforce 1070 (about 5 years as well , though i think it could have made it one more if i had too) . most video cards i've owned i replaced within 2-3 years. and a few i replaced inside of 1 year (things moved along real quick back in the late 90's and early 2000's)
My first pc that I ever built was a 2700x and a Vega 56. Then my friend upgraded his PC and gave me his old 1080ti. Gave the old computer to my brother and bought some used dudes rig 8700k and a 2080ti. Then bought another used rig swapped over 5600x and 2080ti which is what Im using right now. Been thinking of buying a used possibly 3080/3080ti/3090. They seem to go anywhere from 500-700 used. Been also thinking of buying a used 5900x to make a all in one stream PC.
@Slumy My thoughts exactly. Why go for a 3080ti which is essentially just a cut down 3090 with only 12gb ram. Also I have a 750w PSU and 3090 is the max I can go before needing a different one.
I'm playing in 3440x1440p, before I was gaming in standard 1440p. Well I bought Geforce 970 in 2014, and this lasted me until 2019 when I upgraded to 2070 SUPER, and just now I bought a used 3090 (which premiered in 2020). I am happy with the purchase and expect this over 2-year-old card to last me 3+ years.
I'm looking to update my gpu now, have a r9 285 which is finally giving up (Although still in the min specs for diablo 4!) When people like me ask how long a part lasts, I'm not buying a GPU to play at ultra. I'm buying to physically be able play the games I want to play. Turning down settings down over time, is to me, essentially saving money. And I'm looking for expert advice to create something that lasts as long as possible. So I'm looking for recommendations from reviews for what hardware is best value for money over a long time frame. The mindset of updating so often is wasteful to me. It's the same with recommending CPUs based on upgrade potential on the same platform It's fine if that's what you enjoy and can sell your used parts but surely there's more people that think like me than not and it always surprises me the number of people on youtube/forums that advocate parts with the mindset of updating in 2 years
I just upgraded from a 1080ti (bought nov 2017) to a 4090. It was still usable, even at 4k, so I didn't sacrifice too much by waiting. I even played Cyberpunk on it. I would have upgraded earlier to a 3080 but with the price gouging it didn't seem like it was worth the money. Now with the 4090, the improvements are enough to justify the massive increase in price, while waiting longer would have resulted in too much degradation, and the lower cards would have been underwhelming and provided less in new capabilities. Would I have been happier spending time at 1440p and/or lower details with a lower-end card initially, in order to commit to upgrading regularly and have higher framerates in 4k and higher details sooner? Maybe, but I kind of doubt it. I got 4k from day 1 of that PC build and I enjoyed it. Worth considering that upgrading regularly assumes that both your income and your interest and taste in gaming are going to remain constant and high enough to justify it. I bought a card because I had the money at this time, and I'll enjoy gaming to the max right now and worry about the future later. Who knows whether I'll have money to upgrade in the future or whether I'll care enough about gaming to do so. I’m very unsure about my financial future, so it doesn't make sense to commit to future spending. My interest in gaming is up and down, sometimes I hardly play at all or I play old or non-demanding games, and then I come back. I would like to upgrade in two generations, but if I can't justify an upgrade, the 4090 can keep me going for a while. Especially since I'm still running a 4k 60hz monitor. Not much interest in the high frame rates - I care about pretty, cinematic, immersive graphics, not pretending to be an esports competitor.
My 1080 is still going strong after 7 years and I plan to give it to a friend next summer. I am still happy with its performance and I’m mainly building a new system because my cpu is not coping as well. Really don’t get the idea that buying a way overkill gpu and then adjusting expectations over time is a bad idea. As long as it’s hitting 1440p 60fps and doesn’t look horrendous I’m happy, fully convinced low settings in 2030 will still be very good. £2k spend on a pc build that lasts 8 years is nothing
My GTX 1660 Super 6GB still going strong in 1080p gaming almost 4 years later, probably would still get a couple more years out of it but upgrading to a RTX 3060 12GB pretty soon. (Budget PC Gamer Here)
2:00 i don't think that'll necessarily be true. When i bought my vega64, everything was 1k full max. (except for AA & motion blur ofc). Now, i'm lowering settings ofc, especially for darktide, it's full low fsr, however, dark tide full low kinda looks like older games full max... so... My main gripe is actually VRAM. I can't watch 1080p videos on my second monitor while gaming, without something lagging it's life out.
I used my GTX 980 Ti from 2016 to 2023, saved tons of money not having to upgrade year on year, enjoyed that experience so I bought a 4090 FE and it’s gonna actually save me money having to buy a new graphics card every year for the next same 7 years.
I still have a 1080ti w/11gbVRAM...it's in a backup system and no longer my primary gaming system. However, it still works great for 1080 gaming. I have moved to higher resolution and refresh rate gaming....but that doesn't mean I wouldn't be able to play my games on a 1080 monitor still. I purchased that 1080ti back in 2017, so it's around 6 years old. I purchased an EVGA watercooled RTX3090 in the 4th quarter of 2021 for retail price with the expectation that it will also last me at least 6 years.
I bought my first gpu in 2003. Till now i change gpus yearly, just adding some money and buying beter used card. I spent no more than 2500$ for all upgrades over 20 years. Now i am rocking rtx 3080, will see what to buy next year.
Funnily enough, last year was when I upgraded from rx580, which I've given to my young brother and he's enjoying it quite a bit. For 1080p low, maybe some FSR gaming it's still kinda awesome. That was really good buy IMO.
You either have a high fps expectation (90+) or idk how a RX580 can't play higher than Low at HD🤔 It should be easily Med-High on all games except if you want crazy fps
@@upfront2375 "Next gen" games are coming out and they are getting kinda taxing on the old 580. Plague tale, spiderman, last of us. But again I think it's absolutely great GPU and one that will, with the help of community drivers, last for many more years of gaming.
@@korysovecit's poor optimization man. I wish that I could but 1 for my old build in other house, but it's not found new or super difficult to find one that wasn't mined on.. a literally "work horse" ,coming from a Nvidia user since forever!
I update roughly every 3rd year, i like playing everything on ultra at high framerates, so as soon as i see performance starting to lack, i buy the current gen.
playable really depends on the gamer. throughout my gaming life I have used an APU then gtx750ti now 3070. i still recall i played games like AC blackflag on my APU at 16-18fps. those were playable for me. then i moved on to 750ti which i used until November 2022 from 2016 so good 6-7 years although i was a 720p gamer. only reason its no longer usable is ofcourse performance but also cuz its doesn't have dx12 support which won't even start the game. now my 3070 has spoiled me with max setting 80+fps so i doubt I'll ever be able to play below 60fps. if i had just gotten a 2060 tier card as upgrade then i would be used to my old low setting gaming for next 3-4 years unless another direct X versions appears that would limit the gpu
the fact that my 1080 can still do 1440p mix of high med settings is enough for me, most times I launch the game and just play it unless shadows or lighting look a little soft and the hardware's still got wiggle room for playable FPS.
Bought a 3070FE on release and enjoyed it for 30 months playing at 1440p high to ultra graphics in games. I recently purchased a MSI RTX 4090 Surpim liquid X and it was a hefty purchase at $1750 ($1860 including taxes) I weighed heavily on weather to purchase it or not because of cost. But with its massive cuda core count and 24gb vRam I personally believe its good for a least 5 years or more before i need another upgrade. The real question is Will the power connector melt before then? 🤔 Thats my biggest concern.
Gave my brother an Xfx Rx580 8gb 3 years ago and he is still using it. FSR makes it's capable enough for 1080p mixed settings. I have a (2021) 2060 12gb that he is getting this year so hopefully that will allow him to bump settings up a bit for another year.
Still using three Nvidia Titan X in SLI since 2015 in 1440P, 60 Hz, with no issues on high settings. I will adjust some items like shadows, water, sky, to lower settings and playing everything. Was tempted to update to a new system with 4090 GPU and Intel i9 13th GEN but I am still able to play everything I want without issue so held off.
I think it also depends a lot on the console generation. Once a new console generation is established and you buy a decent GPU it will last for the rest of this console generation because the consoles will limit what new games can do.
I did get my 9900K + 2080ti system in 2019 with a 5-year-plus timespan in mind, but then I already had a ultrawide 75Hz 1080p monitor and wasn't satisfied with the 1440p and 4K options in the market, which meant I was bottlenecked by my monitor even with everything maxed out until the end of last year, when I finally managed to get a 4K 144Hz monitor. By then the 4090 and 4080 had already been launched, and now my 2080ti (which has a 3070-level performance) can only keep 4K over 60 fps in maxed-out current AAA titles such as God of War with DLSS (quality mode) and a mild GPU overclocking. I hope I can keep this for at least another year (thus making the planned 5-year mark), because even a 4080 (which would give me about a 50% performance gain over the 2080ti, I reckon) is still too expensive as compared to what I paid for my 2080ti.
you are forgetting something: count the time for the card to be high end starting at the launch date of the card and not stating at the time u bought it ( as new). a fresh out of the box 6900xt is still a 2 year old card.
Depends on the games and monitor, I'd say upgrade if you don't see enough performance that a bios update can't fix if the fps is low or you see weird graphic glitches
Maxwell and Pascal where true performance champions and still very serviceable to this day. Actually just a few minutes ago upgraded a Kepler GTX 660 2 gig to a GTX 1650 super. They last as long as you have use for them?
I bought my 1060 6gb in 2016 december for ~250usd. It still running multipalyer games on Mid settings 60+ FPS. Singleplayer games, like Howgarts Legacy running at ~50 FPS on low/mid settings. It's time to upgrade for a 6650XT or 6700XT, for the next ~5 years.
I still have about 2 months of warranty left on the used EVGA RTX 2080 Ti XC Ultra I bought on eBay for a high bid of $796 on 11-21-2020 and the high setting on COD Warzone seems to average around 130 FPS so I see no reason to upgrade yet. Of course I’m on an older PCIe 3.0 Z490 system (i9-10900K) so that may be why the PCIe 4.0 MSI RTX 3060 Ti Gaming X Trio (12/2/20 $489.99 MicroCenter) had a little less FPS but seemed far better on my sons MSI MAG X570 Tomahawk WIFI (Ryzen 7 5800x). So much so that I thought the Gigabyte RTX 4090 Gaming OC was a waste of $1699.99 (3-9-2023 BestBuy) I may return it.
Up until now, i was upgrading GPUs for "future releases". I eventually realized, i was wrong. If your current GPU delivers great performance and you have great time playing games, you wanna play, there is no need to upgrade. Ofc people playing on higher resolutions (especially 4K) have it harder, as they have to keep upgrading just to be able to play new games at 4K. Kinda reminds me of 2000s, when new mid-range GPU was already weak after a year. But overall, gameplay should matter, not visuals (graphics). I play most games on High (even when my GPU is fully capable of Ultra, simply because in most cases, there is no visual difference between those 2 settings).
It could last forever considering the hefty backlog most of us have of games that will run on what we have already
Agreed. GTX 1080 Ti still usable even these days, especially with the help of FSR.
Same here, the 4070ti I recently bought buys me 2 years of 1440p future gaming and anything that was released in the past, which is a LOT of great games! After 6-10 years I'll probably do the same.
This is such a 💯
Must have 100+ games not even started playing yet over last 10 years.
Own about 300 games PC / console and tablet games.
Zero interest in 4K gaming so my 3080Ti should last a long time.
All you need is 1440p and a massive game collection.
Even a 1050 Ti works just fine these days for most older games still.
I bought a 970 early 2015, it survived up until this year when halo infinite, god of war, and games like cyberpunk 2077 started showing the cards age (
I've got a similar card, a GTX 1060 3gb that I bought in 2017. It plays everything I have at 1080p. I'll look at the lower spec rdna3 cards from AMD later this year.
@@RolanTheBrave smart move am looking to buy a 6700xt or looking what 7700xt bring to table if it's expensive i buy a 7600xt and call it a day if that gpu be something like 6600xt
That's pretty brave when you consider Nvidia scammed out all 970 Users that the last 500 MB Ram of the GPU runs slower than the rest of the 3,5GB RAM. I bought a 980 instead and ist lasted me more than 5 years. G-Sync does its wonders too to have a steady fps even on fast FPS games
DLSS and frame gen on the latest 40 series cards is going to help extend their lifetime as well.
Same here. Upgraded 970 1080p to 2070 wqhd. Runs all games fine with some adjustments. For unreal V games and RT in 4k i will upgrade 2024 to a used 4080/90 or a 5070, depends on specs.
A GPU typically last as long as you want it too. For me I make it last 5-6 years. I have a 3080 now, but my old Rig had 2 980 ti's in SLI. I buy the best at the time to make it last. I save $80 a month to accomplish this goal. $80 * 72 months = $5760. I plan to build a new Rig in 2026. Yeah sure you lose performance over the years, but for many years fps is usually fine if you target 60. With DLSS this goal is even easier to obtain.
This is a good way to look at it. I may adopt this model.
idk about that, My GPUs have typically died after around 8-10 years. I've never had a GPU last longer than 10 years(assuming consistent usage). The most recent one to die on me was a 760 gtx that I got in 2013 and died in 2021.
@@Tony_Calvert yeah I like this idea too. I’ve had to drastically compress my saving to get a new rig. Been throwing back 500£ for 6 months now & im doing 700£ for the next 2 months ( hardly living lol ) get my 4090 watercooled rig by June hopefully.
But I’m not doing this again lol. It’s hellish 🤣
@@Kreege I only switched pcs last month. After running my current pc with a GTX 660 for almost 11 years.
The card is actually still fine, just can't really run anything new anymore.
So the pc will still get some use as a basic non gaming desktop after this.
If you take good care of your card's temp during the 10 years, it can last very long.
This is how we do it 💕
I'm still using a GTX 1080, while looking to upgrade this year. Rather than playing at lower settings, effectively I'm extending the lifetime for my GPU by having shifting habits regarding what to play, as I've been holding off on playing games which seem like they'd be a significantly better experience once I upgrade. There's still plenty of good, less demanding games to play instead, so I feel OK doing this, otherwise I would have upgraded already.
Good point! I play both old and new games, so if my GPU's running out of steam for new games, I play more old games with custom content, such as Doom and Quake. Their modding/mapping communities are still going strong, so I'm not running out of stuff to play even without upgrading to a new graphics card.
Lucky you, my 1080 is starting to reach the end of its lifespan. Originally upgraded to it from a 1060 in 2019. It gets pretty loud now as it goes over 80 degrees. I play mostly CPU bound games, but a few games like OW really ramp up the fans on my 1080. No complaints on performance though, but it does get louder now.
I was using a GTX 1080 @ 40/60fps and it wasnt enough any more and 1440p was shaky. Upgraded to a 4070 Ti @ 1440p /165hz monitor and it works great. Its a 6 year old or so gpu its def needed even at 1080p it can get a bit tricky in some games. DLSS is a game changer as well for games like call of duty. Bread and butter
Could play wow using low to medium settings on an hd 6870 last year and this with 50-60 fps😂....i think that it all depends on the age if the games you play....older games may not look as pretty but the playability is still there😊
@@slob12 Yea, Like if im getting 60fps on certain games im fine. If im getting over 120fps in games like cod im solid as i play on a high refresh rate monitor
Monitor resolution and refresh rate plays a major part in the necessity people feel in upgrading their gpu. No point in having a 4090 and playing on a 1080p/60hz monitor. If you're looking at upgrading a gpu to something faster you also have to look at if it's worth it with the monitor you intend to be playing on. The two really should be considered together. If people are also considering VR they should also take account of VR support and their VRs resolution and refresh rates. My recently replaced 1080ti was well balanced in my previous setup offering good frame rates (80+ fps) for many years but since upgrading to ultrawide 1440 and 144hz it couldn't quite get the smooth frame rates I liked so it was time for the upgrade.
Yeah I think if your satisfied with just 1080p 60-120hz gaming then something like a gtx 1080 still has quite a few years left in the tank.
@@xblfanatic360 I've always chosen my gpu with regards to my monitors capabilities. High spec monitors are reasonably priced now compared to gpus though.
And obviously the games that you play on them. I went from a 660 Ti to a 1060 on 1080p and noticed the difference even at that res back in 2017
Since it's easier to just slot in a new GPU and they get replaced more frequiently, I tend to go a bit overboard on the CPU side, knowing that I will use the platform for a while.
@@HappyBeezerStudios There is a lot more adaptability in the games though. Going from Ultra settings to High can gain a lot of fps without much loss in graphical fidelity that you don't really miss anyway. Dropping to medium isn't that major either but does tend to leave you feeling you could gain that extra fidelity if only your gfx card was that bit better. Great games mainly come from the story they tell and how they play more than amazing graphics with no depth.
if you just want to stretch the usable time a bit, VRAM size might become a dominant limiting factor, especially for newer NVIDIA GPU.
Only because they've limited it on purpose as an upselling strategy. It's the apple approach. Here's a base MacBook air with 8GB ram and 256gb of storage for 900. Want 16GB of ram and 1tb of storage? 1600 please
@@aightm8 ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
but consoles are just THAT profitable for AMD so they just don't care.
until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
this is the way
@Blue well, at least the base Mac m1 air is ok with 8
ESPECIALY for unreal 5 games with ultra photo realistic textures. Onward on quest 2 easily uses 16 gb of vram with my settings. even 4080 for more money woulnt handle that. GOODBYE NVIDIA. I am glad I went to amd this build for cpu and gpu.
@Blue unified cpu and ram reduces the requirement
This basically is the reason I used to target the $350 price range (970, 5700xt). You got incredible value and could consistently good performance by upgrading every 3-4 years. Unfortunately, that midrange value is dead now.
@man with a username 6800xt/6950xt ... or waiting is also an option
@@navid617doesn't excuse stupid pricing on both brands at the moment.
is upgrading every 3-4 years recommended? meaning I should buy a used one of 2 years max currently if i want to use it for 1+ years…
My 1080 TI aged really well with the 11 GB VRAM for 1440p. The 4070TI is twice as fast but have to upgrade sooner for sure.
Thats my problem with this generation mid tier cards . They give you more Vram but thin choke it with the memory bandwidth. So you still will be turning down settings as it age more sooner . I just don't understand Nvida
@@bigdaddyt150 they like money, and don't care what we think/want. its simple.
@YourFavoriteLawnGuy your right, as much as I want to like Nvida I can't. What there doing with there products on the mid tier and low tier for what there asking in price is purely out of touch with reality for those of us that understand the actual specs and how it effects performance long term
@@bigdaddyt150 mining is more or less dead so they need to choke their retail customers harder than ever before lmao
@@bigdaddyt150 If you start calling them Ngreedia and learn about what planned obsolescence is, then it will all make sense to you. They can no longer make more money by providing better products, so they start to try to push worse products for more money in the pursuit of endless increased profits.
All hardware remains as it was (aside from malfunctions) as it was 1 day after appearance. There are also graphics drivers and optimization of programs (games) and they even work better over time. Quake 2 on Geforce 4 runs as well as it did 20 years ago. It's a matter of new software. No one knows how the software will develop... I roughly hope that my RX 6700 Xt will serve me for everything that has come out so far and for the future for at least another 2-3 years.
I passed down my 970 to someone else and we were surprised at how well it handled some new games like New World (low-medium settings) for a 9-year-old card! We were expecting it to catch on fire as soon as the game loaded lol.
The fact that my 1070 has 8GB of RAM and entry tier cards in 2023 only have the same or less gives my wallet a happy face. Before that, I had a 2gb card from the very first Radeon GCN generation. There are too many games to play that don't need a new, three-slot card that weighs and costs more than an entire laptop for me to upgrade.
@@MAKIUSO uhm 3060 didn't have 12gb for clients benefit, just to jack up price even more during crypto boom and shortage. 3060ti had 8gb and look how nividia is introducing new 3060 with 8 or was it 6gb ram
@@MAKIUSO A game developer isn't running GPUs at 100% load 24/7. The GPU is idle while writing and debugging code between test runs, same for graphics design where software is waiting for user input and the GPU will be sitting at 15-30W refreshing a static screen most of the time.
Honestly the problem with nvidia isn't the ram, but the price. 4070 ti 12gb for 900€ is bad, but for 500€ it would be good. 12gb is enough for 1440p, and will be enough for a few years. Yes it's gonna struggle in 4k in some games, but it's not advertised as a 4K, though it's priced like a 4K card. It's a matter of price in my opinion, and it's the same for 4060. 300€ for a 4060 8gb to play at 1080p is good enough in my opinion.
VRAM was a big part of why I went with AMD instead of Nvidia when I was looking for a new card than my 1070. Some games were close to maxing out 8GB, though the one I had "issues" with was Monster Hunter World with hires textures that used just over 7GB of VRAM.
I knew that there wouldn't be long before games would start to bump up against that limit, and even exceed it, so I wanted more VRAM on my next card, as I typically wait 2-3 generation between upgrades. My previous graphics card was a 680, main reason I got a new one was the VRAM limitations.
When I saw the 3070 only had 8GB of VRAM, I decided against getting it purely on that reason, especially since I had gotten a 1440p monitor. AMD had 16GB on their 6000-series at the time and was a more attractive purchase. Frankly, until Nvidia stops with artificially low VRAM on their cards, I don't see myself considering them for my next purchase.
In the end I got a 6900XT as my 1070 was dying (originally was going to wait for 7000-series), and while looking for a card to buy I was contemplating getting something like a 6600/6700 until the new cards launched and sell off that card instead, but I found a good deal on this XFX Merc 319 6900XT. It cost something like 10% more than the 6700xt and less than all 6800xt models in my country so it wasn't a hard sell. So far it's chugging along quite nicely in all the games I play, with relatively little usage, low temps (typically sits at ~50C in games with a VERY lax fan curve. Even playing Final Fantasy 14 at a (locked) 60fps it's 50C at some 25-30% fan speed. The only downside to my card is some transistor noise, but I don't think it's bad, though your mileage may vary.
ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically sold 10-20-even more times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
most likely consoles are just THAT profitable for AMD so they just don't care.
until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
this is the way.
My GTX970 is 8 years old and still going strong. I use it for a 4K television box/email/some Photoshop. Don't play modern games on it (got a new system 5900X 2.5 years ago with a 3070) but that said I got 5 years out of it with some compromises which I was fine with (though often going down to 1440 for some games before I got the new system with the 3070). Mostly play racing games in VR these days and will probably wait until 50 series (use Nvidia over AMD for production software reasons) for the next upgrade.
I am a 980TI user. I bought it to use with my Oculus CV1. Still runs great, but it certainly is feeling long in the tooth. I'm looking to upgrade this year but it's been crazy to think I bought my 980TI for $650 and thought, this is way too expensive. Now, that price barely buys you into entry level
Dude it's the goddamn inflation too.. In 2015 650$ was like 900-1000$ today. Still prices are high but about 150-200$ 😂 Seems like we have to pay for inflation affecting their mistresses and boats owned by those rich fks too! SMH
It's worth mentioning that the majority of the large jumps in performance-needs come from the transition from 1080 to 1440, to 4k - so unless people are jumping to 8k I don't think the VRAM will increase in the same speed as before
Not really, vram requirements have increased outside of resolution increments. Games became bigger and more complex, add Ray tracing and unoptimization to the mix and you have a lot of vram requirements in nowadays games.
@@ArthurM1863 I *did* say the *majority* and *large* jumps 🙂
Very few people are at 4k right now.
@@aapje Very few people also have the highest models of GPU, though
Yes. I currently have a 4k display. No way I'm going to upgrade to an 8K display anytime soon. It's just too much of muchness.
Honestly, when I built my newest PC back in 2021 (6800xt/5800x) I gave my brother my old PC (2080/i7 10th gen), and he gave his wife his old PC (1070/i7 7700k) and we all are doing fine. I play games at 1440p so I'm happy, my brother plays at 1080p and can now play at 120hz no problem, and his wife tends to play games like The Sims so 1080p/60fps is perfect for her.
Id hope the people that bought into rtx 3000 get to use their GPUs for a good few years, before Nvidia makes them feel obsolete
yep my 3080 ti is still going strong in 2023.
Nvidia already made them obsolete because of VRAM. They are still usable, sure, but so is RX 6500 XT if you keep convincing yourself long enough.
@@janbenes3165 Yeah I have a 1080Ti so while a 3070/3070Ti may be a Core Improvement anything less than a 3080 is a backwards step for me as I play Adventure Games on Higest Possible Details
With limiting VRAM to 8 Gigs on 3060 Ti or below 16 Gigs on 3080 already is a limiting factor for a high tear product.
But I also hope that many of these RTX 3000 customers don't buy a 40 Series soon, otherwise Nvidia would go on with overpriced cards with to small VRAM on them. We'll see how much VRAM 4060 is going to have (with having in mind that 12 Gigs on 3080 and 4070 Ti already hits a limit.
@Jan Beneš meh, the 10gb on my 3080 works great on my ultrawide.
I do think the 10gb version shouldn't of ever been released, and the 12gb version should of been the $700 3080.
Nvidia are cunts though.
Indie Games, Hidden Gems of the Last Few Years, and -DLSS- /FSR make my 1070 Ti shine like a little Gem itself.
It’s quite simple:
As long as it gets the job done for your needs, keep using it for as long as possible.
It saves money on unessesary purchases and reduces our overconsumption of electronics.
My GTX 9604GB that I got in 2015 for like $200 lasted up to late 2021. I mostly played at 1366X768 before I upgraded my entire system in 2021 where I ran 1080p for a while before getting my 3070. That 4GB VRAM which many called overkill for 2015 helped my 960 last so long. Its a shame we aren't getting that much value these days.
Last month i upgraded from gtx 960 to 6800xt. Planning to use it for 5+ years
I had my MSI GTX 980 for 7 years, From 2014 to 2021. It came with a free voucher for The Wither III. I upgraded to an ASUS RTX 3060 Ti OC (Non LRH)
Had my old 980 Strix for about 5 years, before I swapped to a 2070S which I am still running. I would want to upgrade now, but prices are idiotic and I don't mind waiting out another GPU gen to leverage the performance diff. The 2070S is perfectly capable with 3K UW, no I don't max out raytracing but I am also satisfied with the visual quality overall in most modern AAA games so what's the point. Most AAA publishers have doubled down on live-service games that rarely get major overhauls and thus don't push visual fidelity and gpu demand like they used to 4-5 years ago. And with no major new titles on the horizon that are expected to push gpu performance demands significantly, there isn't a big reason to not wait for Nvidia 5000 series (or a mid-gen 4000 refresh).
I bought a gtx 660 in 2013 and I'm STILL using it. Many people aren't fans of nvidia, but they're supporting Kepler for 12.5 years, and they have geforce now to help you run certain demanding games. I even did a 1080p stream with it... with my CPU, this is only possible cause of nvenc. I will upgrade in like 1.5 years or so at the latest but I certainly got my money's worth out of the card, I'm happy.
If you are buying a 4090, you are probably buying it for the VRAM anyway and/or productivity, as opposed to the expectation it is going to remain a high-end product for almost a decade. And even then, most people who bought the 1080 Ti are still getting use out of that and okay with it being a great option for 1080p - 1440p high settings.
for me personally if i cannot lock 60 fps anymore regardless of graphic detail thats when i start to look for a new gpu/platform
I’m on a 3090 which I bought about 6 months ago used. I think the 3090 is probably good for another 1-2 years as a high end product just because of its VRAM and still decent RT and 4K performance. 6900XT and 6800XT will probably also be fairly solid cards even at 4K (no rt) just because AMD had the decency to not choke their cards with 8-10gb like Nvidia did.
Yeah i think the last time i bought a brand new card from a store was like back in 2014-15? a gtx 980, store prices are just ridiculous. You can get a used 3090 for 2/3 of the price of a 4070Ti and it will on par with it in 1080p and 1440p and destroys it in 4K due to the crap 192 bit bus of the 4070Ti. The only selling point of 40 series is frame generation no wonder Nvidia made it exclusive to the 40 series otherwise they would be pointless...
Wtf do you mean by 1-2 years lmao. It will last at least the next 5 years
@@filthyhanzomain7917 i was confused as well
ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
but consoles are just THAT profitable for AMD so they just don't care.
until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
this is the way
@@filthyhanzomain7917 1-2 years as a high end product
sure, it can most likely last for 5 years but at that point they wouldn't be high end anymore
This is the reason why I’m heavily considering an AMD card over a 4070ti. The 6800XT is already proving to age better than a 3070, and I think the 7900XT will age better than a 4070ti because of the VRAM. By age I mean only a couple years.
For an $850 tier card the 4070ti is already running into a couple games that max out that 12GB of VRAM.
ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
but consoles are just THAT profitable for AMD so they just don't care.
until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
this is the way
You got way more than a couple years before 12gigs of vram is a issue at 1440p considering it’s way more dedicated vram than the consoles have . Both cards aren’t 4k cards so having the extra vram is pointless .
I thought the 6700xt was the 3070 competitors and everything I’ve seen the 3070 has aged better then the 6700xt …
@Blue yeah your in Germanys though that’s like the US vassal state where everything good cost more
@Blue I don’t know I just watched hardware unboxed updated video where the 3070 was 19% faster than 6700xt in a 50 game benchmark ??at 4k mind you it was like 14% at 1440p
My GTX 1070 has been going strong for about 6-7 years now. 1080p & 1440p. Still gets 60+ fps on most titles except on unoptimized AAA trash which I rarely play anyway. Was gonna finally replace it with a RTX 4060ti 16GB, but that card turned out to be *censored* so I guess it's time to start using Nvidia Image Scaling in more demanding titles until a better card comes along. :D
I waited 6 years with my GTX1080TI on my 34" 1440p Ultrawide. Still plays most games well on low quality. Bit of laggyness on newer games. I had my heart set on a RTX4080 but settled for RTX4070TI to save $600
I'm in the exact same situation. But I really don't wanna pay like 900-1000€ . 4070ti would Double the performance but idk man. I guess I'll wait few month to drop prices maybe.
@@blue-lu3iz das ist ein guter Preis. Aber ich hab einen g-sync Monitor relativ alt glaub von 2016 pg348q. Weiß nicht ob der freesync kompatibel ist. Ich hab wegen Testzwecken Mal g-sync ausgemacht weil ich 3d Mark gestartet habe. Hab vergessen es wieder einzuschalten danach und man merkt selbst bei 60-70 fps ein krassen unterschied wenn g-sync nicht aktiviert ist. Kommt mir vor als würde das Game permanent Micro ruckler haben. Auf ray tracing kann ich getrost scheißen. Da ist die raw Performance der Rx 7900xt schon besser. Aber lutscht mit auch gleich Mal 50-70 mehr Watt Weg. Hab auch nur ein 650w Netzteil. Da müsste ich mir der 7900xt glaub ich auch aufrüsten. Bin kein Nvidia fanboy aber war immer sehr zufrieden . Aber meine 1080ti hat langsam ausgedient 😂. Ich beobachte Mal noch ein wenig. Mehr als 900 will ich nämlich echt nicht ausgeben.
got a 1070 since 2016. Finally starting to look into upgrading my rig. But current pricing makes me gag. probably be looking into just upgrading my platform and keeping the graphics card a while longer.
You mention how developers target midrange systems. So by buying a top end system you will be within the range targeted by developers for much longer. I went from a 1080ti to a 7900xtx. I never felt the need to upgrade my gpu until this year as I could always play my games at high - max settings and get 60-120+fps. This year is the first time I had to turn anything to low to enjoy a game. That is what pushed me to upgrade. I also play at 1440p.
This is the only reason everyone should consider upgrade their GPUs. If you have the need to change the settings from at least mid to low. That is your red flag to start searching for a new toy. You can squeeze your gpu changing the settings from high to mid, but I feel that the games looks shitty in low settings. Lower your resolution or upgrade your GPU. That's it.
Damn I had to endure a gt 730 till 3 weeks ago when I got my XTX 💀
@@nike2706 Really? I feel like resolution should be the last thing you change. Unless we are talking dlss/fsr. Anything non native looks... not great imo.
@@jessefisher1809 For the sake of example, let's assume we're talking about a system built for 4K gaming. If my video card isn't up to snuff, I'd rather play at 1080p then at low settings, because resolution primarily hurts fonts, while low quality textures and shadows hurt everything.
If you're willing to drop settings from ultra to high to medium then it will definitely last longer before sub-45 FPS happens, if you always want near ultra settings just for the heck of it that's another story of course, then it gets expensive buying new top end cards just as often as midrange cards. I am quite used to tweaking settings so I could probably make a high VRAM top end card last a long while (half render scale 1440p/720p might still show 100% size hud elements in some games, etc) but I could also make my current RX480 card last quite long on medium/low settings for much less initial money invested, it just depends on how low you're willing to tweak graphics settings and if CPU is bottlenecking.
Just upgraded to a 3440x1440 display and pleasantly surprised to discover my 5 year old 1080ti plays everything buttery smooth at max or near-max quality settings
My 1070 still able to push high setting on ultra wide monitors, but I still want to upgrade soon to get new features like Dlss.
What PC Components do you have to go with the 1080ti? As I want to upgrade the rest of my PC (4790k and 16gb 1600 mhz RAM) to a current CPU and Motherboard. Plus, buy a gaming monitor (no OLED - I use a lot of static objects on screen) instead of using my 12 year old LCD 60hz TV.
@@jodiepalmer2404 I have an i7-5820 with 16GB 3200. What I also have - and I believe this is the component that really makes a big difference to the speed of the machine’s operation and my perception of its “smoothness” and responsiveness - is a Samsung 960 Pro NVME chip.
@@ifanmorgan8070 What display do you use that you mentioned?
@@jodiepalmer2404 The new Samsung 34” OLED G8. It’s beautiful for gaming and photo editing/video, but there is a colour fringing issue with text due to the shape of the pixels. You get used to it after a while but initially it’s quite jarring.
My previous monitor was an Asus MX299Q, which I definitely recommend as a budget option - nice wide desktop and gaming field of view, good for productivity apps and great for gaming. It even has a good contrast ratio for a monitor of its age, but the blacks aren’t quite as inky, and of course the response time is slow by today’s standards. Tbh though the only real quality-of-life difference for me with the OLED G8 vs the MX299 is the enlarged desktop space / gaming FOV, deeper blacks and a minor sense of improved smoothness and responsiveness probably due to the 175hz vs 60hz increase. It’s not that big a deal though. Depending on your financial situation, I might be tempted to wait for the next gen to get rid of the colour fringing issue. Without that, my new G8 really would be the ideal monitor.
Honestly, it completely matters wether or not high resolution monitors will drop in price to be more affordable so they become the norm from what we currently have which are 1080p monitors. Besides that it is up to you what you demand to play, i told to myself that the online games i want to play should run at 144-120fps while the heavy single player games at a minimum 60fps medium/high mix settings.
My 3070 does exactly that even though i run it with a somewhat outdated cpu (3600) which i plan to replace with a 7800x3d, i doubt the GTX cards can do that except the 1080ti while the 2000 cards don't hold a candle compared to the 3000 series which can be found extremely cheap especially if you don't mind used.
4090 can't even run all games on midrange 1440p 144Hz monitors with all eye-candy turned on and without using upscale trickery.
and yes, 1440p 144Hz+ monitors ARE MIDRANGE if not budget already, look at their prices. there are great MODERN NEW 1440p 144Hz+ monitors for 250-300 bucks. try to find any great 1440p-class GPUs that can run MODERN games at 144+ FPS for 250-300 bucks. 1080p is extreme budget by now.
1080p is obsolete extreme budget level monitors, nobody should make or buy GPUs for more than 100 bucks for 1080p.
1440p is a starting point in 2023. just like 12GB VRAM should be.
1080p GPUs literally should be A VERY BUDGET GPUs with prices around 100 bucks. a 500 bucks GPU with 8GB VRAM is laughable and SHOULD BE RIDICULED as it is morally obsolete dead on arrival overpriced AF piece of crap.
but ngreedia don't really care about selling GPUs at all anymore. all they care about is selling their laggy GaaS subscriptions. that explains everything they do. servers they make can be used for AI/GaaS and bring money constantly, also they are selling server time to multiple users simultaneously, so the same silicon is basically leased (not even sold, ngreedia keeps it) 10-20-50 times, in contrast GPUs are sold once and thats it. no more profit & they have to share profits with AIB manufacturers.
so ngreedia kill PC gaming, don't make faster GPUs with higher VRAMs and affordable prices and sane gamers are stuck with playing games with morally obsolete engines or on morally obsolete low-res screens, because affordable GPUs are way too slow.
1440p is currently the norm in PC hardware and ngreedia GPUs are way too slow for 1440p gaming, even 4090. ngreedia is killing PC gaming on purpose to sell GaaS that doesn't make sense to PC gamers due to latency.
and AMD is happy to support GPU overprice to sell more console chips to sony/ms (+ continue selling overpriced AF GPUs of course).
the only thing that looks weird is AMD apparently don't understand that without affordable new GPUs that are way faster nobody really needs new CPUs. they shot themselves in the balls by supporting ngreedia's overprice in GPU space and now their CPU sales are way down.
but consoles are just THAT profitable for AMD so they just don't care.
until there's no WAY BETTER value GPUs people should just start playing old great games on USED GPUs and don't give any of their money to greedy corporations. don't buy consoles made by amd, don't buy ANY GPUs from ngreedia, amd or Intel (price/perf matched buggy Arc to ngreedia overpriced AF GPUs).
don't pay for GaaS and don't defend overprice from ANY corporation, don't defend ANY morally obsolete or new overpriced AF and stupid hardware with low VRAM. it IS ALREADY OUT OF DATE.
this is the way
I wonder if this clip was published right now, because you guys had set up a poll asking us at what percentage increase we'd be willing to pull the trigger on a GPU upgrade.
GPU’s are like cars. They lose value as soon as you buy them.
HD5870-> 1GB VRAM buffer was the main problem
GTX680-> 2GB VRAM buffer was the main problem
GTX1080-> 8GB VRAM buffer is fine for me... but mostly because DIMINSHING RETURNS has meant that most games I own previously are running 1440p at 60FPS on HIGH (ish) settings (previously just cranking up the RESOLUTION alone from, say, 1600x900 to 2560x1440 would make a huge difference). I wouldn't buy a NEW card with less than 16GB VRAM but I also feel no urge to pay a LOT for a new card just so I can buy new games and/or enable ray-tracing. If I buy a NEW game I'll get one that runs great on my GTX1080... meanwhile I've got a back catalogue of great games that are fun, run well and look great!
I'm perfectly happy at 40 to 45 fps on 1080p with HUB optimized settings. I'm hoping this will help me keep my RX 6700 for at least 5 more years. I prefer lower power consumption over fps or ray tracing.
I was using RX580 from its release and up until recently (no money, then no cards, then waiting for a new gen). That 8 GB of VRAM helped a lot. Its still usable now, most online FPS work ok and games like Atomic Heart run quite nice too. My friend now uses that old PC of mine. Hell, that card still keeps OC levels just like a new one.
I believe VRAM amount on mid-tier cards today is pathetic. 8 gigs was overkill for rx580 on release but helped a lot at the long run. 8 gigs on 3070 is just a joke. Darktide uses 9.5 gigs at 1080p already!
So I have gone with rx7900xtx cause I have no idea when and if I would be able to buy a new card so I need staying power. RT features are secondary and the 1st to go if performance drops down. Nice good textures provide much more to picture quality and 24gigs should be fine for a few generations.
I have a 5600x and a 6800 xt I’ll end up upgrading my CPU before my GPU. 6800 xt drivers have been friendly to me ever since I got it about a year ago. I see zero point in upgrading just to deal with newer driver issues.
If you use that 6800 XT at 1440p, the 5600x is more than enough.
At 1080p the CPU might bottleneck the GPU but you will have well over 100 FPS.😊
@@Shatterfury1871 I play in 4k on a 55in LG C1 so technically I can justify an upgrade. But I get over 100 fps in everything I play using FSR and game play still feel nice and smooth. So it’s still a waste of money. Even if I had a better GPU I’d still be using FSR or dlss.
@AnimeGeek611
You certainly can justify it, at the end of the day you decide what you do with your money.
But at 4k, the bottleneck is 100% the GPU, not the CPU.
At 4k, the 5600x can handle even the RTX 3090 TI.
So going for a 5800 X3D, I guess you are on AM4, you will be lucky if you will see a 5 FPS increase at 4K.
It depends largely on 2 factors main one is the games you play/are likely to play if you're always playing the latest games on release you will more than likely need new gpu more frequently vs someone who plays the same games regularly (ie mostly the big online multiplayer games)
The other factor is your screen size/refresh rate weirdly enough as my eyes are getting worse I thought screw it and got a 48inch 1080p tv for cheap, unfortunately that also means I constantly have to force oversized resolution (super sampling) or jagged edges are EVERYWHERE and because of the nature of the issue AA kinda makes it worse so much so that some games I can't play because my 1070 can't really do any higher than 1080p and the jagged edges are headache-inducing, though it is worth noting on things like counter-strike LoL Valorant etc because they are fairly lightweight I can set it to 1440p which gets rid of most if not all the jaggeds and actually makes for a bit of an advantage in shooters because the target is physically bigger so if you do go that route again you will need a beefier card to deal with supersampling even more so if it's a high refresh rate since your acceptable framerate floor is much higher.
again though if you're generally into the most popular online games on a 28inch or lower monitor you're probably fine with any 1070-2070 card
when i build a system for myself, i select parts that i think will get me through 5-7 years without having to upgrade in order to do what i want to do. its nice to be able to play at the highest level now, but what im after is longevity and stretching my dollar. i think it all comes down to being able to play at the level you want it to, and there is no set amount of time.
You would get better value AND, on average, better performance as well, if you were to upgrade every 2-3 years, but always only get cards which give optimal performance per dollar. When you consider how much you can recoup by selling the older card each time, the net upgrade cost ends up being very low, but you can also always have very good performance if you never wait longer than 3 years to upgrade from a good mid-range card. You certainly can get more life out of a mid-range card, but you're going to have to lower your expectations as time goes on. 7 years seems really ridiculous to expect a card to last though, especially a high end one, because why on earth would you pay a premium for cutting edge performance if you're planning to keep the card for 5-7 years?! You're not going to be having cutting edge performance for most of that period. On average, you would get better performance if you upgrade more often but only buy cards which have good performance per dollar. That being said, periods of GPU shortage and/or very high prices are a great time to put-off doing any upgrades, and to just make do with whatever you happen to have.
@@syncmonism if you were to buy a 3070ti for $600 and then upgrade to a 4070ti for $800 it would have been cheaper than getting the 3090 for $1500 by $100, but you would have went 2-3 years without the 3090/4070ti level of performance. so in that instance, im not sure i would have went the midrange route.
I'm currently have a Radeon R9 290 that's 9 years old (and since two of the three fans have flown off, so it's ready for replacement). Before that, I had a Radeon X1950 XTX that I had for around 7 years. Soon I'm going to build a new rig, and I hope my new graphics card will also work for around 7-9 years.
My 1070 is still kicking and I got it day 1. I'm honestly surprised it still performs well. But I am looking to upgrade I'll probably get a rx 6800 or whatever else gives me the most performance for about $400.
I had my 980 Ti back in 2022 but upgraded to the 3080, glad to see this performance segment still kicking.
My GPU did last 6 years, thermal compounds need replacement and the cooler needs more than compressed air to be cleaned. The GPU is not broken YET and I keep GPU core temps under 67C, anything above 72C and my VRAM starts to produce artifacts during stress testing. Thermal paste can lose function a lot faster in systems that are powered off for extended periods.
I think a 50% increase is not enough for a GPU upgrade. I have now retired my 1080 Ti and got myself a 7900 XT. Actually, I first wanted to switch to a 3080 after 4 years, but at the prices at that time and even a reduction in VRAM, that looked like a bad decision.
Of course, 144 FPS is awesome and 90 FPS is still great, but i can also live with 60 FPS
I sold my 1080ti that lasted 6 years for £200 late last year and picked up a 3090 for £700. I think buying used from trusted sellers and sites is the way to go in terms of value. Stop rewarding these greedy GPU companies.
I mean I would, but I also can't justify spending 700 bucks on a used 3090 when even the awfully priced 4070Ti offers better performance and much better efficiency for pretty much same money at new. The only major difference is the VRAM ofc, but still the 3090 prices have always been ridiculous and so are the used prices too. If it's used, then it's gotta be a truly great deal or the card has to match some very specific spec requirements for how you're gonna use it.
My take is to simply not buy any gpu at all, until prices improve overall. And if that means I skip another gpu gen, then so be it.
@@Real_MisterSir just buy used mid-range amd cards.. they are affordable and have more vram
yess
I narrowly missed out on a used 6800xt that sold for $350😢
Yeah nah, used low end costs as much if not more than new
Nvidia wants you to buy a card every year with the low amount of VRAM and low memory bus theyve been putting on their mid/high tier cards.
My RX 580 works good enough . Cyberpunk only runs at 40fps and Elden Ring 60 unstable.
This year I finally switched it for a second hand 6600XT. Oh boy very happy with the change, I wish I had done it earlier.
Same here with a retired Rx 580 swap for a 6750xt. Playing games with fsr/rsr sure make the upgrade worth every penny.
I bought an rx 580 near launch day 2017. 6 years later, it's still going strong.
I'm in the same boat, rx 590 here. But I was looking to upgrade this gen. After seeing prices and performance, I'm currently considering a RX 6950xt at say $600.
I have a rtx 3070 and going to stick with it as long as possible, i play 1440p but its doing fine in the games i play.
same here bro
My path is like that
past 1080ti > now 3080ti > future 5080ti (or an AMD card like this)
2:00 I completely disagree with your sentiment here. It very much depends on what game one plays.
Generally speaking I purchase top Tier and game at (now) 3440x1440p at high (not ultra) settings.
That means, at worst I go from 3440x1440p @high fps to 3440x1440 at 60.
Medium - High 70fps is around the Mark I start wanting to upgrade badly.
Graphics settings dont really scale anymore. Ultra is a waste and Medium most of the time looks very similar to high.
Top Tier cards usually easily last 4-5 years unless you *demand* ultra settings - which we already established is a waste
Plus there is a lot of fanzastic older and less demanding games out there which makes putting off playing top Tier games for 1-2 years very easy.
😂 good on Tim around 3:40
I don't solely use my GPU for gaming. I upgraded from AMD R9 290 to an Nvidia GTX 1080 because the AMD GPU does not have hardware decoder for HEVC . I might need to upgrade to something newer thats supports AV1 when it is more adopted .
Intel has the cheapest av1 option at the moment however not a big upgrade over your 1080 as they are only a bit faster than the 1080 ti rn.
Oh if you want av1 decode you all ready have it as long as it's no HDR I think
USE GPU RISERS and GPU BRACKETS . It prevents PCB BEND AND SOLDER FROM CRACKING from bends
I only upgrade based on the game or program I want to use it for, which means if I choose to play games such as StarCraft 1 or 2 for the vast majority of gaming, then the current video card I have will last me more than half a decade before I consider a new card. The primary reason I will upgrade is the utility of work a new video card will give me, which for me will likely be a GPU that can process multiple streams of tasks such as encoding, decoding, and video output to more than one monitor.
My rule of thumb is pretty much 2 generations, which is around once every 4 years. I bought a GTX 1070 6 years ago and replaced it only last year for a RTX 3070 to keep playing at 1440p high quality setting and try Ray Tracing and DLSS. I'm certainly gonna skip the current generation, but will check if the one following will be worth it for that 1440p sweet spot depending on how demanding games will be.
6 years for flagship
4 years for high end (Can extend to 6 years if you compromise)
2 years for low end (Can extend to 4 years if you compromise)
*GPU 3x rule*
My general rule USED to be to look up Techpowerup charts and look at whatever GPU got roughly 3x the FPS. I wouldn't buy a card that only got 50% more because usually with a little tweaking my current GPU would look roughly the same at the same performance. Or if a game NEEDED a new GPU due to more VRAM or whatever I would just not buy that game. Plus CPU's where changing a lot years ago so i thought I'd just build a new PC every five or so years and donate the old PC. That worked well, but modern GAMING is so complicated due to ray-tracing, cost and other reasons. Still, I wouldn't UPGRADE unless two things are true:
a) the PERFORMANCE difference on paper is at minimum 2x higher, and
b) I think this would make a noticeable difference in my gaming experience (if the game you play already runs great at 1440p/60FPS or whatever do you REALLY need a better graphics card?)
I've owned 6 GPU's in the past decade. My first was the GTX 670, which I upgraded two years later to the GTX 980 because I was convinced to not buy the 980 Ti... which I then ended up buying months later anyway so as of right now the GTX 980 is my most shortlived GPU. And I ran with that 980 Ti for the next 5 years before upgrading to an RTX 2070 Super which I only did because my computer was starting to die after 7 years and I was forced to build a new one earlier than I had planned. One year later I upgraded to the GPU I always intended to get, the RTX 3090 which I bought used off of someone that didn't want it anymore. And now, on the cusp of building another new rig because I made some mistakes on the last one, I managed to get an RTX 4090 for retail price. The only thing that really convinced me that a 40 Series GPU in general was a worthwhile upgrade from a 3090, is because of DLSS 3.0. It's locked to 40 Series cards and it does make a noticeable impact on perceived framerates albeit with mixed results sometimes, but it's obviously actively being worked on. The real question for a graphics junkie like me is if they'll have anything exclusive to the 5090 next year. Otherwise I'll probably stick with the 4090 this time around.
literally was going to sell my 3090 was going to get 800 for it and use that towards a 4090 only spending 1k... keep in mind I got the 3090 for free in 2022.
I end up not selling the 3090 and decided to keep it for now. because
1. While the 4090 is sometimes double the performance of a 3090. I still can't justify spending $1700 for a card that does not max out every game at Native 4k with all the bells and whistles.
2. In The Witcher 3 with all of the next-gen upgrades the 4090 averages 49-50 frames even with patches 6 months later. The same thing with Cyberpunk 2077 with all the settings turned up, you still can't Burt force your way to high frame rates. you have to use Upscalling and or DLSS3.. so what is the point in going all out spending $1600-1800 for a 4090 when several released games old and new I still even the 4090 struggle to keep 60 or above without the help of AI.
3. If this is the case I might as well keep my 3090 which I end up using DLSS2 on the games that need it. it seems no matter if you're on the 30 series or 40 series you will still need to use AI to get playable frame rates, and it's only going to get worst once more real unreal 5 games start to roll out next year .So I rather wait since I got my 3090 for free, wait it out and see what the 50 series or 60 series will bring when it comes to price to performance, DLSS3 is a selling point /gimmick that Nvida made to justify the super Overprice 40 series . and here's the silver lining, 30 series owners will not be locked out of frame-gen because AMD is releasING FSR3 soon which will work with any GPU just like FSR2.
So is no need to run out and get a 40 series because you feel like you're being locked out by Nvidia, they just want your money.
Have had the 5700 XT for over 3 years now. It hasn't missed a beat currently. Either 60 fps, or 120 fps with upscaling in anything.
Only Forspoken might've given it some troubles based on the demo, but otherwise rock solid.
I'm skipping 7900 cards and I'll get a 8900 flagship or 9900 flagship. Probably 8900. What's 2-4 years more.
1080ti 11gb still going strong - I use a 100hz 3440x1440 monitor
Considering an upgrade but honestly it's not worth it right now. waiting for the 200% uplift @ around 4-500£
I think something that also needs to be mentioned is the kinds of games you play and the res that you play at. 1080p competitive shooters at 240-360hz can still easily be handled by a midrange card from 2020-2021 and probably still can be handled for the foreseeable future!
Got the RX580 when it was already years old in summer of 2020 and I'm still using my RX580 about 3 years later!
1080 60-75
Now I want to upgrade to 4090, so 4K 120+ preferably
Resolution isn't the only thing increasing. Game developers are adding more polygons, more complex shaders, more situations that require high computation, and larger textures.
So even 1080p 60fps can be harder to achieve on an older card with some new AAA titles.
AAA games will continue to require beefier graphics cards.
But some titles will run fine on older cards, too.
I play flatscreen 1080. I had a 1060 6GB, then upgraded to an RTX 2060 SUPER to prep for PCVR. Then I got a 3080 to make that even better (but the improvement didn't really blow me away).
I'll likely get a 50 series when they come out, assuming a 5080 will render over twice as fast as a 3080, and I don't need to commission a new power grid to do so.
I have a GTX 1080 inside a huge Thermaltake Core v71 ATX case that I bought for my first build back in 2015 and I'm still using that same case for every upgrade lmao. It's like the perfect house for new components due to it's insane airflow and it has 2 front fans, 1 top fan and I use a strong 3000RPM Noctua exhaust fan.
You can expect about 30% uplift with each generation (about every two years.). Games will not outpace the hardware.
Using math 2*log(2)/log(1.3) = 5.3 years before you are at half speed. That's general replacement time.
5 year is a good amount of time to hold your card, that is how long until I upgraded from my last gpu too.
A 1080 ti used by my friend for video rendering for 6 hours daily. Air cooled system three intake and three exhaust.
His GPU started to show artifacts after 2.9 years died. After few days upon getting the replacement under warranty, his replaced card last only for 1.5 years
Now he has 4080 and it is working cooler at 60 deg C iat 100%
I’m still…. Content with the 1660 TI. However, I am looking to upgrade in order to improve the VR experience on the Quest 2
The beauty of PC gaming is that there's usually a way to play games even on dated hardware. I'm still running a 1660 super with a R5 3600 and the total watts I consume is less than some GPUs on the market! I can play the games I want at 1080p and have yet to run into those fringe cases where I need to set graphics all the way down to low
i have a love -hate relationship with this topic. part of me thinks it's agood question to ask because hardware does get outdated. but another part of me thinks it's stupid , because the vast majority of users are gonna wait as long as the can or can't afford to upgrade. the longest usage i got on any gpus was: 1: the ATI radeon 9700pro (5 years) and then 2: the geforce 1070 (about 5 years as well , though i think it could have made it one more if i had too) .
most video cards i've owned i replaced within 2-3 years. and a few i replaced inside of 1 year (things moved along real quick back in the late 90's and early 2000's)
so for GPU 2-3 years before you upgrade. how about for CPU?
Just jumped from 1080ti to 4090, fuck me what a jump in performance. haha
My first pc that I ever built was a 2700x and a Vega 56. Then my friend upgraded his PC and gave me his old 1080ti. Gave the old computer to my brother and bought some used dudes rig 8700k and a 2080ti. Then bought another used rig swapped over 5600x and 2080ti which is what Im using right now. Been thinking of buying a used possibly 3080/3080ti/3090. They seem to go anywhere from 500-700 used. Been also thinking of buying a used 5900x to make a all in one stream PC.
go with a 3090 for the sake of having 24gb. That will last you a good decade.
@Slumy My thoughts exactly. Why go for a 3080ti which is essentially just a cut down 3090 with only 12gb ram. Also I have a 750w PSU and 3090 is the max I can go before needing a different one.
I'm playing in 3440x1440p, before I was gaming in standard 1440p.
Well I bought Geforce 970 in 2014, and this lasted me until 2019 when I upgraded to 2070 SUPER, and just now I bought a used 3090 (which premiered in 2020). I am happy with the purchase and expect this over 2-year-old card to last me 3+ years.
I'm looking to update my gpu now, have a r9 285 which is finally giving up (Although still in the min specs for diablo 4!)
When people like me ask how long a part lasts, I'm not buying a GPU to play at ultra. I'm buying to physically be able play the games I want to play. Turning down settings down over time, is to me, essentially saving money. And I'm looking for expert advice to create something that lasts as long as possible.
So I'm looking for recommendations from reviews for what hardware is best value for money over a long time frame. The mindset of updating so often is wasteful to me. It's the same with recommending CPUs based on upgrade potential on the same platform
It's fine if that's what you enjoy and can sell your used parts but surely there's more people that think like me than not and it always surprises me the number of people on youtube/forums that advocate parts with the mindset of updating in 2 years
I just upgraded from a 1080ti (bought nov 2017) to a 4090. It was still usable, even at 4k, so I didn't sacrifice too much by waiting. I even played Cyberpunk on it. I would have upgraded earlier to a 3080 but with the price gouging it didn't seem like it was worth the money. Now with the 4090, the improvements are enough to justify the massive increase in price, while waiting longer would have resulted in too much degradation, and the lower cards would have been underwhelming and provided less in new capabilities.
Would I have been happier spending time at 1440p and/or lower details with a lower-end card initially, in order to commit to upgrading regularly and have higher framerates in 4k and higher details sooner? Maybe, but I kind of doubt it. I got 4k from day 1 of that PC build and I enjoyed it.
Worth considering that upgrading regularly assumes that both your income and your interest and taste in gaming are going to remain constant and high enough to justify it. I bought a card because I had the money at this time, and I'll enjoy gaming to the max right now and worry about the future later. Who knows whether I'll have money to upgrade in the future or whether I'll care enough about gaming to do so. I’m very unsure about my financial future, so it doesn't make sense to commit to future spending. My interest in gaming is up and down, sometimes I hardly play at all or I play old or non-demanding games, and then I come back. I would like to upgrade in two generations, but if I can't justify an upgrade, the 4090 can keep me going for a while. Especially since I'm still running a 4k 60hz monitor. Not much interest in the high frame rates - I care about pretty, cinematic, immersive graphics, not pretending to be an esports competitor.
My 1080 is still going strong after 7 years and I plan to give it to a friend next summer. I am still happy with its performance and I’m mainly building a new system because my cpu is not coping as well. Really don’t get the idea that buying a way overkill gpu and then adjusting expectations over time is a bad idea. As long as it’s hitting 1440p 60fps and doesn’t look horrendous I’m happy, fully convinced low settings in 2030 will still be very good. £2k spend on a pc build that lasts 8 years is nothing
My GTX 1660 Super 6GB still going strong in 1080p gaming almost 4 years later, probably would still get a couple more years out of it but upgrading to a RTX 3060 12GB pretty soon. (Budget PC Gamer Here)
2:00 i don't think that'll necessarily be true. When i bought my vega64, everything was 1k full max. (except for AA & motion blur ofc). Now, i'm lowering settings ofc, especially for darktide, it's full low fsr, however, dark tide full low kinda looks like older games full max... so...
My main gripe is actually VRAM. I can't watch 1080p videos on my second monitor while gaming, without something lagging it's life out.
I bought the 6950XT as an upgrade from an RX 580 so I can play at 1080p forever.
I used my GTX 980 Ti from 2016 to 2023, saved tons of money not having to upgrade year on year, enjoyed that experience so I bought a 4090 FE and it’s gonna actually save me money having to buy a new graphics card every year for the next same 7 years.
I still have a 1080ti w/11gbVRAM...it's in a backup system and no longer my primary gaming system. However, it still works great for 1080 gaming. I have moved to higher resolution and refresh rate gaming....but that doesn't mean I wouldn't be able to play my games on a 1080 monitor still. I purchased that 1080ti back in 2017, so it's around 6 years old. I purchased an EVGA watercooled RTX3090 in the 4th quarter of 2021 for retail price with the expectation that it will also last me at least 6 years.
I went with AMD for my latest GPU for the added VRAM. 20gb on the 7900XT is what I wanted.
you are devious
I bought my first gpu in 2003. Till now i change gpus yearly, just adding some money and buying beter used card. I spent no more than 2500$ for all upgrades over 20 years. Now i am rocking rtx 3080, will see what to buy next year.
Funnily enough, last year was when I upgraded from rx580, which I've given to my young brother and he's enjoying it quite a bit. For 1080p low, maybe some FSR gaming it's still kinda awesome. That was really good buy IMO.
You either have a high fps expectation (90+) or idk how a RX580 can't play higher than Low at HD🤔 It should be easily Med-High on all games except if you want crazy fps
@@upfront2375 "Next gen" games are coming out and they are getting kinda taxing on the old 580. Plague tale, spiderman, last of us. But again I think it's absolutely great GPU and one that will, with the help of community drivers, last for many more years of gaming.
@@korysovecit's poor optimization man.
I wish that I could but 1 for my old build in other house, but it's not found new or super difficult to find one that wasn't mined on.. a literally "work horse" ,coming from a Nvidia user since forever!
I update roughly every 3rd year, i like playing everything on ultra at high framerates, so as soon as i see performance starting to lack, i buy the current gen.
playable really depends on the gamer.
throughout my gaming life I have used an APU then gtx750ti now 3070. i still recall i played games like AC blackflag on my APU at 16-18fps. those were playable for me. then i moved on to 750ti which i used until November 2022 from 2016 so good 6-7 years although i was a 720p gamer. only reason its no longer usable is ofcourse performance but also cuz its doesn't have dx12 support which won't even start the game.
now my 3070 has spoiled me with max setting 80+fps so i doubt I'll ever be able to play below 60fps. if i had just gotten a 2060 tier card as upgrade then i would be used to my old low setting gaming for next 3-4 years unless another direct X versions appears that would limit the gpu
My upgrade path went Radeon HD 7950 > 980Ti (died) > 1070 replacement (died) > 6900XT
I see a pattern forming 🤨
the fact that my 1080 can still do 1440p mix of high med settings is enough for me, most times I launch the game and just play it unless shadows or lighting look a little soft and the hardware's still got wiggle room for playable FPS.
Bought a 3070FE on release and enjoyed it for 30 months playing at 1440p high to ultra graphics in games. I recently purchased a MSI RTX 4090 Surpim liquid X and it was a hefty purchase at $1750 ($1860 including taxes) I weighed heavily on weather to purchase it or not because of cost. But with its massive cuda core count and 24gb vRam I personally believe its good for a least 5 years or more before i need another upgrade. The real question is Will the power connector melt before then? 🤔 Thats my biggest concern.
Gave my brother an Xfx Rx580 8gb 3 years ago and he is still using it. FSR makes it's capable enough for 1080p mixed settings. I have a (2021) 2060 12gb that he is getting this year so hopefully that will allow him to bump settings up a bit for another year.
Still using three Nvidia Titan X in SLI since 2015 in 1440P, 60 Hz, with no issues on high settings. I will adjust some items like shadows, water, sky, to lower settings and playing everything. Was tempted to update to a new system with 4090 GPU and Intel i9 13th GEN but I am still able to play everything I want without issue so held off.
I think it also depends a lot on the console generation. Once a new console generation is established and you buy a decent GPU it will last for the rest of this console generation because the consoles will limit what new games can do.
Just do an upgrade a couple years into the console generation. Your cpu and gpu will outperform the console.
I did get my 9900K + 2080ti system in 2019 with a 5-year-plus timespan in mind, but then I already had a ultrawide 75Hz 1080p monitor and wasn't satisfied with the 1440p and 4K options in the market, which meant I was bottlenecked by my monitor even with everything maxed out until the end of last year, when I finally managed to get a 4K 144Hz monitor. By then the 4090 and 4080 had already been launched, and now my 2080ti (which has a 3070-level performance) can only keep 4K over 60 fps in maxed-out current AAA titles such as God of War with DLSS (quality mode) and a mild GPU overclocking. I hope I can keep this for at least another year (thus making the planned 5-year mark), because even a 4080 (which would give me about a 50% performance gain over the 2080ti, I reckon) is still too expensive as compared to what I paid for my 2080ti.
you are forgetting something: count the time for the card to be high end starting at the launch date of the card and not stating at the time u bought it ( as new). a fresh out of the box 6900xt is still a 2 year old card.
Depends on the games and monitor, I'd say upgrade if you don't see enough performance that a bios update can't fix if the fps is low or you see weird graphic glitches
Maxwell and Pascal where true performance champions and still very serviceable to this day. Actually just a few minutes ago upgraded a Kepler GTX 660 2 gig to a GTX 1650 super. They last as long as you have use for them?
I bought my 1060 6gb in 2016 december for ~250usd. It still running multipalyer games on Mid settings 60+ FPS.
Singleplayer games, like Howgarts Legacy running at ~50 FPS on low/mid settings.
It's time to upgrade for a 6650XT or 6700XT, for the next ~5 years.
I still have about 2 months of warranty left on the used EVGA RTX 2080 Ti XC Ultra I bought on eBay for a high bid of $796 on 11-21-2020 and the high setting on COD Warzone seems to average around 130 FPS so I see no reason to upgrade yet. Of course I’m on an older PCIe 3.0 Z490 system (i9-10900K) so that may be why the PCIe 4.0 MSI RTX 3060 Ti Gaming X Trio (12/2/20 $489.99 MicroCenter) had a little less FPS but seemed far better on my sons MSI MAG X570 Tomahawk WIFI (Ryzen 7 5800x). So much so that I thought the Gigabyte RTX 4090 Gaming OC was a waste of $1699.99 (3-9-2023 BestBuy) I may return it.
I went from rx 580 8gb to rx6700xt. used the 580 6 years. It still played all the games I play pretty well.
I just got a 580 for my new pc and its not bad.
Same here
Up until now, i was upgrading GPUs for "future releases". I eventually realized, i was wrong. If your current GPU delivers great performance and you have great time playing games, you wanna play, there is no need to upgrade.
Ofc people playing on higher resolutions (especially 4K) have it harder, as they have to keep upgrading just to be able to play new games at 4K. Kinda reminds me of 2000s, when new mid-range GPU was already weak after a year.
But overall, gameplay should matter, not visuals (graphics). I play most games on High (even when my GPU is fully capable of Ultra, simply because in most cases, there is no visual difference between those 2 settings).