Okay, so I noticed Arkham City was actually a 2011 game and it’s only the GOTY edition that released in 2012, but it was too late to amend the video. 😞
That makes it ~910 CAD at the time. Accounting for inflation, that would make it ~1,200 CAD in today's money. That is SO expensive for back then. -- Meanwhile, the R9 290X would be ~715 CAD. Accounting for inflation, that would make it ~940 CAD in today's money. That's still a lot of money, but MUCH more reasonable. -- The face that either card costed that much is pretty crazy, but at least the 290X sounds reasonable. 1,200 CAD *just* to play video games (and maybe do some rendering) is a TON of money for its purpose. Prices like that should never be normalized (aside from accounting for inflation). (Looking at you, especially, GPU prices since 2018 (starting with the RTX 2000 series).)
@@ShadowMKII You make some good points, but please refrain from starting new paragraphs every sentence. You're not writing a viral marketing video script.
Meh not really, I think the Titan!(!!!!)!!!(tm) came in at a cool $1000, did it not? But then the day the final Titan!!!!!9(!11!!!)!!(tm) sold, Nvidia released the TITAN!!(!!)!!!)!!)!(! matching GTX780 for half the price. The GTX780Ti cost £450(ish) if I recall. And the R9 290X around the same.
@@selohcin Yeah but pretty much the same. Either way, they charged people £1000 for the Titan and then £500 for the 780Ti, right? Ridiculous. Typical scumbag Nvidia. They did the exact same with the Titan X and the 980Ti.
I will be honest, I was extremely happy with my 4GB Gigabyte R9 290 Windforce card. Though, I admit that when I got my used R9 290, 4GB was just starting to ALMOST be the minimum needed it seemed like. I later upgraded my PC again in 2019 and I was glad I did because the power draw for the performance was not good. I upgraded into a used AMD Ryzen 5 1600X and Nvidia GTX 1070, and it was nice having a much cooler running, lower wattage card that had even better performance. That all said, I miss my R9 290 because I still have a soft spot for AMD graphics cards, I had used nothing but ATi/AMD cards since about 2004-2005. Not because they were always fastest, but because I got great deals on used AMD high end cards till my last 2 upgrades.
I was still rocking a HD6950 in my pc when these came out. Preordered my PS4 in June of '13 so I ended up taking quite a lengthy break from chasing top tier pc hardware. 10 years later and I've come back home. 6950XT baby!
We are all waiting for the 2016's 480 vs 1080 then 2017's 580 (or vega 64) vs 1080ti idk why i like seeing older gpu's battle against eachother, thank you for the video it was great
My current plans for the GOAT Project are just for higher tier GPUs, so I'm not going to be looking at Polaris or the 1060. Depending on how popular the next five videos are, I might do a second series, in which case I'd add midrange cards like the ones you mention. The 1080Ti will go up against the Vega 64, the 2080Ti against the Radeon VII and/or the 5700XT, I haven't decided which one yet.
@@rawhide_kobayashi I'm sure there are some modern games that are not too demanding that this would still do good with. Not everyone wants to play AAA games
@@phil_matic what's the point of even trying if they aren't actually modern games then nobody should be surprised if an old gpu can play dave the diver cuh... there's a difference between "new" and "modern".
@@rawhide_kobayashi Bad example with Dave the Diver. Don't act like games like that are the only thing a 3GB video card can play. 3GB video cards can still play modern, yet less demanding games.
You made the smart choice then. Me? I was eyeballing a 290 for months before pulling the trigger, only to buy a GTX970 on day one of it's release instead (for the same price almost). I was happy for 7 days, then AMD did what AMD do ... and reduced the price of that R9 290 by 33% overnight. I coulda had THAT card, and saved £100. And sure, I'd have a 5-10% slower GPU (but be £100 richer) for a while, but my AMD card would have lasted WAAAAY longer and just got faster and faster. Instead I swapped my KFA 2 GTX970 for a friends R9 390X 12 months later. And then I bought another friends used GTX980Ti for £250 the following year. Still got that card, in my daughters PC. I'm using an RX6800 right now.
No they weren't: DirectX 12 hadn't even been announced yet at the time. It would take years before it became relevant. AMD just correctly saw the way API's were going with Mantle, despite Mantle itself being a failure. Therefore GCN was more geared towards low-overhead API's from the start, but there was no way of knowing for sure that that would become relevant back in 2013. Back then Kepler and GCN were generally very similar in both performance and efficiency, with the 780 Ti just about outperforming the 290X as well as being more power efficient.
Mate even Maxwell didn't support DX12, with the GTX970 and GTX980. OR Vulkan. Not really! They claimed to on the boxart, but the truth was they emulated those features via drivers. Ergo, zero peformance increase from moving from OpenGL to Vulkan on those cards, and sometimes even negative performance "increases" when moving from DX11 to DX12. Same story with Pascal, which was basically Maxwell with higher clocks. Nvidia reversed engineered some shit to incorporate true Vulkan / DX12 support for Turing. And the 1080Ti did actually show increases in Vulkan because it alone was powerful enough, and had resources spare enough, to actually emulate them faster than all the other Maxwell / Pascal cards. Meantime, GCN had been supporting Vulkan and DX12 for eons. I remember the old R9 280X (aka a 7970Ghz Edition) seeing massive performance increases in Vulkan on Doom 2016. And by then the card had been out for YEARS! :D
@@SterkeYerke5555 But the 780Ti only barely beat the 290X, and AMD priced their card accordingly (lower) to offer gamers the best deal, as it were. Of course, just a year later when Maxwell dropped, AMD reduced the price of the 290 and 290X by some 33% overnight, absolutely killing the 780 / 970 / 780Ti / 980 in terms of value for money. And we all know what happened after! Each passing year the AMD cards jsut got better and better, the Nvidias slower and slower, until even the original R9 290 was beating the GTX980 across a 25 game average over at Anandtech. The lack of DX12 and Vulkan support on Maxwell + Pascal was a disgrace cos it ACTUALLY CALIMED IT on the box art! And on Nvidias website! While being false advertising, nothign more.
@@TheVanillatech What have you been drinking? AMD was never on top after the 290X. Sure, they could compete on price to performance in quite a few instances, but they never had an architectural advantage since. The 290X couldn't beat the GTX 980 at the time, but it did consume nearly twice the power. And what about the Fury series, which only beat the 980 Ti in 4K in titles that required less than 4GB of VRAM, while requiring a water cooler? Or Vega, which required 1080Ti amounts of power to keep up with the 1080? I like AMD, I appreciate that they supported Dx12 and Vulkan very early on and I really like that they've brought high-fidelity upscaling and frame generation to the masses in recent years, but their GPUs have rarely been superior for gaming in recent years in the high end segment. I guess you could say the RX 6000 series is more efficient than the RTX 3000 series form nVidia, but then they lack things like DLSS (honestly, coming from a 6900XT owner, FSR looks like crap in some games), proper ray tracing or CUDA support. If anything nVidia seems more forward looking these days, except they like to nerf their own cards with too little VRAM. And let's not forget: Terascale 2 aged like milk as well. After performing well in both price/performance as wel as performance per watt when they were new, they started to suffer from stuttering in more recent games, before being dropped even earlier than nVidia dropped support for their Tesla cards - which were an entire Dx generation older. They were dropped so early that they didn't even get proper OpenGL support, despite the hardware probably being capable of supporting the latest version 4.6. Terascale 1 suffered a similar fate. Neither AMD nor nVidia are saints.
The 780 ti was my first big gpu purchase that I ever made (the kinda gpu purchase that requires you to upgrade a PSU). I bought it for AC Black Flag, and it served me well for over a year, until it died while playing Horizon Chase (outside of warranty replacement). It was EVGA (RIP EVGA & my 780 ti).
The 290X was the better option due to three factors: 4GB vs 3GB, DX12 vs DX11 and ofcourse, continued driver support beyond the 700 series. Even today you can install Nimez drivers and still use the 290X in relatively modern titles as long as horsepower and VRAM isn't an issue. It's quite interesting how some of the differences between AMD and Nvidia back then still apply today with the RTX 4000 and RX7000 series.
The GTX 780Ti was the better option at the time. 1. 780Ti owners just want the fastest gaming card. So the lower price of 290X made no difference. 2. The reference cards were also noticeably better than AMD's R9 290/290X's. Especially as AMD's coolers were really bad for that generation. Hell even boardpartners struggled to make it low noise and cool.
@@Orcawhale1 Sure, many Nvidia cards back then were the "better option at the time". Only problem was, "at that time" was very short lived! As history proved time and time again. Nobody except princes considers a £400 investment into a GPU as a temporary or short term fix. People buy top end to be able to enjoy games maxxed out FOR A LONG TIME perhaps skip a couple of generations, or at least a single generation. 290X and 780Ti traded blows, with the 780Ti coming out on top in performance, but not in value (about equal there). But the day Maxwell dropped, AMD reduced the price of the 290 and 290X by 33%. Destroying the 780/Ti and 970/980 in terms of VALUE. While still not having the crown in performance. As I said, not everyone is a prince. In fact not many people are at all. Then we know, in hindsight, that the R9 290 (originally priced and fighting vs the 780) not only went on to beat the piss out of the 780, but then also the 970, then the 780Ti and finally even beats the 980 across 25 games tested at Anandtech. A card that cost HALF THE PRICE of the 980, beating it just due to better API support and driver support. Then theres the Nvidia boys ... "AMD BAD DRIVERS OMG! NVIDIA EVERY TIME! THE BESTSEST!". zzzzz
@@TheVanillatech You mean the truth, that 780Ti outsold 290X for the entire year, it was out? And that the value prospect is meaningless on high end cards, like 780Ti. Those truths?
I bought an EVGA 780Ti Classified back in 2013, had it in an open loop system, man that thing was an insane overclocker, std core speed was 875mhz iirc and mine ran around 1340mhz (modded bios) totally stable and it never saw 60c, loved that thing until I got a 5700xt when they came out, I still have it in an overkill XP build with a 3770k where it doesn't have to work so hard in it's old age. Great video 👍
An additional test in BF4 with Mantle would've been good, that was one of the few implementations of Mantle. Great video otherwise! This must have been a lot of work, looking forward to the rest of the series 😊
@@yarost12 hmm never heard of that, it worked great on my R9 290x and Fury. On newer drivers you have to insert files corresponding to mantle into the driver path to get it working.
@@FloppaAppreciationSociet-ds7zf BF4, BF Hardline, Sniper Elite and you're maybe somewhat right with todays perspective. Back then you wouldnt know that support would be that bad, because the work in mantle influenced the development direct3d 12. So as i wrote it WAS a selling point of GCN Cards
@@archuser420 Of course. Even ancient GCN cards offered killer performance increases in Vulkan / DX12 etc. I remember the R9 290 destroying the GTX970 in many Vulkan games - and that Nvidia sponsored channel DigitalFoundry deliberately sabotaging the benchmark results on their channel (before getting called out and deleting the video) to make the GTX970 look not so bad! :D There were fewer DX12/Vulkan titles back then, making the gap between Polaris / 1060 pretty tight and also between GTX970/980 and R9 290/290X close. As the years went on, that changed. I had BF4 back when I was still using an HD6950 and had zero issues with missing textures etc. The game ran perfectly. As it did on my HD7970.
When you realize that games from 10 years ago look almost as good (or sometimes even better) than current games, but require only a fraction of computing power to run.
I like the new form of chart/graphs and your retrospective reviews are some of my favorite types of video content. Now, I am very curious to see how well this will perform, considering the demographics of your audience and the lack of testing modern titles.
that's crazy how developers can optimize their game perfectly back then, unlike games nowadays upscaling is very needed to play at reasonable fps with over 1000$ cards
Back then we already had piles of unoptimized games, especially if they were ports from console. It's just that all those titles have been forgotten and you only remember the titles that held up well.
They didn't really, the consoles were already years old at the point that these cards that were many times more powerful than the GPUs in the 360 and ps3. It would be like comparing a PS5/Xbox series X to a 5090 and being shocked it runs better.
Would've been interesting to see a couple games from 2020's, like Cyberpunk and Horizon Forbidden West for example. Just to see if either of the cards would be usable even today.
Each of the GPUs I've bought for the GOAT Project will eventually get their own videos too. The 290X will be a while because I've already tested the 390 recently, and I'll have to come up with something different for the 780 Ti, because it won't even start most of the games on my 2024 benchmark roster!
I understand it's not the point of the video, but I'd be really curious to see how they'd perform across a swathe of popular F2P multiplayer titles, both older and newer. Wonder how viable they still are for the casual gamer.
Cost per frame matters only if you measure it for the year they were released and maybe another year or so. If the performance of one card is higher enough, even with a higher initial cost, which would allow you to keep it at acceptable framerates for longer, then it quickly could become the most cost effective solution. Good video and good idea for a video.
R9 290X 250w $550 GTX 780 Ti 230w $700 The actual competitor was GTX 780 AMD cards were DX12 compatible while NVIDIA was trailing behind at DX11 (huge bummer). R9 290X is an all out winner of 2013 in terms of performance, features and price
Same here with my Sapphire Tri-X 290X (also using the reference board). It was my first serious graphics card and I have absolutely loved it and still do to this day. I've modded the hell out of it, from VBIOS to Drivers and if I remember correctly OCed it to 1.1GHz Nowadays it rests in peace on my shelf (I don't know for sure the cause of its death) while I'm using its grandchild the RX 6800XT.
I had the R9 280X 3gb back in 2013 and it was my gpu all the way up to 2018. I used it on a 3 “monitor” setup and I absolutely loved that gpu. CPU wise I had the amd fx-6300. It worked. Honestly it was a great experience for me at the time. Still have that old pc and it holds some very nostalgic memories for me. Currently that pc is still with us but the old thermaltake water 3.0 aio has since died on me and I haven’t booted the pc in roughly 5+ years.
The difference between the two cards is that the R9 290X can still be used for some early 2020s' games, while the 780ti, which also lacks a lot of software supports (like Dx12) other than the 3gb of Vram, can't.
I had a R9 290x and it was an amazing card, and it was when I switched over to 4k gaming. The only problem was it ran hot AF! Mine eventually killed itself, by then though the 980TI was out so I switched back over to nvidia. Also not sure what you're talking about with display port, you could run 4k @60hz on hdmi back then, I did on a TV. You didn't need display port unless you went above 60hz and no video card was doing that back then.
I had two 290x cards from Gigabyte, one was loaded at 1.109v 900Mhz and the other at 1.256v 1000Mhz (both at stock clocks) but they both did 1200Mhz with an Overclock. Had a lot of fun with these, I got one to pull around 300watts at one point chasing 1266Mhz at 1.33v-1.35volts 🔥 the triple fan coolers did their work.
Ran a GTX 660ti up til 2018, before that I had nothing but a laptop. My work PC had a gt 1030 and eventually I upgraded my PC to a X79 motherboard with a new r9 270x 4gb edition. She served me well through the lockdowns until I went and upgraded my whole PC to an x99 and a GTX titan xp.
My entry to 4K gaming was the GTX 770 Ti 4GB. It was 380€ new. It did handle the games i played at the time at 4K60 pretty well. Last year, i spent 600€ on a 4070 Non-Ti. It does well in 4K144Hz with some optimisations in the settings, thankfully still good enough and most games i play without DLSS. I stopped playing most of the AAA games since around 2019.
The 780 Ti was my card when I first decided to get a gaming PC, eventually swapped it out for a 980 Ti, then build a new system with a 1080Ti, that card was so good it carried me until the 40 series came out. That was the last generation I bought the top card though the prices are just insane, ended up with a 4070 in my 3rd build.
I had a 750ti in 2015, good little 1080p card, certainly cemented my upgrade to 1080p, from the 5:4 days of 1280x1024 (Still a damn good resolution, and if I had my way, I'd have a pair of flatscreen 1600x1200 monitors today). Whether it would've held up in all of these games at higher quality settings, I hesitate to speculate, but I could at least play WoW, and that was really most of what I cared about at the time. Although, WoD did rather run out of content at the end. Tanaan Jungle really didn't satisfy. Now, Pandaria- but if I start remeniscing over the WoW expansions, who knows where it'll end...
Yes, I got a nice boxed XFX model. Unfortunately the material on the card itself has gone a bit tacky, which seems to happen a lot with the Fury & Nano cards.
This is cool, I would call that a tie. One thing you missed was power consumption, not really a concern for me, but a big concern for some people. Expecting the 1080ti to do well in this competition.
I loved my 780ti. Bought it for about $200cad when in 2016, and it ran every game I wanted at 1080p 60fps until halo infinite came out and it wouldnt run. I got a cheap 980 to get me by for the same price I sold the 780ti for which blew up right in the middle of the covid gpu crisis.. That hurt
Damn, PC hardware was way different when my father has built his last gaming rig with i5 4460 and GTX 660. It's hard to watch older Scrapyard wars without knowing about 10+ y.o PCs 🐐🐐🐐
My 290X got me through GPU shortage and even played BF 2042 at beta. The card is a monster at 1080p. I beat DE on the thing during the pandemic, honestly one of the best GPUs ever made. I upgraded to a 4090 in FY24.😅
I hope you test the Vega 64 Liquid Cooled vs the 1080 Ti and the Radeon VII vs the 2080 Ti. They are not quite good GPUs but they are very interesting.
I remember when i bought my EVGA 780ti back in the day barely used at 300 euros, the best deal ever. What a gpu it was for the times, i was a very happy teenager. Had to finally replace it last year when it kinda died. RIP
especially with thoose newer custom designs there are a lot of nice cards, but for me, at 14 i wanted a asus 290x or 780 ti so bad, i think the design is so nice and clean, aso the matrix looked godly, wish they wouldve choose to remaster the design and put it on newer generations!
You presented Borderlands 2 in the video - the other game that has PhysX and it really adds some nice features in game. For example - cloth sheets can be ripped apart and destroyed and there are more visible particles after damaging surfaces + they become interactive, liquids from barrels are interactive with surfaces and player. Borderlands just looks better with PhysX overall
290X was so good. I eventually sidegraded to a 1060 6gb for the vram and power savings when they dipped cheap. Nice trade in on the 290x then too. Then a vega64, which mined its value back, a 2080Ti that mined its value back, and a 3080 that mined its value back. Can't get more goat than "free" i guess
I'd been saving forever to buy an R9 290 back then. I was still using an HD7870 and it had lacklustre performance in Wolfenstein : Old Blood and even struggled in Starcraft II in big multiplayer arcade maps. The R9 290 looked like the one to get. Then Nvidia released the GTX970 and everything changed. It was some 10-15% faster than the R9 290, and cost practically the same! For the first time in a LONG time, Nvidia had actually released something competitively priced! And I didn't think twice before spending £330 on a KFA2 GTX970. Then, the following week, AMD did what AMD always do and offered the gamers the best choice, the best value, and reduced their price of the R9 290 by 33%. The card I'd been looking at for months went down from £320 to just £220 overnight! And the R9 290X came down from £420 to £340, almost the same damn price I'd just paid for the GTX 970! I cursed myself. If Id just waited a week, I'd have an R9 290 and a spare £100! And as we all well know, once more Vulkan and DX12 titles started to emerge, and once Nvidia dropped all but critical support for Maxwell the second Pascal dropped, the R9 290 eventually overtook not only the GTX780, but then the GTX970, GTX780Ti and even the GTX980. I haven't bought an Nvidia card since! (Except a used 980Ti from a friend, which still lives in my daughters PC).
As someone who owns a PS4 Slim, the fact that the R9 290x shares the same architecture as the aforementioned console is great, that probably helped the R9 290x to not only live longer, but also aging better than the GTX 780ti. Will the GTX 980ti suffer the same fate as its predecessor, if that'll happen, I'll feel bad for the Nintendo Switch users.
Oh I wish I had 1 of those cards at 2013 but didn't have much $$ at the time so had to stay with my old crappy (can't remember what exactly) card till I upgraded to a 960. Fortunately the $$ situation is much better now so bought both for my collection, got a nice XFX 290x and 2 Kingpin 780ti.
Man, it would be amazing to throw new games at R9 290X and make a video about it. I kinda hate the finewine buzzword but AMD had the best generations on the brink of new APIs. This was the first GPU that fully supports DX12 features, the APIs that is still with us today and probably will be a long time in the future. In my opinion, it was one of the best GPUs AMD ever made with the likes of Radeon 9700 Pro (first DX9) or HD 5870 (first DX11). It was with DX10 generation when they were ill prepared - the 2900 XTX vs 8800 GTX debacle.
I have my EVGA GTX 780Ti SC 3GB mounted on the wall behind the PC I'm typing this on. It's the only card I've ever owned that died. While it worked it put up a good fight compared to the GTX 970 that replaced it. I bought it second hand so I'm not surprised it died and it remains the only EVGA product I've ever owned.
I owned an R9 290 card I had an option to get a GTX 770 or r9 290 and I went with the R9 290 from sapphire. Interesting results between the 780Ti and R9 290x. With me I would have gotten the r9 290x because of the extra 1GB of vRAM which would come in handy in some titles.
Heres me pulling a 4090s worth of power on cards from 11 years prior, with 2 Palit 780Tis at 310W full bore and a 5Ghz 4820K, the steel beast i could never afford is mine and with the current fan setup, these cards get HOT. at least for my tastes anyway, high 70s flatline, power that isn't needed but well welcomed (high 50s when capped) considering i can pull my main PCs games over and still kick arse in those DX11 titles, £35/pop for the cards is fraction of what they asked in their prime, truly a shame they don't run DX12 and with maxwell the next to take the axe, pascal continues to live on in my main system with a 4.7Ghz 4930K and 2 1070 Strix cards, SLi lives on for a majority of my library still being DX11, but in DX12 1 card just about scrapes 60 in the lows, they both continue to give a punch which is good..means i can put off an upgrade, only making that dreamy 5800x3d and Sapphire 6900"XTXH" cheaper..i might not have been able to get a Sapphire X1950XTX Toxic in crossfire..but perhaps this will pay homage to another big dream..
No surprise the R9 290X aged better. I'm fond of the Kepler GPUs though. They were great. The GTX 780 was a big step up from the GTX 680/770. I guess one can't really blame AMD for rebranding it again with the 390x. It performs like the GTX 970 and that card was amazing. That power consumption though... Nowadays they can all be found cheaply on ebay. All good upgrades from old weak display adapters, as long as one has the PSU for it.
uu someone is picking the nvidia favorite titles , the radeon was gimped by the payed off devs/companies and it still performed great , or is it 10years time too long for you to "forget"
Used 280x (7970 GHz) till recently, while not great anymore it was able to play esport games quite fine with 150-200 fps, only failed to do so after CS2 was released... card that is 1 year older than these samples...
Love the 290x but throughout the years I’ve seen so many of them and other AMD cards of the era fail. Ironically, I’ve been able to heat gun ALL of them to work for a while which must have been at least a dozen. It must be the same lead free solder issues the consoles had at the time. Didn’t see that so much with Kepler.
Cooking in the oven helped me get by with the 270X. But eventually this failed. Definitely not a true solution. Soured my experience, but I had a 390X with no issues other than the Sapphire Nitro felt really cheap.
I loved my 780 Ti, but I knew that RAM limitation would be a problem in the long run. I got rid of it to help me buy a 1080 ti. The 1080ti is such GOAT card from NV and they'll never make another card like that again.
I had a GTX 760 for my first proper gaming PC in 2013 abd it served me well until 2019. But starting with Batman Arkham knight I had to lower textures and ID tech games ran terrible on it But it had better compatibility with emulators vs the equivalent AMD card from back in the day. In retrospective I should have gone with an R9 280.
Interesting but we already have benchmarks from those times with those games. What would be more interesting with old cards is to see how well (or not) they hold up with todays games, otherwise this doesnt really add anything to benchmarks made 10 years ago.
Okay, so I noticed Arkham City was actually a 2011 game and it’s only the GOTY edition that released in 2012, but it was too late to amend the video. 😞
Pain.
@@IcebergTech we forgive you. Still a relevant benchmark
when did arkham origins come out?
Ray tracing video when?
@@ezrapierce1233 Currently planned for the 6th of September.
Ah, yes, 2013: when the fastest graphics card in the world cost $700.
That makes it ~910 CAD at the time.
Accounting for inflation, that would make it ~1,200 CAD in today's money.
That is SO expensive for back then.
--
Meanwhile, the R9 290X would be ~715 CAD.
Accounting for inflation, that would make it ~940 CAD in today's money.
That's still a lot of money, but MUCH more reasonable.
--
The face that either card costed that much is pretty crazy, but at least the 290X sounds reasonable. 1,200 CAD *just* to play video games (and maybe do some rendering) is a TON of money for its purpose. Prices like that should never be normalized (aside from accounting for inflation). (Looking at you, especially, GPU prices since 2018 (starting with the RTX 2000 series).)
@@ShadowMKII You make some good points, but please refrain from starting new paragraphs every sentence. You're not writing a viral marketing video script.
Meh not really, I think the Titan!(!!!!)!!!(tm) came in at a cool $1000, did it not? But then the day the final Titan!!!!!9(!11!!!)!!(tm) sold, Nvidia released the TITAN!!(!!)!!!)!!)!(! matching GTX780 for half the price.
The GTX780Ti cost £450(ish) if I recall. And the R9 290X around the same.
@@TheVanillatech No, the 780 Ti was slightly faster than the Titan.
@@selohcin Yeah but pretty much the same. Either way, they charged people £1000 for the Titan and then £500 for the 780Ti, right?
Ridiculous. Typical scumbag Nvidia. They did the exact same with the Titan X and the 980Ti.
Oh yes, Nvidia's 700 series...that aged like milk...
@@Flex-cx7uj not really no dx 12.0 support
@@Flex-cx7ujThat's a difference of 2 generations and specs...
Except gtx 750 ti
@@Flex-cx7ujyes, it's hardware related, just like AVX2 or AVX512 for example.
or even DLSS
Except the 750 Ti and 750
Man, you could say the 290x has aged like fine wine!!
Except the bottle was left opened for over a decade
Green one looks relatively better 🙃
The gap widens more and more as the years go on. The dollar per frame value of the 290x is just better.
I will be honest, I was extremely happy with my 4GB Gigabyte R9 290 Windforce card. Though, I admit that when I got my used R9 290, 4GB was just starting to ALMOST be the minimum needed it seemed like. I later upgraded my PC again in 2019 and I was glad I did because the power draw for the performance was not good. I upgraded into a used AMD Ryzen 5 1600X and Nvidia GTX 1070, and it was nice having a much cooler running, lower wattage card that had even better performance. That all said, I miss my R9 290 because I still have a soft spot for AMD graphics cards, I had used nothing but ATi/AMD cards since about 2004-2005. Not because they were always fastest, but because I got great deals on used AMD high end cards till my last 2 upgrades.
I was still rocking a HD6950 in my pc when these came out. Preordered my PS4 in June of '13 so I ended up taking quite a lengthy break from chasing top tier pc hardware. 10 years later and I've come back home. 6950XT baby!
That way I "downgraded" from an HD 6950 to a 6900 XT.
Lol 😂
We are all waiting for the 2016's 480 vs 1080 then 2017's 580 (or vega 64) vs 1080ti
idk why i like seeing older gpu's battle against eachother, thank you for the video it was great
The 1080/ti would crush the 480/580/vega 64
It should the 590
My current plans for the GOAT Project are just for higher tier GPUs, so I'm not going to be looking at Polaris or the 1060. Depending on how popular the next five videos are, I might do a second series, in which case I'd add midrange cards like the ones you mention.
The 1080Ti will go up against the Vega 64, the 2080Ti against the Radeon VII and/or the 5700XT, I haven't decided which one yet.
In terms of cost per frame, I think you'll be surprised how well the Vega 64 holds up.
@@twiistedpanda4781The 590 is barely better than a RX 5500 XT
@@twiistedpanda4781 The 1080ti is considerably faster than a rx 590
gpu history documentary XD
In my opinion, to promote relevancy of the older tech, I would recommend showing these games playing a relatively modern game, and it doing well.
probably my least favourite iceberg video because of this.
But it's not going to do well. 3GB of vram is hardly enough for 1080p on anything new without dumpster baby afterbirth settings.
@@rawhide_kobayashi I'm sure there are some modern games that are not too demanding that this would still do good with. Not everyone wants to play AAA games
@@phil_matic what's the point of even trying if they aren't actually modern games then
nobody should be surprised if an old gpu can play dave the diver cuh... there's a difference between "new" and "modern".
@@rawhide_kobayashi Bad example with Dave the Diver. Don't act like games like that are the only thing a 3GB video card can play. 3GB video cards can still play modern, yet less demanding games.
I had a 8gb 290X, stupidly good video card and long lived. 780Ti performance died a few years after it released. Sad times
8gb???
@@kcato5879 The 290X came in 4gb and 8gb, the 390X (same chip) only came in 8gb.
You made the smart choice then. Me? I was eyeballing a 290 for months before pulling the trigger, only to buy a GTX970 on day one of it's release instead (for the same price almost). I was happy for 7 days, then AMD did what AMD do ... and reduced the price of that R9 290 by 33% overnight. I coulda had THAT card, and saved £100. And sure, I'd have a 5-10% slower GPU (but be £100 richer) for a while, but my AMD card would have lasted WAAAAY longer and just got faster and faster.
Instead I swapped my KFA 2 GTX970 for a friends R9 390X 12 months later. And then I bought another friends used GTX980Ti for £250 the following year. Still got that card, in my daughters PC.
I'm using an RX6800 right now.
The 700 series was already outdated when it first came out while the amd cards had dx 12 support.
No they weren't: DirectX 12 hadn't even been announced yet at the time. It would take years before it became relevant. AMD just correctly saw the way API's were going with Mantle, despite Mantle itself being a failure. Therefore GCN was more geared towards low-overhead API's from the start, but there was no way of knowing for sure that that would become relevant back in 2013. Back then Kepler and GCN were generally very similar in both performance and efficiency, with the 780 Ti just about outperforming the 290X as well as being more power efficient.
... which didn't even exist yet
Mate even Maxwell didn't support DX12, with the GTX970 and GTX980. OR Vulkan. Not really! They claimed to on the boxart, but the truth was they emulated those features via drivers. Ergo, zero peformance increase from moving from OpenGL to Vulkan on those cards, and sometimes even negative performance "increases" when moving from DX11 to DX12.
Same story with Pascal, which was basically Maxwell with higher clocks.
Nvidia reversed engineered some shit to incorporate true Vulkan / DX12 support for Turing. And the 1080Ti did actually show increases in Vulkan because it alone was powerful enough, and had resources spare enough, to actually emulate them faster than all the other Maxwell / Pascal cards.
Meantime, GCN had been supporting Vulkan and DX12 for eons. I remember the old R9 280X (aka a 7970Ghz Edition) seeing massive performance increases in Vulkan on Doom 2016. And by then the card had been out for YEARS! :D
@@SterkeYerke5555 But the 780Ti only barely beat the 290X, and AMD priced their card accordingly (lower) to offer gamers the best deal, as it were.
Of course, just a year later when Maxwell dropped, AMD reduced the price of the 290 and 290X by some 33% overnight, absolutely killing the 780 / 970 / 780Ti / 980 in terms of value for money.
And we all know what happened after! Each passing year the AMD cards jsut got better and better, the Nvidias slower and slower, until even the original R9 290 was beating the GTX980 across a 25 game average over at Anandtech.
The lack of DX12 and Vulkan support on Maxwell + Pascal was a disgrace cos it ACTUALLY CALIMED IT on the box art! And on Nvidias website! While being false advertising, nothign more.
@@TheVanillatech What have you been drinking? AMD was never on top after the 290X. Sure, they could compete on price to performance in quite a few instances, but they never had an architectural advantage since. The 290X couldn't beat the GTX 980 at the time, but it did consume nearly twice the power. And what about the Fury series, which only beat the 980 Ti in 4K in titles that required less than 4GB of VRAM, while requiring a water cooler? Or Vega, which required 1080Ti amounts of power to keep up with the 1080?
I like AMD, I appreciate that they supported Dx12 and Vulkan very early on and I really like that they've brought high-fidelity upscaling and frame generation to the masses in recent years, but their GPUs have rarely been superior for gaming in recent years in the high end segment. I guess you could say the RX 6000 series is more efficient than the RTX 3000 series form nVidia, but then they lack things like DLSS (honestly, coming from a 6900XT owner, FSR looks like crap in some games), proper ray tracing or CUDA support. If anything nVidia seems more forward looking these days, except they like to nerf their own cards with too little VRAM.
And let's not forget: Terascale 2 aged like milk as well. After performing well in both price/performance as wel as performance per watt when they were new, they started to suffer from stuttering in more recent games, before being dropped even earlier than nVidia dropped support for their Tesla cards - which were an entire Dx generation older. They were dropped so early that they didn't even get proper OpenGL support, despite the hardware probably being capable of supporting the latest version 4.6. Terascale 1 suffered a similar fate. Neither AMD nor nVidia are saints.
The 780 ti was my first big gpu purchase that I ever made (the kinda gpu purchase that requires you to upgrade a PSU). I bought it for AC Black Flag, and it served me well for over a year, until it died while playing Horizon Chase (outside of warranty replacement). It was EVGA (RIP EVGA & my 780 ti).
oh yes finally test video not only with current games but with their time games too. Great job.
The 290X was the better option due to three factors: 4GB vs 3GB, DX12 vs DX11 and ofcourse, continued driver support beyond the 700 series. Even today you can install Nimez drivers and still use the 290X in relatively modern titles as long as horsepower and VRAM isn't an issue.
It's quite interesting how some of the differences between AMD and Nvidia back then still apply today with the RTX 4000 and RX7000 series.
The GTX 780Ti was the better option at the time.
1. 780Ti owners just want the fastest gaming card.
So the lower price of 290X made no difference.
2. The reference cards were also noticeably better than AMD's R9 290/290X's.
Especially as AMD's coolers were really bad for that generation.
Hell even boardpartners struggled to make it low noise and cool.
@@Orcawhale1 Sure, many Nvidia cards back then were the "better option at the time". Only problem was, "at that time" was very short lived! As history proved time and time again. Nobody except princes considers a £400 investment into a GPU as a temporary or short term fix. People buy top end to be able to enjoy games maxxed out FOR A LONG TIME perhaps skip a couple of generations, or at least a single generation.
290X and 780Ti traded blows, with the 780Ti coming out on top in performance, but not in value (about equal there). But the day Maxwell dropped, AMD reduced the price of the 290 and 290X by 33%. Destroying the 780/Ti and 970/980 in terms of VALUE. While still not having the crown in performance. As I said, not everyone is a prince. In fact not many people are at all.
Then we know, in hindsight, that the R9 290 (originally priced and fighting vs the 780) not only went on to beat the piss out of the 780, but then also the 970, then the 780Ti and finally even beats the 980 across 25 games tested at Anandtech. A card that cost HALF THE PRICE of the 980, beating it just due to better API support and driver support.
Then theres the Nvidia boys ... "AMD BAD DRIVERS OMG! NVIDIA EVERY TIME! THE BESTSEST!".
zzzzz
@@TheVanillatech
I don't care.
@@Orcawhale1 Well you're not important on the grand scale of things. The truth remains the same, whether you care or not.
@@TheVanillatech You mean the truth, that 780Ti outsold 290X for the entire year, it was out?
And that the value prospect is meaningless on high end cards, like 780Ti.
Those truths?
It's a really neat idea doing benchmarks from the eras of both the previous and next cards
I bought an EVGA 780Ti Classified back in 2013, had it in an open loop system, man that thing was an insane overclocker, std core speed was 875mhz iirc and mine ran around 1340mhz (modded bios) totally stable and it never saw 60c, loved that thing until I got a 5700xt when they came out, I still have it in an overkill XP build with a 3770k where it doesn't have to work so hard in it's old age. Great video 👍
An additional test in BF4 with Mantle would've been good, that was one of the few implementations of Mantle.
Great video otherwise! This must have been a lot of work, looking forward to the rest of the series 😊
Such a good format for a video! Please keep these coming :)
Did you use Mantle in BF4? Because that was literally the selling point of GCN2 Cards, they where bad in DX11...
It was never fixed in BF4, missing textures and shit
@@yarost12 hmm never heard of that, it worked great on my R9 290x and Fury. On newer drivers you have to insert files corresponding to mantle into the driver path to get it working.
@@FloppaAppreciationSociet-ds7zf BF4, BF Hardline, Sniper Elite and you're maybe somewhat right with todays perspective. Back then you wouldnt know that support would be that bad, because the work in mantle influenced the development direct3d 12. So as i wrote it WAS a selling point of GCN Cards
@@archuser420 Of course. Even ancient GCN cards offered killer performance increases in Vulkan / DX12 etc. I remember the R9 290 destroying the GTX970 in many Vulkan games - and that Nvidia sponsored channel DigitalFoundry deliberately sabotaging the benchmark results on their channel (before getting called out and deleting the video) to make the GTX970 look not so bad! :D
There were fewer DX12/Vulkan titles back then, making the gap between Polaris / 1060 pretty tight and also between GTX970/980 and R9 290/290X close. As the years went on, that changed.
I had BF4 back when I was still using an HD6950 and had zero issues with missing textures etc. The game ran perfectly. As it did on my HD7970.
The 290x may be beaten by the 780, but it still absolutely MURDERS IT with 1 small thing.
Dx12 support.
When you realize that games from 10 years ago look almost as good (or sometimes even better) than current games, but require only a fraction of computing power to run.
This video released on the EXACT DAY that i bought the directu ii! Thx so much!!
I like the new form of chart/graphs and your retrospective reviews are some of my favorite types of video content. Now, I am very curious to see how well this will perform, considering the demographics of your audience and the lack of testing modern titles.
Not well!
that's crazy how developers can optimize their game perfectly back then, unlike games nowadays upscaling is very needed to play at reasonable fps with over 1000$ cards
Back then we already had piles of unoptimized games, especially if they were ports from console. It's just that all those titles have been forgotten and you only remember the titles that held up well.
They didn't really, the consoles were already years old at the point that these cards that were many times more powerful than the GPUs in the 360 and ps3.
It would be like comparing a PS5/Xbox series X to a 5090 and being shocked it runs better.
its because upscalers exist that they become lazy and rely on it
@@WayStedYouYou know that this GPU (R9 290 i this was) was used in PS4/Xbox One with 8GB VRAM right? Is from the same architecture...
@@WayStedYou. 2013 was the year they launched the Xboxone and PS4. Either way, many games were not well optimized back then.
great video !i was half expecting the nvidia bars to be green but oh well i think i will survive ! keep it up mate !
I’m red-green colourblind, so I usually avoid using that combo in graphics. I know it’s off-brand, but it makes me feel better!
@@IcebergTech oh i see ! now i feel bad i even brought that tiny nitpick !
@@venix20 Hehe, don't worry about it
Would've been interesting to see a couple games from 2020's, like Cyberpunk and Horizon Forbidden West for example. Just to see if either of the cards would be usable even today.
Each of the GPUs I've bought for the GOAT Project will eventually get their own videos too. The 290X will be a while because I've already tested the 390 recently, and I'll have to come up with something different for the 780 Ti, because it won't even start most of the games on my 2024 benchmark roster!
I really like seeing these retrospectives from the period I was most into (consumer) computer hardware.
I guess it helps that GCN was in both major consoles.
Interesting! Looking forward to more :)
I understand it's not the point of the video, but I'd be really curious to see how they'd perform across a swathe of popular F2P multiplayer titles, both older and newer. Wonder how viable they still are for the casual gamer.
Cost per frame matters only if you measure it for the year they were released and maybe another year or so. If the performance of one card is higher enough, even with a higher initial cost, which would allow you to keep it at acceptable framerates for longer, then it quickly could become the most cost effective solution.
Good video and good idea for a video.
R9 290X
250w
$550
GTX 780 Ti
230w
$700
The actual competitor was GTX 780
AMD cards were DX12 compatible while NVIDIA was trailing behind at DX11 (huge bummer).
R9 290X is an all out winner of 2013 in terms of performance, features and price
I've still got my old reference R9 290x on display, it's dead but not forgotten.
Same here with my Sapphire Tri-X 290X (also using the reference board). It was my first serious graphics card and I have absolutely loved it and still do to this day. I've modded the hell out of it, from VBIOS to Drivers and if I remember correctly OCed it to 1.1GHz
Nowadays it rests in peace on my shelf (I don't know for sure the cause of its death) while I'm using its grandchild the RX 6800XT.
I had the R9 280X 3gb back in 2013 and it was my gpu all the way up to 2018. I used it on a 3 “monitor” setup and I absolutely loved that gpu. CPU wise I had the amd fx-6300. It worked. Honestly it was a great experience for me at the time. Still have that old pc and it holds some very nostalgic memories for me. Currently that pc is still with us but the old thermaltake water 3.0 aio has since died on me and I haven’t booted the pc in roughly 5+ years.
The difference between the two cards is that the R9 290X can still be used for some early 2020s' games, while the 780ti, which also lacks a lot of software supports (like Dx12) other than the 3gb of Vram, can't.
I still use R9 290 on my first modern PC. In december 2024 will be 10 years :)
Man, I had a Sapphire Vapor-X R9 290X. Even though I now have an EVGA FTW3 3090, I still think the 290X is the most BEAUTIFUL card I’ve ever owned.
I always wanted the Vapor-X 290/X, such a sexy card. Bet it was stunning.
You mean the yellow card ? Its manly sexy
Love this style of video man! 😊
I love this idea! Please keep testing, this GOAT concept is just what I am looking for.
Classic Iceberg Tech and I'm here for it.
I had a R9 290x and it was an amazing card, and it was when I switched over to 4k gaming. The only problem was it ran hot AF! Mine eventually killed itself, by then though the 980TI was out so I switched back over to nvidia.
Also not sure what you're talking about with display port, you could run 4k @60hz on hdmi back then, I did on a TV. You didn't need display port unless you went above 60hz and no video card was doing that back then.
I have a Titan and a R9 290 - the 290 is way better. The 290 still plays Tarkov at 1080, the Titan stutters at 720.
I had two 290x cards from Gigabyte, one was loaded at 1.109v 900Mhz and the other at 1.256v 1000Mhz (both at stock clocks) but they both did 1200Mhz with an Overclock. Had a lot of fun with these, I got one to pull around 300watts at one point chasing 1266Mhz at 1.33v-1.35volts 🔥 the triple fan coolers did their work.
GTX Titan Black 6Gb VRAM is nVidia's flagship of 2013, not GTX 780 Ti 3Gb ...
Ran a GTX 660ti up til 2018, before that I had nothing but a laptop. My work PC had a gt 1030 and eventually I upgraded my PC to a X79 motherboard with a new r9 270x 4gb edition. She served me well through the lockdowns until I went and upgraded my whole PC to an x99 and a GTX titan xp.
My entry to 4K gaming was the GTX 770 Ti 4GB.
It was 380€ new. It did handle the games i played at the time at 4K60 pretty well.
Last year, i spent 600€ on a 4070 Non-Ti. It does well in 4K144Hz with some optimisations in the settings, thankfully still good enough and most games i play without DLSS. I stopped playing most of the AAA games since around 2019.
Should have started with the 7970 and 680, they were the beginning of a legendary battle.
I was in two minds about including those cards. I might go back and add them to the project at a later date.
I think the beginning of the legendary battle was Radeon 8500 vs GeForce 3 ti 500
The 780 Ti was my card when I first decided to get a gaming PC, eventually swapped it out for a 980 Ti, then build a new system with a 1080Ti, that card was so good it carried me until the 40 series came out.
That was the last generation I bought the top card though the prices are just insane, ended up with a 4070 in my 3rd build.
Man you upgrade too much
I had a 750ti in 2015, good little 1080p card, certainly cemented my upgrade to 1080p, from the 5:4 days of 1280x1024 (Still a damn good resolution, and if I had my way, I'd have a pair of flatscreen 1600x1200 monitors today). Whether it would've held up in all of these games at higher quality settings, I hesitate to speculate, but I could at least play WoW, and that was really most of what I cared about at the time. Although, WoD did rather run out of content at the end. Tanaan Jungle really didn't satisfy. Now, Pandaria- but if I start remeniscing over the WoW expansions, who knows where it'll end...
finally you started this series 🤩
Hope you got the water cooled reference model of the R9 Fury X for the review. That thing always looked so cool
Yes, I got a nice boxed XFX model. Unfortunately the material on the card itself has gone a bit tacky, which seems to happen a lot with the Fury & Nano cards.
yr channel is the goat
I would like to see comparisons of older GPUs, specifically from before 2009.
Love this type of content though huge 👍 💯
This is cool, I would call that a tie.
One thing you missed was power consumption, not really a concern for me, but a big concern for some people.
Expecting the 1080ti to do well in this competition.
I loved my 780ti. Bought it for about $200cad when in 2016, and it ran every game I wanted at 1080p 60fps until halo infinite came out and it wouldnt run. I got a cheap 980 to get me by for the same price I sold the 780ti for which blew up right in the middle of the covid gpu crisis.. That hurt
my convincing counter-argument is that the founders edition 780ti looked gorgeous
Absolutely. I miss the classy blower cards. Both Geforce and Radeon blower cards looked sick.
1080 blower and 5700XT blower looked the best IMO
@@GewelReal holding out hope for an 8700XT blower style reference cars
@@GewelRealwhat about those sexy star wars titan blowers
@@GewelRealnah 20 series dual fan cards looked the best
good to know for when I finish building my time machine and travel back to 2013
Damn, PC hardware was way different when my father has built his last gaming rig with i5 4460 and GTX 660. It's hard to watch older Scrapyard wars without knowing about 10+ y.o PCs 🐐🐐🐐
Nice, now I feel old. Thanks
I love how i didn't even think anything odd about the goat in the thumbail lmao
My 290X got me through GPU shortage and even played BF 2042 at beta. The card is a monster at 1080p. I beat DE on the thing during the pandemic, honestly one of the best GPUs ever made.
I upgraded to a 4090 in FY24.😅
I hope you test the Vega 64 Liquid Cooled vs the 1080 Ti and the Radeon VII vs the 2080 Ti. They are not quite good GPUs but they are very interesting.
I remember when i bought my EVGA 780ti back in the day barely used at 300 euros, the best deal ever. What a gpu it was for the times, i was a very happy teenager.
Had to finally replace it last year when it kinda died. RIP
especially with thoose newer custom designs there are a lot of nice cards, but for me, at 14 i wanted a asus 290x or 780 ti so bad, i think the design is so nice and clean, aso the matrix looked godly, wish they wouldve choose to remaster the design and put it on newer generations!
You presented Borderlands 2 in the video - the other game that has PhysX and it really adds some nice features in game.
For example - cloth sheets can be ripped apart and destroyed and there are more visible particles after damaging surfaces + they become interactive,
liquids from barrels are interactive with surfaces and player.
Borderlands just looks better with PhysX overall
290X was so good. I eventually sidegraded to a 1060 6gb for the vram and power savings when they dipped cheap. Nice trade in on the 290x then too.
Then a vega64, which mined its value back, a 2080Ti that mined its value back, and a 3080 that mined its value back. Can't get more goat than "free" i guess
Back in 2016 I had 2 Strix 390x in Crosfire, played VR great
I'd been saving forever to buy an R9 290 back then. I was still using an HD7870 and it had lacklustre performance in Wolfenstein : Old Blood and even struggled in Starcraft II in big multiplayer arcade maps.
The R9 290 looked like the one to get. Then Nvidia released the GTX970 and everything changed. It was some 10-15% faster than the R9 290, and cost practically the same! For the first time in a LONG time, Nvidia had actually released something competitively priced! And I didn't think twice before spending £330 on a KFA2 GTX970.
Then, the following week, AMD did what AMD always do and offered the gamers the best choice, the best value, and reduced their price of the R9 290 by 33%. The card I'd been looking at for months went down from £320 to just £220 overnight! And the R9 290X came down from £420 to £340, almost the same damn price I'd just paid for the GTX 970!
I cursed myself. If Id just waited a week, I'd have an R9 290 and a spare £100!
And as we all well know, once more Vulkan and DX12 titles started to emerge, and once Nvidia dropped all but critical support for Maxwell the second Pascal dropped, the R9 290 eventually overtook not only the GTX780, but then the GTX970, GTX780Ti and even the GTX980.
I haven't bought an Nvidia card since! (Except a used 980Ti from a friend, which still lives in my daughters PC).
As someone who owns a PS4 Slim, the fact that the R9 290x shares the same architecture as the aforementioned console is great, that probably helped the R9 290x to not only live longer, but also aging better than the GTX 780ti.
Will the GTX 980ti suffer the same fate as its predecessor, if that'll happen, I'll feel bad for the Nintendo Switch users.
bro said 43 fps isn’t playable
i had an r9 270x in my first rig. that machine was a beast! ( for its time )
I bought my friends old R9 290 for 2$ when he upgraded. I can’t really use it for anything but it’s a nice throwback
Finally, some good fucking content on UA-cam!
i went from a 1gb hd6870 to a R9290x windforce it was incredible!
Oh I wish I had 1 of those cards at 2013 but didn't have much $$ at the time so had to stay with my old crappy (can't remember what exactly) card till I upgraded to a 960.
Fortunately the $$ situation is much better now so bought both for my collection, got a nice XFX 290x and 2 Kingpin 780ti.
Man, it would be amazing to throw new games at R9 290X and make a video about it. I kinda hate the finewine buzzword but AMD had the best generations on the brink of new APIs. This was the first GPU that fully supports DX12 features, the APIs that is still with us today and probably will be a long time in the future. In my opinion, it was one of the best GPUs AMD ever made with the likes of Radeon 9700 Pro (first DX9) or HD 5870 (first DX11). It was with DX10 generation when they were ill prepared - the 2900 XTX vs 8800 GTX debacle.
I have my EVGA GTX 780Ti SC 3GB mounted on the wall behind the PC I'm typing this on. It's the only card I've ever owned that died. While it worked it put up a good fight compared to the GTX 970 that replaced it. I bought it second hand so I'm not surprised it died and it remains the only EVGA product I've ever owned.
Will you do dedicated 7900x3d tweaking, undervolting overclocking video with benchmarks?
Such good memories from this period, Moore's law was fully alive, video games were already great
I owned an R9 290 card I had an option to get a GTX 770 or r9 290 and I went with the R9 290 from sapphire. Interesting results between the 780Ti and R9 290x. With me I would have gotten the r9 290x because of the extra 1GB of vRAM which would come in handy in some titles.
ooh, nice, Metro game in benchmarks!
I really want to see (you) reviewing the GTX 750 Ti 🥺
Heres me pulling a 4090s worth of power on cards from 11 years prior, with 2 Palit 780Tis at 310W full bore and a 5Ghz 4820K, the steel beast i could never afford is mine and with the current fan setup, these cards get HOT. at least for my tastes anyway, high 70s flatline, power that isn't needed but well welcomed (high 50s when capped) considering i can pull my main PCs games over and still kick arse in those DX11 titles, £35/pop for the cards is fraction of what they asked in their prime, truly a shame they don't run DX12 and with maxwell the next to take the axe, pascal continues to live on in my main system with a 4.7Ghz 4930K and 2 1070 Strix cards, SLi lives on for a majority of my library still being DX11, but in DX12 1 card just about scrapes 60 in the lows, they both continue to give a punch which is good..means i can put off an upgrade, only making that dreamy 5800x3d and Sapphire 6900"XTXH" cheaper..i might not have been able to get a Sapphire X1950XTX Toxic in crossfire..but perhaps this will pay homage to another big dream..
You should try custom nimez drivers for r9 290x
You're doing great 👍🏻 keep up the great work and be well my friend 🤍
I look forward to your posts!
Ah yes the gtx 770 and 780 kept me warm through many a cold winter nights 🥶
No one cares about win-more scenarios in 10 year old games. Show how they both manage now in 2024.
No surprise the R9 290X aged better.
I'm fond of the Kepler GPUs though. They were great. The GTX 780 was a big step up from the GTX 680/770.
I guess one can't really blame AMD for rebranding it again with the 390x. It performs like the GTX 970 and that card was amazing. That power consumption though...
Nowadays they can all be found cheaply on ebay. All good upgrades from old weak display adapters, as long as one has the PSU for it.
uu someone is picking the nvidia favorite titles , the radeon was gimped by the payed off devs/companies and it still performed great , or is it 10years time too long for you to "forget"
Used 280x (7970 GHz) till recently, while not great anymore it was able to play esport games quite fine with 150-200 fps, only failed to do so after CS2 was released... card that is 1 year older than these samples...
I have same 780 non ti card laying on shelf, used it till i bought a 1070 ti in 2017.
When I built my first PC in 2017, I ran a 780 Ti which I got dirt cheap and I’ll be honest it worked really well… On a 1080P 30hz TV at least.
I think we all know at the end of this series its gonna be the 1080ti. That thing still rocks to this day.😅
Love the 290x but throughout the years I’ve seen so many of them and other AMD cards of the era fail. Ironically, I’ve been able to heat gun ALL of them to work for a while which must have been at least a dozen. It must be the same lead free solder issues the consoles had at the time. Didn’t see that so much with Kepler.
Cooking in the oven helped me get by with the 270X. But eventually this failed. Definitely not a true solution.
Soured my experience, but I had a 390X with no issues other than the Sapphire Nitro felt really cheap.
I would like to see how they perform in newer games (2020 onwards).
I hope you had a restful night's sleep following this.
I loved my 780 Ti, but I knew that RAM limitation would be a problem in the long run. I got rid of it to help me buy a 1080 ti. The 1080ti is such GOAT card from NV and they'll never make another card like that again.
I still have an r9 290x, thing still kills at games for how old it is
I had a GTX 760 for my first proper gaming PC in 2013 abd it served me well until 2019. But starting with Batman Arkham knight I had to lower textures and ID tech games ran terrible on it
But it had better compatibility with emulators vs the equivalent AMD card from back in the day. In retrospective I should have gone with an R9 280.
Interesting but we already have benchmarks from those times with those games. What would be more interesting with old cards is to see how well (or not) they hold up with todays games, otherwise this doesnt really add anything to benchmarks made 10 years ago.