Totally agree, they are very modern minimalist artistic look to them. STRIX and FTW cards are the only thing that come close. I just took apart a dead 3080 FE and wow, they really stuffed the parts in the there and reinvented the blower cooler. I broke one of the fan ribbons accidently also. I thought that I had the ribbon all the way unlocked and the PCB slipped off as the old paste let go and it ripped like butter. I have taken apart the 6-7-9 series FE's with no issue though. @@StrixWar
1080 is still a good card. Don't let anyone tell you otherwise. Yes, it's older but still keeps up and the cost of these, at least in my local market is around $100. I always stay a generation behind, as the cost is lower. @@TheBcoolGuy
but lasted being the best for just few months so thos who got it back then really got kick in the balls, compared to thos who got rtx 4090 at launch now it will be year and a half with nothing coming even close to it
@@GTXDash dont remind me, i bought gtx 1080 for 700$ just 1 week before 1080ti came out at a price of 700$ while 1080 fell to 500$, i felt like shit as i pretty much threw away money on 30% weaker gpu, still it served me well till December last year
It's bizarre to me how many Titans Nvidia put out. Kepler alone has 3 (Titan, Titan Black, Titan Z) and Pascal had 2 Titan XPs. Other major architectures just had 1 each.
I had 2 Titans in SLI. Most of us got Titans for the VRAM. But I also had mine modded and running absurd power, voltage, and clock speed. 1.425v on the core at 1450 MHz, about 450 watts in 2013. It used to trip OPP on my power supply until I sorted that out. I used to run 5760x1080 back then at 120 FPS in 3D. Those were the days.
Until a few years ago, Titans had their own drivers. Based on the Studio drivers rather than the Game Ready line. This is clear evidence that they were marketed at Workstations (specifically Game Devs), and not just aimed at gamers with too much money.
That's a very good point! The Titan's may be over priced compared to their GeForce gaming counterparts, but they're only a fraction the price of their Quadro workstation counterparts
@@thevideolord Which is why the titans got replaced by the 90 series once again. Sure Nvidia could sell these whale cards to gamers for a grand. But why would a corporation buy a Quadro M6000 12gb for 5 grand, when you can get almost the same workstation performance from a Titan X Maxwell for 1/5th the price. Every time a corpration bought a titan instead of a quadro was thousands of dollars, nvidia could have gotten instead. So they dropped the workstation features and dropped the titan name. The whales still buy the 90 series cards for well over a thousand dollars. But now corporations need to buy Quadro cards (or RTX A cards as they are called now), if they want the workstation features.
for a short while yes, then nvidia capped them after. Its a good thing i never collected the original titan. pascal i think has the best titan in its lineup as anything after or before is a shame in the feature set.
Biggest flaw for Titan performance was always it's cooler. Undervolting helps a good bit with keeping it at around 1GHz mark on stock blower cooler, however you can't do it without vBIOS modding :( I own Titan Black, which should be 5-10% faster than OG one (depending on how much VRAM bandwidth is the limiting factor). Fun fact : If you enabled FP64 full speed mode (1:3 vs. 1:24) on it via driver tick box, your performance will get worse in games :D
Back in the day people were like "a 999EUR card? oh hell no" and now they're fine with 1200+ EUR higher mid-end cards like 4080. I know there's inflation etc. but wages haven't risen to match that.
The 80 cards never were „mid end“ cards. They always were high end. NVIDIA just rebranded the Titans as 90s. The 90 cards are still not considered for the average consumer, the 80 tier cards still are the high end buy for the enthusiast gamer. If you just consider inflation adjusted MSRP prices in EUR for the best 80 cards today, prices would be like this: 980 Ti: 740€ (2015); 990€ (2024) 1080ti: 820€ (2017); 1.090€ (2024) 2080 Ti: 999€ (2018); 1.650€ (2024) 3080 Ti: 1.199€ (2021); 1.511€ (2024) 4080: 1.330€ (2022); 1624€ (2024) NVIDIA lowered the MSRPs of both the 4080 (Super) and the 3080 Ti shortly after release considerably. So if we are talking about high end GPUs being less than a thousand bucks, we have to go aaaalll the way back to 2015 as inflation is (unfortunately) real. Prices rally started to skyrocket with the launch of RTX in 2018. So technically, we are not paying so much more for the 80 GPUs than we did for a 1080 ti und 2017. Currently I can get a 4080 Super for 1000 to 1100€, which would be around 800€ in 2017. I really doubt I could’ve got a 1080 ti before the release of Turing for less than 800.
Damn, the Kepler Titan. I had two in SLI, combined with Core i7 3970X, 64GB RAM and a 2560x1600 Dell U3011 display. My first and last multi-GPU system, but it kind of was needed for 1600p resolution.
That's pretty much what my dream PC would have been back then. But my cheap X58 PC with an i7-920, 12GB RAM and GTX 660 did run games decently at 1080p, I did later upgrade to 3x 1080p (5760x1080) and used NVIDIA Surround which was a bit much for the GTX 660 but worked well in older games like GTA IV, L4D2 and even BioShock Infinite
@@AntiGrieferGames I still have one of those Titans, in a random shoebox. The PC went to my younger brother (who is not really interested in gaming and basically needs a multimedia / office PC) after I upgraded to Ryzen 7 5800X and Radeon 6900XT. Technically, I didn't go directly from Titans to 6900, it was SLI Titans → GTX1080 → RTX2080Ti → RX 6900XT. Going from Titans to 1080 was a good move, since SLI was on life support by the time. 2080Ti - not so much, I kinda regret the money wasted, but I also moved to 4K at the time and 1080 just couldn't cut it. 6900XT though, that thing *stays* at least until 5090 comes out, or maybe even 6090, depending on the circumstances.
This brings back so many memories, I paid £837 at OCUK for mines in November 2013 to our in my 3930K X79 build. I still have an upgraded form of that build as my spare pc
Perhaps a month or two ago, I started watching computer hardware videos again after not following it much for years, probably since about 2017 or 2018. After having gone through the madness of learning what new standards are and all that, I've begun to accept it and I occasionally think about upgrading my PC from Devil's Canyon to a new Ryzen. Man, not only is new hardware REALLY expensive, but it also would not benefit me in any way. I don't want to play any games my current PC can't play and I don't have any reason to make the games I create require such expensive hardware. I'm very happy as I am, making games that are actually optimised and that normal people can play.
I think even Maxwell's performance is being dialed down gradually as of late. The reason I'm saying this is because my ancient HD7790 (R7-260X) based on GCN 2.0, which also happens to be the first DirectX 12 GPU in the world (for some reason), is now absolutely spanking the likes of 650Ti Boost (Kepler) and the legendary 750Ti (Maxwell 1.0).
NVIDIA does this. A few years ago I did some testing against a R9 390 and a 780 ti. The 390 decimated the 780 ti in every test when it really shouldn’t have. NVIDIA has been proven time and time again to be extremely anti consumer.
It would be a nice curiosity if you can test that Titan in a proper Linux Distro with Proton API and see if in the same games will encounter the same issues.
For those curious, Baldur's Gate 3 does run on a Titan Black (and I assume the OG Titan, and 780/ti) - but the map features only show the place makers and no base map using DX11, and Vulkan crashes to desktop upon loading. This is a known bug, but since it is below recommended spec for the game, it hasn't been a priority. This may have been fixed in the last two patches I cannot say, I switched out cards in my second system for an Arc a380 experiment in October. If you want to see the map there is a very annoying workaround - it requires copying DXVK files into the main game folder, a la Linux, to force DX11 to run an older version of Vulkan (if you want/need google 'black map fix for BG3' - I believe I found it on a Reddit post). On other older Nvidia cards this is virtually frame rate suicide - but the Titan, which would pull respectable 60ish figures, drops into the 30s (which is still devastating).
Test the $2999 Titan Z: This was a dual GPU beast with 12GB VRAM, which should still have been adequate in 2023 games, but the aging Kepler architecture and fading SLI support kept this from aging well. In games that don't support SLI, this only allocates one GPU and 6GB VRAM.
When you have a Titan, you have to download the drivers from the second row. Its like: Going to the second level. VIP Level. no shirt no shoes no titan level.
I owned an OG Vanilla Titan back then, bought it because of the potential of OC. Stock Titan smoked everything, but fitting it out with watercooling and unlocked BIOS with ramped up voltages could easily yield another 20%+. Core Clocks up to 1250 MHz were possible and VRAM matched up to that aswell. Also a big thing back than: Downsampling. You needed a special bios with modified timings and clock states but then you could downsample every game to your hearts contents. Mostly played in 5k back then - for older and modded games of course.
I always wanted a Titan because it seem so ludicrously to have like 6GB VRAM with a 1k Graphics Card. I thought like Games wouldn't go past 4ish GB VRAM like ever
hehe...he....he... huh.... Well surely games will never need more than 8gb right? oh.. what that? oh they already do? Well surely games will never need more than 12gb.... right?
@@LawrenceTimme No shot 😂 256mb lol oh man, aint technology something... I have a gpu box for a "Geforce 4 Ti 4200" 128mb ddr.. still have the Xtasy demo cd or whatever that has like half a dozen cool rendering video things xD to show the POWER of the gpu.... aint technology something...
Compared to RTX 3090 Ti owners who forked out just months before RTX 4090 released, the Titan X held its crown longer and didn't lose the battle as badly. 😆
You said some games couldn't be played on max settings when they released even on the Titan. Wasn't 2013+ still the SLI timeframe. Therfore a single Titan might not have made it but multiple? My friend only used SLI setups back then and if I remember correctly he was using a GTX690 Quad SLI setup and switched for Titans year after year. Edit: Not a Titan owner myself so I don't really know which Titan cards could do SLI.
I got 2 of them in sli 12g ones,only costed 200$ for both.They can play anything on highest settings.Better than most single cards like 1660super and half the price.And i noticed it could do 1440 with 88fps on elden ring.Some games seem to really like the cards together and RDR2 finally does not crash like my 1080 and 1660 single cards.Im happy with everything,and only use msi user fan control and no overclock.All I ever wanted and need.
I bought two Titans upon release, upgrading from my GTX 690. Then sold my Titans after maybe a year and got GTX980 cards. Sold the Titans for almost as much as I bought them new. Oh the days when my income growth was increasing exponentially to my monthly expenses.
I have kept era desktops alive and working in our home. I had to wait until last year to pick this gem up but happy to have it in our home museum next to my former personal 290x build. Hey I chose better back then, now I simply skip a chipset and only upgrade every 6 years on a gpu. Thankfully that skip allowed me to keep my 1080ti going until the 6950xt took its place while I wait for am6.
Commenting to help with the algorithm. Also, 780m performance from a Titan? Interesting that maybe in 10 years, we’ll have APUs performing like a 4080 or 4090. Crazy.
I remember when the Titan was first released people MOCKING Nvidia and those who paid that much money for one. In 2023, I still mock people for paying over 1000 euros for a GPU. I choose to ignore when people tell me about how I paid a bit over 1000 for a CPU back in 2014-15, simply because I still use it as my main PC, mostly out of need, but it still works fine. Modern gaming is great, just don't pay for gamepass and similar BS. Buy games physical when possible, or DRM free. If gamers avoided the BS, we wouldn't have it. Thanks for the video.
Now with AMD abandoning the polaris and vega in terms of driver support, it's time to revisit these cards and compare the benchmarks installed with mod drivers vs the last official one.
I have a knack for picking cards that don't age well, first the GTX Titan and now the Vega 64 but it's sitting in my spare pc so I'm not worried to much
@@DoktorLorenz Vega aged pretty well ectually. And they aren't quite abandoned. They are just recieiving only critical updates now, as the drivers are relatively mature.
i had a titan at launch, and is was an amezing upgrade from my gtx 560ti. so i did not regret buying it at all. the funniest part to me was wen the ps4 launched later that year it was 3x faster than it and all console peasants who had a ps4 were beaten by my gpu witch was older than their shiny new console. (Actually took someone home to show then what 144 fps looked like at real 1080p)
There was a key value proposition for the original Titan you didn't mention. 1/3 rate FP64. It had unlocked FP64, operating at the full 1/3 speed of FP32 that the GK110 was capable of. The consumer Kepler cards had 1/24 FP64 speed. The Titan was a fantastic value if you needed that, it was much cheaper then any Tesla option available at the time. Granted you lost the scalability features of Tesla, but if you could do without that then Titan was a wonderful value. It definitely wasn't for everyone, but looking at it squarely through the nose of gaming ignores a niche but crucial feature that made it a very easy sell for me back in the day. Alas, they removed this benefit from the Titan line after the Titan Black/Titan Z. Still, here is a comparison point. Feed it the right workload and that old GTX Titan I had in 2013 is still several orders of magnitude faster then the RTX 3080 I have in this computer. There was a brief shining period where the Titan was a tremendous value before they neutered the line after the Titan Black, and later Titan's were just gaming cards with a fancier name.
Idc about the performance, it certainly looked better than all current GPUs That center cooler and brushed silver/gray finish on the casing with the radiator peeking out.. PERFECTION
Here the GTX Titan was in 1100€ range (less than 1400€ today) while the cheapest RTX 4090 is 1800€ with most of them being over 2000€. RTX 4080 is around the same price as the GTX TItan is when adjusted to inflation
Honestly, I think it's a good thing that Nvidia replaced the Titan series with the RTX X090 GPUS. The Titans weren't used by businesses and having them just be integrated into the standard GeForce RTX lineup was the right decision. However, it would still be cool to see a new card with that Titan feel to it. There were reports of an RTX 4090 Ti and an RTX Titan Ada coming soon but those were nearly a year ago and nothing has showed up. What a shame...
That is I why when prices are similar I always preffer lower end but newer gen, over higher end old gen, game optimization matters a lot over the years
Wish one of these companies would try out dual gpu on one card and dual gpu in chassis. Then I could connect what effectively amounts to four gpus to run all games for all time......probably why there were so many 'issues' they 'just cannot figure out'. Cause money.
the first gen where Nvidia didn't drop the -100/110 chips as the flagship on launch. I was stoked for a GPU that was gonna double the performance of a GTX 580... instead we started getting the middle chip as the flagship first lol
I wish they’d bring them back; “Titan” was a badass name. A peasant like myself has never had one, but I can sort of relate to the part where you said the people who bought this card brand new felt disappointed that it didn’t hold “top tier” performance very long 😅 that’s basically how I felt this year with my 3060ti. Of course I’m aware it wasn’t (and has never been) a “high end” card…but it does perform marginally faster than the 2080 Super, which was technically a previous gen flagship (aside from the slightly faster 2080ti). So for a brief moment I felt a sense of satisfaction that has since been destroyed by 2023 😭 jokes aside, I’m still perfectly happy with it and enjoy high refresh 1080p native and 1440p using DLSS
You should make a video on how you can use linux to play dx12 games on cards that dont support dx12 features. This is how i play elden ring on my r7 370 4gb (hd 7850 rebrand)
Phenomenal video, I love Kepler , Nvidia disrespected this generation a lot, the drivers were terrible, my 660 wasnt that usable by 2015 in new games, but it established a standard. The Titan is basically a 250w 1060 6gb in 2013. Id still like to own it, theres probably a driver version out there that offers better performance but this aint that bad either.
good old 2013, when i got my first modern high end pc with an i7 3770, and gtx 660. I got a ryzen 5600 and rx 7800xt now. I do believe nvidia purposely sand bags some of thier old cards. I noticed that when i was using a 780ti while back, I played hellblade senua's sacrifice and was getting 45-50 fps at 3440x1440, then couple year later i played it again same system and display and it was like 20fps. *99% sure it was maxed out settings both times. Got my first big radeon card now and its fricken awesome, and i highley recomend it over the nvidia cards unless your mental and want to burn your money and get a 4090, really all nvidia has right now imo.
I fkg love this review. I recently purchased a GTX 980 Ti (on purpose), as a secondary card simply for the DVI-I analog output to connect to CRT monitors and TVs in pure, end-to-end analog gameplay. Unfortunately, I missed out by a day or two on a Titan X for a reasonable $75 U.S. I still am kicking myself. But I console my injured pride with the knowledge that my new, used 980 Ti has the nice water block, while the Titan X had the obnoxious blower cooler, but I still would have preferred Maxwell when I'm talking in my sleep. Maybe I'll still find one and switch the coolers and sell off the inferior combo. There are STILL people asking for $400 for the Titan X - maybe they've been in a coma? Even so, I have to admit that I'm maybe even more excited to see this old, instantaneously responsive analog config via the warm glow of X-Ray radiation than I am about my newly acquired RTX 3080. And THAT'S coming from someone whose last major GPU investment was a GTX 690. So, stupidly, I'm holding hope that I can still snatch a steal on the Maxwell variant AND side-load into an RTX 3090 with no additional investment after reselling the 3080. A man has to have goals, right? But ahhhh, nostalgia. Dreams of Quake and Doom and Wolfenstein and Sugar Plum Fairies dancing across my Sony Trinitron screen....
Very cool GPU to have, but anything before 2015 isn't great. Kepler gpus are infamous for their batching rendering, in which they render large batches of geometry. It's not exactly well optimized for the latest titles.
Driver support has now ended on that series. If you already have a 570 you have the option to use NimeZ modded drivers to keep them up to date, but if you're shopping I'd recommend looking at the RX 5500 or 5500 XT instead.
That good to know I had upgraded my pc recently and I have to build a pc for a family member so I was thinking about using the old Rx 570. Thanks for the info!
I think it would be good if you would make a comparison between a RX 580 2048SP with this to compare an old FLAGSHIP with a somewhat "midrange' GPU. I believe the two have some distinct difference.
Since i started having a job i dont buy hardware by it's price but by the performance tier i need. I am on 1440p VRR so i would need hardware that can run the games i want to play at 60-100 FPS given my screen resolution. The issue starts when this changes from mid to high between generations. Years ago a 1070 was just great for that resolution and framerate and today you cant even run some games with 12 GB of VRAM...
Those Kepler and Maxwell blower coolers were honestly gorgeous
All of nvidias FE gpus look great (except the 3070ti)
Totally agree, they are very modern minimalist artistic look to them. STRIX and FTW cards are the only thing that come close. I just took apart a dead 3080 FE and wow, they really stuffed the parts in the there and reinvented the blower cooler. I broke one of the fan ribbons accidently also. I thought that I had the ribbon all the way unlocked and the PCB slipped off as the old paste let go and it ripped like butter. I have taken apart the 6-7-9 series FE's with no issue though. @@StrixWar
@@Hughesburner I have taken apart FE models for 700,900,1000,2000 I own a 3070 but haven’t taken that apart yet
I've got a 1080 FE and I think it looks great too, but in my 23 year old mind, those cards really look the most impressive and powerful.
1080 is still a good card. Don't let anyone tell you otherwise. Yes, it's older but still keeps up and the cost of these, at least in my local market is around $100. I always stay a generation behind, as the cost is lower. @@TheBcoolGuy
I just freaking love how the titans looked
I repainted the silver shroud on my 980 FE to look like a titan X from the same generation.
Now we wish for the days when the best only cost a $1000. :(
but lasted being the best for just few months so thos who got it back then really got kick in the balls, compared to thos who got rtx 4090 at launch now it will be year and a half with nothing coming even close to it
@@Ciffer-1998 meh. The GTX 1080 Ti replaced the normal 1080 in less than a year but that "normal" 1080 remained amazing for many years.
@@GTXDash dont remind me, i bought gtx 1080 for 700$ just 1 week before 1080ti came out at a price of 700$ while 1080 fell to 500$, i felt like shit as i pretty much threw away money on 30% weaker gpu, still it served me well till December last year
@@Ciffer-1998 Ouch! Yeah, timing is everything.
wait for rtx 5070ti will perform like a rtx 4090 of half cost xD rtx 5000s will get out in Q4 2024 or erly Q1 2025@@Ciffer-1998
It's bizarre to me how many Titans Nvidia put out. Kepler alone has 3 (Titan, Titan Black, Titan Z) and Pascal had 2 Titan XPs. Other major architectures just had 1 each.
Dont forget the Titan V from the Volta architecture
@@yeetingmymeap5540 And the Titan X based on Maxwell
@@DasMaurice I would complain about it being overpriced but then again I was one of the suckers who bought the 3090 and 4090 on release
Titan RTX... did anyone even buy one?
@@AliceC993IIRC Blender Guru owned one before upgrading to a 4090
Really dig that era when the reference coolers look serious industrial yet futuristic at the same time
"Live in the suburbs but drive a jeep" First time ive watched your channel, and I can already tell im a fan from that quote.
I had 2 Titans in SLI. Most of us got Titans for the VRAM. But I also had mine modded and running absurd power, voltage, and clock speed. 1.425v on the core at 1450 MHz, about 450 watts in 2013. It used to trip OPP on my power supply until I sorted that out. I used to run 5760x1080 back then at 120 FPS in 3D. Those were the days.
Chad setup
@@LawrenceTimme almost Gigachad, if only I had one or two more Titans 😆
Your rig back then: Cocaine is a hell of a drug 😵
It is a bit shocking to realize that the RTX series is already 5 years old, half the age of this titan legend.
Time really is cruel.
My GTX Titan gave me a decade of gaming enjoyment and was totally worth every penny.
Great to see how the older cards still fair today! Keep up the great work good sir!
Until a few years ago, Titans had their own drivers. Based on the Studio drivers rather than the Game Ready line. This is clear evidence that they were marketed at Workstations (specifically Game Devs), and not just aimed at gamers with too much money.
That's a very good point! The Titan's may be over priced compared to their GeForce gaming counterparts, but they're only a fraction the price of their Quadro workstation counterparts
@@thevideolord Which is why the titans got replaced by the 90 series once again. Sure Nvidia could sell these whale cards to gamers for a grand. But why would a corporation buy a Quadro M6000 12gb for 5 grand, when you can get almost the same workstation performance from a Titan X Maxwell for 1/5th the price. Every time a corpration bought a titan instead of a quadro was thousands of dollars, nvidia could have gotten instead.
So they dropped the workstation features and dropped the titan name. The whales still buy the 90 series cards for well over a thousand dollars. But now corporations need to buy Quadro cards (or RTX A cards as they are called now), if they want the workstation features.
for a short while yes, then nvidia capped them after. Its a good thing i never collected the original titan. pascal i think has the best titan in its lineup as anything after or before is a shame in the feature set.
Biggest flaw for Titan performance was always it's cooler.
Undervolting helps a good bit with keeping it at around 1GHz mark on stock blower cooler, however you can't do it without vBIOS modding :(
I own Titan Black, which should be 5-10% faster than OG one (depending on how much VRAM bandwidth is the limiting factor).
Fun fact : If you enabled FP64 full speed mode (1:3 vs. 1:24) on it via driver tick box, your performance will get worse in games :D
I got an OG titan waterblock unused if you need it 😁
@@M-dv1yj i could potentially be interested. i have both the OG and the Black. but of course, if he wants it, let him have it :)
Back in the day people were like "a 999EUR card? oh hell no" and now they're fine with 1200+ EUR higher mid-end cards like 4080.
I know there's inflation etc. but wages haven't risen to match that.
The 80 cards never were „mid end“ cards. They always were high end. NVIDIA just rebranded the Titans as 90s. The 90 cards are still not considered for the average consumer, the 80 tier cards still are the high end buy for the enthusiast gamer.
If you just consider inflation adjusted MSRP prices in EUR for the best 80 cards today, prices would be like this:
980 Ti: 740€ (2015); 990€ (2024)
1080ti: 820€ (2017); 1.090€ (2024)
2080 Ti: 999€ (2018); 1.650€ (2024)
3080 Ti: 1.199€ (2021); 1.511€ (2024)
4080: 1.330€ (2022); 1624€ (2024)
NVIDIA lowered the MSRPs of both the 4080 (Super) and the 3080 Ti shortly after release considerably. So if we are talking about high end GPUs being less than a thousand bucks, we have to go aaaalll the way back to 2015 as inflation is (unfortunately) real.
Prices rally started to skyrocket with the launch of RTX in 2018. So technically, we are not paying so much more for the 80 GPUs than we did for a 1080 ti und 2017. Currently I can get a 4080 Super for 1000 to 1100€, which would be around 800€ in 2017. I really doubt I could’ve got a 1080 ti before the release of Turing for less than 800.
I'm so glad you're channel has grown so much since I found it. The quality of ur videos is awesome and I love ur commentary.
this ^^^
absolutely love what they do
Damn, the Kepler Titan. I had two in SLI, combined with Core i7 3970X, 64GB RAM and a 2560x1600 Dell U3011 display. My first and last multi-GPU system, but it kind of was needed for 1600p resolution.
I built a 3930k, 32GB ram and 1x Titan. I upgraded the X79 build a bit and it's now my spare pc
Do you still have that pc?
mGPU is heavily needed these days with RT, but developers are instead using upscaling tech as crutch instead of raw power.
That's pretty much what my dream PC would have been back then. But my cheap X58 PC with an i7-920, 12GB RAM and GTX 660 did run games decently at 1080p, I did later upgrade to 3x 1080p (5760x1080) and used NVIDIA Surround which was a bit much for the GTX 660 but worked well in older games like GTA IV, L4D2 and even BioShock Infinite
@@AntiGrieferGames I still have one of those Titans, in a random shoebox. The PC went to my younger brother (who is not really interested in gaming and basically needs a multimedia / office PC) after I upgraded to Ryzen 7 5800X and Radeon 6900XT.
Technically, I didn't go directly from Titans to 6900, it was SLI Titans → GTX1080 → RTX2080Ti → RX 6900XT. Going from Titans to 1080 was a good move, since SLI was on life support by the time. 2080Ti - not so much, I kinda regret the money wasted, but I also moved to 4K at the time and 1080 just couldn't cut it. 6900XT though, that thing *stays* at least until 5090 comes out, or maybe even 6090, depending on the circumstances.
This brings back so many memories, I paid £837 at OCUK for mines in November 2013 to our in my 3930K X79 build. I still have an upgraded form of that build as my spare pc
Nice 😎
Perhaps a month or two ago, I started watching computer hardware videos again after not following it much for years, probably since about 2017 or 2018. After having gone through the madness of learning what new standards are and all that, I've begun to accept it and I occasionally think about upgrading my PC from Devil's Canyon to a new Ryzen. Man, not only is new hardware REALLY expensive, but it also would not benefit me in any way. I don't want to play any games my current PC can't play and I don't have any reason to make the games I create require such expensive hardware. I'm very happy as I am, making games that are actually optimised and that normal people can play.
the window showing the heat sink on older nvidia gtx cards is just perfect
I think even Maxwell's performance is being dialed down gradually as of late.
The reason I'm saying this is because my ancient HD7790 (R7-260X) based on GCN 2.0, which also happens to be the first DirectX 12 GPU in the world (for some reason), is now absolutely spanking the likes of 650Ti Boost (Kepler) and the legendary 750Ti (Maxwell 1.0).
NVIDIA does this. A few years ago I did some testing against a R9 390 and a 780 ti. The 390 decimated the 780 ti in every test when it really shouldn’t have. NVIDIA has been proven time and time again to be extremely anti consumer.
It would be a nice curiosity if you can test that Titan in a proper Linux Distro with Proton API and see if in the same games will encounter the same issues.
Its funny that the decade old original Titan is faster than the RTX 4090 at FP64.
Please keep making videos, you're the best at it ❤
Gtx 1080ti is still the daddy for its time
great video❤
Finally somebody that teste contemporary games.
For those curious, Baldur's Gate 3 does run on a Titan Black (and I assume the OG Titan, and 780/ti) - but the map features only show the place makers and no base map using DX11, and Vulkan crashes to desktop upon loading. This is a known bug, but since it is below recommended spec for the game, it hasn't been a priority. This may have been fixed in the last two patches I cannot say, I switched out cards in my second system for an Arc a380 experiment in October.
If you want to see the map there is a very annoying workaround - it requires copying DXVK files into the main game folder, a la Linux, to force DX11 to run an older version of Vulkan (if you want/need google 'black map fix for BG3' - I believe I found it on a Reddit post). On other older Nvidia cards this is virtually frame rate suicide - but the Titan, which would pull respectable 60ish figures, drops into the 30s (which is still devastating).
GTA V is indeed a sleeper hit and an indie title.
Add Metro Last Light for 2013.
always was interested in the titan cards. glad you made a series about them!
Also holy hell you totally called the Forza horizon 4 thing.
Test the $2999 Titan Z: This was a dual GPU beast with 12GB VRAM, which should still have been adequate in 2023 games, but the aging Kepler architecture and fading SLI support kept this from aging well. In games that don't support SLI, this only allocates one GPU and 6GB VRAM.
When you have a Titan, you have to download the drivers from the second row.
Its like: Going to the second level. VIP Level. no shirt no shoes no titan level.
I owned an OG Vanilla Titan back then, bought it because of the potential of OC. Stock Titan smoked everything, but fitting it out with watercooling and unlocked BIOS with ramped up voltages could easily yield another 20%+. Core Clocks up to 1250 MHz were possible and VRAM matched up to that aswell. Also a big thing back than: Downsampling. You needed a special bios with modified timings and clock states but then you could downsample every game to your hearts contents. Mostly played in 5k back then - for older and modded games of course.
Proudly using SLi of these 2 beauties in my X58 Win Xp setup
I really enjoy, and respect you and your content. Thanks for another banger bruv
I always wanted a Titan because it seem so ludicrously to have like 6GB VRAM with a 1k Graphics Card. I thought like Games wouldn't go past 4ish GB VRAM like ever
hehe...he....he... huh.... Well surely games will never need more than 8gb right?
oh.. what that? oh they already do? Well surely games will never need more than 12gb.... right?
@@FacialVomitTurtleFights4070 waving
@@FacialVomitTurtleFights people were saying games with never use 256MB on my fx5950 ultra 😂😂
@@LawrenceTimme No shot 😂 256mb lol
oh man, aint technology something... I have a gpu box for a "Geforce 4 Ti 4200" 128mb ddr.. still have the
Xtasy demo cd or whatever that has like half a dozen cool rendering video things xD to show the POWER of the gpu.... aint technology something...
Compared to RTX 3090 Ti owners who forked out just months before RTX 4090 released, the Titan X held its crown longer and didn't lose the battle as badly. 😆
You said some games couldn't be played on max settings when they released even on the Titan. Wasn't 2013+ still the SLI timeframe. Therfore a single Titan might not have made it but multiple? My friend only used SLI setups back then and if I remember correctly he was using a GTX690 Quad SLI setup and switched for Titans year after year.
Edit: Not a Titan owner myself so I don't really know which Titan cards could do SLI.
I wish I could see a video about 2070 super from you. Thanks for the content!
It's sad to me that the Titan brand is kind of dead, replaced by the 90 series. I remember envying builds that included a Titan XP so much
All fishi rishi's working class friends had titans
I bought a titan Xp for 1200 during the crypto boom , 1080ti’s at the time were going to 1300+ .
I got 2 of them in sli 12g ones,only costed 200$ for both.They can play anything on highest settings.Better than most single cards like 1660super and half the price.And i noticed it could do 1440 with 88fps on elden ring.Some games seem to really like the cards together and RDR2 finally does not crash like my 1080 and 1660 single cards.Im happy with everything,and only use msi user fan control and no overclock.All I ever wanted and need.
I bought two Titans upon release, upgrading from my GTX 690. Then sold my Titans after maybe a year and got GTX980 cards. Sold the Titans for almost as much as I bought them new. Oh the days when my income growth was increasing exponentially to my monthly expenses.
7:13 I think you meant 19 months, not 9 months. The E5-1650v3 was released in September 2014
Looking forward to the maxwell Titan, at that card gave you 12gb of Vram, still decent today.
what an excellent piece of content!
I have kept era desktops alive and working in our home. I had to wait until last year to pick this gem up but happy to have it in our home museum next to my former personal 290x build. Hey I chose better back then, now I simply skip a chipset and only upgrade every 6 years on a gpu. Thankfully that skip allowed me to keep my 1080ti going until the 6950xt took its place while I wait for am6.
Commenting to help with the algorithm. Also, 780m performance from a Titan? Interesting that maybe in 10 years, we’ll have APUs performing like a 4080 or 4090. Crazy.
I remember when the Titan was first released people MOCKING Nvidia and those who paid that much money for one.
In 2023, I still mock people for paying over 1000 euros for a GPU. I choose to ignore when people tell me about how I paid a bit over 1000 for a CPU back in 2014-15, simply because I still use it as my main PC, mostly out of need, but it still works fine.
Modern gaming is great, just don't pay for gamepass and similar BS. Buy games physical when possible, or DRM free. If gamers avoided the BS, we wouldn't have it.
Thanks for the video.
yes, the PC gamer's & even some reviewers seem to have perpetuated a lot of myths in PC gaming & now they're suffering because of it.
Cope
@@LawrenceTimme *sigh* lol
Now with AMD abandoning the polaris and vega in terms of driver support, it's time to revisit these cards and compare the benchmarks installed with mod drivers vs the last official one.
I have a knack for picking cards that don't age well, first the GTX Titan and now the Vega 64 but it's sitting in my spare pc so I'm not worried to much
@@DoktorLorenz Vega aged pretty well ectually. And they aren't quite abandoned. They are just recieiving only critical updates now, as the drivers are relatively mature.
That AKKK-cent is dope AF bro
Titan is the start of the marketing team to go ludicrous...
10 years from now: Wiish we go back to the days of only paying a $1600 for the best GPU instead of $4000
love your vids
i had a titan at launch, and is was an amezing upgrade from my gtx 560ti. so i did not regret buying it at all. the funniest part to me was wen the ps4 launched later that year it was 3x faster than it and all console peasants who had a ps4 were beaten by my gpu witch was older than their shiny new console. (Actually took someone home to show then what 144 fps looked like at real 1080p)
There was a key value proposition for the original Titan you didn't mention.
1/3 rate FP64.
It had unlocked FP64, operating at the full 1/3 speed of FP32 that the GK110 was capable of. The consumer Kepler cards had 1/24 FP64 speed. The Titan was a fantastic value if you needed that, it was much cheaper then any Tesla option available at the time.
Granted you lost the scalability features of Tesla, but if you could do without that then Titan was a wonderful value.
It definitely wasn't for everyone, but looking at it squarely through the nose of gaming ignores a niche but crucial feature that made it a very easy sell for me back in the day.
Alas, they removed this benefit from the Titan line after the Titan Black/Titan Z.
Still, here is a comparison point. Feed it the right workload and that old GTX Titan I had in 2013 is still several orders of magnitude faster then the RTX 3080 I have in this computer.
There was a brief shining period where the Titan was a tremendous value before they neutered the line after the Titan Black, and later Titan's were just gaming cards with a fancier name.
Idc about the performance, it certainly looked better than all current GPUs
That center cooler and brushed silver/gray finish on the casing with the radiator peeking out.. PERFECTION
It wasn't until I bought a Quadro M6000 12GB card in recent times that I found out just how serious the build quality is.
adjusting for inflation, the 4090 is priced just like the old titan class cards were.
Not really, 1000$ in 2013 would be around 1350$ today and the 4090 MSRP's for 1600.
Here the GTX Titan was in 1100€ range (less than 1400€ today) while the cheapest RTX 4090 is 1800€ with most of them being over 2000€. RTX 4080 is around the same price as the GTX TItan is when adjusted to inflation
@@technicallycorrect838 but the official inflation figure is bs. So in reality its same price
@@LawrenceTimme I don't think you understand how inflation works chief.
And yet Titan RTX was not much different from 2080Ti
Honestly, I think it's a good thing that Nvidia replaced the Titan series with the RTX X090 GPUS. The Titans weren't used by businesses and having them just be integrated into the standard GeForce RTX lineup was the right decision. However, it would still be cool to see a new card with that Titan feel to it. There were reports of an RTX 4090 Ti and an RTX Titan Ada coming soon but those were nearly a year ago and nothing has showed up. What a shame...
That is I why when prices are similar I always preffer lower end but newer gen, over higher end old gen, game optimization matters a lot over the years
RTX Titan at $3000 LETSGOOOOO
Clash of the Titans soon? Pog
Wish one of these companies would try out dual gpu on one card and dual gpu in chassis. Then I could connect what effectively amounts to four gpus to run all games for all time......probably why there were so many 'issues' they 'just cannot figure out'.
Cause money.
the first gen where Nvidia didn't drop the -100/110 chips as the flagship on launch. I was stoked for a GPU that was gonna double the performance of a GTX 580... instead we started getting the middle chip as the flagship first lol
I wish they’d bring them back; “Titan” was a badass name. A peasant like myself has never had one, but I can sort of relate to the part where you said the people who bought this card brand new felt disappointed that it didn’t hold “top tier” performance very long 😅 that’s basically how I felt this year with my 3060ti. Of course I’m aware it wasn’t (and has never been) a “high end” card…but it does perform marginally faster than the 2080 Super, which was technically a previous gen flagship (aside from the slightly faster 2080ti). So for a brief moment I felt a sense of satisfaction that has since been destroyed by 2023 😭 jokes aside, I’m still perfectly happy with it and enjoy high refresh 1080p native and 1440p using DLSS
That was a really funny intro haha
i have three titans, kepler maxwell and pascal. i keep them on a shelf to remind me of better days.
I have a titan black on my shelf waiting for a good deal on a second one to do an old school SLI system
rocking a used 12gb model, it still kicks ass
why change the test cpu ? also you said you'd post at least a chart of all the cpu/gpu results
0:30 the lack of "tit" in "titan" tho 😵😵😵
You should make a video on how you can use linux to play dx12 games on cards that dont support dx12 features. This is how i play elden ring on my r7 370 4gb (hd 7850 rebrand)
I believe it is actually an r7 265 rebrand, so a slightly new GCN version.
@@jakelowery7398 well yeah and the r7 265 is a rebrand of the HD 7850 so its gcn 1 so it doesnt have dx12 fetures
The 265 is an HD 7850 refresh too
Phenomenal video, I love Kepler , Nvidia disrespected this generation a lot, the drivers were terrible, my 660 wasnt that usable by 2015 in new games, but it established a standard. The Titan is basically a 250w 1060 6gb in 2013. Id still like to own it, theres probably a driver version out there that offers better performance but this aint that bad either.
good old 2013, when i got my first modern high end pc with an i7 3770, and gtx 660.
I got a ryzen 5600 and rx 7800xt now. I do believe nvidia purposely sand bags some of thier old cards. I noticed that when i was using a 780ti while back, I played hellblade senua's sacrifice and was getting 45-50 fps at 3440x1440, then couple year later i played it again same system and display and it was like 20fps. *99% sure it was maxed out settings both times.
Got my first big radeon card now and its fricken awesome, and i highley recomend it over the nvidia cards unless your mental and want to burn your money and get a 4090, really all nvidia has right now imo.
I fkg love this review. I recently purchased a GTX 980 Ti (on purpose), as a secondary card simply for the DVI-I analog output to connect to CRT monitors and TVs in pure, end-to-end analog gameplay.
Unfortunately, I missed out by a day or two on a Titan X for a reasonable $75 U.S. I still am kicking myself.
But I console my injured pride with the knowledge that my new, used 980 Ti has the nice water block, while the Titan X had the obnoxious blower cooler, but I still would have preferred Maxwell when I'm talking in my sleep.
Maybe I'll still find one and switch the coolers and sell off the inferior combo. There are STILL people asking for $400 for the Titan X - maybe they've been in a coma?
Even so, I have to admit that I'm maybe even more excited to see this old, instantaneously responsive analog config via the warm glow of X-Ray radiation than I am about my newly acquired RTX 3080.
And THAT'S coming from someone whose last major GPU investment was a GTX 690.
So, stupidly, I'm holding hope that I can still snatch a steal on the Maxwell variant AND side-load into an RTX 3090 with no additional investment after reselling the 3080. A man has to have goals, right?
But ahhhh, nostalgia. Dreams of Quake and Doom and Wolfenstein and Sugar Plum Fairies dancing across my Sony Trinitron screen....
Lol, 4K ran on Dual Link DVI back then. And good luck finding such a monitor (mac pro display?)
Very cool GPU to have, but anything before 2015 isn't great. Kepler gpus are infamous for their batching rendering, in which they render large batches of geometry. It's not exactly well optimized for the latest titles.
I think my rx 480 was about the same. Man I loved that card.
I have a idea could you review if the RX 570 is still viable in 2023/4?
I know that it 4GB of VRAM but I am interested
Driver support has now ended on that series. If you already have a 570 you have the option to use NimeZ modded drivers to keep them up to date, but if you're shopping I'd recommend looking at the RX 5500 or 5500 XT instead.
That good to know I had upgraded my pc recently and I have to build a pc for a family member so I was thinking about using the old Rx 570. Thanks for the info!
Can you do amds equivalent
the 3090 and 4090 are basically titan level cards except they dropped the titan naming scheme .
I actually still use a Titan X paired with a Ryzen 9 3900x and still plays games
I have a 2080ti, randomly googled the RTX Titan to see if I could find a random one of market place, cheaper than all the new garbage.
its the same as a 2080ti in gaming
I think it would be good if you would make a comparison between a RX 580 2048SP with this to compare an old FLAGSHIP with a somewhat "midrange' GPU. I believe the two have some distinct difference.
Since i started having a job i dont buy hardware by it's price but by the performance tier i need. I am on 1440p VRR so i would need hardware that can run the games i want to play at 60-100 FPS given my screen resolution. The issue starts when this changes from mid to high between generations. Years ago a 1070 was just great for that resolution and framerate and today you cant even run some games with 12 GB of VRAM...
Back in the day when buying the highest end card all but guaranteed almost 100 FPS at max settings in every game.
1:26 ”Art“ lmao
Would like to see the Tesla M40 appear in your videos one day
I wish modern GPU’s weren’t so ugly. Just makes the cost that much tougher to stomach
4090 being the modern day equivalent to a Titan card at this point.
Ooooh i had that one! But it did not have dx12_0 feature level, so I had to replace it….
super curious how it compares to the 290x
Always love these story ones! 750ti was my first so Maxwell not keplar
Good news 750ti Still get driver updates.
Mum turn on the telly! new iceberg video just dropped
Yes sir!!...
Really interesting video, seeing how old hw performs these days always is.
ps:I believe at 10:37 the tested resolution was 1080p and not 1440p.
That was begining of the end of well priced high end GPUs..
you used ultra not ultimate settings in tomb raider
i thought these were just reject quadro cards without ecc lol?