The Slow Death Of The GTX 690

Поділитися
Вставка
  • Опубліковано 30 вер 2024

КОМЕНТАРІ • 594

  • @IcebergTech
    @IcebergTech  Рік тому +181

    Okay, some footnotes:
    I lost my voice while recording the VO, but there were a few things I wanted to add or change that I couldn’t record in time.
    *4 & 6 way SLI:* Craft Computing, the example I included in the footage, used the 3x GTX 690s as part of a cloud gaming server. Other valid use cases for 4-way and 6-way SLI include triple monitor 3D Vision Surround setups. When I said that it was “just a flex”, I speak from the perspective of someone who doesn’t need to run large numbers of VMs, and had also never considered triple monitor surround gaming setups as anything but an ostentatious display of wealth.
    *Single GPU vs. Dual GPU:* When comparing the single GK104 vs dual, I applied a small OC to the single chip setup. This was to more accurately reflect the experience of someone who had “only” bought a single GTX 680. I’d have gone higher, but my card couldn’t handle more than about 75MHz!
    *Drivers:* Although I’m talking about the history of a GPU from 2011 and the games it would have been played on at the time, I am using it in a modern system running Windows 10, and with 474.04 drivers (Nov 2022 security update, the latest available for Kepler). I could have used period-appropriate drivers and hardware to more accurately reflect the conditions of the times games were released, but my schedule didn’t allow it. Sorry!
    *Skyrim:* I said in the script that I didn’t see any of the physics glitches I’d heard about from using the game > 60FPS. I *should* have said that the only obvious glitches I could see were the weirdly fast hand movements and the flickering, *and that* this might have been caused by running the game above its intended frame rate, and was probably *not* related to SLI. I kinda said it with the on-screen text, but I wanted to be a bit more clear about it.
    *Doom:* I included some footage of Doom 2016 but ended up deleting the associated VO as it didn’t fit the flow of the narrative, and I feel the footage might need some further explanation. Doom 2016 was remarked on at the time as running worse on high end Kepler than on entry level GCN cards, and that still seems to be true 6 years later. I couldn’t get the game to run in OpenGL at all, but in Vulkan it varied between 20 and 40 FPS at 1080 Low. It also didn’t use the second GPU, even with adjustments made in nVidia Control Panel and nVidia Profile Inspector.
    *Fortnite:* At the last minute, I went back and retested Fortnite in SLI, because I’d seen someone else’s results that were very different from mine. Despite multiple attempts with different AFR modes and profiles, I still saw this massive performance regression in DX11, and DX12 didn’t acknowledge the second GPU at all. Performance Beta, meanwhile, performed very well, but soon found itself running above 200 FPS. At this point, CPU performance would be the limiting factor.
    *Asynchronous compute:* There was something of an uproar some years ago when AMD accused nVidia of “not even having async compute” in Maxwell, a feature which nVidia apparently *did* include a variety of as far back as GTX 780 and Titan. I don’t intend to drag up an old Red vs. Blue argument, but in the case of the HD 7970 it is a fact that async compute was something the AMD card had that the GTX 680/690 didn’t.

    • @theburger_king
      @theburger_king Рік тому +5

      I can’t read that much lol

    • @thcriticalthinker4025
      @thcriticalthinker4025 Рік тому +4

      So regarding 6 way SLI
      SLI maxes out at 4 gpu's, as there is no bridge configuration for any higher configs.
      This doesn't stop you from slapping 30 cards in a rig and using them for other purposes

    • @walrusman151
      @walrusman151 Рік тому

      Anybody who knows enough on a topic to leave a whole book of information as a footnote is pretty impressive

    • @nepnep6894
      @nepnep6894 Рік тому +3

      Async compute for any nvidia card pre turing was achieved through rapidly context switching so it didn't really net performance improvements, but only really helped compatibility

    • @CertifiedAsher
      @CertifiedAsher Рік тому

      Hello, cool relaxing videos explaining so well about GPU's, CPU's, anyway, can you do a video about the Nvidia Quadro k4200?

  • @mrfahrenheit8138
    @mrfahrenheit8138 Рік тому +105

    Officially addicted to this channel

  • @terzaputra3203
    @terzaputra3203 Рік тому +348

    I Have a friend who's struggling so hard to sell his old 690 few months ago. Besides unsupported drivers, and broken SLI support, the main problem was it's power consumption. It competes with the likes of 1050 TI at the same price range of 80-100 USD which is like 3-4 times more power efficient. At that price, people would choose more efficient cards rather than glorified heater. He ended up making a nice glass case for it and decorate his walls with it.

    • @misiekt.1859
      @misiekt.1859 Рік тому +34

      For 100 you can get RX 5600xt, which is around 1070ti. Even 5500 xt is 2x faster than 1050ti and often below 100.

    • @HolographicSkux
      @HolographicSkux Рік тому +12

      The 690 will produce a higher frame rate compared to the 1050tI but it is essentially a 680 at this stage. Glad I got rid of my 690 years ago but it was cool while I had it.

    • @jimtekkit
      @jimtekkit Рік тому +7

      Yeah power efficiency of today's cards are a big reason to buy new. Right now I'm on an RX570 and it consumes twice the power of my RX6600 while only giving about 60% the performance. And you can definitely notice the additional heat output while it's loaded up. It's a similar story on the Nvidia side as well.

    • @crazeguy26
      @crazeguy26 Рік тому +2

      I love my GTS450 zoom zoom!

    • @VarietyGamerChannel
      @VarietyGamerChannel Рік тому

      @@misiekt.1859 The radeon 5000 series are also jet noise heaters.

  • @chincemagnet
    @chincemagnet Рік тому +3

    Back in the day the reason many of us ran multiple high end cards was because we were running 3D or Eyefinity/Nvidia Surround, and when DSR came out that was an easy to upscale to 4K, early on for me it was an obsession with Crysis

  • @younglingslayer2896
    @younglingslayer2896 Рік тому

    hey man at 2:27 which tomb raider game is that ive always been interested in getting into the series

  • @TillTheLightTakesUs
    @TillTheLightTakesUs Рік тому

    I had 4 7970s. FOUR OF THEM. I’m sure they’d still do well today.

  • @LineFire2012
    @LineFire2012 Рік тому

    gtx 700 series are diying too, my EVGA 770 4gb have graphics issues with the shadows in warzone and halo infinity directly says gpu no compatibility :(

  • @ChineseNinjaWarrior
    @ChineseNinjaWarrior Рік тому

    I use this card. It's such a pain that most newer games don't support afr. Those that do work perfect with this card. Skyrim flickers cus physix. There are mods that fix that. I play Skyrim SE moded with lower res fixed textures 1080p 60fps locked vsync. It never drops below 60 and quiet fan. What can I say the gpu has over 300 fps in some games. In some games it can barely output 30 fps. Bad game design no afr. In good games it works good.

  • @clashwithkeen
    @clashwithkeen Рік тому +1

    I'm still running a 4GB gtx 680

  • @ALmaN11223344
    @ALmaN11223344 Рік тому +174

    This was back in the era where I had a GTX 460 SLI and helped a friend build a 9800 GTX SLI because they were $40 each on Amazon...oh, how the times have changed

    • @raginginferno868
      @raginginferno868 Рік тому +26

      you cant even get a gt 1030 for $40 nowadays and that card sucks ass

    • @ALmaN11223344
      @ALmaN11223344 Рік тому +9

      @@raginginferno868 Right? It's crazy.

    • @prateekpanwar646
      @prateekpanwar646 Рік тому +25

      @@raginginferno868There’s no sensible gpu a kid could throw in his school gaming pc and play some latest games at low settings.
      Market has huge gap, Companies can just sell older models like GTX 1060 which is made on older wafers thus cheaper.

    • @raginginferno868
      @raginginferno868 Рік тому +4

      @@prateekpanwar646 What? Your comment is completely irrelevant to what I said. I just made a comparison that GPU prices are going up every new generation. And, there are a few GPUs that are not too expensive, can fit a tight budget, and still offer 60 FPS on medium settings.

    • @BoleDaPole
      @BoleDaPole Рік тому +13

      Nvidia found out what other gaming companies learned long ago:
      Gamers are easy to fleece and they're willing to pay whatever it takes.

  • @scurbdubdub2555
    @scurbdubdub2555 Рік тому +312

    I’ve owned a 690 before, it wasn’t that bad of a card for what I played. I eventually put it in another system though and sold it. I liked the card, it was interesting, but it was aging. I’m now on a even more interesting card, the Radeon VII. I really enjoy using the card. It surprisingly aged well considering how it was hammered at the time. At 1440p high-max settings it can play everything just about.

    • @EbonySaints
      @EbonySaints Рік тому +30

      Hold on to that Radeon VII for dear life. They and the Vega series supposedly got massacred in the mining boom (according to Actually Hardcore Overclocking) thanks to the HBM being incredibly valuable for mining. You might be one of the few lucky holders of a working model. Should have got it before the price boom when they were at $500.
      And about interesting GPUs. I can understand, though Arc seems way more frustrating than interesting to me right now. 😅

    • @Saved_Sinner0085
      @Saved_Sinner0085 Рік тому +4

      Radeon VII is still a good 1440p card, bonus for lower heat bills in the winter. Lol it's not nearly as bad as Vega 64 due to the die shrink.

    • @theboostedbubba6432
      @theboostedbubba6432 Рік тому +4

      Still rocking a 1080 Ti. Both great cards that still shred 1080p and are great for 1440p. Enjoy it.

    • @Saved_Sinner0085
      @Saved_Sinner0085 Рік тому +1

      @@theboostedbubba6432 yup, I got a 1070. My display is 1080p 144Hz and the 1070 still runs everything on high with 60+ fps.

    • @theboostedbubba6432
      @theboostedbubba6432 Рік тому +3

      @@Saved_Sinner0085 GTX 1070 still provides a great 1080p experience and you can find them for $115-120 which is great performance for the price. The 10 series was fantastic.

  • @Saved_Sinner0085
    @Saved_Sinner0085 Рік тому +11

    Oh wow, only a 300 watt TDP? That's on the low end of the high end now. There are also CPU's pulling close to that now.

    • @kinkykane0607
      @kinkykane0607 Рік тому +2

      300 Watts was considered ludicrous back in the day including the fact that it cost £1000. I remembered it being quite controversial at the time to spend that much on a graphics card and how much Watts it consumed.

    • @Saved_Sinner0085
      @Saved_Sinner0085 Рік тому +1

      @@kinkykane0607 I remember full and well, I've been a PC gamer since before GPU's needed a 4-pin molex connector for extra power, way before PCI-e was even a thing. I remember people getting worried about the extra power consumption on the Geforce 6800 Ultra back in the day.

  • @watercannonscollaboration2281
    @watercannonscollaboration2281 Рік тому +80

    I’ve always found this card to be really cool despite the practicality or lack thereof. Dual GPU cards just seems novel from the perspective of someone who didn’t understand computers from that time

  • @ThinkAboutVic
    @ThinkAboutVic Рік тому +21

    i love the sudden transition from "foreshadowing is a narrative device-" to the gunshot intro lmao

  • @PyromancerRift
    @PyromancerRift Рік тому +5

    In 2012, only competitive esports players had high refresh rate monitors. But they were a thing. I remember beinQ being in all Esports competitions.

    • @IcebergTech
      @IcebergTech  Рік тому +4

      I’ve been toying with the idea of an “original 144hz PC” video. For the time being it’s on the back burner until I can work out a bigger budget, but the first consumer 144hz display I could find referenced was the Asus VG278HE from 2012. Seems like that was a banner year for new display tech!

  • @RodimusPrime29
    @RodimusPrime29 Рік тому +32

    Seeing these older gpu's "kinda" running current games I hope gives some hope to those who can't upgrade & running budget builds now. Keeping up with the Joneses isn't viable for most people. Keep the great vids coming!

    • @markm0000
      @markm0000 Рік тому

      I am completely out of all of this and just play emulators and old games. If friends want to play a game together we just visit and play split screen.

  • @Djbz170
    @Djbz170 Рік тому +39

    I used to have a titan Z watercooled and bios hacked overclocked to 1370mhz. it did very well at 1440p games I only recently upgraded to a 3090 when they launched in 2020.
    this video brought back alot of memories with my titan Z constant issues with SLI. Most of the time I just ran 1 GPU to eliminate crashing,stuttering and low frame times.

  • @TheVanillatech
    @TheVanillatech Рік тому +104

    The smarter move was buying a GTX670 immediately, a fairly great mid range card lasting 2 years at high settings, and adding a 2nd for pennies later on. My friend built a dual GTX670 machine just weeks after the cards came out, and he didn't have to upgrade for pretty close to 4 years. As he usually builds an entirely new machine from scratch every 2.5 - 3 years, he was very happy. His brother, who always inherited the old machine come upgrade time, was sad.

    • @gg2324
      @gg2324 Рік тому +3

      But you would need a sli compatible Z series motherboard and a fairly powerful psu too

    • @TheVanillatech
      @TheVanillatech Рік тому +4

      @@gg2324 Sure you would need a fairly decent mid range motherboard, in the region of £80-£100, but for someone dropping £350 on two GTX670's or a single GTX690, I'd expect that they would be spending AT LEAST £100 on a motherboard, possibly even going for an overspecced £150 board.
      People who spend £350-£400 on a GPU don't spend £50 on a board, £50 on a GPU and £50 on RAM to complete the system, after all! (Unless they're crazy....).

    • @TheVanillatech
      @TheVanillatech Рік тому +4

      @@randomguydoes2901 Mature GCN was a good buy. Most ATI cards are a good buy because of the lognevity and prolonged driver support. 7970, 7970 Ghz and 280X were amazing buys. At the end of it's shelf life, the 280X cost £160. At that time, Nvidia launched the GTX960 at £220. Almost double the price. But the R9 280X beat the GTX960 by 10-30%, depending on the game. Obviously Nvidia had to drop the price of the GTX960 here in the UK, in the first week.
      980Ti and 1080Ti - both great cards. Easily 2-3 year cards. Questionable with the 1080Ti, given it was almost doubled in price shortly after launch during the crypto rush. But if you got one early enough, despite the high price at launch, the card defninitely had legs!
      HD4770, GTX 750Ti and Geforce 6600GT are the best cards ever released, however.

    • @TheVanillatech
      @TheVanillatech Рік тому +4

      @Crenshaw Pete No shit Sherlock! Technology has IMPROVED over TIME! What a fucking bombshell!
      Any more pearls of wisdom to impart?

    • @oceanbytez847
      @oceanbytez847 Рік тому +1

      you'd be suprised. I've saved at least one awful build like that this year. Client was trying to get next gen performance on a budget and made cuts across the board except on the GPU. fortunately, it still ran pretty well, but he cut prices so severely he screwed himself out of an easy upgrade and instead would have to total rebuild when a slightly better board might have supported the following gen and allowed that upgrade. foolish mistake, but it happens now. I think part of it is the fact that computers have massively increased in price now compared to 10 years ago.

  • @drift-wn2dj
    @drift-wn2dj Рік тому +103

    good things somtimes end

    • @trulaila8560
      @trulaila8560 Рік тому +8

      More like "always end"

    • @v5k456jh3
      @v5k456jh3 Рік тому +4

      So deep, so insightful

    • @maizomeno
      @maizomeno Рік тому +2

      people are mad in this comment section

    • @Prod.Sweezy
      @Prod.Sweezy Рік тому

      all good things must come to a end

  • @TheVanillatech
    @TheVanillatech Рік тому +12

    We know that Nvidia started ignoring their previous generations in terms of driver updates and performance during Kepler and forever after. My friend bought a GTX980Ti for £580 after much deliberation. SIX MONTHS LATER - Nvidia released Pascal and anounced that only "critical driver support" will continue for Maxwell. Nvidia abandoned their customers who had just spent over half a grand on a top end GPU. Meanwhile, even the HD7970 and it's reincarnation the R9 280/X, were getting huge performance increased and added features via drivers YEARS afterward.

    • @liquidhydration
      @liquidhydration Рік тому +1

      Wait 980ti gets less support? Ever since I gotten mine this year all it has been is constant random nvidia driver stable updates. Now that I think of it, some happened today while I was sleep

    • @TheVanillatech
      @TheVanillatech Рік тому +2

      @@liquidhydration Nvidia released updated statements on their website, on the actual driver pages, saying that all but critical support is dropped from Maxwell just 6 months after the release of the 980Ti. basically, the final card of Maxwell and the arrival of Pascal, brought about the end of Maxwell. Dropping support for their PREVIOUS generation of GPU, which had sold tens of millions units, just like that. While AMD were still providing performance and feature updates for Barts and Cypress even in the Vega and RDNA 1 drivers.
      You might have not understood what I said. Nvidia simply make sure the cards still *work* as they should, and don't crash with newer engines / applications. But that's just a fraction of what drivers are supposed to do. They are also supposed to be refined to offer BETTER performance, on both older and newer applications. That's been the case for decades, and is still the case today. Nvidia said "Fuck that! We want our customers to buy Pascal, not cling on to Maxwell for 2-3 years!". So they officially dropped all but CRITICAL support. Which is basically the bare minimum. A fact that you can clearly see by looking at graphs of performance on Maxwell cards from subsequent drivers after their announcement.
      It's like buying a car and, six months later, the manufacturer / garage says "We will still do your MOT, but we ONLY ensure that the car starts and gets you to your destination! We will no longer change the oil, check the brakes, repair the engine beyond starting and stopping, and we won't adjust the seats or replace the windows! THANKS FOR YOUR £40,000 THOUGH! Why not consider our LATEST car? Only another £40,000!".

    • @S41t4r4
      @S41t4r4 Рік тому

      @@TheVanillatech before trying to make amd look great, look at r9 cards at the same age of the 900 gen, yeah not supported anymore.

  • @FatheredPuma81
    @FatheredPuma81 Рік тому +3

    Tbh most of the people that bought one of these new fall into 2 camps: People that upgraded either at the 980 Ti or 1080 TI and people who's gaming interest and standards fell significantly and will keep using it until the one game they really want to play doesn't launch.

  • @valentinardemon
    @valentinardemon Рік тому +12

    I had an SLI configuraton of GTX 980, this was realy great with my 1440p monitor, in some games like the Wither 3 and Battlefield (until the 5) I had a bit more perf than a GTX 1080. V-sync is mandatory with SLI, without it frame times are just bad and you had to love tweaking game and drivers with nvidia profil inspector.
    BTW I still use my Ipad air 1 and it work great for what I do with it, like youtube, twich, some Google research, etc. I use it like a second screen, no need to back windows or alt-tab

    • @VarietyGamerChannel
      @VarietyGamerChannel Рік тому

      Considering Vsync usually kills 15-20fps SLi was a bad bet. I used to use x2 Radeon 280's in xfire. Same shit. 90% of games I would have frametime stuttering and issues.

    • @valentinardemon
      @valentinardemon Рік тому

      @@VarietyGamerChannel And with our card we did not have freesync, I think it could have save multi GPU problems. The only thing I regret now, is that I dont have a heater beside me this winter

  • @ffwast
    @ffwast Рік тому +19

    It's a real bummer that multi gpu doesn't really get supported anymore. It could have been amazing on modern cards.

    • @spankeyfish
      @spankeyfish Рік тому +5

      I had a dual 670 setup and it was incredibly inconsistent. It'd stutter at random moments.

    • @MDxGano
      @MDxGano Рік тому +3

      Had 780's in sli, was the worst thing ever in terms of consistancy and jitter vs a single card. There are many reasons sli died. If we were talking about non real time viewing type workloads, it would be great. For gaming however, it simply was poorly paced in all iterations.

    • @NeovanGoth
      @NeovanGoth Рік тому +4

      I'm much more happy with the manufacturers having switched to extremely large dies (Nvidia) or chiplet designs (AMD) delivering scaled up single GPU solutions for the high end instead relying on multi GPU. It's just as expensive and power hungry, but works _much_ better in practice.

  • @daLiraX
    @daLiraX Рік тому +9

    With SLI you will need heavy user to user support, trying custom SLI Bits.
    There are still some places out there which are into that stuff, but well, it's less ofc, but they often tend to get some results after some time.
    If none works, you can aswell always try SFR Mode, but it might introduce stuttering and or heavy tearing.

  • @zephyr4494
    @zephyr4494 8 місяців тому +2

    "Foreshadowing is a narrative device-" jad me dying

  • @gamewizard1760
    @gamewizard1760 Рік тому +5

    I remember 3 years back, (before the pandemic drove GPU prices through the roof)a liquidator had a bunch of these for something like $75 each, and it was a long while before I finally decided against it, and ended up buying a GTX 970 for $100 for one of my older machines. Sli was already dead, and the fact that it was only DX 11, meant that it would be limited to older games. The power consumption is also too high, for the performance you get. If you're buying one for testing purposes, or to put on a shelf as part of a collection, then it's fine, but not as a card that you will use every day. 600 and 700 series cards are also out of driver support now, so there won't be any more optimizations coming. You're better off getting a Maxwell or Pascal based card, if you're upgrading from something even older than a 690.

  • @d0ubleg78
    @d0ubleg78 Рік тому +9

    Is the Proton compatibility layer on Linux / Steam OS a viable workaround to play otherwise incompatible titles? If the Vulkan API on Kepler is recent enough, then translating DX12 into Vulkan and running that on the GTX 690 could become a very interesting video.

    • @actuallyn
      @actuallyn Рік тому +2

      It would have similar outcome as MoltenVK on Apple's side (but worse, due to outdated... everything)
      You need to sacrifice performance to a compatibility layer.
      Unless use one gpu to do all the translation magic and other one for rendering?

    • @nepnep6894
      @nepnep6894 Рік тому +1

      DXVK no longer is supported on Kepler and will not work, and was pretty slow to begin with. VKD3D never worked in the first place on Kepler.

  • @jean-micheldupont1150
    @jean-micheldupont1150 Рік тому +1

    The sad thing is that "flagship" GPUs, doomed to become paperweights after a few years, used to cost 4 times less than now so you would feel a little less robbed back in the day...

  • @TastyGuava
    @TastyGuava Рік тому +4

    That building that fell still has a lane closed off, 1-2 years later. Still crazy that happened in my area. It's just an empty pit now and no one knows whether they're gonna give a new building permit or turn it into a memorial.

    • @Iwetbeds
      @Iwetbeds Рік тому

      was just thinking that was some odd choice of b roll footage.

  • @hypogogix9125
    @hypogogix9125 11 місяців тому +1

    I ran a 690 in my first ever gaming computer back in 2012. I wouldn't have bought it if I had properly researched what I was doing. I just got excited and had some money on hand so just bought the best available. Like an idiot. Either way I replaced it for a 1070 in 2016 and I still run this computer today. lol DDR3 and a 3930k. It's time for a new build!

  • @Obie327
    @Obie327 Рік тому +4

    Interesting review Iceberg Tech.. I actually still have my original day one purchase of A Zotac GTX 680 and the PC I installed it in. (i7 2600k) Everything stills works great for my older games. Thanks for the video!

  • @NOM4D20
    @NOM4D20 Рік тому +6

    Very good video, really detailed, and enjoyable. Keep up the good work

  • @nlk294
    @nlk294 Рік тому +2

    This GPU can't be more outdated...

  • @simsdas4
    @simsdas4 Рік тому +4

    I used to 980 for 6 years, good to know it's still capable of holding its own at the end there.

  • @jeremyocampo1529
    @jeremyocampo1529 Рік тому +5

    I'm surprised how great put together this video is for such a small tech youtube channel. Well done!

  • @Taorakis
    @Taorakis Рік тому +1

    I think 95% who can afford Halo Products like these, can do so every generation, or every second. So those who bought the 690 back in the day, probably have a 4090 by now and the 690 is in the shelf or has been sold to pay for the next big thing. Because if you buy highest end, you can also sell a highest end card when the next gen comes out and the people a few tiers down will happily take a card like this off of you when they still run a 570ti or something.

  • @JohnSmith-nj9qo
    @JohnSmith-nj9qo Рік тому +3

    I still remember just starting to get into PC building in 2012, and drooled over how ludicrously overpowered the 690 sounded. I eventually opted for a much more sensible 660 ti, and I'm glad I did because the 690 was very much a last gasp of the Crossfire/SLI trend. Now multi GPU setups are just a weird largely forgotten about footnote in the annals of PC gaming history.

    • @AFnord
      @AFnord Рік тому

      Part of me is surprised that multi-GPU support has pretty much gone the way of the dodo, considering how long it was around in some way or another. Even in the late 90's, with the Voodoo cards you could do a multi-card setup, if you had more money than sense (technically speaking a card like the Voodoo 2 required a multi-card setup, but not really in the way people tend to mean it when they talk about multi-card setups).

  • @BlackThunderRC
    @BlackThunderRC Рік тому +3

    Still have a fully working GTX 295.

    • @theburger_king
      @theburger_king Рік тому +1

      For your main???

    • @ImperialDiecast
      @ImperialDiecast Рік тому

      the best card from january 2009 :'-) couldnt run direct x 10 that came out 1 year later.

    • @BlackThunderRC
      @BlackThunderRC Рік тому +1

      @@theburger_king No no lol

    • @BlackThunderRC
      @BlackThunderRC Рік тому +1

      @@ImperialDiecast Yep indeed. Got it a couple of years back just to play about with. Also does not like to work on a 4k screen. 1440p is the max output as well. Dont ask me how long it took to figure that out

  • @northernleigonare
    @northernleigonare Рік тому +1

    Me who spent £1150 on a 3090ti 👀
    (new, when Nvidia dropped prices in preperation for the 4090)

  • @glown2533
    @glown2533 Рік тому +1

    bought a gtx titan x maxwell back in 2015 still running it today even tho my entire pc has changed since i kept it and still a grate gpu ive even ran it overclocked its entire life and in apex legends at 1440p max everything it stilll hits around 100 with alot of fighting it can go down to round 80

  •  Рік тому +1

    TBH, dropping SLI support sort of makes sense, especially from economy pov, but it also limits the usability of older cards. If look at it this way, cards like RTX 4090 are sort of a replacement for things like 4080 SLI that would have been a thing in the past. People often fail to notice that just a few years ago the top of the line was not a single top tier card but having 2-3 of those. That was the biggest beast you could get. Now? You can get 4090, slam water on it and push it as far as it goes but that's the limit.
    Anyways, the catch here is that older cards could still be usable in SLI. Even if a single card did not perform that well anymore, using two of those could often be cheaper than the new card, make the availability a little higher and most of all, it would actually be the best reusability scenario, better than recycling. Basic 1070 SLI is on pair with 3070, Strix 1070 OC in SLI is around RTX 3080 performance in most applications. Well, unless you play something that does not support multi GPU setup. Just imagine that, if we still were doing SLI with good support, in the GPU shortage after 2020 we would still have an option to run 10 and 20 series in SLI even though 30 series was hardly available. Getting dual 3080 would be a good alternative to 4080/4090, especially considering the prices and availability. This would let us use older generations at least a generation or two more before they become a scrap metal and when they do, they are a real paperweight. If games can't be even played on 2/3 way SLI, then the GPU is worth scrap.
    So why does it make sense? First and foremost, competition. If SLI was viable option, then new generation would have to compete with used market of old generation. Getting two slower, used cards might be cheaper and much more available than getting a new one. Secondly, optimizations, more corner cases, the overall complexity of development for SLI. I think that if anything like that should be done, it should be handled by Nvidia/AMD/Intel only, no special code on the developer's side. Multi GPU should really work as a single GPU from the game perspective and everything else should really be handled by the drivers and firmware. Otherwise it does not make sense if one game can get almost 100% boost, another will get nothing.

    • @kyronne1
      @kyronne1 Рік тому

      My thoughts tbh, sucks that corporation's only care about the bottom line

  • @raychat2816
    @raychat2816 Рік тому +2

    I still have my own 690 however it’s now a collector’s item, I had replaced it when I got 2 980 cards in SLI … but I’m keeping it

  • @koreannom
    @koreannom Рік тому +2

    Ah yes, the video I was waiting for haha

  • @Smasher-Devourer
    @Smasher-Devourer Рік тому +2

    The people who bought the GTX 1080/1080ti are the smart ones. Those cards still going strong to this day.

  • @TheTardis157
    @TheTardis157 Рік тому +3

    I went from a GTX275 to a GTX690 and it was amazing for the games I played in 2015 or so. It only cost me $160 at the time so I was fine with it as a card as at that price it made sense. I moved on to a RX580 when GPU RAM was becoming an issue and haven't looked back. I still have the card in a backup PC that I use mostly for home theater and streaming sites.

  • @brentsnocomgaming7813
    @brentsnocomgaming7813 Рік тому +2

    Honestly I'd've gotten a 980Ti instead of 2 980s. My brother had one that ran 4k60+ High/Ultra like a champ, and it overclocked to 2.1 GHZ on air, it was a beast. He had it until 2018 when it burnt out from the extreme OC.

  • @blahblahblahbloohblah
    @blahblahblahbloohblah Рік тому +6

    I'm drunk rn. But aware enough that your editing and voiceover work are amazing 6 seconds (literally stopped 6 seconds in to comment this) into the video and I can hear your excitement. You love doing this, and I just subscribed because of that. It's so immediately apparent that you do this out of passion.

  • @DFX4509B
    @DFX4509B Рік тому +2

    The latest dual-chip cards I'm aware of were custom Radeon Pro MPX cards made for Apple for the last-gen Mac Pro.

  • @alastairpei
    @alastairpei Рік тому +2

    I've still got a 690, it's effectively my backup GPU. Wish it still had good driver and SLI support.

  • @terrabyteonetb1628
    @terrabyteonetb1628 Рік тому +1

    Games have to be written to use sli, they started dropping that in 700 series, by my 980ti, my old 670x2 sli, in bf3 got 120fps, and the 980ti same (at 1080p).
    That why I did not huy a 2nd one on my x99 sli matx board..(seeing death support in games).

  • @Vfl666
    @Vfl666 Рік тому +3

    Strange Brigade has really good sli and crossfire support.

    • @Waldherz
      @Waldherz Рік тому

      Do esn anyone actively play that game, or are all of the "players" just PC YT channels, benchmarking it?

  • @AveragePootis
    @AveragePootis 2 дні тому

    Just bought a 690 for an ultimate 2012 rig im building, surprising how expensive they have gotten, judging by the comments they used to be peanuts just a year or few ago

  • @dbsuperfanboy1315
    @dbsuperfanboy1315 Рік тому +1

    Make Sli/crossfire great again.

  • @jaydenridley4
    @jaydenridley4 Рік тому +1

    I’m still out here with an ati 5770 😂

  • @genrenato
    @genrenato Рік тому +2

    14:58 "SLI is just an SLI-ability" oof

  • @candle86
    @candle86 Рік тому +1

    As someone who bought Kepler back in 2012, I bought 4x GTX670 cards and did triple SLI + Physx on them. At the time 2GB seemed fine, and it was. They where also faster than the AMD compitor. The AMD Finewine thing wasn't a known thing in 2012, the first generation that really applies to was the 7000 series, but in 2012 the 680 beat the 7970 and the 670 beat the 7950. You've also got to remember that during this time period the 7970 and 7950 where plauged by a red screen bug that would randomly occur. No one made a mistake in 2012 buying Kepler.

  • @SaberSlayer88
    @SaberSlayer88 Рік тому +2

    god what an excellent video, well done. i was shocked to see this was done by such a small channel.

  • @gandyhehe
    @gandyhehe Рік тому +2

    I wonder how the 690 aged compared to the HD 7990. I had a friend who had one of those long been consigned to the bin which is a shame I'd have loved to benchmark it.

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq Рік тому

      AMD aged better, just not with crossfire. There is no dx12/Vulkan support. Doesn't matter the API does, it removed driver support for game dev specific support, and none of them supported it.

  • @JustSomeDinosaurPerson
    @JustSomeDinosaurPerson Рік тому +1

    Can't wait for the day we get to the RTX 6090 where we argue that 2 TB of VRAM isn't enough.

    • @IcebergTech
      @IcebergTech  Рік тому +1

      I love your optimism! My cynical brain suggests that by the time of the 6000 series, nVidia might just see fit to offer 8GB cards to entry level consumers.
      If we're lucky.

    • @JustSomeDinosaurPerson
      @JustSomeDinosaurPerson Рік тому

      @@IcebergTech God, I hope not. I would be okay with that IF memory optimizations were the standard in video game development, but sadly optimization has been on the backburner in place of increasing management overhead and crunch culture. Now every game is a buggy mess as the corporate sector seeks to transform gaming into another manufacture type industry where it produces assets/investments with expected returns rather than lovingly crafted art. So without that, the only way forward is to ramp up available memory for the die. Hopefully Samsung can get its vertical stacking architecture off the ground.
      This is such a dark timeline :(

  • @supremebohnenstange4102
    @supremebohnenstange4102 Рік тому +1

    I bought a 1080ti almost 6 years ago and it still works I guess 😌

    • @DevouringKing
      @DevouringKing Рік тому

      Good Card. Its still Fast Today.
      I got my Asus HD4770 (first 40nm card) in Summer 2009 for 120€. It was so Silent und Cool and Fast back in the day. It lastet to Polaris, so 7 Years.
      Then i got a Polaris (14 nm) in Summer 2016 for 199€ and it lastet to last Month, close 7 Years again.
      So i Played over 13 Years for 320€.

  • @shinvelcro
    @shinvelcro Рік тому +3

    I grabbed a 690 on release for my 2700k, spent a while waiting for them to announce it or I was going to pull the trigger on two 680's. It was a lovely card for running at 1440p at the time. Its downfall though was the 4gb of Vram. It only took about a year for that to become the real issue, as the stutter from swapping out assets started to get pretty bad. Had it been 4gb for each core it would have been able to pull its weight for a good while longer for me. Still, it always looked great, even if it was a bit more heafty then I would have liked.

    • @TheKazragore
      @TheKazragore Рік тому

      Like how the Titan Z had 12Gb total.

  • @Rabbit_AF
    @Rabbit_AF 20 днів тому

    SMH, Iceberg forgot about the best multicard implementation... S3 Graphics MultiChrome. To be fair, S3 was pretty much dead by the time the GTX 690 😅

  • @genethebean7597
    @genethebean7597 Рік тому +1

    Probably the most overlooked downside of this GPU was the triple DVI output with only a single mini DisplayPort. Heck, one of the three DVI ports was DVI-D so you were even more down bad.

  • @coldvaper
    @coldvaper Рік тому +1

    The gtx690 was a dream card for me but I am glad I never got it, it aged horribly even with in a 3 year span.

  • @CindyHuskyGirl
    @CindyHuskyGirl Рік тому +2

    Damn that amd card REALLY aged like fine wine 🍷

  • @bryndal36
    @bryndal36 Рік тому +2

    I had the GTX 680 and it was the 4gb version from Leadtek. It was a pretty decent card when I bought it in 2013 but by 2016 it was starting to show it's age. I replaced it with the GTX 1060 and wow what a jump in performance that was.

    • @PokeBurnPlease
      @PokeBurnPlease Рік тому +2

      Ive used a 680 for about a year in like 2019. Could run anything i threw at it. One day it sadly broke and i had to buy another GPU. I went with the 770 also 4GB.
      Why was Nvidia thinking 2GB Vram is enough in 2012 ? They knew PS4 and Xbox one gonna release soon obvisiously increasing the demand for Vram. I cant understand that till now.
      AMD is thinking ahead even their mid range Cards have like 16GB Vram.
      Currently rocking a RX 6600 cause of the Power Draw. 130W is really low nowdays.

    • @randomguydoes2901
      @randomguydoes2901 Рік тому +2

      @@PokeBurnPlease nvidia and gimped vram is tale as old as time.

    • @TheKazragore
      @TheKazragore Рік тому

      @@randomguydoes2901 True as it can be.

  • @DrBreezeAir
    @DrBreezeAir Рік тому +1

    Beautifully done, I can only imagine how much work this took to accomplish. I've stuck with a 660Ti until 2015 and bought a 980. I was able to cover a third of the price by selling the trusty 660Ti. Then I moved to a 1080 during the mining craze for a ridiculous price and now I'm on a 3070 to see how the prices will behave in the future. 4080 seems like a total rip-off and a 4090 is an unjustifiable purchase for something to play games on. Plus my 3070 is a Noctua edition, I'd love my next card to be of the same type. Maybe I'll give AMD a shot, I'm still a little sour about the HD2900XT.

  • @johnaceto7126
    @johnaceto7126 5 місяців тому

    I bet the 4090, 7900xt, and 4080 super will last a decade in the terms of performance. The only issue I see is if the drivers are abandoned or graphic feature are improved.

  • @matthewhanson498
    @matthewhanson498 10 місяців тому

    Two 670 in sli kept me gaming for a long time this video is a trip down memory lane for me. Played ultra quality on crysis 2 and Witcher 2 and loved it, but by crysis 3 and Witcher 3 they were showing their age, and hellblade was basically unplayed on high settings. Still though kept me gaming for a long time and got a 1060ti after.

  • @bismarck6
    @bismarck6 6 місяців тому

    I thought this was a 1m/500k UA-cam channel until I read the comments, only to realize it's not, but I'm pretty sure it will quickly get to a big number if you keep up the good work

  • @JohnPaulBuce
    @JohnPaulBuce Рік тому +1

    rip SLI

  • @hartsickdisciple
    @hartsickdisciple Рік тому

    I had 4 different SLI configs, and can say that it was rarely worth it. I had 2x 6600GT, 2x 7800GT, 2X 7900GT, and 2X 8800GT. There were a few situations where the SLI 6600GT could match a 6800GT, but it still had less VRAM. The 2X 8800GT was the best, because it delivered more performance than 8800 Ultra, while costing less.

  • @RonGrethel
    @RonGrethel 11 місяців тому

    I've never heard of this "built in sli" style cards. The scaling is crazy when it's supported. I played around with used crossfireing r7 260x, r9290, and Vega 64s. Never really was worth it.

  • @hi_tech_reptiles
    @hi_tech_reptiles Рік тому

    SLI/Xfire was never really great for gaming. For productivity it was and technically still is pretty great now with NVLink, but gaming basically always had some kind of issues, and was really just for enthusiasts wanting to drop any amount for every bit of performance just for the love of building, tweaking, modding etc. Also, high refresh was def a thing back then, tons of people had CRTs still for that fact reason. Hell, I still do along with my 1440p144hz. Lastly, I'd use more common resolutions for the time, which probably maxed out at 1080p iirc, but went as low as 480p, 800p etc. It's been a while even tho I do play old games, I generally use my 480p100hz or higher CRT or my main display.

  • @tomstech4390
    @tomstech4390 Рік тому

    Mistake: you said it had a pair of top end gk104 gpus... gk104 isn't top end.
    The gtx690 a little bit earlier would have been a gtx460x2, Its a pair of mid range cards slapped together to milk people from idiots with too much money which is pretty much what Nvidia does best even today. GK110 was the flagship part with the big die, gk104 in the gtx680 was the flagship just like the gtx1080 was "a new king".
    The problem was Kepler was just a crap architecture compared to the likes of GCN which would last over 10 years, it only looked good because Fermi was so back and GCN was being held back by long continued terascale (4000 and 5000 series using VLIWS4 and VLIWS5) support for the next 5 years, Meanwhile Nvidia quickly ditched Fermi and Kepler like a dead dog to move onto Maxwell and Pascal (Maxwell on speed) which shared some design philosophys with GCN like having 64shaders in a CU or "SM" for better wave function efficiency etc.
    While GCN still supported very compute insensive options like FP64 with huge memory buses and bandwidth and Vram, Nvidia castrated their cards in anything not even remotely related to gaming and stil relied on software tricks like delta color compression to make up for the lack of memoy bus. Thats why Maxwell looked so efficient.
    I could go on but gamers are simple folk and don't care about this stuff anyway, They've had 10 years to find this stuff out by now.

  • @kevinerbs2778
    @kevinerbs2778 Рік тому

    Most game engines now still have stutter issuse without multiple GPU's
    Blaming SLI & crossfire wasn't the answer. Now we've got retarded pricing for single cards because of the myth of "1% & 0.1% lows".
    1% lows could be 1% of frames out 300,000 totall which is like 3000 frames over all. & it's even less for for 0.1% lows it's 300 frames in total on 300,000 frames shown
    a 3 mintues video at 60 fps has 21,600 frames at 60 FPS.

  • @frosthoe
    @frosthoe Рік тому

    I used to run 2 gtx295s for quad sli, they were dual gpu . then added a gtx 275 for physx 5 Gpus . that was 2009 ish.
    then 2010 gut Evga Quad sli motherboard with ten pcie slots and ran 4 gtx 480s , and a Xeon server Cpu that was 70% overclocked , all chilled liquid cooled.
    I think the pc drew like 1400 watts when loaded, and the chiller drew about 760 watts. ( 1 HP chiller)
    Nowadays , one midrange vidcard, a midrange cpu, low budget foxcon motherboard, and some ram and M2 ssd. Boom solid game rig.

  • @frankwren8215
    @frankwren8215 Рік тому

    The 690 was the last time a dual GPU flagship was cool.
    Shame Nvidia started the titan line by massively handicapping the X80 & X90. Of course that decision led to where nvidia is today; unable to sell 4080s.
    The 700-900 series was definitely the lowest point in Nvidia's history. Performance/$ was so freaking low I nearly quit gaming altogether. Especially given the 970's VRAM problems.

  • @POKEMANZZ3
    @POKEMANZZ3 Рік тому

    the extra gb of vram reminds me how backwards Nvidia always is with future proofing, Nvidia tends to have better driver longevity, but at the same time the cards memory configurations like never lend themselves to aging well, prime example to me was my GTX 780, was a good card in my experience, but even by like 2017-18 i was already hitting titles that capped the pathetic 3gb VRAM buffer, ik it was the same deal on the ancient Geforce 9000 series, the 9800 GTX despite being slightly faster aged worse than the previous years 8800 GTX simply because the 8800 had 768mb of vram while the 9800 gtx's had 512.
    another example, 970s 3.5gb of vram hasnt aged nearly as well as the R9 390s 8gb of vram...

  • @Slimek
    @Slimek Рік тому

    I had the Spiderman stop issue with 5700x+3060Ti. The game is quite buggy with RT on

  • @hrod9393
    @hrod9393 Рік тому

    I had a SLI 2x 670 EVGA FTW back in the day (2012). Those cards had extra vram, it sucked that you couldn't really use the extra vram on the 2nd card.
    My Mitsubishi 2070sb CRT could do 160-127hz and I still have it for nostalgia. Most LCD's couldnt match its clarity, response time, colors and refresh for a very long time. I only just got a 240hz lcd 2020 and now 4k 240hz 2022.

  • @TheEdgey22
    @TheEdgey22 Рік тому

    Started off pc gaming with a GTX560TI, kept this through the 600 series and eventually swapped for a GTX770, this lasted until my dream of owning a titan came about early 2020. It was a step up but being a first gen titan, no dx12 support which meant one thing.................. another feckin' upgrade, this time to a RTX2060 12gb and 2 Samsung 4k monitors. If i get 7 years out of this gpu and monitors then it would have been as good as the GTX770

  • @Gael32
    @Gael32 Рік тому

    The 690 was a bad buy, even back then. Everyone knew that the real value card was the GTX 670 and I'm happy I went that route. I was even happier in 2015 when I picked two more GTX 670s for a total of $300 and got to run tri-SLI for a year.
    Nowadays I'm just depressed that I'm still using a GTX 980 TI when, if I had waited just a little longer, I could have gotten a GTX 1080 which crushes it in performance and, at the time, price point.

  • @timothypattonjr.4270
    @timothypattonjr.4270 Рік тому

    I had sli 680's. I can not speak for everyone but I always knew there was poor value and compatibility in SLI and got it because it offered performance beyond what I could get from any single card at that time. I did not care about the lower Vram amount or lack of direct X 11.1 feature set because I had no intention of using it more than 2 years. I do know anyone who bought sli 80's or 90 cards or amd equivalents like the 6990 and were not performance enthusiast who upgraded every generation. I played AROUND WITH SLI 980TI AND SLI Titan Xp. but i knew it was dead already by then. Got a Titan V to play with that was pretty cool. I played around with 3 way 580 sli just for 3dmark before the 680's. I do not know why anyone would have ever went with an sli setup if they were worried about their investment lasting more than 2 gpu generations. Honestly 1 generation.

  • @SterkeYerke5555
    @SterkeYerke5555 Рік тому +2

    Will you be using a 780 Ti or a Titan for the final Kepler review? I'd assume they've got more of a fighting chance considering they're got more VRAM and a faster gpu than the 680 and 690 have.

    • @IcebergTech
      @IcebergTech  Рік тому +1

      I had planned on going the other direction and using something cheaper, but we'll see how the testing goes.

    • @TheVanillatech
      @TheVanillatech Рік тому

      But also totally abandoned by Nvidia in driver support and performance updates. All Nvidia cards die a death once their new generation is released.

  • @JoshuaNicoll
    @JoshuaNicoll Рік тому

    Ah yes, the PCIe lane thing wouldn't effect things too much, it's still splitting 16 lanes into "32" lanes for the GPUs, through an onboard bridge chip. This 2 x16 set up still only had 16 lanes worth of bandwidth to the CPU though the chips could communicate with each other directly through the bridge I beleive, there wouldn't have been as much of a benefit vs 680 SLI, 680 SLI, with the clock advantages, in games that support SLI well, would have beaten the 690.

  • @rck-lp7389
    @rck-lp7389 Рік тому +1

    Nice man, youre madly underrated and I'm looking forward to see you grow like is saw randomgamingHD expanding! Keep up the nice work

  • @stefensmith9522
    @stefensmith9522 Рік тому

    Sli was never about doubling fps back in the day, it was always expected around a 30% boost and a way to get top of the line card performance for less than top of the line price. Picture getting a 4060ti and then a year and half from now you could get another one for half price and run it with the one you have now and get better fps than a 4090 that still cost more than both cards. That's what it was about back then. Then developers slowly started to not optimize for sli to the point having two or more cards would hurt performance compared to a single card. Sli was awesome it was the developers saving a penny in development time that ultimately killed it.

  • @justhitreset858
    @justhitreset858 Рік тому

    SLI for normal GPUs is definitely dead. However, I can see it being used as a way to separate GPU features into two separate devices such as having a normal rasterization GPU and a dedicated raytracing accelerator card. That would also help with yields as each die could be smaller and would allow for greater pipeline optimizations due to not having to share them with each type of fixed function hardware. Furthermore, it would lessen the cost for most people as you could mix and match say gen 1 GPU and gen 2 raytracing accelerator or vice versa. Maybe even a large cache card as cache no longer scales at all past TSMC's 3N node.
    Some things likely can't be separated like the optical engine on RTX cards from the "normal" gpu. Still I think this is a novel approach to counter AMDs current chiplet design. Even with AMD's chiplet design, that too will still reach the same limitations and a different solution will be needed so maybe then breaking the GPU up into smaller devices will become viable.

  • @bananabro980
    @bananabro980 Рік тому

    blew my mind there was no hfr displays back in 2012 especially lcd, i remember only vga or dvi lcds and crts could od that high, times change. now a budget 3440x1440p 140hz screen is barely 300usd

  • @terrabyteonetb1628
    @terrabyteonetb1628 Рік тому

    Iv still got my old setup box, doing nothing, 2700k, 2x6700 oc in sli....2g video cards...
    Should have sold them...(had 2x gpheart attacks, in different location from where I am.

  • @jacobwhittaker6241
    @jacobwhittaker6241 Рік тому

    As someone who has over 2500 hours of skyrim on pc. Running it on gpu's ranging from 7970, 780, 290x, 980ti, and 1080. The reason skyrim has problems is due to limitations of the engine. The game can't handle 120fps or more. I used to lock it at 90fps and leave it. Never had an issue doing that.

  • @steel5897
    @steel5897 Рік тому

    I still can't believe Nvidia made an entire thing and marketed their GPUs around hair physics back then.
    Nowdays it's all about transformative stuff like path tracing, deep learning... but in the early 2010s? Hair physics. Amazing.

  • @JoeWayne84
    @JoeWayne84 Рік тому

    SLI in gaming is something I’m glad was killed off I can’t think of anything stupider than to have to keep adding graphics cards to my system …. I remember when it was all the rage and it was the trendy thing where everyone had to have two flagship GPUs in a old case with a acrylic window this was right before the RGB and glass computer case or side panel became the standard.
    Its funny looking at how it’s evolved to having cables all color matched in combs being a focal point in PC builds haha .
    I imagine that will eventually go away next with the newer 12vhp plugs make builds look a lot cleaner not having a 4” wide comb of cables curling up at the front your build.
    With NVME becoming so affordable and motherboards offering 4 slots on board the available options for small form factor builds with a lot of storage is excellent. The cases really could change the layout design even .

  • @allxtend4005
    @allxtend4005 Рік тому

    funny how the earlyer games show higher avg fps then you show in this video ... the GTX 690 did not deliver the performance you show us. Because you show us the FPS in the video and they was way lower then we saw even in gta 5 80-90fps on a single GTX 690 when in the video he was in the terrain not much npc's and cars to render and still fall about 60 fps. This is not even close to a 90 fps experience.
    Because you get in the house 300 fps it does not mean that you have 80 fps Avg because 90% of the time you play in the open world and not in the house.

  • @JoeWayne84
    @JoeWayne84 Рік тому

    It’s ten years old… the difference in computer gaming hardware in the last decade has changed completely.
    I will be willing to bet the newer GPUs will last more than a decade. Maybe not officially supported by Nvidia and AMD because they will want to sell a new graphics card at least every 10 years to the consumers haha but I imagine there will be open source community’s for hardware driver updates.
    I have a hard time beleive if a rtx4090 in a decade won’t be able to run any game at 60 fps low settings at 1440p or even 4k. And with dlss only going to get better … unless the manufacturers artificially gimp modern GPUs through software updates there’s no way the amount of linear compute they have built into them will ever be worthless.

  • @picblick
    @picblick Рік тому

    Well even old cards might still find use today inside legacy hardware to play ancient games that no longer work with x64 Win 10 and up. Also money spent to have fun for multiple years is no money lost. I still have a 290x I bought a few months after it was released and I bet the dude who sold it deeply regrets it, because I can still play most games I want to in my living room. It now inhabits my secondary PC in my living room for all kinds of jobs ranging from local split screen multiplayer to browsing, media playback and sometimes just as a backup system.
    I'm not saying to desperately hold on to all hardware ever bought, I'm saying that old hardware still has it's place. It's always a good idea to sell it, not necessarily for the money you receive, but because someone might actually still want to use it and that's a very nice thing.

  • @nuggetpiece
    @nuggetpiece Рік тому

    Sli and xfire had terible frametimes. This made the games feel like shit (despite being over 120 fps). A 690 being a literal sli card meant everything was probably awful. I swore off sli/xfire back in like 2011.

  • @bradley163
    @bradley163 Рік тому

    I fondly remember using a pair of 6950 gtx gpus in sli back in the day. Being the only person in my highschool friends group who could run Oblivion at ultra settings was such a good feeling. Oh, those were the days.
    And Crysis 2 inherently runs poorly on any platform. They added that strange cinematic feel to the game, which made the entire experience frustrating.

  • @ThaexakaMavro
    @ThaexakaMavro Рік тому

    gtx690 was my last Nvidia product they always make their product with barely enough vram to last with few exceptions. The fact that the Gtx680 had a 4gb version but still made the gtx690 with 2x2gb speaks for itself. I now hate Nvidia with a passion .