The Slow Death Of The GTX 690

Поділитися
Вставка
  • Опубліковано 30 лис 2024

КОМЕНТАРІ • 594

  • @IcebergTech
    @IcebergTech  2 роки тому +181

    Okay, some footnotes:
    I lost my voice while recording the VO, but there were a few things I wanted to add or change that I couldn’t record in time.
    *4 & 6 way SLI:* Craft Computing, the example I included in the footage, used the 3x GTX 690s as part of a cloud gaming server. Other valid use cases for 4-way and 6-way SLI include triple monitor 3D Vision Surround setups. When I said that it was “just a flex”, I speak from the perspective of someone who doesn’t need to run large numbers of VMs, and had also never considered triple monitor surround gaming setups as anything but an ostentatious display of wealth.
    *Single GPU vs. Dual GPU:* When comparing the single GK104 vs dual, I applied a small OC to the single chip setup. This was to more accurately reflect the experience of someone who had “only” bought a single GTX 680. I’d have gone higher, but my card couldn’t handle more than about 75MHz!
    *Drivers:* Although I’m talking about the history of a GPU from 2011 and the games it would have been played on at the time, I am using it in a modern system running Windows 10, and with 474.04 drivers (Nov 2022 security update, the latest available for Kepler). I could have used period-appropriate drivers and hardware to more accurately reflect the conditions of the times games were released, but my schedule didn’t allow it. Sorry!
    *Skyrim:* I said in the script that I didn’t see any of the physics glitches I’d heard about from using the game > 60FPS. I *should* have said that the only obvious glitches I could see were the weirdly fast hand movements and the flickering, *and that* this might have been caused by running the game above its intended frame rate, and was probably *not* related to SLI. I kinda said it with the on-screen text, but I wanted to be a bit more clear about it.
    *Doom:* I included some footage of Doom 2016 but ended up deleting the associated VO as it didn’t fit the flow of the narrative, and I feel the footage might need some further explanation. Doom 2016 was remarked on at the time as running worse on high end Kepler than on entry level GCN cards, and that still seems to be true 6 years later. I couldn’t get the game to run in OpenGL at all, but in Vulkan it varied between 20 and 40 FPS at 1080 Low. It also didn’t use the second GPU, even with adjustments made in nVidia Control Panel and nVidia Profile Inspector.
    *Fortnite:* At the last minute, I went back and retested Fortnite in SLI, because I’d seen someone else’s results that were very different from mine. Despite multiple attempts with different AFR modes and profiles, I still saw this massive performance regression in DX11, and DX12 didn’t acknowledge the second GPU at all. Performance Beta, meanwhile, performed very well, but soon found itself running above 200 FPS. At this point, CPU performance would be the limiting factor.
    *Asynchronous compute:* There was something of an uproar some years ago when AMD accused nVidia of “not even having async compute” in Maxwell, a feature which nVidia apparently *did* include a variety of as far back as GTX 780 and Titan. I don’t intend to drag up an old Red vs. Blue argument, but in the case of the HD 7970 it is a fact that async compute was something the AMD card had that the GTX 680/690 didn’t.

    • @theburger_king
      @theburger_king 2 роки тому +5

      I can’t read that much lol

    • @thcriticalthinker4025
      @thcriticalthinker4025 2 роки тому +4

      So regarding 6 way SLI
      SLI maxes out at 4 gpu's, as there is no bridge configuration for any higher configs.
      This doesn't stop you from slapping 30 cards in a rig and using them for other purposes

    • @walrusman151
      @walrusman151 2 роки тому

      Anybody who knows enough on a topic to leave a whole book of information as a footnote is pretty impressive

    • @nepnep6894
      @nepnep6894 2 роки тому +3

      Async compute for any nvidia card pre turing was achieved through rapidly context switching so it didn't really net performance improvements, but only really helped compatibility

    • @CertifiedAsher
      @CertifiedAsher 2 роки тому

      Hello, cool relaxing videos explaining so well about GPU's, CPU's, anyway, can you do a video about the Nvidia Quadro k4200?

  • @terzaputra3203
    @terzaputra3203 2 роки тому +353

    I Have a friend who's struggling so hard to sell his old 690 few months ago. Besides unsupported drivers, and broken SLI support, the main problem was it's power consumption. It competes with the likes of 1050 TI at the same price range of 80-100 USD which is like 3-4 times more power efficient. At that price, people would choose more efficient cards rather than glorified heater. He ended up making a nice glass case for it and decorate his walls with it.

    • @misiekt.1859
      @misiekt.1859 2 роки тому +34

      For 100 you can get RX 5600xt, which is around 1070ti. Even 5500 xt is 2x faster than 1050ti and often below 100.

    • @HolographicSkux
      @HolographicSkux 2 роки тому +11

      The 690 will produce a higher frame rate compared to the 1050tI but it is essentially a 680 at this stage. Glad I got rid of my 690 years ago but it was cool while I had it.

    • @jimtekkit
      @jimtekkit 2 роки тому +7

      Yeah power efficiency of today's cards are a big reason to buy new. Right now I'm on an RX570 and it consumes twice the power of my RX6600 while only giving about 60% the performance. And you can definitely notice the additional heat output while it's loaded up. It's a similar story on the Nvidia side as well.

    • @crazeguy26
      @crazeguy26 Рік тому +2

      I love my GTS450 zoom zoom!

    • @VarietyGamerChannel
      @VarietyGamerChannel Рік тому

      @@misiekt.1859 The radeon 5000 series are also jet noise heaters.

  • @mrfahrenheit8138
    @mrfahrenheit8138 2 роки тому +111

    Officially addicted to this channel

  • @ALmaN11223344
    @ALmaN11223344 2 роки тому +180

    This was back in the era where I had a GTX 460 SLI and helped a friend build a 9800 GTX SLI because they were $40 each on Amazon...oh, how the times have changed

    • @raginginferno868
      @raginginferno868 2 роки тому +26

      you cant even get a gt 1030 for $40 nowadays and that card sucks ass

    • @ALmaN11223344
      @ALmaN11223344 2 роки тому +8

      @@raginginferno868 Right? It's crazy.

    • @prateekpanwar646
      @prateekpanwar646 2 роки тому +25

      @@raginginferno868There’s no sensible gpu a kid could throw in his school gaming pc and play some latest games at low settings.
      Market has huge gap, Companies can just sell older models like GTX 1060 which is made on older wafers thus cheaper.

    • @raginginferno868
      @raginginferno868 2 роки тому +4

      @@prateekpanwar646 What? Your comment is completely irrelevant to what I said. I just made a comparison that GPU prices are going up every new generation. And, there are a few GPUs that are not too expensive, can fit a tight budget, and still offer 60 FPS on medium settings.

    • @BoleDaPole
      @BoleDaPole Рік тому +13

      Nvidia found out what other gaming companies learned long ago:
      Gamers are easy to fleece and they're willing to pay whatever it takes.

  • @scurbdubdub2555
    @scurbdubdub2555 2 роки тому +316

    I’ve owned a 690 before, it wasn’t that bad of a card for what I played. I eventually put it in another system though and sold it. I liked the card, it was interesting, but it was aging. I’m now on a even more interesting card, the Radeon VII. I really enjoy using the card. It surprisingly aged well considering how it was hammered at the time. At 1440p high-max settings it can play everything just about.

    • @EbonySaints
      @EbonySaints 2 роки тому +31

      Hold on to that Radeon VII for dear life. They and the Vega series supposedly got massacred in the mining boom (according to Actually Hardcore Overclocking) thanks to the HBM being incredibly valuable for mining. You might be one of the few lucky holders of a working model. Should have got it before the price boom when they were at $500.
      And about interesting GPUs. I can understand, though Arc seems way more frustrating than interesting to me right now. 😅

    • @Saved_Sinner0085
      @Saved_Sinner0085 2 роки тому +5

      Radeon VII is still a good 1440p card, bonus for lower heat bills in the winter. Lol it's not nearly as bad as Vega 64 due to the die shrink.

    • @theboostedbubba6432
      @theboostedbubba6432 2 роки тому +5

      Still rocking a 1080 Ti. Both great cards that still shred 1080p and are great for 1440p. Enjoy it.

    • @Saved_Sinner0085
      @Saved_Sinner0085 2 роки тому +1

      @@theboostedbubba6432 yup, I got a 1070. My display is 1080p 144Hz and the 1070 still runs everything on high with 60+ fps.

    • @theboostedbubba6432
      @theboostedbubba6432 2 роки тому +3

      @@Saved_Sinner0085 GTX 1070 still provides a great 1080p experience and you can find them for $115-120 which is great performance for the price. The 10 series was fantastic.

  • @ThinkAboutVic
    @ThinkAboutVic 2 роки тому +22

    i love the sudden transition from "foreshadowing is a narrative device-" to the gunshot intro lmao

  • @watercannonscollaboration2281
    @watercannonscollaboration2281 2 роки тому +80

    I’ve always found this card to be really cool despite the practicality or lack thereof. Dual GPU cards just seems novel from the perspective of someone who didn’t understand computers from that time

  • @TheVanillatech
    @TheVanillatech 2 роки тому +107

    The smarter move was buying a GTX670 immediately, a fairly great mid range card lasting 2 years at high settings, and adding a 2nd for pennies later on. My friend built a dual GTX670 machine just weeks after the cards came out, and he didn't have to upgrade for pretty close to 4 years. As he usually builds an entirely new machine from scratch every 2.5 - 3 years, he was very happy. His brother, who always inherited the old machine come upgrade time, was sad.

    • @gg2324
      @gg2324 2 роки тому +3

      But you would need a sli compatible Z series motherboard and a fairly powerful psu too

    • @TheVanillatech
      @TheVanillatech 2 роки тому +6

      @@gg2324 Sure you would need a fairly decent mid range motherboard, in the region of £80-£100, but for someone dropping £350 on two GTX670's or a single GTX690, I'd expect that they would be spending AT LEAST £100 on a motherboard, possibly even going for an overspecced £150 board.
      People who spend £350-£400 on a GPU don't spend £50 on a board, £50 on a GPU and £50 on RAM to complete the system, after all! (Unless they're crazy....).

    • @TheVanillatech
      @TheVanillatech 2 роки тому +5

      @@randomguydoes2901 Mature GCN was a good buy. Most ATI cards are a good buy because of the lognevity and prolonged driver support. 7970, 7970 Ghz and 280X were amazing buys. At the end of it's shelf life, the 280X cost £160. At that time, Nvidia launched the GTX960 at £220. Almost double the price. But the R9 280X beat the GTX960 by 10-30%, depending on the game. Obviously Nvidia had to drop the price of the GTX960 here in the UK, in the first week.
      980Ti and 1080Ti - both great cards. Easily 2-3 year cards. Questionable with the 1080Ti, given it was almost doubled in price shortly after launch during the crypto rush. But if you got one early enough, despite the high price at launch, the card defninitely had legs!
      HD4770, GTX 750Ti and Geforce 6600GT are the best cards ever released, however.

    • @TheVanillatech
      @TheVanillatech 2 роки тому +5

      @Crenshaw Pete No shit Sherlock! Technology has IMPROVED over TIME! What a fucking bombshell!
      Any more pearls of wisdom to impart?

    • @oceanbytez847
      @oceanbytez847 2 роки тому +1

      you'd be suprised. I've saved at least one awful build like that this year. Client was trying to get next gen performance on a budget and made cuts across the board except on the GPU. fortunately, it still ran pretty well, but he cut prices so severely he screwed himself out of an easy upgrade and instead would have to total rebuild when a slightly better board might have supported the following gen and allowed that upgrade. foolish mistake, but it happens now. I think part of it is the fact that computers have massively increased in price now compared to 10 years ago.

  • @Saved_Sinner0085
    @Saved_Sinner0085 2 роки тому +12

    Oh wow, only a 300 watt TDP? That's on the low end of the high end now. There are also CPU's pulling close to that now.

    • @kinkykane0607
      @kinkykane0607 2 роки тому +2

      300 Watts was considered ludicrous back in the day including the fact that it cost £1000. I remembered it being quite controversial at the time to spend that much on a graphics card and how much Watts it consumed.

    • @Saved_Sinner0085
      @Saved_Sinner0085 2 роки тому +2

      @@kinkykane0607 I remember full and well, I've been a PC gamer since before GPU's needed a 4-pin molex connector for extra power, way before PCI-e was even a thing. I remember people getting worried about the extra power consumption on the Geforce 6800 Ultra back in the day.

  • @FatheredPuma81
    @FatheredPuma81 2 роки тому +3

    Tbh most of the people that bought one of these new fall into 2 camps: People that upgraded either at the 980 Ti or 1080 TI and people who's gaming interest and standards fell significantly and will keep using it until the one game they really want to play doesn't launch.

  • @Djbz170
    @Djbz170 2 роки тому +39

    I used to have a titan Z watercooled and bios hacked overclocked to 1370mhz. it did very well at 1440p games I only recently upgraded to a 3090 when they launched in 2020.
    this video brought back alot of memories with my titan Z constant issues with SLI. Most of the time I just ran 1 GPU to eliminate crashing,stuttering and low frame times.

  • @zephyr4494
    @zephyr4494 10 місяців тому +2

    "Foreshadowing is a narrative device-" jad me dying

  • @PyromancerRift
    @PyromancerRift Рік тому +5

    In 2012, only competitive esports players had high refresh rate monitors. But they were a thing. I remember beinQ being in all Esports competitions.

    • @IcebergTech
      @IcebergTech  Рік тому +4

      I’ve been toying with the idea of an “original 144hz PC” video. For the time being it’s on the back burner until I can work out a bigger budget, but the first consumer 144hz display I could find referenced was the Asus VG278HE from 2012. Seems like that was a banner year for new display tech!

  • @RodimusPrime29
    @RodimusPrime29 2 роки тому +32

    Seeing these older gpu's "kinda" running current games I hope gives some hope to those who can't upgrade & running budget builds now. Keeping up with the Joneses isn't viable for most people. Keep the great vids coming!

    • @markm0000
      @markm0000 Рік тому

      I am completely out of all of this and just play emulators and old games. If friends want to play a game together we just visit and play split screen.

  • @candle86
    @candle86 2 роки тому +1

    As someone who bought Kepler back in 2012, I bought 4x GTX670 cards and did triple SLI + Physx on them. At the time 2GB seemed fine, and it was. They where also faster than the AMD compitor. The AMD Finewine thing wasn't a known thing in 2012, the first generation that really applies to was the 7000 series, but in 2012 the 680 beat the 7970 and the 670 beat the 7950. You've also got to remember that during this time period the 7970 and 7950 where plauged by a red screen bug that would randomly occur. No one made a mistake in 2012 buying Kepler.

  • @TastyGuava
    @TastyGuava 2 роки тому +4

    That building that fell still has a lane closed off, 1-2 years later. Still crazy that happened in my area. It's just an empty pit now and no one knows whether they're gonna give a new building permit or turn it into a memorial.

    • @Iwetbeds
      @Iwetbeds 2 роки тому

      was just thinking that was some odd choice of b roll footage.

  • @valentinardemon
    @valentinardemon 2 роки тому +12

    I had an SLI configuraton of GTX 980, this was realy great with my 1440p monitor, in some games like the Wither 3 and Battlefield (until the 5) I had a bit more perf than a GTX 1080. V-sync is mandatory with SLI, without it frame times are just bad and you had to love tweaking game and drivers with nvidia profil inspector.
    BTW I still use my Ipad air 1 and it work great for what I do with it, like youtube, twich, some Google research, etc. I use it like a second screen, no need to back windows or alt-tab

    • @VarietyGamerChannel
      @VarietyGamerChannel Рік тому

      Considering Vsync usually kills 15-20fps SLi was a bad bet. I used to use x2 Radeon 280's in xfire. Same shit. 90% of games I would have frametime stuttering and issues.

    • @valentinardemon
      @valentinardemon Рік тому

      @@VarietyGamerChannel And with our card we did not have freesync, I think it could have save multi GPU problems. The only thing I regret now, is that I dont have a heater beside me this winter

  • @drift-wn2dj
    @drift-wn2dj 2 роки тому +103

    good things somtimes end

    • @trulaila8560
      @trulaila8560 2 роки тому +8

      More like "always end"

    • @v5k456jh3
      @v5k456jh3 2 роки тому +4

      So deep, so insightful

    • @maizomeno
      @maizomeno Рік тому +2

      people are mad in this comment section

    • @Prod.Sweezy
      @Prod.Sweezy Рік тому

      all good things must come to a end

  • @gamewizard1760
    @gamewizard1760 2 роки тому +5

    I remember 3 years back, (before the pandemic drove GPU prices through the roof)a liquidator had a bunch of these for something like $75 each, and it was a long while before I finally decided against it, and ended up buying a GTX 970 for $100 for one of my older machines. Sli was already dead, and the fact that it was only DX 11, meant that it would be limited to older games. The power consumption is also too high, for the performance you get. If you're buying one for testing purposes, or to put on a shelf as part of a collection, then it's fine, but not as a card that you will use every day. 600 and 700 series cards are also out of driver support now, so there won't be any more optimizations coming. You're better off getting a Maxwell or Pascal based card, if you're upgrading from something even older than a 690.

  • @jeremyocampo1529
    @jeremyocampo1529 2 роки тому +5

    I'm surprised how great put together this video is for such a small tech youtube channel. Well done!

  • @TheVanillatech
    @TheVanillatech 2 роки тому +12

    We know that Nvidia started ignoring their previous generations in terms of driver updates and performance during Kepler and forever after. My friend bought a GTX980Ti for £580 after much deliberation. SIX MONTHS LATER - Nvidia released Pascal and anounced that only "critical driver support" will continue for Maxwell. Nvidia abandoned their customers who had just spent over half a grand on a top end GPU. Meanwhile, even the HD7970 and it's reincarnation the R9 280/X, were getting huge performance increased and added features via drivers YEARS afterward.

    • @liquidhydration
      @liquidhydration 2 роки тому +1

      Wait 980ti gets less support? Ever since I gotten mine this year all it has been is constant random nvidia driver stable updates. Now that I think of it, some happened today while I was sleep

    • @TheVanillatech
      @TheVanillatech 2 роки тому +2

      @@liquidhydration Nvidia released updated statements on their website, on the actual driver pages, saying that all but critical support is dropped from Maxwell just 6 months after the release of the 980Ti. basically, the final card of Maxwell and the arrival of Pascal, brought about the end of Maxwell. Dropping support for their PREVIOUS generation of GPU, which had sold tens of millions units, just like that. While AMD were still providing performance and feature updates for Barts and Cypress even in the Vega and RDNA 1 drivers.
      You might have not understood what I said. Nvidia simply make sure the cards still *work* as they should, and don't crash with newer engines / applications. But that's just a fraction of what drivers are supposed to do. They are also supposed to be refined to offer BETTER performance, on both older and newer applications. That's been the case for decades, and is still the case today. Nvidia said "Fuck that! We want our customers to buy Pascal, not cling on to Maxwell for 2-3 years!". So they officially dropped all but CRITICAL support. Which is basically the bare minimum. A fact that you can clearly see by looking at graphs of performance on Maxwell cards from subsequent drivers after their announcement.
      It's like buying a car and, six months later, the manufacturer / garage says "We will still do your MOT, but we ONLY ensure that the car starts and gets you to your destination! We will no longer change the oil, check the brakes, repair the engine beyond starting and stopping, and we won't adjust the seats or replace the windows! THANKS FOR YOUR £40,000 THOUGH! Why not consider our LATEST car? Only another £40,000!".

    • @S41t4r4
      @S41t4r4 2 роки тому

      @@TheVanillatech before trying to make amd look great, look at r9 cards at the same age of the 900 gen, yeah not supported anymore.

  • @chincemagnet
    @chincemagnet 2 роки тому +3

    Back in the day the reason many of us ran multiple high end cards was because we were running 3D or Eyefinity/Nvidia Surround, and when DSR came out that was an easy to upscale to 4K, early on for me it was an obsession with Crysis

  • @ffwast
    @ffwast 2 роки тому +18

    It's a real bummer that multi gpu doesn't really get supported anymore. It could have been amazing on modern cards.

    • @spankeyfish
      @spankeyfish 2 роки тому +5

      I had a dual 670 setup and it was incredibly inconsistent. It'd stutter at random moments.

    • @MDxGano
      @MDxGano 2 роки тому +3

      Had 780's in sli, was the worst thing ever in terms of consistancy and jitter vs a single card. There are many reasons sli died. If we were talking about non real time viewing type workloads, it would be great. For gaming however, it simply was poorly paced in all iterations.

    • @NeovanGoth
      @NeovanGoth Рік тому +4

      I'm much more happy with the manufacturers having switched to extremely large dies (Nvidia) or chiplet designs (AMD) delivering scaled up single GPU solutions for the high end instead relying on multi GPU. It's just as expensive and power hungry, but works _much_ better in practice.

  • @simsdas4
    @simsdas4 Рік тому +4

    I used to 980 for 6 years, good to know it's still capable of holding its own at the end there.

  • @DFX4509B
    @DFX4509B 2 роки тому +2

    The latest dual-chip cards I'm aware of were custom Radeon Pro MPX cards made for Apple for the last-gen Mac Pro.

  • @NOM4D20
    @NOM4D20 2 роки тому +6

    Very good video, really detailed, and enjoyable. Keep up the good work

  • @SaberSlayer88
    @SaberSlayer88 2 роки тому +2

    god what an excellent video, well done. i was shocked to see this was done by such a small channel.

  • @daLiraX
    @daLiraX 2 роки тому +9

    With SLI you will need heavy user to user support, trying custom SLI Bits.
    There are still some places out there which are into that stuff, but well, it's less ofc, but they often tend to get some results after some time.
    If none works, you can aswell always try SFR Mode, but it might introduce stuttering and or heavy tearing.

  • @jean-micheldupont1150
    @jean-micheldupont1150 Рік тому +1

    The sad thing is that "flagship" GPUs, doomed to become paperweights after a few years, used to cost 4 times less than now so you would feel a little less robbed back in the day...

  • @JohnSmith-nj9qo
    @JohnSmith-nj9qo Рік тому +3

    I still remember just starting to get into PC building in 2012, and drooled over how ludicrously overpowered the 690 sounded. I eventually opted for a much more sensible 660 ti, and I'm glad I did because the 690 was very much a last gasp of the Crossfire/SLI trend. Now multi GPU setups are just a weird largely forgotten about footnote in the annals of PC gaming history.

    • @AFnord
      @AFnord Рік тому

      Part of me is surprised that multi-GPU support has pretty much gone the way of the dodo, considering how long it was around in some way or another. Even in the late 90's, with the Voodoo cards you could do a multi-card setup, if you had more money than sense (technically speaking a card like the Voodoo 2 required a multi-card setup, but not really in the way people tend to mean it when they talk about multi-card setups).

  • @Obie327
    @Obie327 2 роки тому +4

    Interesting review Iceberg Tech.. I actually still have my original day one purchase of A Zotac GTX 680 and the PC I installed it in. (i7 2600k) Everything stills works great for my older games. Thanks for the video!

  • @d0ubleg78
    @d0ubleg78 2 роки тому +9

    Is the Proton compatibility layer on Linux / Steam OS a viable workaround to play otherwise incompatible titles? If the Vulkan API on Kepler is recent enough, then translating DX12 into Vulkan and running that on the GTX 690 could become a very interesting video.

    • @actuallyn
      @actuallyn 2 роки тому +2

      It would have similar outcome as MoltenVK on Apple's side (but worse, due to outdated... everything)
      You need to sacrifice performance to a compatibility layer.
      Unless use one gpu to do all the translation magic and other one for rendering?

    • @nepnep6894
      @nepnep6894 2 роки тому +1

      DXVK no longer is supported on Kepler and will not work, and was pretty slow to begin with. VKD3D never worked in the first place on Kepler.

  • @TheTardis157
    @TheTardis157 2 роки тому +3

    I went from a GTX275 to a GTX690 and it was amazing for the games I played in 2015 or so. It only cost me $160 at the time so I was fine with it as a card as at that price it made sense. I moved on to a RX580 when GPU RAM was becoming an issue and haven't looked back. I still have the card in a backup PC that I use mostly for home theater and streaming sites.

  • @alextremayne4362
    @alextremayne4362 2 роки тому

    I love finding small channels the second before they explode, awesome video!

  • @blahblahblahbloohblah
    @blahblahblahbloohblah Рік тому +6

    I'm drunk rn. But aware enough that your editing and voiceover work are amazing 6 seconds (literally stopped 6 seconds in to comment this) into the video and I can hear your excitement. You love doing this, and I just subscribed because of that. It's so immediately apparent that you do this out of passion.

  • @brentsnocomgaming7813
    @brentsnocomgaming7813 2 роки тому +2

    Honestly I'd've gotten a 980Ti instead of 2 980s. My brother had one that ran 4k60+ High/Ultra like a champ, and it overclocked to 2.1 GHZ on air, it was a beast. He had it until 2018 when it burnt out from the extreme OC.

  • @alastairpei
    @alastairpei 2 роки тому +2

    I've still got a 690, it's effectively my backup GPU. Wish it still had good driver and SLI support.

  • @raychat2816
    @raychat2816 2 роки тому +2

    I still have my own 690 however it’s now a collector’s item, I had replaced it when I got 2 980 cards in SLI … but I’m keeping it

  • @bryndal36
    @bryndal36 2 роки тому +2

    I had the GTX 680 and it was the 4gb version from Leadtek. It was a pretty decent card when I bought it in 2013 but by 2016 it was starting to show it's age. I replaced it with the GTX 1060 and wow what a jump in performance that was.

    • @PokeBurnPlease
      @PokeBurnPlease 2 роки тому +2

      Ive used a 680 for about a year in like 2019. Could run anything i threw at it. One day it sadly broke and i had to buy another GPU. I went with the 770 also 4GB.
      Why was Nvidia thinking 2GB Vram is enough in 2012 ? They knew PS4 and Xbox one gonna release soon obvisiously increasing the demand for Vram. I cant understand that till now.
      AMD is thinking ahead even their mid range Cards have like 16GB Vram.
      Currently rocking a RX 6600 cause of the Power Draw. 130W is really low nowdays.

    • @randomguydoes2901
      @randomguydoes2901 2 роки тому +2

      @@PokeBurnPlease nvidia and gimped vram is tale as old as time.

    • @TheKazragore
      @TheKazragore Рік тому

      @@randomguydoes2901 True as it can be.

  • @Vfl666
    @Vfl666 2 роки тому +3

    Strange Brigade has really good sli and crossfire support.

    • @Waldherz
      @Waldherz 2 роки тому

      Do esn anyone actively play that game, or are all of the "players" just PC YT channels, benchmarking it?

  • @rck-lp7389
    @rck-lp7389 2 роки тому +1

    Nice man, youre madly underrated and I'm looking forward to see you grow like is saw randomgamingHD expanding! Keep up the nice work

  • @genethebean7597
    @genethebean7597 2 роки тому +1

    Probably the most overlooked downside of this GPU was the triple DVI output with only a single mini DisplayPort. Heck, one of the three DVI ports was DVI-D so you were even more down bad.

  •  2 роки тому +1

    TBH, dropping SLI support sort of makes sense, especially from economy pov, but it also limits the usability of older cards. If look at it this way, cards like RTX 4090 are sort of a replacement for things like 4080 SLI that would have been a thing in the past. People often fail to notice that just a few years ago the top of the line was not a single top tier card but having 2-3 of those. That was the biggest beast you could get. Now? You can get 4090, slam water on it and push it as far as it goes but that's the limit.
    Anyways, the catch here is that older cards could still be usable in SLI. Even if a single card did not perform that well anymore, using two of those could often be cheaper than the new card, make the availability a little higher and most of all, it would actually be the best reusability scenario, better than recycling. Basic 1070 SLI is on pair with 3070, Strix 1070 OC in SLI is around RTX 3080 performance in most applications. Well, unless you play something that does not support multi GPU setup. Just imagine that, if we still were doing SLI with good support, in the GPU shortage after 2020 we would still have an option to run 10 and 20 series in SLI even though 30 series was hardly available. Getting dual 3080 would be a good alternative to 4080/4090, especially considering the prices and availability. This would let us use older generations at least a generation or two more before they become a scrap metal and when they do, they are a real paperweight. If games can't be even played on 2/3 way SLI, then the GPU is worth scrap.
    So why does it make sense? First and foremost, competition. If SLI was viable option, then new generation would have to compete with used market of old generation. Getting two slower, used cards might be cheaper and much more available than getting a new one. Secondly, optimizations, more corner cases, the overall complexity of development for SLI. I think that if anything like that should be done, it should be handled by Nvidia/AMD/Intel only, no special code on the developer's side. Multi GPU should really work as a single GPU from the game perspective and everything else should really be handled by the drivers and firmware. Otherwise it does not make sense if one game can get almost 100% boost, another will get nothing.

    • @kyronne1
      @kyronne1 Рік тому

      My thoughts tbh, sucks that corporation's only care about the bottom line

  • @betaomega04
    @betaomega04 Рік тому

    I had dual 690s. It was the first NVIDIA card that actually looked like a high-end card, and was the first card to have that metal aesthetic that seems to have carried over the years. For power, I used (and still use) a Corsair AX1200i. Practically speaking, it was one of the only cards that had three DVI ports (in a time when Display Port and HDMI were still new on cards), but did have mini-DP. These cards carried me through the beginning of the pandemic, and by that point I had replaced my 4-monitor setup with a single 55" 4K panel, so I needed an upgrade. Eventually settled on dual Strix GTX 1080 Ti's.

  • @Linda-
    @Linda- Рік тому

    5:20 yup, thats the physics bugs you get when trying to run a bethesda game over 60FPS, the flickering is actually just your character going into swimming mode for 1 or 2 frames and then returning out of it right after, back and forth

  • @laurencem2327
    @laurencem2327 2 роки тому

    12k views in 11hrs, congrats on getting some traction with YT IcebergTech!
    Looking forward to more videos as always, keep up the effort!

  • @gandyhehe
    @gandyhehe 2 роки тому +2

    I wonder how the 690 aged compared to the HD 7990. I had a friend who had one of those long been consigned to the bin which is a shame I'd have loved to benchmark it.

    • @JohnDoe-ip3oq
      @JohnDoe-ip3oq 2 роки тому

      AMD aged better, just not with crossfire. There is no dx12/Vulkan support. Doesn't matter the API does, it removed driver support for game dev specific support, and none of them supported it.

  • @gazzamildog6732
    @gazzamildog6732 2 роки тому +1

    I remember having a GTX 770 as my first ever GPU (the 770 is just a rebranded 680), I remember wanting to get a second one at the time because the 770 couldn't quite get 60fps ultra on BF4 at 1080p, the thing that put me off at the time was the constant complaints of "micro-stuttering" when using SLI, apparently the frame pacing would just be terrible for many games people tried. Seems like a gimmick looking back, kind of like rollerblades or something, like a dated PC gaming concept. Cool video :) brought me back

    • @darthwiizius
      @darthwiizius Рік тому +1

      I used a 770 from 2013-2017 and my gawd was the deterioration painful. From about 2015 I used it at as a 720P card to maintain a balance of performance and quality but it's lack of modern features just started making new games look old gen, and when I say old gen I mean older than the card. The lack of VRAM combined with the dated feature set made that card not so much fall off the cliff as jump off it clinging to a boulder. Replaced it with a 1060 6GB and even though I replaced that after 3 years it was because it was starting to fall over the cliff edge, the 1060 fell off more like it was wearing a parachute because they're still OK 2.5 years after I replaced it. The 1060 even only pulled half the power of the 770, so much electric for so little benefit.

    • @gazzamildog6732
      @gazzamildog6732 Рік тому +1

      @@darthwiizius yeah it wasn’t a great card looking back, I think they just didn’t realise how much VRAM would matter in the future, these days even low end cards are rocking like 8GB minimum. I remember playing the Witcher 3 with the 770, it forced me to upgrade to a 1070. 20 fps no matter the settings literally.

    • @darthwiizius
      @darthwiizius Рік тому +1

      @@gazzamildog6732
      I never even bothered trying to run The Witcher 3 on it, I knew me place. You did well getting a 1070 though, they were cracking cards for the time, the 8 chip memory set-up means they're still totally viable now, the 1060 6 chip config was why it tailed off before the RX570s & 580s it was competing with.

    • @gazzamildog6732
      @gazzamildog6732 Рік тому

      @@darthwiizius haha yeah it’s still a solid card today. My friend still has it, swears by it.

  • @glown2533
    @glown2533 2 роки тому +1

    bought a gtx titan x maxwell back in 2015 still running it today even tho my entire pc has changed since i kept it and still a grate gpu ive even ran it overclocked its entire life and in apex legends at 1440p max everything it stilll hits around 100 with alot of fighting it can go down to round 80

  • @shinvelcro
    @shinvelcro 2 роки тому +3

    I grabbed a 690 on release for my 2700k, spent a while waiting for them to announce it or I was going to pull the trigger on two 680's. It was a lovely card for running at 1440p at the time. Its downfall though was the 4gb of Vram. It only took about a year for that to become the real issue, as the stutter from swapping out assets started to get pretty bad. Had it been 4gb for each core it would have been able to pull its weight for a good while longer for me. Still, it always looked great, even if it was a bit more heafty then I would have liked.

    • @TheKazragore
      @TheKazragore Рік тому

      Like how the Titan Z had 12Gb total.

  • @SilentdragonDe
    @SilentdragonDe 2 роки тому

    Unfortunately this isn't the only time this happened with GPUs. Does anybody remember AMDs TeraScale architecture? Because I had (still have technically) not one, but two of those cards. Not only was the support for TeraScale dropped way too early, the drivers were literally left in a broken state where for some features (like CrossFire) to work properly you basically *had to* rely on community fixes. Even years after driver support had ended, you were still able to get decent performance from these cards, provided that the various hacked together community drivers worked for the particular game you were trying to play, but with stock drivers you were just completely out of luck.

  • @terrabyteonetb1628
    @terrabyteonetb1628 2 роки тому +1

    Games have to be written to use sli, they started dropping that in 700 series, by my 980ti, my old 670x2 sli, in bf3 got 120fps, and the 980ti same (at 1080p).
    That why I did not huy a 2nd one on my x99 sli matx board..(seeing death support in games).

  • @camofelix
    @camofelix 2 роки тому

    Correction: 6 way SLI in a single game was never a thing, but what you could do was 4way SLI + PHYSX accelerator.
    The physx accelerator could also be any NV card, it didn’t have to be the same as the other 4 GPUs.
    Example would be 4x980ti + a Titan Black
    I ran triple gtx670+ a single 560ti as a physx accelerator back in the day to allow for 4K60 in borderlands 2 maxed out, was pretty much the only way to do it. (I’d acquired the 670s piece meal over time)

  • @StormsparkPegasus
    @StormsparkPegasus Рік тому

    The first big game I remember not supporting SLI (at all) is Arkham Knight from 2015. That was the beginning of the end.

  • @conman3609theoriginal
    @conman3609theoriginal 2 роки тому

    one thing ive noticed with spider man is that if you don't have it on an SSD or something faster than a spinning mechanical drive even good cards can be handy capped by loading into areas swaped it from my hard drive to my SSD and the issues went away

  • @genrenato
    @genrenato 2 роки тому +2

    14:58 "SLI is just an SLI-ability" oof

  • @frosthoe
    @frosthoe Рік тому

    I used to run 2 gtx295s for quad sli, they were dual gpu . then added a gtx 275 for physx 5 Gpus . that was 2009 ish.
    then 2010 gut Evga Quad sli motherboard with ten pcie slots and ran 4 gtx 480s , and a Xeon server Cpu that was 70% overclocked , all chilled liquid cooled.
    I think the pc drew like 1400 watts when loaded, and the chiller drew about 760 watts. ( 1 HP chiller)
    Nowadays , one midrange vidcard, a midrange cpu, low budget foxcon motherboard, and some ram and M2 ssd. Boom solid game rig.

  • @SterkeYerke5555
    @SterkeYerke5555 2 роки тому +2

    Will you be using a 780 Ti or a Titan for the final Kepler review? I'd assume they've got more of a fighting chance considering they're got more VRAM and a faster gpu than the 680 and 690 have.

    • @IcebergTech
      @IcebergTech  2 роки тому +1

      I had planned on going the other direction and using something cheaper, but we'll see how the testing goes.

    • @TheVanillatech
      @TheVanillatech 2 роки тому

      But also totally abandoned by Nvidia in driver support and performance updates. All Nvidia cards die a death once their new generation is released.

  • @XaviarCraig
    @XaviarCraig Рік тому

    Slight correction for @6:35 Asus had a 120Hz 1080p 23 inch monitor called the VG236H available as early as 2010 and by 2011 I believe there were multiple options available.
    Obviously future proofing is always a bit hand wavy, but generally you can scrape a couple extra years of good performance by looking at works well in current year and going 1 step beyond it. Like right now consensus is 6C/12T CPUs are generally enough for gaming along with 32GB of ram and a card like the 3070. However, if you step the aforementioned setup up to a 8C/16T CPU w/ 64GB of ram, and a 3080 you will be able to run games on it for another year or two before future games become unplayable.

  • @jarekstorm6331
    @jarekstorm6331 Рік тому

    I ran 2 GTX 285s in SLI with an i7 965 and had high frame rates, but always experienced micro-stuttering. When I upgraded to a single GTX 780 years later the micro-stuttering disappeared.

  • @KapiteinKrentebol
    @KapiteinKrentebol 2 роки тому

    From what I understood of the SLI and Crossfire tech it was a gimped technology anyway.
    It would chop the screen in half and divide the parts to both gpu's.
    Instead of 3dfx SLI which would interlace the resolution and you would get a much better workload for both gpu's.
    But apparently that isn't possible anymore.

  • @hrod9393
    @hrod9393 2 роки тому

    I had a SLI 2x 670 EVGA FTW back in the day (2012). Those cards had extra vram, it sucked that you couldn't really use the extra vram on the 2nd card.
    My Mitsubishi 2070sb CRT could do 160-127hz and I still have it for nostalgia. Most LCD's couldnt match its clarity, response time, colors and refresh for a very long time. I only just got a 240hz lcd 2020 and now 4k 240hz 2022.

  • @Splomf
    @Splomf Рік тому +1

    Got myself a gigabyte 3080 10gb about 6 months ago to replace my RX580 since it was starting to struggle in newer games.
    Here's hoping it lasts 4 years like my 580 did.

    • @DevouringKing
      @DevouringKing Рік тому

      I got my Asus HD4770 (first 40nm card) in Summer 2009 for 120€. It was so Silent und Cool and Fast back in the day. It lastet to Polaris, so 7 Years.
      Then i got a Polaris (14 nm) in Summer 2016 for 199€ and it lastet to last Month, close 7 Years again.
      So i Played over 13 Years for 320€. And both Cards had very low Power Consumption.
      In Comparrision, your 3080 needs Tons more of Energy. It should last more then 4 years.

  • @m8x425
    @m8x425 Рік тому

    I had a couple GTX 680's in SLI, but I returned them for a GTX 670. Then I added a second 670 when I found one for $100. After all the upgrades I made to my PC in 2012, I barely paid any attention to computer hardware until Pascal and Polaris came out. I bought a couple 8GB RX 480's on launch day thinking Crossfire was still a thing, but I was wrong. At least I bought a 1080ti not long after it was released and that card has treated me well.
    A year before that I treated myself to a GTX 590 because I always wanted a dual-GPU card, but after seeing the limitations of a dual GPU cards in some games I wish I had bought a GTX 580 instead. At least EVGA replaced the GTX 590 with a GTX 970 after it stopped working.

  • @Smasher-Devourer
    @Smasher-Devourer 2 роки тому +2

    The people who bought the GTX 1080/1080ti are the smart ones. Those cards still going strong to this day.

  • @imjust_a
    @imjust_a 2 роки тому

    I had purchased a second 660 TI for SLI purposes back in the day. It never really worked the way I had hoped it would, and in fact I felt like my performance suffered in more games than it improved. That being said, I was very hopeful for the future of SLI... a bit too hopeful. To this day I still think that SLI could have been extremely useful for lowering the barrier to entry for VR (being able to use two older GPUs for cheap instead of buying one high-end GPU.) It seems NVidia was doing work to support per-eye rendering with SLI, but it happened at the tail end of SLI's lifespan and was thus canned before it really had a chance.

  • @DrBreezeAir
    @DrBreezeAir Рік тому +1

    Beautifully done, I can only imagine how much work this took to accomplish. I've stuck with a 660Ti until 2015 and bought a 980. I was able to cover a third of the price by selling the trusty 660Ti. Then I moved to a 1080 during the mining craze for a ridiculous price and now I'm on a 3070 to see how the prices will behave in the future. 4080 seems like a total rip-off and a 4090 is an unjustifiable purchase for something to play games on. Plus my 3070 is a Noctua edition, I'd love my next card to be of the same type. Maybe I'll give AMD a shot, I'm still a little sour about the HD2900XT.

  • @704pat
    @704pat 2 роки тому +4

    5:35 The black screening and stuttering isn't an SLI issue, it's the physX you mentioned before. I could never find a physX config that wouldn't flicker above 100 fps.

    • @clashwithkeen
      @clashwithkeen Рік тому

      Skyrim doesn't even utilize physx. It runs the Havok physics engine.

  • @RazielXSR
    @RazielXSR Рік тому

    I know you said you would have affiliate links to all products used in the video, I seem to be missing the one you put up for the 690. I was hoping to triple SLI them.

  • @bismarck6
    @bismarck6 8 місяців тому

    I thought this was a 1m/500k UA-cam channel until I read the comments, only to realize it's not, but I'm pretty sure it will quickly get to a big number if you keep up the good work

  • @soylentgreenb
    @soylentgreenb 2 роки тому

    Skyrim had problems (and probably still does) around the 100 FPS mark or above. The cart may up-end and prevent you from getting through the intro and tutorial. Stuff on shelves in stores may randomly fly at high enough speed to damage you, typically when you first enter the room. The day and night cycle may become out of sync with the ingame clock; which is just bizarre since you'd think those were the same thing; by that I mean the sun may set at 9 AM ingame time. Slightly higher than 60 isn't obviously broken. Also you may have random water-noises whenever you are close to the water elevation and standing on solid ground.

  • @kintustis
    @kintustis 2 роки тому +1

    Having owned a 690, what killed it for me was the pitiful 2gb of vram combined with so much raw gpu power.
    So many games were stuck at wonderful fps on the absolute lowest clay-looking textures, and complete unplayability on medium as the vram cap is reached. I was desperately closing other programs to free up precious extra MB of vram. I remember Siege wouldn't run playably if I didnt close my web browser and discord first. This was circa 2017-2018 during the mining boom, and anything over 3gb of vram was used in mining etherium as the mining apparently took up 3 or 4 gb.
    It had the horsepower for VR, but not the vram. Or rather it would have, if oculus and valve didnt enforce having wasteful 500+MB "home" worlds that must be loaded in when pressing the corresponding controller button, and those were just sitting in memory, never being used. I got great FPS when staring ahead, but several seconds of stutter when turning as the assets had to be paged in from system memory or hard drive. They later added stripped down home areas, but that was years later, and still didn't solve the core of the problem.
    The card also runs extremely hot compared to what else i had been using. Temps in the 80s and 90s were common, even with repasting and a more aggressive fan curve.

    • @fiece4767
      @fiece4767 2 роки тому

      So you cant use 4gb of vram taht 690 has, only 2gb? What that other 2gb are doing on the pcb?

    • @S41t4r4
      @S41t4r4 2 роки тому +1

      @@fiece4767 holding a copy of the data for the second chip

  • @iangoldense4730
    @iangoldense4730 Рік тому

    the worst casualty of the Death of SLI is the lack of a solid upgrade path. with my old GTS 8800 512 i was able to run that for a few year and then pick up a used one for a fair price and get a fantastic performance boost. I did the same with my 980 to much less effectiveness. GPU manufacturers realized this was bad for GPU sales and slowly killed off the feature, especially since the benefit was only seen as an ultra high-end feature for the latest gen GPU's.

  • @jimcachero
    @jimcachero 2 роки тому

    Want to let you know when you open the app(YT) to get info on …we’ll anything.. I’ll see a new video from you and stop everything till I’ve watched it…good stuff man really good stuff.

  • @aleksandarudovicic4232
    @aleksandarudovicic4232 2 роки тому

    Still using this card and its almost 2023 now. I am mostly playing older games though.

  • @matthewhanson498
    @matthewhanson498 Рік тому

    Two 670 in sli kept me gaming for a long time this video is a trip down memory lane for me. Played ultra quality on crysis 2 and Witcher 2 and loved it, but by crysis 3 and Witcher 3 they were showing their age, and hellblade was basically unplayed on high settings. Still though kept me gaming for a long time and got a 1060ti after.

  • @bananabro980
    @bananabro980 Рік тому

    blew my mind there was no hfr displays back in 2012 especially lcd, i remember only vga or dvi lcds and crts could od that high, times change. now a budget 3440x1440p 140hz screen is barely 300usd

  • @miabuelaenpatinete
    @miabuelaenpatinete Рік тому

    Are you colourblind? i did notice a strange visuals in color on the games. Great vídeo btw very entertaining!!

    • @IcebergTech
      @IcebergTech  Рік тому +2

      Yes, I have the deuteranopia filter enabled whenever it’s available

  • @BeefLettuceAndPotato
    @BeefLettuceAndPotato 2 роки тому +2

    The delivery of your lines is getting better and better and the content is good as ever. You just need a cheat code viral video to pop off. Maybe make an Ali Express video? Those always get way more views than they should. ahah

    • @IcebergTech
      @IcebergTech  2 роки тому +1

      Thanks! I'm thinking about some AliExpress-related content for the future, but I need to come up with my own spin on it. Something a bit more interesting than "haha, this stuff is from AliExpress, look how shady it is".

  • @hafentoffen2
    @hafentoffen2 Рік тому

    I used to run a Sapphire 7970 GHz edition vapor-x crossfire rig. Was a beautiful rig and it will be missed ❤️

  • @ahmed-0775
    @ahmed-0775 2 роки тому

    Another nice vid! how do you not have way more subsss?

    • @IcebergTech
      @IcebergTech  2 роки тому

      Gained 5k last month, I'm not complaining! Thanks for the compliment 😊

  • @vmystikilv
    @vmystikilv 2 роки тому

    My 690 is still sitting here on my disc. I sold its mate when i retired my quad setup. Man, I thought I was a God of computers at the time. Even buying Dual titan x's and owning a 6950XT today from release date, I have never felt as powerful as I did then.

  • @RuruFIN
    @RuruFIN 2 роки тому

    The last time I had a multi-GPU setup was a R9 290 Crossfire in 2019, and already back the multi-GPU support for newer titles was pretty meh.

  • @lasarousi
    @lasarousi Рік тому

    This reminds me of the 660m laptop I got for university back in 2013.
    It was already aging and didn't run okay most things. It ended up the best university laptop for graphic design, it ran Adobe but couldn't run path of exile at 60fps lol

  • @stefensmith9522
    @stefensmith9522 Рік тому

    Sli was never about doubling fps back in the day, it was always expected around a 30% boost and a way to get top of the line card performance for less than top of the line price. Picture getting a 4060ti and then a year and half from now you could get another one for half price and run it with the one you have now and get better fps than a 4090 that still cost more than both cards. That's what it was about back then. Then developers slowly started to not optimize for sli to the point having two or more cards would hurt performance compared to a single card. Sli was awesome it was the developers saving a penny in development time that ultimately killed it.

  • @qasderfful
    @qasderfful Рік тому

    I remember beating all those pre-2016 games on 650 Ti.
    Those were good times, man.

  • @telavus920
    @telavus920 Рік тому

    I had one of these. I remember being so excited when I got this. It was SUCH a beast. This was in 2013ish. It was noisy, but beasty! Then when Dark Souls 3 came out in march 2016, it was a looooot of problems, mostly due launch bugs. But as the patches came, I still had a very bad experience playing the game, with constant stuttering and crashes. I considered myself a huge fan of the series, but all these problems made me ragequit, after dying 80+ times against the Abyss watchers, as a result of long walk to the boss, as well as a 40% crash rate, combined with critical stuttering all the time.
    Then during black friday in 2016, I had the opportunity to buy a GTX 1070 for 450 dollars. This is hands down the best graphics card I've ever had the joy of owning. It was the most quiet, overclockable and amazing graphics card I've ever had (I bought a 3090 last year and I'm not half as excited nor proud to own it).
    I put up the GTX 690 for sale shortly after. I got a bunch of lowball bids, ranging between 40 dollars to 80 dollars, I was so shocked and disappointed by it (My asking price was 250 dollars). After two months, someone contacted me and we finally managed to agree on 175 dollars, which all things considered I think was a good deal that made both of us happy.
    This card had SO much potential, if only the memory would be shared across both GPUs.... It was a early teenager mistake (and ironically enough, my father pushing me to purchase "the best of the best"). Regardless of the sad ending I had with the card, it will still hold a special place in my heart, as it was the first graphics card in my first non-laptop computer I ever purchased.
    :)

  • @BlackThunderRC
    @BlackThunderRC 2 роки тому +3

    Still have a fully working GTX 295.

    • @theburger_king
      @theburger_king 2 роки тому +1

      For your main???

    • @ImperialDiecast
      @ImperialDiecast 2 роки тому

      the best card from january 2009 :'-) couldnt run direct x 10 that came out 1 year later.

    • @BlackThunderRC
      @BlackThunderRC 2 роки тому +1

      @@theburger_king No no lol

    • @BlackThunderRC
      @BlackThunderRC 2 роки тому +1

      @@ImperialDiecast Yep indeed. Got it a couple of years back just to play about with. Also does not like to work on a 4k screen. 1440p is the max output as well. Dont ask me how long it took to figure that out

  • @EVOTech1
    @EVOTech1 2 роки тому

    Just as I was binge watching some videos this turned up in my sub box. what I wonderful surprise!

  • @haroldfong8758
    @haroldfong8758 Рік тому

    Very comprehensive review!

  • @megajuanph11
    @megajuanph11 2 роки тому

    All the flickering is caused by Skyrim itself. The engine totally collapsed trying to play on my rx5600xt and had to cap it

  • @Slimek
    @Slimek Рік тому

    I had the Spiderman stop issue with 5700x+3060Ti. The game is quite buggy with RT on

  • @idklmaostilldontknow3592
    @idklmaostilldontknow3592 2 роки тому

    Great video. Absolutely adore the channel

  • @Jmvars
    @Jmvars Рік тому

    Skyrim flickers on high fps. I also heard water when I tried to run it on higher fps. You need a mod that dynamically changes the physics speed to match your FPS.

  • @grantmackinnon1307
    @grantmackinnon1307 2 роки тому +2

    i rember getting decent performace on alot of games when i had a 780ti which was still keplar. i got a fairly solid 60-75 on witcher 3, 45ish in hellblade. If i still had that card it would be interesting to see if the performance is still that same after 3 years of software updates. I did have an asus rog card with a modded bios that probably added 15% over stock.

    • @АрсенийСтучинский-в1ъ
      @АрсенийСтучинский-в1ъ 2 роки тому

      Same, 780ti was really good

    • @grantmackinnon1307
      @grantmackinnon1307 2 роки тому

      @@АрсенийСтучинский-в1ъ really power hungery, and ran extremely hot. back then everybody said the r9 290s ran hot but to get 1200mhz from a 780ti expect it to get to the boiling point of water even with the asus direcy cu cooler.

    • @АрсенийСтучинский-в1ъ
      @АрсенийСтучинский-в1ъ 2 роки тому

      @@grantmackinnon1307 that's true as well, mine died due to overheating back in 2020. But until then it ran all the games well for, what, 6 years? Although I may be misremebering since I never cared for ultra settings

    • @grantmackinnon1307
      @grantmackinnon1307 2 роки тому +1

      @@АрсенийСтучинский-в1ъ i gave my away about the sametime. as far i know it still works fine.

  • @urygen
    @urygen 2 роки тому

    I miss my old gtx 650 first card I ever had back In 2013 and I remember getting it for free from my neighbour. Paired with a whopping 2gb ram and an AMD athlon 2 that came in a prebuilt I got from a circuit city in 2007. How time flies

    • @urygen
      @urygen 2 роки тому

      Not to forget windows vista as well

  • @jensgerntholtz4041
    @jensgerntholtz4041 2 роки тому

    I never knew that the chips on the 690 were intereconnected via SLI.
    I am wondering if the AMD Radeon R9 295X2 followed a similar fate as a card with two graphics processors.
    I didn't see any mention of crossfire, so I wonder what the interconnect is and how it held up compared to the SLI on the 690, and if there were also driver consequences alogn the line.
    Would be interesting to see whether the multi-gpu of that time were a cautionary tale and if manufacturers have a better go at it in modern times. As we're approaching something similar with AMD's chiplet design on GPU's which has learnt from it's CPU counterparts making use of the "Infinity Fabric".
    Exciting times!

  • @ianlandry72
    @ianlandry72 Рік тому

    I was fortunate enough to buy a 1080ti when they were first released, its coming up on 6 years of use at this point and still going strong at 1440p high settings. I dont think the performance per dollar will ever be that good again!

  • @nlk294
    @nlk294 2 роки тому +2

    This GPU can't be more outdated...

  • @crainger
    @crainger Рік тому

    I had a 590, but went 680 SLI and then 780 SLI. I think 1080Ti was first time going back to single GPU.