Why Did SLI Die?

Поділитися
Вставка
  • Опубліковано 22 жов 2024

КОМЕНТАРІ • 673

  • @FlamingTX
    @FlamingTX 4 місяці тому +471

    To sum it all up
    1. AI
    2. The more you buy, the more you save

    • @ultrapredatorr
      @ultrapredatorr 4 місяці тому +4

      Preach brotha

    • @Aerobrake
      @Aerobrake 4 місяці тому +2

      The more you buy, the more you save

    • @jovee6155
      @jovee6155 4 місяці тому +1

      3. The market isn't there

    • @PantsCranks
      @PantsCranks 4 місяці тому +18

      0:09 into the video, let me stop you there as SLI died because of:

      Money, and Greed first and only first, this dictated the turn of events that subdued and pretty much killed "SLI"...
      SLI options and or similar features to the common man posed a potential when GPU upgrades would normally be warranted, to have the PC Gamer (instead of buying a new shiny mid-tier $800 dollar GPU) have another option. What about doubling your V ram? Doing so would be also, half to a quarter the price of that shiny shiny new GPU if that is, the current one possessed by said GAMER cost about the equivalent of that new shiny shiny Nvidia wants to force or throw, maybe slam? down our throats.
      SLI was this option for us mere mortals, that little connector (never used by most) that came with your GPU back in the day, allowed you to utilize SLI on compatible motherboards (most had this feature OR crossfire AMD). The subsequent move then by Nvidia was too, ice this feature, as only reserved for higher end cards first, then by, replace said feature with their "new improved" Nvidia Link. This also was an over-priced monstrosity of an item that now one would need to also acquire separately (in most cases). The plan now was moving this feature too only the highest end top tier consumer card (by that I mean 1 card only) available at the time being the RTX 3090.
      They, Nvidia, was already against the clock as AMD and Gen 5 PCIE was about to establish itself for the mainstream. Gen 3 and prior PCIE channels (where you slot in GPU) were in past, not, fast enough to (basically) effectively use SLI feature AND the requirements for the SLI adapter or NV LINK adapter was needed that synced and made this feature possible (basic explanation).
      But with the cat out of the bag and Intel to soon follow, Gen 5 PCIE allows very easily, for, all current GPU's (including the 40 90 that needs max gen 4 lanes for full functionality) the necessary speed between GPUs to enable these linked states and subsequentially ALL that POWER for mere mortals to ponder and acquire, while Nvidia would be counting less mullah in turn going towards their wallets...
      Also, the narrative driven was that those (past titles) games ("not much support it anymore."...) did not develop and allowed for the SLI feature set (for who's actual benefit and influence I have you ponder?) to be utilized anymore... People? Please.

      So now after 0:09 seconds into this video I apologize for what I have transpired on all of you but with that said if you got this far into my comment then now kiddies you know the truth!
      Shamelessly after reflection of my own actions, consider going down a rabbit hole with us and check out my new tech channel, linked to my handle here (shamelessly of course!) @phantomtec while just another tech phantom myself, Peace.
      P.S I hope for the love of all things tech that "META PCs" did this truth justice and is not another one of those shills we at phantomtec call ironically "just phantom tech here" (you know? more BS...) no hard feelings...

    • @fatherv6922
      @fatherv6922 3 місяці тому +3

      The more you buy, the more you save

  • @e2-woah939
    @e2-woah939 4 місяці тому +317

    Lots of people were buying gtx 970s and getting great performance. Nvidia didn’t want others buying used gpus cheap and not buying newer cards.

    • @FantomMisfit
      @FantomMisfit 4 місяці тому +18

      Yes I think that's part of it too.

    • @Trikipum
      @Trikipum 4 місяці тому +7

      Native multigpu support is a thing since dx12. You can even mix diferent brands and models there. They simply phased it out when it made no sense anymore

    • @certifiedhousemoment
      @certifiedhousemoment 3 місяці тому +2

      hey wait.. i have a gtx 970? could i USE sli still?

    • @MichaelBurnham
      @MichaelBurnham 3 місяці тому +2

      I still have a pc my wife now uses with 970s in crossfire that I bought new, still works fine for her usage and the games she plays

    • @96NARII69
      @96NARII69 3 місяці тому +5

      ​@@certifiedhousemoment not really, nobody codes games to support SLI anymore

  • @arenzricodexd4409
    @arenzricodexd4409 4 місяці тому +338

    former SLI user here. the thing that kill SLI is definitely the software. more specifically game engine. nvidia can improve SLI with things like NVlink but it is useless when game engine developer keep introducing tech that working against tech used in multi GPU like AFR. honestly i don't think nvidia remove SLI from consumer in favor of things like AI. after a decade of pushing multi GPU game developer have no interest to support multi GPU in their game engine (they do the opposite when it comes to CPU). the effort to support SLI in games is not cheap. ex nvidia engineer said in the past almost 50% of driver team work was to make sure SLI work in all existing games despite the amount of SLI user only account for small amount of nvidia user.

    • @notaras1985
      @notaras1985 4 місяці тому +13

      This. But AI loads still need it

    • @adamtajhassam9188
      @adamtajhassam9188 4 місяці тому +9

      @@notaras1985 100% agreed these companies just dont care

    • @kevinerbs2778
      @kevinerbs2778 4 місяці тому +25

      No Nvidia did take it away on purpose There's an Extreme over clocker(venturi) who found that Nvidia was specifically blocking it in software when it could've been working on specific games that it should've been working on (RTX 2060 in S.L.I because of mGPU) . The game is Dues Ex Mankind Divided. It has mGPu support but the way Nvidia wanted mGPU to work with their cards was that only with S.L.I specific cards.
      the absolute worst thing that ever happened for S.L.I was the completely & utter lie that Mutiple card caused "mirco-stutter". We've gotten plenty of games already that only ever use a single card & still stutter or have micro-stutter in them on DX12 By only using a single card. In 2023 there were at least 14 Triple A games that all released with bad stuttering or micro stuttering. It's time Reviewers actually ate crow because they're ones who pushed for this single card crap that is now costing the consumer cards to be in the $1,000to $2,500.
      That Ex-Nvidia Engineer is lair, all they did was make sure flags from DX9,DX10,DX11 to DX12 that were in the drive, it's not even that hard to do. AMD is currently doing it right now for mGPU for all their RDNA line this goes from RX 5000 series up though RX 7000 series. you can still use two cards on all AMD RDNA line up, you only lose Rebar because it's pointless for dual GPU.
      Complete bull crap about it costing too much; cards right now as in RTX cards all cost twice as much as GTX cards of the same tier used to cost that supported S.L.I. Don't even say D.L.S.S is superior either to S.L.I
      D.LS.S is not ran on the GPU in your computer first, D.L.S.S isn't free either it cost money to use an A.I machine for which Nvidia uses to process that correctly. That is ran by Nvidia first & then it implemented into the drivers by a driver team along with them implementing D.L.S.S into the game itself. Which is far more than what S.L.I ever needed. it is some what on the base level exactly the same, while having even more time consuming as how S.L.I was with the drivers.
      People need to understand that & stop praising this D.L.S.S as some kind savior. It isn't, in fact it's also made developers lazy to the point where it's become a crutch for running poorly written/coded games.

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому +12

      @@kevinerbs2778 1) DX12 mGPU is implemented by game developer not graphic drivers like CF of SLI. it is up to developer which card they want to enable mGPU support. nvidia have nothing to do with it when DX12 mGPU did not work in games. because the support is completely done inside the game. for example Ashes of singularity allows AMD Fury X to work with nvidia 980Ti in DX12 mGPU because the implementation are done by game developer to allow both card work in mGPU. nothing nvidia can do to stop it. nvidia can block SLI from working but not DX12 mGPU since implementation are done by game developer natively inside the game.
      2) you're talking about different issue. the stutter we see post multi GPU era are game engine issue. for example in UE case we saw it become more common with UE4. when UE3 still very popular and used in many games we did not see this kind issue.
      3) just enabling SLI is easy. in the past if the game does not support SLI or SLI official support was late people can enable it themselves using tools like nvidia inspector. the hard part is to make sure it can scale as high as possible and to ensure stability in games. what's the point going multi GPU if you can only get 40% more performance or the game keep crashing or have visual artifact? i still remember game like Dragon Age Inquisition have artifacts when running on my 660 SLI. it completely gone when i run the game on single GPU only. as for AMD so called mGPU support: did you see any gamer end up buying two radeon card and make them work in recent games?
      4) the effort to support all the issues caused by multi GPU does not worth it when the userbase is extremely small. initially the idea behind multi GPU is to encourage gamer to buy more than one GPU for a single system. after more than a decade pushing SLI/CF not even 50% of gamer world wide entertain that idea. in the end it comes to this: multi GPU need gamer to spend money to buy more GPU while upscaling like DLSS will let gamer keep their single GPU a bit while longer.
      5) DLSS might be more expensive cost wise for nvidia to do but it become a marketing point for people to buy their GPU over competition. with DLSS being supported from low end to high end all gamer that buy nvidia GPU can benefit from it. multi GPU? only people with such setup can benefit from it.
      6) this kind of thing are not specific to DLSS only. when 8th gen console have massive memory upgrade over 7th gen developer also use that as a reason not to optimized their VRAM/RAM usage in games as well as pointed by John Carmack a few years ago.

    • @kevinerbs2778
      @kevinerbs2778 4 місяці тому

      @@arenzricodexd4409your whole comment is nothing but hot air & misinformation.

  • @The_Twisted_Monk
    @The_Twisted_Monk 4 місяці тому +93

    SLI truly died because of money... You said it "It was becoming more a budget option". They wanted people to get off the older GPUs and spend more money for the newer stuff. The longer they kept SLI alive, the longer people would want to stay on their older GPUs.
    I think AMD threw a wrench into their plans when they made FSR and Upscaling for everyone. hahaha

    • @prafullrahangdale666
      @prafullrahangdale666 2 місяці тому +1

      ❤ true

    • @disclaimer4211
      @disclaimer4211 2 місяці тому

      yeah but then AMD also psudo Killed Crossfire too (except they just don't put crossfire on new cards anymore) and saw that what nvidia was doing was 'smart' possibly.

    • @AlexanderVonish
      @AlexanderVonish 2 місяці тому +2

      @@disclaimer4211pretty sure they also had nightmares trying to support the added configurations since you can essentially stick two different cards next each other with the bridge to initiate crossfire, and that means a lot more possibilities…And trying to figure out why a specific gpu configuration is bugging out for the development team.

    • @Yinco1
      @Yinco1 Місяць тому

      @@disclaimer4211 It still somewhat exist with AFMF 2.0

  • @stefanwalsh8034
    @stefanwalsh8034 4 місяці тому +57

    I had 2, HD 5850's to crossfire BATTLEFIELD 3 back in the day. What a time to be alive.

  • @ChairmanMeow1
    @ChairmanMeow1 3 місяці тому +25

    I worked at a Gamestop back in the day and the store manager had a triple SLI with 480's in his home PC. Same guy also owned both a neo geo AND a turbo grafx. Dude was my idol as a teenager lol. Those were the days!

  • @luckythewolf2856
    @luckythewolf2856 4 місяці тому +118

    The GTX 1060 never supported SLI. The GTX 1070 and up supported it.

    • @36KACRE
      @36KACRE 4 місяці тому +5

      I have 1080 , its goated rn

    • @RmX.
      @RmX. 4 місяці тому +7

      @@36KACRE I threw 1080 to trash this week because it runs like crap in 2024

    • @arghyaprotimhalder5592
      @arghyaprotimhalder5592 3 місяці тому +8

      ​@@RmX.WTH you could have sell it bro every enthusiastic person love it .

    • @blaspemy5398
      @blaspemy5398 3 місяці тому +4

      @@36KACRE I love my 1080 ti so much

    • @36KACRE
      @36KACRE 3 місяці тому +3

      @@blaspemy5398
      Yes I love it too

  • @nikke2404
    @nikke2404 4 місяці тому +132

    Actually, RDR2 was released in the fall of 2018, not 2017...

    • @Pasi123
      @Pasi123 4 місяці тому +40

      October 2018 for PS4 and Xbone. It didn't come out for PC until November 2019

    • @squirrelsinjacket1804
      @squirrelsinjacket1804 3 місяці тому +1

      Seems like it could have been a PS5 only game. Game is still one of the best looking out there.

    • @killabyte7397
      @killabyte7397 3 місяці тому

      It was actually 2016

    • @vcbbbbb
      @vcbbbbb 3 місяці тому +1

      @@killabyte7397rdr2 was not 2016 it released in 2018 it was in development for like 8 years

    • @killabyte7397
      @killabyte7397 3 місяці тому

      @@vcbbbbb look it up

  • @doriansanchez1961
    @doriansanchez1961 4 місяці тому +45

    Cutting the BS Nvidia wants you to buy high end GPU and not a old one that they won't profit off of.

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому +5

      The idea behind multi gpu is to encourage people to buy more gpu.

    • @doriansanchez1961
      @doriansanchez1961 4 місяці тому +9

      @@arenzricodexd4409 nope that was what they were hoping for so Towne the end only high end GPU had sli. But most people instead of buying a new GPU would buy a used one to add to their PC. Therefore no new GPU where being sold. With the release of the Titan Nvidia learned that people would pay twice as much for a GPU so the invention of the 90 series came about and the end of SLI was born.

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому

      @@doriansanchez1961 when i was still active on hardware forum years ago the talk about getting used card for cheap later on does not really fly in reality. In the end most of us conclude that if you want to go multi gpu do it from the get go. Even if you want to do it later do not do it much later than mid cycle of the series. Because back it is known that there will be people hunting for their second card later. So brand new gpu end up retaining it's original MSRP while on the used market people know the value of that second card for some so they were sold at higher price. In some cases rather than going with SLI/CF and deal with the quirk of such system it better to get next gen GPU.

    • @kevinerbs2778
      @kevinerbs2778 4 місяці тому +8

      @@doriansanchez1961 even worse is that people can't see that D.L.S.S needs more than what S.L.I needed.
      D.L.S.S has to implemented directly into the game
      then it has to get updated from NVidia's A.I machines
      Next the driver implementations have to go out with the new drivers that include the game.
      Last the card has to support that version of D.L.S.S
      Meanwhile S.L.I only need a driver implementation & two S.L.I enable cards.
      =_=

  • @FantomMisfit
    @FantomMisfit 4 місяці тому +76

    I think SLI hurt their sales so thats why they ended it. You could buy 2 4080 Supers for what you could buy a 4090 for

    • @ZackSNetwork
      @ZackSNetwork 4 місяці тому +14

      Exactly, I still remember back in 2016 I bought a Titan Xp for 4k gaming. A person I knew said why didn’t you just do a 1080 SLI. More money, more power, less consistency in performance.

    • @betag24cn
      @betag24cn 4 місяці тому +20

      it was killed because driver never worked as expected, and two gpus together only gave you like 50% extra performance boost, but you paid for two gpus, a expensive motherboard, the sli bridge and a expensive psu, it was not the best idea

    • @AndyU96
      @AndyU96 4 місяці тому +1

      Agreed. If the product stack had the prices scaling linearly with the performances of the cards, then they wouldnt be losing any money through SLI. But since they scale the prices of their cards exponentially with respect to performance, what you just said ends up being the case.

    • @AndyU96
      @AndyU96 4 місяці тому +4

      @@betag24cn Surely if enough research was thrown at it, it could have been improved to the point where the performance boost is near 90%

    • @betag24cn
      @betag24cn 4 місяці тому

      @@AndyU96 i doubt it, engines do not split loads, programs barely split between cpus, it was not meant to be

  • @Idk-Anymore4
    @Idk-Anymore4 4 місяці тому +56

    If NVIDIA keeps on pulling these actually good features from their gpus AMD and Intel will be sure to catch up.

    • @jai_the_guy.
      @jai_the_guy. 4 місяці тому +18

      AMD already has, but Crypto miners and AI companies have increased Nvidia's market share

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому +15

      catch up? AMD kill CF support on their card much earlier than nvidia. nvidia still support SLI until 20 series. AMD last card that have CF support was Vega. and even back then AMD no longer put priority on it to the point polaris actually have better CF support than Vega which is AMD high end GPU at the time.

    • @MrBlackdragon1230
      @MrBlackdragon1230 4 місяці тому

      @@jai_the_guy. LMAO!

    • @SlyNine
      @SlyNine 4 місяці тому +5

      Nvidia killed SLI for the same reason ATI killed Crossfire. AFR rendering was not compatible with deferred rendering techniques.
      The previous frame's buffer is on the other card. The game uses that frame to help with rendering.
      Devs had the option to use explicit multiple GPU modes for both and didn't do so.

    • @ZackSNetwork
      @ZackSNetwork 4 місяці тому

      @@jai_the_guy.AMD has not caught up with anything on the GPU side they suck.

  • @SlyNine
    @SlyNine 4 місяці тому +32

    They do explain why. It's deferred rendering techniques that rely on the previous frame to render the next frame. With SLI the previous frames buffer is on the other card.
    Because the rendering technique used was AFR. This is how implicit multi gpu modes worked. The devs had the option of using explicit multi GPU modes, but only one or two games did.
    So Nvidia gave up on the feature.

    • @exoticspeedefy7916
      @exoticspeedefy7916 4 місяці тому

      An so now we will have MCM GPU which is sort of the same, two GPU on one Die

    • @MLWJ1993
      @MLWJ1993 3 місяці тому

      ​@exoticspeedefy7916 A vast difference with MCM GPU's is that they're identified as 1 single GPU, leaving anything that happens there up to the hardware manufacturer.
      With SLi it's all in software where the hardware manufacturer no longer has the required amount of control over what happens since Dx12, Vulkan, etc. give away that control from the GPU driver to the (game) developer to potentially push for better hardware utilisation.

  • @Crazy-Chicken-Media
    @Crazy-Chicken-Media 3 місяці тому +8

    The fact nobody has figured out how to do SLI again is absolutely insane.

  • @stefensmith9522
    @stefensmith9522 4 місяці тому +88

    I miss SLI. I was running SLI in every setup I've had since the 8800gtx era.

    • @METAPCs
      @METAPCs  4 місяці тому +20

      Wow I bet you have a lot of experience to share! Very rad 🫡

    • @regisegek4675
      @regisegek4675 4 місяці тому

      @@METAPCs PX 8600 GT

    • @billchildress9756
      @billchildress9756 4 місяці тому +3

      I have 2 8800 GTS cards in SLI on a XFX 680i Board. What you had was the ultimate setup!

    • @kevinerbs2778
      @kevinerbs2778 4 місяці тому +3

      I love my S.L.I system, there's like over 1,000 Games that support it.

    • @JerryNeddler
      @JerryNeddler 4 місяці тому +2

      I love SLI

  • @LucasDaRonco
    @LucasDaRonco 3 місяці тому +5

    I did SLI and Crossfire back in the day. I'll tell you why they really stopped supporting it (at least for gaming as you can still use multiple GPUs for computing performance). This is what's up:
    1) GPU manufacturers had to release a different driver profile for each game apart from the basic profile so it would run "fine".
    2) too many games were incompatible, forcing it through custom profiles would almost certainly end up in artifacting or visual glitches or just bad performance.
    3) Frame pacing issues. Although this was an issue for AMD on single GPUs too (back in the day), while doing SLI or crossfire, even if you had more fps, the feeling you had with most games was definitely not smoother.
    4) It took business opportunities down for GPU manufacturers as people didn't have as much of an incentive on paying more for a high end GPU if you could buy 2 mid end GPUs and have "similar" (but definitely not better) performance.
    There's definitely more reasons but that's the gist of it.

  • @WhiskeyInspekta
    @WhiskeyInspekta 4 місяці тому +8

    I had 2 780ti’s in SLI for years. I want to say till 2020. But like you said many new game I tried always had some issue with SLI. I ended up upgrading to a 5700XT then. 7900XT.

    • @Kuli24000
      @Kuli24000 3 місяці тому +3

      780ti sli was pretty killer powerful. I tried it for a bit too. The 3gb vram was a definite downer though.

  • @ronhaworth5808
    @ronhaworth5808 4 місяці тому +6

    The problem with SLI for me was every time I considered adding a second card later to boost performance a new generation GPU came out that could do it with a single card.

  • @stevens1041
    @stevens1041 4 місяці тому +8

    I always heard about it, but could never justify the cost to myself. Never got to experience it, sadly. Thanks for covering it.

    • @METAPCs
      @METAPCs  4 місяці тому +2

      It was a difficult cost to justify for sure! Thanks for watching. Appreciate it

    • @Brodda-Syd
      @Brodda-Syd 4 місяці тому +2

      It doubled the frame-rate in the Crysis series, Battlefield series and Sniper series. I loved it.

  • @TewaAya
    @TewaAya 4 місяці тому +11

    Still useful for 3d rendering, 3090 is the last nvlink/multi gpu compatible sku though.

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому +2

      for non gaming task you can always multi GPU. even mixing the architecture. the benefit of Nvlink on the consumer grade card was those it allows for combined VRAM thing. it is the reason why nvidia take away nvlink from 4090.

    • @kevinerbs2778
      @kevinerbs2778 4 місяці тому

      @@arenzricodexd4409 There were issues with the RTX 4090 in CUDA programs that supported multiple cards, I read a lot complaining about the issues by independent developers in forums.

    • @bilbobaggins887
      @bilbobaggins887 2 місяці тому

      @@arenzricodexd4409 2 4090s would be bad ass.

  • @Eneeki
    @Eneeki 3 місяці тому +4

    As a 3D artist, I have a different perspective. This was forced to make 3D render farms and small 3D studio's buy the A6000 Ada's at ridiculous cost if you needed the extra Vram. 2017 was right about the time 3D art because mainstream and financially viable for smaller studio's and artists. My perspective comes from a conversation with a sales rep when I was looking into buying a6000's for my blender workstation. I think SLI for games was an afterthought and the real focus was 3D render farms and special effect studios.

  • @NKO_8188
    @NKO_8188 4 місяці тому +24

    Nvidia wants you to upgrade to every gen they have, that's why sli is dead

    • @Flyon86
      @Flyon86 3 місяці тому +6

      And also why they make the vram lower than it should be on every generation of gpus since the 20 series

    • @smackerlacker8708
      @smackerlacker8708 2 місяці тому

      If that was true, they would have dropped it a decade ago.

    • @pwcorgi2000
      @pwcorgi2000 Місяць тому

      Who in the hell upgrades their gpu every year?

    • @Flyon86
      @Flyon86 Місяць тому

      @@pwcorgi2000 Some people that just have to have the best rig at any given time will buy the 5090 right when it comes out. People were paying double msrp to scalpers for gpus during the 2020-2021 shortages lol.

  • @Bludobi9001
    @Bludobi9001 4 місяці тому +25

    Could you imagine 2 rtx 4090s working together?!

    • @user-tq4rt2fj9l
      @user-tq4rt2fj9l 4 місяці тому +7

      💣

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому +5

      Cpu bottleneck will be an issue even at 4k.

    • @BarnOwl-dx8vg
      @BarnOwl-dx8vg 4 місяці тому +8

      The plugs would melt

    • @Brodda-Syd
      @Brodda-Syd 4 місяці тому +4

      RASSSS-FIRE

    • @Pasi123
      @Pasi123 4 місяці тому

      ​@@arenzricodexd4409 Probably not that much at native 4K in graphically intensive games. In TPU review the RTX 4090 + i9-13900K got 45 FPS in Alan Wake 2 at native 4K with RT on, 67 FPS with DLSS Quality, 77 FPS DLSS Balanced, 87 FPS DLSS Performance, 106 FPS DLSS Ultra Performance.

  • @Ivan-pr7ku
    @Ivan-pr7ku 4 місяці тому +4

    Frame Generation tech pretty much supplanted SLI (and CrossFire) functionality -- it produces more frames with the same or worse input lag, plus few image artifacts for a change, but at the low cost of one GPU in the system. Multi-GPU is still an option in DX12 and other APIs, but now it's all up to the game developers to implement and use it.

    • @TheMaztercom
      @TheMaztercom 4 місяці тому +3

      You forgett that SLI didnt downgrade your visual fidelity, remember that DLSS is just lkw resolution scaled

    • @max.racing
      @max.racing 4 місяці тому

      but what if we combine Frame Generation with the power of a SLI system?
      Unlimited fps hack

    • @kevinerbs2778
      @kevinerbs2778 4 місяці тому

      Frame generation can't even touch S.L.I
      The Witcher 3 supports both & I get 170fps on DX11 with two RTX 2080 ti's in S.L.I
      Meanwhile the RTX 4090 in DX 12 can't even past 110 fps with D.L.S.S & Frame generation enable.
      This game is actually terrible in DX12 with RT because it lacks a lot of proper code calling because the engine was ported over from their DX11 game engine & not built from the ground up for DX12. A single RTX 2080 ti in 4K with everything maxed out including RT gets 32fps.

  • @lelandclayton5462
    @lelandclayton5462 3 місяці тому +4

    I had two Voodoo 2 card backs in the day in "SLI". It was awesome.

    • @METAPCs
      @METAPCs  3 місяці тому +1

      Right on! That was the ultimate flex 💪

  • @gamerstewart1660
    @gamerstewart1660 4 місяці тому +7

    Amd and nvidia should have just made a driver special for multi gpu use. Making all gpus in the setup into 1 virtual gpu so games can treat it that way and the driver splits the workload between the gpus😮‍💨

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому +1

      That is what SLI and crossfire is. Games most often did not aware about there is two GPU inside the system. Most of the "magic" are done by the drivers.

    • @gamerstewart1660
      @gamerstewart1660 3 місяці тому

      @@arenzricodexd4409 if only the power of combining Gpus for "maximum power" could be achieved in a single toggle😥

  • @Anima_moxhya
    @Anima_moxhya 4 місяці тому +3

    SLI evolved into THe Linking Chip that is now used in Cloud GPUs now 2 48 gb GPU are 96 GB VRAM and the high speed data transfer between the Servers

  • @MrGivmedew
    @MrGivmedew 3 місяці тому +5

    I had (2) Vodoo 2 12MB cards and a Fusion AGP. That erra was fun!

    • @henrikbrolin5687
      @henrikbrolin5687 3 місяці тому

      creater of SLI, nice.. :P
      i will have that to later this year.. :P

  • @aurinator
    @aurinator 26 днів тому

    As someone who had SLI since it was introduced on the Voodoo II's (which I still have), I always wondered why it went extinct, along with some more information around AMD's (ATI's) Crossfire competing technology. Thanks for creating this and shedding light on it all!

  • @Alex-il2pq
    @Alex-il2pq 3 місяці тому +1

    in 2015 playing GTA 5 on PC with 980 ti SLI 4k smooth 60FPS was an experience well ahead of its time, ironically even two top tier graphics cards for gaming during the time cost less than the 4090

  • @andrewszombie
    @andrewszombie 4 місяці тому +7

    3:45 imagine being this forum poster & seeing your comment from a decade ago in this video 😂

  • @keithwalsh8436
    @keithwalsh8436 3 місяці тому +1

    I find it very interesting that no one has even considered the lack of PCI lanes being an issue 40 series needs 16 PCI lanes while most mainstream gaming desktops only offer 20 PCI lanes barely leaving any for a video card and storage

  • @ATomRileyA
    @ATomRileyA 3 місяці тому +2

    Had a nice 8800GTX SLi Watercooled setup back in the day was awesome for Oblivion on a Barco CRT projector.

  • @Sidicas
    @Sidicas 2 місяці тому +2

    Nah, nobody ever used SLI for upgrade. We used it to play a small handful of SLI games where it was cheaper and easier to find than a higher end card. If you want upgrade, you just waited for next gen.

  • @sparkeyjames
    @sparkeyjames 3 місяці тому +1

    I ran two Nvidia 660's. Saved me from having to spend big bucks on a new card. At the time most card makers included the SLI bridge with their cards if it was capable. Then one 660 blew up and I finally upgraded to a 1070.

  • @GregOver
    @GregOver 4 місяці тому +5

    Yung Gravy and Shania Twain just dropped a song about White Claw. 202wtf is going on.

  • @zincwing4475
    @zincwing4475 Місяць тому

    My first pc had SLI for a while. My idea was buy a single GPU now, upgrade with a second GTS 450 later. I was a teenager then.
    Worked out for a while, but I wanted video encoding to stream Steam games to my laptop. My room got hot in the summer, and using another room helped.

  • @SpaceyMonkey75
    @SpaceyMonkey75 4 місяці тому +6

    How about being able to use a lesser card as a dedicated physx card? That was cool too.

    • @SAVikingSA
      @SAVikingSA 3 місяці тому +3

      I had three 970's in my rig and I dedicated one to PhysX wherever I could. A dedicated PhysX card is why I went with three GPU's, but sadly, like everything SLI, it was underutilized. I was over 100fps @ 1440p in 2016's Doom, it didn't really optimize for SLI, but that PhysX card was like a supercharger.

  • @rangersmith4652
    @rangersmith4652 4 місяці тому +5

    Years ago I ran a dual 980Ti SLI setup, and before that I ran a dual R9-290 Crossfire setup. Both added a ton of power draw and heat. Neither scaled worth a hoot in FFXIV, which was the only game I cared about. Still is, pretty much. Now that any decent mid-range GPU is more than enough for FFXIV, I have zero interest in dealing with the added complexity of a dual-GPU setup.

    • @kevinerbs2778
      @kevinerbs2778 4 місяці тому +1

      That game has terrible game engine built around it.

    • @rangersmith4652
      @rangersmith4652 4 місяці тому

      @@kevinerbs2778 It's old, yes. Terrible is a very subjective thing.

  • @mtbwithjure7809
    @mtbwithjure7809 4 місяці тому +121

    AMD for the win!

    • @kaishedan37
      @kaishedan37 4 місяці тому +14

      crossfire is also dead though?

    • @CallOFDutyMVP666
      @CallOFDutyMVP666 4 місяці тому +24

      15% of the steam hardware, 90% of youtube comments.

    • @ghostlyinterceptor7756
      @ghostlyinterceptor7756 4 місяці тому +14

      @@CallOFDutyMVP666 the fact is that the reputation of AMD software and drivers was what tarnished people into this kind of state of "avoid amd at all costs no matter what even if they improoved allot" , right now amd did a massive flop with the 7000 series but a huge W with 6000 series rivaling the best 3090 ti in raster making nvidia actually improove and not go the intel route of "MORE POWER"

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому +6

      ​@@ghostlyinterceptor7756RDNA 2 are winning because they were using TSMC 7nm while nvidia are using samsung 8nm. But when they were on node parity nvidia will come out on top unchallenge like 4090. Similar thing happen during pascal generation as well.
      On software side AMD tend to have major issue when they move into new architectute or doing something drastic to it. This end up giving AMD the stigma of "always" having software/driver issues.

    • @WolfChen
      @WolfChen 4 місяці тому

      for the price*

  • @KTSpeedruns
    @KTSpeedruns 3 місяці тому +1

    I'm just going to take a guess before actually watching. I'm guessing earlier in the days of crisis, developers were able to make wildly detailed things that graphics cards were not able to render in real time. So the obvious solution was to pull graphics cards together, have them share resources, and team up to be able to render what developers were putting out. But it turns out that was about the peak of how detailed things could get, so as graphics cards got better, the need for having multiple died out. I have to imagine if it came back, it would be purely to speed up ray tracing to get rid of the complaints that ray tracing cuts your frame rate in half.

  • @Oomlie
    @Oomlie 4 місяці тому +3

    NVLink wasn't introduced to the consumer cards until the 20 series, not sure why you said 10 series & NVLink on the consumer side was basically SLI it didnt operate like the workstation version and pool resources

  • @sleepyspartan1367
    @sleepyspartan1367 3 місяці тому +1

    Another issue which LTT proved was that for all the performance benefits you got, it was really necessary when compared to single GPUs systems. Now that's to say that if this was done today it wouldn't be worth it but at the time it just was really needed

  • @xBINARYGODx
    @xBINARYGODx 4 місяці тому +2

    You could set AFR for any "unsupported" game and get AT LEAST a ~50% uplift from having the second card.

  • @andreabriganti1226
    @andreabriganti1226 4 місяці тому +7

    Now more than ever we need SLI/Crossfire. Or something that could help with ray tracing, especially when it comes to AMD GPU.

    • @DenverStarkey
      @DenverStarkey 4 місяці тому +1

      yeah imagine using one card to render the polys and textures while the other card handles just the RT. you'd have 4k 60+ FPS with path tracing easy and cheap then. stead of having to buy a 2000 dollar 4090.

    • @MelioUmbraBelmont
      @MelioUmbraBelmont 4 місяці тому +1

      Thats my point in many discusions, like Phys X, Just buy a card to accelerate it.

    • @DenverStarkey
      @DenverStarkey 4 місяці тому

      @@MelioUmbraBelmont well that was the original idea .. like physX Nvidia aquired RTX via buying out a company that was developing it for a stand alone card. except that company never got their card to market before getting bought , while physX did see market first.
      this is one instance where Nvidia really hurt the market with their buy out. as the standalone RT card was aimign to get 1440p RT at 60Fps jsut on it's first gen. instead the tech got bolted on nivida's video cards and as we know the first RTX video cards couldnt acheive 1440p @ 60FPS with RT on. hell most of the second and 3rd gen RTX video cards still can't achieve this. and forget 4k unless you got a 4080 or 4090 at 1000 bucks and 1700 bucks.

    • @kizunadragon9
      @kizunadragon9 4 місяці тому +2

      An SLI like setup where one card renders the game and the other handles just ray tracing would be lit

    • @MelioUmbraBelmont
      @MelioUmbraBelmont 4 місяці тому

      @@kizunadragon9 it's possible to do, a Brazilian Channel made a UE4 demo using AMD cards, It was in the release of the RT when Nvidia said that was only possible with their hardware.
      I remember something about a guy who make RTGI on Reshade in Resident Evil remake and Bioshock, its possible to render different layers/chuncks/shaders based on normal maps Fresnel in parallel then combine all in the final frame like in Zelda BotW

  • @LBXZero
    @LBXZero 3 місяці тому +1

    Driver-based multi-GPU is highly possible without the troubles of AFR. Nvidia wanted multi-GPU to die because they had a point where a pair of RTX xx70 cards would match the RTX xx90 card at 60% of the price. Not anymore, especially with the RTX 40 series pricing.
    All of the glitches and stutter were a part of how AFR worked. AFR gave the marketable results, but Screen Splitting methods resolved the AFR problems. Screen Splitting methods had trouble with load balancing, but there are methods around it, if people would put the effort to build them.

  • @sargonsblackgrandfather2072
    @sargonsblackgrandfather2072 4 місяці тому +2

    Shame about SLI, I used to be a top tier card like a 80 series then after a few years when it was showing its age I’d buy another one second hand and be back on top again

  • @brkbtjunkie
    @brkbtjunkie 3 місяці тому

    I did Sli GTX 580s, and then 670s, but support just dwindled and I found that it was much easier to get a fluid framerate using one big silicon instead of two smaller ones.

  • @alejandrobadia4835
    @alejandrobadia4835 3 місяці тому

    I had an APU and the dedicated gpu version in a "Hybrid crossfire" setup. Loved it. Had to sell the system because took too many years to have dx12 games and not many games were properly supported. Learned a lot back then about computers and drivers :V
    I've seen recently a Chinese guy who got Nvidia and AMD experiment and it was amazing, a 3070 and a 6600, or something like that

  • @nikushim6665
    @nikushim6665 3 місяці тому

    I had a few quad SLI workstations (mostly used for VM's and running the CUDA SDK in linux), i think the last was a 780 x4 build. Main issue with SLI was vary few games actually utilized it. No one wanted to waste development runway to optimize a game around it when it was a vary vary tiny section of the market share (except for those directly sponsored by NVDA to use it for marketing and even then was usually halfassed)

  • @EveryGameGuru
    @EveryGameGuru 4 місяці тому +1

    😭 SLi (and crossfire) was amazing. Sadly my 3090s are the last SLi rig, for now 😭 and it's not even compatible with DX11 games. For that, I still have my dual 2080Ti system

  • @leeebbrell9
    @leeebbrell9 4 місяці тому +1

    I remember the voodoo cards, and yes I would dream of SLI even upto 20 series. Some game ran slower with it enabled

  • @rossmclaughlin7158
    @rossmclaughlin7158 4 місяці тому +2

    I had 2 hd 7970 and move/ upgrade to the 980 was rather lackluster at the time given what 4 or 5 years I had between upgrading to be noticing that games largely played similarly to the way they did is a bit of a disappointment to say the least that's like some one today building a pc to highest spec they can afford and upgrading to second highest end GPU available in 2029 to see little to no improvement but this honestly happened.

  • @ryuoki1
    @ryuoki1 2 місяці тому

    About 12 or 13 years ago a friend upgraded his video card and gave me his old nvidia one. I didn't realize it at the time, but my mobo had the same video card integrated in it... when I did, I was a bit disappointed that I couldnt find a way to SLI them together..... pretty sure the mobo manufacturer didn't consider this a possibility, otherwise there may have been a special cable that could have allowed it back in the day. Anyway, not that much of a loss, the hand-me-down was adequate for awhile, and worked decently for the newer MMO at the time SWTOR.

  • @dannyyristenpatt614
    @dannyyristenpatt614 4 місяці тому

    I built my last SLI rig back in 2012 using two GTX 660 Ti's and it was awesome until mid-2014 when titles like Watch Dogs, Wolfenstein: TNO and the Evil Within started releasing with non-existent or broken SLI support. Things didn't improve that following year with games like Assassins Creed: Unity and Batman: Arkham Knight both forcing me to physically remove my second card from the system because simply disabling SLI wasn't cutting it. Things haven't gotten much better until GTA V surprisingly launched with excellent SLI support and my 660 Ti's were useful for another 6 month's until I eventually switched to a GTX 980 Ti.

  • @NOLAgenX
    @NOLAgenX Місяць тому

    2 GTX 460’s was peak SLI. When it worked, it was beautiful. The problem was half the games didn’t have any implementation for SLI, or were poorly done. Then as you said, prices of GPU’s rose steeply and never stopped. So studios, hardware manufacturers and Nvidia all stopped planning for it. It died by being ignored instead of outright ended.

  • @Wushu-viking
    @Wushu-viking 3 місяці тому +1

    A correction here. . The GTX 10xx cards did not use NVlink, but High Bandwidth SLI.
    NVlink was a Quadro developed interface, and was also awailable on the RTX 2080 and 2080Ti GPUs.
    It is way better than SLI, because it can also link the memory, utilizing all the VRAM from both cards. It's sad that it has been taken away from mainstream, and never got real gaming support. But I'm sure Nvidia did not want 2x2080Ti (with 22GB) besting a 3090Ti, which it could with perfect scaling. :)

  • @NerothLoD
    @NerothLoD 3 місяці тому

    Used to have a Radeon 4870X2 and later an HD5970. Both worked really well overall. There were some cases of microstuttering in certain games, but often I could just disable one of the GPUs or adjust some settings that would more or less resolve the issue. Overall I'm happy with what I got, although I realise a lot of people had tons of issues with these Crossfire cards.

  • @tek_lynx4225
    @tek_lynx4225 3 місяці тому +2

    3dfx did not fail from decline. It was doing insanely well when it failed and was the top end. What happened is they took on to much at once and went bankrupt. They tried to make their own cards from start to finish cutting out the board partners and had to deal with the costs that goes with doing that, instead of just selling their chips to the board partners and having them deal with pcb costs packaging and logistics when they weren't ready for it. Even today NVIDIA and AMD only do limited amounts compared to their board partners.

  • @mtcoiner7994
    @mtcoiner7994 Місяць тому

    Honestly, SLI or similar software would be alot of fun to play with. It would be nice if it were at least an option. Someone must be able to throw together some software. How much fun would it be to link any 2 or 3 different cards that all run the same drivers.
    Thinking my 2070 super and 2060 super. Would basically be a 2080 TI with 16gb of Vram.

  • @NorseGraphic
    @NorseGraphic 2 місяці тому

    The development is towards a dedicated card for AI, so you’ll need a graphics-card and an AI-card. It’ll focus NPC-behaviors and responses onto the AI-card, thus reducing the need for CPU-responsiveness overall. And some developers would need to focus solely on AI-behavior and response vs. player-interaction, like some developers are solely focused on graphics-fidelity.

  • @Viscupelo
    @Viscupelo 3 місяці тому +2

    I remember strapping two 570s together, and then to my horror finding out what microstuttering was.

  • @TheOnlyTwitchR6
    @TheOnlyTwitchR6 3 місяці тому

    They should offer an SLI like add-on card for just Ray tracing.
    To either boost RT performance, or add nvidia RT to AMD systems
    Throw an nvenc encoder on there also.

  • @BreezzersGaming
    @BreezzersGaming 3 місяці тому +1

    I had a 3090 SLI set up, 600mil of Rad and it was still more heat than anything I ever imagined, 700w each, 1500w total power draw

  • @456MrPeople
    @456MrPeople Місяць тому

    Only the Quadro Pascal cards had NVLink. The regular GTX cards still had SLI. NVLink started showing up in consumer GPUs with Turing.

  • @richm77
    @richm77 3 місяці тому +1

    my first SLI setup was with 2 1080 ti's on an x99 platform. man the good ole days

    • @friskas8664
      @friskas8664 2 місяці тому

      God the gtx 1080ti is still now a really good card.
      Still have it, still running it with i7 8700k oc'ed to 5ghz.
      They really made an upsy making this card, ffs its already what? 6 or 7 years old and still runs with no problems.

  • @mattkrumm8141
    @mattkrumm8141 23 дні тому

    I remember using the Pirate bay to find software to run SLI on non-supported cards.

  • @werewolflover8636
    @werewolflover8636 2 місяці тому

    Looking back it’s amazing how far things have come in just 20 years.

  • @arbiter1
    @arbiter1 3 місяці тому

    The game i think that had the best SLI scaling was one of the sniper elite games might been 3 or 4. They had nearly perfect scaling with SLI in doubleing the fps.

  • @charcoalmoth
    @charcoalmoth 4 місяці тому +2

    Honestly this is AMD and Intel's opportunity to implement a new and improved version of this software on their gpu's and actually push its use because they arent going blow for blow with NVIDIA in the single gpu market why not try and hit them in the multi gpu set up market especially with their more budget friendly cards.

    • @AsmodeusExx
      @AsmodeusExx 4 місяці тому

      Will be a.good idea 4 intel invest in sli... will.boost their gpu at.nice budget... i think will we return to sli in few time... cause is more cheaper than invest in ultra expensive tech like nvidia

    • @jerm70
      @jerm70 3 місяці тому

      You can't get blood from a stone. If Nvidia can't convince people to support multiple GPUs you aren't seeing that happen from AMD and Intel.

  • @Laserxray
    @Laserxray 3 місяці тому

    The Talos Principle VR from CROTEAM is a game that still works in SLI today , tested on a 3 titan X, 3 way sli set up, i m pretty sure CROTEAM games still run in SLI if you have it .

  • @darkphotographer
    @darkphotographer 3 місяці тому +1

    sli is grate , i buy a 970 when the 10 seri came out , for 200euro , and second one few years after for 100 , run them in sli , and still have them today , work just fine ,

  • @darkholyPL
    @darkholyPL 2 місяці тому

    I'm old enough to have had 2x Voodoo 2 cards irl back in the day, and the problems with SLI were always there. First of all, you had to have a game that even supported SLI, because not every game did. You could 'force' it in drivers but it would often be broken then. Plus the scaling of it was all over the place. Sometimes you would get 80% more frames, sometimes... 20%. There were also many bugs and graphical glitches with using SLI. All in all I'm not sad it's gone.

  • @WhoaMykey
    @WhoaMykey Місяць тому

    Facts! Amd take notes your Ryzen AI could be utilized to improve crossfire efficiency and performance. Just a thought. Make multi gpus great again

  • @LeadStarDude
    @LeadStarDude 2 місяці тому

    I had a laptop with SLI that performed well for 10 years before it started showing its age in new release games. It has two GTX 980m 8GB cards. I actually still have it, and it still works fine for legacy games before everything went to ray tracing.

  • @mainsource8030
    @mainsource8030 4 місяці тому +1

    its criminal that the moment they started charging 80 dollars for an nvlink bridge and made sli bridges unusable is the same time they stopped supporting sli. i bought 2 x 2080ti and the80 dollar bridge only to find out the setup was useless

  • @nobody8717
    @nobody8717 13 днів тому

    you can get 2 titans for about $300 right now, but you'd also need at least a 750w psu as those ran pretty hot.
    also, you'd lose out on some cool features.

  • @crrc4s
    @crrc4s 2 місяці тому

    I loved my SLI 980 setup. Still works pretty darn well today. Would absolutely love to be able to SLI two 4090’s today. It’s a real shame it’s not an option.

  • @thelaughingmanofficial
    @thelaughingmanofficial 3 місяці тому

    Because DirectX 12 and 12a can do Multi-GPU without needing external API calls. But most games don't have Multi-GPU enabled, except "Ashes of the Singularity" and maybe a few other games.
    Yes you can still do Multi-GPU, but the game needs to be running either DirectX 12 or 12a or Vulkan with Multi-GPU support enabled in the game. If the game doesn't support it, well you're out of luck.

  • @Michael-ft3vg
    @Michael-ft3vg 2 місяці тому

    My favorite GPU ever made was the HD7990. Such a cool GPU. Crossfire built in.

  • @NoOnionsUK
    @NoOnionsUK Місяць тому

    Still got my 2 Voodoo2 cards up in the loft. One day I will build that retro gaming PC!

  • @Ludak021
    @Ludak021 3 місяці тому +2

    It's quite simple. You could buy another used card and get a lot of performance that way instead of buying a new more powerful GPU. It's that simple. DX12 supports CF and SLI by the way. nVidia and AMD do not allow it.

  • @Personalinfo404
    @Personalinfo404 4 місяці тому +3

    my first gaming pc was an intel 2600k and 2 gtx 660ti in SLI

    • @METAPCs
      @METAPCs  4 місяці тому

      Nice! That’s a pretty rad first gaming PC.

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому

      Wow almost similar to mine back then. 2500K with 660SLI.

  • @ShredBird
    @ShredBird 3 місяці тому

    In short:
    Old rendering techniques meant that frames of a game were independent. The CPU could send all the draw calls for one frame to GPU 1, then while GPU 1 is working on that send the next set to GPU 2. Now you just output them in alternating fashion, boom done.
    New engines have lots of temporal rendering techniques that rely on past frame data. GPU 2 can't finish until GPU 1 is done, and if it has to wait, what's the point?
    Well if you're a developer it means you can either make an engine that avoids these techniques, but it also puts you at risk of your engine being left behind technologically compared to other studios. You can work around it by letting both GPUs work on parts of the frame that are independent, then have a single GPU finish the interdependent pieces. This is what happened a lot toward the end and why games had far less than perfect scaling and GPU 2 ended up being pretty underutilized.
    Between tweaking engine and drivers etc, this is an era of gaming where gamers started becoming aware of latency, frame pacing and microstutter. And multi GPU really didn't help with any of that. It seemed like there might have been a second life for multi GPU for VR, as you can render two eyes independently, but that didn't seem to take off either.
    Modern APIs support multi GPU over PCIe without a dedicated bridge. Developers can still support it. The most recent game I can't think of that supports mGPU is Far Cry 6.
    Not to mention, mGPU would be wholly incompatible with many techniques like TAA, FSR and DLSS.
    In the end, many reasons killed it. Not that it's just hard to optimize for, but also it was becoming less and less worth the effort.

  • @Nahan_Boker94
    @Nahan_Boker94 3 місяці тому

    From my perspective:
    1. The trend to make technology much smaller, compact yet powerful. Especially for general markets. People today build PC like less cable, smaller parts, less complicated, and also more lightweight materials. Thats why mini ITX market exist and mostly used in office and casuals. Especially nowadays people living in smaller spaces.
    2. Power & Cost, while 2 midrange GPU is cost less than just one powerful GPU, its cost more in bills, maintenances, and other parts. Forcing you to buy specifics mobo, bigger PSU wattage, bigger case, better cooling but expensive, etc. Not ideal for casuals/people with budget, which contribute much more to GPU markets than few people that can afford/need to buy SLI GPUs and other specific parts.
    3. The development of softwares/games itself not needing SLI. Since this combinations usually used by tech influencers, miners and machine learning/AI which is much smaller market than people buy single GPU. And I rarely hear/see any general games or software needed/actually have features benefited by SLI config. Maybe back then there is few but not nowadays. So bcs SLI is not that popular compared to Single GPU.
    4. AMD giving up Crossfire and NVIDIA eventually dropped support or any development of it few years ago I think. No competitors and market for it. Why invest more?
    But I do hope they bring it back more. Since in today era of higher up GPU is so goddamn expensive, AI everywhere and videogames more hungry to VRAM (which actually AAA game industry get lazy optimizing textures and more caters to gaming & tech influencers plus assuming everyone using 4090). This is one good alternatives. But hoping theres solutions to its sizes and power consumptions. Also the availability of supporting parts is improved. But again, maybe I'm asking too much lol.

  • @DavidAlsh
    @DavidAlsh Місяць тому

    Weird that the drivers didn't hide the implementation details of multi-gpu configurations.
    Game engine shouldn't have to handle multi-gpu setups, the drivers should show programs/game engines a single GPU and manage scheduling

  • @mraltoid19
    @mraltoid19 4 місяці тому +6

    I think AMD could do well if they market "Crossfire" with RDNA4...since they are mid-level cards anyway....AMD can market (RDNA4) 2x Radeon RX-9700XT for 2x $550 will get you better than RTX-5090 performance for $500 less. Then they can do the same with the Radeon RX-9600XT, and provide sell those cards (which have 7700XT class performance) for $299 each, and let you crossfire them later for RX-7900XTX+-Class performance.

    • @arenzricodexd4409
      @arenzricodexd4409 4 місяці тому

      AMD have more and a decade crossfire experience with past product. They ditch it because it is not worth pursuing.

    • @AsmodeusExx
      @AsmodeusExx 4 місяці тому

      Will be back...

    • @kevinerbs2778
      @kevinerbs2778 4 місяці тому +1

      you can already use two cards with all of AMD's RDNA type line up cards from RX 5,000 series through RX 7,000 series. You can enable mGPU in the control panel of their software. You just lose "Rebar" when you do.

  • @andersmalmgren6528
    @andersmalmgren6528 3 місяці тому

    Missed the good old times when nvida made dual GPU cards. I had them all from gtx 295 upto gtx 690. Water cooled them all. The 295 was so cool becaue it was not just a dual GPU card, it was a dual PCB card. And you sandwiched the water block between the two cards

  • @Dr.Kanine
    @Dr.Kanine 2 місяці тому

    so what im wondering, If nvlink was the equivalent of turning 2 gpus into 1, why did it take any different of game programming to make it work? couldnt they just treat the 2 nvlinked gpus as a single bigger gpu?

  • @HaDe_Twins
    @HaDe_Twins 3 місяці тому +2

    No need to roast my first ever game like that😢 3:41

  • @aisolutionsindia7138
    @aisolutionsindia7138 4 місяці тому +1

    nvlink was deprecated simply because amd gave up on crossfire first, for nvidia sli was just a checkbox to be ticked against the competitor - this has been something they would have liked to do earlier but were forced to keep up the charade

  • @kwikschannel4371
    @kwikschannel4371 2 місяці тому

    Man, SLI was how we got to play Crysis with all the FPS to make your buddies jealous. Still got my tri-SLI rigs from 2008 and 2013.

  • @OctagonalSquare
    @OctagonalSquare 3 місяці тому

    running mismatched cards in one system with different screens plugged into each has kind of become the more common multi-GPU system setup. That way you can offload lower priority tasks to a weaker GPU and keep your stronger one dedicated to the game. And I think this became common enough that people just didn’t care for SLI

  • @SAVikingSA
    @SAVikingSA 3 місяці тому

    I had three 970's. It was incredibly stupid but what a flex on my friends lol
    That being said, dedicating a card to PhysX in 2016's Doom was amazing, still some of the best particle physics I've ever seen.

  • @pstm53
    @pstm53 4 місяці тому +3

    this video is criminally underrated

  • @NF-pk5mo
    @NF-pk5mo 2 місяці тому

    3:40 hey man I never forgot about crossfire that game was my main fps when I was younger

  • @xpodx
    @xpodx Місяць тому

    They should bring back sli and improved so we can have 8k 144hz gaming with 2 way or 4 way 5090s. After they release 8k 144hz of course. Tcl announced a couple monitors at 120hz. Still haven't released it.

  • @phenixnunlee372
    @phenixnunlee372 3 місяці тому

    Also, AI is super RAM intensive so just using NVlink for more ram and processing would be amazing as well.

  • @klaspeppar5619
    @klaspeppar5619 3 місяці тому

    I once ran two Nvidia GTX GeForce 760s with SLi.
    In my experience it was a bit of a mixed bag, in most games the second 760 didn’t really do much for performance, but in some games it gave me a noticeable performance boost (not 2x, maybe more something like 1.3x or at most 1.5x) and other games graphical glitches.

  • @Tribudo
    @Tribudo 4 місяці тому +1

    I remember seeing a video where a guy used crossfire with a custom driver to do combined graphics between an A series apu and a low end graphics card.

  • @baltakatei
    @baltakatei Місяць тому

    Because the, like 3, people who knew how it really worked moved on and their replacements couldn't get up to speed on the specific architecture ideas baked into existing designs before management demanded results, so they just made their own branch.