I've only done SLI once. Many moons ago I had a 2500K and a GTX 560. I picked up a used GTX 560 for £80 and my two 560's in SLI were performing similar to a GTX 670. It was fine for a few months and then Nvidia did something to the drivers so my 560's wouldn't work in SLI anymore unless I used an old driver. Newer games required a newer driver so my SLI became useless. Fast forward twelve months or so when games really needed 2GB of VRAM, and my 560's started to work in SLI again with the latest drivers, but performed nowhere near a GTX 670 anymore because of the lack of VRAM. Since these shenanigans from Nvidia I've been nowhere near multi GPU again.
I was the same way with my dual 580’s, those things tore fallout 4 up, but I had to use old drivers or else the game would either crash, or the whole game would flicker and objects would pop in and out of existence
I must stop this nonsense. SLI and Crossfire do not sum up the VRAM of participating cards, they just do alternate rendering. It's same VRAM, double FPS. Let that sink in.
@@duddahgyeah7653 Only because game devs don't code it that way. Mantle had support for pooled VRAM a decade ago. Vulkan and Dx12 can handle it just fine. The only game I can think of off the top of my head that bothered to support it was Civilization: BE, and that game certainly didn't need it...no one supports it, because even at their height, multiple GPUs were a rarity, and no one wants to do extra code for 0.5% of the market.
"Being as useful as a handbrake on a canoe." Later in Sharktank: Sharks! Have you ever wanted to stop your canoe in the middle of the ocean but got those pesky waves rocking your boat?
Talking of "messing around", a lot of people who purchased APU's and added a discreet GPU later along the line don't realise that they can run a Dual Graphics setup in Windows by running the APU graphics alongside with the GPU graphics. It works perfectly well and can be utilised in various different ways, for example: one of my friends streams his gaming, his GPU plays the game and his Ryzen 5700g Radeon graphics records his stream, (which takes the resource strain of the recording from the GPU). I myself run a setup with a 5700g Radeon graphics running with an RTX 2060, (it works across both AMD and NVIDIA platforms perfectly well). There are setup guides on UA-cam with instructions on how to set up Dual Graphics, its not very complicated to do only requiring a few bios settings and when both Graphic sources show up in device manager, you know its worked, and in my opinion to have APU graphics built on a chip, and not utilise them in some way just because you've added a discreet GPU is a waste, and i don't particularly like waste haha.
On a higher end GPU, especially an Nvidia one - this would be so pointless. NVENC runs on a separate part of the chip and nowdays we even have multiple encoder streams available, so a high-end GPU just has no use for a "helper card" and hasn't for a while.
@@TheUltimateBlooper I agree you have that on a higher end GPU, but lets face it, the chances are if you've bought an APU in the first place its probably because your on a budget and a high end GPU will be out of range, and also if you have high end GPU money to spend your probably going to match it with a high end CPU, this is a budget alternative, and this channel tends to aim at a budget orientated audience. Its highly unlikely your going to run a higher end GPU matched with an APU, as performance charts all tell you that APU versions of otherwise identical chips perform worse.
There was a brief moment in time where it really looked like the DX12 multi-GPU thing was actually going to take off. I had CrossFire R9 Fury cards and was getting quite literally twice the fps of one card in Dying Light and Rise of the Tomb Raider. I was gaming at 4K native max settings in 2015 and it was glorious, also hot and loud, but none the less glorious. Even managed to make the protection trip on a 850W PSU, which got resolved by replacing it with a 1150W one!
@@kevinerbs2778as an owner of many SLI configs in the past and every high-end card since Kepler - yeah, some people absolutely can justify it. I currently run a 3090FE and a 4090FE - both - in my rig. Though I'm a 3D artist by trade, so them cards do work rendering for me, aside from gaming xD
Add the original HITMAN reboot to that pile too! Square Enix were pioneering DX12 titles and they ran REALLY well in SLI! I'm also one of the people who played at 4K near a decade ago, lol. Nowdays on a 4090 we can go well beyond that in many games though. GTA4 ran fine at 10K, so did Shadow Warrior, etc. Farcry5 runs at 8K maxed at like 60+ fps easy as well. GTA5 I finished when it was new at 5K, same with MH World. New "AAA" games can be really demanding, so not gonna run them above 5K while maintaining reasonably high framerates, but 4K is basically the default now on that.
Because it was never done well. The ideal would be having 2-4 low powered/efficient GPUs running 1 thing (or few things) each. That is the ideal scenario. If you have 2 current 4090s in SLI and that's how it's done then there's no point and it's stupid. Like maybe not different cards entirely, but if this was brought back again in the future, it would be cool having a card that requires 4090s power, but has 3 or more different chiplets that you could assign to "stuff" in games, so say 1 chiplet runs physics while the rest load entirely shadows or whatever. Perhaps technically speaking that is how it works today, assigning tasks within the 1 chip, but it would be expanded. Some sort of "multithreading" but in gpu. I also can be completely wrong, and you are welcome to correct me if that is this case as this comment is based on "what I think I know about it".
Dude, how many hours per month do you game for it to effect your power bill?!?! I've done the math, and even with VERY expensive electricity you literally can't even notice a difference on your power bill by switching from an undervolted RTX 3060 to an overclocked RTX 4090 😂 Unless you're doing something like crypto mining literally 24/7
@kishirisu1268 I'm not American and I'm FAR from "rich"... I sincerely doubt you cut your power bill in half by tweaking your pc power... That would mean your pc was using as much power 24/7 as ALL of the other electric things in your home combined... It makes ZERO sense...
1) if you could afford multiple GPUs and a PSU for them you could afford the electric. 2) it only consumed much more electric if the other GPU's were much more used.... y'know, working. If you had a really easy to run game like minecraft that couldn't use the second GPU it didn't matter, if you had a game that could use the second GPU like battlefield then it made a big difference.
The only modern game which supports multiple graphics cards/gpus is Microsoft Flight Simulator. With a dual card setup, the game will render the plane using one card and the terrain using the other. Now you'll have to find a Powercolor Devil 13 as well. Or a R9 295X2.
I've got the powercolor devil 13! It's an HD 7990, basically two 7970s. Honestly I haven't even tried it because im scared to setup crossfire bullshit lol
@@naamadossantossilva4736 More like a sign that actual effort went into coding it. Every modern API supports multiple GPUs, if the dev takes the time to utilize it.
I used 4 separate 7970s in quad cf for many years. Never have they failed me. Just that sometimes only 3 cards or 2 cards would work in tandem. Then switched to 2080 ti sli. No problems either. It's only when nvidia totally ditched sli at 4000 series that I moved to 4090.
I miss the SLI/CrossfireX era. I've seen a lot of people saying it was a bad technology because of frametimes, I cannot desagree more, it was an awesome technology that made a lot o sense back in the day, you could buy a 60 class card first and later when you got more money you could buy another and outperform the 80 class card. People somehow think the fact that most bad console ports of that era not working properly with dual GPU was fault of the technology and not bad optmization, which is the same as saying X modern GPU or CPU is bad because it stuters in Jedi Survivor. You realize the potential that tech had when you look at well optimized PC focused games of that period that gained almost 100% performance uplift with no impact in frame latency, an exemple I always give is Battlefield 3, look at videos of that game running in the GTX 690, it beats the GTX 1080 with perfect frametimes.
Dude, the idea is awesome, what killed it was the dumb lack of support from nvidia and shit AMD documentation. Nvidia could Just: hey we got that new thing called ray tracing, wanna Max out this Shit? BUY MORE CARDS to accelerate It. Or like phys X worked a long time ago until It got a driver Lock, just buy a card to accelerate it or if you have a AMD card to enable It.
Now we have PCI gen 5 x16, re-bar, HAGS, games that let you go nuts with the resolution scale, 8K monitor, virtual resolutions....and the GPU makers that would be trying to sell cards with absurd claims Just stop puting effort after the pandemic (I miss the 4k r9 fury, even in crossfire It could not perform).
Got mine non fonctional for about 30€ with the box, a cheapo mutlimeter helped me find a broken 0 ohms resistor, burnt by a bad mosfet on the back of the card. Replaced them and I used the card for 4-5 years. Retired due to multi GPU issues and low VRAM. Even as a paperweight it’s a beautiful piece of industrial design.
I remember that a Spanish youtuber was excited because he was doing a dual sli with a GTX 980 only to find that he couldnt record the game well because only was recording the FPS that one card produces.
I don’t know why people use the phrase “as useful as a handbrake on a canoe.” If you just put an oar on a swivel on the side of your canoe, it would work exactly like a car handbrake.
Love dual GPU’s. There just something about them. I have 2x 7950 GX2’s, 2x HD 3870X2’s, GTX 690, GTX 760 X2 Mars and also some faulty once unfortunately, including GTX 295, 2x HD 4870X2’s and HD 3870X2.
Considering the amount of power 5090 needs, it will definitely be a dual-core. Basically 2 4090`s stacked together (and maybe downclocked to prevent card melting itself).
long time viewer here. you dont know it but your first review on that Mars II is one of the first videos that i watched from you. along with the AMD A6 CSGO pc that you did.
sli is dead but multi gpu config isnt. Using a 6000 or 7000 series amd gpu with any card as a dedicated Fluid Motion frames card will double the frames of any card. Also you can use AMD free sync with an Nvidia card
@@RandomGaminginHD Awesome! Its hard to find tutorials online but basically your NVidia card will go into the top slot and then your AMd card will go in the bottom one. Display cable will go into the AMD card. You want both Nvidia and Adrenaline drivers installed then in the Windows graphics setting you want it to default all graphics to your NVidia card. Only downside is you can only monitor the FPS gain in AMD monitor. The gains wont show up in MSI afterburner and your screen needs to be in Fullscreen in order for fluid motion frames to work. I have a 6600xt and 2060 for my home theater PC and its fast combo!
@@Hairybarryyany chance you'd want to record a video detailing this? I've got a 2080 and a 6700XT I'd like to try this setup with once I get home from vacation
So if you have an Nvidia card, you suggest getting an AMD card just to use frame gen? I tried AFMF when it first released and it was mediocre. I dont think getting an AMD card just to use AFMF is worthwhile.
@t1e6x12 I wouldn't call afmf mediocre it works in nearly every game, and it's fantastic. It's a lot better than what it use to be. It's helps a lot with bottleneck issues. I have a 6800 in my dual xeon rig and it helps with choppiness I use to get without AFMF. It's truly a godsend as I struggled to get high fps on my high refresh rate monitor. Heck it allows me to play elden ring at 4k 120fps without moding anything!
As far as I’m aware Warframe is pretty much the last massively played multiplayer game getting modern updates that still supports SLI out of the box. I played with the Mars 760x2 which to my surprise managed to do better than my GTX 980. I was shocked! Really want to test more SLI set ups some day.
It’s pretty sick but man that’s way too much. I do have the 760 mars boxed I bet that would also go for a pretty penny, bought it used for €125 a year ago. Won’t sell it but it’s worth a fair bit more on eBay.
Also makes me wonder how crossfire works these days. I used to have a pair of Polaris cards back in the day. Maybe even see a pair of Vegas too,especially playing with the HBCC memory segment settings. Or even modern RX cards paired up.
Watching you play W3 made me think something was wrong with my GPU. I remember when that was normal play in 2008 when we would get a new expansion for WoW and wonder why my 7800GTX felt choked. Games always releasing to make sure we are constantly having to upgrade.
I did GTX 660 Sli paired with 16gb DDR3 and an AMD FX 8320 back in late 2014 but got rid of it late 2015 when SLI started dying rapidly, I fondly remember playing Star Wars battlefront on high at 1080p with 70fps on sli 660’s
I'd love to get my hands on one of these, I was just getting into pc hardware when they came out and this was the first gen of cards I was around for the start of it. So cool!
that title is really something Sir, this kind of feature is fun and exciting to try, but because most new games no longer support it, it makes no sense to buy a same model graphics card to use and try this, but we all can see how it will do in 2024 from your amazing content, i wanted to try this but when i think it again, it's not worth it now, it only raise ellectricity bill, again Thank you for making this video sir👍🏻
I had SLI GTX 780 Back In Day. Then When The 980Ti Came Out i Upgraded To That. Mostly Because SLI Was A Pain To Work With And Also The 980Ti Was Like Almost Equal To My 2 x 780's. That Being Said It Was Dope To look At 780's At The Time On One PC. 980ti Died While Back Currently Running A 4060Ti 16GB. Mostly Because Budget Wise I Cant Affoad These Crazy Priced Cards. Plus The Hype For Mid-High End Cards Is Lame Nowadays.
No worries on the original Skyrim glitching, trying to figure out what's causing each glitch could probably be its own documentary series so I think pointing at it and going "well there's that but the SLI does seem to be mostly working" is all you could do here.
Fun fact about the gtx 580: it’s the best gpu where SLI works in windows XP. This could make a really fun retro machine with a Z87 (seek out one with XP drivers by comparing with manufacturers website) and a g3258 ($10 unlocked dual core, more than 2 cores is useless for XP games) from ebay. I’m sure Crysis would play great.
That's how I used to roll - I'd buy 2 high-end cards (780, 980, 1080, 1080Ti, 2080Ti) and SLI them for a great overall experience. But then with the 3090 and 4090 I haven't bothered, especially since 4090 no longer has a SLI connector. I could have ran 3090s in SLI, my 1200W platinum PSU is enough for that, but by that time SLI was pronounced dead already :( SLI was fun because it was a glimpse into the future for us. Adding 2 cards together suddenly allowed for 4K and 5K resolutions well before 1440p even became mainstream (let alone 4K). Paired with Nvidia's DSR at the time SLI just ruled! The tweaking was indeed a bit of fun. Making games use SLI when the default profile didn't, or fixing issues here and there for better gameplay - it was interesting, even if a bit annoying. One thing I *don't* miss is the SLI microstuttering. I used to play FarCry5 and Witcher 3 a lot on SLI - both of those titles scale really well - but I immediately noticed how much smoother those games ran on a single 3090 vs 2080Ti SLI, even if the reported framerates were actually lower! Microstutter was something I surely will not miss nowdays.
I remember when AMD had a crossfire "light" onboard graphics could work with the ultra low end graphics cards. I had a r5 240 card paired with some dual core AMD chip that had integrated graphics.
I still have 2x HD 4890 in my collection, it's nice to play older games when Crossfire kicks in. Though I need a lot of volume from headphones as they are NOISY. :D
All of those problems with Skyrim are because of the frame rate not the dual gpus. It is possible to fix it by changing the fixed time step value in a config file. That's how I run Skyrim 144fps on my 12400 and 3070. I'm surprised more people don't know about changing that here and in Fallout 4 which you fix in the same manner. They just slog through it at 60fps for some reason or just deal with the too fast physics. Weird.
I had 3 gtx 680s and a 2600k/5820k up until Elden rings release. Now I have a 4070 super and a 12900k and they already struggle at 1080p in some games. I hate this timeline
The concept was obsolete until LLMs became popular, and consumers realized they needed a godly amount of vram to run the best open source models locally
I had radeon HD5850s crossfired for a number of years. I got the first one in my system new, and bought a second one cheap off Ebay a few years later. It just felt super badass to be plugging in a second GPU. Later upgraded to geforce 970, and even though it was an upgrade, I felt kinda sad pulling out that crossfired combo that just looked so cool in the case.
I liked the Dual-GPU era a lot better than the Super Expensive Single GPU era..... but I don't miss all of the inconsistencies. At one point I had a couple HD 5870's in Crossfire, and a couple GTX 670's in SLI. Then I tried running a couple RX 480's in Crossfire, but the inconsistencies got even worse. I tried the GTX 590, and a couple GTX 680's in SLI too. The framerate jumps were tolerable, but what I mean by inconsistencies is the 2nd card having 70% utilization in one game, and 30% utilization in another. Back then, MMORPG games were all the rage in my household and Dual-GPU solutions played like dung in those games. SLI and Crossfire were great for Battlefield 3 and Crysis 2 though. The other thing I loathed was the VRAM not being shared, and this confused a lot of people. A lot of people though they were getting 2GB of VRAM when they slapped two GTX 560 or 460 cards together in SLI, but they were only getting 1GB in actuality. I know someone in the comments below claims it's stackable, but it's not. Some people were tricked because some monitoring programs and games would double the amount of VRAM to match the total amount of the cards. It was from people trying to slap together two lower end cards, where the saying, a stronger single GPU is better than two GPU's came from.
I did previously have a Dual GPU GTX 690 specifically, I didn't really use SLI most of the time at the end of it's time in my system but using one of the GPU's for the game and the other GPU for recording with NVENC in OBS was a pretty good experience until driver support died off.
Actualy been thinking about putting together an 6th gen intel build with two MSI 980TI's for some older games like IGI 1, IGI 2, older cod games, Black Ops 1,2,3 , Advanced Warfare, Older Far Cry's etc. Pair it up with some older gaming pheripherals and a good old trysty 1080p 120hz monitor, Maybe 720p 120hz if im feeling real nostalgic and just have at it 😅
skyrim's engine is capped at 60fps by default for most computers, ive tried to run it higher than that on plenty of hardware and the engine breaks after about 75-80fps, its really weird but 60fps in skyrim is still plenty playable for me and im sure for a game like that it should be for everyone, hope that helps
actualy been think about putting together like an old Dual MSI 980TI, maybe 6Th gen intel typa build for a lot of the older games like IGI 1, IGI 2, the older COD games, Black ops 1,2,3 , Advanced Warfare older Far Cry's stuff like that , pair it up with some older gaming pheriperals and a good old trusty 1080p 120Hz monitor and have at it 😅
Got awesome experience with couple of GTX 480 and then couple of GTX 580. Sure, some games wouldn't support SLI at all even back in the day, but those were games you didn't need much GPU power anyway like original Spellforce. It's sad there is no way to get some more FPS when RTX 4090 is just not enough for all the settings maxed out now.
the flickering could possibly be due to bottlenecking of some kind. I had something similar years ago with flickering textures on GTA V (i5 750, 16gb RAM, and some 3gb AMD Radeon GPU)
Good idea, but yeah I can't see SLI working very well nowadays. Most modern games have more support for a single, more powerful GPU over two weaker GPUs in SLI or Crossfire.
Had Crossfire and SLI since my 7970s back in the day (SLI for the 980 TIs I had for a while). Up until my current Radeon VII. If it supported Crossfire, I'd be rockin' 2 of them.
Until recently Creation Engine games had their physics tied to frame rate, so uncapping it would make exciting and unexpected things happen in the world. I assume that's what's happening here
I miss SLi and CrossFire... good old times! I remember one of the last AGP cards I bought, a GeForce 7800GS which was baically a native PCIe card with an onboard chip that "bridged" it to AGP, and the first crossfire I had, a X1900XTX and X1900XTX Crossfire Edition (yes you had to buy different cards to crossfire them and they were connected via an external bulky cable)
Haha for what it's worth, those Skyrim issues are from the framerate. I learned the hard way it needs to be capped at 60fps increments or you encounter all of those physics issues.
I miss my triple SLI GTX 470s... Never a cold room and the power company would send me thank you cards for my patronage, lol! But seriously, it never affected my electricity bill. Go figure. But I still have my 1080ti SLI setup and love it. As far as the W3 "Enhanced" hoes, dump it and revert to the original for SLI smoothness.
I've never had something like this, my budget was never good enoug back in the day. :D I'm having an AMD HD 5970 lying around with dual GPU if you want it.
@RandomGaminginHD Blimey, that's a lot. My local Tesco has a Mars for way less than that in the meal deal, plus you get a sandwich and a drink with it.
Try disabling the SLI and instead using the second GPU as a dedicated PhysX accelerator in some of the handful of games that support it... Mirror's Edge, Batman Arkham, etc.
Edit: AMD QUOTE : " Multi-GPU Configurations For any hybrid-graphics configuration, AFMF 2 will use the displaying GPU for frame generation, allowing the render GPU to focus on the game. " That could be nice to re do with AFMF 2 preview driver , as i read that the framegen can use power of a secondary GPU, not dropping FPS when using it. I plan to keep my 6500XT for that purpose when i upgrade in the futur. Anybody did try it out yet ?!
Dual GPUs are the very first GPUs I had, HD6990. Good thing they've 'disappeared'. My favorite game then the Sims 3 lagged as hell, had to disable one of the GPUs. I'm so heartbroken back then (granted, I'm still a stupid high schooler). Performance is notoriously bad, and the price is also equally bad (including the electricity bill). The best multi-GPU usage is probably like combining your iGPU for low load usage and discreet GPU for a high load that works in tandem flawlessly (similar to a small core and big core in a phone I think).
I remembered running 2 GTX 780 TIs in SLI, man it was such a waste lol.. it was already being bottlenecked by the 3gb vram at the time where games were starting to take advantage of higher video memory.. oh good times! A lot of the games I played had limited support for SLI as well, it was more of bragging rights than anything..
The idea is good, the chosen hardware is far from it. If you want to make more sense, do a video on multigpu in 2024. 2xRX580 vs RX7600 for example. Dual GPU cards are obsolete, perhaps less the HD7990 which supports Vulkan and can play perhaps quite nicely some mGPU games. You're turning a good idea into mockery and oddity.
Gaming virtual machines can pool resources including gpus but realistically only worth if you want to play Cyberpunk 2077 with raytracing with 200fps with 2x 4090s. Would be usable with less extreme choices but the effort wouldnt be as rewarding performance wise for the effort.
how technology hasnt evolved to make dual chiplet gpus work better as if it was one is shocking, so many people would gladly buy a dual chip gpu jsut like in the old days even if its more powerhungry.
It did actually, both nvidia and amd done a lot to make multi-gpu setups run well then ever before. Problem is...no developer ever bothered implementing multi-gpu support, even in the form of frame gen workaround (one gpu is used to run game while another is used to generate frames for it, basically doubling available GPU power).
There's so much wasted potential for multi gpu with all these high data speeds and high vram capacities we have now that are just getting higher,and direct data transfer without waiting for the cpu. The hardware could do it better than ever before purely through the pcie slots like with applications like AI and these chiplet designs they're coming up with would use multiple chips anyways, but it's all been just squandered.
I've only done SLI once. Many moons ago I had a 2500K and a GTX 560. I picked up a used GTX 560 for £80 and my two 560's in SLI were performing similar to a GTX 670. It was fine for a few months and then Nvidia did something to the drivers so my 560's wouldn't work in SLI anymore unless I used an old driver. Newer games required a newer driver so my SLI became useless. Fast forward twelve months or so when games really needed 2GB of VRAM, and my 560's started to work in SLI again with the latest drivers, but performed nowhere near a GTX 670 anymore because of the lack of VRAM. Since these shenanigans from Nvidia I've been nowhere near multi GPU again.
I was the same way with my dual 580’s, those things tore fallout 4 up, but I had to use old drivers or else the game would either crash, or the whole game would flicker and objects would pop in and out of existence
YUP! Gotta' LOVE Team Green...
You homosexual
I must stop this nonsense. SLI and Crossfire do not sum up the VRAM of participating cards, they just do alternate rendering. It's same VRAM, double FPS. Let that sink in.
@@duddahgyeah7653 Only because game devs don't code it that way. Mantle had support for pooled VRAM a decade ago. Vulkan and Dx12 can handle it just fine. The only game I can think of off the top of my head that bothered to support it was Civilization: BE, and that game certainly didn't need it...no one supports it, because even at their height, multiple GPUs were a rarity, and no one wants to do extra code for 0.5% of the market.
"Being as useful as a handbrake on a canoe."
Later in Sharktank: Sharks! Have you ever wanted to stop your canoe in the middle of the ocean but got those pesky waves rocking your boat?
My canoes handbrake is very useful for drifting in the river
More like a gear shifter on a canoe
Buy a $12 folding anchor now.
Dragons Den (british) is waaay better than Shark Tank (murican).
The american version is just too tacky and flashy to stomach...
Talking of "messing around", a lot of people who purchased APU's and added a discreet GPU later along the line don't realise that they can run a
Dual Graphics setup in Windows by running the APU graphics alongside with the GPU graphics. It works perfectly well and can be utilised in various different ways, for example: one of my friends streams his gaming, his GPU plays the game and his Ryzen 5700g Radeon graphics records his stream, (which takes the resource strain of the recording from the GPU). I myself run a setup with a 5700g Radeon graphics running with an RTX 2060, (it works across both AMD and NVIDIA platforms perfectly well).
There are setup guides on UA-cam with instructions on how to set up Dual Graphics, its not very complicated to do only requiring a few bios settings and when both Graphic sources show up in device manager, you know its worked, and in my opinion to have APU graphics built on a chip, and not utilise them in some way just because you've added a discreet GPU is a waste, and i don't particularly like waste haha.
That's actually pretty cool
i remember i was able to do so in my first desktop pc, an A55 chipset using an A4-3300. only that i bought an nVidia card back then 💀
I did that on my I5 6600K. I only had a 1060 6gb, which was tanking a bit while streaming. So I rendered and recorded with Intel Quicksync
On a higher end GPU, especially an Nvidia one - this would be so pointless.
NVENC runs on a separate part of the chip and nowdays we even have multiple encoder streams available, so a high-end GPU just has no use for a "helper card" and hasn't for a while.
@@TheUltimateBlooper I agree you have that on a higher end GPU, but lets face it, the chances are if you've bought an APU in the first place its probably because your on a budget and a high end GPU will be out of range, and also if you have high end GPU money to spend your probably going to match it with a high end CPU, this is a budget alternative, and this channel tends to aim at a budget orientated audience.
Its highly unlikely your going to run a higher end GPU matched with an APU, as performance charts all tell you that APU versions of otherwise identical chips perform worse.
There was a brief moment in time where it really looked like the DX12 multi-GPU thing was actually going to take off. I had CrossFire R9 Fury cards and was getting quite literally twice the fps of one card in Dying Light and Rise of the Tomb Raider. I was gaming at 4K native max settings in 2015 and it was glorious, also hot and loud, but none the less glorious. Even managed to make the protection trip on a 850W PSU, which got resolved by replacing it with a 1150W one!
Now you can trip the power with a single gpu after dropping over 3x the money!
@@tomstech4390funny how people can some justify that.
@@kevinerbs2778 a word out is of order
@@kevinerbs2778as an owner of many SLI configs in the past and every high-end card since Kepler - yeah, some people absolutely can justify it. I currently run a 3090FE and a 4090FE - both - in my rig.
Though I'm a 3D artist by trade, so them cards do work rendering for me, aside from gaming xD
Add the original HITMAN reboot to that pile too! Square Enix were pioneering DX12 titles and they ran REALLY well in SLI!
I'm also one of the people who played at 4K near a decade ago, lol. Nowdays on a 4090 we can go well beyond that in many games though. GTA4 ran fine at 10K, so did Shadow Warrior, etc. Farcry5 runs at 8K maxed at like 60+ fps easy as well. GTA5 I finished when it was new at 5K, same with MH World.
New "AAA" games can be really demanding, so not gonna run them above 5K while maintaining reasonably high framerates, but 4K is basically the default now on that.
We all miss SLI and CrossFire...until we look at our recent electric bill....
Haha exactly
Because it was never done well. The ideal would be having 2-4 low powered/efficient GPUs running 1 thing (or few things) each. That is the ideal scenario.
If you have 2 current 4090s in SLI and that's how it's done then there's no point and it's stupid.
Like maybe not different cards entirely, but if this was brought back again in the future, it would be cool having a card that requires 4090s power, but has 3 or more different chiplets that you could assign to "stuff" in games, so say 1 chiplet runs physics while the rest load entirely shadows or whatever.
Perhaps technically speaking that is how it works today, assigning tasks within the 1 chip, but it would be expanded. Some sort of "multithreading" but in gpu.
I also can be completely wrong, and you are welcome to correct me if that is this case as this comment is based on "what I think I know about it".
Dude, how many hours per month do you game for it to effect your power bill?!?!
I've done the math, and even with VERY expensive electricity you literally can't even notice a difference on your power bill by switching from an undervolted RTX 3060 to an overclocked RTX 4090 😂
Unless you're doing something like crypto mining literally 24/7
@kishirisu1268
I'm not American and I'm FAR from "rich"... I sincerely doubt you cut your power bill in half by tweaking your pc power... That would mean your pc was using as much power 24/7 as ALL of the other electric things in your home combined... It makes ZERO sense...
1) if you could afford multiple GPUs and a PSU for them you could afford the electric.
2) it only consumed much more electric if the other GPU's were much more used.... y'know, working.
If you had a really easy to run game like minecraft that couldn't use the second GPU it didn't matter, if you had a game that could use the second GPU like battlefield then it made a big difference.
Great vid. I dug out one of my old desktops after watching this to play around with.
It's a core 2 quad 9550 with dual 4870s. Still works fine.
The only modern game which supports multiple graphics cards/gpus is Microsoft Flight Simulator. With a dual card setup, the game will render the plane using one card and the terrain using the other. Now you'll have to find a Powercolor Devil 13 as well. Or a R9 295X2.
More proof of how outdated that game is.
I've got the powercolor devil 13! It's an HD 7990, basically two 7970s. Honestly I haven't even tried it because im scared to setup crossfire bullshit lol
@@naamadossantossilva4736 I'm taking about the brand new one, not FSX.
@@BrunodeSouzaLino The one that launched with DX11 in 2020?I was talking about that one.
@@naamadossantossilva4736 More like a sign that actual effort went into coding it. Every modern API supports multiple GPUs, if the dev takes the time to utilize it.
"Takes me back to the days when I was troubleshooting my pc more than gaming on it" spoken like a true pc gamer haha
I used 4 separate 7970s in quad cf for many years. Never have they failed me. Just that sometimes only 3 cards or 2 cards would work in tandem. Then switched to 2080 ti sli. No problems either. It's only when nvidia totally ditched sli at 4000 series that I moved to 4090.
6:20 The stakes are getting higher...
I see what you did there, sir.
Can you imagine if nvidia made a card with dual 4090s just for the flex? You'll probably need a nuclear reactor from a us aircraft carrier to run that
Haha would be amazing though.
The 3090 was the last GeForce card to have SLI. It's worth flexing with that.
Clicking the power button required rocket launch power to just boot the pc with i9 14900k
Nah it would catch on fire before you would even boot it up
@@josephbryanasuncion4904 OCd 14900k and dual OCd 4090. Will have the power consumption of a small nation
I miss the SLI/CrossfireX era. I've seen a lot of people saying it was a bad technology because of frametimes, I cannot desagree more, it was an awesome technology that made a lot o sense back in the day, you could buy a 60 class card first and later when you got more money you could buy another and outperform the 80 class card.
People somehow think the fact that most bad console ports of that era not working properly with dual GPU was fault of the technology and not bad optmization, which is the same as saying X modern GPU or CPU is bad because it stuters in Jedi Survivor.
You realize the potential that tech had when you look at well optimized PC focused games of that period that gained almost 100% performance uplift with no impact in frame latency, an exemple I always give is Battlefield 3, look at videos of that game running in the GTX 690, it beats the GTX 1080 with perfect frametimes.
Dude, the idea is awesome, what killed it was the dumb lack of support from nvidia and shit AMD documentation.
Nvidia could Just: hey we got that new thing called ray tracing, wanna Max out this Shit? BUY MORE CARDS to accelerate It. Or like phys X worked a long time ago until It got a driver Lock, just buy a card to accelerate it or if you have a AMD card to enable It.
Now we have PCI gen 5 x16, re-bar, HAGS, games that let you go nuts with the resolution scale, 8K monitor, virtual resolutions....and the GPU makers that would be trying to sell cards with absurd claims Just stop puting effort after the pandemic (I miss the 4k r9 fury, even in crossfire It could not perform).
... And i feel even luckier i scored my working gtx 690 for $68 shipped.
Love the 690!
@@RandomGaminginHDyou homosexual
haha almost the same price I bought it, still on the way.
Got mine non fonctional for about 30€ with the box, a cheapo mutlimeter helped me find a broken 0 ohms resistor, burnt by a bad mosfet on the back of the card. Replaced them and I used the card for 4-5 years. Retired due to multi GPU issues and low VRAM.
Even as a paperweight it’s a beautiful piece of industrial design.
Got one €65 including shipping a while back. Such a great looking card.
I remember that a Spanish youtuber was excited because he was doing a dual sli with a GTX 980 only to find that he couldnt record the game well because only was recording the FPS that one card produces.
I don’t know why people use the phrase “as useful as a handbrake on a canoe.” If you just put an oar on a swivel on the side of your canoe, it would work exactly like a car handbrake.
Good luck parking your canoe on a waterfall with 1 oar.
@@auturgicflosculator2183 To be fair, a car's handbrake would work just about as well in that case.
@@MarginalSC This is true. Cars park on hills with their handbrake. Boats use anchors. Hence the phrase. It's not a racing term.☺
Love dual GPU’s. There just something about them. I have 2x 7950 GX2’s, 2x HD 3870X2’s, GTX 690, GTX 760 X2 Mars and also some faulty once unfortunately, including GTX 295, 2x HD 4870X2’s and HD 3870X2.
SLI may live again, but will be in the form of 2 GPU dies bonded within a overall GPU chip package. The 5090 might be configured as such.
Considering the amount of power 5090 needs, it will definitely be a dual-core. Basically 2 4090`s stacked together (and maybe downclocked to prevent card melting itself).
@@alexturnbackthearmy1907 it's a single GB202 die
Need to see how these old multi gpu beasts perform in old games like crysis or far cry 2 at 2k or 4k res with fast modern cpu
long time viewer here. you dont know it but your first review on that Mars II is one of the first videos that i watched from you. along with the AMD A6 CSGO pc that you did.
That is exactly the sort of information UA-camrs do actually know -_-
sli is dead but multi gpu config isnt. Using a 6000 or 7000 series amd gpu with any card as a dedicated Fluid Motion frames card will double the frames of any card. Also you can use AMD free sync with an Nvidia card
Ah good point I’ll have to try that
@@RandomGaminginHD Awesome! Its hard to find tutorials online but basically your NVidia card will go into the top slot and then your AMd card will go in the bottom one. Display cable will go into the AMD card. You want both Nvidia and Adrenaline drivers installed then in the Windows graphics setting you want it to default all graphics to your NVidia card.
Only downside is you can only monitor the FPS gain in AMD monitor. The gains wont show up in MSI afterburner and your screen needs to be in Fullscreen in order for fluid motion frames to work.
I have a 6600xt and 2060 for my home theater PC and its fast combo!
@@Hairybarryyany chance you'd want to record a video detailing this? I've got a 2080 and a 6700XT I'd like to try this setup with once I get home from vacation
So if you have an Nvidia card, you suggest getting an AMD card just to use frame gen? I tried AFMF when it first released and it was mediocre. I dont think getting an AMD card just to use AFMF is worthwhile.
@t1e6x12 I wouldn't call afmf mediocre it works in nearly every game, and it's fantastic.
It's a lot better than what it use to be. It's helps a lot with bottleneck issues. I have a 6800 in my dual xeon rig and it helps with choppiness I use to get without AFMF. It's truly a godsend as I struggled to get high fps on my high refresh rate monitor.
Heck it allows me to play elden ring at 4k 120fps without moding anything!
Wait how would actual sli work with one of these? Would that be considered Quad sli 🤔
yes, thats quad sli.
It's definitely classified as Quad SLI. Both the GPUs get shown in afterburner and hwinfo as 2 cards each.
No it is a dual gpu card. There are two gpus on one card. So this would be dual sli but on one card.
I think there was a guy on yt, Fully Buffered, who made a YT video about that exact thing haha
@@heylookthatboi which are 4 gpu's Total. If 2 gpu's on one card is still sli, 2 dual cards is quad sli.
As far as I’m aware Warframe is pretty much the last massively played multiplayer game getting modern updates that still supports SLI out of the box.
I played with the Mars 760x2 which to my surprise managed to do better than my GTX 980. I was shocked! Really want to test more SLI set ups some day.
I do not want to run the cards together but run 3 games on one card and three games on the other using two monitors
you gotta test crossfire too. the setting gives more flexibility for compability as iw seen
Yeah will do for sure
There is currently only one listing for a Mars 2 on eBay...the seller wants $3,649.04 for it
It’s pretty sick but man that’s way too much. I do have the 760 mars boxed I bet that would also go for a pretty penny, bought it used for €125 a year ago. Won’t sell it but it’s worth a fair bit more on eBay.
That Witcher section was great. Steve talking away calmly while getting absolutely whupped by a guard. Good times :D
Also makes me wonder how crossfire works these days. I used to have a pair of Polaris cards back in the day. Maybe even see a pair of Vegas too,especially playing with the HBCC memory segment settings. Or even modern RX cards paired up.
I’ll have to test it soon!
SLI skyrim, the way Todd Howard meant it to be played
Burn the BLOODY dog, WAIT!? Was that TES:IV?
Watching you play W3 made me think something was wrong with my GPU. I remember when that was normal play in 2008 when we would get a new expansion for WoW and wonder why my 7800GTX felt choked. Games always releasing to make sure we are constantly having to upgrade.
Never thought I'd hear someone talking about CrossFire and SLI again.
I did GTX 660 Sli paired with 16gb DDR3 and an AMD FX 8320 back in late 2014 but got rid of it late 2015 when SLI started dying rapidly, I fondly remember playing Star Wars battlefront on high at 1080p with 70fps on sli 660’s
I'd love to get my hands on one of these, I was just getting into pc hardware when they came out and this was the first gen of cards I was around for the start of it. So cool!
5:13
All good and fun until you look at those power connectors, Holy Moly.
The only good SLI i s Voodoo 2 SLI :)
Remember when one card suddenly stopped working? Frozen scanlines and working scanlines together! Oh the days! :P
5:30 That horse is like "Let's go already!" lol
Frame generation have pretty much supplanted the main utility of multi-GPU: more frames at worse input lag. 😅
true
that title is really something Sir, this kind of feature is fun and exciting to try, but because most new games no longer support it, it makes no sense to buy a same model graphics card to use and try this, but we all can see how it will do in 2024 from your amazing content, i wanted to try this but when i think it again, it's not worth it now, it only raise ellectricity bill, again Thank you for making this video sir👍🏻
Think you could do a review on the ryzen 7 3700x? See if it still holds up as a 8/16 these days. Cheers mate
I had SLI GTX 780 Back In Day. Then When The 980Ti Came Out i Upgraded To That. Mostly Because SLI Was A Pain To Work With And Also The 980Ti Was Like Almost Equal To My 2 x 780's. That Being Said It Was Dope To look At 780's At The Time On One PC. 980ti Died While Back Currently Running A 4060Ti 16GB. Mostly Because Budget Wise I Cant Affoad These Crazy Priced Cards. Plus The Hype For Mid-High End Cards Is Lame Nowadays.
similar story here, but with two 770s and then I ended up SLI-ing the 980ti later on
No worries on the original Skyrim glitching, trying to figure out what's causing each glitch could probably be its own documentary series so I think pointing at it and going "well there's that but the SLI does seem to be mostly working" is all you could do here.
You can adjust the havoc physics in the config file. People have lists of adjusted numbers for any given frame rate target.
Thanks for another fine video Sir, keep 'em coming please :)
Fun fact about the gtx 580: it’s the best gpu where SLI works in windows XP. This could make a really fun retro machine with a Z87 (seek out one with XP drivers by comparing with manufacturers website) and a g3258 ($10 unlocked dual core, more than 2 cores is useless for XP games) from ebay. I’m sure Crysis would play great.
@RandomGaminginHD i still got a old Waterblocked Radeon 5970 sat on shelf here
I did sli once and i am glad it got discontinued. Double the cost, maybe get some increase. Maybe
That's how I used to roll - I'd buy 2 high-end cards (780, 980, 1080, 1080Ti, 2080Ti) and SLI them for a great overall experience. But then with the 3090 and 4090 I haven't bothered, especially since 4090 no longer has a SLI connector.
I could have ran 3090s in SLI, my 1200W platinum PSU is enough for that, but by that time SLI was pronounced dead already :(
SLI was fun because it was a glimpse into the future for us. Adding 2 cards together suddenly allowed for 4K and 5K resolutions well before 1440p even became mainstream (let alone 4K). Paired with Nvidia's DSR at the time SLI just ruled!
The tweaking was indeed a bit of fun. Making games use SLI when the default profile didn't, or fixing issues here and there for better gameplay - it was interesting, even if a bit annoying.
One thing I *don't* miss is the SLI microstuttering. I used to play FarCry5 and Witcher 3 a lot on SLI - both of those titles scale really well - but I immediately noticed how much smoother those games ran on a single 3090 vs 2080Ti SLI, even if the reported framerates were actually lower! Microstutter was something I surely will not miss nowdays.
I remember when AMD had a crossfire "light" onboard graphics could work with the ultra low end graphics cards. I had a r5 240 card paired with some dual core AMD chip that had integrated graphics.
I still have 2x HD 4890 in my collection, it's nice to play older games when Crossfire kicks in. Though I need a lot of volume from headphones as they are NOISY. :D
All of those problems with Skyrim are because of the frame rate not the dual gpus. It is possible to fix it by changing the fixed time step value in a config file. That's how I run Skyrim 144fps on my 12400 and 3070. I'm surprised more people don't know about changing that here and in Fallout 4 which you fix in the same manner. They just slog through it at 60fps for some reason or just deal with the too fast physics. Weird.
I had 3 gtx 680s and a 2600k/5820k up until Elden rings release. Now I have a 4070 super and a 12900k and they already struggle at 1080p in some games. I hate this timeline
The concept was obsolete until LLMs became popular, and consumers realized they needed a godly amount of vram to run the best open source models locally
I had radeon HD5850s crossfired for a number of years. I got the first one in my system new, and bought a second one cheap off Ebay a few years later. It just felt super badass to be plugging in a second GPU.
Later upgraded to geforce 970, and even though it was an upgrade, I felt kinda sad pulling out that crossfired combo that just looked so cool in the case.
Should have added another 970 in sli 😅
I liked the Dual-GPU era a lot better than the Super Expensive Single GPU era..... but I don't miss all of the inconsistencies. At one point I had a couple HD 5870's in Crossfire, and a couple GTX 670's in SLI. Then I tried running a couple RX 480's in Crossfire, but the inconsistencies got even worse. I tried the GTX 590, and a couple GTX 680's in SLI too.
The framerate jumps were tolerable, but what I mean by inconsistencies is the 2nd card having 70% utilization in one game, and 30% utilization in another. Back then, MMORPG games were all the rage in my household and Dual-GPU solutions played like dung in those games. SLI and Crossfire were great for Battlefield 3 and Crysis 2 though. The other thing I loathed was the VRAM not being shared, and this confused a lot of people. A lot of people though they were getting 2GB of VRAM when they slapped two GTX 560 or 460 cards together in SLI, but they were only getting 1GB in actuality. I know someone in the comments below claims it's stackable, but it's not. Some people were tricked because some monitoring programs and games would double the amount of VRAM to match the total amount of the cards.
It was from people trying to slap together two lower end cards, where the saying, a stronger single GPU is better than two GPU's came from.
Hey, we can't forget, this has the same amount of Driver support as the RX-580, am I right?
I did previously have a Dual GPU GTX 690 specifically, I didn't really use SLI most of the time at the end of it's time in my system but using one of the GPU's for the game and the other GPU for recording with NVENC in OBS was a pretty good experience until driver support died off.
Actualy been thinking about putting together an 6th gen intel build with two MSI 980TI's for some older games like IGI 1, IGI 2, older cod games, Black Ops 1,2,3 , Advanced Warfare, Older Far Cry's etc. Pair it up with some older gaming pheripherals and a good old trysty 1080p 120hz monitor, Maybe 720p 120hz if im feeling real nostalgic and just have at it 😅
skyrim's engine is capped at 60fps by default for most computers, ive tried to run it higher than that on plenty of hardware and the engine breaks after about 75-80fps, its really weird but 60fps in skyrim is still plenty playable for me and im sure for a game like that it should be for everyone, hope that helps
I had dual GTX 285s back in the day. It sounded like a good idea at the time but you had to keep up with it for it to run smoothly.
WTH? You didn't die profusely in the first two shooter games..you're slacking lol. Great Video.
😂 sorry I don’t know what’s got into me
@@RandomGaminginHD LMAO!! We all have our moments. I am sure you will get back on track of profusely dying in a grotesque military manner in game lol
actualy been think about putting together like an old Dual MSI 980TI, maybe 6Th gen intel typa build for a lot of the older games like IGI 1, IGI 2, the older COD games, Black ops 1,2,3 , Advanced Warfare older Far Cry's stuff like that , pair it up with some older gaming pheriperals and a good old trusty 1080p 120Hz monitor and have at it 😅
Got awesome experience with couple of GTX 480 and then couple of GTX 580. Sure, some games wouldn't support SLI at all even back in the day, but those were games you didn't need much GPU power anyway like original Spellforce. It's sad there is no way to get some more FPS when RTX 4090 is just not enough for all the settings maxed out now.
the flickering could possibly be due to bottlenecking of some kind. I had something similar years ago with flickering textures on GTA V (i5 750, 16gb RAM, and some 3gb AMD Radeon GPU)
Good idea, but yeah I can't see SLI working very well nowadays. Most modern games have more support for a single, more powerful GPU over two weaker GPUs in SLI or Crossfire.
I had an SLI setup once
I wish I still had those Voodoo2s, but I do still have my Voodoo5, albeit it is broken
Had Crossfire and SLI since my 7970s back in the day (SLI for the 980 TIs I had for a while). Up until my current Radeon VII. If it supported Crossfire, I'd be rockin' 2 of them.
When you try SLI next time try Borderlands the Pre-Sequel. Ran at 4k with the Titan Z when I tried it and that was impressive. :)
Until recently Creation Engine games had their physics tied to frame rate, so uncapping it would make exciting and unexpected things happen in the world. I assume that's what's happening here
It was a great idea back then, unfortunately it did not have a great future with games.
I miss SLi and CrossFire... good old times! I remember one of the last AGP cards I bought, a GeForce 7800GS which was baically a native PCIe card with an onboard chip that "bridged" it to AGP, and the first crossfire I had, a X1900XTX and X1900XTX Crossfire Edition (yes you had to buy different cards to crossfire them and they were connected via an external bulky cable)
Haha for what it's worth, those Skyrim issues are from the framerate. I learned the hard way it needs to be capped at 60fps increments or you encounter all of those physics issues.
Linus did some testing with quad 980 Ti's running in SLI, it wasn't really doing much but it is cool though. just because 😅
I miss my triple SLI GTX 470s...
Never a cold room and the power company would send me thank you cards for my patronage, lol!
But seriously, it never affected my electricity bill. Go figure.
But I still have my 1080ti SLI setup and love it.
As far as the W3 "Enhanced" hoes, dump it and revert to the original for SLI smoothness.
I hear tell there are games out that that can use one card for Physx and the other as a GPU.
I've never had something like this, my budget was never good enoug back in the day. :D
I'm having an AMD HD 5970 lying around with dual GPU if you want it.
3dfx Voodoo²(12MB) SLI
^Sry' Younglings, but that was(is*) my only experience... 😛
*Still alive in my FIRST build Socket 7 BEAST ♥
Curious, how much did the mars cost? From what I remember hearing, it was quite pricey to find.
Great video as always btw!
I think new about $1500. This one was sent to me a while back though. I’ve kept it as it’s one of my favourites
@RandomGaminginHD Blimey, that's a lot. My local Tesco has a Mars for way less than that in the meal deal, plus you get a sandwich and a drink with it.
The utter chaos in Skyrim is because the game engine really hates being over a certain frame-rate. There's mods to fix it. =P
Yeah it sure makes for a laugh though haha
mabey should retry skyrim with the mars II and the mods 🤔
maybe you should test the radeon r9 295x2 if you still have it
those pci-e 8x gpu should do dual chip since they still use 16x connector
Ah, the Lucky Luke Skyrim experience. Starts with a Jolly Jumper. :)
This card was good. But the dual 780 version was the one I wanted.
Ah I’d love to own them all to be fair haha
Try disabling the SLI and instead using the second GPU as a dedicated PhysX accelerator in some of the handful of games that support it... Mirror's Edge, Batman Arkham, etc.
SLI was always a bit of a gimmick to be fair.
Yeah definitely more of an enthusiast option
Edit: AMD QUOTE :
" Multi-GPU Configurations
For any hybrid-graphics configuration, AFMF 2 will use the displaying GPU for frame generation, allowing the render GPU to focus on the game. "
That could be nice to re do with AFMF 2 preview driver , as i read that the framegen can use power of a secondary GPU, not dropping FPS when using it. I plan to keep my 6500XT for that purpose when i upgrade in the futur. Anybody did try it out yet ?!
Thank God that is a thing of the past, expensive enough these days......jeez!
Last time I did sli was with the GTX 770 and it was the biggest waste of money. Never did it again after that.
Dual GPUs are the very first GPUs I had, HD6990. Good thing they've 'disappeared'. My favorite game then the Sims 3 lagged as hell, had to disable one of the GPUs. I'm so heartbroken back then (granted, I'm still a stupid high schooler). Performance is notoriously bad, and the price is also equally bad (including the electricity bill). The best multi-GPU usage is probably like combining your iGPU for low load usage and discreet GPU for a high load that works in tandem flawlessly (similar to a small core and big core in a phone I think).
Imagine if 4090 has sli support
What up Steve! Have a good day!
And you :)
I remembered running 2 GTX 780 TIs in SLI, man it was such a waste lol.. it was already being bottlenecked by the 3gb vram at the time where games were starting to take advantage of higher video memory.. oh good times! A lot of the games I played had limited support for SLI as well, it was more of bragging rights than anything..
Jeeez what an ABSOLUTE UNIT of a card!
It’s a beast. May even me bigger than the 4080 super I looked at recently. And that’s massive
i have a laptop (qosmio x305) with mobile 9800s in sli. the advantage was so good for some games from its time
The idea is good, the chosen hardware is far from it. If you want to make more sense, do a video on multigpu in 2024. 2xRX580 vs RX7600 for example. Dual GPU cards are obsolete, perhaps less the HD7990 which supports Vulkan and can play perhaps quite nicely some mGPU games. You're turning a good idea into mockery and oddity.
2080 ti sli was my latest 2x gpu rig... same amout of power than a 3090 ...
Gaming virtual machines can pool resources including gpus but realistically only worth if you want to play Cyberpunk 2077 with raytracing with 200fps with 2x 4090s. Would be usable with less extreme choices but the effort wouldnt be as rewarding performance wise for the effort.
how technology hasnt evolved to make dual chiplet gpus work better as if it was one is shocking, so many people would gladly buy a dual chip gpu jsut like in the old days even if its more powerhungry.
It did actually, both nvidia and amd done a lot to make multi-gpu setups run well then ever before. Problem is...no developer ever bothered implementing multi-gpu support, even in the form of frame gen workaround (one gpu is used to run game while another is used to generate frames for it, basically doubling available GPU power).
@@alexturnbackthearmy1907yup best example is the old Dues Ex Mankind divided. Two rtx 2080 tis match a single rtx 4090 in 4k.
In my mind it was a dual Radeon 580, until you mention GTX
..and I was thinking Voodoo²! Oh Hello grandson(or is it Great-grandson)? 😛
will u make a syrim mod tutorial
I still toss my mars 2 760 in my system to mess around. It's just fun to do so.
There's so much wasted potential for multi gpu with all these high data speeds and high vram capacities we have now that are just getting higher,and direct data transfer without waiting for the cpu. The hardware could do it better than ever before purely through the pcie slots like with applications like AI and these chiplet designs they're coming up with would use multiple chips anyways, but it's all been just squandered.
I'll just stick with my Gtx680! This one even can play The Division2! With compromise, but not of that much tinkering needed.