two GPUs is doable and effective, but not via crossfire or SLi. Lossless Scaling's frame generation can use the frame gen ability entinrely on only the 2nd GPU, making the 1st one be the render GPu, thus actually gaining 2x or 3x fps actually. Toasty Bros should check it out, LSFG actually makes 2 Gpus in 2024 possible. Edit: Wow. didn't think this would get upvoted. I might as well provide more info in case the Toasty Bros does see this. The setup is relatively easy, but there's a specific cable to the monitor setup that one must follow, it's pretty simple though. One just has to make sure the display out (HDMI, DVI, Displayport whatever) HAS to be connected to the GPu that's utilizing LSFG. For example, GPU 1 does the rendering of the game or video (yes LSFG works on videos too), while GPU 2 does the LSFG entirely only. The cable that is connect to the monitor(s) has to all be connected to GPU 2. After that, you just make sure in windows 10 or 11, that you setup GPU 1 to be the main one, and in LSFG GPU 2 be the render one. If you have two same GPUS, just use RTSS or some monitor to make sure LSFG is using the correct GPU, which Toasty Bros are doing already. If they need more info, LSFG has an entire discord.
Actually had a question sort of along those lines. So I'm someone who liked to have other things running on my second monitor, usually some kinda yourube video or a stream. Well doing so with some of my games puts my rx 6700 xt through the ringer and hurts performance. What I was curious about is could I say grab something like an rx 6400, something super low powered, and run my second monitor, and whatever apps I'm using there off that weaker gpu, so my good gpu can focus soul on gaming? It'd be a lot of money to simply upgrade my 6700, and it does run the games I play perfectly fine when nothing is happening on my other monitor. I'm curious if this is a viable idea, since I had heard about people throwing in the arc A310 into systems as a second gpu souly to use its newer video incoder, and it got me thinking if that would help my use case, as a small investment in something like an A310 or rx 6400 is a lot better then a huge investment in simply a higher performing gpu.
the people who made the hybrid physx software that let you use a amd card for main and nvidia card for physx are working on a hybrid rtx software to let you buy a budget rtx card like a 2060 pair it with 7900xt and use nvidias feature set like dlss and raytracing
Back in the day I had dual 1070s with a 4770K, I remember a handful of games not working with both GPUs but when games did work it was pretty cool, I played through Resident Evil 7 on a triple monitor setup with that PC, good times.
there is a reason SLI and CROSSFIRE does not work that well , both INTEL & AMD would not sell so many expensive graphic cards if it did work , we need some geeks to sort out the problems for us as SLI and CROSSFIRE is a dam good idea , i was going to have a go at crossfire with 2x 580 , dont think i will bother now tho
@@smokejumper749 on the games that supported it, it worked great. Once I bought a 1080TI to replace them, I could see how the 1080TI was superior though for sure.
PCIE Power. Slot gives 75 Watts 8 Pin gives 150 Watts 6 Pin gives 75 Watts So these cards with 2x 8pins can draw 375 Watts safely. However... you are using daisy chained power cables.... but that shouldn't be a problem as each cable should be able to supply about 400 Watts.
The lesson here kids, If you missed out on the SLI Crossfire wave you didnt miss much outside of relaunching your games and reinstalling and whipping your drivers with DDU constantly!! the only config of SLI I had no big trouble out of was the 560ti's once the cards demand more power oops!!
@@bornwisedistruction so far my experience with SLI and Crossfire has been pretty seamless. Only problem I've had is power draw. I have maxed my RM1000 out a few times now with my current system configuration.
@@GoonyMclinux I used to call them the Ice Cream sandwich gpu's, dont know why my brain always processed them that way. If you look at them long enough you want to take a bite out of them! 😅
Crossfire (AMD) and SLI (Nvidia) only required 2 PCIe x8 Lanes. To get true full PCIe x16 + x16 Crossfire/SLI, it required Enthusiast (Intel i9/AMD Threadripper) motherboards with a far higher PCIe lane count. They only need to be PCIe Gen3 to work, as no Crossfire/SLI on any PCIe Gen4 cards exist outside of Enterprise level GPUs. Normal Consumer-level chips only have 20+4 PCIe lanes with motherboards. Motherboards will no longer specifically state they support SLI/Crossfire... but if it has 2 x8 lanes, it will work. For instance... ASUS Pro Art B550 Creator motherboard has 2 PCIe 4.0 x16 slots. (x16 or x8/x8) Will work with this motherboard, confirmed with 2x GTX 1080s in SLI.
I sold my Vega 64 for $900 during the crypto boom and bought one of the low end AMD cards at the time and used the profit to upgrade some other components. Hearing you got two of them for $200 tells me just how different the market is now.
Dual running an RX 6600 and an RX 6700XT in the same system for LLM research and frequent gaming. AMD is great for these scenarios, it was seamless for me (I remember the SLI/Crossfire old days, it wasn't always this easy).
I run 8b models on my 6600 xt all day.. got my own finetunes I made of llama 3.1. this on Linux though as Rocm is a PITA to get working correctly on windows. do my gaming on linux also
I used to run 2X GTX 660 in SLI back in the day… Older versions of Fallout 4 worked rather well and the non-special edition of DMC4 is great with multi-GPU setups 😊
I have 2 vega 64s in my system ironically. I have already tripped the breaker multiple times and have maxed out my rm1000 on more than 1 occasion. I even fried an outlet with this system. I have engineering samples so they operate at a max of 250 watts while boosting to 1400 mhz and you can't change the clocks either. On the retail bios it has absolutely no limits in place so i can make them pull around 400 watts each if i wanted. Funny enough the engineering sample bioses actually score higher in benchmarks.
Back in the days i had a sapphire 7970 when it came out, years later i got a sapphire r9 280x (pretty much same chip/card). I ran them both in crossfire without a problem. The only problem however was that crossfire, when it actually worked in a game, was often plagued with rather bad frametimes. In some games it worked quite good. One i particularly remember was metro 2033 (the original). Ran so much better in crossfire than with 1 card only. But in general was more or less a waste. Replaced them both later with a rx 580 when it came out. However... there was also the time where physix was quite a thing (at least in some games). And adding a nvidia gpu as secondary to my rx 580 allowed me to use hardware physix while gaming on the rx 580. (shoved in a 560ti to make that happen).
I had 2x 7970 cf relaced with one nvidia 680.... same performance, added second 680 then replace with one 980ti.... same performance, replaced with a 1070ti and the story ended with a 2070
Not much of a gamer but back in those days I had 4x 7990s paired with the i7 980x (6 core hyperthreaded) and 24 gigs of triple channel ram (ddr3 1866) that worked awesome for mining bitcoin etc. That rig finally gave up the ghost about 6 months back. It ran with a 33% overclock for over a decade. Looking forward to the parts for my new to me 8 year old rig to arrive. It will be an upgrade to a 22 core Xeon with 128 gigs of ram and an intel ARC or two for the GPU side of things.
1 8-pin can pull 150W max. That's 300 per card. Plus 75 via the PCIe = 375W. Double that due to the SLI = 750W. Edit: the official TDP is 295W. So that's 590W in theory.
I think the crashes are due to the GPU's spiking at high wattage and causing the power supply to trip. Crysis Remastered games disable XFIRE in the later drivers, you will have top use older drivers to get those games to work.
It worked fantastic. I had 2 r9 390x gpus in crossfire right when they launched, and it was a great experience. Almost every game worked with them, and overall, I had no complaints.
I really wish both AMD and nVidia bring back support for Crossfire and SLI for both driver, game and motherboard levels. I really would like to know what it is like to run current gen GPUs in its maximum potential.
I’d be willing to bet two 1080TI’s in SLI would give you better results, especially if you can track down a high bandwidth bridge from the 10 series. I ran two 1070’s back in the day and the bulk of what I played supported SLI with good scaling. The big upgrade from the 9th to the 10th gen was that HB bridge, it smoothed out the majority of the microstuttering I had experienced with the 700 and 900 cards.
Trying running each gpu with it's own power supply to see if it's the power draw or crossfire. The other possibility it could be the motherboard as well.
What i would have liked to have seen is the pc running killing floor with crossfire turned off to show the difference in the fps between crossfire on and off.
I used to run 1070 sli back in the day just fine with no issues until more games came out without support which ended up making the choice to upgrade to a single card. Loved the performance I got when I had it though
But with a B450 motherboard you are running one card at PCIe 3.0 x8 speeds and the other card at PCIe 2.0 x8 speeds... Wish there was a test video with at least dual PCIe 3.0 x8 speeds or even 4.0 speeds.
there is a pin switch on the card that change the led colors, it's not because of the crossfire there's also a pin switch that changes the vbios I have a 56 and that was the first thing I researched about the card
Id got with a 40 series card or wait on the 50 series with how games are taking more and more to run your gonna want a upgrade and and update path also deff got with ddr5
i've had 2 Vega 56s for a while and this @ 10:10 just recently started happening. also, i dont know if its related but my PC wont work on 2 monitors, i had dual monitor setup in the past but it randomly stopped working, since then i've swapped it multiple times, but after a few weeks my PC would stop displaying to those new monitors. after some time i just gave up and stuck to just using one monitor. i've been getting into game Dev and debating if 2 GPUs are a viable option and should try carry these 2 cards into my new rig or simply start from scratch...
In addition to other comments about lossless frame gen... you can also pair an older gen Nvidia card with a newer one and in Nvidia control panel make the older card a dedicated Physx Processor. The removes load from both the CPU and render card which gives a noticeable uplift in titles that have physics rendering. It also has the benefit that a low end old gen card is still usually a better physx card than your cpu is.
Sure, a 4090 is more powerful than five 1080s, but it cost more than twenty, so if there was support for it in games now, it could be a value. IIRC one of the big drawbacks with SLI and Crossfire was that you were limited to the VRAM of a single card, as they only paired the GPUs themselves for processing, and the VRAM was whatever the primary card had.
I still have a PC with 2 1080 Ti cards in SLI. I've gotten Metro Exodus and Red Dead 2 to run in SLI, and just about any DX11 game, and some Vulkan games. The big problem is DX12, as it needed developer support to work in SLI, like what DICE did with Battlefield 1... it ran beautifully in SLI with DX12 selected, but that's the only DX12 game I've come across that works.
Any powersuply can be jumped by putting a paper clip from the green wire to any black and it will turn on. You can used two smaller powersuplies in a pinch. Done i many of times.
Betcha one of those old cards (at least) is bad. Try running them each individually. Crossfire/SLI was always problematic but those hard crashes are something else.
According to the power supply connector overuse definition from SilverStone warranty information. A single PCIe 8pin cable and connector's maximum current rating is 12.5A, which is 150W (+12V x 12.5A).
Early Mac Pro setups were specifically designed to run with dual CPUS and GPUs. My daughter's (from her college days) came with dual Xeons and dual AMD GPUs (don't recall which model). Running BootCamp for Windows, there was literally no PC game she couldn't run. Of course, with current generation GPUs, Windows, and games, it's no longer practical.
vega 56 owner here my card can pull 400 watts ish pcie 8pins are rated at 150watts each theese cards have 2 plus 75 watts from the pcie slot 375 watts together anything past thats a risk pcie slots are over engineered tho but i still wouldnt push it past that your cards are both extremly starved at 200 watts and are running at about half each of what they could pull with out a undervolt
Your black screen is more then likely caused by the starvation A.set power limit to plus 50 b.Undervolt c.the hbm memory tends to begin failing after time make sure you load them individually to test this
Ran Nvidia SLI all the time never had any problems in any game and SLI made a lot of difference in frame rates and performance. But then again you could just buy a card that was equal to the performance of your two card in SLI. I concluded that this is how Nvidia figured it out anyway when they came up with SLI just to sell you another card. Or buy the expensive card to skip SLI. All I can say it was fun doing it and I even ran 3 nvidia gtx 260 cards at one time and 2 1080 ti cards that ran as good or better than one 3080 of today. These days two cards in one rig is just not worth the trouble or money. Just get a 4090 and fly.
i ran dual gpus for years with nvidia, 780ti's, 980ti's then 1080ti's, never had these issues on the nvidia side. it either worked or the second gpu sat idle, never had lockups like these vega cards are doing
In October of 2009 I built a computer around the AMD Phenom II x4 965 Black Edition which fairly new at the time. For graphics cards I ended up getting two XFX HD 5770 which was new at the time so I could run them in crossfire each GPU costed $400 US dollars. For storage I ended up getting two 1TB WD's Blacks ran them in Raid 0. I can't remember which motherboard I had but do remember I had 8GB of RAM. I clearly remember crossfire being hit and miss when it came to games. To the point some games would flat out crash on you if you was using crossfire had to disable it to play. In the end crossfire was more of an headache than it was worth. I ended up using that system for like 6-7 years if I recall correctly I stretched it as far as I could. Around a week ago I ended up finding one of the XFX HD 5770's I ran back than. I just ended up chucking it in the bin but perhaps I should have installed the card in my computer. To see what the card could do in 2024 with my i7 9700K.
Man, my first and only SLI build Circa 2015ish was a 4790k with 32gbs of Corsair Vengence and 2 970 Windforces. I still have all but the 970s sitting in my closet lol. Upgraded the gpu to the 2080ti back in 2018, still have it in my new build, should be upgrading once the 50 series are announced.
i have been using 2 gpus my whole life , recently i have 7800xt and a 6750xt but I do a lot of work in adobe products, so having 2 gpus and 3 monitors makes it easy where i can have a GPU dedicated to adobe and one for gaming on 1 monitor via that card output but while gaming I recently realize that both gpus share a certain amount of usage while gaming where the dedicated GPU for gaming runs at 90 percent over clock and the other gpu I notice usage while gaming it uses 30 percent from the GPU , for example in cyberpunk2077 on ultra I get 330fps all the time, while adobe products are running in the background , PC ram is 64gb Kingston renegade. CPu is 5800x over clocked. almost all the games I play are over.
I remember having two 1080ti's, wasnt really worth it. Good content, was wondering for most of the video why you were looking at the ceiling/past the camera though
I remember in 2009 I bought an I9 with 16gb Ram and a 256mb card...and games were so slow. And everyone was telling me to go crossfire....i never did but that was the thing back then. I dont see anyone doing this anymore.
Back in the days I remember running Crysis 1 with my old Radeon 9800 Pro 128MB/ Pentium 4 3ghz HT/ 256MB DDR, then a GeForce 8800 GTS 640MB, then a Radeon x1950xtx 512MB and it ran pretty good with that card at 1024x768 on medium/high
@@BeastyTank402 That's also not true. Im just asking a few knowledgeable UA-camrs just to see if they do or don't. As a poll type of thing. Not advice.
CRYSIS 1 original on PC doesnt launch with SLI / or crossfire even though for my system it launches on 2 x 580s but crashes before 3d map loads up DX 9 - DX 10 is what it is using
The problem with this video is the MB. As it is B450 the 2nd PCIe slot runs at x4. So you have 1 card at x16 and 1 at x4. It would have been better to use an X470 board where both PCie lanes are connected to the GPU. BTW even today Multi GPU is supported on AM5. The issue is that the last Crossfire driver is super old.
fun fact DX12 crossfire on 2 rtx3090 was doable last year. PS: why we did it? because we wanted to see if we had improv on AI computing (spoiler you are better off using them individually). Then we where curious to the classic question: can it run crysis? yeah it can, on BF3 we had 400+fps 4k all ultra both gpu at 84-89%. Was it painless? no but i'm not sure if it was the system fault that did not support it or if it was our "thinkering" for our AI projects.
I am in the process of building a retro xp gaming machine, with a qx9770 4gb of ram and 2 8800 Ultra gpus in SLI. SLI was a lot of fun with the 8000 series nvidia cards back in the day.
Use pro drivers, adrenalin sucks with crossfire. You'll lose a bit of fps probably like 1-5% but the stability gain makes it worth it. Running two Radeon Vii's on a LGA 2011 board with crossfire support. Works great for me, I run both GPUs in most games pre 2014. I also use a virtual machine half the time, but that's why I built the system.
I don't think that SLI or Crossfire will come back mostly as developers have moved on from it and also as someone who used SLI I found it was dropped real fast when developers did not see it as viable any more was on 2 GTX Titans back then. Would never go back to it or Crossfire if it did come back think the hassle it causes for users and also the developers is understandable this is a relic of the old 3Dfx cards so that shows just how dated and flawed it was. Interesting video on a footnote of graphics card history even so.
two GPUs is doable and effective, but not via crossfire or SLi. Lossless Scaling's frame generation can use the frame gen ability entinrely on only the 2nd GPU, making the 1st one be the render GPu, thus actually gaining 2x or 3x fps actually. Toasty Bros should check it out, LSFG actually makes 2 Gpus in 2024 possible.
Edit: Wow. didn't think this would get upvoted. I might as well provide more info in case the Toasty Bros does see this. The setup is relatively easy, but there's a specific cable to the monitor setup that one must follow, it's pretty simple though.
One just has to make sure the display out (HDMI, DVI, Displayport whatever) HAS to be connected to the GPu that's utilizing LSFG. For example, GPU 1 does the rendering of the game or video (yes LSFG works on videos too), while GPU 2 does the LSFG entirely only. The cable that is connect to the monitor(s) has to all be connected to GPU 2.
After that, you just make sure in windows 10 or 11, that you setup GPU 1 to be the main one, and in LSFG GPU 2 be the render one. If you have two same GPUS, just use RTSS or some monitor to make sure LSFG is using the correct GPU, which Toasty Bros are doing already. If they need more info, LSFG has an entire discord.
I'd be interested in see a video about this ngl, sounds like a cool concept
Actually had a question sort of along those lines. So I'm someone who liked to have other things running on my second monitor, usually some kinda yourube video or a stream. Well doing so with some of my games puts my rx 6700 xt through the ringer and hurts performance.
What I was curious about is could I say grab something like an rx 6400, something super low powered, and run my second monitor, and whatever apps I'm using there off that weaker gpu, so my good gpu can focus soul on gaming? It'd be a lot of money to simply upgrade my 6700, and it does run the games I play perfectly fine when nothing is happening on my other monitor.
I'm curious if this is a viable idea, since I had heard about people throwing in the arc A310 into systems as a second gpu souly to use its newer video incoder, and it got me thinking if that would help my use case, as a small investment in something like an A310 or rx 6400 is a lot better then a huge investment in simply a higher performing gpu.
the people who made the hybrid physx software that let you use a amd card for main and nvidia card for physx are working on a hybrid rtx software to let you buy a budget rtx card like a 2060 pair it with 7900xt and use nvidias feature set like dlss and raytracing
Crazy that doing all that and buying two GPUs is cheaper than one top end card. @@josephdias5859
can i get a gpu please
Me with 0 graphics card and 0 pc
L
I have 1 gpu no pc
F
I have a pc in a shopping cart, but not enough money to purchase😭
İ have 1 keyboard
Back in the day I had dual 1070s with a 4770K, I remember a handful of games not working with both GPUs but when games did work it was pretty cool, I played through Resident Evil 7 on a triple monitor setup with that PC, good times.
4770k for 2 1070....
Solid Setup!!
there is a reason SLI and CROSSFIRE does not work that well , both INTEL & AMD would not sell so many expensive graphic cards if it did work , we need some geeks to sort out the problems for us as SLI and CROSSFIRE is a dam good idea , i was going to have a go at crossfire with 2x 580 , dont think i will bother now tho
I had the 970s in sli. So I learned it the hard way that one gpu is best. Configuring sli from game to game wasn’t worth it.
@@mikeslemonade i have decided against it now , just a pity both radeon and nivdea only think about profits rather than utilising what is available
I remember running 3 Rx 580's to get the performance of a 1080TI. Those were the days. Great video!
Same here 😂
Lol how did it run
@@smokejumper749 on the games that supported it, it worked great. Once I bought a 1080TI to replace them, I could see how the 1080TI was superior though for sure.
good old SLI days, im freaking OLD.
you`re not old and SLI its cool 😎
Did you run Voodoo 2 SLI? That’s where Nvidia got that technology from.
My friend with dual 980TI in sli had the fastest pc at my high school
@@Typhon888 had that card with sis video card. I didn't had 2 of them but even one was like a rocket.
I got into pc building right as it started to die so I never got a chance to try it out
PCIE Power.
Slot gives 75 Watts
8 Pin gives 150 Watts
6 Pin gives 75 Watts
So these cards with 2x 8pins can draw 375 Watts safely.
However... you are using daisy chained power cables.... but that shouldn't be a problem as each cable should be able to supply about 400 Watts.
iv had issues with vegas on daisy chain and some not. iv had a fair amount of them lol
Time to kick it up to some Quad-SLi next!
the best
@@DanielCardei Hey there what's up Daniel? 😀
I waiting for your RX Vega crossfire.
There is only handful of Intel mobos and even less for AMD to have more than 2 PCIe ×16 lanes on a motherboard or custom made
The lesson here kids, If you missed out on the SLI Crossfire wave you didnt miss much outside of relaunching your games and reinstalling and whipping your drivers with DDU constantly!! the only config of SLI I had no big trouble out of was the 560ti's once the cards demand more power oops!!
@@bornwisedistruction so far my experience with SLI and Crossfire has been pretty seamless. Only problem I've had is power draw. I have maxed my RM1000 out a few times now with my current system configuration.
I ran a 3870x2, two gpus no crossfire needed. 😂
@@GoonyMclinux I used to call them the Ice Cream sandwich gpu's, dont know why my brain always processed them that way. If you look at them long enough you want to take a bite out of them! 😅
SLI 6600GT SLI 8800Ultra and SLI GTX260 😅
I ran Crysis with just a i7-4770 and a GTX 660. That rig should have blown my old system away.
A 970 / 1070 is a perfect match for that processor, you could play alot of games! I love that processor it was my first resl PC I modified.
Mate that's a pretty powerful pc compared to what crysis was supposed to be ran on
I ran it with i7 920 with 2 GTX 285 in SLI. Couldn’t do 1080p max.
@Typhon888 such a cool setup idea
mine i7-7700K and 1060 6gb
Crossfire (AMD) and SLI (Nvidia) only required 2 PCIe x8 Lanes.
To get true full PCIe x16 + x16 Crossfire/SLI, it required Enthusiast (Intel i9/AMD Threadripper) motherboards with a far higher PCIe lane count. They only need to be PCIe Gen3 to work, as no Crossfire/SLI on any PCIe Gen4 cards exist outside of Enterprise level GPUs.
Normal Consumer-level chips only have 20+4 PCIe lanes with motherboards.
Motherboards will no longer specifically state they support SLI/Crossfire... but if it has 2 x8 lanes, it will work.
For instance... ASUS Pro Art B550 Creator motherboard has 2 PCIe 4.0 x16 slots. (x16 or x8/x8)
Will work with this motherboard, confirmed with 2x GTX 1080s in SLI.
I sold my Vega 64 for $900 during the crypto boom and bought one of the low end AMD cards at the time and used the profit to upgrade some other components. Hearing you got two of them for $200 tells me just how different the market is now.
Dual running an RX 6600 and an RX 6700XT in the same system for LLM research and frequent gaming. AMD is great for these scenarios, it was seamless for me (I remember the SLI/Crossfire old days, it wasn't always this easy).
I run 8b models on my 6600 xt all day.. got my own finetunes I made of llama 3.1. this on Linux though as Rocm is a PITA to get working correctly on windows. do my gaming on linux also
Does it bring any perks at gaming or it just utilize one card at a time.
I used to run 2X GTX 660 in SLI back in the day…
Older versions of Fallout 4 worked rather well and the non-special edition of DMC4 is great with multi-GPU setups 😊
Its actually insane how good tge quality and the release frequency of these videos are, keep up the great job :D
I have 2 vega 64s in my system ironically. I have already tripped the breaker multiple times and have maxed out my rm1000 on more than 1 occasion. I even fried an outlet with this system. I have engineering samples so they operate at a max of 250 watts while boosting to 1400 mhz and you can't change the clocks either. On the retail bios it has absolutely no limits in place so i can make them pull around 400 watts each if i wanted. Funny enough the engineering sample bioses actually score higher in benchmarks.
I was hoping for the 7970xt but gave up and got the 7900xtx
@@sasquatchcrew leaks are coming out that they are releasing refreshes under those names being 7950xt and 7990xt
Back in the days i had a sapphire 7970 when it came out, years later i got a sapphire r9 280x (pretty much same chip/card).
I ran them both in crossfire without a problem.
The only problem however was that crossfire, when it actually worked in a game, was often plagued with rather bad frametimes.
In some games it worked quite good. One i particularly remember was metro 2033 (the original). Ran so much better in crossfire than with 1 card only.
But in general was more or less a waste. Replaced them both later with a rx 580 when it came out.
However... there was also the time where physix was quite a thing (at least in some games). And adding a nvidia gpu as secondary to my rx 580 allowed me to use hardware physix while gaming on the rx 580. (shoved in a 560ti to make that happen).
I had 2x 7970 cf relaced with one nvidia 680.... same performance, added second 680 then replace with one 980ti.... same performance, replaced with a 1070ti and the story ended with a 2070
Not much of a gamer but back in those days I had 4x 7990s paired with the i7 980x (6 core hyperthreaded) and 24 gigs of triple channel ram (ddr3 1866) that worked awesome for mining bitcoin etc. That rig finally gave up the ghost about 6 months back. It ran with a 33% overclock for over a decade. Looking forward to the parts for my new to me 8 year old rig to arrive. It will be an upgrade to a 22 core Xeon with 128 gigs of ram and an intel ARC or two for the GPU side of things.
I ran 7300gts in SLI back in 2007, it actually was pretty great for the time.
1 8-pin can pull 150W max. That's 300 per card. Plus 75 via the PCIe = 375W.
Double that due to the SLI = 750W.
Edit: the official TDP is 295W. So that's 590W in theory.
I think the crashes are due to the GPU's spiking at high wattage and causing the power supply to trip.
Crysis Remastered games disable XFIRE in the later drivers, you will have top use older drivers to get those games to work.
Love to see kf 2 being played, im still playing it on my i7-4790/1070 rig
I’m excited waiting on the new one
It worked fantastic. I had 2 r9 390x gpus in crossfire right when they launched, and it was a great experience. Almost every game worked with them, and overall, I had no complaints.
I used twin RX 580s for years. Most games worked well, just used "generic AFM" in the drivers.
I miss those days.
I really wish both AMD and nVidia bring back support for Crossfire and SLI for both driver, game and motherboard levels. I really would like to know what it is like to run current gen GPUs in its maximum potential.
Apparently there’s UAlink coming out that will mesh gpu’s through the pcie ports.
Yes you have to roll back driver for them to fully work.
Does that mobo support PCIe Gen 4 for the 7600?
B450 has 3.0 so no.
I’d be willing to bet two 1080TI’s in SLI would give you better results, especially if you can track down a high bandwidth bridge from the 10 series. I ran two 1070’s back in the day and the bulk of what I played supported SLI with good scaling. The big upgrade from the 9th to the 10th gen was that HB bridge, it smoothed out the majority of the microstuttering I had experienced with the 700 and 900 cards.
Apply new thermal paste on the gpus, set a target fps, and undervolt the gpus if you can. If those cards run good undervolted
Praise the modern GPU!
What? Why?
nah hook up 4 GTX1080s
No, praise Jesus
@@DavidLopez-vn4be jesus only gets like 15fps
Trying running each gpu with it's own power supply to see if it's the power draw or crossfire. The other possibility it could be the motherboard as well.
how about "DUAL" RX580 8GB 2304SP?
You guys should have done a prime day budget build , wheel spin vid, maybe being able to use Newegg too.
What i would have liked to have seen is the pc running killing floor with crossfire turned off to show the difference in the fps between crossfire on and off.
I remember taking the side of my case off to put a box fan on my 2x r9 290s. Talk about putting some heat out
“ oh god i lost a screw” 😭😭 I felt that in my soul
When I did dual power supplies both cards got better showing a dedicated power supply just for the GPu alone was best to allow it to fully ai unlock
I used to run 1070 sli back in the day just fine with no issues until more games came out without support which ended up making the choice to upgrade to a single card. Loved the performance I got when I had it though
But with a B450 motherboard you are running one card at PCIe 3.0 x8 speeds and the other card at PCIe 2.0 x8 speeds... Wish there was a test video with at least dual PCIe 3.0 x8 speeds or even 4.0 speeds.
there is a pin switch on the card that change the led colors, it's not because of the crossfire
there's also a pin switch that changes the vbios
I have a 56 and that was the first thing I researched about the card
I’m here love your videos soon going to custom build one of the 3050 builds you guys did
Id got with a 40 series card or wait on the 50 series with how games are taking more and more to run your gonna want a upgrade and and update path also deff got with ddr5
@@MichaelGuertin-kh5no ok thank you
@@Natebate9000Dicamillo Your welcome best of luck to you!!! may your frames be high and your ping be low!!!
Dude buy 3060 not 3050@@Natebate9000Dicamillo
i've had 2 Vega 56s for a while and this @ 10:10 just recently started happening.
also, i dont know if its related but my PC wont work on 2 monitors, i had dual monitor setup in the past but it randomly stopped working, since then i've swapped it multiple times, but after a few weeks my PC would stop displaying to those new monitors. after some time i just gave up and stuck to just using one monitor.
i've been getting into game Dev and debating if 2 GPUs are a viable option and should try carry these 2 cards into my new rig or simply start from scratch...
First issue: the second PCIE slot is electracly at x8/PRIMER FALLO: EL SEGUNDO SLOT PCIE ES ELECTRICAMENTE X8
That's normal for mainstream platforms
I used to run Battlefield 4 with dual Radeon HD 7850's running on the Mantle API.... haha... remember Mantle?
Who’s the first one loaded in?
I had two hd 7870's paired with my i7 980x back in the day was a cool rig
Can I do 2 Radeon 580s?
In addition to other comments about lossless frame gen... you can also pair an older gen Nvidia card with a newer one and in Nvidia control panel make the older card a dedicated Physx Processor. The removes load from both the CPU and render card which gives a noticeable uplift in titles that have physics rendering. It also has the benefit that a low end old gen card is still usually a better physx card than your cpu is.
Nothing beats the cool factor of sli/Xfire in pc builds, im thinking of gettin myself some vega 64s for the heck of it
Great content as always.
Sure, a 4090 is more powerful than five 1080s, but it cost more than twenty, so if there was support for it in games now, it could be a value. IIRC one of the big drawbacks with SLI and Crossfire was that you were limited to the VRAM of a single card, as they only paired the GPUs themselves for processing, and the VRAM was whatever the primary card had.
The 1080 was £575 at launch. 20 x 575 is 11,500, so you are not even close.
I always wanted to duel up my 1070 but by the time I could it was an outdated technology 😢
I still have a PC with 2 1080 Ti cards in SLI. I've gotten Metro Exodus and Red Dead 2 to run in SLI, and just about any DX11 game, and some Vulkan games. The big problem is DX12, as it needed developer support to work in SLI, like what DICE did with Battlefield 1... it ran beautifully in SLI with DX12 selected, but that's the only DX12 game I've come across that works.
still running a 1050 ti with fans that dont spin from a friends old broken pc
Any powersuply can be jumped by putting a paper clip from the green wire to any black and it will turn on. You can used two smaller powersuplies in a pinch. Done i many of times.
do the issue with everything judt dropping out id called PCIE surprised link down . one of the ways to fix it is to use the modded driver
Betcha one of those old cards (at least) is bad. Try running them each individually. Crossfire/SLI was always problematic but those hard crashes are something else.
finally a micro center near me
According to the power supply connector overuse definition from SilverStone warranty information. A single PCIe 8pin cable and connector's maximum current rating is 12.5A, which is 150W (+12V x 12.5A).
Early Mac Pro setups were specifically designed to run with dual CPUS and GPUs. My daughter's (from her college days) came with dual Xeons and dual AMD GPUs (don't recall which model). Running BootCamp for Windows, there was literally no PC game she couldn't run. Of course, with current generation GPUs, Windows, and games, it's no longer practical.
how would it preforme using blender?
This ALL depends on your Motherboard x4 x8 x16 and what slots you use.
I had 7970s in cross fire back in the day n was so worth it 💯
My last dual gpu set up was dual Evga 8800gt’s in sli
I had one as my first gpu
i`ve done this with 2x r9 295x2 in Quad CrossfireX. i feel you. its an achievement when its working.
heyyyy Daniel 😎
English or spanish
@@danielivanov930 👋
@@danielivanov930 ✌
vega 56 owner here my card can pull 400 watts ish pcie 8pins are rated at 150watts each theese cards have 2 plus 75 watts from the pcie slot 375 watts together anything past thats a risk pcie slots are over engineered tho but i still wouldnt push it past that your cards are both extremly starved at 200 watts and are running at about half each of what they could pull with out a undervolt
Your black screen is more then likely caused by the starvation
A.set power limit to plus 50
b.Undervolt
c.the hbm memory tends to begin failing after time make sure you load them individually to test this
Ran Nvidia SLI all the time never had any problems in any game and SLI made a lot of difference in frame rates and performance. But then again you could just buy a card that was equal to the performance of your two card in SLI. I concluded that this is how Nvidia figured it out anyway when they came up with SLI just to sell you another card. Or buy the expensive card to skip SLI. All I can say it was fun doing it and I even ran 3 nvidia gtx 260 cards at one time and 2 1080 ti cards that ran as good or better than one 3080 of today. These days two cards in one rig is just not worth the trouble or money. Just get a 4090 and fly.
0:01 Hah, I don't have 1 graphics card, I have 0! Integrated Graphics! I am better! I am smarter!
if I remember correctly, the 7900xtx supports crossfire... though i dont think many people have tried. I could be wrong though
why not test 6800 xt's there crossfire also .
What if you wanted to run more than 4 screens? Could you still run dual gpus without crossfire?
I imagine they need to be re-pasted badly, I bet the TJ-Max is high.
i ran dual gpus for years with nvidia, 780ti's, 980ti's then 1080ti's, never had these issues on the nvidia side. it either worked or the second gpu sat idle, never had lockups like these vega cards are doing
How about 2x 2080 Ti over NVLINK? Curious minds!!
Thx for the great content! 🎉
LTT already did 3090s
Undervolt on msi afterburner help keep voltage to not go very high.
There is a point if you run the Ryzen 8000 with a a add-on card to the iGPU.
In October of 2009 I built a computer around the AMD Phenom II x4 965 Black Edition which fairly new at the time. For graphics cards I ended up getting two XFX HD 5770 which was new at the time so I could run them in crossfire each GPU costed $400 US dollars. For storage I ended up getting two 1TB WD's Blacks ran them in Raid 0. I can't remember which motherboard I had but do remember I had 8GB of RAM.
I clearly remember crossfire being hit and miss when it came to games. To the point some games would flat out crash on you if you was using crossfire had to disable it to play. In the end crossfire was more of an headache than it was worth. I ended up using that system for like 6-7 years if I recall correctly I stretched it as far as I could.
Around a week ago I ended up finding one of the XFX HD 5770's I ran back than. I just ended up chucking it in the bin but perhaps I should have installed the card in my computer. To see what the card could do in 2024 with my i7 9700K.
thank you for doing this video, i was actually thinking about doing this and spending around $600 instead of saving up for a really expensive GPU
Man, my first and only SLI build Circa 2015ish was a 4790k with 32gbs of Corsair Vengence and 2 970 Windforces. I still have all but the 970s sitting in my closet lol. Upgraded the gpu to the 2080ti back in 2018, still have it in my new build, should be upgrading once the 50 series are announced.
9:14 new meaning to “whistle while you work” 🤷🏾♂️🤭
i have been using 2 gpus my whole life , recently i have 7800xt and a 6750xt but I do a lot of work in adobe products, so having 2 gpus and 3 monitors makes it easy where i can have a GPU dedicated to adobe and one for gaming on 1 monitor via that card output but while gaming I recently realize that both gpus share a certain amount of usage while gaming where the dedicated GPU for gaming runs at 90 percent over clock and the other gpu I notice usage while gaming it uses 30 percent from the GPU , for example in cyberpunk2077 on ultra I get 330fps all the time, while adobe products are running in the background , PC ram is 64gb Kingston renegade. CPu is 5800x over clocked. almost all the games I play are over.
I bought an SLi motherboard back in the day to be able to run 2 x 560 Ti as a budget GPU upgrade. Can't remember having any issues.
can i use a 2nd gpu just for more displays
I remember having two 1080ti's, wasnt really worth it. Good content, was wondering for most of the video why you were looking at the ceiling/past the camera though
I used to crossfire 2 R9 290X's and a water cooled 8 core FX chip, it was pretty awesome back in the day 😎
Does it still crash like that with the latest drivers?
10:15 still happens to amd 6000 series cards. Has happened to my 6950xt a few times.
Me doing crossfire between integrated graphics and an R5 240😞
I remember in 2009 I bought an I9 with 16gb Ram and a 256mb card...and games were so slow. And everyone was telling me to go crossfire....i never did but that was the thing back then. I dont see anyone doing this anymore.
My guess is the thermal interface is failing and the cooler is clogged up. Can't wait to see Watt it was!
I'll see myself out.
Best one I had was SLI GTX 260's. Running on an MSI P35 mobo with a C2Q SCLAR. If you're old enough you know what that GOAT cpu is.
Back in the days I remember running Crysis 1 with my old Radeon 9800 Pro 128MB/ Pentium 4 3ghz HT/ 256MB DDR, then a GeForce 8800 GTS 640MB, then a Radeon x1950xtx 512MB and it ran pretty good with that card at 1024x768 on medium/high
Another question, do you guys turn off the igpu in bios when you have a dedicated gpu? Example build 7800x3d/7900xtx
No, no one does this
@@BeastyTank402 That's not true at all
@@DannyB1291 it’s pointless but ok. Why ask if you know?
@@BeastyTank402 That's also not true. Im just asking a few knowledgeable UA-camrs just to see if they do or don't. As a poll type of thing. Not advice.
CRYSIS 1 original on PC doesnt launch with SLI / or crossfire even though for my system it launches on 2 x 580s but crashes before 3d map loads up DX 9 - DX 10 is what it is using
The problem with this video is the MB. As it is B450 the 2nd PCIe slot runs at x4. So you have 1 card at x16 and 1 at x4. It would have been better to use an X470 board where both PCie lanes are connected to the GPU. BTW even today Multi GPU is supported on AM5. The issue is that the last Crossfire driver is super old.
what if u told i have no gpu?
fun fact DX12 crossfire on 2 rtx3090 was doable last year. PS: why we did it? because we wanted to see if we had improv on AI computing (spoiler you are better off using them individually). Then we where curious to the classic question: can it run crysis? yeah it can, on BF3 we had 400+fps 4k all ultra both gpu at 84-89%. Was it painless? no but i'm not sure if it was the system fault that did not support it or if it was our "thinkering" for our AI projects.
The trouble with micro center is that every time I want to mail order it can’t be shipped. In store only.
I am in the process of building a retro xp gaming machine, with a qx9770 4gb of ram and 2 8800 Ultra gpus in SLI. SLI was a lot of fun with the 8000 series nvidia cards back in the day.
Use pro drivers, adrenalin sucks with crossfire. You'll lose a bit of fps probably like 1-5% but the stability gain makes it worth it. Running two Radeon Vii's on a LGA 2011 board with crossfire support. Works great for me, I run both GPUs in most games pre 2014. I also use a virtual machine half the time, but that's why I built the system.
I don't think that SLI or Crossfire will come back mostly as developers have moved on from it and also as someone who used SLI I found it was dropped real fast when developers did not see it as viable any more was on 2 GTX Titans back then. Would never go back to it or Crossfire if it did come back think the hassle it causes for users and also the developers is understandable this is a relic of the old 3Dfx cards so that shows just how dated and flawed it was.
Interesting video on a footnote of graphics card history even so.
Me watching pc video without a pc 😂
Great video guys
What model Zalman case is that ?
Neo # ????
i3 neo