Are SLI and Crossfire DYING?
Вставка
- Опубліковано 25 сер 2024
- Get an unrestricted 30-day free trial of FreshBooks at www.freshbooks...
Why have multi-GPU systems seen a sharp decline in popularity?
Linus Tech Tips Merch Store: www.lttstore.com
Follow: / linustech
Leave a reply with your requests for future episodes, or tweet them here: / jmart604
My roommate has an SLI setup. It fails on half the games he uses it on (crashes, stuttering, no performance gain, etc.) It's great when it works, but that's IF it works.
Which games are those specifically? From what I have seen, most modern games love SLI.
Christopher Kidwell I’ve used SLI since it inception, and I’ve used Titan Xs in SLI for the past two generations, but to be honest the benefits are... questionable. Having additional card back in the day was the difference between running Witcher 3 in 4K max settings smoothly or with jittery sub-40 frame rate, but nowadays it’s just not worth the hassle. Yes, with Shadow of the Tomb Raider I’ve seen ~110% increase from second 2080Ti (yes, that’s more than double), but other DX12 games (and you need DX12 for raytracing) doesn’t support SLI at all.
It can be done, as Tomb Raider proves, but nobody bothers.
And even before DX12 the gain was usually around 20% if not less unless the game was specifically optimized for SLI.
Christopher Kidwell so basically you’re expected to pay another $1200 (or whatever 2080Ti costs these days) for one game... pretty questionable investment.
And don’t forget that some games, like Assassins Creed, actually loose performance with SLI.
If implemented correctly, DX12 multi GPU can scale 1:1, much better than SLI ever was, but for developers it’s not worth the effort when less that 1% of players use it. And for players it’s not worth it because nobody supports it.
@@christopherkidwell9817 most modern games get ~20% improvement in frames, at double the cost. Many games get negative scaling in cases that the game engine uses persistent effects from previous frames waste a shit load of draw cycles trying to pull it from the memory of one card to the other before it can try drawing the next frame. This is one instance where significant negative scaling occurs.
@@christopherkidwell9817 Two examples of games we have tried recently were Insurgency: Sandstorm and Titan Fall 2. Yes, you can google some fixes and workarounds, but it's not always as simple as "Start Game with SLI enabled, and Go". sometimes you have to go to the Nvidia forums and look for different game profiles and sometimes they work, sometimes not.
Dying? No?
Died over 4 years ago.
@singularon1 oh so you will be that "back in my days..." kinda guy, about pc gaming.
@singularon1 Sure you can still install 2 or even 4 cards if you want to. Just plug in 1 of them.
No it dies with 3Dfx
@@flxdrv5020 back in my days it was fun the new start button and don't have to type in the prompt
You don't need it anymore so why would we... it alway been for a manic who need highest setting.... Average gamer do medium.....
SLI crossfire are dead so they raise the price to 2x.
nice one xD
Isn't it happening already ?
more since they need bridge... for it.... nowday we don't I don't see the point since ray trancing also useless with SLi... It more for commerical used for movie now... that since using it...
I'd laugh if it wasn't true
And they accept 60fps on a 144hz screen with ultra settings😂
nonono, you're forgetting the most important possibility: dual *RGB* gpu
RGB gets 50+ frames per SECOND OH MY GOD MASTER RACE
Consoles are so weak beacause no RGB . Ssssss
dual rgb heat sources
You know you're a member of the master race when your RGB needs it's own gpu
That means like... An extra... 600 frames per second?
Tbh a pc with 4 graphics cards looks sick af
Nicholas Ryabchuk your pocket/wallet must be tick asf, right?
@@fi5075 I never said I had that, my laptop struggles to run minecraft at 30fps without smooth lighting
@@tarkfarhen3870 BURN
Just throw em in there for show and don't plug them in
Agreed, and a temperature that high might just need a trip to the hospital
You have to look at reality,
Going for a 2x Rx 580 crossfire ..
Dx11 left the chat.
Vulcan api left the earth ..
T-T
Ayan Mondal same, but I don’t play many new games anyway
AMD has left the chat.
Vulkan API and DXR12 are actually growing. Many devs are accepting it slowly. When DX9 was also slowly adapted. It's just the current development tools slowly slowly people will adapt as Cross platforms will be a thing.
🤣
Ayan Mondal AMD sucks
Sli and crossfire are dying
Me : i cant even afford a single GPU
Ahaha i feel u sir.. Just have ryzen.. And just use low, med settings
Everyone knows its not because of that the price doubled since sli is basically dead.
750ti is best budget GPU ! its best if u arrange a used one.
@@bluelivesmatter8502 750 ti? That can barely run Witcher 3 at 20 fps. I think a 970 is one of the best budget cards and used ones are only about $20 - $30 more than a 750ti with about 200% performance increase. But each to their own :)
@@bluelivesmatter8502 lol I cry in laughter in my 750ti CPU PC
Speaking of dying...*Grim Reaper* our sponsor providing fast and reliable transition to eternal state
Hahahaha
They figured out how to make both GPU's run as a single unit but they haven't given it to consumer grade gaming hardware.
Maybe SLI/CrossFire would work better if they enabled that feature on consumer hardware.
Not to mention games probably wouldn't need to be coded differently if all the GPU's are seen as just one powerful GPU.
Memory needs to be shared across both cards. That's the major bottleneck of SLI technology. How frames are delivered in a video game are different how they are used in compute heavy applications, where one could break up processing evenly across CPUs without having to pull from the other card's memory.
@@RationalAndFree I was thinking of Unified memory, but sadly you need something to coordinate both GPU dies on a single card, and this problem has been worked on for more than a decade. Because even with shared Unified memory where it's treating both dies as literally one thing, you got to remember that a GPU also holds cache on the die. So, you would need a bus that is fast enough to be as if they are the same unit, otherwise, the lag between the two will be great enough to drop your frames down. Let's look at the Ryzen Threadripper 1950X. To access memory locally to the CPU is 70 ns. To access memory across the Infinity Fabric is 120 ns. That's just less than double the time it takes. And if we are looking at the new stuff, such as the 3900X, 3950X or the new Threadripper 3 lineup and Epyc Rome, we are looking at an Active Infinity Fabric that deals with congestion, but still has a lag between the Nodes on the silicon when crossing the Infinity Fabric. So taking that information, you are talking about reducing your frame output by trying to just do 2 GPU dies on a single card natively.
Whereas with compute applications, you are not necessarily hindered by time sensitivity, and it's very possible to schedule out workloads that do not require unified parallelism.
My GeForce GTX 750 Ti has handled every spreadsheet I’ve thrown at it! Also no problem with 1080 video processing.
I just bought one for 40
Mine is the long card with two cooling fans, $75 new, 3 years ago. (There is a short card version, one cooling fan, don’t know if there is any performance difference)
But can it Load Clippy without crashing windows
Load clippy? Yup no problem loading it, running it? I’ll get back to you on that.
Buy two video cards for $1500 or one for $1500. Seems logical to me.
Agreed.
There are so many companies not opimising their games for dual gpu setup.
You wont get even close to a double in performance.
Single good card is better than 2 mediocre.
Graphics cards were a hell of a lot less espensive in the past, even factored with inflation.
Many would buy the most high end card they could afford, then buy a second one down the road, which allowed one to sit with your set up for a bit longer than sitting with a single card.
At the current price of the cards, and the ~20% performance improvement on average, it no longer makes sense. Buy a single card and sit on it for 4-5 years as the graphical power being delivered isn't improving at the rates it was in the past.
You could easily do a powerful Multi-GPU setup for just 500$, but that was before the GPU price hikes. Imagine being bullied so much by Nvidia that people believe 400$ is no longer high end for a single GPU.
@@sulphurous2656 remember that AMD chased right behind Nvidia with price hikes. Nvidia isn't the only one whos raking everyone over the coals.
@@sulphurous2656 what kills me is that I paid around $300 for a gpu not 2 years ago and it lags badly in video editing now a days. A cpu is good for much lo get, but I think that's a hallmark of an overpriced item.
IMO its 95% of the issue is based on the lack of software/programming work, much like the lack of multi-core support the lack of multi GPU support is common.
It's a vicious circle: developers don't optimize their games for dual GPU because most people don't have it, and most people don't have it in part because developers don't optimize their games for dual GPU.
It's why my Haswell Devils Canyon 4790K is still relevant today. Lazy game developers. Also, because most people are broke, my GTX 1080 will be still relevant for a few more years.
Loo
A lot of games in 2019
Are not even
Optimized to run on a single gpu haha
I feel like this is meant to be an acrostic. Could you make it many games instead of a lot of games?
Was excited about SLI and Crossfire back in 2009 for multi monitors..Now its 2019, 1 GPU can do all of that..
Love the pop-out effect at 0:23
nice!
The industry needs another Crysis. I hope that’s cyberpunk or red dead pc, but honestly we just need another “omg that looks amazing it justifies an upgrade” type game
sorry, but we are already deep deep in the land of diminishing returns and Max perceptible prettiness. improving graphics past the point we are at today has nothing to do with Fidelity and raw power, but rather skilled art design, texture design, lighting and graphics tricks. most people cant tell the difference between high and Max or RayT on, and it makes absolutely no difference to their play experience (it looks great!) except that it runs worse.
I'm from the future. MSFS 2020
What's the point of SLI? The Human eye cannot see more than 2 GPUs...
Maybe different purpose. I mean some dual GPU for computing is reliable rather than for gaming.
Raihan Wira Danurdara r/ woosh
Your comment makes no sense "human eye cannot see more than 2 GPUs which means human eye can see 2 GPUs and sli in 2 GPUs. Boom facepalm
But the purpose of having 4 gpus is to heat up your room...
Only for winter
My crossfire rig will live forever!
I miss multigpu. I had 2xgtx460, then 2xgtx770, then gtx 1070. I used to just buy a second one when the new generation came out on the used market for a nice boost. But alas as you see with the 1070 it was not longer needed. I did upgrade to a 2070 but it basically does 1440p 144hz everything so for my monitor it's currently pointless to try and slinanyway (not that Nvidia allowed nvlink on the rtx 2070.
Pretty much this.
ayyyyy, i had a 770 sli set up too. only wish i had bought the 4GB variants instead of the 2GB ones
In gaming, it seems that multi-GPU is, in fact, dead. For other things, it's far from dead.
well, i plan to use it for streaming, and i stream games, so i hope i will figure a way to dedicate a gpu to each task without experiencing gaming impossibilities
Hey this is great info but real talk when are people going to start addressing how multiple graphics cards can benefit editing workloads like vfx/fusion? And why does hardly anyone do tests on Davinci Resolve??? Asking for a friend.
i can answer that for you...No..they are not dying...they are already dead
NANI?!?
I agree with your video points that it's the code that makes the most sense for multi GPU use. I use FCPX to edit videos and unlike other editing suites it is multi GPU aware. I have run massive tests and found that two RX560s perform about 10-15% better. Even thought they equal less performance on paper than one RX580 does. They also run cooler, more efficient, and I won't be as bummed when they burn out from being used all the time to encode video effects.
As long code is written well to use parallel computing then multiple CPU/GPUs, will benefit the end user experience. Good video you all made about this topic. Make some more about this if you can find enough content to do so. (:
I remember when my friends used to flex this in the days of xbox, where are we now. Haha
I'm glad I could build my sli set up, It does look beastly compared to a single GPU.
One of the things that was nice though was that say you have a GTX1080, instead of dropping $750-800 on a GTX2080 Super could get a second used GTX1080 for ~$300, skip the hassle of selling my old GPU to be able to afford a new one, and get theoretically better performance in supported titles anyways. And as you said DX12/Vulcan actually also supposed to improve performance for multigpu scaling (and allow for mixed GPU setups so theoretically I could throw a similarly performing but newer RTX2060 Super in there if explicit multigpu was more common and get RTX support too!). Issue is those are entirely based on titles supporting it and it is in fact becoming far more rare sadly. I'm just waiting to see in CDPR has SLI/DX12/Vulkan Multigpu support for Cyberpunk 2077 (and how the game in general performs), if they do that will be my deciding factor in what I do because that's the ONLY game rn that I care about. Also doesn't help Nvidia doesn't allow voltage tweeking anymore, otherwise I'd overclock my GTX1080 with more voltage. Mines hitting 46C in full load in a custom loop, and that's overclocked at 2114MHz core (also debatable if it would make a difference with Pascal, but still...). But that's a whole different topic.
Ah, I remember my old GTX670 and GTX980 SLI setups. Wouldn't even cross my mind now.
ya same here, even tho I haven't personally had more than one GPU in my systems at any given time I do remember watching loads of videos back in the day when it was a big thing even single dual cards where a big thing, even then tho I had a hunch it wouldn't last long as I new cards would just get better to the point where having more than one GPU in your system was pointless
I use the Matrix for all my gaming needs. Ever since Neo struck an accord it's been good.
I wish multiple GPU setups worked by having dedicated sister card designed specifically to be the secondary card in an SLI / Crossfire setup. Pack as much compute performance as possible on to the primary card which will quite happily run standalone and do all the heavy lifting but then include additional features like real time ray tracing support or additional compute units for working on separate tasks like simulation workloads on to the sister card. I doubt that's even possible but twin GPU setups look sweet.
I'm still rocking 2 1080 Ti's. Though mostly I keep SLI off because I mine crypto. Usually if I'm playing a game, I turn off mining on just one card and keep mining with the 2nd.
I have two GTX 1080’s i run them as virtual machine each, so we can play lan on it.
Just give me a three way NVLink I can play most games 8K60p damn it. Titan RTX SLI is so close to a perfect 8K gaming experience.
I can't wait for the DP to HDMI 2.1 adapter
Dying implies that it's still alive.
Awesome video as always guys, keep it up 👍
I just would love to see more SLI/CrossFire support for VR titles, they would really need it and also profit the most.
This is actually the ultimate use case IMO. It's a pity so few VR titles take advantage of it.
Yup, you said it.
Ditched my ex-ex-680 SLI setup because SLI never was developers' priority. One time, I did some tests in Splinter Cell Blacklist and the game worked better when using only one 680 over two. Meaning, I had more fps while using 1 GPU over two.
And it wasn't the only game that behaved this way.
After that, I switched to a single GTX 980 and sticked to mono-gpu since then !
When I built my PC over 3 years ago I knew that shit was already dead.
RTX desperately needs solid multi card options.
maybe a separate cheaper card for Ray trace rendering. But i am pretty sure, 2-3 years from now, the number of RTX cores will increase so much that multi card option will again be dead.
A day late and a dollar short. It's already dead. It was popular when the best cards were around $200 a pop. Now they're north of $500. Some are arctic north of that at $750. And apparently, some of us are insane enough to pay that much for a card that will be at the bottom of the stack in about 2 years but not insane enough to spend it twice for a few extra frames.
Zeus Legion Buddy the 2080ti cost $1200
Rendering frames on alternate cards overlapping increases the frame rate but doesn't decrease the latency, and it's the latency that affects actual game play. So getting a card that's twice as fast helps your game play in a way that using two cards that take twice as long to render each doesn't even if you get the same frame rate.
My question is what is that case with two vertical mounts. One on top of eachother
It's a special case that was built only for ltt
Yea sadly the guy was gonna market it but never spoke about it again
Do you remember a time -not so long ago- when you were seeing linus in almost every video of this channel?
02:53 Being able to pull original files from the old shoots instead of grabbing them from old YT videos are showing. Hail the Vault
I actually belong to the somewhat 3% of the population who still believes in the multi-gpu set-up despite the unpracticallity of it. It might sound shallow but an sli or crossfire set-up just looks AWESOME
We really need a new game to come out with excellent sli/crossfire support that crushes current gen GPUs like crysis did. Only then do I see a resurgence.
When you have to compensate for your small peepee but can't afford a big lifted truck
I stopped multi-GPU setups (crossfire was last one) when I had the Radeon 5870s, which produced way too much heat together.
I have a suggestion for a video.
What about using two non identical GPU's for multiple monitors? It's a pretty obscure idea that I've seen some people talk about. Basically, say you have two monitors, one running your game, and one running your crap like google chrome or whatever. You set things up where your higher end GPU takes care of the game, while your lower end GPU takes care of everything else. Some people have claimed getting a 10fps boost or so by doing this.
You have your low end GPU set as your primary graphics (I know this sounds counter intuitive), so that way all programs default to it. Then you set your game to run on the higher end GPU, so that way it's exclusively running on it.
I think it's possibly a good idea for anybody running multiple monitors (which is a good chunk of people). You just gotta buy a cheap low end GPU, doesn't have to be anything fancy, just has to render chrome and discord and junk like that.
I have not done this myself btw. I'm hesitant on buying a low end GPU to try it out, because I have a hard time finding information on it. Hence why I'm asking for a video of it.
I didn't buy two 2080tis for performance, i bought them to flex on all the peasants on the internet.
i wish we see cards like the AMD R9 295x2 back in 2013
2 GPUS .. 1 card
it was and probably to this day the coolest card IMO
Haven't been using mult-gpu since back in the Voodoo 2 days.
lapp.tech heh I still have a pair in a 98 box for old games.
Every SLI/Crossfire video talks about micro stuttering but I've only ever experienced it with 4x Titan X. Regular 2x setups were always good for me.
I have seen it on SLI GTX 1080 cards.
@@tenshi7angel Not sure what you're doing then, because I've been running SLI since I had a pair of 7950 GTs with no micro stutter. I suppose it depends on the games being played. Only seen it with 3 and 4 way cards, which ironically never had micro stutter in synthetic benchmarks.
So do people still use SLI setups for heavy modeling/rendering? You only talked about the impact on gaming without mentioning other fields of work.
Yes. It makes a huge difference.
The issue with the Linus channels is they only care about how tech affects them. Which is mostly gaming. If you look at how they stress test stuff they do the Adobe test and tomb raider. They never compile extensive code, render models, try it with medical software (which is designed for God like pcs), audio rendering, or any of the other uses for a comp. The issue is many think if a comp is good at gaming it's good for those when that's not true at all.
I actually got a old ID tech art dev computer through a friend and it couldn't run doom2016 the game it was used for. However, letting my roommate use it she said it was one of the best damn art computers she'd ever used. No art program she used lagged on it and she could render like a madman.
Then as i got into music production I had to build a music pc because my gaming one wasn't cutting it. Then I got a dual pci sound card for it and it became amazing.
I'd build a dev machine but it's a pain to transfer from dev to test/gaming PC and my projects get maybe 10 minutes saved.
@@backlogbuddies Thanks for the informative reply! I didn't even know dual PCI sound cards existed. I wonder if they'll ever develop hardware that can deliver the best of both worlds, so you don't have to choose between gaming or faster rendering. Until then, a high-end gaming PC is still my go-to rig for art/music and general usage.
@@Game-BeatX14 from my understanding it's the cpu for art that matters a lot and the graphics cards don't have to be as powerful but the sli set up helps and ram is important. I don't get the art set up too much to be honest, but I know two weaker sli cards out do one much stronger than both cards card. The id machine had two 480s.
With music it depends on your daw, I use fl, and how good your sound card is, ram, and processor.
Basically compiling stuff is much different than running stuff. You need a computer with specific processors. If you spend the money you can have a all in one work stationand then a live machine
@@backlogbuddies but workstations use explicit mGPU, not sli/crossfire, right?
@@ChucksSEADnDEAD the one I got was two 480s
Yes, they are.
That's a picture of my truck... Are you taking shots at my life choices and video cards?
Right now the only compelling use for multi card rigs is Multi-Monitor set ups that you wan't all your powerful card to go to the gaming monitor and have the secondary non SLI/Crossfire Card handle the ancillary monitors.
Personally i have just 1 GPU GTX 1070 TI but for those on a budget could gain from using SLI And Crossfire
Wow, so I'm one of the top 6% for using a 1440p monitor? I wonder how many of those use 1440p/144Hz monitors. Maybe I'm in the top 3%. Lol
you are indeed a rare being.
Maybe if you could do some sort of ‘splitting’ where each GPU processes a portion of the view frustum, rather than the whole thing with alternation. Ofc you might have issues with frame timing though, where one GPU could render first before the others, but there could be some sort of ‘waiting’ system that makes sure all GPUs are done their rendering before displaying it.
I don’t know the effect this would have on drawcalls though, and CPU occlusion culling would probably be more expensive, since you’d have to check culled objects foreach GPU. I guess GPU culling is a thing though, so maybe? I’m not well enough versed in this level of GPU magic.
I'm glad I never bought that second 970 way back. I read on a forum that sli only really makes sense when the best isn't enough. Made sense to me at the time, but now it's not a good idea for many reasons. Hopefully it makes a resurgence.
what about for 3D rendering or after effect rendering??
is it worthless for this sector too??
2080ti pushes more frames than a 4x 980ti rig 😂
Nah
@@obeid5208 yes
@@TheXeffx nah
@@obeid5208 It's true that the RTX 2080Ti performs better than 4x GTX 980Tis, and there are bunch of reasons for it: For starters, 4x SLI really isn't supported anymore, even Nvidia dropped it in the 10-series. Another factor is also SLI support in general: While some games do support SLI, most don't right out of the box. Some even impact performance negatively. Ideally, 4x 980Tis would still perform a lot faster than a single 2080Ti, as the 2080Ti is approximately 2x faster than the 980Ti, but support just isn't there.
@@TooHTP 980 ti has the exact same performance as the gtx 1070. Theres no way 2080 ti is 4 times better than 1070. Maybe but i dont think so
Linus ... Please do all of your staffs to host every episode of techliked, and techquickie
The only reason I would have multiple video cards is to have a virtual machine powerhouse for when I have friends over and they didn't bring their towers over
They arent dying they already dead
3:36 - WAAAAIT A MINUTE! American truck with russian license plate? lolwut?
Поди ещё и праворульные машины удивляют?
@@niter43 Естессна! Ну, просто мне интересно, это такая пасхалочка или реально это была первая выдача по запросу "Pickup Truck" в гугле?
3:29.....thats a lie bro the minimum cpu requirements for fortnite as of recently was changed to a 4th gen i5 and the gpu requirements was changed to gtx 660 with the needed ram amount being 8gbs....
it all depends on the developer, As an example, the last 2 games I played through were Farcry 5 and Assasins Creed Odyssey. Farcry scaled almost perfectly with my 2nd GPU wile Assasins Creed did not use it at all.
It does, however, look like more and more developers are choosing not to add support for SLI and I can't really blame them as not that many people use the configuration.
Anti-SLI propaganda! I’ll have none of it! No!
*cries*
"49 likes and no views"
UA-cam drunk again!
Go home youtube, you are drunk
The real question : Has it ever been relevant ? I can't recall when I last heard somebody say that they use SLI or CF
Crossfire/sli is actually good when the game you play is supported and isnt laggy, for me, i usually play gta 5 with my 2xrx580, and it actually works great
*Nope*
RTX 2070 SUPER NVLINK
RTX 2080 SUPER NVLINK
Both outperform 2080ti and RTX Titan
We need a 4 way NVLink to push Red dead Redemption 2 upto 360 fps with maxed out settings @4k
I'm surprised these videos aren't done in the new cellphone based aspect ratio that LTT, MKBHD, and others have been using
Back in the early to mid 00s when even the best GPUs are single slot, opting for up to 6 graphics cards is not unheard of in the windows XP days of gaming. Today SLI and crossfire is only needed if you ran an 8K monitor, multiple 4K monitors, or doing Finite element analysis or other GPU intensive engineering work and gaming at the same time. 3 or more graphics these days are pretty much reserved exclusively for the prosumer market.
Putting mid-ter cards in SLI used to be a good value back when the 900 series was hitting the market. But I wanted the best so I bought a 980ti at full price with the hope's of running sli in the future. But when the 10 series released, the massive performance upgrade was so apparent. That one really cared about running multiple cards anymore. But however, the ability was still there. And hardcore enthusiasts still took advantage of it. I will admit it's a "niche" market. But that shouldn't mean Nvidia and AMD as well as the developers them selves, nerf performance of it for that market either. They are the ones willing to pay for it. And I think we deserve better, I believe It would help the community grow and it would provide an upgrade path to those who could only get one now, to get meaningful upgrading performance in a second one in the future when and if needed needed. As for SLI being dead? The need for it will always be there for purely computational work, and for gaming. There are videos and benchmarks that provide what I would call, "Positive" for the future support for this kind of GPU configuration. So in short, I still have hope for SLI to be great for enthusiasts as well as for the "Run of the mill gamer."
Sorry for the long post, I just feel strongly for the topic.
i actually have multi gpus setup in cfx, so far over 70% of games which i played always was CFX ready, and i benefit a lot from that, even played some of them in 4K and older titles in 8K textures through GeDoSaTo, it's up to game and most of newer games do not support CFX or SLI , however it happened that i play older ones, all my favorite games support CFX somehow with exception of league of legends which doesn't need it at all(i hope everyone understands why it doesn't need it), for me it's not dead at all.
also, premiere pro have insane scaling from these gpus, that's just amazing.
Forget about not optimized for dual GPU set up, most games straight up don't support it, not to mention in most cases two cards makes the temperature heat up faster which reduce the clock speed of the primary card, resulting in a negative performance gain. Nvidia GPU boost 3.0) Unless you are water cooling both graphic cards, which brings a whole new plateau of problems and maintenance with custom water loop.
I feel like SLI/XFire's wheelhouse was bringing new life to older generation systems (buying a second used card vs starting over with a new, more powerful one) but Nvidia/AMD don't get any money from the used market. I feel like that, along with developer support, is what's REALLY killing multi gpu setups...
I've been an advocate of building smaller systems for years now because of the lack of benefits from multi gpu systems.
But what I don't understand is, why the custom PC manufacturers don't push for more mini itx parts, since a majority of their customers will be only using 1 GPU anyway. And if they're serious about sound, they'll get something external instead of something from the mobo anyways.
All of this, but pc part manufacturers are still pushing normal full sized mobos/cases for pci-e slots that will never be filled etc. 🥴
In VR it would make a lot of sense if the API's would support it proberly (or the devs caring/knowing about it).
You have two Viewports that need to be renderd at the same time and have different perspectives on the Scene - seems like the Multi GPU could be the solution to hit 90fps+, but then again VR is not yet a mass market - damn.
What if instead of alternating frames each card took care of half of the screen. Or is that even harder to synchronize?
Alorand It was in theory supposed to be better, because the cards don’t have to coordinate themselves but it never took off
I wish they could find a way to make SLI/Crossfire just not require anything from the game devs.
Like that alternating lines or frames mechanic, why would that require anything from the game developer? it just requires frames to be rendered and the two gpu's just devide that load.
As a VR player, I really wish SLI would have evolved and fixed the problems. I want as much power as I can get and not even a 2080ti is enough to run some VR games at the fidelity that I would want.
What about rendering? Blender? Photo scanning? Surely this would be the most targeted market, unless there's compatibility issues that I'm not aware of?
I have two way SLI in one of my rigs...I have run several tests and while playing most games...one card is almost asleep while the other one is running high settings and getting great frame rates. It just seems odd to me that it literally does nothing and sits idle. Now the content creator side of me loves it for the pure rendering power for video editing.
Setting up 2x RX590 Crossfire.
Vulcan has left the conversation.
Dx10 and Dx11 killed each other.
NVidia jumped from cliff.
Game developers were erased from the existence.
Only handful of people that actually know what It's like to have multi-GPU setup. That sinking feeling when the game doesn't work well and realizing you wasted so much money and also that euphoria when you see more than 50% gain in performance when it actually works.
I think remember a video where Linus said SLI was dead... That's why I'm always puzzled when people get SLI configs these days.
I remember a time when the buzz was around duel CPU computers, then we got duel cores and those seemed to fade away.
When duel GPUs came out I knew it could only be a matter of time until solo cards that out preform the set up for cheaper would be a thing.
This may be needless and not make sense for newer gpus but for people who have older GPUs like me, adding another 980 Ti to my rig with a 980 ti is both money saving and very very helpful, getting 2080ti performance when overclocked.
Makes me sad that companies are literally so disconnected from consumers that the one use-case where spending literally DOUBLE on graphics card doesn’t appeal to them
I've been running on MSI 1070's in SLI for about 2-3 years now. In my experience, they have been solid and reliable in literally every game I play. Sure, from time to time I've had some games crash or get micro stutters, but I was able to correct the problem either on my end, or through a driver update. In 2019 I'm still able to play all my games at 2K resolution, with max settings, and my frames never drop below 60 (with games like the Tomb Raider series, Mass Effect, etc). I personally think SLI setups are worth it (and are very aesthetically pleasing). But I might be the few who still runs an SLI setup since I know no one else who does *cries in Chinese*
Yup As I professional editor, who can use multiple GPU, ( as you obviously, turn of SLi, or Crossfire, to use all the GPU's in professional applications) I wondered why gamer still use more than 1 card.
Does anyone else hear how James' voice in this specific video would've sounded way better if UA-cam had better audio quality? or is it just me
I got two GTX 970s and I get a lot of stutters with Metro: Exodus and Fallen order when levels start up but Doom Eternal runs absolutely perfect. Had these for like...4+ years, maybe longer, and pretty happy with it but I can see getting a single powerful card later on down the line.
I hope the next gen consoles have dual gpu setups. If they do developers will probably add support for more dual gpu setups on pc
Personally I think SLI was introduced because 7xx-9xx serie nvidia gpu were in production for a long time and needed something extra/new untill the next generation. But then they were stuck with it. Now only the top 20xx serie support it, because you can't remove 'features')
I don't know about SLI, but during the mining craze I got a used GPU off of ebay to try out crossfire instead of straight upgrading.
Don't try crossfire kids, live a happier life.
If you seriously consider it, spend the money on like a Nintendo Switch and you'll have a much better time, trust me.