I Combined AMD & NVIDIA in One PC and Got 1500 FPS

Поділитися
Вставка
  • Опубліковано 21 лис 2024

КОМЕНТАРІ • 770

  • @lecctron
    @lecctron  Місяць тому +656

    "sicko mode or mo bamba ? why not both ?" ahh video

  • @Ked778
    @Ked778 Місяць тому +1788

    Ryzen 4070 mentioned 🗣️🗣️🔥🔥

    • @jacobgames3412
      @jacobgames3412 Місяць тому +37

      RAHHH🔥🗣️🗣️🔥🗣️🗣️🗣️🔥🦅🦅🦅🦅🦅🔥🔥🔥🔥🗣️🗣️🗣️🗣️🔥🔥🔥🔥🔥🗣️🔥🗣️🗣️🔥🗣️🔥🔥🔥🦅🦅🦅🦅🦅

    • @moboy5277
      @moboy5277 Місяць тому +37

      ZTT FOR LIFE🗣🔥🦅

    • @shinyevoli9899
      @shinyevoli9899 Місяць тому +14

      RYZEN 4070

    • @latestcue
      @latestcue Місяць тому +10

      the leaks said ryzen 4090 is releasing November 23rd 2025

    • @kenny_the_second
      @kenny_the_second Місяць тому +11

      ​@@latestcueleak said ryzen 5090 will release in February 30, 2026

  • @oscar21781
    @oscar21781 Місяць тому +570

    Guess you're buying an $1800 MB because you can't leave us on a cliffhanger!

    • @lecctron
      @lecctron  Місяць тому +129

      Im gonna pull a spiderverse and drop the sequel years later 😂

    • @myck7021
      @myck7021 Місяць тому +15

      @@lecctron we’ll be waiting

    • @RavioliBurrito
      @RavioliBurrito Місяць тому +10

      @@lecctronbro that last spiderverse movie was a whole cliffhanger

    • @raamy2426
      @raamy2426 7 днів тому

      @@lecctron lol it would probably be cheaper to change your cpu + mb

    • @sammyboy_x1
      @sammyboy_x1 6 днів тому

      waiting

  • @nepp-
    @nepp- Місяць тому +466

    long story short: adding a second GPU won't help gaming performance much. This is because graphics cards are already so fast that any extra distance between components can actually slow it down. That's why the memory modules on a graphics card always surround the main chip.
    Also, you *can* use an AMD and Nvidia card at the same time in the same pc, for example running a game on AMD and using Nvidia for OBS; they can both do separate tasks at the same time, but not the same task at the same time. But because having two cards installed complicates things, it's honestly better to never worry about running two at once and instead either try overclocking or replacing your card entirely if it isn't performing to your needs

    • @arenzricodexd4409
      @arenzricodexd4409 Місяць тому +8

      @@nepp- no. Combine them to make them run a single game together. Time to Fuzion.

    • @nepp-
      @nepp- Місяць тому +19

      @@arenzricodexd4409 with an SLI/Crossfire compatible gpu with another gpu of the same model, and vendor would work, but amd and nvidia cards were never designed to work together sadly

    • @ivanvladimir0435
      @ivanvladimir0435 Місяць тому +22

      The closest you can get to it is getting a second GPU that's dedicated to frame generation (using Lossless Scaling) while the first one does the rendering, depending on the model you can turn 60 fps into 120, 180 or even 240 fps, I love it but it's not real "GPU performance combined", but rather two GPU's doing different things that benefit the guy playing the game

    • @hideakikovacs2859
      @hideakikovacs2859 Місяць тому +2

      How about for recording, as in using the NVENC encoder while I game on the radeon cards, does that work or am I missing something here?

    • @mjcox242
      @mjcox242 Місяць тому +6

      @@hideakikovacs2859 yes and no.
      Yes you can do it, but it can lower performance due to transferring the image between GPU. Most GPU have encoder chips baked onto them so it's faster to encode with the GPU your using to game

  • @V1SKosity
    @V1SKosity Місяць тому +86

    sucks when someone with a good idea cant carry it out because of money problems. love you bro

    • @lecctron
      @lecctron  Місяць тому +16

      happens all the time on this channel lol we got so close 💔

    • @shia9631
      @shia9631 Місяць тому +3

      @@lecctron send me the banking details, ill sponsor a dollar.

    • @Gabriel_Ultrakill
      @Gabriel_Ultrakill 28 днів тому +1

      Easy fix, have more money

  • @ReinForceOne
    @ReinForceOne Місяць тому +140

    Fun Fact - back in 2010 there was a short lived solution to combine amd and nvidia - some motherboards like the ASUS Crosshair IV Extreme had a "hydra" chip on them

    • @stuwilldoo7035
      @stuwilldoo7035 6 днів тому +2

      i thought it was coming with either directX 11/12 ?

    • @lekoro1
      @lekoro1 День тому

      @@stuwilldoo7035 DX12 (and vulkan) support a thing called explicit multi adaptor that kind of worked as a hardware agnostic SLI/crossfire solution - including letting that intel iGPU in the processor get in on the action
      only issue is only like 5 games supported it because it is 100% on the developers of the game to implement this feature because it requires really low level GPU programming to do

  • @PhonkMachine220YT
    @PhonkMachine220YT Місяць тому +105

    Fun fact: The current CEOs of AMD and NVIDIA are cousins once removed.
    And AMD also provided NVIDIA their AMD Epyc CPUs in DGX A100 workstations.

    • @FuentesDj
      @FuentesDj Місяць тому +6

      @@PhonkMachine220YT I remember back in the day people pairing nvidia with radeon for physx

    • @_ch1pset
      @_ch1pset Місяць тому +2

      @@FuentesDj some games physx are not updated and the cpu physx is extremely unoptimized (Mafia 2 and Mirror's Edge), so having the dual setup still improves that experience. It's probably only a few games though. BL2 physx is about the same on modern cpu as it is on nvidia gpus, it's definitely the version of physx in the game. I know this, because iirc, you can swap Mirror's Edge physx files with updated ones and it fixes issues with the cpu physx, but this is not possible on Mafia 2.

  • @svegetax
    @svegetax Місяць тому +11

    Random Dude here. 2 things your looking for.
    First a driver from the past before Nvidia decided to add driver blocks to hybrid gpu setups.
    Second is a motherboard that supports Bifurcation allowing the 2 PCIE slots depicted by the mobo diagram in the manual to use 8x/8x splitting PCIE lanes. Most GPU's do not saturate x16 slot and nearly nothing is lost performance wise.
    Good thing is your current Motherboard may already support it. Bad thing is finding a driver without the locks in place. Do an old search for Hybrid GPU you may find what you're looking for.
    Yes we did used to do this without issue. GL

    • @WallcroftUK
      @WallcroftUK 5 днів тому

      yea thats why some gpus have nvme slot :D these days cus GPUs not use full x16

    • @lecctron
      @lecctron  День тому +2

      wow suddenly i'm excited to try this again, thanks so much !

  • @skellybelly_gaming
    @skellybelly_gaming 2 дні тому +2

    2:46 THE NETTSPEND BACKGROUND IM DEAD (i need it pause)

  • @Ethorbit
    @Ethorbit Місяць тому +9

    4:32 It's completely fine to use a GPU at a slower slot.
    I use dual gpu for gpu passthrough virtualization, both the GTX 1060 6GB and RTX 3060 are running at their full rated speeds despite the motherboard throttling the pcie speeds to accommodate both cards.

  • @0rchyd655
    @0rchyd655 Місяць тому +53

    The reason your second GPU is losing performance is not due to a chipset bottleneck on your motherboard, the chipset only manages the last PCIe and M.2 slot.
    The real issue is that your GPU is plugged into a PCIe slot with only 1 lane, giving you just 1 GB/s of bandwidth.
    Also, you don’t need a $1,700 motherboard. First of all, SLI only works with two NVIDIA GPUs, and not in the way you're thinking. If you want to connect a second GPU to the PCIe lanes of the CPU, use your top M.2 slot with a riser. It has 4 PCIe lanes that are directly connected to the CPU.

    • @shadow105720
      @shadow105720 Місяць тому +3

      Sli capable means it gives 16x slots to both gpus. So yes it would help in this case even without actually using an sli bridge.

    • @thelasthallow
      @thelasthallow 24 дні тому +3

      that depends on the motherboard, there are cheaper motherboards in the $300 range that can dp 8x/8x but the guy who uploaded this video doesnt actually know what he is doing or what he is talking about.

  • @EpicSOB_
    @EpicSOB_ 2 дні тому +1

    xfx gpus are especially stable, and reliable, they are probably the best vender for AMD cards, and I'm happy you chose that particular AMD card, putting team red's best foot forward.

  • @atooalyousif6488
    @atooalyousif6488 День тому +2

    I like this guy I subscribed after watching like 4 vids

  • @khorvietech
    @khorvietech Місяць тому +118

    The vid we all been waiting for

  • @Nebula_YT23
    @Nebula_YT23 9 днів тому +1

    I literally can't believe LTT hasn't made a video like this before

  • @6XVK
    @6XVK Місяць тому +185

    The forbidden combo ahh video

    • @ArouzedLamp
      @ArouzedLamp Місяць тому +3

      @@6XVK ahhhhhhh

    • @RTWT990
      @RTWT990 Місяць тому +3

      fun fact AMD and Nividia's founders are cousin Jensen and Lisa are related not joking

    • @6XVK
      @6XVK Місяць тому

      @@RTWT990 oh that’s sick

    • @fraxbnezl
      @fraxbnezl Місяць тому

      🎉

    • @davidcollins-ez5nn
      @davidcollins-ez5nn Місяць тому

      😂

  • @zcomputerwiz
    @zcomputerwiz Місяць тому +2

    I have a dual GPU setup for exactly the reason you mentioned!
    Radeon RX 6800 in the PCIe 4.0 x16 slot for gaming.
    RTX 4060 ti 16gb in the PCIe 3.0 x4 slot for CUDA, tensor stuff, etc.

  • @HowToArter
    @HowToArter 13 днів тому +1

    2 Gpus actually make sense for content creators and streamers.
    1 card can be used for gaming, the other can be dedicated in OBS for recording/encoding. This way you can still stream/record without losing performance at all and no delay whatsoever.
    The card dedicated to the recording doesn't need to be a powerhouse, a simple 1050Ti can record 1080p 60fps flawlessly if there's another card handling the game itself.

  • @Kangaroo63
    @Kangaroo63 День тому +2

    I searched for this because i got curious

  • @kahboomsumrall1839
    @kahboomsumrall1839 Місяць тому +7

    MSI 870A and P55 Lucid Hydra motherboards supported multi GPU from AMD and Nvidia 14 yrs ago, while it worked it was hit and miss on being better than SLI and Crossfire, the only benefit is you could use any GPU that was from the same brand so GTX XXX or HD XXXX , or just mix them up, and was killed off due to lack of support just like SLI and Crossfire themselves.

  • @Marius04020
    @Marius04020 Місяць тому +15

    7:25 triggered ptsd

    • @i_ruby
      @i_ruby Місяць тому

      @@Marius04020 literally

    • @hiddenguy67
      @hiddenguy67 Місяць тому

      nooooo

    • @Otonari_chan
      @Otonari_chan Місяць тому

      I litelary stoped the video to check if it really was there or it was in my head

    • @dragonspear570
      @dragonspear570 23 дні тому

      @@Marius04020 Bruh 😭

    • @DeviousDanielYT
      @DeviousDanielYT 6 днів тому

      @@Marius04020 ddlc reference

  • @resetskatarina6400
    @resetskatarina6400 20 днів тому +2

    Awesome Vid man. Loving the Croft Manor background music from TB Legend btw.

  • @LavadeepYT
    @LavadeepYT 22 дні тому +3

    Dammmm That AMD GPU is _gasping_ for air

  • @exosine
    @exosine 10 днів тому +2

    Holy crap, the legendary ryzen 4070 🔥 I knew it existed!!

  • @taktarak3869
    @taktarak3869 23 дні тому +2

    You can actually use a rtx 30/40 series gpu as the main rendering gpu for games/applications that supports dlss, and a rx 6000/7000 series gpu for AFMF. That's what people actually did to get 200~300 fps on max settings of cyberpunk when it first got first released.

  • @tonofbrix
    @tonofbrix Місяць тому +3

    I've had my gtx 1080ti for a long time and it still performs to even the most graphics intense games in 2024 i love it so much

  • @ghgltggamer
    @ghgltggamer 25 днів тому +3

    Bro mention lunchly which was crazy 😂

  • @PhvexSeven
    @PhvexSeven Місяць тому +5

    every Gamer Dream is combined AMD and Nividia Gpu together

  • @duddahgyeah7653
    @duddahgyeah7653 Місяць тому +4

    Godzilla had a stroke trying to understand why you've done this and fragging died.

  • @rajivbhuarya1351
    @rajivbhuarya1351 Місяць тому +8

    Bro made team yellow 😐

  • @Cxrved
    @Cxrved Місяць тому +2

    Bro been here since like 250- subs luv to see you’ve grown so much❤

    • @vicvm.
      @vicvm. Місяць тому

      @@Cxrved I’m here since almost 1k

  • @ashleysloan1758
    @ashleysloan1758 Місяць тому +3

    Know it’s a good day when leccteon post

  • @Hakeem597
    @Hakeem597 Місяць тому +2

    My dude said "2010s" like it was the 1980s.

  • @Zenithsloth
    @Zenithsloth Місяць тому +5

    Day two of asking for this video idea... "Using lossless scalings frame generation on the GT 710"

    • @lecctron
      @lecctron  Місяць тому +1

      that will be part of the next video stay tuned bro

    • @tekeagle2136
      @tekeagle2136 Місяць тому

      I use my 3060ti as the main render GPU and my 5600XT as the GPU for Lossless Scaling. I will not spoil much, but it varies. The main bottleneck is mostly my 5600XT and its 6GB VRAM, and partially the x4 slot it is in.

  • @RealCenti
    @RealCenti Місяць тому +1

    w vid

  • @WitmerXL
    @WitmerXL Місяць тому

    Actually you could combine AMD (ATI at the time) with an Nvidia GPU with technology called "Lucid Hydra" -- it was a feature on my Crosshair IV AMD board. I I believe I had a 6870 (primary) paired with a GTX275 (secondary). Funny enough, as far as I can remember, it didn't really cause any major glitches in games, but many games didn't see much of an improvement, while others saw massive gains. I remember Red Faction Gorilla getting a major 20 - 30fps boost. DX & OpenGL would potentially show different results. Great vid btw!

  • @deeior
    @deeior Місяць тому

    Whats your editing software? Your too good❤

  • @archangel3237
    @archangel3237 4 дні тому

    This is why I bought an open form factor case. Convective cooling with no case fans needed

  • @FrizzRizz
    @FrizzRizz Місяць тому +6

    Best ending: amd and nvdia unite

  • @FrancisNull
    @FrancisNull 21 день тому

    I use a RTX 3080 for gaming and the APU of my 5600G to encode my Livestreams. It works flawlessly. No problems, no colored screens whatsoever.

  • @daydream605
    @daydream605 6 днів тому

    This reminds me of when physx came out on nvidia,
    I had a hd 5770 gpu and a gtx 460. In the same pc, i used the 460 (i think it was anyways) to compute the physx calculations and the amd card for raster.
    Worked really well, but not in everygame because it requires you to manually edit config files to tell it which gpu is for what. Lots of hassle but felt worth at the time.
    Theres so much distance between computer components, i wish they'd all come together to be nice and speedy.

  • @Drywest
    @Drywest Місяць тому +27

    blud completed mission impossible 😭🙏

  • @Megahertz88
    @Megahertz88 13 днів тому

    Because this video, i was subscribed to your channel. Smart an interest content about hardware, but on funny way. Keep it going :)

  • @PyricDemon
    @PyricDemon Місяць тому

    I was not expecting the video to end after the price reveal lmao. SLI was dropped so fast that any board with it is an absurd price. I was interested in it for a time too.. sad to see it disappear

  • @kreempa.i.8349
    @kreempa.i.8349 Місяць тому

    If your motherboard supports bifurcation, you can split the top PCIe 4.0x16 into two 4.0x8 lanes, and use an adapter to split it into two x16 slots. Each gpu will be limited to PCIe 4.0x8 speeds but it’s better than using the bottom slots which run at x1 speeds

  • @kikihun9726
    @kikihun9726 Місяць тому +1

    Well, if you get an m.2 to pcie adapter, you can use the x4 lanes from the top m.2 slot for the gpu.
    90% of the new ATX motherboards only wires pcie X1 to the physical x16 plastic for each slot.
    mATX motharboards have x4 and x8 slots from the chipset because of the size limit.

    • @crowntotheundergroud
      @crowntotheundergroud Місяць тому

      Yes, you are correct. This dude put the 1080 ti into a Gen 3 Pcie x16(x1) slot. The "(x1)" means it only has a single lane of bandwidth lmao. Using the m.2 to x16 adapter would have been better. Even so, he didn't play a single game that could utilize his setup. The only games that can utilize an AMD and Nvidia gpu together are the rare titles that have Direct X 12 explicit multi-gpu support.
      Edit: I was incorrect. Rise of the Tomb Raider has explicit multi-gpu support, but I doubt he had anything configured correctly.

  • @luciantermopan
    @luciantermopan 9 днів тому

    You could try to run the 1080 ti with a riser cable from an m.2 slot, as there are 4 lanes directly connected to the cpu in the up most slot. In that way both gpus would be connected at the cpu at the same time

  • @jonbjorkeback9499
    @jonbjorkeback9499 Місяць тому

    You _don't_ need a SLI motherboard per se, what you do need is a board supporting bifurcation. That essentially means you can route the PCIE lanes to two slots (so PCIEx8 + PCIEx8 instead of one PCIEx16).

  • @Sinbad1999
    @Sinbad1999 Місяць тому +1

    Should've done the full trifecta and added the intel arc gpu as well

  • @Unseen1094
    @Unseen1094 28 днів тому

    I was thinking to combine my old gpus with hopes and dreams and then this vid came into my search

  • @jlebrech
    @jlebrech 10 днів тому

    physx be like "woah so much room for activities"

  • @William_Kyle-Yuki_Yuuki
    @William_Kyle-Yuki_Yuuki Місяць тому

    I did this back in around 2015 using ATI HD 7600 and a GTX 980. Playing Witcher 3 on a Radeon GPU with Nvidia Hairworks was fun.

  • @agoogleuser2507
    @agoogleuser2507 Місяць тому +4

    So did you just plug both GPUs in and they worked automatically? Or did you have to install new drivers? Or do you plug your monitor into the GPU you want to 'host' everything? What's the process? You made it seem like you just added a second GPU into your motherboard and they both worked together flawlessly.

    • @lecctron
      @lecctron  Місяць тому +2

      had to install drivers, other than that it worked pretty much flawlessly across the 2 monitors

    • @agoogleuser2507
      @agoogleuser2507 Місяць тому

      @@lecctron I see. What drivers?

    • @alexturnbackthearmy1907
      @alexturnbackthearmy1907 Місяць тому

      @@agoogleuser2507 Also the app can select what GPU it will be using, and if its different then the one monitor cable plugged into, then this card would work in pass-trough mode if its rendering something for screen to be displayed, aka games (which can SEVERELY limit the output if you use like CMP90HX in pass-trough mode trough your iGPU).

    • @agoogleuser2507
      @agoogleuser2507 Місяць тому

      @@alexturnbackthearmy1907 What app?

    • @alexturnbackthearmy1907
      @alexturnbackthearmy1907 Місяць тому

      @@agoogleuser2507 Games, rendering software, pretty much anything. And if it doesnt support it natively (in settings menu), you can just temporarily disable one gpu, launch the thing you need on working one, and then enable it back from device manager.

  • @להבצור
    @להבצור Місяць тому +2

    For the mobo, I think you can also use the asus proart b650-creator. It has two pcie slots connected to the cpu and costs around 400 dollars

  • @desertedpyro3238
    @desertedpyro3238 Місяць тому

    If i could, i would but that for you. Hope to see this work out in the future

  • @claxtoncurtis811
    @claxtoncurtis811 27 днів тому

    Outside of software, The main cause for dual gpus not being performant is communication between the two GPUs that is fast in a practical manner.

  • @nightfallvampire
    @nightfallvampire 12 днів тому

    AMD and NVIDIA should collab now

  • @Bloowashere
    @Bloowashere 13 днів тому

    look ya'll, it's the DSP of computer hardware.

  • @mohd5rose
    @mohd5rose Місяць тому

    There is one motherboard manufacturing company (could not remember which one gigabyte/asus/msi/asrock/abit/dfi/biostar/ecs) did produce a motherboard that support both Nvidia and Amd graphics that would run simoultaneously and the chipset on the motherboard they were using at that time was called Lucid. This was way back when nvidia SLI and amd Crossfire was a thing.

  • @D1N0-_-
    @D1N0-_- Місяць тому

    great job next video should be intel+nvidia+amd gpus all in one pc

  • @Wishmaster145
    @Wishmaster145 Місяць тому

    best unexpected ending of a video ever

  • @imussewingpartskapatid
    @imussewingpartskapatid 23 дні тому

    the moment he mentioned 1080ti i write this and then close the video. lol

  • @goaliedude32
    @goaliedude32 Місяць тому

    I used to run ski/crossfire cards. I kind of wish they would bring that back fully supported. It was sick to buy 2 cards at a $300 price point that had nearly exactly the same performance of the $1000 card

  • @ValeroLucky
    @ValeroLucky Місяць тому

    Croft's Mansion ahh soundtrack at the beggining of the video 🗣️🗣️🔥🔥

  • @furkantugraakbas2962
    @furkantugraakbas2962 6 днів тому

    this guy: "i combined two rival gpus to create a hollow gpu. imaginary gpu: amdvidia

  • @hsienkangliu1436
    @hsienkangliu1436 26 днів тому

    There is another way to make dual GPU useful as I know, AMD's 6000 and 7000 series card has fluid motion frames, which work like dlss frame gen and compatible with any card. So u can now have a card dedicated running frame generation, decreasing the load for the card for rendering.

  • @n.stephan9848
    @n.stephan9848 Місяць тому

    The passthrough system of 2d GPUs and 3d accelerators in the 90s would be wonderful here. Imagine being able to plug one GPU into another and that other one goes to the monitor. Fullscreen applications just whichever one's better does the rendering, windowed mode, UI done by one and everyrhing within the window's borders is done by the other.
    I hoped something like this would come to systems featuring both an iGPU and a dGPU.

  • @mekaHDD
    @mekaHDD Місяць тому

    My favorite youtube back with another awesome video

  • @shadmansudipto7287
    @shadmansudipto7287 6 днів тому +1

    If it goes up in any game, it's probably tricking the drivers to enable some Nvidia tech. Not the gpu doing any work.

  • @Planeidklol
    @Planeidklol Місяць тому

    It’s equally good to each other

  • @10Sambo01
    @10Sambo01 Місяць тому +46

    Hmm, I'm not sure you fully understand how games work with GPUs...

    • @ArnoId-Schwarzenegger
      @ArnoId-Schwarzenegger 20 днів тому

      @@10Sambo01 you don't know

    • @10Sambo01
      @10Sambo01 20 днів тому +1

      @Manemyonem I'v ebeen an IT professional for almost30 years. I know.

    • @sayankakalita7827
      @sayankakalita7827 17 днів тому

      @@10Sambo01 yup I agree

    • @lockinhinddanger934
      @lockinhinddanger934 5 днів тому

      @ArnoId-Schwarzenegger yes, he does, I wish youtube would boot this misinformation off the platform also 1500fps in fortnight isn't impossible, in fact I'd be more impressed with getting 900fps in cs2

  • @AsianFlex
    @AsianFlex Місяць тому

    Afaik, It's all still rendering off the 6900XT, the 1080Ti is doing no rendering besides the second display it's connected to. The 1080Ti can be used for certain tasks like secondary rendering (2 games at once, one off 6900XT and one off 1080Ti) or like for it's encoder but you can't "SLI" them together, in a sense, to gain more performance.

  • @Mo79792
    @Mo79792 Місяць тому

    No way you got almost 50k subs. I was there from the beginning🎉

  • @ProfSnakes
    @ProfSnakes Місяць тому

    I did this a few years back. I had a AMD Fury and added a 980 I came across for cheap and was using for mining purposes. Snice I had hem both I went ahead and tried them out. Low and behold they could work together in certain instances. I tried mGPU in a couple things but most of the time just left the 980 mining while I played on the Fury. The fun trick was after NVIDIA updated their drivers and stopped blocking access to PhysX when you had a AMD card installed. Had a few games that instantly started running better by enabling PhysX on the 980 instead of having to do it on the CPU and still rendering on the Fury.

  • @norbberting3849
    @norbberting3849 15 днів тому

    Somebody sponsor this man!

  • @erikschiegg68
    @erikschiegg68 Місяць тому

    But I do, yes I do do do. Operating AMD and Nvidia parallel eliminates the bottlenecks when I do AI or video editing stuff on the nvidia, I can still e.g. gaming along and stuff on several monitors, without any problems. Take a good power supply. Upgrading the power supply and swapping in a second hand RTX was the best tuning hack in years.

  • @tuhinfrags
    @tuhinfrags Місяць тому +1

    I used to use a 3060 as main gpu and a 2060 super as secondary gpu, 3060 for gaming and 2060 super for streaming and recording

    • @De-M-oN
      @De-M-oN Місяць тому

      might be actually slower because then the data must be copied all over to the 2nd card before it can encode it. This will also need the System RAM again which otherwise OBS can bypass too with the current Gamecapture hook and NVEnc implementation they have.

    • @tuhinfrags
      @tuhinfrags Місяць тому

      @@De-M-oN you are wrong i gained almost 80-90 fps by doing so

  • @listX_DE
    @listX_DE Місяць тому +5

    Lets help him to get 1800$ for the Motherboard 🔥🔥

  • @aRandomMenno
    @aRandomMenno Місяць тому +2

    Now break the minecraft fps wr! 🤭

  • @ROcheater
    @ROcheater 24 дні тому +16

    Bro, this is not how you dual GPUs work. You cannot make an AMD card work with an Nvidia card. Look at your task manager while in game and you'll see only one card is doing anything at all. You don't just plug in dual GPUs and magically get a bigger resource pool for gaming 🤦

    • @lockinhinddanger934
      @lockinhinddanger934 5 днів тому +5

      The host also is claiming in 2024 that a rx 6900xt is the best amd card which is bs, if you had a 7900xtx it would smoke both cards, on top of this the 4090 is actually the fastest consumer grade gpu on the market. This video is full of so many holes I can't even use this cheese on a grilled cheese sandwich.

    • @rrxt
      @rrxt 3 дні тому

      @@lockinhinddanger934 you foaming out the mouth I can’t even lie

    • @Torboy124
      @Torboy124 2 дні тому

      @@lockinhinddanger934 You know this video was done for fun right? You really need to go outside and find something real to do.

    • @LeitoAE
      @LeitoAE 14 годин тому

      Right, but there is one trick I think could work to make more FPS out of 2 GPUs. Use lossless scalling and make your main GPU render a game and second GPU generate frames in LLS app. I tested it and it works, but unfortunately my second GPU was too weak and generated frame drops, because whole system had to wait for it to generate fake frames, but it is worth trying if someone has something than 1050 paired with rtx 3070.

    • @ROcheater
      @ROcheater 14 годин тому

      @LeitoAE if you're referring to GPU passthrough, I guarantee your 1050 is not doing anything to benefit your frames. I don't know what your exact setup is but the fact that you're saying you're getting less frames from whatever your doing does in fact, not make more frames.

  • @jaxbade
    @jaxbade Місяць тому +1

    Maybe going with an X99 Xeon or an old ThreadRipper equivalent could also work properly. They usually have more PCIe lanes and the motherboards from that time were thought for SLI and Crossfire in mind.

  • @creativeartdesign4820
    @creativeartdesign4820 Місяць тому

    Hi,
    I’ve been following your channel for quite some time and really appreciate the content!
    I’m looking for advice on building a new PC for both gaming and AI development, specifically for working with LLM models. I also do a lot of development using ReactJS.
    Currently, I have a decent setup with an Intel i9-9xxx series processor and an AMD Radeon IV GPU. However, the lack of CUDA support is limiting my AI-related work. Additionally, I frequently use Adobe Illustrator, and it tends to lag when working with larger files. My system has 32GB of RAM running at 3600MHz.
    I’m planning to build a new PC that can handle these tasks better without going overboard on the budget-something that’s efficient, not overkill. I’m mainly confused about which motherboard to choose, and I’m considering upgrading my CPU, memory, and GPU to a near-top-tier setup.
    Do you have any recommendations? Should I go with AMD or Intel, or would it be worth waiting for Nvidia's new CPUs? I’m not sure which direction to take and would really appreciate any guidance.
    Thanks so much!

  • @grtitann7425
    @grtitann7425 Місяць тому +2

    Test for you.
    1- AMD GPU as primary.
    2- Ngreedia GPU as secondary.
    3- Install Arkham Knight.
    4- test if PhysX works using the Ngreedia GPU just for that

    • @pacomatic9833
      @pacomatic9833 Місяць тому

      PhysX crashed all the time with one GPU, I doubt it'd be any stabler with 2

    • @grtitann7425
      @grtitann7425 Місяць тому

      @@pacomatic9833 If you remember, PhysX launched with the Aegia accelerator.
      Ngreedia bought them and locked the tech behind their hardware.
      Back then, it was possible to use an AMD GPU as primary and the Ngreedia GPU just for PhysX.

    • @PupperTiggle
      @PupperTiggle Місяць тому

      might actually work pretty sure nvidia already made support for having a second gpu dedicated just to physx

    • @grtitann7425
      @grtitann7425 Місяць тому

      @@PupperTiggle it always worked, but Ngreedia went out of their way to make sure that it didnt work.
      They spent a decade on that anticonsumer crap.
      We went through hell back in the day to be able to use our gpus.

  • @ready4fraud
    @ready4fraud 5 днів тому

    The 1080-Ti has ALWAYS been a solid graphics card.

  • @Silentguy_
    @Silentguy_ День тому

    I actually have run something similar to this for streaming. AMD 6800XT for playing games and Nvidia 1050 Ti for streaming and video encoding. Only reason I don't do that anymore is because I threw the 1050 Ti in a Xeon workstation I use for a media server.

  • @redkaled
    @redkaled Місяць тому +1

    I think the first tomb raider benchmark got upper with 2gpus because the 1080ti got the PhysX rendering, which free'd some ressources of the 6900xt

  • @Kwijibob
    @Kwijibob Місяць тому

    Pcie bifrucation my dude. Split the x16 into 2 x8 slots :D

  • @UCs6ktlulE5BEeb3vBBOu6DQ
    @UCs6ktlulE5BEeb3vBBOu6DQ 26 днів тому

    About combining GPUs: My server (2U) has 2x Nvidia Tesla P40 24GB, 1x Nvidia A2000 12GB, 1x Matrox G200eR (onboard) and 1x AMD Radeon Pro WX 3100 and it never crashed.

  • @jacobtravis5824
    @jacobtravis5824 4 дні тому

    Guys. Lets all donate money to see the finished computer. We can do it.

  • @JeremiahPierre-tu5qz
    @JeremiahPierre-tu5qz Місяць тому

    Guys support this man so we can get a part 2!!!!!!!

  • @SnowBoostservice_official
    @SnowBoostservice_official Місяць тому

    this is insane, cool vid G!

  • @YourArcanist
    @YourArcanist 11 днів тому

    Tbh they would make so much more money if they both made their stuff work together.

  • @samuel-br.man__3571
    @samuel-br.man__3571 25 днів тому

    You’re doong what i dreamt

  • @Chaos_God_of_Fate
    @Chaos_God_of_Fate Місяць тому

    The more realistic pairing would be the 1080ti with a 5700xt. There may be some missing features with the 5 vs 6 series but, the 1080ti and 5700xt are almost identical in benchmarks- one is better in some games, the other is better in others but when averaged they're within margin of error. Still though, this is a really neat test and I've wondered a bit about this when this feature was new- I just haven't heard anything about it since this was first possible a few Years ago. Thanks for the video, it's very interesting!

  • @frenaedits
    @frenaedits Місяць тому +6

    Now that is a original idea

    • @anadventfollower1181
      @anadventfollower1181 Місяць тому

      @@frenaedits Its been done before, and I thought about it years ago- just no money to execute the experiment.

    • @arenzricodexd4409
      @arenzricodexd4409 Місяць тому +2

      Is it? There are official product try to do this about 15 years ago.

  • @lpfan4491
    @lpfan4491 Місяць тому

    This unironically feels like a "I am gaming on AMD gpu, but I just have to use Nvidium in Minecraft"-kind of thing. There are some gains(When they don't bite each other to make it worse), but it feels so unneeded.

    • @platinumsun4632
      @platinumsun4632 Місяць тому

      Wait how does that even work? You can use shaders with Nvidium anyways.

  • @zpnk
    @zpnk Місяць тому +1

    That dude at 0:40 installing the second card gave me a brain aneurysm...

  • @nikifor5229
    @nikifor5229 4 дні тому

    In my experience, several different gpu manufacturers lead to the ability to simply connect several additional monitors, it is worth considering that the rendering on the monitor will take place on the GPU to which it is connected (thanks captain), but the most interesting thing is the technology of the manufacturers. I found a case it the web like using gt1030 to work with physx in borderlands 2 or 3, I can't remember, although all the main rendering took place on rx580. Another option, at the moment I use EGPU with a laptop, the laptop has an igpu amd 780m and via EGPU I connect rtx3060 in baldur's gate 3 on a stationary computer with rtx3050 I did not have the opportunity to enable FSR only dlss, and in my EGPU build I have FSR and DlSS, so it seems to me that the increase of perfomance with diferent manufacturers gpu will only be if the user himself will parallelize tasks, for example, the main monitor with the game is a high-performance gpu and a funny video with cats on the second monitor is rendered by a weaker gpu. SLI and crossfire themselves are very lame solutions, for example: with nvidia both gpu will have the same frequencies and the most powerful will adjust to the weak one and in most tasks one card checks the work of the other and therefore very often there were problems with sli configurations. AMD Crossfire has no frequency limitation, but I very much doubt that legacy AMD video cards with old drivers in Crossfire can show stability in games or work tasks.

  • @PVT_MaYhEm
    @PVT_MaYhEm Місяць тому +1

    I have a spare 1080 laying around... and I have a 6950XT...
    I also have a motherboard that supports bifurcation (Basically, SLI/Crossfire capable)
    I have 2x PCIe X16 slots... (x16/x0, x8/x8).
    As long as the motherboard supports PCIe Bifurcation, it can likely support it.
    What you're saying is possible. It has been done. This is a topic as old as it gets... no offense.
    The only caveat being, you can't run the same game off both gpus in almost every scenario... with a few exceptions where it actually works.
    NOW... if your PRIMARY GPU is an NVIDIA... you can set the AMD Card in NVIDIA Control Panel to handle PhysX/Compute whilst the NVIDA GPU handles the graphics load.
    This... usually works better if the secondary card is an NVIDIA card... even if not the same generation.
    As long as the signal is x8 or better, the GPUs can focus mainly on what they need to do.
    In many cases, people use this for virtualization... utilizing hypervisor OS to run virtual machines... tying those GPUs to their own virtual machines, and specifying the amount of compute per cpu, you can have 2 gaming pc's in one.
    I have a Ryzen 5900X, 6950XT, and 1080, all on water. I could do this without issue... but you pointed out the most obvious plot-twist... power. Even though I'm capable of running it with a 1200W PSU, the power consumption just from running the second GPU will increase by at least 100W at idle, and if both gpus are utilized for gaming, well, you can imagine the power bill is akin to running a space heater.
    Let's face it, Virtualization is the only reason to have more than one GPU anymore these days. Nobody is optimizing for multi-gpu gaming anymore.

  • @coffeelofi46
    @coffeelofi46 14 днів тому

    you could just use a software that tricks the pc into thinking that your 1 monitor is actually 2 seperate monitors and you can then assign both of the gpu's to render left side and right side seperately leading to 100% consumption of both gpu's ( idk the software name, i just heard about it on the internet like a month ago, you just have to do your own research) :)