This also works with video acceleration gpus, Tesla M40 for example with a few changes in command prompt to change to compute mode to WDDM mode and you got gtx 980 ti performance with 12gb. EDIT: changes in cmp are nvidia-smi -g 0 -dm 0 (idk why it doesnt let me reply)
really depends on the game anything running on openGL if not told will always just choose the iGPU but anything running on Vulkan or directx seems to pick the strongest GPU 👍
And if you use Nvidia GPUs it's gonna be easier, as Nvidia driver has Optimus feature which auto switch between the iGPU and discrete GPU depending on the load demand
NVIDIA mining card's require driver mod's to be used for anything other than mining.. LTT have a video on how to modify the driver's to use it as a rendering device and pass though to the igpu.. AMD was allowing this specifically but NVIDIA had it locked down so people was forced to buy cards with port's.. so a BIOS and windows setting will work on AMD but not NVIDIA unless you download and use modded drivers or for security reasons modify your own drivers..@@sihamhamda47
@@sihamhamda47 It's not quite that simple. Depending on the game, the game may still only see the iGPU and use that. It's impossible to force a game to use the dedicated GPU if it's not configured to.
if i recall from a LTT video i seen, the reason you lose performance is a couple reasons, one because similar to you said it has long path to travel on the motherboard, but also and probably the biggest reason, even though you GPU is doing the video computing, it still sends it back to your CPU to process and send out via motherboard video port. there for putting more workload on the CPU and having more steps in its self.
You can reflash them to normal bios if you know how. I have done it in AFOX Mining RX580 turning it to normal gaming card. There is a risk of bricking the card though.
If you buy a Nvidia card you can run Nvidia patcher to install it then just flash with a techpowerup bios . The pcie lanes can be restored if you have the time and equipment i've done it many times. Even if you don't the hit is frames it's that bad i think it's like 5% . The thing that wanted crippled was A I task which require the high bandwidth.
If you can find it for very cheap, using an ex mining card with no ports might be a good thing. However, the abundance of used graphic cards on the market made the difference pretty much negligible imo
Some games such as Hogwarts Legacy have an option to choose what GPU you want the game to use but not all games have this option. For games that don't have a GPU option, you will have to use Windows Graphics settings.
That's reason we need a "MUX Switch" on laptop gaming. Basically, the discrete graphics card on a Gaming laptop does not have a video out port and it must also use the video out port from the mainboard.
I actually a similar kind of setup with a AMD 6800XT and a Nvidia 970 a few weeks ago to see how it would do. Ran my 1440p display from the 970 and had the 6800XT do all the game processing. The few games I tried worked at playable levels even at higher settings but it was very clear that this was a very inefficient way to run games and there were a few caveats with my specific setup that caused some losses in performance. For starters, the DisplayPort of the 970 was several iterations behind the 6800XT and my screen did not refresh as quickly even though it claimed to be running at 144hz. The 6800XT was also never fully loaded. Both GPUs were seeing over 50% load applied (even though the 970 should've really not been doing any real work) which caused significant power usage (>300 watts). My frame times were also increased drastically. If I had to guess I'd say it was a result of the data having to be loaded through two separate GPUs, both of which were not running at their full PCIe x16 rating because my CPU (9900k) was stretched for PCIe lanes given that I was also using a Nvme SSD running a PCIe x4. So essentially I had 2 x16 GPUs running at PCIe 3.0 x4. Whole thing was a mess.
My setup consisted of rx570 connected by HDMI to a TV and a VGA monitor connected to the motherboard. That was my way to save on DVI-VGA adapter. While I did some furmark tests before selling my gpu, on TV it was about 126fps while on monitor about 113fps. And yes, I am sure that it was generated by the GPU. No way that intel coffee lake igpu could do 10% less than rx570
The performance difference isn't as bad if your hardware supports Cross Adapter Scan-Out (CASO). 11the gen intel and ryzen 6xxx series I think is where support starts. Usually you loose about 10-16% performance by going through the integrated graphics without CASO. I do the opposite of this on my laptop to unlock features and increase my performance. It a little over doubles my frame rate through frame generation and upscaling. Mux-less laptops are configured this way. AMD drivers support laptop graphics with desktop features if you have a mux.....or...->. If you use smokeless or an msi advanced bios (Press Right Control+Shift and Left Alt + F2 all at once) you can select Advanced - AMD PBS -> Primary Graphics/Display Adapter -> Dedicated GPU , plug in an external monitor and restart. Then you can use AMD Fluid Motion Frames - (frame gen) and RSR ( Upscaling) This is for Radeon RX 5xxx series and above.
The proper full mod is to integrate the video processing circuitry, add on the ports, and flash a standard bios to the card, it'll unlock the full potential of the card but it is more expensive because of adding on the components required for video output. This method you have explained works well as long as the performance hit isn't that much of an issue.
you should have stated from the very beginning that for this method you require a motherboard that supports integrated graphics, and a cpu that has an IGPU. there's plenty of motherboards with no integrated video and plenty of CPU's, especially ryzens, with no IGPU.
@@batterypwrlow amd's best integrated graphics is the radeon 780M. the rx 580 still outperforms it. although the 780M gets close. but if one of these mining cards is worse than the rx 580, the 780M might outperform it. additionally, the cpus with the 780M, or similar IGPs, are gonna be quite expensive, and since they're ddr5 only, the rest of the system would also be very expensive, so, if you can get one of those, you can definitely pick up a brand new, high end gpu like a 3070ti, or rx 6700 xt or smth like that. this video showcases a budget strategy.
@@eduardoroth8207 AMD insiders reported, the Ryzen 5 8600G iGPU *IS* outperforming 1060 in Benchmarks and that's just a beginning of the low-end GPU prices finally falling down. It'll be slowly, but eventually *WILL* be.
It's not quite as simple as that, especially for older games. Some games only see the primary GPU, in most cases this is the iGPU. So even by forcing a game to use the dedicated GPU option, it'll still actually only use the iGPU. This also comes down to firmware, drivers, and manufacturers' implementations of those.
I get 4k out of both my RX 470 DVI-only ex-mining GPUs using a DVI-to-HDMI adapter/cable and the wonderful, free AMD/ATI Pixel Clock Patcher utility. I also get audio through DVI, which was initially a surprise to me. I've tried using the iGPU on my 12700K and the steps mentioned in the video but for some reason it hasn't worked well for me. I think audio was a problem? Anyway, I went back to the DVI port and my adapter/cable. Nice video; thank you for sharing.
The diference of the fps is not the latancy. It's only that when de gpu renderes a frame has to send it back to the ram, then the cpu has to compose the image on the integrated gpus frame, and then the integrated gpu has to send the imatge to the display.
5:28 you're close. When the GPU you've selected draws each frame, it has to be sent to the framebuffer of the GPU that has the display connected to it. If you're drawing 60 frames of 1080p, you're going to chip away 355.957 MB/s off the PCIE bus of each of those graphic cards, since the exchange of frames is transactional in nature. On top of that, the cards need to do extra processing to send and receive the frames rather than be busy with the game. It'd be much more efficient to just get a soldering iron and solder the missing ports on.
still love how clean and crisp and bright the intel gpus render none of this should be needed though as it is automatically initialized by the game it just has to route that video to a gpu at a lower clock cycle.
This method only works on older card's with AMD GPU's and NVIDIA GPU's require driver mod's there is a few videos about this and they all found AMD GPU's mostly allow this but NVIDIA GPU's don't. Hence a LTT video explaining it and how to modify your driver's on NVIDIA cards to pass though the rendered frame's to the igpu
First you need to have a gaming BIOS on the cards you want to use it like this and I recall that for some cards you can find plenty of BIOSes while for other cards there is none available. Soldering HDMI ports on mining cards is also easy (plus a few more components also easily done). Whenever you want to buy a mining card, check if there are gaming BIOSes available for download then proceed with buying.
I can say this video is pretty misleading from lack of knowledge. 1)First off standard drivers may not work with these GPU. On Nvidia they 100% dont work so you relay on hacked drivers 2)These GPGPUS may be not 16x. For example CMP line is limited to 1x PCIe (or you can solder additional surface mount capacitors for 4x-8x) 3)No RTX 4)No DLSS 5)Less performance from the box (these cards have lower core count) up to 30% 6)Increased latency and possible stutters from data transmit between GPGPU and iGPU 7)You need iGPU, and no ports is huuuge disadvantage cuz most software rely on single card, and not every game will work correctly with this setup. The higher class GPGPU you bought the less returns you got in terms of usability Better off seek a good deal on standard aftermining GPU like I did after looking through these cards. Bought first 1660 for 100$ and after half a year sold it and bought 3060-12 for 200$.
So say you have an integrated intel CPU with graphics integrated The GPU with no ports will not be dedicated. It will solely depend on the speed of your Intel HD graphics. It will say it's dedicated , but it is not because if you look at your task manager , it will tell a different story Games oftentimes rely on the bios, and the bios were reported as a dedicated g. P. U
the gpu without any ports, sapphire nitro+, actually has a HDMI port behind the pci slot cover! you only need to drill a hole there, and here it is! a fully working HDMI graphics card!
@@ACoTam2 Some of them come with the resistors installed, some don't, some come with some of them installed. The one I got had half the resistors populated but needed additional ones to activate the HDMI, and then the PCIe decoupling caps were also missing, limiting the card to PCIe 3.0 x1 until I soldered some on.
This is exactly how gaming laptops work. An APU and a dedicated GPU. Windows can switch automatically. Also, if you have a UPS for your PC and it has a USB port (usually USB-B) plug that into your computer and it will show state of charge and battery usage when power goes out. From there you can have your system automatically save and safely shut down or hibernate when remaining power is at whatever you set it to. it 100% acts like a laptop.
It's how gaming laptops *should* work, but as I experienced, it's not quite as simple as that and there's quite a few caveats and things which need to align and work with each other.
0:19 you can see the saphire has a HDMI port which is just hidden. Unscrew the backplate and there it is. If you reflash it with the gaming version bios maybe the performance will be a bit more since the mining bios kimits gaming perf. To 70~% perf.
its very important get a graphic card with 16x pcie lines, some one of this cards has x4 and isn't enough for gaming, maybe x8 will work but im not sure.
The reason why performance is lower is the same reason for developing a SLI Bridge. Your GPU making frames then send it to iGPU through PCI Express bus. It's more work for the bus and performance difference more noticeable the lower PCI Express gen. Also as I know AMD cards more sensitive to PCI Express bus speed.
wait: what about SAM on, vs off? Have you tried HAGS on vs off? (I know HAGS suck at AMD, but would love to know if there's a difference in this scenario...) Cool video.
now I know next time I buy a GPU I can get a RTX 4090 with no ports for just one lounge and play my games on RTX graphics at 100-ish FPS on my death bed
5:26 Wrong. The difference is caused by encoding and data transfer. The GPU has the take the frame, turn it into a 'video' (actually just a data stream) and your iGPU in the CPU needs to convert that into display output and send it out through the mobo. This extra communication between GPU to CPU back and forth over the PCIe bus also reduces performance a little. However, if you play heavier games, or your frame rate is high, the GPU and the bus connection is loaded heavier, so the impact is more noticeable and the difference is larger. 15% or 130/170 fps is insanely high. Thats 1 or 2 tiers of GPUs.... Better to buy some normal GPU cheap, then help a miner cash out on a used GPU.
it actually doesnt just go through the motherboard to the port , it goes throught the motherboard to the igpu which is in the cpu unit and acts as a relay then gets sent throught the motherboard to the port
Slight misconception in your DVI specs reasoning: Only DVI single link is limited to 2560x1900(30Hz), while the DVI dual link can go up to 3840x2400(30Hz, but also does 60Hz if you run it in 2560x1900)
ummm idk about that. it seems the dvi colors are richer and carries more effects. soo how would comparing fps show that one was better than the other? they are not outputting the same graphic. Did the mb port tourniquet the output? If so using the gpu to calculate then send the output through the mb is useless. Unless you believe a higher fps with limited colors is better than a lower fps with great detail. But that point puts you right back where you started. Balancing performance and quality. But using the gpu in the mix is adding power consumption, heat and expense..therefore in my opinion getting a gpu without an output is silly.
Interesting clip, thanks, however i have a question or two. Settings in window for Power levels in windows settings, GPU power levels settings Maximum performance, PCIE state X how many lanes when operating as integrated vs direct from DVI as the discrete gpu?
If your CPU doesn't have integrated graphics, would installing a cheap low profile graphics card be another option to get monitor ports for that mining GPU?
Why would you buy an ex-mining GPU anyway? You have no idea how hard it's been thrashed, or how long it'll last. Interesting information from a technical point of view, but you wouldn't catch be buying one.
the gpu maybe cheaper but the hardware requirment more spesific, more cpu workload, more electric bill, less performance... i rather buy NEW gpu with ex mining chip rebuilt vga with normal port, the price were slightly different, with warranty nevertheless
I was wondering if hdmi, mini-hdmi, or DP ports could soldered onto the PCB and an appropriate back plate be put on, reflash the vbios to a non mining card. But all that cost & effort may not be worth it.
the frame rates seem ok even when they are lower but what about artifacting and object population drops from the motherboard port , i mean do really care about FPS rates with that bad an imagine
Absolutely. Video encoders aren't tied to the ports so you could easily be playing a game on your main GPU while using the encoders on a secondary GPU for streaming. I typically game on a 6800XT but I've often used a 1050 Ti for streaming. I got a used 1050 Ti for around $50 USD. They don't need any more power than what a PCIe slot provides and still have great NVENC video encoders. I've got an identical setup running a Plex media server with a 1050 Ti and it can easily handle multiple video streams at once.
Some higher end NVIDIA chips can handle more than one simultaneous NVENC encodes. I have a 1050 4GB non-Ti (got it new, for just under $100 - it's half height single slot, which I needed for my PC) and since it's the same GP107 like yours (just cut down a little) it can handle one stream, but a GTX 1080 Ti can handle 2 and the Tesla P100's can handle 3! @@Silentguy_
I wonder how different the performance is when using different iGPUs. Especially when using a more modern (but not expensive as that would be unfair) or older but higher end APUs.
watching this video game me a silly idea. if i ran linux, with a cheap 3 port dispay graphics card and a high end graphics card through a kvm. wouldnt this work to?
Tbh I am never gonna buy GPU that was used by crypto miner. Why ? Well, I do not want to support them at all. Let them eat all of those GPUs if necessary. If some one wonders why low end gpu nowadays is priced as high end used to be, then it basically because of crypto mining that sky-rocketed the demand so high and then it turned out that desperate (but rich) gamer will buy GPU for ANY price. GPU has become an "luxury product". Average Joe has to either buy a Laptop or game on APU, or maybe a Steam Deck with a Dock. Mining has ended, but Prices on New GPUs had stayed.
For CPUs that don't have an iGPU, I wonder if it's possible to throw in a garbage card like a gt 710 or something similar and pass the mining card through that.
Was the car with ports a proper 580? Because the no port card you were using might be more like a 570 and performance on those cards is slightly less than a proper 580.
I think he used the same card for the tests. And yes 2048SP is technically 570 spec. But since AMD themselves released a 580 2038SP that's what it is called.
Ah yes... The "No ports Gay Yao Ming"
It served long and hard in the gay yoai mine
This also works with video acceleration gpus, Tesla M40 for example with a few changes in command prompt to change to compute mode to WDDM mode and you got gtx 980 ti performance with 12gb. EDIT: changes in cmp are nvidia-smi -g 0 -dm 0 (idk why it doesnt let me reply)
So a titan x basically
@@thisthingmint250 and at a cheaper price
you mean 1070 with more power draw
@@Insomnifera and cheaper the price
Whats the changes u need to do in cmp
01:55 this is done automatically by Windows.
The display output is done using the iGPU.
The computing for the game done by the GPU.
really depends on the game anything running on openGL if not told will always just choose the iGPU but anything running on Vulkan or directx seems to pick the strongest GPU 👍
And if you use Nvidia GPUs it's gonna be easier, as Nvidia driver has Optimus feature which auto switch between the iGPU and discrete GPU depending on the load demand
@sihamhamda47 thats only for laptops, Optimus wont work in this case
NVIDIA mining card's require driver mod's to be used for anything other than mining.. LTT have a video on how to modify the driver's to use it as a rendering device and pass though to the igpu.. AMD was allowing this specifically but NVIDIA had it locked down so people was forced to buy cards with port's.. so a BIOS and windows setting will work on AMD but not NVIDIA unless you download and use modded drivers or for security reasons modify your own drivers..@@sihamhamda47
@@sihamhamda47 It's not quite that simple. Depending on the game, the game may still only see the iGPU and use that. It's impossible to force a game to use the dedicated GPU if it's not configured to.
if i recall from a LTT video i seen, the reason you lose performance is a couple reasons, one because similar to you said it has long path to travel on the motherboard, but also and probably the biggest reason, even though you GPU is doing the video computing, it still sends it back to your CPU to process and send out via motherboard video port. there for putting more workload on the CPU and having more steps in its self.
if CPU is very powerful, you will not notice lose performance because it is very minimal
@@uluhitah12 yes but who has a very powerful cpu together with a crappy gpu?
@@jeroenvdw almost everyone who can afford pc can buy mid-high cpu, but not mid-high gpu. GPU prices have been pretty insane for the past ten years.
Yeah, the MUX Switch.
This won't work so easily with all types of mining gpus, some of the later types have severely cut down pcie lanes and/or different VBIOS
Also normal 580s are like 50 bucks now but it's a nice thing for people to know and maybe use in a pinch in the future.
You can reflash them to normal bios if you know how. I have done it in AFOX Mining RX580 turning it to normal gaming card. There is a risk of bricking the card though.
it literally will
If you buy a Nvidia card you can run Nvidia patcher to install it then just flash with a techpowerup bios . The pcie lanes can be restored if you have the time and equipment i've done it many times. Even if you don't the hit is frames it's that bad i think it's like 5% . The thing that wanted crippled was A I task which require the high bandwidth.
If you can find it for very cheap, using an ex mining card with no ports might be a good thing. However, the abundance of used graphic cards on the market made the difference pretty much negligible imo
Some games such as Hogwarts Legacy have an option to choose what GPU you want the game to use but not all games have this option. For games that don't have a GPU option, you will have to use Windows Graphics settings.
That's reason we need a "MUX Switch" on laptop gaming. Basically, the discrete graphics card on a Gaming laptop does not have a video out port and it must also use the video out port from the mainboard.
I actually a similar kind of setup with a AMD 6800XT and a Nvidia 970 a few weeks ago to see how it would do.
Ran my 1440p display from the 970 and had the 6800XT do all the game processing.
The few games I tried worked at playable levels even at higher settings but it was very clear that this was a very inefficient way to run games and there were a few caveats with my specific setup that caused some losses in performance.
For starters, the DisplayPort of the 970 was several iterations behind the 6800XT and my screen did not refresh as quickly even though it claimed to be running at 144hz.
The 6800XT was also never fully loaded. Both GPUs were seeing over 50% load applied (even though the 970 should've really not been doing any real work) which caused significant power usage (>300 watts). My frame times were also increased drastically. If I had to guess I'd say it was a result of the data having to be loaded through two separate GPUs, both of which were not running at their full PCIe x16 rating because my CPU (9900k) was stretched for PCIe lanes given that I was also using a Nvme SSD running a PCIe x4.
So essentially I had 2 x16 GPUs running at PCIe 3.0 x4. Whole thing was a mess.
My setup consisted of rx570 connected by HDMI to a TV and a VGA monitor connected to the motherboard. That was my way to save on DVI-VGA adapter. While I did some furmark tests before selling my gpu, on TV it was about 126fps while on monitor about 113fps. And yes, I am sure that it was generated by the GPU. No way that intel coffee lake igpu could do 10% less than rx570
The performance difference isn't as bad if your hardware supports Cross Adapter Scan-Out (CASO). 11the gen intel and ryzen 6xxx series I think is where support starts. Usually you loose about 10-16% performance by going through the integrated graphics without CASO.
I do the opposite of this on my laptop to unlock features and increase my performance. It a little over doubles my frame rate through frame generation and upscaling.
Mux-less laptops are configured this way. AMD drivers support laptop graphics with desktop features if you have a mux.....or...->. If you use smokeless or an msi advanced bios (Press Right Control+Shift and Left Alt + F2 all at once) you can select Advanced - AMD PBS -> Primary Graphics/Display Adapter -> Dedicated GPU , plug in an external monitor and restart. Then you can use AMD Fluid Motion Frames - (frame gen) and RSR ( Upscaling) This is for Radeon RX 5xxx series and above.
The proper full mod is to integrate the video processing circuitry, add on the ports, and flash a standard bios to the card, it'll unlock the full potential of the card but it is more expensive because of adding on the components required for video output. This method you have explained works well as long as the performance hit isn't that much of an issue.
you should have stated from the very beginning that for this method you require a motherboard that supports integrated graphics, and a cpu that has an IGPU. there's plenty of motherboards with no integrated video and plenty of CPU's, especially ryzens, with no IGPU.
It's funny cause we are getting to the point that a Ryzen with an APU could perform as good or better than these mining cards.
@@batterypwrlow amd's best integrated graphics is the radeon 780M. the rx 580 still outperforms it. although the 780M gets close. but if one of these mining cards is worse than the rx 580, the 780M might outperform it.
additionally, the cpus with the 780M, or similar IGPs, are gonna be quite expensive, and since they're ddr5 only, the rest of the system would also be very expensive, so, if you can get one of those, you can definitely pick up a brand new, high end gpu like a 3070ti, or rx 6700 xt or smth like that. this video showcases a budget strategy.
@@batterypwrlowit's crazy because you're a liar and that's definitely not ever going to happen.🎉
This example was done with a 2048sp model, so it's closer to a capped 570 due to this work around with the iGPU @@eduardoroth8207
@@eduardoroth8207 AMD insiders reported, the Ryzen 5 8600G iGPU *IS* outperforming 1060 in Benchmarks and that's just a beginning of the low-end GPU prices finally falling down. It'll be slowly, but eventually *WILL* be.
It's not quite as simple as that, especially for older games. Some games only see the primary GPU, in most cases this is the iGPU. So even by forcing a game to use the dedicated GPU option, it'll still actually only use the iGPU. This also comes down to firmware, drivers, and manufacturers' implementations of those.
thank you so much
I've been trying to find video about these no port card and see if there are difference
When I was searching for this solution like 2 years ago everyone was telling me it was impossible, but it was so simple
I get 4k out of both my RX 470 DVI-only ex-mining GPUs using a DVI-to-HDMI adapter/cable and the wonderful, free AMD/ATI Pixel Clock Patcher utility. I also get audio through DVI, which was initially a surprise to me. I've tried using the iGPU on my 12700K and the steps mentioned in the video but for some reason it hasn't worked well for me. I think audio was a problem? Anyway, I went back to the DVI port and my adapter/cable. Nice video; thank you for sharing.
The diference of the fps is not the latancy. It's only that when de gpu renderes a frame has to send it back to the ram, then the cpu has to compose the image on the integrated gpus frame, and then the integrated gpu has to send the imatge to the display.
so basically the same copyback type of thing that Windows Vista did...?
@@SeeJayPlayGames The same as laptop doing to this day
5:28 you're close. When the GPU you've selected draws each frame, it has to be sent to the framebuffer of the GPU that has the display connected to it. If you're drawing 60 frames of 1080p, you're going to chip away 355.957 MB/s off the PCIE bus of each of those graphic cards, since the exchange of frames is transactional in nature. On top of that, the cards need to do extra processing to send and receive the frames rather than be busy with the game. It'd be much more efficient to just get a soldering iron and solder the missing ports on.
still love how clean and crisp and bright the intel gpus render
none of this should be needed though as it is automatically initialized by the game it just has to route that video to a gpu at a lower clock cycle.
underated channel
Didn't know it was possible...Appreciate the info!
This method only works on older card's with AMD GPU's and NVIDIA GPU's require driver mod's there is a few videos about this and they all found AMD GPU's mostly allow this but NVIDIA GPU's don't. Hence a LTT video explaining it and how to modify your driver's on NVIDIA cards to pass though the rendered frame's to the igpu
You can also use this for a home server for plex/jellyfin
Great idea!
Absolutely. One of these cards could easily handle a dozen or more transcoding sessions.
As long as they aren't 4k anyways lol
First you need to have a gaming BIOS on the cards you want to use it like this and I recall that for some cards you can find plenty of BIOSes while for other cards there is none available. Soldering HDMI ports on mining cards is also easy (plus a few more components also easily done). Whenever you want to buy a mining card, check if there are gaming BIOSes available for download then proceed with buying.
I can say this video is pretty misleading from lack of knowledge.
1)First off standard drivers may not work with these GPU. On Nvidia they 100% dont work so you relay on hacked drivers
2)These GPGPUS may be not 16x. For example CMP line is limited to 1x PCIe (or you can solder additional surface mount capacitors for 4x-8x)
3)No RTX
4)No DLSS
5)Less performance from the box (these cards have lower core count) up to 30%
6)Increased latency and possible stutters from data transmit between GPGPU and iGPU
7)You need iGPU, and no ports is huuuge disadvantage cuz most software rely on single card, and not every game will work correctly with this setup.
The higher class GPGPU you bought the less returns you got in terms of usability
Better off seek a good deal on standard aftermining GPU like I did after looking through these cards. Bought first 1660 for 100$ and after half a year sold it and bought 3060-12 for 200$.
Such a setup can only be a supplemental. It will be quite honestly depending on your cpus Integrated GPU
So say you have an integrated intel CPU with graphics integrated The GPU with no ports will not be dedicated.
It will solely depend on the speed of your Intel HD graphics. It will say it's dedicated , but it is not because if you look at your task manager , it will tell a different story Games oftentimes rely on the bios, and the bios were reported as a dedicated g. P. U
the gpu without any ports, sapphire nitro+, actually has a HDMI port behind the pci slot cover! you only need to drill a hole there, and here it is! a fully working HDMI graphics card!
It doesn't work by default because there are resistors missing. Watch this: ua-cam.com/users/shortsuLBAmM9bV9U
@@gpuspecs hmmm, interesting. i bought the exact same GPU for a friend and worked flawlessly. maybe there are different revisions?
@@ACoTam2 Some of them come with the resistors installed, some don't, some come with some of them installed. The one I got had half the resistors populated but needed additional ones to activate the HDMI, and then the PCIe decoupling caps were also missing, limiting the card to PCIe 3.0 x1 until I soldered some on.
thank you so much i was looking for this
This is exactly how gaming laptops work. An APU and a dedicated GPU. Windows can switch automatically. Also, if you have a UPS for your PC and it has a USB port (usually USB-B) plug that into your computer and it will show state of charge and battery usage when power goes out. From there you can have your system automatically save and safely shut down or hibernate when remaining power is at whatever you set it to. it 100% acts like a laptop.
It's how gaming laptops *should* work, but as I experienced, it's not quite as simple as that and there's quite a few caveats and things which need to align and work with each other.
0:19 you can see the saphire has a HDMI port which is just hidden. Unscrew the backplate and there it is. If you reflash it with the gaming version bios maybe the performance will be a bit more since the mining bios kimits gaming perf. To 70~% perf.
This was an interesting test. A more budget friendly GPU with some FPS loss.
its very important get a graphic card with 16x pcie lines, some one of this cards has x4 and isn't enough for gaming, maybe x8 will work but im not sure.
This works by copy frame buffer to the igpu, which is also where the performance bottleneck would be... Might be worth it in some case, might be not.
The reason why performance is lower is the same reason for developing a SLI Bridge.
Your GPU making frames then send it to iGPU through PCI Express bus. It's more work for the bus and performance difference more noticeable the lower PCI Express gen.
Also as I know AMD cards more sensitive to PCI Express bus speed.
Only catch is the driver support is limited depends on the type of the miner gpu as well it may not work out of the box
Does this mean i can use a high end graphics card through my motherboard's vga port?
The GPU will send the data to your iGPU (the video basically) and you'll get an output like this yes
Which one has more lag. hdmi to vga converter or routing through the motherboard
@@1marcelfilms no lag but performance drop if you use motherboard way
the converter I would say ?
The mobo doesn't have much lag, the GPU either.
Just the signal having to get converted can take "time" @@1marcelfilms
wait: what about SAM on, vs off? Have you tried HAGS on vs off? (I know HAGS suck at AMD, but would love to know if there's a difference in this scenario...) Cool video.
and one more thing: active northbridge cooling...
awesome video, I think I'll be picking one up from aliexpress!
now I know next time I buy a GPU I can get a RTX 4090 with no ports for just one lounge and play my games on RTX graphics at 100-ish FPS on my death bed
Don't these clearly still have the headers on them to accept a DVI or HDMI female socket? I mean, couldn't you just solder one on with a $10 iron?
Some minning gpus has lower clock speeds and half ram. So lower clock affect fps.
5:26 Wrong. The difference is caused by encoding and data transfer.
The GPU has the take the frame, turn it into a 'video' (actually just a data stream) and your iGPU in the CPU needs to convert that into display output and send it out through the mobo.
This extra communication between GPU to CPU back and forth over the PCIe bus also reduces performance a little.
However, if you play heavier games, or your frame rate is high, the GPU and the bus connection is loaded heavier, so the impact is more noticeable and the difference is larger.
15% or 130/170 fps is insanely high. Thats 1 or 2 tiers of GPUs.... Better to buy some normal GPU cheap, then help a miner cash out on a used GPU.
it actually doesnt just go through the motherboard to the port , it goes throught the motherboard to the igpu which is in the cpu unit and acts as a relay then gets sent throught the motherboard to the port
Slight misconception in your DVI specs reasoning: Only DVI single link is limited to 2560x1900(30Hz), while the DVI dual link can go up to 3840x2400(30Hz, but also does 60Hz if you run it in 2560x1900)
This is freaking insane.
ummm idk about that. it seems the dvi colors are richer and carries more effects. soo how would comparing fps show that one was better than the other? they are not outputting the same graphic. Did the mb port tourniquet the output? If so using the gpu to calculate then send the output through the mb is useless. Unless you believe a higher fps with limited colors is better than a lower fps with great detail. But that point puts you right back where you started. Balancing performance and quality. But using the gpu in the mix is adding power consumption, heat and expense..therefore in my opinion getting a gpu without an output is silly.
This might be good for running E GPU on a laptop since you're going back to laptop's screen anyway
Interesting clip, thanks, however i have a question or two.
Settings in window for Power levels in windows settings,
GPU power levels settings Maximum performance,
PCIE state X how many lanes when operating as integrated vs direct from DVI as the discrete gpu?
If your CPU doesn't have integrated graphics, would installing a cheap low profile graphics card be another option to get monitor ports for that mining GPU?
Why would you buy an ex-mining GPU anyway? You have no idea how hard it's been thrashed, or how long it'll last. Interesting information from a technical point of view, but you wouldn't catch be buying one.
This essentially mimics the bottleneck laptop GPU's have without the use of a MUX switch.
The dynamic range takes a hit too? why is everything so bright on the iGPU mode?
Pretty sure the iGPU footage was recorded with a camera while the dedicated GPU footage was a screen capture.
nice video. can u test latency?
ahh its like a laptop gaming without mux switch
699 subs, let me fix that!
you are comparing a rx 580 to a rx 580sp. they have different specs. the rx 580sp is not as strong.
the gpu maybe cheaper but the hardware requirment more spesific, more cpu workload, more electric bill, less performance... i rather buy NEW gpu with ex mining chip rebuilt vga with normal port, the price were slightly different, with warranty nevertheless
For half the price its not really worth it. Now if they were 80% cheaper it would be a good choice.
Thx broh excellent video
I was wondering if hdmi, mini-hdmi, or DP ports could soldered onto the PCB and an appropriate back plate be put on, reflash the vbios to a non mining card. But all that cost & effort may not be worth it.
It worked for me, thanks pal...
This is basically what Laptop is doing this whole time
I wonder if it would work on amd mobo with non G processor.
It looks like the output is different for each card when it comes to colors- is that just the recording or is that actually in game?
the frame rates seem ok even when they are lower but what about artifacting and object population drops from the motherboard port , i mean do really care about FPS rates with that bad an imagine
wonder if these mining cards still have their video encoders and could you off load streaming work load to the mining gpu?
should; they'd have to nerf silicon rather than just leave off ports in order to break that.
Absolutely. Video encoders aren't tied to the ports so you could easily be playing a game on your main GPU while using the encoders on a secondary GPU for streaming. I typically game on a 6800XT but I've often used a 1050 Ti for streaming. I got a used 1050 Ti for around $50 USD. They don't need any more power than what a PCIe slot provides and still have great NVENC video encoders. I've got an identical setup running a Plex media server with a 1050 Ti and it can easily handle multiple video streams at once.
Some higher end NVIDIA chips can handle more than one simultaneous NVENC encodes. I have a 1050 4GB non-Ti (got it new, for just under $100 - it's half height single slot, which I needed for my PC) and since it's the same GP107 like yours (just cut down a little) it can handle one stream, but a GTX 1080 Ti can handle 2 and the Tesla P100's can handle 3! @@Silentguy_
yoo please make a video doing the exact opposite if possible
using ur iGPU through GPU
why not solder hdmi into it?
I wonder how different the performance is when using different iGPUs. Especially when using a more modern (but not expensive as that would be unfair) or older but higher end APUs.
watching this video game me a silly idea. if i ran linux, with a cheap 3 port dispay graphics card and a high end graphics card through a kvm. wouldnt this work to?
Me with a VGA running at 1366x768 resolution.
Hmmm yes, 480p…
I have used my gtx 1080 like this when my vga adapter broke I didn't notice any difference
rx580/570 need driver mod for this like p106/etc?
Tbh I am never gonna buy GPU that was used by crypto miner. Why ? Well, I do not want to support them at all. Let them eat all of those GPUs if necessary. If some one wonders why low end gpu nowadays is priced as high end used to be, then it basically because of crypto mining that sky-rocketed the demand so high and then it turned out that desperate (but rich) gamer will buy GPU for ANY price. GPU has become an "luxury product". Average Joe has to either buy a Laptop or game on APU, or maybe a Steam Deck with a Dock. Mining has ended, but Prices on New GPUs had stayed.
i used to have a p106-90 3gb mining 1060 3gb with no ports worked not to bad more hassle to setup oin nvdia had to do regedit if i mind lol
I honestly didnt know that.
And i dont need it, my 3090 got ports^^
But who knows, i may need it some day.
bro skip driver installation section
For CPUs that don't have an iGPU, I wonder if it's possible to throw in a garbage card like a gt 710 or something similar and pass the mining card through that.
Yes it is.
thanks
But you make mono X-99 No port😢
can i do it on x99 aliexpress mobo with e5 2660 v3
Costs 50% less and lasts 90% less 😂
How am i going to turn on my pc to switch settings without screen though
you need integrated graphics on your cpu to use the ports on your mobo
That's interesting. Now show me how to do that on linux.
Doing this is really not worth it.
i feel like your pc is overkill compared to gpu
Was the car with ports a proper 580? Because the no port card you were using might be more like a 570 and performance on those cards is slightly less than a proper 580.
I think he used the same card for the tests. And yes 2048SP is technically 570 spec. But since AMD themselves released a 580 2038SP that's what it is called.
Rx 580 2048SP but it might be a rx 470 or 570 not too sure. Watch this video if you're interested: ua-cam.com/video/Wf6GIrhyhu4/v-deo.html
are there any 3090 like equivalents like this?
This is really scraping the bottom of the barrel... Why am i watching this with 2 high end PCs?
not everyone is using windows my boi
Does this work only on Windows 10?
Or jusr use a 2nd cheap graphics card and put thr settings to the graonics card tou want to do the work thats whar i did lol
he didn't test it with cs or fornite
So?
Wonder if you can use these mining cards for Stable Diffusion workloads. Cheap image generation setup. The more VRAM the better though.
Nice.
this is awesome i hope i found it sooner
Nah this is not viable. Buying a card with no ports is stupid. Interesting video amyway.
You need a pop filter bro.
please go much nearer to your microphone
also: aw yess, much more input lag and worse performance