I tweaked the title since so many people are confused about the difference between Cables vs Risers for PCIE socket despite the thumbnail.... but this means that I now need to play around with riser cards and see if there is any tangible difference.
There's a huge difference if you use the crap ones Thermaltake was giving with their P3/P5 cases. I tried two different ones (original and replacement), both would only run at 2x with horrible performance. I'm one of the lucky ones though since many people had those riser cables short out and kill their systems. I ended up getting one off Amazon that works perfectly.
Obviously there is an ampere - bottleneck with one cable. Depending on the PSU layout - single rail vs multiple rails it is the lack of diameter of the cables or the lack of rails or both.
@@nickgomez28 1 cable. We’ve found the issue. My psu it totally wrong. Tuned the over clock down on afterburner till I can buy a better psu. Working fine now
Denis Naumov On the other hand, you'll be lighter and thus need less energy (=fuel) to get up to speed. There's some balance to be made between faster degradation of fuel pump and less fuel consumption.
Just did this exact test with my 3070 and had an avg score of 7100 and avg fps of 170 with the pigtail congfig. Using the two dedicated power cables my avg score went up to 7260 and avg fps to 174. Overall GPU performance has increased while gaming.
Yes! I did something silly as well. Used to be running a GPU off a pigtail was fine. I got away with this on a R9 390 which could pull close to 300 watts, and a GTX 1080 which could pull 200 or so. This was not fine at all on my 3070Ti. I didn't really notice any performance problems at first, but just the other night I was gaming and my rig crashed hard. I'm almost certain my Seasonic PSU tripped it's OCP and shut the rig off, but not before my display glitched out, and completely freaked me out. I was afraid I killed something, or my card was defective. I've experienced this kind of shutdown only once or twice, but I trust my Seasonic PSU. It's been very reliable. I've seen it take many a brown out without damaging anything. I don't think I damaged anything in this case, but I certainty could have. The sad thing is, is that I've been building rigs for a long time, and I should know better. Old habits die hard I guess, and if you look up information online, you'll see a lot of people saying "Meh. It's fine. It doesn't matter." It definitely matters on these new cards. Use two cables . Today I decided to plug in two cables, each on a separate rail, which both should be more or less load balanced. I'm seeing an improvement in performance, as well as temperatures. I had a few random driver related crashes before doing this as well, with one cable the card "worked" ... but it wasn't "just fine".
JayzTwoCents Its due to the efficiency of the 12v rail(s), higher amperage per cable means more stable power and less loss through heat means more juice and less ripple. The results are notable, but shouldn't have much impact overall. I did this test on my 1080 Seahawk and found similar results with my cablemod extensions. Great video though, its always interesting learning how small differences between configurations can add up to a more impactful change.
u can add also readouts from program like aida64 or similar where u can see how 12v rail is doing and how big or less are drop when u put load on one or two cables
The 2 cables might improve the performance when the power draw suddenly increases. The PSU might not provide stable enough power at those moments if it has to put it though one cable.
Hey Jay, the answer as to why is actually quite simple. I deal with this a LOT when designing and building RC racing drones and flight controller to ESC to motor power delivery systems for them. It solely has to do with the resistance of the cable that that current & voltage run down. Use your multimeter and go measure the voltage at the GPU END (this is key) of the cables. The dedicated cables will give you a higher value because V=IR is ALL you need to know as to why dedicated cables outperform a shared cable. I'll let you run your own math on this one. ;) - Eganwp
I have another theory for you. Current creates heat, heat creates resistance. Less heat=less resistance, you get more current. The result would be the same if the guage of the wire were increased.
hey dude, I have a semimodular power supply. Bunch of cables coming out of a whole that connect to all my components. My PCIE slot in PSU is empty? Right now my GPU is connected with a single - 8 pin to a 2 (Daisy chained) 8 pins (6+2) to my GPU. If i wanted to do 2 separate 8 pins, should those be goign to my PCIE on my PSU?
Woolfyishere it’s semi modular so it had bunch of cables together (24pin atx, 8pin cpu, 2 pin pcie daisy chained). What I was wondering is if I wanted to run 2 separate pcie cables would I plug into pcie slot. I’ve figured it out since posting this question. I just ended up swapping out psu for a fully modular one anyway so I no longer daisy chain connection to gpu and now have 2 dedicated 8pins going into gpu
@@mannyfresh7065 If your semi-modular PSU has 1 PCIe cable in the non-modular bundle, and also extra modular plugs for more PCIe cables, then yes, use an extra PCIe cable that came with the PSU and connect from your PSU to your GPU's second connector so you will have 2 separate cables connected. Although if your card doesn't pull anywhere near 225 watts it won't really make a difference. The PCIE slot delivers 75 watts and 1 PCIe power cable delivers 150 watts. But if your card has two 8-pins then you probably wanna use two cables :)
everope thanks for your reply. I’ve actually switched to fully modular as for whatever reason when I hooked up the extra pcie cable it wouldn’t turn on my gpu. But now my fully modular allows me to have 2 separate pcie cables.
Btw I recently finished custom SFF build and was hooking cables (duh) and testing overclocking performance ... So I don't know if a single pig-tail Type4-PCIE cable affects performance but it DEFINITELY AFFECTS STABILITY!!! My specs are: SF750 PSU, i9-9900k, Rtx-2080Ti, AsRock z390 Phantom Gaming-ITX/ac, 32GB RAM G.Skill RGB 3200/CL14 The "case" is 13L (400x260x125), but I haven't closed it yet (as the side-panels are custom as the rest of the case, and I haven't bend them yet :)) It was not overheating - while both CPU & GPU are very powerful and replace very well the room-heater in the winter, aparently the 360 Rad (alu 27mm thick w/ Noctua NF-A12x25 fans) plus the 3 thin NF-A12x15 at the bottom can keep the temps down (confirmed now that the issue is solved - CPU stress test @5GHz & under-volted @1.27V) keeps the temps under 70C (and my ambient is about 24C). GPU is staying even cooler (I guess because of the wider contact surface of the water block). But still even without any overclocking after maximum of few minutes playing Witcher 3 at top GFX settings resulted in freezing the PC. I noticed that after this happened, it could even happen when not under load just browsing stuff in Windows. Checked a ton of info on the net and finally decided to put the two Type4-PCIE cables to the 2080Ti (which I tried to avoid as I don't have much space inside ... apparently I'd better find custom cables with proper length (ie shorter) - calculated that I have 1meter of extra cable-guts that I don't need :)), and all the stability issues are gone! Even turned the OC settings back up out of curiosity (not that I really need it yet) and still everything is stable now. Later I read somewhere that GPUs over 200W should use two Type4-PCIE cable for exactly this reason - STABILITY !!
Good to hear it worked for you, because I think it’s the same situation for me, my SFF with RX5700XT has months with instability gaming and with regular use, to the point I didn’t wanted to turn on my pc sometimes, after trying probably all the troubleshooting I could find on the internet I was already considering to start swapping parts when realized there was a second VGA port on my PSU (an EVGA), and well so far it seems to work, I’m gonna give it some days before ordering some nice cable replacements and start used my pc like it should have been a while ago.
@@ericc.30 , I assume you mean the 8-pin PCIE power cable (not VGA port since RX5700XT don't have VGA ports at all tpucdn.com/gpu-specs/images/c/3339-i-o.jpg :)) But I'm glad to hear it fixes your problems too! I ordered short cables from cablemod, as I couldn't find other custom length cables in europe. Cablemods are crazy expensive, but ... well :)
Yeah I mean those 8-pin cables, the thing is that I don’t know why in the EVGA PSU are named like that lol. That’s cool, Cablemod makes some really nice cables, probably I’ll go the same way.
@@ericc.30 , I had no idea either, but apparently someone opened a forum thread, and finally called them on the phone ... so it's: E.nhanced V.endor G.aming A.pplicators
There is a reason why this works. The instantaneous current delivery of large cable diameters increases the ability of a component to draw power from the supply. Even more could be achieved by applying proper noise and magnetic shielding to each individual cable. This would reduce grunge riding on the cable reducing the load on the component's supply and filtration circuits. The effect of component (internal) cable shielding on PC components would be less than the effect of cable diameter. The diameter would be responsible for the stability and the one or two tick bump. Shielding would add extra stability and might add half a tick. All of this assumes a stable power supply such as the Corsair digital series. Cheap PSUs would not benefit much (if at all). A substantial shielded power cable from the wall to the PSU needs to be used as a baseline to maximise all effects. I have designed Hi Fi power and signal cables for years, which is where these principles are best known.
@@stiltrsh Hi, good question! All power supplies deteriorate over time. That measn headroom that thye used to have will diminish after extended use. You will notice problems if you start getting randmom CTDs or BSODs. If everything is working fine, no need to change.
I think you pretty much nailed it. It's about power delivery and how clean it is. A standard 6 pin on a modern PSU has the same power potential as an 8. They can both do 3 12v lines. The difference is that the 8 has one extra sense pin and one extra ground. No extra power. The assumption you'd have to make is that somehow this extra sense pin is able to control the power delivery more reliably and that's why we use 8 pins. So under that same assumption we'd have to infer here that having two individual 8's must better be able to control power delivery because of the two independent additional sense pins as opposed to the one daisy chained in line. Cleanest possible power delivery I guess?
When I moved from a pigtail cable to two cables I actually had to reduce the overclock on my 1080. I was even able to reproduce the results... So your results may very.
So do I really have to run more cables? I already have the 16 pin connector from the one cable, should I save some time and money and just use the pigtailed(standard) way
One 8pin can supply 300watts and 9V. GPUs are designed to max out the intake on 1x 6+2pin to 150watts. So one pigtail can deliver the 2x150watts easily. Cards with 200-250watts tdp are really safe to use with one pigtail if you have a decent PSU.
Guy with electronics experience here, this makes no sense at all. Unless the actual plugs are somehow being directed to separate portions of the PCB and not tied together, it shouldn't have a performance difference. if the card needs more watts, it will just pull them, the wire will get hotter, but that does not mean less current will pass trough it. The card might have a sensing line connected to each plug, limiting the power just in case when you only plug one of them, as electrically it makes no sense at all to have less performance. You should test a variety of cards from different manufacturers to reach a conclusion. Because it makes no theoretical sense.
He should test other power supplys, 1 of the ground wires is in fact a sensing wire, that is in the specs for PCIE not many PSUs actually use it however, the one he is using may. And yes they just get hot and melt, seen them melt.
Look on the comment to Jay's post. The guy basically did the math for this actual cable. Even at 100 degrees Celsius the resistance change due to heat and the related loss in V was within the tolerances defined in the ATX specification. It should be mentioned at 100C the insulation and plastic parts will probably catch fire. The cable while very soft will probably still work though.
"if the card needs more watts, it will just pull them, the wire will get hotter, but that does not mean less current will pass trough it"....Maybe in this situation where the wires are very short but that is a very bad generalization to make. Hotter DOES equal more resistance...which equals less current. Again, I know I am nitpicking but making a broad statement like that is bad. That is like someone trying to tell you that your car that makes 100 hp at sea level will make the same hp no matter the elevation above sea level. Sure, at 100, or 200, or even 1000 feet it might not make a difference (like the short cables here)....but at 10,000 feet that car is going to make less HP. In my example: Resistance = heat = elevation Wattage seen by card = HP The greater the difference in heat (elevation), the greater the loss in wattage (HP). So, I guess what I am trying to say is that making broad statements that boil down to "Heat doesn't matter" is just a bad way to expain all this with out a qualifier, such as "heat doesn't matter in a cable that is less that two inches long".
Silviu Stroe voltage drop. it happens any time there is a sudden load at the end of a wire. the "load" in this case being the gpu. don't know if you've ever seen it, but older vehicles (80s) that have a volt meter, that actually reads the actual voltage of the charging system, reflect voltage drop anytime a load is applied. for example, a blinker. this is obvious to anyone who has ever installed amps and subwoofers in thier car. if you have two amps, both of which require 8 awg, you don't run an 8 awg and then splice it to go to both. you run at least 4 awg and splice that, or run two 8 awg wires.
Because of the resistance of the cable the current will slightly drop: V = I x R. While the Voltage (V) provided by the PSU will remain constant but twice the cable length providing current to the graphics card means the total cable lenght will have twice the resistance, so (R) becomes slightly higher causing a small decrease in current (I). When using two cables the resistance (R) will not increase so there will not be a decrease in current. But since the wires are already pretty thick, the loss will be really small. But you have proven the laws of physics really work, thanks for the video!
I think that it has something to do with the cable resistance, and the fact that when you try to use the same cable to power the 2 slots of the gpu, it demands more power in 1 cable which might increase its temperature, changing its resistance making it lose a little bit of power.
Duck Souls they will add delay in the signal. so it will hurt your benchmark scores. The user won't see any difference. I had to test this for HP on servers with nic cards. We had a 1% to .1% increase in latency and a 2%+/- on speed with 10 Gb nic cards. The cool part is with 100 riser cards we got a 1000ms lag and 10 mb nic connect lol.
+Lee Landis - Fascinating... I'm surprised that the network card still even worked with one hundred risers; you'd think that it would have started throwing errors and basically stopped working much sooner, given that PCIe signals weren't at all designed to be carried over long distances. I'm actually planning to use riser cables to relocate my boot drive (an M.2 SSD in a PCIe-to-M.2 adapter) into my to-be-custom 3D-printed PSU shroud, and I did worry about latency given that I'll be needing two riser cables to reach there, but now I'm a lot less worried. :p
Robert Faber it took a lot of work and was really touchy. interference was the biggest hurdle to over come. PCI Express is really strong and good. it only worked when it wanted too. but was the maximum we got to work 101 was too much. my boss wanted to know the max so we could send accurate rating to a data center on the specs. the same data sender complained the HP made riser in the server was slowing down the nic. we proved it was in margin of error but they pushed which lead to the 1 month project that got be laid off after they filled my job with a temp that was more liked.
It depends on the gauge of the wire used among other things like connection type, material at the junction, whether its modular or not, length of cable. All of these things add resistance to the wire and therefore will add a small bit of voltage drop which will change the operation of the VRMs on the card. I'm sure an engineer can chime in with more specifics. I only took a few engineering courses and I don't use the knowledge every day. My Seasonic power supply had an asterisk or something in the manual that said something like "If your GPU draws more than x watts, use two separate wires."
As a someone who is studying computer and electrical engineering at LSU currently, this is a bit surprising. I suppose a higher resistance could account for some of the performance differences, but it should be negligible. I wonder if it has more to do with how the card draws power/uses power, instead of the actual cables themselves. It could also be a factor in the power supply as well. There are so many variables here, but it is a very interesting inquiry that I never would have thought of. It'd be very interesting to take this to a lab on campus and see exactly what was going on. Great video.
well from what I remember from basic electronics, electricity takes the path of least resistance, two resistors in parallel, aka two wires, you calc them by 1/(r1)+1/(r2)=rT, then finally 1/(rT). also the total resistance of two resistors is lower than the smallest resistor in parallel.
Jay, I am an EE with 30+ years experience. In my experience connector pins present resistance. Parallel paths reduce that resistance The formula looks something like this. Assuming that each circuit path has the same resistance, the total resistance = resistance of each path / number of paths.
Using two, insterad of one wire "bundle" decreases resistance, and so voltage drop is smaller under load. So two instead of one = more volts But still not sure why it gains on performance, core is powered from VRM which should be working fine even with 0,5V lower input voltage. If you want to see some changes in voltage just connect precision voltmeter to input on GPU and you should be able to see differences. But it can vary much more between PSUs than wires, because some PSU can supply 12,1V instead of 12V which will already give you these couple points more in benchmarks. Sory for bad engrish.
I'd be curious to see the measurements under an oscilloscope. Maybe the graphics card pulls 100W average, but does short bursts of 500W, which would cause a huge voltage drop.
+Luca Fuoco that's very common, so i guarantee that is happening. ergo the 1st party Power Supply recommendations are bullshit, and you should only trust places which fully test via Oscilloscope so that you can see that the spikes are. to which end those spikes are what your Power Supply needs to be able to provide in order to be stable.
The difference in GPU performance is due to the resistance of the wires connecting it to the power supply. All wire has resistance which causes a voltage drop across the connection. A wire’s resistance depends on 3 factors: the cross-sectional area (referred to as the “gauge”) of the conductor, the material of the conductor (i.e. copper or aluminum), and the total length of wire in the circuit. The total length includes both the positive (+) line as well as the negative (-) (Ground) line returning to the power source. A short thick wire will have very little resistance and a long thin wire will have a very high resistance. The voltage drop across the circuit is equal to the amount of current in amps (I), multiplied by the total resistance (R) of the wire used (V = I·R). The value V is how much Voltage is lost across the connection. Power (Watts) is the product of current and voltage or P = I·V. If, for example, a GPU uses 200 Watts from a 12 V supply, then 16.7 Amps of current are required (which is A LOT). A typical GPU connector is about 2 feet long (4 feet total of wire round trip) and should usually be made from 18-gauge copper conductors. The GPU 6 pin connector uses 2 positive wires and 2 negative wires to transmit power (the other 2 are for sensors). When using just one connector the Voltage drop calculates to 0.221 Volts. This means the voltage transmitted to the GPU is reduced to 11.78 V. When using two connectors the amount of current traveling through each is cut in half. So the voltage drop across two connectors is 0.111 Volts, half that of a one connector harness setup (So the GPU runs with 11.89 V). Thank you for your experiment! Good to know a reduction as small as 0.1 V has a measurable impact on performance. If you wanted you could improve performance even further by using thicker wire, or cutting the connector to a shorter more precise length for your setup. This is a very quick explanation so please let me know which concepts you would like me to clarify.
when you use two cables of the same gauge to carry current, it's the same as using a single wire but minus 3 gauge. example 2x 16ga is the same as 1x 13ga do you can effectively get a bigger pipe for more current flow.
In theory this is the case but in practice it does not work like that because no two wires are the same, they will have different resistances due to the manufacturing processes when the wire was made and so on.. Electricity finds the path of least resistance so one wire will see more current on it than the others. The best way is to have a single home run to the device and let the devices PCB marshall out the power. But sometimes the best way is not the most practical either..
My guess would be the psu regulates the voltage and amperage for each socket and limits the output for each outlet or each outlet has a maximum set output to prevent overheat or damaging of the psu components. A relatable example is to compare a psu output to a two socket wall outlet, if you attach an 8 socket power bar to a single socket, it can still provide power to 8 items. Now 8 items running off a power bar in one socket will not be able to draw the same amount of power as they would if they were each plugged to a direct socket in the wall, although the items would still work the wouldnt be working at their full capacity. Powering 8 items off a power bar in one socket also presents a risk of damaging the power bar and wiring which could result in an electrical fire but in most modern houses the result would be setting off a breaker or blowing a fuse. Nice video Jayy
Jay, this is the Video that I like the most! I used a EVGA 750W PSU; EVGA 1060 SSC; and the EVGA Power Link. So, 1060 is a 6 PIN. Power Link can be set-up to input an 8 PIN. I input 8 PIN, and the Output is 6 PIN. The Overclock of the 1060 SSC on Valley was running at 2151 Stable! I think this is a Silicon Lottery winner but, the Power Delivery allows the GPU, to eat as much power as it wants!
A couple things could affect this. 1. Some power supplies have more then one 12v rail. So if your second cord is coming from that other rail then you now have two rails sharing the load. Each rail is going to have a set of capacitors. So you effectively double your capacity in this situation. Smoothing out the power more. 2. Even if it’s a single rail power supply you are going to have less resistance in the cables. Thicker cables (or more of them) = less resistance. Because the card runs on 12v it will pull relatively high amps. Power = voltage * amperage. If the card ran at say 24v it would effectively use half the amperage for the same watts. Therefore needed less thick wire.
Not enough to care. The air flow changes make a bigger change. Be careful about where it pulls from and where the hot air goes after the pass. Some cases put no thought in that at all with riser setups. :(
Jay! Some graphic cards use LEDs, RGBs and such that adds even more needed power than a so-called vanilla GPU. This would be the main reason why separate power cables is needed to power some of them.
You've said that you started in the car audio world. I did as well, but stayed there for 20 years. Bigger power/Ground wire, done properly, has netted me discernible sonic results. It didn't make sense, but it really did sound better. This, though the power requirements were easily met. I think these are things that are pushing the outer boundaries of power delivery, hi-fi and overclocking. A hardly noticeable difference in voltage consistency can make a difference while one is on the ragged edge. Well, actually it's a pretty smooth edge but one that can yield better performance if smoother. Of course these things don't make a difference to most, but. . . .
Something strange I encountered when using a single cable with 2x 8Pins to my GPU. There was a 5 second delay before the PC would being to POST (power on). I struggled to troubleshoot this issue and finally after using 2 separate cables, the delay is no longer there. My setup is a X570 MB/ Ryzen 5600X with a 6800XT GPU. Power supply was reused from my old SLi build and its a beefy 1200W Seasonic.
This may be due to the fact that when cable get warmer they become less efficient meaning more resistance also decreasing the voltage intern reducing performance. So, spreading the load over 2 cables will reduce how worm the cables get meaning they will be more efficient (less resistance). This is also why the supercool superconductors.
That tiny little bit of added resistance should make no difference unless you get like 1 meter long extensions, though I bet even those wouldn't be much of an issue.
you can suffer with some EM interference. It can vary a lot depending on the cable and where you route it. ultimately in average use with a great riser you won't see a change but cheap junk that rests against another pci-e slot can be noticed.
GERSBOXERS Extensions create some latency because it's a data bus (light only goes so fast), and it creates interferences, so it WILL lower performances, the same way an external GPU would, but it should be only 1-2%.
It has to be pretty long before it does anything. The ones that come with some itx cases and those 90 degree risers for mounting GPUs vertically change virtually nothing performance wise.
I did a google search on this question because I am having sound issues. Currently this is my set up; MSI Z370 GAMING PRO CARBON DDR4 SLI ATX Motherboard, Intel Core I7-8700K, Corsair Vengeance RGB PRO 16GB (2x8GB) DDR4 2666MHz, GPU EVGA Geforce RTX 2070, PSU Corsair HX850i High Performance. The CPU is cooled by an AIO; Corsair H115i RGB Platinum AIO. I have a ASUS Strix SOAR sound card installed to provide 7.1 surround sound for my Razer Tiamat 7.1 V2 headset. So I have one PCI-E going to the GPU and the splitter from the same cable plugged into the sound card. I was wondering if the sound issue I am having could be caused by this but I think it is unlikely since I have not had this issue from day one. Glad to see Jay had a video on this.
I think an important factor in your observation is when you used the 2 cables, were they on different rails? If so, that will probably explain this. From what I know during my courses in university, an electrical component can cause small variations in the voltage on a cable. If you link a component that does this to a component is very sensitive to power variations, then it can behave incorrectly. Of course here it is on a very small scale, but by re-using the same wire you are linking the components, while if you use multiple wires that are on different rails they remain separate.
But what was the voltage at the card of one cable vs. two? Would have been good to probe the connectors during the tests. The resistance of running the two PCIe connectors off of one cable is going to cause a voltage drop. W = V X A. If the voltages are lower, the current has to increase. This, in turn, creates even more resistance and also can increase temperature.
LOL! It has it's "own power supply", yes. A DC to DC "power supply", but it has to take voltage in to create voltage out and the tolerance of the FETs used on the card is going to be affected by the voltage in.
Extremely interesting video! I'm super curious to know why that might be the case, and hope you do some sort of follow up with more testing and/or professional comment. Perhaps GPU and/or PSU manufacturers have some input?
Also fun fact, if you're getting the error nvlddmkm and changing TDR values didn't help and underclocking your GPU ALSO didn't help, try using two power cables from PSU to GPU. That actually seems to have solved my issue and increased my performance in Blender as well as games.
Jay, here's my hypothesis as to why this could happen. When amps are drawn across a wire the wire will have a voltage drop. More amps drawn over a given wire will cause a larger voltage drop. So using two wires means that each one is carrying half the amps of the singer wire config, so there is a smaller voltage drop on each. This would only be very minimal change in voltage that I doubt could be measured with a regular handheld multi meter, but a very small change in voltage could cause the tiny change in performance you saw. This is all just like how you can under-volt a CPU to lower the clock speed.
I know you said your not an electrical wizard, but her's some correct terminology to use: you don't "draw voltage", voltage is simply there is the wires are connected. You can draw both power and current. Keep up the interesting videos.
1 cable harness increases the resistance of the power supply. It is different at 2 where the load can be better distributed. The "approximately" same effect can be achieved with thicker cables.
Drawing more current through a single cable would minutely drop the voltage meaning less overvolt capability, but I expect what's happening is due to 'high frequency inductive effects'. A decent analogy is to think of a freeway, during normal hours everyone drives quite fast, but when everyone needs to travel at once things can get blocked up. Wires act like this too, specifically they resist fast changes in the flow of charge, so every time the card executes instructions, the power consumption spikes, meaning charge flow has to change rapidly. Adding a second cable (more lanes to the freeway) means less inductance and power can be delivered more effectively when it spikes.
who do you think your watching Linus he doesn't have the staff for that. (aside from my joke I agree) To prove something you must fail at disproving your theory. Science 101
I was going to say this, redo both tests again after doing both of them initially to make sure the difference is persistent. The difference in results could simply be attributed to him having to restart the computer to change out the power cables, and not the power cables themselves.
exactly.... I would recommend doing the test 2 other times with a reboot for both test scenario and average everything up. There should have no difference between the two. Unless one of your connector has bad contact (bad connection) (higher impedance which would result of a voltage drop within the connector itself) but this is very unlikely. The difference is mainly due to how the test were done. Do the same test 3 times and you'll see the average will be very similar. :)
At current view count, people have already spent nearly 13,000 hours combined watching this video. His few hours spent benchmarking/editing/uploading this video isn't an excuse for improper testing. Especially when he gets paid to do so.
@@callmetarif yes; it's ok to cut the pigtail if you don't need it BUT be careful when you cut them it will have a bit of exposed wire so either tape it up and leave it alone and don't cut them. If those cables touch after being cut they will short.
Well, I've always used 2 cables on the video cards, but then again it's like in audio. When you have a 100 watt amp, then switch to a 200 watt amp, but run them at the same level. The 200 watt amp has more reserve power, so it will sound effortless compared to the 100 watt which may be more constricted sounding, if that makes any sense to you. Even though the video card can run on 1 line, adding another line gives you more reserve power to draw on. So, the card runs more effortlessly, especially when running demanding material that has peaks in power consumption. I could possibly give you all of the technical terms, but it's been well over 55 years since I graduated from electronics school. Like in audio though, the power supply design makes a huge difference. I had a Phase Linear 600 watt per channel amp & it sounded okay or so I thought. Then I replaced that amp with a McCormack 150 watt per channel amp. The sound was unbelievable!It sounded as though the same speakers grew a set of balls. The bass was down to 16 hertz, whereas with the Phase Linear it sounded like the bass cut off at 43 hertz. And the mids & highs were more detailed & open. The Phase Linear weighed 36 lbs., but the McCormack weighs 53 lbs. The difference was in the power supply.
4th year Electrical Engineering student here. The cables each have a small amount of resistance to them. The more amperage you draw out of each cable, the higher the voltage drop is (Since Voltage = Resistance * Amperage). So the reason each cable is limited to 11 amps each is not because they would blow up from too much current, it's because the voltage drop would be so much that the cable can no longer supply a proper 12 volts. So adding in the 2nd set of cables reduces the voltage drop, allowing the graphics card to receive closer to the 12 volts it needs.
as some one else said. The more current you put through a wire the more energy you loose to heat. This is why those big power lines run at massive voltages to reduce power loss. I highly doubt you would see much loose at these currents and distances.
I'm not surprised with the results... The wires themselves have a little bit of electrical resistance. Typically PSU manufactures only use 18 gauge wire. Just today before watching this video I was custom sleeving some 6-pin and 8-pin PCI-E power cables to length (a much shorter length) and using thicker 16 gauge wire. This video makes me great about the work I've been doing.
@JayzTwoCents *Resistance is introduced at every part of the circuit when you route power from one place to another.* The most significant source is going to be the power supply (called output impedance, not often an advertised rating, though.) and maybe dodgy connections with loose connectors. As you pull more current, the voltage sags. How much sag occurs for a given current depends on the total _resistance_ or _impedance_ of the system. Even as an Electrical Engineering student, I was also quite surprised by the results, given that the GPU essentially has local regulators (often being termed "VRMs" in this particular hobbyist field) to convert the 12V at a lower current into the Vcore voltages with much lower impedance and higher current capabilities. These regulators should have good line-rejection and load-regulation, meaning they can maintain a very precise voltage to the important bits even with dips and sags on the input line, and as the load the GPU presents changes, respectively. So perhaps this has more to do with the ground, and the 0V (ideal) rising up several hundred millivolts as current is returned to the power supply, and this causing some sub-system to throttle performance to prevent logic errors from occurring with a "dirty" (electrically noisy) ground reference.
Yes, any time your adding to the medium of the signal or power you're also increasing the resistance. Though it's going to be rather insignificant for short distances. Also, you also doubling the exposure of uninsulated connections and allow for more interference. The degree it effect performance can still be explored. Various websites and UA-camrs have have seen little impact on performance, as much as 2% is some tests.
the answer is absolutely yes - however the relevance of this yes varies highly. it's crucially important for Processors, as they need to complete their tasks in Nanoseconds and therefore how far away your RAM Slots are from the Processor directly affects performance - but other stuff has more flexibility.
@JayzTwoCents This is just some reboot fps rng. For Gpu cables theres only +12V and - ground, those all +12V come from same pin inside the PSU. More amps = more watts = more heat. aluminium or copper cable has 0.0041-0.0043 ohms @ +20°C. Even if the temp goes up to 80°C the resistance wont go over 0.01 Ohms, which doesnt affect the Voltage going to GPU. So i call this fps increase just boot rng. myth busted.
peNKku the max stable clock wouldn't be affected if this was the case. microprocessors are extremely sensitive, and the added leads reduce resistance, and this single case shows it *may* increase stability in some cases.
Gap801 lets say 8 pin connector has 4x1mm2 12V supply and same for grounding. extremely long, 0.5meter long supply, 4mm2 is rated at 41,8A at 12V and has 0.000333... Ohms of cable resistance. Heat resistivity wont go up 0.001 ohms even at 800°C, which has no effect on the voltage. Titan Xp has 250W tdp, thats 20,8 A. So theres plenty of cable in single supply. Same goes for CPU, sometimes you boot your pc you can run higher clocks stable and next time you get bluescreen.
i tryed and it didn't work even with the same graphics card , avg fps still the same and overclocking performance still the same i think is just that power supply that u have.
I haven't seen a lot of comments giving thoughts on the why of this. The way I understand the situation is its quality of power not amount. Your psu probably splits the pic-e power cables on separate rails. This allows the gpu to pull the same amount of power with less strain on the psu. Your card may not be able to pull the watts that a single power cable can provide but it lowers the quality of the power the closer it gets. A good comparison (although it's kinda in the opposite direction) think of speakers. if you have a 500 watt speaker and you hook up a 200 watt rms amp you can get the same volume as a 500 watt rms amp turned up to where it's putting out 200 watts. At this point the volume of the speaker is the same since both are producing 200 watts to the speaker but the sound quality is better because you are farther from the limit of the amp. If you are pulling 50% or less of what can be supplied by one line with a pig tail you will probably not notice a difference. However if you are pulling 70-80% or more of what can be supplied by one line with a pig tail then you will probably benefit from the better quality power from running two lines. sorry for being long winded.
It's most likely an impedance or cable capacity ("electron pressure") issue. With older cards this wasn't a big issue because they wouldn't use multiple power draws as their power draw was relatively low. Newer cards can use HUGE power draws that exceed anything previously seen. Thus the capacity to reach cable power max is much higher now. The cables can only carry so much. If you are approaching cable max the "electron pressure" reacts the same way any other "pipeline" would react, a drop in delivery. So having 2 independent cables decreases the "electron pressure" as they are flowing through different pipes. This means that better power flow. I have tried to explain this simply, but it somewhat complicated. Simply put higher draw cards need larger or more "pipes" to deal with the current.
@@bxeagle3932 They heavily restricted the free version now, you can only access your lastpass account with one device at a time, so you need to choose to either use your PC or phone but never both
Yea I'm confused by this result, because what you're doing is paralleling the two connectors, means lower resistance total, means more amps, but the power supply should be able to handle it, and paralleled connections will have equal voltages. So I think the simple V=IR has nothing to do with it (like most people in comments are talking about) There must be some hysteresis loss, I know it's DC power, but I'm not sure if it's constant, I'm gonna have to look up how GPUs pull power to come up with a theory for myself.
Well, when you draw the same amount of amperes through one single wire as you would through two wire with the same cross-section, then the voltage will come down a little bit. The higher the current, the lower the voltage as the wire will get warm and the resistance of the wire will go up. So yes, you will have a slighly higher +12V on two wires than on one.
I was wondering this JUST yesterday...... spooky. I feel JayzTwoCents is monitoring my thoughts. Ah well, i know how to fold them tin foil hats anyway....
This happens because with the plugs being in series the power has to flow through the card once before going through the second plug. This causes some power to be lost as heat into the surrounding environment. Therefore the second plug isn't going to get the same amount of power. It's like putting 2 light bulbs in series. Even if the power supply is rated at a much higher than the two light bulbs there is going to be a voltage drop over the first bulb do to the resistance of the filament and energy being lost as heat and light. Thus causing the second bulb to be slightly dimmer, one of the reasons lights in houses are wired in parallel.
2 things, resistance can increase for cables as you pass more amps on them. For PSUs however that have multiple rails, the further away you are from the spec of the switching PSU (such as the transistors and such, assuming it has multiple for different rails), the cleaner the output as the amps are lower. If you see PSU reviews that include a waveform test, you can see that at 50% and 100% use the waveform varies less at 50% than it does at 100% so you get a more DC like output.
Jay remember back in the day when psu's got mounted at the top of the pc case? I often think to my self why are psu's never mounted at the front of the case with the shortest cable lengths possible. What you have uncovered here is power drop and heat in the rail and wires. Normally you can't see real proformance loss, until as you have done and started overclocking. As you know overclocking is very reliant on stable voltage and the harder you overclock the more stable you want your voltage. I suggest getting the rig overclocking it to max CPU GPU ram settings as is. Then making some kind of psu mount over the middle of the motherboard. Order some custom wires with the sortest wireing route then rebenchmark.
I love the videos their so random at times but every time it's something worth watching and highly accurate. Keep it up, I'm pretty stoked my msi gtx 1070 armor Gpu with the Gpu boost is stable at 2036 mhz with my i7 7700k overclocked to 4.2ghz gets me a boost of almost 80 fps on heaven benchmark all settings maxed. I didn't want to mess with the CPU because I was getting temp spikes near the 80c mark on air but some tweaks in bios got the temps down and I went from 3.2-4,2 on the cpu and temps never go over 54c and that's after hours of gameplay. Now the stress tests take it to the 60/70 mark but once I'm done the temps go back down but that's normal. And all thanks to your videos
Always use 2 cables for High-End graphic cards! Even though you cant always see the difference in benchmarks - you get much less frame drops, lag and a more stable system!
Pigtailing increases the resistance in the cable that the electrons have to pass through. In physics, your 3 variables for this situation are V (potential difference in volts) I (current in amperes) and R (resistance in ohms). The equation to figure this out is V=IR so if you use 2 different cables and the voltage is the same throughout the system (psu to gpu @ 12V) , the resistance will lower which increases the current and total charge that goes through the wire, effectively allowing for more performance as more electrons can reach your gpu. @JayzTwoCents
My thoughts on why this is the case is that it's because Nvidia's powerlimit is so heavily enforced. the pigtail cables cause a bigger voltage drop at the end of the connectors and a lower voltage means more amps and since the power is measured by the voltage drop through through a diode more amps means a bigger voltage drop through the diode which means the card effectively thinks its using more power hence its probably boosting less giving less performance. If this is true doing this on an AMD card would result in no difference in performance. But testing is needed to prove this.
The difference is not necessarily in the use of two connectors from the same cable, the key is to use the same number of power wires in both connectors (depending on board design) and that the gauge of the wires that are being used should be enough to provide the amperage required with no drop in the supply voltage. I used to design the electrical power for buildings in New York City, and the wire gauge ( or wire size) makes a big difference.
As a qualified electrical-mathmagician, I can tell you that the only difference that could effect anything would be the extra distance of the pigtail (causing a very small amount of extra resistance) or if you were to somehow pull so much current that the power supply throttled the output through the single cable in order to protect the conductors.
The second one is the issue. For example the 3000 cards draw up to 1.29 times their wattage in transients highs. So I guess more independent cables are better able to handle those.
I tweaked the title since so many people are confused about the difference between Cables vs Risers for PCIE socket despite the thumbnail.... but this means that I now need to play around with riser cards and see if there is any tangible difference.
There's a huge difference if you use the crap ones Thermaltake was giving with their P3/P5 cases. I tried two different ones (original and replacement), both would only run at 2x with horrible performance. I'm one of the lucky ones though since many people had those riser cables short out and kill their systems. I ended up getting one off Amazon that works perfectly.
Obviously there is an ampere - bottleneck with one cable. Depending on the PSU layout - single rail vs multiple rails it is the lack of diameter of the cables or the lack of rails or both.
Nice proggy I download it for my gpu.
Yeah, I thought risers too. Anyway. Um. Yeah. Quite surprised by the result. Maybe the power delivery is a little cleaner.
BTW,
JayzTwoCents can you make a tutorial of how to overclock a GTX1060 graphic card?
This video needs a revisit with the RTX 3000 Series
That why I’m here. Nothing but problems with my 3070. Had it a few days now and can’t even finish a game of cs without an error
@@DaddyMorphh How is your 3070 hooked up? 1 cable daisy chained or two cables?
@@nickgomez28 1 cable. We’ve found the issue. My psu it totally wrong. Tuned the over clock down on afterburner till I can buy a better psu. Working fine now
So what is better for 3000 series? 3 independent 8pin cables or triple 8 pin in only one cable?
Especially with the 3090 since you need 24 pins in total so my question is... is it safe to use 2 cable and pig tail 1 of them
rgb affects performance
Agreed
for high end, really fast RAM module yes it does
Also always driving with low fuel will burn up your fuel pump. There is no winning.
SodiePops True, it just needs to be turned off and you gain 8 fps.
An RGB hater
Denis Naumov
On the other hand, you'll be lighter and thus need less energy (=fuel) to get up to speed.
There's some balance to be made between faster degradation of fuel pump and less fuel consumption.
Just did this exact test with my 3070 and had an avg score of 7100 and avg fps of 170 with the pigtail congfig. Using the two dedicated power cables my avg score went up to 7260 and avg fps to 174. Overall GPU performance has increased while gaming.
Meanwhile I couldn’t figure why I was crashing…and it was because I was using one cable in my 3080
Yes! I did something silly as well. Used to be running a GPU off a pigtail was fine. I got away with this on a R9 390 which could pull close to 300 watts, and a GTX 1080 which could pull 200 or so. This was not fine at all on my 3070Ti. I didn't really notice any performance problems at first, but just the other night I was gaming and my rig crashed hard. I'm almost certain my Seasonic PSU tripped it's OCP and shut the rig off, but not before my display glitched out, and completely freaked me out. I was afraid I killed something, or my card was defective.
I've experienced this kind of shutdown only once or twice, but I trust my Seasonic PSU. It's been very reliable. I've seen it take many a brown out without damaging anything. I don't think I damaged anything in this case, but I certainty could have.
The sad thing is, is that I've been building rigs for a long time, and I should know better. Old habits die hard I guess, and if you look up information online, you'll see a lot of people saying "Meh. It's fine. It doesn't matter." It definitely matters on these new cards. Use two cables .
Today I decided to plug in two cables, each on a separate rail, which both should be more or less load balanced. I'm seeing an improvement in performance, as well as temperatures. I had a few random driver related crashes before doing this as well, with one cable the card "worked" ... but it wasn't "just fine".
@@accessiblelinuxgaming thank you for this i was using the pigy tail on my 3080ti i will switch it
Thanks for sharing!!
It’s time for me to get sleeved cables and get out of this pigtail config for my 3080
I'd like to see the guys at gamers nexus put this to a more scientific test.
Link?
Michael Wahlgren he said he’d like to see it
where is it?? I'm here because of evga 3060 ti has 2 8pin connectors. I guess I could power the card with a single cable and 2 headers
@@tamassebok711 3060 ti is fine. But 250w and up I would use two cables
@@bluex610 This. Anything pulling 255w+ needs more than 1 cable apparently. My 6900 XT hits substantially better minimum FPS because of it.
I would have thrown in cable extensions as well.
Cable length is another good suggestion!
JayzTwoCents It's not about the size :p
Preach! - JayzTwoCents (When he read it)
JayzTwoCents Its due to the efficiency of the 12v rail(s), higher amperage per cable means more stable power and less loss through heat means more juice and less ripple. The results are notable, but shouldn't have much impact overall. I did this test on my 1080 Seahawk and found similar results with my cablemod extensions.
Great video though, its always interesting learning how small differences between configurations can add up to a more impactful change.
u can add also readouts from program like aida64 or similar where u can see how 12v rail is doing and how big or less are drop when u put load on one or two cables
Would love to see a refresh of this vid with modern parts given the huge power consumption of some cards.
+1 !
+1
The 2 cables might improve the performance when the power draw suddenly increases.
The PSU might not provide stable enough power at those moments if it has to put it though one cable.
Hey Jay, the answer as to why is actually quite simple. I deal with this a LOT when designing and building RC racing drones and flight controller to ESC to motor power delivery systems for them. It solely has to do with the resistance of the cable that that current & voltage run down. Use your multimeter and go measure the voltage at the GPU END (this is key) of the cables. The dedicated cables will give you a higher value because V=IR is ALL you need to know as to why dedicated cables outperform a shared cable. I'll let you run your own math on this one. ;) - Eganwp
Yup, and the measurements have to be done at full power, because the more amps you pull, the bigger the voltage drop is going to be.
Rubbish, a CPU is MUCH more than clockspeed, it actually doesnt mean much between platforms.
This is to do with amperage more than resistance, yes wires have resistance but its so small to make a difference at this length
If he has a multimeter, he's much better off measuring the resistance along one power cable vs two, isn't he?
I have another theory for you. Current creates heat, heat creates resistance. Less heat=less resistance, you get more current. The result would be the same if the guage of the wire were increased.
Video starts at 2:37
Why dont you upload anymore xSplayd
lol
ty bic boi
@@ZuoXGFX because he sucks. Jk
it's because there is more resistance across the one wire when you pig tail it, this wastes a little bit of voltage
Perfect answer right here!
Magicbjørn I had to revise this stuff for physics, so I guess there can be real application for what you learn😂😂
You're also limiting the amount of watts you're providing your card, thus less Overclocking potential.
Haha, nice :)! Yeah, never thought it would come in handy :p!
I just tested this with a multi-meter, and there is no difference in voltage.
I'm happy that every time I have a question, Jay has a video.
after watching this i went and changed my power cables from one cable and pig tail to two and i got an extra 25mhz also Thanks! mr jay
hey dude, I have a semimodular power supply. Bunch of cables coming out of a whole that connect to all my components. My PCIE slot in PSU is empty? Right now my GPU is connected with a single - 8 pin to a 2 (Daisy chained) 8 pins (6+2) to my GPU. If i wanted to do 2 separate 8 pins, should those be goign to my PCIE on my PSU?
@@mannyfresh7065 how is your gpu working if the cables aren't even plugged in?
Woolfyishere it’s semi modular so it had bunch of cables together (24pin atx, 8pin cpu, 2 pin pcie daisy chained). What I was wondering is if I wanted to run 2 separate pcie cables would I plug into pcie slot. I’ve figured it out since posting this question. I just ended up swapping out psu for a fully modular one anyway so I no longer daisy chain connection to gpu and now have 2 dedicated 8pins going into gpu
@@mannyfresh7065 If your semi-modular PSU has 1 PCIe cable in the non-modular bundle, and also extra modular plugs for more PCIe cables, then yes, use an extra PCIe cable that came with the PSU and connect from your PSU to your GPU's second connector so you will have 2 separate cables connected.
Although if your card doesn't pull anywhere near 225 watts it won't really make a difference. The PCIE slot delivers 75 watts and 1 PCIe power cable delivers 150 watts. But if your card has two 8-pins then you probably wanna use two cables :)
everope thanks for your reply. I’ve actually switched to fully modular as for whatever reason when I hooked up the extra pcie cable it wouldn’t turn on my gpu. But now my fully modular allows me to have 2 separate pcie cables.
Btw I recently finished custom SFF build and was hooking cables (duh) and testing overclocking performance ...
So I don't know if a single pig-tail Type4-PCIE cable affects performance but it DEFINITELY AFFECTS STABILITY!!!
My specs are:
SF750 PSU, i9-9900k, Rtx-2080Ti, AsRock z390 Phantom Gaming-ITX/ac, 32GB RAM G.Skill RGB 3200/CL14
The "case" is 13L (400x260x125), but I haven't closed it yet (as the side-panels are custom as the rest of the case, and I haven't bend them yet :))
It was not overheating - while both CPU & GPU are very powerful and replace very well the room-heater in the winter, aparently the 360 Rad (alu 27mm thick w/ Noctua NF-A12x25 fans) plus the 3 thin NF-A12x15 at the bottom can keep the temps down (confirmed now that the issue is solved - CPU stress test @5GHz & under-volted @1.27V) keeps the temps under 70C (and my ambient is about 24C). GPU is staying even cooler (I guess because of the wider contact surface of the water block).
But still even without any overclocking after maximum of few minutes playing Witcher 3 at top GFX settings resulted in freezing the PC. I noticed that after this happened, it could even happen when not under load just browsing stuff in Windows.
Checked a ton of info on the net and finally decided to put the two Type4-PCIE cables to the 2080Ti (which I tried to avoid as I don't have much space inside ... apparently I'd better find custom cables with proper length (ie shorter) - calculated that I have 1meter of extra cable-guts that I don't need :)), and all the stability issues are gone!
Even turned the OC settings back up out of curiosity (not that I really need it yet) and still everything is stable now.
Later I read somewhere that GPUs over 200W should use two Type4-PCIE cable for exactly this reason - STABILITY !!
Good to hear it worked for you, because I think it’s the same situation for me, my SFF with RX5700XT has months with instability gaming and with regular use, to the point I didn’t wanted to turn on my pc sometimes, after trying probably all the troubleshooting I could find on the internet I was already considering to start swapping parts when realized there was a second VGA port on my PSU (an EVGA), and well so far it seems to work, I’m gonna give it some days before ordering some nice cable replacements and start used my pc like it should have been a while ago.
@@ericc.30 ,
I assume you mean the 8-pin PCIE power cable (not VGA port since RX5700XT don't have VGA ports at all tpucdn.com/gpu-specs/images/c/3339-i-o.jpg :))
But I'm glad to hear it fixes your problems too!
I ordered short cables from cablemod, as I couldn't find other custom length cables in europe. Cablemods are crazy expensive, but ... well :)
Yeah I mean those 8-pin cables, the thing is that I don’t know why in the EVGA PSU are named like that lol. That’s cool, Cablemod makes some really nice cables, probably I’ll go the same way.
@@ericc.30 ,
I had no idea either, but apparently someone opened a forum thread, and finally called them on the phone ... so it's:
E.nhanced
V.endor
G.aming
A.pplicators
"I'm not a electrical-math-a-magician"
That's amazing.
James McReynolds *an
A Grammer nazi 3 years late
So how can I download my graphics card?
George Kats delete system32
the baff shut the fuck up
the baff nice thanks mate
Get a 3D printer from the year 2057
the baff sir i need you to alt+f4 immediately, thank you
It would be interesting to see results on the 30 series with 3x 2+6 pins.
2:38 I actually enjoyed that..
RMBL21 XD
There is a reason why this works.
The instantaneous current delivery of large cable diameters increases the ability of a component to draw power from the supply.
Even more could be achieved by applying proper noise and magnetic shielding to each individual cable. This would reduce grunge riding on the cable reducing the load on the component's supply and filtration circuits.
The effect of component (internal) cable shielding on PC components would be less than the effect of cable diameter. The diameter would be responsible for the stability and the one or two tick bump. Shielding would add extra stability and might add half a tick.
All of this assumes a stable power supply such as the Corsair digital series. Cheap PSUs would not benefit much (if at all). A substantial shielded power cable from the wall to the PSU needs to be used as a baseline to maximise all effects.
I have designed Hi Fi power and signal cables for years, which is where these principles are best known.
@@stiltrsh Hi, good question! All power supplies deteriorate over time. That measn headroom that thye used to have will diminish after extended use. You will notice problems if you start getting randmom CTDs or BSODs. If everything is working fine, no need to change.
I think you pretty much nailed it.
It's about power delivery and how clean it is.
A standard 6 pin on a modern PSU has the same power potential as an 8. They can both do 3 12v lines. The difference is that the 8 has one extra sense pin and one extra ground. No extra power. The assumption you'd have to make is that somehow this extra sense pin is able to control the power delivery more reliably and that's why we use 8 pins. So under that same assumption we'd have to infer here that having two individual 8's must better be able to control power delivery because of the two independent additional sense pins as opposed to the one daisy chained in line.
Cleanest possible power delivery I guess?
When I moved from a pigtail cable to two cables I actually had to reduce the overclock on my 1080. I was even able to reproduce the results... So your results may very.
So do I really have to run more cables? I already have the 16 pin connector from the one cable, should I save some time and money and just use the pigtailed(standard) way
One 8pin can supply 300watts and 9V. GPUs are designed to max out the intake on 1x 6+2pin to 150watts. So one pigtail can deliver the 2x150watts easily. Cards with 200-250watts tdp are really safe to use with one pigtail if you have a decent PSU.
Is this fact?
I have a Corsair rm650 and a 3060ti MSI gaming X.
Using a pigtail ...
Should I be worried?
@@anthonyplaysbass did you encounter any problems?
@@h5amm916 I ended up using two pcie power cables. Didn't run with the pigtail, better safe than sorry...
Yess this was a concern to me when I received my first 8+6 pin GPU recently... I was right to get 2 cables plugged after all!
Two separate cables or one cable pigtailed? I’ve got a rtx 3060
@@P3DOC he’s talking before 30 series
Guy with electronics experience here, this makes no sense at all. Unless the actual plugs are somehow being directed to separate portions of the PCB and not tied together, it shouldn't have a performance difference. if the card needs more watts, it will just pull them, the wire will get hotter, but that does not mean less current will pass trough it. The card might have a sensing line connected to each plug, limiting the power just in case when you only plug one of them, as electrically it makes no sense at all to have less performance.
You should test a variety of cards from different manufacturers to reach a conclusion. Because it makes no theoretical sense.
He should test other power supplys, 1 of the ground wires is in fact a sensing wire, that is in the specs for PCIE not many PSUs actually use it however, the one he is using may. And yes they just get hot and melt, seen them melt.
Look on the comment to Jay's post. The guy basically did the math for this actual cable. Even at 100 degrees Celsius the resistance change due to heat and the related loss in V was within the tolerances defined in the ATX specification. It should be mentioned at 100C the insulation and plastic parts will probably catch fire. The cable while very soft will probably still work though.
"if the card needs more watts, it will just pull them, the wire will get hotter, but that does not mean less current will pass trough it"....Maybe in this situation where the wires are very short but that is a very bad generalization to make. Hotter DOES equal more resistance...which equals less current. Again, I know I am nitpicking but making a broad statement like that is bad. That is like someone trying to tell you that your car that makes 100 hp at sea level will make the same hp no matter the elevation above sea level. Sure, at 100, or 200, or even 1000 feet it might not make a difference (like the short cables here)....but at 10,000 feet that car is going to make less HP.
In my example:
Resistance = heat = elevation
Wattage seen by card = HP
The greater the difference in heat (elevation), the greater the loss in wattage (HP).
So, I guess what I am trying to say is that making broad statements that boil down to "Heat doesn't matter" is just a bad way to expain all this with out a qualifier, such as "heat doesn't matter in a cable that is less that two inches long".
voltage is U not V in calculations
Silviu Stroe voltage drop. it happens any time there is a sudden load at the end of a wire. the "load" in this case being the gpu. don't know if you've ever seen it, but older vehicles (80s) that have a volt meter, that actually reads the actual voltage of the charging system, reflect voltage drop anytime a load is applied. for example, a blinker. this is obvious to anyone who has ever installed amps and subwoofers in thier car. if you have two amps, both of which require 8 awg, you don't run an 8 awg and then splice it to go to both. you run at least 4 awg and splice that, or run two 8 awg wires.
Because of the resistance of the cable the current will slightly drop: V = I x R. While the Voltage (V) provided by the PSU will remain constant but twice the cable length providing current to the graphics card means the total cable lenght will have twice the resistance, so (R) becomes slightly higher causing a small decrease in current (I). When using two cables the resistance (R) will not increase so there will not be a decrease in current. But since the wires are already pretty thick, the loss will be really small. But you have proven the laws of physics really work, thanks for the video!
I think that it has something to do with the cable resistance, and the fact that when you try to use the same cable to power the 2 slots of the gpu, it demands more power in 1 cable which might increase its temperature, changing its resistance making it lose a little bit of power.
2:30 greatest transition ever! :P
what about pci e extenders/risers
Duck Souls they will add delay in the signal. so it will hurt your benchmark scores. The user won't see any difference. I had to test this for HP on servers with nic cards. We had a 1% to .1% increase in latency and a 2%+/- on speed with 10 Gb nic cards. The cool part is with 100 riser cards we got a 1000ms lag and 10 mb nic connect lol.
+Lee Landis - Fascinating... I'm surprised that the network card still even worked with one hundred risers; you'd think that it would have started throwing errors and basically stopped working much sooner, given that PCIe signals weren't at all designed to be carried over long distances. I'm actually planning to use riser cables to relocate my boot drive (an M.2 SSD in a PCIe-to-M.2 adapter) into my to-be-custom 3D-printed PSU shroud, and I did worry about latency given that I'll be needing two riser cables to reach there, but now I'm a lot less worried. :p
Robert Faber it took a lot of work and was really touchy. interference was the biggest hurdle to over come. PCI Express is really strong and good. it only worked when it wanted too. but was the maximum we got to work 101 was too much. my boss wanted to know the max so we could send accurate rating to a data center on the specs. the same data sender complained the HP made riser in the server was slowing down the nic. we proved it was in margin of error but they pushed which lead to the 1 month project that got be laid off after they filled my job with a temp that was more liked.
+Lee Landis - Sorry to hear that...
I think someone here is totally full of it.
It depends on the gauge of the wire used among other things like connection type, material at the junction, whether its modular or not, length of cable. All of these things add resistance to the wire and therefore will add a small bit of voltage drop which will change the operation of the VRMs on the card. I'm sure an engineer can chime in with more specifics. I only took a few engineering courses and I don't use the knowledge every day.
My Seasonic power supply had an asterisk or something in the manual that said something like "If your GPU draws more than x watts, use two separate wires."
As a someone who is studying computer and electrical engineering at LSU currently, this is a bit surprising. I suppose a higher resistance could account for some of the performance differences, but it should be negligible. I wonder if it has more to do with how the card draws power/uses power, instead of the actual cables themselves. It could also be a factor in the power supply as well. There are so many variables here, but it is a very interesting inquiry that I never would have thought of. It'd be very interesting to take this to a lab on campus and see exactly what was going on. Great video.
I was also worried that adding cable extensions would cause the same issue maybe if you go more in depth you can test that as well. awesome video jay.
well from what I remember from basic electronics, electricity takes the path of least resistance, two resistors in parallel, aka two wires, you calc them by 1/(r1)+1/(r2)=rT, then finally 1/(rT). also the total resistance of two resistors is lower than the smallest resistor in parallel.
I think the big question to ask to help understand what is going on is whether your PSU is running single or multiple 12V rails?
Jay, I am an EE with 30+ years experience. In my experience connector pins present resistance. Parallel paths reduce that resistance The formula looks something like this. Assuming that each circuit path has the same resistance, the total resistance = resistance of each path / number of paths.
Using two, insterad of one wire "bundle" decreases resistance, and so voltage drop is smaller under load.
So two instead of one = more volts
But still not sure why it gains on performance, core is powered from VRM which should be working fine even with 0,5V lower input voltage.
If you want to see some changes in voltage just connect precision voltmeter to input on GPU and you should be able to see differences.
But it can vary much more between PSUs than wires, because some PSU can supply 12,1V instead of 12V which will already give you these couple points more in benchmarks.
Sory for bad engrish.
I'd be curious to see the measurements under an oscilloscope. Maybe the graphics card pulls 100W average, but does short bursts of 500W, which would cause a huge voltage drop.
Ollyweg 0 copy paste here, comments section is messy...
+Luca Fuoco
that's very common, so i guarantee that is happening.
ergo the 1st party Power Supply recommendations are bullshit, and you should only trust places which fully test via Oscilloscope so that you can see that the spikes are. to which end those spikes are what your Power Supply needs to be able to provide in order to be stable.
Arek R.
probably the power limit in amps, that's why
@Luca Fuoco
Well that's just how VRM works, there are caps that help with that near the 6/8pin input, but well...
The difference in GPU performance is due to the resistance of the wires connecting it to the power supply. All wire has resistance which causes a voltage drop across the connection. A wire’s resistance depends on 3 factors: the cross-sectional area (referred to as the “gauge”) of the conductor, the material of the conductor (i.e. copper or aluminum), and the total length of wire in the circuit. The total length includes both the positive (+) line as well as the negative (-) (Ground) line returning to the power source. A short thick wire will have very little resistance and a long thin wire will have a very high resistance. The voltage drop across the circuit is equal to the amount of current in amps (I), multiplied by the total resistance (R) of the wire used (V = I·R). The value V is how much Voltage is lost across the connection.
Power (Watts) is the product of current and voltage or P = I·V. If, for example, a GPU uses 200 Watts from a 12 V supply, then 16.7 Amps of current are required (which is A LOT). A typical GPU connector is about 2 feet long (4 feet total of wire round trip) and should usually be made from 18-gauge copper conductors. The GPU 6 pin connector uses 2 positive wires and 2 negative wires to transmit power (the other 2 are for sensors). When using just one connector the Voltage drop calculates to 0.221 Volts. This means the voltage transmitted to the GPU is reduced to 11.78 V. When using two connectors the amount of current traveling through each is cut in half. So the voltage drop across two connectors is 0.111 Volts, half that of a one connector harness setup (So the GPU runs with 11.89 V).
Thank you for your experiment! Good to know a reduction as small as 0.1 V has a measurable impact on performance. If you wanted you could improve performance even further by using thicker wire, or cutting the connector to a shorter more precise length for your setup. This is a very quick explanation so please let me know which concepts you would like me to clarify.
that title is great
when you use two cables of the same gauge to carry current, it's the same as using a single wire but minus 3 gauge. example 2x 16ga is the same as 1x 13ga do you can effectively get a bigger pipe for more current flow.
you need a bigger pipe.
In theory this is the case but in practice it does not work like that because no two wires are the same, they will have different resistances due to the manufacturing processes when the wire was made and so on.. Electricity finds the path of least resistance so one wire will see more current on it than the others. The best way is to have a single home run to the device and let the devices PCB marshall out the power. But sometimes the best way is not the most practical either..
My guess would be the psu regulates the voltage and amperage for each socket and limits the output for each outlet or each outlet has a maximum set output to prevent overheat or damaging of the psu components. A relatable example is to compare a psu output to a two socket wall outlet, if you attach an 8 socket power bar to a single socket, it can still provide power to 8 items. Now 8 items running off a power bar in one socket will not be able to draw the same amount of power as they would if they were each plugged to a direct socket in the wall, although the items would still work the wouldnt be working at their full capacity. Powering 8 items off a power bar in one socket also presents a risk of damaging the power bar and wiring which could result in an electrical fire but in most modern houses the result would be setting off a breaker or blowing a fuse. Nice video Jayy
Jay, this is the Video that I like the most! I used a EVGA 750W PSU; EVGA 1060 SSC; and the EVGA Power Link. So, 1060 is a 6 PIN. Power Link can be set-up to input an 8 PIN. I input 8 PIN, and the Output is 6 PIN. The Overclock of the 1060 SSC on Valley was running at 2151 Stable! I think this is a Silicon Lottery winner but, the Power Delivery allows the GPU, to eat as much power as it wants!
I actually gained 25 mhz core clock on my 1080ti because of this video. god damn jay and I thought I knew everything. thank you so much!
Gained ~1 fps #worth
Tiboonn why are you complaining about FREE performance?
Justin Provido not complaining ^^ I added worth in the back of my sentence :)
A couple things could affect this. 1. Some power supplies have more then one 12v rail. So if your second cord is coming from that other rail then you now have two rails sharing the load.
Each rail is going to have a set of capacitors. So you effectively double your capacity in this situation. Smoothing out the power more.
2. Even if it’s a single rail power supply you are going to have less resistance in the cables. Thicker cables (or more of them) = less resistance. Because the card runs on 12v it will pull relatively high amps.
Power = voltage * amperage. If the card ran at say 24v it would effectively use half the amperage for the same watts. Therefore needed less thick wire.
How much do gpu risers affect performance?
Not enough to care. The air flow changes make a bigger change. Be careful about where it pulls from and where the hot air goes after the pass. Some cases put no thought in that at all with riser setups. :(
Jay! Some graphic cards use LEDs, RGBs and such that adds even more needed power than a so-called vanilla GPU. This would be the main reason why separate power cables is needed to power some of them.
The entire LED and RGB on the GPU will consume 10-20w on extreme conditions so nah.
You've said that you started in the car audio world. I did as well, but stayed there for 20 years. Bigger power/Ground wire, done properly, has netted me discernible sonic results. It didn't make sense, but it really did sound better. This, though the power requirements were easily met.
I think these are things that are pushing the outer boundaries of power delivery, hi-fi and overclocking. A hardly noticeable difference in voltage consistency can make a difference while one is on the ragged edge. Well, actually it's a pretty smooth edge but one that can yield better performance if smoother.
Of course these things don't make a difference to most, but. . . .
Something strange I encountered when using a single cable with 2x 8Pins to my GPU. There was a 5 second delay before the PC would being to POST (power on). I struggled to troubleshoot this issue and finally after using 2 separate cables, the delay is no longer there. My setup is a X570 MB/ Ryzen 5600X with a 6800XT GPU. Power supply was reused from my old SLi build and its a beefy 1200W Seasonic.
This may be due to the fact that when cable get warmer they become less efficient meaning more resistance also decreasing the voltage intern reducing performance. So, spreading the load over 2 cables will reduce how worm the cables get meaning they will be more efficient (less resistance). This is also why the supercool superconductors.
very roughly
yeah it was just a quick explanation
What about the PCIE extensions? Like ones that move where your graphics card plugs in?
GERSBOXERS they do have a minute performance impact, just use as short cable as possible
That tiny little bit of added resistance should make no difference unless you get like 1 meter long extensions, though I bet even those wouldn't be much of an issue.
you can suffer with some EM interference. It can vary a lot depending on the cable and where you route it. ultimately in average use with a great riser you won't see a change but cheap junk that rests against another pci-e slot can be noticed.
GERSBOXERS Extensions create some latency because it's a data bus (light only goes so fast), and it creates interferences, so it WILL lower performances, the same way an external GPU would, but it should be only 1-2%.
It has to be pretty long before it does anything. The ones that come with some itx cases and those 90 degree risers for mounting GPUs vertically change virtually nothing performance wise.
I did a google search on this question because I am having sound issues. Currently this is my set up; MSI Z370 GAMING PRO CARBON DDR4 SLI ATX Motherboard, Intel Core I7-8700K, Corsair Vengeance RGB PRO 16GB (2x8GB) DDR4 2666MHz, GPU EVGA Geforce RTX 2070, PSU Corsair HX850i High Performance. The CPU is cooled by an AIO; Corsair H115i RGB Platinum AIO. I have a ASUS Strix SOAR sound card installed to provide 7.1 surround sound for my Razer Tiamat 7.1 V2 headset.
So I have one PCI-E going to the GPU and the splitter from the same cable plugged into the sound card. I was wondering if the sound issue I am having could be caused by this but I think it is unlikely since I have not had this issue from day one. Glad to see Jay had a video on this.
I think an important factor in your observation is when you used the 2 cables, were they on different rails? If so, that will probably explain this.
From what I know during my courses in university, an electrical component can cause small variations in the voltage on a cable. If you link a component that does this to a component is very sensitive to power variations, then it can behave incorrectly. Of course here it is on a very small scale, but by re-using the same wire you are linking the components, while if you use multiple wires that are on different rails they remain separate.
Wow, that card runs cooler than my watercooled GTX 1080 does! :O
But what was the voltage at the card of one cable vs. two? Would have been good to probe the connectors during the tests.
The resistance of running the two PCIe connectors off of one cable is going to cause a voltage drop. W = V X A. If the voltages are lower, the current has to increase. This, in turn, creates even more resistance and also can increase temperature.
LOL! It has it's "own power supply", yes. A DC to DC "power supply", but it has to take voltage in to create voltage out and the tolerance of the FETs used on the card is going to be affected by the voltage in.
Extremely interesting video! I'm super curious to know why that might be the case, and hope you do some sort of follow up with more testing and/or professional comment.
Perhaps GPU and/or PSU manufacturers have some input?
Also fun fact, if you're getting the error nvlddmkm and changing TDR values didn't help and underclocking your GPU ALSO didn't help, try using two power cables from PSU to GPU. That actually seems to have solved my issue and increased my performance in Blender as well as games.
Jay, here's my hypothesis as to why this could happen. When amps are drawn across a wire the wire will have a voltage drop. More amps drawn over a given wire will cause a larger voltage drop. So using two wires means that each one is carrying half the amps of the singer wire config, so there is a smaller voltage drop on each. This would only be very minimal change in voltage that I doubt could be measured with a regular handheld multi meter, but a very small change in voltage could cause the tiny change in performance you saw. This is all just like how you can under-volt a CPU to lower the clock speed.
now my big question is do the PCI Express Riser cables for the slot make any difference?
yes, next video hopefully.
you get a higher resistance and that means you get a slighlty lower voltage
Would like to see what this does today on a 2080TI and 4K gaming.
Holy crap the new set is looking amazing. Love the tree in the wind out the window as well.
I know you said your not an electrical wizard, but her's some correct terminology to use: you don't "draw voltage", voltage is simply there is the wires are connected. You can draw both power and current. Keep up the interesting videos.
I've always used 2 separate cables for reliability but interesting to know it may have some small improvement as well!
That background tho ♥
2:36 Impressive lol
1 cable harness increases the resistance of the power supply. It is different at 2 where the load can be better distributed. The "approximately" same effect can be achieved with thicker cables.
Drawing more current through a single cable would minutely drop the voltage meaning less overvolt capability,
but I expect what's happening is due to 'high frequency inductive effects'.
A decent analogy is to think of a freeway, during normal hours everyone drives quite fast, but when everyone needs to travel at once things can get blocked up.
Wires act like this too, specifically they resist fast changes in the flow of charge, so every time the card executes instructions, the power consumption spikes, meaning charge flow has to change rapidly.
Adding a second cable (more lanes to the freeway) means less inductance and power can be delivered more effectively when it spikes.
You should have rerun the original test again to prove a drop again or at least mention that you did for the doubters :-)
Abyss' end Definitely this, I was hoping he'd do that because frankly I still can't believe it makes a difference!
who do you think your watching Linus he doesn't have the staff for that. (aside from my joke I agree) To prove something you must fail at disproving your theory. Science 101
I was going to say this, redo both tests again after doing both of them initially to make sure the difference is persistent. The difference in results could simply be attributed to him having to restart the computer to change out the power cables, and not the power cables themselves.
exactly.... I would recommend doing the test 2 other times with a reboot for both test scenario and average everything up. There should have no difference between the two. Unless one of your connector has bad contact (bad connection) (higher impedance which would result of a voltage drop within the connector itself) but this is very unlikely. The difference is mainly due to how the test were done. Do the same test 3 times and you'll see the average will be very similar. :)
At current view count, people have already spent nearly 13,000 hours combined watching this video. His few hours spent benchmarking/editing/uploading this video isn't an excuse for improper testing. Especially when he gets paid to do so.
RGB PCI-E cables?
Is it ok to cut off the extra 8 pin on the pig tail cable? My gpu only needs 8 pins.
Angelo you know by now? Same question here 😉
@@callmetarif yes; it's ok to cut the pigtail if you don't need it BUT be careful when you cut them it will have a bit of exposed wire so either tape it up and leave it alone and don't cut them. If those cables touch after being cut they will short.
Well, I've always used 2 cables on the video cards, but then again it's like in audio. When you have a 100 watt amp, then switch to a 200 watt amp, but run them at the same level. The 200 watt amp has more reserve power, so it will sound effortless compared to the 100 watt which may be more constricted sounding, if that makes any sense to you.
Even though the video card can run on 1 line, adding another line gives you more reserve power to draw on.
So, the card runs more effortlessly, especially when running demanding material that has peaks in power consumption.
I could possibly give you all of the technical terms, but it's been well over 55 years since I graduated from electronics school.
Like in audio though, the power supply design makes a huge difference. I had a Phase Linear 600 watt per channel amp & it sounded okay or so I thought. Then I replaced that amp with a McCormack 150 watt per channel amp. The sound was unbelievable!It sounded as though the same speakers grew a set of balls. The bass was down to 16 hertz, whereas with the Phase Linear it sounded like the bass cut off at 43 hertz. And the mids & highs were more detailed & open.
The Phase Linear weighed 36 lbs., but the McCormack weighs 53 lbs. The difference was in the power supply.
I like that shot with the test rig and your monitor in frame.
I remember watching this two years ago when you released it. Here I am now watching it again after upgrading my 1080Ti to a 2080Ti
Same. Small world eh
@@weavercs4014 and second time around, I still can't be arsed to run a second 8 pin lol
@X X learn something new everyday lol
Can you do it with pci-e riser cables, like the ones in the A4-SFX?
Fans running at uh""" the speed of sound. Lol
4th year Electrical Engineering student here. The cables each have a small amount of resistance to them. The more amperage you draw out of each cable, the higher the voltage drop is (Since Voltage = Resistance * Amperage). So the reason each cable is limited to 11 amps each is not because they would blow up from too much current, it's because the voltage drop would be so much that the cable can no longer supply a proper 12 volts. So adding in the 2nd set of cables reduces the voltage drop, allowing the graphics card to receive closer to the 12 volts it needs.
as some one else said. The more current you put through a wire the more energy you loose to heat. This is why those big power lines run at massive voltages to reduce power loss.
I highly doubt you would see much loose at these currents and distances.
"electrical mathemagician"
revisit please for RTX 3000 series x2 and x3 8pins, thanks in advance!
I literally just asked myself this question the other day lmao wow
I'm not surprised with the results... The wires themselves have a little bit of electrical resistance. Typically PSU manufactures only use 18 gauge wire. Just today before watching this video I was custom sleeving some 6-pin and 8-pin PCI-E power cables to length (a much shorter length) and using thicker 16 gauge wire. This video makes me great about the work I've been doing.
@JayzTwoCents *Resistance is introduced at every part of the circuit when you route power from one place to another.* The most significant source is going to be the power supply (called output impedance, not often an advertised rating, though.) and maybe dodgy connections with loose connectors. As you pull more current, the voltage sags. How much sag occurs for a given current depends on the total _resistance_ or _impedance_ of the system. Even as an Electrical Engineering student, I was also quite surprised by the results, given that the GPU essentially has local regulators (often being termed "VRMs" in this particular hobbyist field) to convert the 12V at a lower current into the Vcore voltages with much lower impedance and higher current capabilities. These regulators should have good line-rejection and load-regulation, meaning they can maintain a very precise voltage to the important bits even with dips and sags on the input line, and as the load the GPU presents changes, respectively. So perhaps this has more to do with the ground, and the 0V (ideal) rising up several hundred millivolts as current is returned to the power supply, and this causing some sub-system to throttle performance to prevent logic errors from occurring with a "dirty" (electrically noisy) ground reference.
Can PCI-E extender Cables effect GPU Performance Jay?
dont be silly, seriously, the answer is no
What if they are those 16x (or 8x) -> 1x that you see sometimes ;)
That I have to think would affect performance.
Yes, any time your adding to the medium of the signal or power you're also increasing the resistance. Though it's going to be rather insignificant for short distances. Also, you also doubling the exposure of uninsulated connections and allow for more interference. The degree it effect performance can still be explored. Various websites and UA-camrs have have seen little impact on performance, as much as 2% is some tests.
the answer is absolutely yes - however the relevance of this yes varies highly.
it's crucially important for Processors, as they need to complete their tasks in Nanoseconds and therefore how far away your RAM Slots are from the Processor directly affects performance - but other stuff has more flexibility.
what about PCIE-8pin power sleeved cable extender
@JayzTwoCents This is just some reboot fps rng. For Gpu cables theres only +12V and - ground, those all +12V come from same pin inside the PSU. More amps = more watts = more heat. aluminium or copper cable has 0.0041-0.0043 ohms @ +20°C. Even if the temp goes up to 80°C the resistance wont go over 0.01 Ohms, which doesnt affect the Voltage going to GPU. So i call this fps increase just boot rng. myth busted.
peNKku the max stable clock wouldn't be affected if this was the case. microprocessors are extremely sensitive, and the added leads reduce resistance, and this single case shows it *may* increase stability in some cases.
Gap801 lets say 8 pin connector has 4x1mm2 12V supply and same for grounding. extremely long, 0.5meter long supply, 4mm2 is rated at 41,8A at 12V and has 0.000333... Ohms of cable resistance. Heat resistivity wont go up 0.001 ohms even at 800°C, which has no effect on the voltage. Titan Xp has 250W tdp, thats 20,8 A. So theres plenty of cable in single supply. Same goes for CPU, sometimes you boot your pc you can run higher clocks stable and next time you get bluescreen.
i tryed and it didn't work even with the same graphics card , avg fps still the same and overclocking performance still the same i think is just that power supply that u have.
I haven't seen a lot of comments giving thoughts on the why of this. The way I understand the situation is its quality of power not amount. Your psu probably splits the pic-e power cables on separate rails. This allows the gpu to pull the same amount of power with less strain on the psu. Your card may not be able to pull the watts that a single power cable can provide but it lowers the quality of the power the closer it gets. A good comparison (although it's kinda in the opposite direction) think of speakers. if you have a 500 watt speaker and you hook up a 200 watt rms amp you can get the same volume as a 500 watt rms amp turned up to where it's putting out 200 watts. At this point the volume of the speaker is the same since both are producing 200 watts to the speaker but the sound quality is better because you are farther from the limit of the amp. If you are pulling 50% or less of what can be supplied by one line with a pig tail you will probably not notice a difference. However if you are pulling 70-80% or more of what can be supplied by one line with a pig tail then you will probably benefit from the better quality power from running two lines. sorry for being long winded.
It's most likely an impedance or cable capacity ("electron pressure") issue. With older cards this wasn't a big issue because they wouldn't use multiple power draws as their power draw was relatively low. Newer cards can use HUGE power draws that exceed anything previously seen. Thus the capacity to reach cable power max is much higher now. The cables can only carry so much. If you are approaching cable max the "electron pressure" reacts the same way any other "pipeline" would react, a drop in delivery. So having 2 independent cables decreases the "electron pressure" as they are flowing through different pipes. This means that better power flow.
I have tried to explain this simply, but it somewhat complicated. Simply put higher draw cards need larger or more "pipes" to deal with the current.
last pass is genuinely good, I recommend
3 years later, not so much lmfao
@@bxeagle3932 They heavily restricted the free version now, you can only access your lastpass account with one device at a time, so you need to choose to either use your PC or phone but never both
Who needs grammar anyway amirite
, amirite?*
Well, you do.
hi jayz i am one of your best fan im 13 years old i learn so much of you
Martinus stander you obviously didn't learn to speak probably
Should've used the 13years to learn English
Yea I'm confused by this result, because what you're doing is paralleling the two connectors, means lower resistance total, means more amps, but the power supply should be able to handle it, and paralleled connections will have equal voltages. So I think the simple V=IR has nothing to do with it (like most people in comments are talking about)
There must be some hysteresis loss, I know it's DC power, but I'm not sure if it's constant, I'm gonna have to look up how GPUs pull power to come up with a theory for myself.
Well, when you draw the same amount of amperes through one single wire as you would through two wire with the same cross-section, then the voltage will come down a little bit. The higher the current, the lower the voltage as the wire will get warm and the resistance of the wire will go up. So yes, you will have a slighly higher +12V on two wires than on one.
I was wondering this JUST yesterday...... spooky.
I feel JayzTwoCents is monitoring my thoughts. Ah well, i know how to fold them tin foil hats anyway....
It's the UA-cam algorithm.
This happens because with the plugs being in series the power has to flow through the card once before going through the second plug. This causes some power to be lost as heat into the surrounding environment. Therefore the second plug isn't going to get the same amount of power. It's like putting 2 light bulbs in series. Even if the power supply is rated at a much higher than the two light bulbs there is going to be a voltage drop over the first bulb do to the resistance of the filament and energy being lost as heat and light. Thus causing the second bulb to be slightly dimmer, one of the reasons lights in houses are wired in parallel.
2 things, resistance can increase for cables as you pass more amps on them. For PSUs however that have multiple rails, the further away you are from the spec of the switching PSU (such as the transistors and such, assuming it has multiple for different rails), the cleaner the output as the amps are lower. If you see PSU reviews that include a waveform test, you can see that at 50% and 100% use the waveform varies less at 50% than it does at 100% so you get a more DC like output.
Jay remember back in the day when psu's got mounted at the top of the pc case? I often think to my self why are psu's never mounted at the front of the case with the shortest cable lengths possible. What you have uncovered here is power drop and heat in the rail and wires. Normally you can't see real proformance loss, until as you have done and started overclocking. As you know overclocking is very reliant on stable voltage and the harder you overclock the more stable you want your voltage. I suggest getting the rig overclocking it to max CPU GPU ram settings as is. Then making some kind of psu mount over the middle of the motherboard. Order some custom wires with the sortest wireing route then rebenchmark.
I love the videos their so random at times but every time it's something worth watching and highly accurate. Keep it up, I'm pretty stoked my msi gtx 1070 armor Gpu with the Gpu boost is stable at 2036 mhz with my i7 7700k overclocked to 4.2ghz gets me a boost of almost 80 fps on heaven benchmark all settings maxed. I didn't want to mess with the CPU because I was getting temp spikes near the 80c mark on air but some tweaks in bios got the temps down and I went from 3.2-4,2 on the cpu and temps never go over 54c and that's after hours of gameplay. Now the stress tests take it to the 60/70 mark but once I'm done the temps go back down but that's normal. And all thanks to your videos
Always use 2 cables for High-End graphic cards! Even though you cant always see the difference in benchmarks - you get much less frame drops, lag and a more stable system!
Pigtailing increases the resistance in the cable that the electrons have to pass through. In physics, your 3 variables for this situation are V (potential difference in volts) I (current in amperes) and R (resistance in ohms). The equation to figure this out is V=IR so if you use 2 different cables and the voltage is the same throughout the system (psu to gpu @ 12V) , the resistance will lower which increases the current and total charge that goes through the wire, effectively allowing for more performance as more electrons can reach your gpu. @JayzTwoCents
My thoughts on why this is the case is that it's because Nvidia's powerlimit is so heavily enforced. the pigtail cables cause a bigger voltage drop at the end of the connectors and a lower voltage means more amps and since the power is measured by the voltage drop through through a diode more amps means a bigger voltage drop through the diode which means the card effectively thinks its using more power hence its probably boosting less giving less performance. If this is true doing this on an AMD card would result in no difference in performance. But testing is needed to prove this.
The difference is not necessarily in the use of two connectors from the same cable, the key is to use the same number of power wires in both connectors (depending on board design) and that the gauge of the wires that are being used should be enough to provide the amperage required with no drop in the supply voltage. I used to design the electrical power for buildings in New York City, and the wire gauge ( or wire size) makes a big difference.
As a qualified electrical-mathmagician, I can tell you that the only difference that could effect anything would be the extra distance of the pigtail (causing a very small amount of extra resistance) or if you were to somehow pull so much current that the power supply throttled the output through the single cable in order to protect the conductors.
The second one is the issue. For example the 3000 cards draw up to 1.29 times their wattage in transients highs. So I guess more independent cables are better able to handle those.