SERIOUSLY?? MORE bad news for NVIDIA's 40 Series...

Поділитися
Вставка
  • Опубліковано 22 вер 2022
  • **Links to articles on this issue**
    WCCF article PCI-SIG Warns of Potential Overcurrent/Overpower Risk With 12VHPWR Connectors Using Non-ATX 3.0 PSU & Gen 5 Adapter Plugs (wccftech.com)
    wccftech.com/atx-3-0-12vhpwr-...
    Zotac Adapter Spec ZOTAC GAMING GeForce RTX 4090 AMP Extreme AIRO | ZOTAC
    www.zotac.com/us/product/grap...
    Article about Zotac Adapter specifically Zotac Reveals Surprising Fragility Of 12VHPWR Cable Adapters For GeForce RTX 40 Cards (hothardware.com)
    hothardware.com/news/zotac-hi...
    Steve vid on NVIDIA's report to PCI-SIG about the 12VHPWR cable (not adapters) after 40 cycles • HW News - Melting GPU ...
    Check out the new H7 series of cases from NZXT at nzxt.co/JayzH7
    Get your JayzTwoCents Merch Here! - www.jayztwocents.com
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: / jayztwocents
    Twitter: / jayztwocents
    Instagram: / jayztwocents
    SUBSCRIBE! bit.ly/sub2JayzTwoCents
  • Наука та технологія

КОМЕНТАРІ • 7 тис.

  • @NathanTyph
    @NathanTyph Рік тому +10158

    AMD seriously could not have a better launch window than they do right now

    • @KacKLaPPeN23
      @KacKLaPPeN23 Рік тому +289

      This problem will also affect AMD...

    • @666Necropsy
      @666Necropsy Рік тому +360

      amd's last cards where kinda gimped. i think they can blow it out the water this launch. the bus width on the 40 series is a joke.

    • @DagobertX2
      @DagobertX2 Рік тому +140

      We don't know anything of their new GPU, they could be in the same situation even if their GPU would be more efficient as they hinted.

    • @thecomicguy2785
      @thecomicguy2785 Рік тому +200

      @@KacKLaPPeN23 Pretty sure amd isnt switching to the new standard yet. By the time they do this may very well be fixed.

    • @registeredblindgamer4350
      @registeredblindgamer4350 Рік тому +150

      @@KacKLaPPeN23 That would only apply to AMD if they use the same connector or if the AIB's use them. So, you could easily be wrong.

  • @krishnateja4011
    @krishnateja4011 Рік тому +2269

    The more news is out about Nvidia the more I'm excited about AMD GPUs

    • @hamzasamar8823
      @hamzasamar8823 Рік тому +26

      same

    • @jsteezus
      @jsteezus Рік тому +153

      I’ve never seen as many former Nvidia enthusiasts (not fanboys, but the fans of their high end products) saying they are excited for the AMD launch or are waiting for AMD to launch before their purchasing decision. I really think they are going to be competitive. And performance is already so good that even if ray tracing isn’t as good, if its just as good as ampere thats plenty for like 90% of people.

    • @wahahah
      @wahahah Рік тому +61

      Just makes me more anxious. If AMD fails to deliver then what

    • @alexanderschulte8248
      @alexanderschulte8248 Рік тому +16

      At some point amd is going to this spec too so i dont know why it makes you more excited?

    • @austint.9943
      @austint.9943 Рік тому +3

      Of only they didn't break destiny 2 every new Gen

  • @Jet-ij9zc
    @Jet-ij9zc Рік тому +201

    I really hope we'll get a gpu gem where manufacturers heavily focus on power efficiency instead of going all in on performance and giving up on everything else

    • @Alte.Kameraden
      @Alte.Kameraden Рік тому +15

      I know. I was very happy with my 970 and 1070 Mini GPUs. I got so exciting seeing a 970 and 1070 reduced down to about the size of a 950/1050 GPU. It was such a wonderful push in the right direction. All I can say now is WTF is going on? I really think Ray Tracing is murdering the GPU market, the amount of Umph needed to make Ray Tracing work requires beefier GPUs than what would of been necessary if RT didn't exist. So since the RT series started getting introduced, bigger, and bigger, more power hungry GPUs just blew up.
      Really wish they'd abandon RT entirely. Of course Nivida's solution is Cloudy with a Chance of Meat Balls "Bla bla bla bla, all I heard is bigger... and bigger is BETTER!"

    • @catcollision8371
      @catcollision8371 Рік тому +8

      I only purchase video cards that do not require any additional power connectors. :p
      1050ti was a great card, still is..

    • @korassssssss
      @korassssssss Рік тому +2

      @@catcollision8371 I'm still using 1050ti in my current rig, really good card and only 70 watts.

    • @narmale
      @narmale Рік тому +1

      LOL... cuz you never push the envelope by being cute and tiny

    • @callofduty611
      @callofduty611 Рік тому

      It'd be lovely though I think we're reaching a certain limit with some more extreme innovations
      And as always , people will complain and never be satisfied. Just to use the GTX680 from a long time ago as an example with a smaller, less power hungry die... performance was the same as the AMD 7970 or better/worse depending on certain titles... people went crazy over the codename of the die being GK104 though despite it delivering performance-wise.

  • @davidcsokasy5451
    @davidcsokasy5451 Рік тому +220

    He's totally spot on with his assessment of the connectors. It's not paranoia. I bought a custom PC during the height of the GPU shortages at the end of last year (it was actually cheaper not to mention less frustrating than spending all of my free time hunting for prices that weren't eye watering) with a RTX 3080. The builder, who I won't name (cough, cough, CyberPowerPC), not only used a single split PCIe power cable, but wasn't very gentle when closing the case and crammed the excess power supply cabling inside with a bit too much gusto. Everything was fine for a couple weeks, but I started noticing GPU performance degrading. Also, my PC started crashing sporadically during gaming sessions after about 30-60 minutes. Eventually after a long gaming session one day my GPU died. At first I thought it was solely a faulty GPU because when I installed an older GPU (GTX 1080ti) it worked fine, so I started an RMA and shipped it off. When I received my RMA replacement I decided to install some braided cable extensions to class things up a bit. During the process of installing the new GPU and extensions I noticed the "real" problem. When the case was closed at the factory and the cables were crammed in the case the single PCIe power cable on the power supply had been bent hard to one side. This has caused two of the conductors to pull out a bit from the connector. The additional resistance from the less than ideal connections caused a hot spot and not only melted the cable connector, but also melted the power supply connector as well. My RTX 3080 is an overclocked model and pulls like 350W under heavy load. The higher end RTX 4000 series cards will pull even more. Using high quality cables with no defects is a must.

    • @camdavis9362
      @camdavis9362 Рік тому +26

      @WolfieMel are you trying to flex or something?

    • @justcommenting4981
      @justcommenting4981 Рік тому +17

      @WolfieMel well no, we can see now you are indeed attempting to flex. Quite silly.

    • @CaptainWafflos
      @CaptainWafflos Рік тому +20

      @WolfieMel cringe king

    • @citizensnips3850
      @citizensnips3850 Рік тому +14

      @WolfieMel Haha imagine being proud of saving money on a gaming PC. Too poor to afford a big-boy rig? :(.
      Hey bud, do you realize how dumb this flexing thing is yet?

    • @citizensnips3850
      @citizensnips3850 Рік тому

      @WolfieMel Why are you poor? Stop being poor. Just make more money so you don't have to use vouchers. Isn't flexing fun?

  • @Scarhwk
    @Scarhwk Рік тому +482

    I think the lesson here is "always be skeptical when a new generation draws more power." Drawing more power just screams "we're hoping nobody notices how meager a performance improvement this really is."

    • @davidlee1646
      @davidlee1646 Рік тому +79

      Yeah this trend of hulking video cards is getting out of hand. It's like shoving SUV sized boxes into PC cases. I can see the climate police going after gamers if this continues.

    • @zakkart
      @zakkart Рік тому

      @@davidlee1646 There is no climate police

    • @timhorton8085
      @timhorton8085 Рік тому +17

      Welcome to the altar of maximum performance. The market demands higher performance but engineers have not had the time or incentive to improve performance per power. If you dont want a maxed out, giant card dont get one.

    • @abby8043
      @abby8043 Рік тому +8

      We really should have just gotten 30 series “super”

    • @timhorton8085
      @timhorton8085 Рік тому +4

      @@abby8043 A rose by any other name.

  • @mikeyhope7577
    @mikeyhope7577 Рік тому +1506

    AMD can seriously take back a major part of the gpu market as long as they aren’t greedy with their pricing

    • @NavySeal2k
      @NavySeal2k Рік тому +75

      It already has with 2 5700xt and a 6800xt in my home. I saw the possibility to send a message to Nvidia especially as a Linux and enterprise feature user…

    • @RicochetForce
      @RicochetForce Рік тому +97

      They need to see Nvidia's abandonment of the low and mid-end market and utterly claim that. By the time Nvidia comes back to their senses they'll have to fight for that market again.

    • @liquidsunshine697
      @liquidsunshine697 Рік тому +1

      @@NavySeal2k ima rmaing my 5700xt its been a nightmare but i might just stick with them cause fuck nvidia

    • @reinhardtwilhelm5415
      @reinhardtwilhelm5415 Рік тому +96

      @@RicochetForce It's more than that - with these cards, Nvidia has abandoned the high end too. The "4080 12GB" that's really a 4070 is a perf/$ sidegrade from the 3080. That's never happened with a GPU this expensive before.

    • @Bambeakz
      @Bambeakz Рік тому +1

      And if is it true that the 40 serie is really expensive to produce (and a lot produced) then they can’t even get the price much lower 😅

  • @ncauhfsuohd
    @ncauhfsuohd Рік тому +106

    Just sharing my recent experience to add to this: I recently decided to add two new HDDs in Raid 1 to my computer. To get to the SATA ports on the motherboard, I had to uninstall and reinstall the graphics card (that's 1 cycle). Then, the computer would not turn on; the power supply died on me. So, I replaced it with a new power supply and cables (that's 2 cycles now). When I turned on the computer with the new power supply, my AIO pump had died. In trouble shooting that issue, I double-checked all my power cables (3 cycles). Solved the AIO issue and then turned back to setting up my new HDDs. One of the HDDs was not registering in BIOS, so I had to get back to the SATA ports (4 cycles) to double check that they were properly connected. Checked again in BIOS and the same HDD still wasn't registering. So, I replaced the SATA cable to that HDD (5 cycles).
    So, even in a pretty normal use case, you can go through a lot of cycles.

    • @danr.1299
      @danr.1299 27 днів тому

      Why not just unplug the 12vhpwr at the PSU side? Thats what I do when I have to remove my 4090 so the cable is always plugged into the GPU
      I know this is a year old but I’m just curious if you are worried about it why would you unplug it at the card and not just leave the cable in the card and remove it with the whole cable

  • @thomaswieland9284
    @thomaswieland9284 Рік тому +202

    As, for example, Steve has pointed out, old-style PCIe connectors have the same 30-cycle insertion limit, so that's not necessarily something new; what's new is the insane amount of power that can flow through the new connector.
    Regarding the 4-pin sensor input, the table from the PCIe spec (e.g. at 3:20) shows that with both Sense0 and Sense1 inputs open, the card is not supposed to pull more than 100W on startup and 150W sustained. If there is nothing plugged into the 4-pin control input, those sense lines should be "open" and cards should be limited to that power. So either you'll seriously underpower your 40x0 GPU or those cards don't adhere to the spec and ignore the sense inputs.

    • @dguisinger
      @dguisinger Рік тому +4

      I was going to reply with the same thing, it doesn't seem to do what he says its going to do

    • @jetseverschuren
      @jetseverschuren Рік тому +1

      Yep, noticed that too

    • @AliasAlias-nm9df
      @AliasAlias-nm9df Рік тому +9

      Since all of these cards are shipping with adaptors, and the 3090ti here sold with an adaptor prior to widespread availability of atx 3.0 supplies i would conclude that the cards do not adhere to the spec. That said we already operate in this environment where cards can draw more than we can supply. Just be cautious when selecting a power supply and don't play with the cables.

    • @DimitriosChannel
      @DimitriosChannel Рік тому +1

      Gonna need 1.21 Gigawatts!

    • @VikingDudee
      @VikingDudee Рік тому +1

      They do, but the cards of yesterday also do not draw as much power as a 3090ti or 4090, so the 30 cycles you would most likely get away with it with a crash or not notice a single thing, I have, these drawing so much power and amps and being so small of pins, you have a higher risk of something wanting to ark out and heat up due to a loose connection over time, If they made the plug bigger, the pins and sleeves are going to be made with some thicker metal and bigger contact area.
      If you ever installed outlets over the years, they have them push in ones where you can just shove a wire into the back of it and it grips it and you got the ones you can use the screw, the ones with the push in design have vary small contact area and in my opinion is a fire hazard on anything on a 15amp or more circuit, I've seen them melt with a belt sander years ago, Always use the screws on outlets. It just should have a bigger contact area and the melting of the plug probably really wouldn't be too much of a concern.
      I feel this was just an over sight and I honestly don't think the person designing this plug had any clue how current works when rating and testing it. But leave it to the educated dummies.

  • @lord_bozo
    @lord_bozo Рік тому +821

    Jay you should stress test it with a range of "worst case scenarios", an ATX 2.0, a 3 hour gaming sessions, and a fire extinguisher. Take the hit so others don't have too. and it will make for an epic thumbnail and good content too. Come on man, you know Linus is going to try to beat ya too it.

    • @killerhawks
      @killerhawks Рік тому +62

      While I love to see Jay "blow up" a psu let's leave that to the "experts" @gamersnexus k..

    • @Top3Entertainment
      @Top3Entertainment Рік тому +76

      Throw in a gigabyte PSU, a nzxt case, and a Asus z690 motherboard and we will really see if it's fireproof

    • @CommanderJPS
      @CommanderJPS Рік тому +38

      @@Top3Entertainment fireproof or a new bomb recipe for the anarchists cookbook? 🤣💥

    • @Top3Entertainment
      @Top3Entertainment Рік тому +7

      @@CommanderJPS judgement day

    • @killerhawks
      @killerhawks Рік тому +3

      @@Top3Entertainment i don't think that motherboard will fit in the H1 CASE...LOL

  • @tspencer227
    @tspencer227 Рік тому +786

    As an electrical engineer, there's a reason the NEC has very, very specific requirements for bend radii for conductors, cables, and conduit. Good on EVGA to consider that for this application (especially considering how much current power cables for modern GPUs are carrying, and therefore heat from I^2R losses), and I'm just waiting now for IEEE to create a more stringent standard. Because yeah, shit's gonna start catching on fire.

    • @enntense
      @enntense Рік тому +52

      Someones got to make a hard 90 degree connector...right?...

    • @MrQuequito
      @MrQuequito Рік тому +92

      These puny cables are drawing more amps than my wall outlets, thats scary to be honest, and yeah, thats a fire waiting to happen

    • @kanlu5199
      @kanlu5199 Рік тому +17

      They would require to introduce a 36V for power supply

    • @Flor-ian
      @Flor-ian Рік тому +15

      Mechanical here, god if only the IEEE was efficient

    • @Alex-bi8ob
      @Alex-bi8ob Рік тому +10

      Maybe i am missing something but don't these PSU's have overcurrent protection for each rail which would avoid overloading in general? Not necessarily avoid this cabling bending issue tho.

  • @emilybjoerk
    @emilybjoerk Рік тому +2

    Based on the diagram shown, the "sub channel" for the 12VHPWR cable is just a few voltage sense pins, there's no actual data communication like USB does. The device end would just sense this with some pull-ups. This means that an adapter cable can wire up the sense pins appropriately to "communicate" the maximum power allowed assuming all the PCIe cables are plugged in to the adapter and rated at ATX 2.0 power limited for the plug sizes. I'm an electrical engineer.

  • @Warsign01
    @Warsign01 Рік тому +40

    Great to know, thanks Jay. Starting to see more reasons why EVGA may have backed out.

  • @johnnyvvlog
    @johnnyvvlog Рік тому +531

    A single graphics card consuming more power than my entire server stack. That's totally insane especially given the current power prices.
    And then to see that tiny connector it is doomed to give massive problems and fires...

    • @WizeguyGaming
      @WizeguyGaming Рік тому +15

      LOL. Like complaining about gas prices driving a Bently. Go get yourself a kia. No shame in staying within the parameters of what you can afford.

    • @gggiiia147
      @gggiiia147 Рік тому +74

      @@WizeguyGaming bro still 600w is lot of power for a single component no cap

    • @C3l3bi1
      @C3l3bi1 Рік тому +38

      @@WizeguyGaming yeah if that kia consumed as much fuel as a bentley i would be pretty pissed.

    • @robertbolzicco9995
      @robertbolzicco9995 Рік тому +36

      @@gggiiia147 maybe gene doesn't pay his own hydro/electric bill lol. 600w is insane, don't let anyone tell you different. Soon you will need a dedicated electrical circuit just for the PC lol. Call an electrician!

    • @craigbrown6628
      @craigbrown6628 Рік тому +40

      Totally this, and in the context of a world where meeting power demands at a cost effective price is becoming harder and harder. The whole 40 series launch feels a bit tone deaf.
      You know at this point I would actually have been more impressed if they had gone for ZERO gain in performance but a significant reduction in power.

  • @foglebr
    @foglebr Рік тому +490

    Here’s what sucks, it’s going to take a fire or two before this issue is truly dealt with. Which is seriously sad. It’s great that you and others are highlighting this issue, I just hope consumers actually get and implement the knowledge to prevent a dangerous condition because of this plug design.

    • @jwar375
      @jwar375 Рік тому +14

      I hate the fact that you're probably right.

    • @Jon-pc6ch
      @Jon-pc6ch Рік тому +14

      The problem will be when then average consumer buys a shoddy pre built machine.

    • @tomglover98
      @tomglover98 Рік тому +11

      @@Jon-pc6ch not necessarily. Most people buying pre-builds don't touch the insides and rarely disassemble.
      More of an issue for clean freaks like me. I take my GPU out the case to not recirculate dust inside, welp..... now I can't anymore or only 30 times before potentially damaging hardware.
      However if its the case that it can be taken out of a modular power supply end instead without damage then that's would be an OK compromise.

    • @deViant14
      @deViant14 Рік тому +3

      @@tomglover98 we salute you for taking one for the team

    • @aitorbleda8267
      @aitorbleda8267 Рік тому +5

      More like tens of fires at least.
      Seems unsafe.. and they could have added a thermistor and two very fine cables to know the connector/cable temp.

  • @glitch9211
    @glitch9211 Рік тому +22

    Reminds me of the good old days, where every new game seemed to require a video card and RAM upgrade. Within two or three cycles, it was a motherboard upgrade to support the RAM. Of course, this is when RAM was about $100 a MB.

    • @JorgetePanete
      @JorgetePanete Рік тому +1

      And the reason is similar: unoptimized games (ignoring ridiculous hardware)

  • @jokerproduction
    @jokerproduction Рік тому +6

    eTeknix just covered this with JonnyGURU and there is absolutely zero cause for concern. Literally zero.

  • @Varvarin22
    @Varvarin22 Рік тому +266

    *edit* the point of the comment is to talk about the connector itself breaking, not voltage/etc.
    I guarantee EVGA knew about the power cable issue; hence the release of the PowerLink product for the 3090 TI and 3090 TI Kingpin cards.
    I am glad I purchased the power link, as it keeps the main 16 pin cable fixed in place with screws, and has additional capacitors to help with spikes.
    EVGA states in the marketing of the PowerLink that it helps to "keep temperatures lower", and it makes me wonder if this was a statement that was sarcastically aimed at melting cables.

    • @Dave-kh6tx
      @Dave-kh6tx Рік тому +5

      I can see extra capacitors helping slightly with temps and spikes, if properly designed, but I don't see it helping with constant power draw or high temps at all.

    • @allanwilmath8226
      @allanwilmath8226 Рік тому +6

      It's not, when the voltage dips the VRMs compensate by drawing even more current which makes the heat where the restriction to current is even worse and even makes VRM less efficient the longer the VRM drivers stay on the longer there is heat generated across the gates as they conduct current.
      What they should have doen if they wanted to reduce cables melting and small connectors with less pins is increase the voltage from 12 to 24 or even 48, thus reducing current by 2 to 4 times and reducing the effects of voltages drops across connectors and wires in the process.

    • @kevingaukel4950
      @kevingaukel4950 Рік тому +6

      I know EVGA broke relations with NVIDIA because of disrepect, but I really wonder if EVGA saw this coming and was told they couldn't modify the power plugs.

    • @BKDarius27
      @BKDarius27 Рік тому +2

      Can you tell me what this powerlink thing is? Because i may buy a 3090 so i would like to know

    • @kuma8030
      @kuma8030 Рік тому

      @@BKDarius27 just search it up bro

  • @jayhill2193
    @jayhill2193 Рік тому +603

    Looks like Gamers Nexus and LTT couldn't haven chosen a better time to go all in on PSU testing. Their results and reviews are going to be interesting.

    • @MatthewSemones
      @MatthewSemones Рік тому +5

      you think that was just a happy accident? :P

    • @mroutcast8515
      @mroutcast8515 Рік тому +18

      LTT 🤣🤣🤣🤣🤣🤣🤣🤣

    • @TribeXFire
      @TribeXFire Рік тому +23

      @@mroutcast8515 I do believe that LTT will do good work. It would be interesting to see how their lab department will develop.

    • @mroutcast8515
      @mroutcast8515 Рік тому +16

      @@TribeXFire mhm, just like their current product reviews - 5% benchmarks and 95% talking crap. It's utter cringe channel.

    • @eng3d
      @eng3d Рік тому +6

      @@mroutcast8515 its funny but also do the job

  • @zodak9999b
    @zodak9999b Рік тому +7

    On the table at 5:50, it shows on the bottom row that if the sense pins are both open circuit then the max power allowed at startup is 100w, and 150 after software configuration. So on the cable that doesn't have those pins, there shouldn't be a problem if the card is designed for ATX 3.0.

    • @EmptyZoo393
      @EmptyZoo393 Рік тому +1

      They communicate via simple grounding, though. I doubt the graphics card or PSU manufacturers are going to mess with it too much, but those adapters will be copied by third parties that may just ground all of those.

    • @ano_nym
      @ano_nym Рік тому

      Saw that too. But they will probably just ignore that from the card makers, because otherwise a lot of people with older PSUs will complain and want refunds because their cards aren't up to spec. Or perhaps have some work around included, to like the adapter grounding them, and saying to use the adapter at your own risk.

  • @coach357
    @coach357 Рік тому +29

    If you go see the GeForce RTX 40 Series & Power Specifications on the Nvidia forum, you will see that they are using an active circuit inside the adapter that is translating the 8-pin plug status to the correct sideband signals according to the PCIe Gen 5 (ATX 3.0) spec. It is not a passive adapter so this is why there is nothing to worry if you have a good ATX 2.0 PSU with the minimum power required for the 4090 card.

    • @MegaTheJohny
      @MegaTheJohny Рік тому

      Thanks for this comment. I have 1300w Seasonic platinum PS, but it is and few years old. Shoild I be safe to use it with RTX 4090?

    • @thisisashan
      @thisisashan Рік тому +1

      That has nothing to do with his concern here.
      His concern here is there are reports, from both nvidia and users, of the smaller pins melting after being unplugged and replugged several times.
      This has nothing to do with what you are talking about here.

    • @coach357
      @coach357 Рік тому +4

      @@MegaTheJohny Your PSU will work fine with the NVIDIA adapter.

    • @coach357
      @coach357 Рік тому +2

      @@thisisashan You should rewatch the video and also check the Nvidia forum about the melting issue with the adapter. There is no more risk than all the PCIe connector we used before. The 30 connections always have applied. You really think NVIDIA would release new video card with a stupid passive adapter that can catch in fire? Seriously? Do your homework and buy a new pricy ATX 3.0 PSU if you are afraid of burning your PC. And by the way, listen at 7minute and you will understand (I hope) why I made my comment as what Jay says is absolutly false as the Active adapter balance power between pcie connector and prevent overcurrent on one cable that could result in cable melting, regardless of the 30 connections.

  • @-EchoesIntoEternity-
    @-EchoesIntoEternity- Рік тому +317

    never interrupt your enemies from making mistakes....
    AMD: *nodding silently*

    • @tablettablete186
      @tablettablete186 Рік тому +18

      -Sun tzu (for real this time)

    • @paulustrucenus
      @paulustrucenus Рік тому +22

      @@tablettablete186 I think it was Napoleon, not Sun Tzu.
      Edit : after a quick Google search, I can confirm Sun Tzu never said that. The quote is attributed to Napoleon.

    • @tablettablete186
      @tablettablete186 Рік тому +3

      @@paulustrucenus Wait, really!? I could swear that I read this quote in Art of War.
      Either way, thanks for the heads-up!

    • @paulustrucenus
      @paulustrucenus Рік тому +2

      @@tablettablete186 Napoleon is another famous war guy so the confusion is easy.

    • @deplorablerach
      @deplorablerach Рік тому

      never ever

  • @ffwast
    @ffwast Рік тому +938

    The good news about these power supply problems with the 40 series is that you already shouldn't buy a 40 series anyways.

    • @justjami9619
      @justjami9619 Рік тому +11

      lol fair enough

    • @DarthCuda
      @DarthCuda Рік тому +14

      Why not? I am probably going to get a 4090.

    • @CrimsonFader
      @CrimsonFader Рік тому +84

      @@DarthCuda Very expensive, power efficiency non existent, Apparently a fire issue? Not sure havent watched the full video yet, display port 2.0 missing display port 1.4a I believe, not good for future vr headsets
      Edit: I would wait for amd just incase It is better at all/ any of these at a better price, they might not be, but its only november 3rd for the amd launch

    • @pixiesworld6367
      @pixiesworld6367 Рік тому +58

      @@DarthCuda I hope you enjoy those fake fps 😄

    • @terribleatgames-rippedoff
      @terribleatgames-rippedoff Рік тому +49

      @@DarthCuda There are cheaper options around for room heating, you know...

  • @fayathon
    @fayathon Рік тому +7

    This, this is one of the big reasons I sub to you Jay, this is damned good info to have, and I will link this to anyone that I hear is getting a 40 series as a precaution so that we have less fires going forward. Thanks for the heads up.

  • @AndrewFrink
    @AndrewFrink Рік тому +14

    maybe i missed something here, but the chart @6:02 shows that with sense1 and sense0 both open (not connected to ground, like the plug is missing) the cards should self limit to 100w on boot, and 150w sustained after that. so a compliant 4090 ATX3.0 based card plugged into those adaptors should just limit to 350w. My guess is that some of these PSU cables will have the side connectors and that they will be connected or switchable.

    • @LavitosExodius
      @LavitosExodius Рік тому

      The part your missing is your relying on this to work correctly this is a new thing all together there will be glitches in the system and some cards may not work correctly. Sure you can RMA them but little late at that point if it blows your PSU.

    • @AndrewFrink
      @AndrewFrink Рік тому

      @@LavitosExodius well then the card isn't pcie3.0 compliant and shouldn't have the sticker on the box. We have f#$king standards for a reason. The card vendor could just ship adapter cables for use with non 3.0 PSUs with a great big warning, but if I was engineering at an OEM, I'd be pretty against that. Let a third party do that.

    • @LavitosExodius
      @LavitosExodius Рік тому

      @@AndrewFrink no simply put with all new products there will be bugs and glitches that need worked out. It would be wise to remember all new products have these issues. I'm sure the GPU makers will do their best to make sure each card is compliant but you know Murphy's Law and all that.

    • @AndrewFrink
      @AndrewFrink Рік тому

      @@LavitosExodius i think we're just going to have to disagree. A card that doesn't respect the power config is not a "bug". Especially in a $1000+ GPU.

    • @LavitosExodius
      @LavitosExodius Рік тому

      @@AndrewFrink you missed the point this whole system was designed by a human there is a chance however small that the system can fail. If it's every card that fails like that sure not a bug. But only a handfull? That would be a glitch/bug somewhere in the detection method of the sensors. Or put more simply don't roll the dice with these cards and hope your 650 Watt PSU will power them safely unless it is a Gen 3 PSU. Even then I wouldn't roll the dice. Granted there is an argument to be made if you're buying a 4080 or higher you probably shouldn't have skimped on the PSU to begin with.

  • @mromutt
    @mromutt Рік тому +477

    Man I already decided I do not want a 4080/80/90 and most like a 4000 series card at all, but this sold me on wanting nothing to do with these. I really hope intel actually keeps going and improves and AMD doesnt let this opportunity slip through their fingers to stomp on nvidia this go around.

    • @exploranator
      @exploranator Рік тому

      AMD IS CALLING YOUUUUU?

    • @RobertNES816
      @RobertNES816 Рік тому +23

      NVIDIA is going to be a thing of the past like 3DFX. They're too cocky and too full of themselves. At this point I want NVIDIA to go out of business, this is coming from a former NVIDIA Fanboy.

    • @Lucrativecris
      @Lucrativecris Рік тому +5

      Im sticking with the 3060

    • @ZackSNetwork
      @ZackSNetwork Рік тому +10

      @@RobertNES816 So you want AMD to be the new king of the market and Intel takes AMD’s current place?

    • @Cyromantik
      @Cyromantik Рік тому +5

      Intel already said they're committed and won't quit their push to make and improve upon their Arc cards, and considering they can source their own silicone they should be able to back that up!

  • @mistirion4929
    @mistirion4929 Рік тому +151

    Electrical engineer here.
    Seriously, if I knew that my product (in this case a simple connector) would be a serious safety hazard and I had to warn my customers that they "should not plug it in and out too often" I would be ashamed of selling it.
    Honestly, before I even dare to announce it, I would redesign it completely if I knew that it would be this fragile and not durable at all. The amount of current that's flowing through these tiny connectors and wires is insane. If you make an adapter do it right and don't indirectly force your customers to buy the ideal solution so they don't have to be afraid of a fire hazard.

    • @B20C0
      @B20C0 Рік тому +19

      This will be even more interesting in Europe, 4090s could actually be banned from the European market if this isn't resolved before the EU launch. They would simply refuse CE certification.

    • @brandontrish86
      @brandontrish86 Рік тому +12

      As a fellow EE, I was thinking quite similar. The moment I saw that plug I immediately knew that it was not safely capable of delivering 50A of current. There's just not enough surface area. This is going to be a massive failure point until the day it's superceded.

    • @CommanderJPS
      @CommanderJPS Рік тому +1

      This is unfortunately how this works is going with such disposable toot so we have to keep manufacturing rubbish to earn more...
      What happened to make something which will last?

    • @monad_tcp
      @monad_tcp Рік тому +2

      I don't get why they don't use standard XT60 connector if they are going to pull 40A over that bit

    • @monad_tcp
      @monad_tcp Рік тому +3

      I'm not a professional EE, I'm only amateur, I know a bit and now I'm worried about the amount of oz of copper they used and how big are the power rails/traces on the power section of the graphics card, where VRM - voltage regulator module - are. 50A is no joke. I've seem cheap motherboards get burned silkscreen from all that heat.
      One thing is for sure, I'm not buying the stock model of that GPU before some of the better manufacturers that usually do overclocking and know their shit gets their hands and fix the design.

  • @FreakyOptics
    @FreakyOptics Рік тому +12

    This video further solidified me to not get a 40 series. I knew deep down after it was announced that the adaptors have a connect/disconnect life, that It sounded like a fire hazard already.

    • @doghous3
      @doghous3 Рік тому +1

      After seeing this, I'm thinking the same. To me it sounds like the connector/wiring is underspec. Melted wiring - come on! I may change my mind if a rev 2.0 comes out with appropriate connectors. I guess time will tell.

  • @coreymasson6591
    @coreymasson6591 Рік тому

    Thank you for supplying (no pun intended) an excellent article on this issue. Also major respect for including Steve's Link at the end. Shows that you care about your viewers and will do anything to keep them informed and safe.

  • @nonaurbizniz7440
    @nonaurbizniz7440 Рік тому +376

    I think I see evga's strat for this change. They will be making a fortune producing the new atx 3.0 psu because of the standard change and the flood of upgrades so no need to bother with gpu sales.

    • @ZappyOh
      @ZappyOh Рік тому +113

      EVGA bailed from the GPU market, just before the actual deadly fires start, and both skyrocketing warranty claims, plus potential lawsuits or even criminal investigations begins.
      Smart move!

    • @subodai85
      @subodai85 Рік тому +7

      I’d said this, it makes complete business sense.

    • @nabieladrian
      @nabieladrian Рік тому +53

      Every single day since the news, EVGA moves make more sense.

    • @earthtaurus5515
      @earthtaurus5515 Рік тому

      @@ZappyOh EVGA didn't bail, Nvidia forced them out. Massive difference.

    • @steamroller9945
      @steamroller9945 Рік тому +29

      @@earthtaurus5515 ??? You got any evidence bud?

  • @kaimeraprime
    @kaimeraprime Рік тому +376

    used nvidia for decades but the 40 series is making me want to see what team red has to offer.

    • @Schwing27
      @Schwing27 Рік тому

      Same. This is the first time I’m not buying a new GPU every 2 years. Nvidia can F off.

    • @nando03012009
      @nando03012009 Рік тому +11

      I hear ya.. I haven't used an AMD (ATI ) card since the R9 nano. I may look into and AMD card going forward. My 3090 should last 2 more years.

    • @MemTMCR
      @MemTMCR Рік тому +20

      having used amd cards as long as I can remember
      let me warn you weird game and pc crashes have been a legitimate problem for some of my systems

    • @shawnd7
      @shawnd7 Рік тому +2

      @@MemTMCR have u owned an rx 5700xt thats where i had my problems but glad rdna 2 fixed those

    • @Dlo_BestLife
      @Dlo_BestLife Рік тому +18

      Everyone should always look at all products avaliable before making a purchase. Just my opinions

  • @zeonos
    @zeonos Рік тому +2

    I would love if they put the connector facing down (same side as the fans), so the cable goes down into the bottom cable management.
    Having them face the side or front of the case require you to bend the cables and gives issues in smaller cases.

  • @gwaites6329
    @gwaites6329 Рік тому +9

    Dangggggg hah. Thanks a ton for always doing the vid coverage and getting us the info in your entertaining way and easily understandable. Implications for me are hard to say, haven't considered buying a new PC for 2+ years now buuuuuut kept up on the market. Now I have so much to learn to potentially grab a 3080 to upgrade from my 1080 before it's too late and decide if I just give up on 40 series or wait another year for things to stabilize. Eitherway sooooo glad I'm starting. Anew job this year. If I keep myself from buying a sports/muscle car as a first car like a dumbass, I can save a good bit on my daily commuter and finalyyyyyyy push for a PC refresh. But damnnnnnnmnn. Still can't believe power supplies gonna get the shaft. Buuuuuut glad I still have my evga g3. Fucking love them and glad they'll atleast be making psu for me yayyyy. Cheeeeeeeeeeeeeeeers gogogogogo. 3080 evga for me hah

  • @buddymac1
    @buddymac1 Рік тому +259

    30 cycles? That is HORRIFYING. I have a friend who's had his 2080 for 4 whole years now. He got one at launch and hasn't needed to spend to upgrade his GPU since then, but he has swapped/upgraded his motherboard, case, NVME, PSU, CPU cooler and probably some more stuff. Each time, he's unplugged that thing at least once, maybe 2 or 3 times for a test boot outside the the PC. When you add that plus when we built it, he's probably pushing 30 uses, and the SCARIEST PART there's thousands of people out there, maybe even more, who have definitely passed that 30 cycles safe rating. Imagine in 5 years, when they're 4080s instead of 2080s, and these things start popping off because they're running SIX-HUNDRED FUCKING WATTS through a CONNECTOR that is NOT SAFETY rated for that.

    • @Hungerperson
      @Hungerperson Рік тому +23

      As someone who works in a computer repair shop, if a computer has some issues or takes a lot of troubleshooting, then just one visit to our shop to fix an issue that could be unrelated to the gpu can take a third or even more of those cycles. It’s terrible

    • @link1565V2
      @link1565V2 Рік тому +1

      I still run my gtx 680 in an XP machine lol
      It would have to be pushing 30 plug cycles.

    • @James-wd9ib
      @James-wd9ib Рік тому +25

      If you're a manufacturer and you have to make consumers count BS "plug cycles", you can take a step back and look at your designs- I'm sure you'll find that you've been doing something very, VERY, V-E-R-Y stupid

    • @an4rchy_yt
      @an4rchy_yt Рік тому +4

      hence why evga left?

    • @an4rchy_yt
      @an4rchy_yt Рік тому +4

      @@James-wd9ib cost cutting and material limitations. p l a s t i c s .

  • @Griffolion0
    @Griffolion0 Рік тому +327

    It seems this issue is definitely more universal, but Nvidia just became the perfect demonstration case given the insane power requirements of the new line of cards. We had almost 2 decades of not having to worry about this, and now those of us old enough to remember are going to relive the nightmares of cable adapters, fear of fires, fear of blowing your cards, etc.

    • @Ghastly10
      @Ghastly10 Рік тому +31

      Exactly, unfortunately reading some of the other comments here, some folks are going the AMD vs Nvidia route and not really understanding that this problem is with the new ATX 3.0 standard. With Nvidia using the new ATX 3.0 standard, they have demonstrated one of the weaknesses of ATX 3.0.

    • @theKCMOPAR
      @theKCMOPAR Рік тому +7

      I agree with this. I'm just bummed cause I don't think nvidia and amd realize how expensive psu are. At this point you might as well do a completely new build.

    • @Lordpickleboy
      @Lordpickleboy Рік тому +2

      We will see many many house fires I thinks

    • @spankbuda7466
      @spankbuda7466 Рік тому +19

      And since you're old enough you should know not to ever buy first generation of anything because these companies uses the consumers as their real world test subjects.

    • @Nemesis1ism
      @Nemesis1ism Рік тому

      No only foolish people will have those sorts of problems. if you put a 450 watt gpu in a pc your an idiot.

  • @virtualxip
    @virtualxip Рік тому +3

    @Jay: I don't think these are data lines per se. The psu and the graphics card don't communicate back and forth. They're more like jumpers - at least that's what the displayed specs suggest.
    So basically there could be specific adapter cables for different psu ratings. Or you could have cables that include jumpers or dip switches.
    (Of course that's risky, if you don't know what you're doing, no doubt about that.)

  • @andrewsuryali8540
    @andrewsuryali8540 Рік тому +68

    Also, if you play around with ITX it's pretty easy to get to 30 mating cycles because usually you'll need to plug and unplug several times during build just to get all the components to fit and tidy up the cabling, then every time you need to clean up you'll likely have to unplug the cables again to get to some components.

    • @lucianooscarmasciulli6200
      @lucianooscarmasciulli6200 Рік тому +15

      "It's pretty easy to get to 30 mating cycles"
      ..nice

    • @GoldenTiger01
      @GoldenTiger01 Рік тому +3

      Then stop being a noob and needing to plug/unplug so many times. They make digital calipers so you can measure the connectors and any other space constraints.

    • @filthyrando3632
      @filthyrando3632 Рік тому +1

      @@lucianooscarmasciulli6200 nice

    • @MoonOvIce
      @MoonOvIce Рік тому

      @@filthyrando3632 Filthy Rando 🤣

    • @JayOhm
      @JayOhm Рік тому

      @Erick Nava Yes, that. Also, they make some nice 3D CAD and software, so you can triple-check and actually see your final PC design, with cable routing and all that, before you even start assembling anything! What an idiot would use trial and error instead of the proper engineering tools?
      Edit: Oh, and don't forget thermal simulations, to ensure your planned cooling solution can stay quied without getting blisteringly hot, and to fix it if it can't, before it gets a chance to become a problem.
      Edit2: And mechanical stress simulations, don't forget about them! These new GPUs aren't exactly light-weighted.

  • @Quendk
    @Quendk Рік тому +68

    This is what we need, the Brutal Truth.
    Seems like this gen for Nvidia should rather be skipped until all the kinks are worked out. Don't think Intel is going to have much of a better response to power draw either.
    AMD really has the edge. I'm glad that we have the competition to drive innovation.

  • @danishprince2760
    @danishprince2760 Рік тому +252

    I were really planning on getting a 4000 series card and finally upgrade after 8 years but between the EVGA stuff, the 4070/4080 naming bullshit, the insane price increase over 3000 series and now this.. My faith in Nvidia is just gone and creating a good product can only go so far in keeping the consumers.
    AMD is truly in a good position to grab market share depending on what they show off in November!

    • @damunzy
      @damunzy Рік тому +10

      Get the EVGA 3090ti for $1400. It'll last until the 7000 series comes out

    • @nk-dw2hm
      @nk-dw2hm Рік тому +13

      @@damunzy or literally any amd card and have better support that will improve over time, unlike nvidia

    • @MRBURNTTOAST111
      @MRBURNTTOAST111 Рік тому +7

      @@damunzy if you don’t use ray tracing there is literally no reason to not get a amd 6000 card it’s so much more efficient. My rx 6800 undervolted only uses 200 watts at max and gets similar fps compared to a 3080

    • @thenonexistinghero
      @thenonexistinghero Рік тому +8

      Insane price increase? Sounds like you have no common sense. Inflation's been crazy. The 4090 is only $100 more expensive than the 3090... looking at inflation that took place between the releases... you're actually paying a bit less. Similar story for the 4080 16 GB. Only the 4080 12 GB got an unreasonable price hike compared to the regular 4080... especially when you consider the fact that it's actually a 4070.

    • @danishprince2760
      @danishprince2760 Рік тому +23

      @@thenonexistinghero the 4080 is $500 dollars more than the 3080 and the 4070 is $400 more than the 3070. That is not reasonable increases due to inflation lol

  • @Digitalbath570
    @Digitalbath570 Рік тому +1

    Do we need to look at atx standard as being old and come up with a new standard instead of just iterations? Maybe allow for different designs of pc shapes and power delivery to key components

  • @ujiltromm7358
    @ujiltromm7358 Рік тому +2

    I don't know who specced the connector, but MOLEX (turns out MOLEX is a company and makes the PCIe plug) validated the specs. That's at least one culprit.
    One thing I'm wondering is connector contact area. The pins are tiny rods that make contact within the pin wells. Now, we all know how we use thermal paste on CPUs to fill all the asperities of the IHS and the cooler coldplate to reduce thermal resistance and ensure a higher thermal transfer. Well, those pins have asperities too, and so I expect the pins to not make full contact with the wells, increasing electrical resistance there and thus waste heat, leading to thermal runaway and ultimately melting the plug. I wonder if there exists a way to make a "contact paste" (that's not solid like well, solder) to reduce the resistance. I guess making sure the customer makes no short would be a major issue though...

  • @ErikCPianoman
    @ErikCPianoman Рік тому +67

    Rated for only 30 cycles? And this is a full launch consumer product, not some sort of alpha/beta build/prototype? This is fire hazard and unacceptable. Period. Here's hoping this trend of brute forcing generational improvement stops and efficiency takes precedent. 600w power draw is just too much.

    • @marcust478
      @marcust478 Рік тому +3

      Agree. This is actually scary.
      Imagine fucking up your PSU, your card and even putting fire on your own house because of nVidia and this madness.

    • @golfgrouch
      @golfgrouch Рік тому +5

      The 30 cycle spec is the same as it’s been for the last 20+ years on existing PCIe/ATX 8pin connectors, and has not changed with the PCIe Gen5 connectors or power adapters.

    • @ertai222
      @ertai222 Рік тому +1

      For sure. No GPU should be needing 600+ watts ever.

  • @willwunsche6940
    @willwunsche6940 Рік тому +473

    AMD has the perfect opportunity right now to win a lot of market share, positive press and long-term mind share if their new GPU's are priced well and good

    • @honkhonkler7732
      @honkhonkler7732 Рік тому +10

      RDNA2 should've done that had crypto miners not ruined the market for the entire past generation.

    • @sevroaubarca5209
      @sevroaubarca5209 Рік тому +3

      They just need to be better.

    • @paullucci
      @paullucci Рік тому +21

      @@sevroaubarca5209 they just need to be better value

    • @pfnieks
      @pfnieks Рік тому +10

      They won't because they make more money now that they have 20% market share than when they had 40 or even 50%, you can expect them to give you a 100$ discount at most.
      Price wars are over, they ended the moment amd saw that people bought nvidia products even when they were objectively worse and pricier.

    • @AlexanderDiviFilius
      @AlexanderDiviFilius Рік тому +7

      With all of the information available to them, I see no reason why AMD wouldn’t be able to undercut Nvidia with their new GPUs. Even if they sell at a loss initially, they’ll make it up before long, and gain a lot of market share in the process.

  • @mikkokuorttinen3113
    @mikkokuorttinen3113 Рік тому +6

    Great thanks to you Jay for the attention on the topic! In addition to the facts you are already pointing out here, is the additional stress on the gpu power supply due to the actual length of the cables. The longer the cable the higher the stress heat created! People at home doing their own home builds may not always remember consider this.

  • @CoreyKearney
    @CoreyKearney Рік тому +2

    The spec is 600w @12v in ideal conditions that's 50+ amps. Sure that's split across 6 pairs of wires, but that's still a bump in amps over a single line of 8-pin. With the compound issue of possible uneven resistance across the pairs pushing more or less power to certain pairs, an that resistance heating up the dinky connectors. Those pins are tiny and the wire gauge is what 18? 16? This whole spec is Firestarter. It's to many watts over to small a connector and to thin a set of wires. Before I buy my next psu I'm going to be looking into custom 8 gauge pci connectors. This is some basic electronics fail to save a buck.

  • @clintono
    @clintono Рік тому +206

    Jay, you have missed the fact that if the card does not receive a signal from the PSU it MUST (by the standard) default to the lowest setting. So people buying a 4090 are going to end up with 150W cards if they don't use an ATX 3.0 PSU.

    • @Secunder
      @Secunder Рік тому +15

      Wait.. you're joking, right?

    • @nanoflower1
      @nanoflower1 Рік тому +9

      @@Secunder Hopefully that's only at bootup because you know most people who are likely to buy 4090s already have high end PSUs and so will not be buying a new 3.0 PSU to go with their 4090. So they will be quite upset if they aren't getting the expected performance (because the card limited the power draw to 150 watts.)

    • @tehf00n
      @tehf00n Рік тому +15

      @@Secunder he's right.

    • @clintono
      @clintono Рік тому +10

      @@nanoflower1 At boot the card can only draw 100W, MAX power draw 150W.

    • @louiscorrigan3865
      @louiscorrigan3865 Рік тому +12

      yeah this comment is important. the spec is designed to prevent over-draw (at least just by looking at the table) do with no data pins going to ground the card will (at least by spec) only draw 150W at peak sustained power

  • @silentq15
    @silentq15 Рік тому +216

    I am starting to feel like after watching this that a discussion needs to be had as to why these GPU's are not getting more power efficient at all. It seems like they are chasing that 2x greater than the previous generation buzz by just increasing the power. Something just seems off about that.

    • @MichaelWerneburg
      @MichaelWerneburg Рік тому +33

      Especially while the planet lurches through energy and climate crises.

    • @reappermen
      @reappermen Рік тому +33

      Oh, they absolutely are getting more power efficient. It's just that Nvidia (and to a far lesser extend AMD) currently think that it is not enought to have 15-30% better performance for the same power, so they also up the power draw to gain more totla performance.
      Plus, the GDDR6X VRAM that Nvidia uses is VERY power hungry, so upping the amounts of Vram pretty much linearly increases power draw.
      (Don't have exact numbers in memory, but higher end 3000 cards pulled hih into double digits of watts just for the VRAM under heavy load).

    • @BillyAltDel
      @BillyAltDel Рік тому +21

      People have been able to undervolt 30-series, drop like 100W, and maybe lose like 3% performance. Ridiculous.

    • @JHattsy
      @JHattsy Рік тому +2

      Yah I'm starting to feel like they're trying to turn GPUs into console release timings, they just want to put one out every 2-3 years and make the previous one useless.

    • @MartinKrol
      @MartinKrol Рік тому

      @@MichaelWerneburg theres no climate crisis. change, sure, crisis, not even remotely close. not even close in the next 100 years.

  • @Kupidon14
    @Kupidon14 Рік тому

    Nice information! The whole life i used power supply with no data lines between gr. card and power supply, why it's nedded now? And ŵhy they don't put thermal cutout near power connector?

  • @pollopesca5130
    @pollopesca5130 Рік тому +7

    I feel like we've gone back in time to 2000-2005. Back when Intel’s only solution to getting more performance was throwing more electricity at it until they hit a breaking point and had to come up with a new more efficient architecture (the core-2). ATX 3.0 just allows manufactures to drag this out even longer before efficiency is a consideration again. The electricity bill tripled for me this summer, so I think I’ll wait this gen out (and possibly others). My microwave shouldn’t be more power efficient than my PCs (x.x)

    • @wag-on
      @wag-on Рік тому +1

      I agree, it's the wrong path of ever increasing power demands. I want something fast but efficient too.

  • @nightshademilkshake1
    @nightshademilkshake1 Рік тому +167

    Jay I really appreciate your clear and concise explanations - they're educational without being overly technical, and conversely not dumbed down and chalk-full of silly gimmicks and sarcasm. I also appreciate your more recent bravery shown when calling out manufacturers for their bad business practices. Way to go! You're striking a very nice balance between the tech reviewer options.

    • @jahmed525
      @jahmed525 Рік тому +2

      Ditto

    • @StarFury2
      @StarFury2 Рік тому +3

      Companies should start to employ Jay to design power supplies and electrical conduits. Obviously so called engineers and scientists don't know sht.

    • @adamtajhassam9188
      @adamtajhassam9188 Рік тому

      I hope the HX 1200 watt PSU is fine too he mentions ATC alot.

    • @12Burton24
      @12Burton24 Рік тому +1

      Sadly its just wrong what he says about 600 Watts. The 4090 is rated to 450Watts if you keep stricktly to 450Watts the 3 times 150Watts are fine for the card.

  • @Rick_Makes
    @Rick_Makes Рік тому +79

    I'm no engineer but surely a bank of pcie connectors and a separate data cable would be the smart idea for something with such a high draw. I'm wondering if that's part of the reason EVGA have dumped Nvidia. If Nvidia were making them use a connector that's barely fit for purpose it would really hold back EVGA from making the high end overclocking cards and that's one of the things EVGA are known for.

    • @2Fast4Mellow
      @2Fast4Mellow Рік тому +6

      That data pins combo table should have been reversed. Only when there are two powered data pins, the GPU can draw 600W, otherwise it is restricted to 100W (no communication). But I guess for backwards computability they (PSI SIG) have turned it around...

    • @mahuk.
      @mahuk. Рік тому +2

      Jayz already made a video about the EVGA situation. The long story short is that EVGA would be having negative sale numbers due to NVIDIA undercutting their sales. What purpose serves a partner deal when the person you're partnering with is screwing you over?

    • @tomglover98
      @tomglover98 Рік тому

      @@mahuk. he also stated nvidias lack of clear communication, timeliness and access to products on release, which does reflect what OP is saying.

    • @MikrySoft
      @MikrySoft Рік тому +3

      @@2Fast4Mellow The table is fine, it's the cards that must break spec to work with old PSUs. If you look at 5:50, both pins in "open" state (as would happen if the connector wasn't there) is the lowest power state. To get full 600W the connector should be present with connections from those two sense pins to ground.
      The correct solution would be to make the card respect the ATX3.0 spec and supply people with an adapter with two DIP-switches so they can declare the max power themselves. Heck, make the data connector a separate part and supply 4 of them, hardwired to fake different power levels, it would just need ground from the main plug. Or supply/sell 4 different whole adapters (possibly 1/2/3/4 plug versions for different power levels).
      What's really missing is a way for the card to detect if it works with an ATX3.0 PSU on a 12VHPWR cable or an older one through an adapter and change it's behavior to reduce transient spikes.

    • @MyrKnof
      @MyrKnof Рік тому +7

      EVGA could already see the amount of warranty cases they'd have to deal with and noped out

  • @KH-cs7sj
    @KH-cs7sj Рік тому +4

    I had never thought that there could be so many problems with just connecting a cable when it was ATX2.0 or USB Type-A. Now these fancy new ports (ATX3.0 and USB Type-C) are tinier but come with all sorts of weird issues. The thunderbolt port on my Lenovo laptop just fried itself out of no reasons.

  • @ralphcarter3261
    @ralphcarter3261 Рік тому +13

    I know nothing about how these cards and PSUs are made. But I always thought that as time goes on these cards would get more power efficient. It feels like in that aspect Nvidia is going backwards

  • @pixelpusher220
    @pixelpusher220 Рік тому +135

    the adapter issue is one thing, but clearly they've skimped heavily in the power connector innards on the cards. As noted by Jay, serious bending/pulling is simply a part of building. And it could cause melting and fire. just scary

    • @fynkozari9271
      @fynkozari9271 Рік тому +8

      Wheres the technology advancement? Permorfance increase, but power consumption still high? Wheres the efficiency?

    • @simbad3311
      @simbad3311 Рік тому +2

      @@fynkozari9271 for efficiency u must wait for AMD cards, if im not wrong month or so....

    • @WizeguyGaming
      @WizeguyGaming Рік тому

      Considering he hasn't seen the new card yeah.

    • @earthtaurus5515
      @earthtaurus5515 Рік тому

      Effectively, all custom cables will need to be redone for ATX3.0 and ITX / SFF builds? There is no way to even fit in PSU provided cables without scrunching up cables and shoving them into a corner somewhere. So, this poses a serious problem for any ITX / SFF Build using any top end 40 series cards.

    • @janknoblich4129
      @janknoblich4129 Рік тому

      @@simbad3311 How the turns have tabled

  • @dibs3615
    @dibs3615 Рік тому +289

    I've noticed a trend ever since I bought my GTX 460 768mb....cards were getting way too power hungry to the point of causing meltdowns, fires and other issues. Then Nvidia started working more on power efficiency. Now they're neglecting power efficiency again and it's starting to show through other issues like this.

    • @DenyBlackburn
      @DenyBlackburn Рік тому +10

      They implemented a pretty nice inbuilt undervolt program tho on their GeForce experience panel when you press alt+z and then to performance. I could drop my watt usage from 220w on full load down to 125w on my RTX 2080s with not much performance loss and it saves it so you don't need to set it everytime unlike third party programs like MSI afterburner. Idk why not everyone is doing that tbh

    • @Mizra-dq3lj
      @Mizra-dq3lj Рік тому +18

      But muh fps

    • @techkilledme
      @techkilledme Рік тому +4

      RTX 4080 more like GTX 480 am I right? But funnily enough I think the 480 was like 250 watts at most

    • @dutchdykefinger
      @dutchdykefinger Рік тому +4

      ​@@techkilledme it was, that was considered pretty high at the time though,
      i think the radeon 6970 was 250 watts too about half a year later.
      even the radeon fury X, and rhe r9-390x, both radeons notorious for running hot, years later were only 275w tdp,
      both of them didn't scale for shit with overclocking so it wasn't worthwhile to even put more into them
      it was mostly the performance/watt AMD got seriously behind on around nvidia's maxwell, in terms of power use the flagships were never all that far apart, at least not the single-chip ones, usually higher IPC tells a story about overclocking potential and scaling with clock speeds though

    • @austinbriggle3961
      @austinbriggle3961 Рік тому +9

      Honestly I believe this is just the natural life cycle of specific CPU/GPU manufacturing technologies.
      When they introduce a new manufacturing process for the "Next gen" they always start off with lower power efficient designs. There's too reasons for this. First reason is so they can perfect the manufacturing process and iron out any kinks before moving on to more sophisticated designs. The second reason, and I believe the main reason, is economics. If they try to go straight for the maximum performance they can achieve when they first introduce the manufacturing process then they miss out on future sales from potential upgrades.

  • @riccardo1796
    @riccardo1796 Рік тому +2

    The connectors are minifit for the old standard and likely microfit for the new adaptors, with some 2.54mm pins for the serial on the bottom

  • @TayschrennSedai
    @TayschrennSedai Рік тому +2

    It's worth noting that we use these adapters in the server world today. However, they're the opposite where you pull from one or two mobo ports that give power. And these are 1200w platinum or higher psu in the servers.

  • @RayeKinezono
    @RayeKinezono Рік тому +108

    I have no intention of getting any form of 40 series card. I'm still running an EVGA 2060 KO Super, which has performed admirably well. I had originally intended to get a 30 series, but now, with EVGA's departure, and nVidia's pricing and 'in their own world' mentality, I'm looking more and more into Team Red for my next potential GPU upgrade. (Which is starting to make more sense, as time goes on, especially since I've been pretty much exclusively on AMD CPUs for a very long time.)

    • @MADBADBRAD
      @MADBADBRAD Рік тому +1

      I’m on the same boat as you. I also got a 2060 KO from EVGA and hasn’t giving me any issues at all. I’m definitely going with AMD when I get ready to build a new PC.

    • @Teku175
      @Teku175 Рік тому

      Honestly I'm thinking the same as you. I currently have an EVGA RTX 3080 10GB FTW3 Ultra (and used to have a 2060 KO Super which wasn't enough for 1440p155 but I digress), and my next GPU is gonna be AMD in maybe 4-5 years. Seeing all the news about how nvidia treated EVGA (and other board partners) along with them constantly making tech that doesn't work with other GPUs (AMD/Intel) makes me dislike nvidia, despite the product itself being good. 40 series pricing was the final nail in the coffin too.

    • @xfrostyresonance8614
      @xfrostyresonance8614 Рік тому

      Same over here. Been with my 5600XT for 2 years now, it still works and still is getting AMD's feature back-trickle to the point of it still being entirely relevant for even next-gen gaming. I'm just waiting to see if AMD has a worthy mid-range card this gen for me to upgrade.

    • @VirtualAustin
      @VirtualAustin Рік тому

      Yeah i managed to get a EVGA 3080 12GB FTW3 Ultra with a EVGA 1000 Watt power supply and now with EVGA out I don't think I will be getting a GPU for at least 2-4 years because even though AMD is a viable option I like to have the RTX with DLSS + Ray Tracing and then Reflex for competitive gaming

    • @halfbelieving
      @halfbelieving Рік тому +2

      I'm still on my 1660ti and Ryzen 2600. I don't really feel the need to upgrade, especially when i have a Series X. Sure i would be able to take advantage of my 144hz 1440p screen with more games but lately with the ridiculous pricing i'm content with waiting for longer than i set out to.

  • @samthemultimediaman
    @samthemultimediaman Рік тому +50

    The way they designed the new ATX plugs, I'm wondering if they had an actual electrical engineer working on it, for the current needed, the gauge of wire and the size of the plug is very undersized and seems very slapped together. they should have used 8 gauge wire and some heavy duty connectors. The amount power draw on new GPUs now has moved beyond conventional computer wiring.

    • @mycosys
      @mycosys Рік тому

      many

    • @kall399
      @kall399 Рік тому +5

      No they probably had some designer who doesn't know shit about electricity design something ''sleek''

    • @qwesx
      @qwesx Рік тому +14

      Nah, there were definitely actual engineers developing this thing. But then the pointy-haired boss came into the room being "That's nice and all, but can't we just re-use the old plugs format and cables, that'll make it so much cheaper for the manufacturers!". Then the engineers did the best they could to figure out a way that could work, by reducing the longevity and cycle count.

    • @ilenastarbreeze4978
      @ilenastarbreeze4978 Рік тому +4

      hoenstly im not an electrician but i do like electrical stuff and researched it some for my own things i wana do and yea, power connectors should not melt after a couple of plugins, the amount of power moving through that tiny connector is INSANE , id rather something be safe then pretty and im sure some gamers may disagree but i like not having fires in my electronics

    • @michelvanbriemen3459
      @michelvanbriemen3459 Рік тому +1

      They probably did, and then overruled the engineer's recommendations because "progress needs to look slimmer and more efficient, not thicker and bulkier" in the minds of the higher-ups.

  • @Gettingadicted
    @Gettingadicted Рік тому +2

    Hi Jay, the issue with te connector doesn't end with the adapter. The "mating limit" if for the 2 connectors (male and female) so now we will probably see some savers, in order to keep que VGA's connector safe, other wise you would need to replace the connector after 30 mates (to be within specs). This is totally nuts for the computer market.

    • @1337GameDev
      @1337GameDev Рік тому

      OHHHHHH. The 30 cycles applies to the FEMALE end of the card? Holy fuck that's bad.

    • @Gettingadicted
      @Gettingadicted Рік тому

      @@1337GameDev Yes usualy when you specify a mateing/demating limit, it is for the male and female connectors. That is why you will see "savers" on microwave equipment with SMA connectors (sma is not the best when you consider mating/demating limit).
      In my oppinion, the connector on the card should hold a little bit better, but by spec you should replace it to make sure the contact is good (bad contact can be the start of a molten connector)

  • @____________________________.x

    Engineer here. We already have multiple types of power connector rated at >500 mating cycles, built especially for this type of application. But no, they have to dream up this new Fra.nkenstein connector which is easily the most badly designed implementation I've seen since the Molex.
    There are well known problems of insertion cycles with high current connectors, BUT WE HAVE ALREADY SOLVED THIS PROBLEM. Until now, when they decided to UNSOLVE everything we've done.
    It's infuriating and so anti consumer.

  • @theshijin
    @theshijin Рік тому +334

    Love that AMD's new socket being exclusively DDR5 had people worrying about extreme prices but some DDR5 rams are available for decent enough prices and the motherboard isn't that far off either, meanwhile NVIDIA announces their 4000 series and issues immediately show up lmao

    • @_Kaurus
      @_Kaurus Рік тому +22

      No one was worried about that other than UA-camrs and sensationalists

    • @Chrisp707-
      @Chrisp707- Рік тому +15

      @@_Kaurus the average person actually kinda was. DDR5 boards and such for most are still quite expensive.

    • @lirycline6646
      @lirycline6646 Рік тому

      @@Chrisp707- not really tho, MSI ATX PRO Z690-A pretty good price

    • @TheMadYetti
      @TheMadYetti Рік тому +2

      enjoy your ddr with cpu with pluton chip from microsoft. amd betrayal is complete

    • @Claude-Vanlalhruaia
      @Claude-Vanlalhruaia Рік тому +8

      @@lirycline6646 What you are saying is that a private jet that cost 5m dollar is a good price compare to other private jet that cost 10m. Compare to ddr4 it is still expensive.

  • @Somtaaw7
    @Somtaaw7 Рік тому +70

    AMD and Intel's got a massive opportunity now. Hopefully the next gen is good and Intel sorts out their driver issues.

    • @achaosg
      @achaosg Рік тому

      With tensions over in China with Taiwan and being an intel Fanboi since like 2003 due to AMDs overheating issues I am a big investor in Intel personally... I Own intel stock and with their new fabs and the reinvestment they have into their business I am a strong believer intel will be the future Goat. Its also going to be made right here in America, even if they do a little outsourcing.

  • @trailingrails9953
    @trailingrails9953 Рік тому +10

    Just seeing how small the connector was in comparison to the current standard was raising red flags, now my suspicions are confirmed. Of all the components to over-engineer, the PSU connections should be at the top of the list, especially when they knew damn well that GPU draw was only going to increase.

    • @Squall4Rinoa
      @Squall4Rinoa Рік тому +1

      the connector is engineered just fine, the conditions that can cause failure would cause (and have caused) the same failures with the mini-fit 8pin (crushed or split female contact),
      melting failure is only possible with resistance resulting through poor contact and mishandling.

    • @degru4130
      @degru4130 Рік тому +8

      @@Squall4Rinoa Good engineering is supposed to account for "mishandling" (in this case normal things a majority of people do when building a PC). Especially when you're carrying this much current and *risk of melting* is even part of the conversation. Normally when you want to carry *more* power through a single connector you make it *bigger*, not smaller and more fragile. Just because there are other under-built connectors out there doesn't excuse this one.

    • @1337GameDev
      @1337GameDev Рік тому +2

      100%.
      Who cars about size, when the CURRENT thickness was BARELY sufficient.
      They should have had a 16pin+4pin data connector, and had it have a straight and right angle variants. But no.... now we get fire hazards....

  • @dorfkind9825
    @dorfkind9825 Рік тому

    Now I'm a bit worried, I've a corsair ax1600i don't know yet if there will be any problems with the pcie connectors with an adapter

  • @JohnSmith-ws7fq
    @JohnSmith-ws7fq Рік тому +79

    If they were gonna revolutionize the power interface, they should have gone with 24V DC. Halves the amps (these things are pulling more than cooker circuits now in terms of amperage), yet is still extra low voltage and safe to handle.

    • @wayland7150
      @wayland7150 Рік тому +4

      As long as the plugs don't fit the old socket that would have been fine. 600w is 50A through that tiny plug.

    • @glassman3333
      @glassman3333 Рік тому +10

      It would make sense. That’s why electrical infrastructure over long distances is high voltage, and only transformed where it’s needed. This crap seems crazy to me. A pc component needs like 50amps?! Why, because we’re stuck on 12v rails? How short-sighted.

    • @joshuahulce5630
      @joshuahulce5630 Рік тому

      @@wayland7150 what about CPUs? they can easily have 100-200A flowing through them.

    • @XIIchiron78
      @XIIchiron78 Рік тому

      There might be implications for VRM and PSU design that make this undesirable? Other than the obvious break in a many decades long chain of compatibility, I'm not sure if there is really much of a market for power stages and other components that could be used either, so it'd have to be planned way in advance.

    • @reezlaw
      @reezlaw Рік тому

      That would make a lot of sense, even phone manufacturers have long realised that it's better to charge at a higher voltage to reduce amps and heat

  • @Powerman293
    @Powerman293 Рік тому +149

    ATX 3.0 makes me wonder if that was what pushed EVGA over the edge and decide to stop making graphics cards now. Having all these issues means they have way more warranty crap to deal with, cutting how much profit they can make. And I'm sure Nvidia tightened the leash even more so they can't make as much.

    • @RarestAce
      @RarestAce Рік тому +12

      EVGA will have to start making ATX 3.0 power supplies they can't make ATX 2.0 supplies forever

    • @geerstyresoil3136
      @geerstyresoil3136 Рік тому +11

      Yea, talk about fixing a "problem" that doesn't exist. I have no problem with adding more tried and true 8 pins.

    • @jMewsi
      @jMewsi Рік тому +1

      sure has to be something like that

    • @MemTMCR
      @MemTMCR Рік тому +7

      @@geerstyresoil3136 letting your gpu and psu cooperate is a pretty good thing. he's talking about how many people are just gonna keep using their old supplies because they think they probably don't need to upgrade it.

    • @MCasterAnd
      @MCasterAnd Рік тому +4

      I doubt ATX 3.0 was the problem, given that everything JayZ mentioned about the new ATX spec is wrong. The graphics doesn't default to full power, it defaults to the minimum (150W). The new ATX standard is perfectly fine.

  • @thedaywalker8823
    @thedaywalker8823 Рік тому

    Question, wouldn't it be better if you placed the power connector at the back of the card or maybe even facing downwards?

  • @jimmay7736
    @jimmay7736 Рік тому +4

    It's amazing to me how much power PC's are moving around at 12V, on those wires and connectors, and then go look at the 12V wiring and connectors used in cars.
    Other than starting the car or huge aftermarket audio, incandescent high beams are the biggest draw, and they are 60W @ 12V, only 5 amps - and their connectors are almost as big as 120V household 15A plugs.

  • @Vladek16
    @Vladek16 Рік тому +57

    5:59 : no, you misunderstood the spec. If the sense pin are absent (if you PSU is an old one for example) the cable default back to the lowest state of 150w max, not the highest.
    The guy who made the ATX 3.0 spec said it in an interview with PCWorld. Look for the video "Intel Talks New ATX 3.0 And ATX12VO 2.0 Power Specifications | The Full Nerd Special Edition" it's at 14min
    It's also literally written under the table "If the Add-in Card does not monitor these signals, it must default to the LOWEST value in this table"

    • @Maax1200
      @Maax1200 Рік тому +6

      So you get a 1060 card for the prize of a 4090, awsome.👍👍

    • @oebeloeber
      @oebeloeber Рік тому

      @@Maax1200 stonks

    • @Melech
      @Melech Рік тому +2

      This. The problem with using the adapter is not too much power draw. A "dumb" ps limits the card to 150W. Prepare for a lot of people complaining about their 4090 running worse than the old card.

    • @Melech
      @Melech Рік тому +1

      btw, it looks like there isnt any actual communication, it should be possible to make adapters with different power limits for "dumb" power supplies.

    • @Vladek16
      @Vladek16 Рік тому

      @@Melech the table addresses 2 of the 4 pins. Those two pins are used to specify the max power and they are hardwired yes. But the other two pins are used for smart communication between the PSU and the GPU.
      And modding those two hardwired pins is useless if your power supply is not strong enough to deliver the power, so don't do risky mods

  • @Firecul
    @Firecul Рік тому +24

    For any GPU (or add-in card) that is designed for atx 3.0 if those data pins are not connected or "does not monitor these signals, it must default to the lowest value in this table" 3:13
    So the GPU should limit it's self to 150w unless it communicates with the PSU (or someone decides to to it in the cable for some stupid reason) and the psu confirms that it can provide the appropriate power to go higher.

    • @bills6093
      @bills6093 Рік тому

      Yes, it almost certainly works that way.

    • @robertr.1879
      @robertr.1879 Рік тому +2

      That was also my thought; without the 4 pins plug, you get 2 x open signal = 150W max.

    • @dark4yoyo
      @dark4yoyo Рік тому

      They're just going to sell jumpers with the adapters to fix that lol

    • @dastardly740
      @dastardly740 Рік тому

      That is what I am wondering. If these adapters leave the signal lines open, then the GPU either ignores them or is limited to 150W. And, the ignoring the both sense lines open seems like a very bad idea.

  • @cottontails
    @cottontails Рік тому +2

    I contacted EVGA to see if i would need to upgrade to an ATX 3.0 power supply, i currently have the 1300w SuperNova G2, i also shared this video with them and they responded with this "Hello, We will not be making an adapter for any 40 series card as the specific manufacturer will come with its own adapter. We are not aware of any fire issues in regards to this. We believe a lot of people are a bit confused about how the 12 + 4 sensing pins will work. Any standard PSU will work with a 4 to 16 pin as it will evenly distribute the power over the provided adapter. "

    • @ertai222
      @ertai222 Рік тому +1

      Wonder if they can actually confirm that or there's going to be a class action lawsuit in their future when people lose their houses in fires.

  • @sadropol
    @sadropol Рік тому

    Looking at the chart you're refering to, if an adaptor cable without sense pins is used both sense0 and sense1 would be open only letting the GPU draw 100 W at boot and 150 W after software configuration?

  • @billwiley7216
    @billwiley7216 Рік тому +11

    I actually did my latest build at the end of last year when Alder Lake was released with the expectations of a 4090/4080 gpu.
    I bought a quality 1300w psu thinking at the time this would cover up to the 4090 card with no issues.
    Fortunately a few weeks ago when the bottom dropped out of the pricing on the 3090ti and news was first leaking about two 4080 models and price predictions were floating the 4090 would be $2000 I pulled the trigger and bought a 3090ti.
    Honestly with this information coming to light I would really be upset that the top tier platinum PSU I bought some months ago which was over $300 would not be compatible with a 4090 had I decided to go that direction.
    This was information that should have been made very plain to the consumers a year ago when these PSU specs and GPU cards were being designed with all of these requirements of actually NEEDING these sensor capabilities on the PSU.
    So much for a upgrade path from a 30 series card with the older PSU'S even if it is a 1600w model!
    Thanks for the information Jay!

  • @arcticwarfare1000
    @arcticwarfare1000 Рік тому +14

    Don't worry about max wattage. You need to worry about what the maximum current the wires can handle. Because when the voltage goes down from a high load on the card the current draw goes up. Potentially being the max wattage of 600 watts. But if voltage dips to 10v. You are potentially pushing a extra 10 amps on top of the 50 amps already being pulled. Its entirely possible that the pins are rated for a total of 60 ish amps. But the unplugging and reconnecting of the delicate pins will overtime mangle and deform the female ends (more likely) meaning a hot join with high resistance will form. Causing a location where you will accumulate heat and cause smoke\fire so on.
    One of the most overlooked aspects of the PC building space is this IMO.
    Just like how your rig couldn't over clock in the scrapyard wars was from your supply voltage to your PSU was affected by the extension leads length adding resistance and reducing supply voltage by a few volts.

    • @mk72v2oq
      @mk72v2oq Рік тому +3

      If the voltage drops to 10V your power supply is garbage in the first place. Max allowed voltage deviation is +-5%. Higher deviation by itself can damage and kill hardware.

  • @TheOneTrueCaius
    @TheOneTrueCaius Рік тому

    So is the hazard only if you go beyond the 30 cycles of plugging it in/out? Or is it still there depending on case conditions (bent cables, high temps, etc)?

  • @JimFeig
    @JimFeig Рік тому +8

    I think it's time for power supplies to communicate directly with the system. And the GPU should be able to access that power info via the system.

    • @justcommenting4981
      @justcommenting4981 Рік тому +1

      They'll make it shut down if it doesn't show the power they want. Given the wattage calculations from the manufacturers seem to always want bigger and bigger wattage I'm skeptical.

  • @Durbanite2010
    @Durbanite2010 Рік тому +143

    Even more reason to not even consider a 40 Series GPU. That will be expensive enough (probably starting at £1200 or so for a 4080) but then to have to replace the PSU, which will easily be £150+ on top for an ATX 3.0 Unit (which, like Jay said, won't be low wattage and likely at least 1000W) is just going to price people out.
    Hopefully AMD will bring out their 7000 Series with rDNA 3 soon. If that doesn't require an ATX 3.0 power supply, I could see AMD dominating the GPU market for average users with this upcoming generation of GPUs.

    • @timotervola2734
      @timotervola2734 Рік тому +8

      Plus you need to turn the PC to a room heating element with these power levels :)

    • @None-lx8kj
      @None-lx8kj Рік тому +10

      I've gone from a 1080 to a 2080ti to a 3090. That's the last Nvidia card I will buy for a long time. Maybe ever.

    • @bambix1982
      @bambix1982 Рік тому +4

      I just bought a new 850w PSU when I bought my 3080. Not going to buy another PSU anytime soon so yeah, just another reason to pass on that overpriced power guzzler.

    • @djfirestormx
      @djfirestormx Рік тому +4

      @@bambix1982 i just bought a 1200w rog thor when mine went last march, how do u think I feel right now? it has power sensing capability, but nope not supported

    • @Chrisp707-
      @Chrisp707- Рік тому +4

      I need to update my PSU anyway but I won’t be going nvidia because the regular 4080 12GB is basically a next gen 3070(4070) labeled as a 4080.

  • @Macho_Man_Randy_Savage
    @Macho_Man_Randy_Savage Рік тому +157

    Damn, if getting a 4080/4090 in this climate isn't bad enough, you now have to consider a ATX 3.0 PSU to go with that 😅
    I'd feel really anxious knowing my PCIe plug has a finite life, even if I set and forget it 😬

    • @SoundwaveSinus9
      @SoundwaveSinus9 Рік тому +15

      not only a ATX 3,0 but a high Watt one. 3000 series had already huge spikes wich fried PSUs

    • @sebastians1511
      @sebastians1511 Рік тому +3

      would buy a atx 3.0 psu - none available :( i'll wait till they come out and then decide if i buy nvidia or amd

    • @TheMeragrin
      @TheMeragrin Рік тому +1

      Well, the fact is your PCIe plug has the same 30 cycle limit.

    • @xe-wf5iv
      @xe-wf5iv Рік тому

      @@SoundwaveSinus9 Those spikes are one of the reasons for the ATX 3.0 standard. With the new standard it will be possible to run a GPU with less wattage overhead. In other words, you want need a 1000 watt supply.

    • @DefianceOrDishonor
      @DefianceOrDishonor Рік тому +4

      The only way you could get a 3080 for the first like year they were available was to buy a bundle of hardware. I bought mine at MSRP but had to also buy a PSU / CPU / Mobo. Even many of those bundles would sellout pretty fast, but yeah.
      Chances are all the 4000 series will be sold out and scalped like we've seen with the 3000 series and you'll basically need to go the bundle route-- but the real icing on the cake here? I doubt any of those bundle deals will include ATX 3.0 PSUs, they'll likely offload older PSUs lol.

  • @pmonet31
    @pmonet31 Рік тому +7

    Might be beneficial with these connectors to utilize dielectric grease considering their thin sidewalls and questionable connection strength over time with use.

    • @timrc666
      @timrc666 Рік тому

      That's not what dielectric grease is used for.

  • @jakasrinaga
    @jakasrinaga Рік тому

    I agree with you to change the position of the connector, along with changing the shape of the connector, I suggest using a connector that can withstand heavy loads, such as Anderson connectors, it may be more suitable to be a connector with a heavy load.
    And for power supply designers to look further into the connectors suitable for heavy duty loads delivering full power.

  • @thesilverwang
    @thesilverwang Рік тому +14

    Damn Jay, you really don’t want an Nvidia review sample 🤣

  • @matthewmcclure8294
    @matthewmcclure8294 Рік тому +131

    I wasn’t planning on getting a 40 series anyway, but I might be hopping the fence after everything that happened with the last launch and now with all of the things coming out recently with the 40 series AND the split off on EVGA’s end. Thanks for the info!

  • @KuKoKaNuKo
    @KuKoKaNuKo Рік тому +3

    Lol, high-end power delivery coupled with low-end cable/wire design.... GENIUS!

  • @AndrewAffolter
    @AndrewAffolter Рік тому +1

    How does the Corsair PCIE 5.0 / GEN 5 12VHPWR PSU Power Cable fit into all this? When using your existing PSU? With the upcoming 4090

  • @russ4533
    @russ4533 Рік тому +73

    Can't wait for AMD, will be interesting to see their power draw and power connector choice.

    • @Alpine_flo92002
      @Alpine_flo92002 Рік тому +4

      They will have to use the same connector or hold back the ATX spec by using the old connectors

    • @ledoynier3694
      @ledoynier3694 Рік тому +5

      @@Alpine_flo92002 And AMD has the habit to give the GPU and memory power draw as the official power spec of their cards, "forgetting" all the VRM and other power draw to appear more power efficient. But power draw will very likely be very close to Nvidia 40xx. (Nvidia spec is the total card power draw).

    • @Fearzzy
      @Fearzzy Рік тому +2

      the top spec AMD cards are expected to draw around 300-350w, with lots of headroom for overclocking^ You are safe without a fancy new PSU

    • @Alpine_flo92002
      @Alpine_flo92002 Рік тому

      @@Fearzzy "Lots of headroom for overclocking" Always just reminds me of badly binned chips where either you get a GPU you can overclock to hell and back or a GPU that draws twice the power at 5% OC

    • @Fearzzy
      @Fearzzy Рік тому +1

      @@Alpine_flo92002 Well this headroom might be very "reliable" or consistant and exploited by AIB's so it might be higher clocked / power straight out of the box. But the stock clocks / power will be alot more reasonable than what we see from Nvidia right now

  • @teardowndan5364
    @teardowndan5364 Рік тому +56

    If you want to send lots of power down a small connector, use a small plug rated for high current like XT60 or EC5, then you don't have to worry about 30A going down one #16 wire and nothing on the others. 4mm solid brass pins and barrels should be quite a bit more wear-tolerant than MiniFit pins made of thin folded sheet metal.

    • @colt45strickland
      @colt45strickland Рік тому +7

      A modified XT60 with the data pins would be great imo

    • @MaddJakd
      @MaddJakd Рік тому +7

      Tell that to the PSU manufacturers and/or the guys making these cables. Seems they just threwe the most random ontern at making SOME conversion cable. No experience required

    • @MrQuequito
      @MrQuequito Рік тому +13

      They are not even 16, these are 18 at best, these cables are tiny, they have more insulation than copper, and yeah, 23+ amps going through these cables will melt them, its like whoever designed the cards never accounted for the limitations of PCI connectors

  • @decoxish
    @decoxish Рік тому

    Can you make a guide on how to tell 2.0 and 3.0 psus? I can't find anything when I look at product pages

  • @stigfullspeed
    @stigfullspeed Рік тому

    Hi Jay i have a coolermaster silent pro gold 1200w would i be able to run a 4090 thx in advance for advice :)

  • @Supreme-King
    @Supreme-King Рік тому +64

    Hopefully AMD starts releasing real competitive products. The industry really needs healthy competition to push innovation.
    NVIDIA needs the Intel treatment, a kick in the nuts forcing them to drop the bullshit and start working or got out of business.

    • @Atixtasy
      @Atixtasy Рік тому +7

      They have been (AMD) I have a 6700XT and can run anything at 4k. the card was also like HALF the cost at the time lol. RDNA 3 is gonna change it easy and the ONLY thing nvidia has is its proprietary bullshit, which, given enough sinking time and the rate they're falling in stock prices, they might JUST HAVE to license at the very least their proprietary bullshit.

    • @Kaptime
      @Kaptime Рік тому +5

      AMD is competitive (or winning) in pure raster performance.

    • @animalyze7120
      @animalyze7120 Рік тому

      Exactly, that piece of Humble pie really woke Intel up, it will do the same for nVidia.

    • @RyTrapp0
      @RyTrapp0 Рік тому +7

      "but muh ray tracings" - I think I only have like 3 or 4 games that support it, and it definitely isn't worth the FPS hit to have it on, pretty minimal difference in 'live action'

    • @whdgk95
      @whdgk95 Рік тому +2

      It's always been more of a public rep problem than an actual performance difference. Most people, especially newcomers, will follow the popular advice which used to be intel cpu + nvidia gpu just a few years back. Then ryzen came in and changed the narrative. It's not about the actual performance, most people who build computers don't actually understand the hw and performance differences. If AMD gets some share back this time and manages to shift the narrative, the products themselves are already good enough.

  • @unrealdevop
    @unrealdevop Рік тому +20

    Nvidia is messing up big time, their actions coupled with the Crypto Crash is raising awareness and causing people to become more critical of their cards. They thought they were being slick, but now people are waking up to the dirty marketing strategies they have been using. I really hope AMD uses this to step up.

  • @patrickmallory8273
    @patrickmallory8273 Рік тому +1

    They knew about this yes; If you remember the L shaped power clip found on the 2060, it had this issue. The adapter would short inside the plastic housing causing no post on the motherboard. It was acting as a resistor, limiting power flow. In fact, it was so bad you could smell the plastic burning. When I stopped using the inline L adapter, it booted right up.

  • @adamjc9683
    @adamjc9683 Рік тому

    Subbed for what you did for Kiapia. Good stuff.

  • @XwhiskeysXgaming
    @XwhiskeysXgaming Рік тому +11

    Going back to Team Red for the next build

  • @ivesennightfall6779
    @ivesennightfall6779 Рік тому +10

    it seems the latest trend in power plugs for electronics is to find out how small you can make the pins before they evaporate upon first contact, running 100W through usb-c is crazy enough…

    • @RadioactiveBlueberry
      @RadioactiveBlueberry Рік тому

      Yeah, wearing out a connection that quickly is not a well designed connection. We really need a more reliable standard for that, that doesn't rely on friction that much. Maybe a clamp of some sort (which btw are already used with CPUs). As an example, BNC is the closest already existing cable connectors I can think of, that just needs poking one needle to a hole and rotating its surrounding ring 180 degrees to tighten the connection. I'm honestly surprised that, considering power draw, ATX 3.0 designers took a multiple smaller connections approach instead of a few larger ones.
      For mobile devices, wireless recharging is luckily slowly becoming a thing. Even though less power efficient, that one ruined port doesn't make the whole device useless.

  • @ARK0307
    @ARK0307 Рік тому +1

    Should I be worried about the melting cables thing if I have a 600W PSU and using RTX 3050 with non modular PSU and bent cables? Can someone explain a bit in layman terms?

  • @wyatt1880
    @wyatt1880 Рік тому

    I bought a Seasonic Focus GX 1000W Gold in 2019. Will this be good enough for the 4080 16gb or should I upgrade to a ATX 3.0 PSU?

  • @MrDrTheJniac
    @MrDrTheJniac Рік тому +26

    This honestly feels like a serious engineering goof. However, it looks like the card will actually default to the lowest power bracket unless told it can go higher; note that having both pairs of pins "open" (meaning no connection) sets the cards to the low power mode.

    • @VikingDudee
      @VikingDudee Рік тому

      I wouldn't worry with anything lower than a 4080 honestly, I don't think the 30 cycles would be too much of a concern on them, but something power hungry as a 3090ti and a 4090, yeah, the plug in my opinion is too small for the amount of current it could draw, if it was a big bigger plug, the pins would have to be bigger, more contact area, less chance of melting, but even the standard PCI-E power connections are also rated for 30 cycles but nothing really draws as much power as the them higher end cards yet so. Guess we will see if someones system caught fire or see melted used cards on ebay lol.

  • @brucethen
    @brucethen Рік тому +8

    Looking at the sense line diagram, as an electronics engineer, it would appear that if those pins are not fitted, the 2 sense lines would be open and the power would be limited to 150W. However if the extra pins are fitted, and shorted to ground, in a specific combination, that could be a problem. I would again suspect that a single 8 pin header would be configured for 150W, a double for 300W and so on. My main concern for older power supplies though would be the power on surge of the graphics card. I would be more worried about the 3090 with its disconnected sense pins.
    As an example, here in the UK we can support upto 3kW of mains power on a single socket, that is 230V at 13A. I had a power supply to test, this unit could provide 2kW DC, that is 100V at 20A, or 22V at 90A, or any combination in between. This obviously is far less than the 3kW maximum that our mains can supply. The power supply in question had been fitted with a standard mains plug, so I plugged it in an switched on. The mains trip flipped and on further investigation I discovered that the power supply had a 31A power on surge and should never have been fitted with a standard plug
    Yes the number of connections and strain on the pins would be a problem, the plug? Has connectors that are basically springy cylinders, each time they connect they are forced, slightly more, open and when disconnected they have to spring back. Eventually this spring effect gets weaker and bending the cable puts extra strain on the connection, shortening its lifespan. Once the connector looses its spring the connection becomes weak and causes resistance creating heat and melting the connector.
    .

    • @BERSERKERDEMON1
      @BERSERKERDEMON1 Рік тому

      I would rather think it as a Peltier effect problem...
      An adapter remains a piece of different alliage (not the same amount of copper, or not as pure) which creates heating points on connectors... especially with such power draws...
      It's a shame, I was really interested in the the new RTX improving games IA thing (and even considering selling my rx6900) but changing my graphics card, + my psu (and maybe my motherboard and CPU) is going to make banker have a heart attack... so I m going to skip on Nvidia I guess (to save my banker and my myself)

  • @puddleduckist
    @puddleduckist Рік тому +8

    Pure crazy power hungry gpu's! 😳 I'm sure the melting n fires will start popping up once they start getting into serious use! As always thank you for keeping us all informed Jay!

  • @DiZh0
    @DiZh0 Рік тому

    WIth rail he mains single connection on PSU side? Cause i'm wondering, if you're graphics card has 2 connections. Should you run 2 individual cables or can you take 1 psu cable and use the branch off on the cable? and if yes, you can do that depending on the graphic card draw, how do you know when you can or can't?