We need to have a chat about these 4090 Benchmarks...

Поділитися
Вставка
  • Опубліковано 7 вер 2024
  • The RTX 4090 is finally here... but there are some things you should know...
    Get an iFixit kit for yourself or someone you love at - amzn.to/3IDDkj9
    Get your JayzTwoCents Merch Here! - www.jayztwocen...
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: / jayztwocents
    Twitter: / jayztwocents
    Instagram: / jayztwocents
    SUBSCRIBE! bit.ly/sub2Jayz...

КОМЕНТАРІ • 5 тис.

  • @Cryptic0013
    @Cryptic0013 Рік тому +2590

    Remember folks: Time is on your side. Nvidia is, as Jay put it, sitting on a mountain of overbought silicon and developers aren't going to rush to use this new horsepower (they've only just begun optimising for the 2060 and next-gen gaming consoles.) If you don't hit that purchase button now, they *will* have to drop the price to meet you. The market pressure is on them to sell it, not you to buy it.

    • @earthtaurus5515
      @earthtaurus5515 Рік тому +67

      Indeed and generally that's usually the case tech does go down in price barring any unforseen madness like a global pandemic or GPU based crypto mining or both....

    • @DarkSwordsman
      @DarkSwordsman Рік тому +67

      I really do think games are at an all time high of being CPU-bound due to how they work internally. For example: Unity is limited by draw calls. Every mesh with every material incurs a draw call, and it's why VRChat runs so bad while barely utilizing a single CPU core or the GPU. It's also part of the reason why Tarkov runs so poorly, though that's mostly down to the insane number of objects they have in the game and the, in my opinion, less then optimal LOD.
      Engines like UE5 with Nanite and Lumen, and games like Doom are prime examples of the direction that we need to go for future games if we want them to actually take advantage of modern hardware. The hardware we have now is so powerful, I don't think people realize what absolutely crazy things we can do with some optimization.

    • @corey2232
      @corey2232 Рік тому +27

      Exactly. And at those prices, there's no way in hell I'm touching these cards. I already thought jacking up the prices last gen was too much, so I'm going to happily wait this out.

    • @JSmith73
      @JSmith73 Рік тому +12

      Especially after rDNA3 hopefully brings a good sanity check to the market.

    • @theneonbop
      @theneonbop Рік тому +8

      @@DarkSwordsman Draw calls in unity are often easily fixable with optimization from the developers (early on in the project), I guess the problem with VRChat is that the developers aren't really involved in making the maps.

  • @niftychap
    @niftychap Рік тому +1696

    Going to wait for AMD or buy last gen. In my eyes performance of the 4090 looks amazing but so easy to get carried away and end up blowing way more on a GPU than I'm comfortable with.

    • @originalscreenname44
      @originalscreenname44 Рік тому +72

      I would say that for most people it's unnecessary. I'm still running a 2080 FE and it does enough for me to play anything on my PC. Unless you're working in CAD or streaming/creating video, you don't really need anything this powerful.

    • @CornFed_3
      @CornFed_3 Рік тому +113

      @@originalscreenname44, those of us that only game in 4K would disagree.

    • @victorxify1
      @victorxify1 Рік тому +125

      @@hoppingrabbit9849 yea, 1440p 160 fps > 4k 90 in my opinion

    • @idkwhattohaveasausername5828
      @idkwhattohaveasausername5828 Рік тому +70

      @@CornFed_3 if you're only gaming in 4k then you need it but most people are playing in 1080p

    • @2buckgeo843
      @2buckgeo843 Рік тому +18

      Ya snooze ya lose bro. Get the 4090 and call it a day.

  • @SkorpyoTFC
    @SkorpyoTFC Рік тому +624

    "Nvidia doesn't see this as a problem"
    Summed up the whole launch right there.

    • @TheCameltotem
      @TheCameltotem Рік тому +9

      supply and demand 101. People who can afford this will buy it and enjoy it.
      If you don't got the money then buy an Intel ARC or something

    • @lgeiger
      @lgeiger Рік тому +30

      @@TheCameltotem Ha! Let's see how many 4090's will break due the strong bend of the cable. Not sure if everyone is going to "enjoy it" when Nvidia starts blaming them for breaking their 4090, because they bent the cables too much. This adapter is a problem and I am so sure that it's gonna create problems in the future.

    • @raymondmckay6990
      @raymondmckay6990 Рік тому +23

      @@lgeigerNvidia could have solved the connector problem by having the connector be an L shape where it plugs into the card.

    • @lgeiger
      @lgeiger Рік тому +11

      @@raymondmckay6990 Exactly my thought! But I guess that's not a valid solution for a billion dollar company.

    • @PresidentScrooge
      @PresidentScrooge Рік тому +3

      @@raymondmckay6990
      What Nvidia shouldve done is plan ahead. If they plan to push a new standard, they should have worked with PSU developers 2-4 years ago so at least the high-end PSUs will have that standard available. this is just pure arrogance by Nvidia

  • @bander3n
    @bander3n Рік тому +317

    I love how I can watch multiple tech UA-camrs and they give you a general idea about a product while each one gives their input into a specific part that is important to them . Gives you an overall good idea about it . Very informative video Jay . Love it

    • @pat7808
      @pat7808 Рік тому +15

      Jay, GN, and LTT. The holy trinity.

    • @punjabhero9706
      @punjabhero9706 Рік тому +4

      @@pat7808 I watch all of them. And also Hardware unboxed, Pauls hardware and sometimes hardware canucks(I love their style of videos)

    • @pat7808
      @pat7808 Рік тому

      @@punjabhero9706 Yes love the canucks!

    • @marvinlauerwald
      @marvinlauerwald Рік тому

      @@pat7808 aaaand red gaming tech/hardware meld

    • @HillyPlays
      @HillyPlays Рік тому

      @@punjabhero9706 Because this community subscribes to so many others, I was recommended those channels and it MASSIVELY improved my understanding

  • @TheMatrixxProjekt
    @TheMatrixxProjekt Рік тому +1365

    Completely agree with you on the bending cable to fit in case issue. What they probably should have done is made the cable adaptor have a 90 degree bend pre-done or made the plug an L-shape that can lead the cable directly downwards (or going an extra mile and having a rotational system that can allow it to go in any direction). There are so many design choices that could have remedied this I feel, but instead they’re giving out a whole Nine Tailed Fox for builders to deal with. Really odd oversight for such an expensive product.

    • @MichaelBrodie68
      @MichaelBrodie68 Рік тому +62

      Exactly - like the very familiar SATA L connectors

    • @kingkush3911
      @kingkush3911 Рік тому +24

      Definitely would be smart to have a 90 degree cable to avoid having to bend the cable and risk damaging the gpu .

    • @Pileot
      @Pileot Рік тому +21

      Whats wrong with having the power connector on the motherboard side of the card and facing down? This is probably going to be the longest expansion card in your case and its not likely you are going to be running two of them.

    • @Xan_Ning
      @Xan_Ning Рік тому +2

      EZDIV-FAB makes a 8-pin 180 degree power connector that wraps over onto the back plate. I expect them or someone else to make the same thing for the 12-pin. (EDIT: just saw that they have a long 12-pin (not 12+4 pin) to 2x8-pin, so I think they will have one for 12+4)

    • @gordon861
      @gordon861 Рік тому +7

      Was just going to say the same thing. I wouldn't be surprised if someone produces a 90 degree plug/extension to solve this problem.

  • @Appl_Jax
    @Appl_Jax Рік тому +459

    Was EVGA the only board partner that put the connectors on the end? At least they had the foresight to recognize that it was a problem. They will be missed in this space.

    • @onik7000
      @onik7000 Рік тому +7

      PCB is not long enough on most GPU to put it there. Actually connectors on 4090 FE are on the END of PCB ) Only fan and radiator are behind that point.

    • @TheFoyer13
      @TheFoyer13 Рік тому

      I have an MSI 3090ti with the 12 pin connector facing the side panel. These cards are so big, and so long, that if the power connector was at the rear, it would bend even worse. I guess the only benefit to that would be seeing the RGB "carbon" motherboard leds that are hidden by the PCIE wires but then it'd be uglier and the wires in the way of my fans. Now that they don't sell a lot of cases with optical drives, they aren't as long as they used to be. I guess it really comes down to what kind of case you buy. (My 3090ti is in a corsair 4000d, and it fits great, and the adapter doesn't touch the glass)

    • @EisenWald_CH
      @EisenWald_CH Рік тому +1

      @@onik7000 that's why they put a little daughter pcb (when it's more than just the power) or just a cable that connects the PCB power IN to whatever place you would like the connector to be (and then fix that connector to the metal or plastics structures from the heatsink), it's not like they can't do what EVGA does, they just don't care that much or the feel it will look "out of place" or "bad", cost is also a thing, but i feel it's very negligible in this case, as it's just a little extra cable and fix points (redesign would be a bummer though, with machining cost and all).

    • @BM-wf8jj
      @BM-wf8jj Рік тому +1

      It wasn't even a foresight. Last year I ended up having to send back a FTW3 3080 because I wasn't able to get the glass side panel back onto my Corsair 280x w/ the PCI cables attached to it smh. They could've at least sent out some type of communication to inform buyers that they fixed that issue.

    • @Old-Boy_BEbop
      @Old-Boy_BEbop Рік тому +2

      @@onik7000 if only they could invent small pcb boards and capacitors with wiring to add the Power Connectors at the end... if only, what a world we would be living in.

  • @gremfive4246
    @gremfive4246 Рік тому +360

    Igor's Lab explained why the AIB's coolers are so massive, the AIBs were told by Nvidia to build coolers for 600 watt cards (all those rumors back in July of 600 watt 4090s) in the end the 4090s would have been just fine with 3090 coolers since Nvidia used TSMC over Samsung.
    Thats going to add alot of cost to AIBs cards over the Founders Editions and maybe one of the things that made EVGA say enough is enough.

    • @ChenChennesen
      @ChenChennesen Рік тому

      Bit of a tangent but what are the recommended psu sizes now?

    • @kimiko41
      @kimiko41 Рік тому +17

      Gamer's Nexus did some power draw testing, stock was 500w and overclocked with the power limit / power target bumped was pulling over 650w. Seems to me the large coolers are necessary unless AIBs want to set a lower limit than FE.

    • @fahimp3
      @fahimp3 Рік тому

      @@ChenChennesen 10000W if you want to be safe with the transient spikes and future proof it.

    • @randomnobody660
      @randomnobody660 Рік тому +9

      @@kimiko41 Wasn't it mentioned in this video that the card got to 3Ghz while maintaining 60-ish degrees in a closed case with factory fan curves? That sounds like even the fe's cooler is arguably overbuilt already.

    • @TotallySlapdash
      @TotallySlapdash Рік тому +12

      @@randomnobody660 or NV is binning chips for FE such that they get an advantage.

  • @YAAMW
    @YAAMW Рік тому +167

    The MOST valuable info in this review was the bit about bending the connector. MASSIVE thanks for pointing that out. Somebody IS going to have a REALLY bad day because of this. The second most valuable info was the bit about the over-engineered coolers. This is the first time I felt restricted by the GPU clearance in my Armor Revo Snow Edition because of these unnecessarily huge GPUs

    • @DGTubbs
      @DGTubbs Рік тому +5

      I agree. Shame on NVIDIA for downplaying this. If you screwed up somewhere in design, own it. Don't hide it.

    • @Chris-ey8zf
      @Chris-ey8zf Рік тому +1

      Honestly though, if someone is that careless as to break off their connector by bending/pulling on the cables, they probably aren't responsible enough to be building/upgrading PCs. PCs aren't adult lego sets. You have to actually be careful with what you're doing. People that break things due to being that careless deserve to lose the money they spend. Hopefully it teaches them a valuable lesson for the future. Better they break a 4090 and learn than to mishandle a car or misfire a gun that can actually harm/kill others.

    • @EkatariZeen
      @EkatariZeen Рік тому +8

      @@Chris-ey8zf Nope, that's just a sh¡t design, even if the user is careful they would be stuck with an unusable brick or an unsightly opened case until they get one of the 4 cases in the entire market where that crap fits.
      Jay just showed that he broke the power cable by bending it, so it's not just being careful about not breaking the cheap connector in the PCB.

    • @javascriptkiddie2718
      @javascriptkiddie2718 Рік тому

      Vertical mount?

    • @MoneyMager
      @MoneyMager Рік тому

      This video and comment aged "well" :)

  • @tt33333wp
    @tt33333wp Рік тому +60

    They could introduce “L” shape adapters. This would be a good solution to this bending issue.

    • @Digitalplays
      @Digitalplays Рік тому +32

      Then you can take two Ls when buying one of these

    • @johnderat2652
      @johnderat2652 Рік тому +4

      @@Digitalplays4090 would be a great card for Blender artists, AI training, and simulations.
      Not so sure about gaming though.

    • @mukkah
      @mukkah Рік тому

      Interesting thought and I wonder if it has been explored by RnD at nvidia (seems like an obvious thought now that you mention it but it escaped me entirely up until then, so who knows)

    • @Chipsaru
      @Chipsaru Рік тому

      @Anonymous did you see 43 FPS in Cyberpunk RT vs 20ish with 3090? Nice uplift for RT titles.

  • @theldraspneumonoultramicro405
    @theldraspneumonoultramicro405 Рік тому +344

    man, i just cant get over just how comically massive the 4090 is.

    • @woswasdenni1914
      @woswasdenni1914 Рік тому +7

      there goes my plans making a ssf build. the cooler itself is bigger than the entire build

    • @itllBuffGaming
      @itllBuffGaming Рік тому +1

      @@woswasdenni1914 if the waterblocks for them are the same as the 3090. It’ll be a quarter the size, if you want a small build custom liquid is going to be the way now

    • @MichaeltheORIGINAL1
      @MichaeltheORIGINAL1 Рік тому +4

      I thought the 3090ti was huge but this thing is a whole nother story, haha.

    • @Cuplex1
      @Cuplex1 Рік тому +2

      Agreed, I think my 3080 is massive. But it's nothing compared to that beast of a card. 🙂

    • @watchm4ker
      @watchm4ker Рік тому

      @@outlet6989 It'll fit in a full tower case, as long as you don't have drive cages to worry about. And you're thinking EATX, which is for dual-socket MBs

  • @shermanbuster1749
    @shermanbuster1749 Рік тому +67

    I can see a lot of 90 degrees adaptors being sold for these cards. If I were in the market for this card, that is the way I would probably go. Go with a 90 degree adaptor so you are putting less stress on the cables and saving some space.

    • @ArtisChronicles
      @ArtisChronicles Рік тому +7

      That's the one thing that would make the most sense to me. Problem for me is the damn things are so big. I do not want to run a card that big in my case.
      Those 90 degree adapters should exist regardless though. It's a pretty important piece overall.

  • @Porkchop899aLb
    @Porkchop899aLb Рік тому +29

    New builder here and went for a 3080ti for 165hz 1440p. The 4090 will be lovely of an upgrade for me in 3 or so years lol

    • @liandriunreal
      @liandriunreal Рік тому +6

      getting rid of my 3080ti for the 4090 lol

    • @kertosyt
      @kertosyt Рік тому +1

      @@liandriunreal i just got a 1080 after my 970 died ..lol

    • @garretts.2003
      @garretts.2003 Рік тому +5

      3080ti is a great card. Certainly an excellent first build. The cards are getting so good that last gen is worth the savings for me personally. I'm still on a 2080ti running 1440 ultrawide decent. Most single player games I'm fine with at 60fps and online FPS can always drop the resolution if required. I'll probably upgrade to the 3080ti while prices are good.

    • @mattgibbia2692
      @mattgibbia2692 Рік тому +2

      @@liandriunreal you must hate money 3080ti is more than enough for anything out right now

    • @Amizzly
      @Amizzly Рік тому +1

      @@mattgibbia2692 like what? Most games with high detail textures and RT on my 35” UW 1440p monitor are dragging ass with my 3080. Cyberpunk it’s like 45FPS even with DLSS on performance mode. With no DLSS it’s like 15FPS.

  • @latioseon7794
    @latioseon7794 Рік тому +328

    After the whole ordeal with 30 series and the "4070" thing, i hope nvidia gets a reality check

    • @eclipsez0r
      @eclipsez0r Рік тому +45

      That performance sells lol this card gonna be sold out

    • @TheSpoobz
      @TheSpoobz Рік тому +22

      Honesty just gonna stay with my 3080ti cuz of that

    • @kevinerbs2778
      @kevinerbs2778 Рік тому +26

      @@eclipsez0r that's the most disappinting part about this.

    • @samson_the_great
      @samson_the_great Рік тому

      @@eclipsez0r yup, I already hit the plug up.

    • @omniyambot9876
      @omniyambot9876 Рік тому +1

      people who would spend money to 3090ti before crypto crash have every reasone to buy 4090 cards especially with those insane performance jumps. Yeah we hate NVIDIA, they are dick and overpriced but let's stop being stupid here, their product is still absolutely competitive that's why people still buy them, they are not pointing a gun at you.

  • @vorpled
    @vorpled Рік тому +118

    DetBauer had an amazing point which shows that they could have reduced the size of the card and cooler by 1/3 for about a 5% performance hit.

    • @sircommissar
      @sircommissar Рік тому +20

      Id unironically rather not, rather have a bigger card and better perf. Let some AIB have their trash perf for smaller size

    • @Beamer_i4_M50
      @Beamer_i4_M50 Рік тому +45

      @@leeroyjenkins0 1/3 of cooler size means 1/3 of power needed. Means 150 Watt less you have to pay for. Means cooler temps all around. For 5% penalty in performance.

    • @jtnachos16
      @jtnachos16 Рік тому

      @@leeroyjenkins0 You are both utterly missing the point, and a perfect example of the type of idiot who lets scalpers (be they the official producer or a third-party) keep stake in the market. Which is what NVIDIA is doing right now by marketing their 4070 as a 4080. Additionally, your comments further paint the picture as someone who doesn't have the money for a 4090 in the first place.
      You've just demonstrated that you have no understanding whatsoever of how GPUs work. Guess what? With less power draw and 5% less performance, you are still in a range where overclocking can bring that performance back to a degree that is utterly unnoticeable in actual use (this ignoring that we are talking loss of 1-3 frames or so in most real-world loads). Furthermore, that reduced power consumption? Less wear on parts, means more reliability, not just on your gpu, but also psu.
      This is also ignoring that the transient spikes that were being caused by GPUs are still unsolved as of yet, and are capable of killing other parts in the system (yes, they are claiming the new power supply standard solves it, but those aren't commercially available yet and will likely have most consumers priced out for the first year or two. It's also a really bad precedent to be having a totally new power standard for PSUs popping up solely because one manufacturer refuses to work toward efficient use of power). Further yet, momentary power spikes were what was consistently killing 3080ti and 3090 cards, yet nvidia's response was, to my knowledge 'we added more capacitors' which isn't a solution, as those capacitors will still end up getting blown.
      Put bluntly, Nvidia tim tayloring it in search of tiny advancements in performance is absolutely a bad idea, from engineering, consumer, and environmental positions. Literally from every reasonable and informed position, it's a bad idea. Furthermore, that power draw actually is reaching the point where a high end pc will risk overtaxing standard american in-home circuits.
      The TLDR here, is that NVIDIA absolutely did not have to draw that much power for a substantial performance gap over last gen. They are being exceedingly lazy/arrogant and trying to brute force the situation in a way that is almost certainly going to result in failing cards and potentially damage to other parts in the system.

    • @rodh1404
      @rodh1404 Рік тому +11

      Personally, I think NVidia took a look at what AMD has in the pipeline and decided to go all out. Because they'd probably have been beaten in the benchmarks if they didn't. Although given how huge the coolers they've strapped on to their cards are, I don't understand why they haven't just abandoned air cooling for these cards and just gone with water cooling only.

    • @HappyBeezerStudios
      @HappyBeezerStudios Рік тому +1

      I heard that was more an issue with the manufacturing process. The one they originally planned warranted the 600W and massive coolers, but the one they ended up using sits at 450W and doesn't need those monsters.

  • @dale117
    @dale117 Рік тому +295

    What impressed me the most was the improvement in 4K resolution. Can't wait to see what AMD brings to the table.

    • @surft
      @surft Рік тому +10

      Excited too, but I'm going to be shocked if their top of the line can get close (single digits) in fps to this in most games. The uplift in rasterization alone is insane.

    • @roccociccone597
      @roccociccone597 Рік тому +8

      @@surft Well leaks suggest they do match it, sometimes even beat it. I do expect RNDA3 to be very very good.

    • @TwoSevenX
      @TwoSevenX Рік тому +4

      @@surft AMD and NvIdia both expect AMD to *win* in pure raster performance by 5-20% with the 7950XT

    • @jakestocker4854
      @jakestocker4854 Рік тому +27

      @@roccociccone597 the leaks always say that though. Literally for the last 3 generations there have always been links that AMD has something huge coming and then they release some solid cards but nothing like the leaks hype up.

    • @roccociccone597
      @roccociccone597 Рік тому +7

      @@jakestocker4854 well rdna 2 was pretty accurate. And they mostly match nvidia. So I’m optimistic amd will manage to match or even beat nvidia in raster and get very close in ray tracing. And I hope they won’t be this expensive

  • @TheZigK
    @TheZigK Рік тому +94

    couldn't wait any more. Had money to upgrade my 1050 ti when the market was inflated. Finally snatched a 6800 XT for $550 and have 0 regrets. Will still be watching to see how things evolve

    • @dennisnicholson2466
      @dennisnicholson2466 Рік тому +4

      I last week nabbed the 6800xt after seeing a mod video that taxes this card to comfortably run as a 3090ti . Been having some power draw issues thought would have been safe using same modular psu that my dual 1080s .

    • @TheZigK
      @TheZigK Рік тому +3

      @@dennisnicholson2466 I saw a article about the same thing! Seems like it requires a custom cooling setup to see real improvements though, and probably doesn't work with every game. If I find myself running up against the card's limits I might consider it

    • @ElectricityTaster
      @ElectricityTaster Рік тому +3

      good old 1050ti

    • @HansBelphegor
      @HansBelphegor Рік тому +1

      Same but msi 6950xt

    • @user-ck8ec7pj1l
      @user-ck8ec7pj1l Рік тому +1

      I got that STRIX 3090 White edition he is showing for $999 2 weeks ago. Been eyeing it for months.

  • @Zeniph00
    @Zeniph00 Рік тому +123

    Impressive uplift, but happy to wait and see what AMD has. MCM tech has me very interested in what will come.

    • @Malc180s
      @Malc180s Рік тому +4

      AMD has fuck all. Buy what you want now, or spend yoursl life waiting

    • @georgejones5019
      @georgejones5019 Рік тому +21

      @@Malc180s lmao. Probably a userbrnchmark fan boy.
      AMD has the 3D vcache. The 5800X3D's tech will only improve with age, they've stated it not just applicable to CPUs, but GPUs as well.

    • @HeloisGevit
      @HeloisGevit Рік тому +4

      @@georgejones5019 Is it going to improve their shocking ray tracing performance?

    • @AGuy-vq9qp
      @AGuy-vq9qp Рік тому +3

      @@georgejones5019 the sounds false. GPUs are a lot less latency sensitive than CPUs are.

    • @mutley69
      @mutley69 Рік тому +7

      @@HeloisGevit Ray tracing is just another tactic to make you buy their next latest and greatest cards, let alone the last 2 gens of cards have been been bad for Ray tracing. This 4090 is the first card that can actually manage it properly

  • @luckyspec2274
    @luckyspec2274 Рік тому +6

    17:00 Hi, I am from the future, JayzTwoCents was right about the cable bend issues

  • @connor040606
    @connor040606 Рік тому +52

    Thanks for the tips Jay with the adapter cable. Pretty sure you just saved a ton of people RMA headaches!

  • @bernds6587
    @bernds6587 Рік тому +119

    Roman (der8auer) made an interesting point about the Powertarget:
    setting it to 70% allows the card to run cooler with its power reduced 300W, but still performing at about 95% of its graphical power. The power to fps curve definitely looks like it runs overclocked by default

    • @R1SKbreaker
      @R1SKbreaker Рік тому +3

      Oh this is good to know! I have a 750 watt power supply, and I think I am going to splurge for a 4090 eventually, coming from a 2070 Super. I'd really rather not upgrade my power supply, and if I can hit 95% graphical power with my current power, then I am more than happy. 4090 is OP as is; I can deal with a 5% graphical power reduction.

    • @bobbythomas6520
      @bobbythomas6520 Рік тому

      @@R1SKbreaker (coming from a person who owned a 750 watt power supply)

    • @ryze9153
      @ryze9153 Рік тому

      ​@@R1SKbreaker I would get a 3070 now and then wait for 50 series. That's my suggestion.

    • @R1SKbreaker
      @R1SKbreaker Рік тому

      @@ryze9153I'm actually just going to stay with 2070 Super until the 5000 series. After I upgraded my CPU, I'm a lot more content with my current setup.

    • @ryze9153
      @ryze9153 Рік тому

      @@R1SKbreaker I'm hoping to have a 3060 ti or somethin like that pretty soon.

  • @markz4467
    @markz4467 Рік тому +116

    You should start adding power consumption to the fps graphs. Something like - 158/220, where 158 stands for the fps count and 220 represents power consumption in watts.

    • @nervonabliss
      @nervonabliss Рік тому +3

      Or just next to the cards name

    • @ramongossler1726
      @ramongossler1726 Рік тому +1

      no, Power consumption is just an AMD fanboy Argument just like "no one needs RT anyway" if you are concerned about Power consumption buy a Laptop

    • @GM-xk1nw
      @GM-xk1nw Рік тому +30

      @@ramongossler1726 Power consumption is a thing people who pay bills care about, you know people with responsibilities.

    • @GamerErman2001
      @GamerErman2001 Рік тому +16

      @@ramongossler1726 Power consumption raises your power bill, affects what power supply you need and heats up your PC as well as your room. Also although this is a small matter for a single user several people using large amounts of power to run their computer creates pollution and can also cause black/brown outs.

    • @excellentswordfight8215
      @excellentswordfight8215 Рік тому +8

      Bills aside, when using something like PCAT so that you actually get a good messure of GPU-powerdraw it would actually be a good way of seeing how system-bottlenecked the card is.

  • @Bi9Clapper
    @Bi9Clapper Рік тому +28

    Im sticking with my evga 3080ti that baby will get me through until 50 series easy. Honestly by that point im hoping intel cards are good, id love to try a good top notch Intel card

    • @SuperSavageSpirit
      @SuperSavageSpirit Рік тому +3

      Intel's cards won't be if they are even still making them by that point and didn't give up

    • @christopherlawrence4191
      @christopherlawrence4191 Рік тому +1

      If i dont have a card at all (1050ti died on me a year ago). Should i wait or keep the 4090 i ordered?

    • @brandonstein87
      @brandonstein87 Рік тому +1

      ​@christopherlawrence4191 how is the card? I just got one

  • @rallias1
    @rallias1 Рік тому +51

    So, someone else showed how they were able to cut power down to like 50% and still get like 90% of the performance. I kinda want to see your take on that, and maybe in comparison to a 30-series or team red card with the same power limits.

    • @paritee6488
      @paritee6488 Рік тому

      tech ciiy!

    • @emich34
      @emich34 Рік тому +1

      @@paritee6488 derBauer too - running at 60% power target was like a 2% fps drop in most titles

    • @Trisiton
      @Trisiton Рік тому +1

      Of course, you get diminishing returns after a certain point. This is why there are laptop 3080tis that run at 180W and still get like 70% of the performance of a desktop 3080ti.

    • @sonicfire9000
      @sonicfire9000 Рік тому

      @@Trisiton those sound very interesting but I just have one question in my mind tbh: how are the batteries? Just asking so I don't accidentally run into a razer or alienware situation

    • @prabhsaini1
      @prabhsaini1 Рік тому

      @@Trisiton 180 watts? on a laptop? those things must only run for a solid 28 seconds

  • @commitselfdeletus9070
    @commitselfdeletus9070 Рік тому +2

    Jay makes the only ads that I don’t skip

  • @PixelShade
    @PixelShade Рік тому +10

    I'm at a point where I am totally happy with the 6600XT performance... At least for gaming at 1440p. I kind of feel like games need to become more demanding to justify an upgrade. The 4090 is impressive and I would totally buy it if I worked with 3D modelling/rendering professionally... But let's not kid ourselves. This is not really a "consumer" product, but rather nvidias professional grade hardware made available to normal consumers.

  • @NiveusLuxLucis
    @NiveusLuxLucis Рік тому +81

    Thanks for talking about the connector, had the same concerns and nvidia's response is kind of unbelievable.

    • @earthtaurus5515
      @earthtaurus5515 Рік тому +10

      Nvidia is too damn egoistic and they are fully aware of frying 12 vhpwr connectors as well as it's very low durability. They just don't give a damn about anyone except their bottom line. So, if you fry your PC or break their connector off the PCB they think everyone will go out and by another 4090. The only way they will do anything about it is if there is massive backlash publicly especially if people start frying their PCs en masse due to the low durability of the 12 vhpwr 4 way adapter.

    • @Cruor34
      @Cruor34 Рік тому +2

      This one isn't a big concern to me... I buy a top end GPU every 4 years or so unless something breaks (for example, my 980 TI broke, MSI didn't have any left so the sent me a check, I got a 1080TI) I build the PC and I don't ever touch it again really, then I build a new PC years later. Who the hell is hooking and unhooking the cables constantly? On my list of care issues, this is really low, unless I am misunderstanding, its only an issue if you hook and unhook it 30 times.

    • @hashishiriya
      @hashishiriya Рік тому +6

      @@Cruor34 cool

    • @flopsone
      @flopsone Рік тому +3

      a 90 degree connector would make things a lot better/easier, surely nvidia or an aib can find a little space and just have the connector just come directly out of the bottom of the pcb, would direct the cable straight down where most cases have the power supply

    • @jordanmills7327
      @jordanmills7327 Рік тому +3

      @@Cruor34 that 30 times was under "ideal conditions" which means with the cable being inserted straight in with no wiggle and without bending the cable. its almost certainly a lot less than 30 times if you bend or insert the cable in "not ideal" conditions. also keeping it in a tight angle might wear the cable/port down over time and considering this connector could be a serious fire hazard this is completely unacceptable.

  • @hdz77
    @hdz77 Рік тому +9

    The only thing I got out of this benchmark is that the 4090 price is not justifiable. The 6950XT and the 3090 is more then enough for price to performance, if anything wait and see for the RX 7000 series GPUs.

    • @jondonnelly4831
      @jondonnelly4831 Рік тому

      It is justifiable. Take the performance increase per dollar into account. The 4090 costs more, but it gets a lot more fps. If you have a 1440p 240Hz panal and a high end cpu you will feel that increase. The 3090 will feel S L O W in comparison.

    • @LordLentils
      @LordLentils Рік тому +2

      @@jondonnelly4831 Old flagship GPUs were the beasts of their time yet the price increase wasn't as tremendous over a single generational leap.

  • @carlwillows
    @carlwillows Рік тому +16

    It will be interesting to see the performances of the 4080's with only 47% and 60% of the 4090's cores respectively.

    • @carlwillows
      @carlwillows Рік тому +3

      @@taekwoncrawfish9418 I don't think it will be quite so linear, but we shall see.

    • @ChiquitaSpeaks
      @ChiquitaSpeaks Рік тому

      @@carlwillows benchmarks dropped it’s pretty Linear

    • @ChiquitaSpeaks
      @ChiquitaSpeaks Рік тому

      @@taekwoncrawfish9418 same architecture lol right everything

  • @CR500R
    @CR500R Рік тому +78

    Thank you for testing the PowerColor Red Devil 6950XT! It makes me feel better about my purchase. It actually held its own against the 3090 & 3090Ti on a lot of benchmarks. Not many people test the Red Devil 6950XT. It's not the most popular of the 6950XT cards.

    • @KingZeusCLE
      @KingZeusCLE Рік тому +2

      Maybe not, but any of the other 6950 XT numbers still apply. They likely all perform within 1-2% of each other.
      Bitchin' card though. Even with the 4090 released.

    • @vespermoirai975
      @vespermoirai975 Рік тому +1

      Red Devil and XFX/Sapphire has always been my favorites when I've had AMD Cards. Red Devils seem to be what EVGA was to Nvidia. With XFX/Sapphire coming close.

    • @ArtisChronicles
      @ArtisChronicles Рік тому +1

      @@vespermoirai975 idk if I'd call the Red Devils that as the Red Devil RX 480 actually cut a lot of corners. Mostly applied to overclockers, but I'd still refrain from running Furmark on them. Unless you want to risk damaging them. Old card now, but still a relevant issue.

    • @vespermoirai975
      @vespermoirai975 Рік тому +1

      @@ArtisChronicles If I remember right there was a thermal pad fix for that. Could be thinking the R9290

    • @farmeunit
      @farmeunit Рік тому +2

      @@ArtisChronicles I had a Red Devil 580 and I loved it. That was their top tier card. Red Dragon below it, and then any other models. I wanted a Red Devil 6800XT but prices were ridiculous. Finally got cheaper but got a 6900XT Gaming Z for less.

  • @vetsus3518
    @vetsus3518 Рік тому +145

    I’m with you… a little confused why they didn’t create a 90 degree adapter if that was the temporary solution until the new PSU’s are released… that would have at least allowed you to fit it within a ‘normal’ case. Also, I love the GPU mounted on the wall in the back. At least that’s how it appeared to be. Looks like one of your custom printed GPU stands mounted to the wall with the card on it. It’s a cool look. Try putting up some motherboards too…. I mean since you’re just a little tech channel. Lol

    • @Ben-Rogue
      @Ben-Rogue Рік тому +9

      That cable solution is just lazy. A 90 degree adapter with about 20CM of cable before they split it, would be a lot cleaner and easier for customers to fit into cases

    • @ChaseIrons
      @ChaseIrons Рік тому +3

      Someone will make an adapter eventually. For my 3090’s I got u-turn adapters that have been excellent since launch. No bends needed at all

    • @joee7452
      @joee7452 Рік тому +1

      I am not a betting man but what is the chance that they didn't create one so that the after market do create them and then can charge a 79 or 99 dollars for them as an extra part? Remember they put caps on prices, so that would be a way for them to give, say asus, an easy way to make 50 or 70 dollars extra on the 4k series that technically doesn't count in the price of the gpu itself.

    • @MrMoon-hy6pn
      @MrMoon-hy6pn Рік тому

      @@joee7452 Since nvidia seems to treat their AIBs with utter contempt as shown with evga leaving the GPU market entirely because of nvidia, I somehow doubt that's the reason.

    • @lUnderdogl
      @lUnderdogl Рік тому

      I bet they planned to do but sometimes it is too late to implement.

  • @dereksinkro1961
    @dereksinkro1961 Рік тому +10

    5900x and 3080ti is a nice spot to be for 1440 gaming, will enjoy my rig and watch this all play out.

    • @duohere3981
      @duohere3981 Рік тому

      @reality8793he thinks he’s smart 😂

  • @adamsmith4953
    @adamsmith4953 Рік тому +76

    Looks pretty impressive, I can't wait to see what AMD comes out with

  • @mcflu
    @mcflu Рік тому +40

    On one hand seeing current performance of the RX 6950XT compared to 3000 series in super impressed and looking forward to what thing bring with RDNA3 next month. On the other hand, all of these cards are too much for me lol and I'm happy with my 3060ti 😁

    • @chexlemeneux8790
      @chexlemeneux8790 Рік тому +4

      Personally Im totally fine playing in 1080p and the 3060ti is more than capable of playing every game I own on ultra settings with at least 100fps . I got it in a $1800 CAD pre-built PC during the chip shortage , while people were paying that much for a 3080 by itself. I felt like I made out like a bandit and still feel super good about that purchase.

    • @craiglortie8483
      @craiglortie8483 Рік тому +2

      thats why i went with a 67000 xt for my upgrade. running 60fps on a 4k monitor. i watch youtube and streaming services more than i play now, so it made the best choice for me.

    • @cerebelul
      @cerebelul Рік тому +1

      More impressive is the fact that it comes very close to the 4090 in some games at 2K/4K.

    • @FlotaMau
      @FlotaMau Рік тому

      @@craiglortie8483 same I mean 4 k at 60 fps for single player Is really fine, I only Envy More fps for competitive but playing comp at 4k Is NOT even a thing.

    • @craiglortie8483
      @craiglortie8483 Рік тому

      @@FlotaMau I play war thunder fine with my Philips monitor. The settings I turn down are the same as I would to improve game play. I stay locked between 55-60 fps all game. Lose a few ms from monitor but nothing I don't lose from age.

  • @Daniel218lb
    @Daniel218lb Рік тому +5

    You know, i'm still very happy with my rtx 2080ti.

  • @__last
    @__last Рік тому +2

    hopefully i can get a 3090 for much cheaper now since no game is gonna need a 4090 for at least 3-4 years.

  • @kratonos47
    @kratonos47 Рік тому +45

    The iFixit ad is the only one you don't want to skip

  • @BrettWidner
    @BrettWidner Рік тому +194

    This is actually very interesting. A lot of your numbers for the 4090 are INCREDIBLY different from LTT's. I'm actually quite perplexed by it, their 4090 was getting 2X the fps of your 4090 test on what looks like the exact same settings in Cyberpunk. 4K, RT On, Ultra Preset, DLSS Off.
    EDIT: Just want to clarify, I'm not accusing either reviewer of anything. Merely pointing out the vast differences, could be related to their test bench, could not be. Something one might have to think about if they're looking to buy this card.
    UPDATE: LTT ran the card with FidelityFX on by mistake. JayzTwoCent's numbers are accurate.

    • @Pleasant_exe
      @Pleasant_exe Рік тому +17

      And gamers nexus

    • @AsaMitakasHusband
      @AsaMitakasHusband Рік тому +7

      Yea i noticed that too lol

    • @marcelosoares7148
      @marcelosoares7148 Рік тому +30

      Hardware Unboxed too got different numbers but the strangest one was the 6950XT getting only 28fps in CP2077 on the LTT Benchmark

    • @B8con8tor
      @B8con8tor Рік тому +8

      Everyone's numbers will not match. It will depend on room temperature, Open/closed case, Cpu Memory and so on.

    • @BrettWidner
      @BrettWidner Рік тому +46

      @@B8con8tor I get they're on different benches but LTT's 4090 getting 2X the performance of Jayz 4090 on from what I can see are the same settings in Cyberpunk?

  • @dunastrig1889
    @dunastrig1889 Рік тому +21

    Thanks! Raw performance #'s are always what I look for first. DLSS and FSR are nice options to have if you can't push the fps but I want to see bare performance with all the bells and whistles. Now I'll look for a 12vhpwr 90 adapter...

    • @andytroschke2036
      @andytroschke2036 Рік тому

      Better to wait for ATX3.0 PSU's to release instead of an Adapter.

    • @antonchigurh8343
      @antonchigurh8343 Рік тому

      @@andytroschke2036 They are already available

    • @andytroschke2036
      @andytroschke2036 Рік тому

      @@antonchigurh8343 where? All I can find is ATX 2.4 with a 12VHPWR. ATX3.0 hast several other additions

  • @faucheur06400
    @faucheur06400 Рік тому

    There are household power cables, sata cables, USB cables, HDMI cables, jack cables, IDE cables and (if my memory is good) even molex cables, which have a 90° angle, made of hard plastic. I can't believe that no one in the graphics card industry thought it would be a better solution than "yeah, bend that cables".
    The cables for my GTX 1080 are touching the glass panel since six years already.

  • @Angsaar011
    @Angsaar011 Рік тому +66

    I got myself a 3070 Ti. Simply because that was what was available at the time of the shortage. I'm very curious to see what AMD brings to the party. Here's hoping it will be something exceptional.

    • @Its_Me_Wheelz
      @Its_Me_Wheelz Рік тому +4

      I nailed a great deal on a 3080 last month, and I'm all sorts of happy. It will last me a long time. Most likely somewhere around the 5000 to 6000 cards.

    • @Angsaar011
      @Angsaar011 Рік тому +4

      @@Its_Me_Wheelz For sure. I had my 980 ti up until recently. The 30-series are going to last a long time.

    • @Its_Me_Wheelz
      @Its_Me_Wheelz Рік тому +3

      @@Angsaar011 In all honestly, I was running a 2060 super, and it ran everything I play with no problems. Mainly ESO, HLL, COD, and a few others in that kind of games. I had no intentions of upgrading. But like a said, I got a great deal on the 3080 and so, here I am.

    • @yyorophff786
      @yyorophff786 Рік тому

      You should have waited for this card.

    • @josephj6521
      @josephj6521 Рік тому

      @@yyorophff786 not at these prices.

  • @sidra_games4551
    @sidra_games4551 Рік тому +23

    There is so much happening on such a short timeframe that it just makes sense to wait a few months before deciding on a new build. How are the Intel chips gonna perform versus the new AMD ones? How will the new AMD cards perform? How will the lesser (70/80) Nvidia cards compare with this one? And keep in mind we are still awaiting on 5.0 M2 SSDs. It's new build time for me as my last one is 5 years old. But I am gonna let the dust settle and once JAN-FEB rolls around figure out what's best.

    • @Silver1080P
      @Silver1080P Рік тому

      I've been waiting for what's next across the board for 3 years now, whether it's due to cost or lack of power, most of the things I've been interested in has pushed to the side. I have a 3080 12gb and i7 8700k so I'm happy enough for now. Will be looking at Intel's next cpu though

    • @jamesc3953
      @jamesc3953 Рік тому

      @@Silver1080P Do you find your 8700k bottlenecks your 3080? what kind of resolution do you play at?

  • @greenawayr08
    @greenawayr08 Рік тому +124

    Jay, have you considered including VR in your benchmarks? it's an ever growing segment and really pushes performance with variance in performance from card to card. just a suggestion.
    thanks for the great content.

    • @pumpkineater23
      @pumpkineater23 Рік тому +14

      Agreed. Even mid-range GPUs are way more than good enough for gaming on a monitor. VR is what really pushes a card now. Flat-screen gaming is yesterday's tech. How does the 4090 improve MSFS in VR.. that's a more 'worthy opponent'.

    • @blinkingred
      @blinkingred Рік тому +7

      VR is growing? Last I checked it was stagnant with declining interest and sales.

    • @3lbios
      @3lbios Рік тому +3

      I'd love to see a 3090 vs 4090 comparison on a Reverb G2 headset in the most popular VR games.

    • @lePoMo
      @lePoMo Рік тому

      has the VR landscape changed?
      VR requires a fluid high framerate so much that no (sane) game developer takes any risks. or has this changed?
      When I bought into VR (CV1), every game targeted RX470-RX480 (GTX980/GTX1060). I somewhat got stuck on BeatSaber so didn't follow evolution since, but to my memory, the only games back then to break with this were flatscreen ports to VR.

    • @privateportall
      @privateportall Рік тому +3

      Vr remains gimmicky

  • @vedomedo
    @vedomedo Рік тому +2

    I'm not gonna get the 4090, and I would probably have gotten the 4080 16gb if the pricing here in Norway was more in line with the $ pricing. However, here you have to pay the 4090 price for the 4080 16gb, and the 4090 costs like $2000-> $2500, which is simply silly. Even if I sold my 3080 the difference is still what I would expect a 4080 16gb to cost in total. Who knows, maybe the 50xx will be more in line with "normal" pricing, or maybe even the 40xx cards go down in price after a while.

  • @damienlahoz
    @damienlahoz Рік тому +9

    Why does it feel like the review community are trying to make 90s a mainstream product? I dont recall so much attention being afforded the Titans, which this essentially is. Its just weird. Every channel has dedicated significant time covering this show piece and what, 1% of PC enthusiasts will even bother trying to buy one regardless of how it performs. Its obvious that Nvidia is trying to normalize certain price points but it doesn't mean people have to play along.

    • @DanielFrost79
      @DanielFrost79 Рік тому +1

      I personally hope and wish people did NOT play along. These prices are fuck**g insane and ridicolous.

    • @TheKain202
      @TheKain202 Рік тому +1

      Because it's the only one they're allowed to make content about, until NV starts shipping out lower models? And besides, as ridiculous as it sounds - the 90's have the best value.
      4080 and 40""""80"""" had such a ridiculous price hike from last gen, or any before for that matter - it's really hard to justify buying it.

    • @damienlahoz
      @damienlahoz Рік тому +1

      @@TheKain202 value? You could say a Ferrari is great value compared to a McLaren. And youd be right but its also astronomically expensive and outside the price point of 99.9% of consumers.

    • @Ferretsnarf
      @Ferretsnarf Рік тому +3

      The power draw is just absolutely insane as well, and is an enormous increase over the previous gen. We're essentially seeing the performance scale with the power draw... which isn't really that impressive. 600 watts is off the charts. Honestly, we're not far off from having to dedicate an entire circuit in your house to these PCs that would be running hardware like this.
      Between the price, the power, and the ludicrous size of these things, when is enough enough? Nvidia, come see me when you make a better card by making it better, not by getting the performance out of a proportional gain in both size and power draw.

    • @damienlahoz
      @damienlahoz Рік тому +1

      @@DanielFrost79 when the rubber meets the road, no 99% of gamers aren't playing along. $2k is still $2k to damn near everyone. Alot of these people saying they are buying one have 3060s and aint buying sht

  • @shadowjulien5
    @shadowjulien5 Рік тому +43

    Definitely waiting on rdna3 to see what they’ve got to offer. Then I finally want to get around to an itx build. Im also waiting to see what raptor lake has to offer because with the price of am5 boards I might actually end up going Intel after 4 zen systems lol

    • @ZackSNetwork
      @ZackSNetwork Рік тому

      Why on GPU’s that powerful and huge they need space and proper cooling? As hell as a high wattage PSU. SFF builds are good for low to mid range PC’s.

    • @Adonis513
      @Adonis513 Рік тому +3

      Never understood the point of putting high wattage cards in itx builds , very stupid.

    • @shadowjulien5
      @shadowjulien5 Рік тому +4

      Tbh the engineering challenge of getting that much power in a small space seems fun and I’ve had a mid tower for like idk 15 years lol I just wanna try something different

    • @Adonis513
      @Adonis513 Рік тому

      @@shadowjulien5 you buying cutting edge hardware just so it can be throttled in a itx setup , unless you are doing a custom loop i see no point.

    • @ghomerhust
      @ghomerhust Рік тому +1

      @@Adonis513 thats the point of the challenge they talked about. if they can get it to run properly on air, thats a win. if they can fit cooling in that tiny box, its a win. for some people, just chucking big ass parts with big ass numbers in a big ass case with big ass airflow, well, it's boring as hell, regardless of performance.

  • @09juilliardbayan
    @09juilliardbayan Рік тому +39

    Considering my budget, as much as I dream of having a 40 series, I see this as the perfect opportunity to buy a 30 series, which I have been waiting for for a looong time. It's all so exciting

    • @exq884
      @exq884 Рік тому +1

      same - looking at a 3090

    • @bloodstalkerkarth348
      @bloodstalkerkarth348 Рік тому

      @@exq884 wait to see if the 4080 is better or the new amd card

    • @zerogiantsquid
      @zerogiantsquid Рік тому

      I'm in the same boat. I saw a 3090 for $950 on newegg two weeks ago and sniped it. I'm kinda sad that the 4090 is so much better, but at the same time I was super excited to finally get my hands on a 30 series. Still a massive leap from my previous card.

    • @chillchinna4164
      @chillchinna4164 Рік тому +1

      @@zerogiantsquid Life is about being happy with what you are able to get, rather than being upset about not obtaining perfection.

    • @beH3uH
      @beH3uH Рік тому +1

      Just bought rx 6900 xt for 800 euro lol prices are good.

  • @earlsb23
    @earlsb23 Рік тому +1

    @23:20 - Frazier never knocked out Ali in the first fight. He *DID* knock him down in the 15th round (only knock down of the fight) and did end up winning.
    Love ya, Jay! 🤓

  • @OzzyInSpace
    @OzzyInSpace Рік тому +10

    With the placement of that weak as heck plug, I'll be holding off of the 40 series. I'll be perfectly happy with my 3080 TI for a while.

  • @HiCZoK
    @HiCZoK Рік тому +25

    The angled plugs on 3080/90fe were perfectly positioned.
    I still think 3080fe was a perfect card. Great price at 700, performance, build quality

    • @monikaw1179
      @monikaw1179 Рік тому +4

      Except for the crappy thermal pads they used on the VRAM leading to ludicrously and unnecessarily high temps.

    • @HiCZoK
      @HiCZoK Рік тому +2

      @@monikaw1179 2 years going strong and I can't complain.
      I don't care what pads are used. The card runs great

    • @pumpkineater23
      @pumpkineater23 Рік тому +1

      And 3080/90 a more compatible size!

    • @nickwallette6201
      @nickwallette6201 Рік тому +5

      The only problem I had with the 3080 FE was that it didn't exist. I waited and waited and waited, with money in the bank to buy one. There were no cards, and no water blocks being made because _there were no cards._
      Now, I can go pick up a 3080 Ti if I want to, but I'm going to wait and see what the 4080 looks like. I want a reasonable blend of performance and efficiency. I've got no interest in reinforcing my desk to hold up a graphics card, and wiring a three-phase plug to power it.

    • @meurer13daniel
      @meurer13daniel Рік тому +2

      3070, 80 and 90 were perfect cards at MSRP at launch. Great presentation, didn't see any controversy at the time.
      I think they look at the past launches and do the opposite of what worked. 4000 is powerful, but it's so atrocious at the same time.

  • @NostalgicVibes
    @NostalgicVibes Рік тому +9

    The 4090 really makes me want to do a full custom loop because of where the connector is located and because of how beefy they are. Would be nice to see a full custom loop once people start doing some builds.

    • @MrInstinctGamer
      @MrInstinctGamer Рік тому

      I can't wait to see one these things draw like 400w of power and the engineering of the heating keeps them cool at 80 so I wonder if a custoom loop would really help. Def gonna need its own 360 rad probably 🤣

    • @linsetv
      @linsetv Рік тому +3

      @@MrInstinctGamer Well according to Der 8auer temps aren't even that bad.
      And he also tested other power limits like 70% where the card draws 30% less power for only 5% less perfomance...
      With temps in the 60 while gaming...(Founders Edition)

    • @NostalgicVibes
      @NostalgicVibes Рік тому +1

      @@linsetvI agree. The cooling I don’t think will be so much of a problem from the testing that I’ve seen and the GamersNexus video breaking down the vapor chamber. The power draw is relatively close to a 3090 ti. The cooler is SO DAMN BEEFY as well that it should be able to handle it. My only real concerns are...
      1. The connector slot on the PCB becoming loose over time and eventually breaking off. (What Jay mentioned)
      2. You’re forced to use a larger case unless you do a custom loop
      3. THE PRICE (I have a 3080 ti so is it REALLY worth the upgrade?)

    • @linsetv
      @linsetv Рік тому

      @@NostalgicVibes Case size will probably be the most problematic :D
      And ofc the price but i guess this time it will get lower a bit faster

  • @dschlessman
    @dschlessman Рік тому +3

    Damn you called all these power connector issues. Good job!

  • @Halfrightfox
    @Halfrightfox Рік тому +13

    I've been on team green for over a decade, and this is the first time I'm interested to see what AMD will launch

  • @HazelnutColossus
    @HazelnutColossus Рік тому +12

    Given the ginormous size of the air cooler I think the water cooled partner boards are going to be the way to go for people interested in the 4090

    • @jonahhekmatyar
      @jonahhekmatyar Рік тому +2

      Lol, I will have to get an AIO version because I have other pcie cards that'll get in the way of a 3 slot card

    • @jonahhekmatyar
      @jonahhekmatyar Рік тому +1

      Lol, I will have to get an AIO version because I have other pcie cards that'll get in the way of a 3 slot card

  • @nicoarcenas
    @nicoarcenas Рік тому +4

    Couldn't help myself from cheer for the 6950 XT while watching the benchmarks

  • @-zerocool-
    @-zerocool- Рік тому +1

    I wish AMD brought back an ATI special edition, but only for nostalgia and the reason being that it beats every Nvidia 4000 card, that would be real nice.

  • @user-funnymadp
    @user-funnymadp Рік тому +2

    remember things being expensive is ur problem, not the product's problem, when u see overpriced, other people see it worththe cost

  • @TMacGamer
    @TMacGamer Рік тому +17

    I am probably going to sit this one out for now. Im happy with the RTX 3070 that I have right now. And after seeing NVIDIA label an RTX 4070 as a lower end 4080, the huge price increase isnt worth it for me right now. Maybe down the line when the rest of the 40 series comes out if the prices are decent. But as of right now I think I will just watch & enjoy the competition playing out & see what happens next.

    • @adamahmadzai2357
      @adamahmadzai2357 Рік тому +1

      i dont think its worth upgrading a 3070 at least for another two gens

    • @malazan6004
      @malazan6004 Рік тому +2

      @@adamahmadzai2357 depends what resolution you play it honestly.. for 1080p and 1440p no not really but for 4k yeah it's a big deal. Then again DLSS makes 4k gaming on a 3070 much better

    • @beemrmem3
      @beemrmem3 Рік тому

      @@malazan6004 3070 has the power for 4K DLSS, but not the VRAM. I’m finding that out trying to run my HP reverb G2

    • @beemrmem3
      @beemrmem3 Рік тому +1

      I’m sitting this out too. I was hoping for a 4070 12gb with 3090 perf for $550-600. Let’s be real, the 4070 is $900 and is called the 4080 12gb. This means the 4070 they actually release won’t be much faster than a 3070.

    • @LunchBXcrue
      @LunchBXcrue Рік тому +1

      Yeah I have a 3080, just gonna skip this gen. Not worth the upgrade, I dont even use the 3080s full potential, Nvidia should have waited for games to catch up cause even brand new releases aren't utilizing the cards fully unless you have a stupidly expensive setup and with motherboards becoming as expensive as the CPUs the money you would otherwise have saved with lowering solid state memory prices and DDR4 is wasted now that we've got ddr5 as mainstream on most new platforms. I'm happy with my 3700x and a 3080, I only play at 1440p so I'm good thanks. That price drop pisses me off though, I could have gotten buy on my 1080ti if I had of known they would drop prices by that much.

  • @FilipoLipos
    @FilipoLipos Рік тому +44

    I am honestly impressed with 6950XT holding in top tier like this!

    • @andyastrand
      @andyastrand Рік тому +4

      As long as you don't want to turn on RT everything from the 6800XT up is a seriously good and relatively cheap GPU.

    • @GodisGood941
      @GodisGood941 Рік тому +1

      The 7000 series is about to shock everyone im telling u

    • @Brad25King
      @Brad25King Рік тому

      Especially when I take EU prices in consideration..

    • @marcust478
      @marcust478 Рік тому

      @@andyastrand I was thinking about that, in my region found one with good pricing (Sapphire Pulse 6800xt)
      But IDK...I wanted to wait for 7000XT series...I'm really "holding" my wallet cause I seriously wanted to buy;
      Just don't know if I should.

    • @grizzzlyadamz
      @grizzzlyadamz Рік тому +2

      I got my power color red devil 6950xt as a amazon deal for $800

  • @stueyxd
    @stueyxd Рік тому +12

    Just looking at the sheer size of the card, I think a lot of us would need a bigger case, and with the concern you are sharing about even the larger cases requiring such a stressed bend, I think I will give it a miss until some aftermarket/third party manufacturers correct this in our favour.

    • @Toutvids
      @Toutvids Рік тому +1

      Exactly, my full tower Thermaltake Core X71 wouldn't even shut the glass side panel with one of these mounted. If I went vertical mount, the card would be starved for air shoved flush with the glass. No thought about current cases on the market was given when making these GPUs.

    • @yurimodin7333
      @yurimodin7333 Рік тому

      just cut a hole in the side like a supercharger sticking out of a muscle car hood

    • @PDXCustomPCS
      @PDXCustomPCS Рік тому

      Imagine this in an O11D Mini.. 😅

    • @cole7914
      @cole7914 Рік тому

      Cost of the card, cost of a new PSU, and cost increase of electricity to run this monster. Nah… I’m good.

  • @knifeuu777
    @knifeuu777 Рік тому +11

    As far as the connector itself, it looks like someone will need to develop those L shaped adapters to safely wire route the cable.

    • @aussiebattler96
      @aussiebattler96 Рік тому +1

      Those will finally be viable and not useless lmao

  • @adamw9764
    @adamw9764 Рік тому +21

    I feel like we are hitting a point in graphics cards where we will need an entire external case and power supply just for the card, great review!

    • @bygoneera1080
      @bygoneera1080 Рік тому

      We'll only get there if people stupidly keep buying 'bad' (meaning unreasonable or absurd) products.

    • @KaiSoDaM
      @KaiSoDaM Рік тому +1

      Oh yeah. Not talking about how throwing 600w of heat inside the PC case is a bad idea too.
      I'm pretty sure we will start using external GPU like some gaming laptop did.

    • @ultravisitors
      @ultravisitors Рік тому

      That was true for 3090ti, this is much more efficient and doesn't have those crazy 2-2.5X transient spikes 3090ti had, and you can target 60% power and only lose 10% of the performance. If you want you can even use it with only 3 pins attached on much lower watt PSU.

  • @drizzlydr4ke
    @drizzlydr4ke Рік тому +7

    Evga really were smart having the cables on the side, on my lian li dynamic the cables were touching the glass (with frontal cable plug)but having it on the side gives better room for the cables indeed even if you have a fan. Nvidia really need to make it on the side so it fit most cases with no issue

  • @joaquinneis
    @joaquinneis Рік тому +5

    Idk if it's just me or flagship GPUs doesn't make sense anymore. I'm a "former" flagship buyer, I had 980 Ti, 1080 Ti and 2080 Ti. I see the 4090 and it just doesn't appeal to me anymore, idk if it's because I'm not into 4k or what, but paying $1599 for something that will be worth less than half in 1-2 years doesn't click with me. With the previous ones I lost some value but nothing compared to this.

    • @rochester3
      @rochester3 Рік тому +1

      3090s still cost over 1200 what are you talking about

    • @DanielFrost79
      @DanielFrost79 Рік тому

      I've always gone for the tier just below.
      I think i'm gonna pass on the 40 series.
      The thing is that where i live the 3090Ti is around 1600usd still. So i don't even want to think about the 40-Series prices.
      It's too greedy by nvidia, even if it is up to 70% performance increase.
      Not worth it.

    • @TheKain202
      @TheKain202 Рік тому

      @@rochester3 just like the 3090 is now right? Oh, wai....

    • @rochester3
      @rochester3 Рік тому

      @@DanielFrost79 if your looking to buy a 3090 for the same price as a 4090 you might as well buy the 4090

    • @DanielFrost79
      @DanielFrost79 Рік тому

      @@rochester3 Not really, no. Because the 4090 would probably be around 2400 here.

  • @manicfoot
    @manicfoot Рік тому +65

    Great analysis of the card, Jay. Given the energy crisis in Europe we're not sure we can afford to heat our home this winter, so buying a new GPU isn't really on my radar right now. Good thing is my PC is in a tempered glass case so it heats up my office quite nicely. I think playing games to keep warm this winter will actually be more economical than using a radiator 😅

    • @Liperium
      @Liperium Рік тому +15

      Technically the only thing a gpu transfers it's electricity to is to heat. So it's technically as efficient as your electric radiator 😂 and a free side effect if you need the heating!

    • @TheVoiceOFMayhem1414
      @TheVoiceOFMayhem1414 Рік тому +4

      I run a 3090 + lots of high end components in north Norway and i dont need any heating in my gaming area i can say 😁 they do produce enough heat and 2/3 of the year i even need 2 open my windows haha 😅
      So the high powerdraw kinda translates to heating the house 😅

    • @EJD339
      @EJD339 Рік тому +2

      Holy hell. How much is your energy bill? I thought mine was bad in AZ when I had a 250 dollar bill for a month in a studio apartment in the summer.

    • @manicfoot
      @manicfoot Рік тому +3

      @@TheVoiceOFMayhem1414 Nice! My idea is to adjust the fan curve on my GPU so they don't kick in till temps exceed 70 degrees. I think that way my PC could generate some heat while idle thanks to wallpaper engine always running and working the GPU a bit. Could save some electricity! 🙂

    • @HitmannDDD
      @HitmannDDD Рік тому

      Get a 3090 (not TI) instead. Decent performance, less overall wattage, and it can double as a heater with the hotspot potentially hitting 100c.

  • @lostgenius
    @lostgenius Рік тому +4

    I was finally able to get a 3080 a few months ago, so I will be passing on the 40 series

  • @michaelboyce3227
    @michaelboyce3227 Рік тому +5

    19:50 "I can't for the life of me figure out why Nvidia would argue with me about this being a concern for causing damage"
    _next sentence_ 19:55 "because I guarantee you, if you break this off and send it back too then they're not honoring your warranty. They're going to say you damaged it"
    Lol I think you know exactly why, Jay

  • @Walamonga1313
    @Walamonga1313 Рік тому +5

    I'd rather buy an entire system with that kinda money. I'm still using my laptop with a 1050 lmao

    • @Noodlepunk
      @Noodlepunk Рік тому

      Same I am thinking of building a guest pc so a friend can Come Over and play with me with that cost.

  • @zukodude487987
    @zukodude487987 Рік тому +2

    I am still Happy playing Genshin Impact on my RTX 2070 super on my laptop and i will just wait for 5090 release before snagging a 4000 series.

  • @eddiec1961
    @eddiec1961 Рік тому +6

    I think it's worth waiting a little bit longer, I think your right to be concerned about the socket a 90°plug would be a better solution.

  • @rippow
    @rippow Рік тому +12

    I have faith in AMD, I bought a 6800xt for my new rig recently, and will wait to see what unfolds. Thanks for the info JAY!!

  • @Liberteen666
    @Liberteen666 Рік тому +16

    Jay I really appreciate your content. Thank you for being quick when it comes to informing all of us. I'm looking forward to building my 4K system in the upcoming few weeks after I see the aftermarket versions of 4090 and their benchmarks. Keep us updated!

  • @enterthebiscuit
    @enterthebiscuit Рік тому +2

    My monitor is a 4k TV that does 60Hz. All I ever want is for the 0.1% FPS to never dip below 60 because that's my definition of flawless. If I have to buy a card that can do 200FPS 'average' to achieve that then so be it, but raw frame-rate isn't a driver for me - I always like to see the 1% and 0.1% figures too. Currently running a 3060ti rendering at 1080p, and it mostly works. Hoping my next GPU will do native 4k to that standard... 🤞

  • @frankbeuker6609
    @frankbeuker6609 Рік тому +13

    I just bought the 6950xt. I think this is a great card. It runs all the games I throw at it. The energy consumption in combination with performance is also great. I hope they can make more compact cards that still perform well without a lot of power draw in the near future ....

    • @n9ne
      @n9ne Рік тому

      you made awful choice but have fun i guess :)

    • @gottaeat3360
      @gottaeat3360 Рік тому +2

      @@n9ne hey that wasn’t nice man

    • @frankbeuker6609
      @frankbeuker6609 Рік тому +1

      that wasn't nice indeed. I respect everyone's choice. And understand that some people want to wait a little longer for the new stuff. I made this choice because I got the card for around 800 euros. A great deal in combination with a DDR 4 motherboard. Here are many parts good and on sale. for now I think a ddr 5 motherboard with all new parts is still too expensive. Hope u can build your dream soon Nine :)

  • @NorthernLaw_
    @NorthernLaw_ Рік тому +11

    4090 is quite the beast, I'll be waiting for a 4080Ti if they decide to make one, if they don't I'll be happy with that honestly.

    • @NorthernLaw_
      @NorthernLaw_ Рік тому

      @Deadhorse and the PSU will be an extra 1500 watts just to be sure

    • @earthtaurus5515
      @earthtaurus5515 Рік тому

      The 4090Ti won't be launched for a long while yet as apparently prototypes have been melting themselves 😶.

    • @NoBrainah
      @NoBrainah Рік тому

      I still have a 1080ti. I’m grabbing a 4090 asap then sitting out for another 4 years lol

    • @gmualum08
      @gmualum08 Рік тому

      @@NoBrainah that 4090 is like triple the price of the 1080Ti you paid for back in the day. I hope you got much deeper pockets, you'll need it

    • @NorthernLaw_
      @NorthernLaw_ Рік тому

      @@NoBrainah Might as well wait until GTA 6 comes out then get a 5090 or a 6090, whenever the game comes out. No good games to bother playing right now anyways

  • @CootBusiness
    @CootBusiness Рік тому +15

    Considering those results,
    and that I'd rather stick with a 2K monitor,
    I still have my eyes set on getting a 3080.

    • @Third_eyee
      @Third_eyee Рік тому +1

      same here, i'm hardstuck on either a 6950xt or a 3090ti

    • @legendmir1
      @legendmir1 Рік тому

      That’s where I’m at. It’s perfectly good performance and will be for a good few years.

    • @craigpratt8875
      @craigpratt8875 Рік тому

      You'll love it. I really think that it'll have the staying power of the 1080/1080ti

    • @thedoctor7158
      @thedoctor7158 Рік тому +1

      Same here. I have a triple 1440p monitor setup, only play on one and most common games not trying to show off all the bells and whistles don't even make my 2070 super cry and are easily playable. The only reason I sprung for a 3090 due to the sub $1000 price recently is 3d rendering using the iRay rendering engine is a big hobby of mine. Based on what I have read and watched, the 3090 is going to cut my rendering times down by 65%-75%. 30 minute render on 2070 super down to 7.5-10 minutes with the 3090. Good enough for me.

    • @mukkah
      @mukkah Рік тому

      @@thedoctor7158 bro rendering is a pretty damn legit reason for the horsepower, for sure. Gonna love those decreased times!

  • @buildr3303
    @buildr3303 Рік тому

    First thing I’m doing is cutting off those plugs and hardwiring. That power “adapter” looks like it was designed off an email marked “important” that ended with “…and get going asap. We need to send your design to manufacturing in 11 minutes” and that engineer procrastinated on UA-cam for another 6 minutes before starting on the “Medusa Connector.”

  • @jessmac1893
    @jessmac1893 Рік тому +6

    One number I’d love is power (amps) coming out of the display port. For those with longer extension cables for VR, they often lose connection. I end up using my 1070ti for VR instead of my 3080ti because the 1070ti consistently connects and powers it with an extension cord. Weird stat. But no one measures/reports it.

    • @Shadow0fd3ath24
      @Shadow0fd3ath24 Рік тому

      Part of that is because only 1-2% of people are vr players and many have standalone VR headsets that dont interface with a pc

    • @GravitySandwich1
      @GravitySandwich1 Рік тому

      I have the Reverb G2, They brought out a new powered cable due to connection issues. (usb wasn't providing enough power) Side note: I have a 1080ti. looking towards a The AMD equivilent of the 4080. (I'm boycotting Nvidia)

  • @angryace4017
    @angryace4017 Рік тому +5

    I am so glad I am past the phase where I had to have the latest and greatest every release. I built new rig with 9900k and 1080 Ti GPU. Skipped the 2000 series and bought 3080 Ti while still rockin the 9900K and z390. Hopefully that will last me for a long while :D

    • @Darkhalo314
      @Darkhalo314 Рік тому +1

      That's exactly what I have. Gonna stick with the 9900k and 3080 Ti for probably another generation or two. I'll probably buy Meteor Lake and buy the RTX 6000 series when they release.

    • @ChillyNuts4u
      @ChillyNuts4u Рік тому

      you can squeeze another 5 years outta that easily, depends on what your playin i guess :P

    • @angryace4017
      @angryace4017 Рік тому

      @@ChillyNuts4u I'm old and slow now... takes too long to IFF for FPS games lol... mainly play wargaming titles

    • @zeromitch8792
      @zeromitch8792 Рік тому

      I'm gonna keep rocking my 2080 ti and 9900k, I'm not 4k gaming or playing vr games.. i'll wait for the 5090

  • @AvinashRaghavendra
    @AvinashRaghavendra Рік тому +10

    Coming from 980Ti, I am all in for 4090. Hoping the base AIB cards remain at the MSRP :(

    • @processedsoy
      @processedsoy Рік тому +1

      good luck! hope you get one

    • @MrGorpm
      @MrGorpm Рік тому +3

      I hope you've got a top of the line cpu to match, as from what I can make out, cpu bottlenecks are going to be a big issue for anyone not able to game on a 4K high refresh rate monitor. I for one, am not ready to build a complete new system (13900k, DDR5) just to accommodate the 4090.

    • @AvinashRaghavendra
      @AvinashRaghavendra Рік тому

      @@MrGorpm Yes, I got lucky that my system is 10 years old so building from scratch. Yes, got Intel 13700KF and Asus Prime Z790 on pre-order.. with DDR5 5600 64GB.. the only unknown entity in all of this is the pricing of the base 4090..

    • @AvinashRaghavendra
      @AvinashRaghavendra Рік тому

      @@processedsoy Thank you! 🙏

    • @duffel87
      @duffel87 Рік тому +1

      Just get the FE

  • @Yoyo34
    @Yoyo34 Рік тому

    Normally I am very bored while reading benchmarks or listening to people reading from them. But this song, made it a lot better. 😀

  • @roylee3558
    @roylee3558 Рік тому +15

    They need to put the cable plug on the motherboard side, then make the cable end a premade 90* (like how you have SATA 90* cable ends). This would keep the cable out of sight, and also relieve the bend pressure on the card's connection port.

    • @Trinity-Waters
      @Trinity-Waters Рік тому

      Do that kind of thing with high-end military hardware and it works really well and is robust.

  • @Jimtheneals
    @Jimtheneals Рік тому +10

    Regardless of price, I will wait till after the AMD launch and reviews before I make any decisions. But I am very excited to see what AMD has up its sleeve.

    • @pilotbsinthesky3443
      @pilotbsinthesky3443 Рік тому +2

      Now let's just hope all the reviewers check their bias at the door and actually give us an honest review of each.. I won't hold my breath on that one, but I'll be watching.

    • @kiloneie
      @kiloneie Рік тому

      @@pilotbsinthesky3443 That is why you gotta check more than just one reviewer, and hopefully one of them delivers.

    • @pilotbsinthesky3443
      @pilotbsinthesky3443 Рік тому

      @@kiloneie Yes, I've found a few showing 2-3X performace in MSFS 2020 vs 3090. So I'm hoping 4080 will be the sweet spot using DLSS 3.0

  • @ranulfcaceres
    @ranulfcaceres Рік тому +3

    i hope that NVIDIA has some room to grow or improvement for the upcoming next gen GPU... cuz i don't want to see 50% improvement for 4090 compare to 3090 and then just 5 - 10% improvement on the next gen of GPU for the same price aka 5090 @ 1700$ or higher

  • @erickeith1466
    @erickeith1466 Рік тому +2

    Watching 4090 videos while still debating going from my 980 TIs to 1080s.

  • @coldgarden_
    @coldgarden_ Рік тому +20

    I want a 4090, but I'm going to wait a bit and see. I'm pretty happy with my 3080, and dropping $1,600 on a new card when I'm happy with the one I've got doesn't make sense. Eventually I'll pick one up when I feel the need to upgrade because the 4090 is quite impressive. I do want to see the 4080 and RDNA 3 before I pull the trigger.

    • @chrisl8527
      @chrisl8527 Рік тому +1

      I haven't seen one under 2 grand

    • @JorgeGarcia-lp8vz
      @JorgeGarcia-lp8vz Рік тому +1

      I just got a rog strix 3080 at 799.99 micro center glad I waited for prices to come down

    • @PinkFloydFreak55
      @PinkFloydFreak55 Рік тому +1

      me and my 3080ti are planning to have a looooong relationship together, especially considering it's replacing a 980ti

    • @gambino883
      @gambino883 Рік тому +1

      Congratulations on being rich. No avg income person would upgrade from a 3080.

    • @TastyMeat8675
      @TastyMeat8675 Рік тому

      Completely agree. I got a pretty new 3080 ti and this thing still runs games pretty damn decently. I might upgrade to the 4080 when it comes out but we’ll have to see

  • @MrGryphonv
    @MrGryphonv Рік тому +7

    I may have missed it in the video, but I'm really curious about the undervolting performance. I was able to cut down about 100w draw on my 3090 with an undervolt at about a 3% performance hit. If the 4090 can be undervolted with similar or better values it will also be a good selling point.

    • @Daswarich1
      @Daswarich1 Рік тому +2

      Der8auer tested tuning the power target and the 4090 got about 90% performance at 60% power target.

    • @MrGryphonv
      @MrGryphonv Рік тому +3

      @@Daswarich1 Those are amazing numbers. Well worth the compromise IMO

  • @RIGHTxTRIGGERx
    @RIGHTxTRIGGERx Рік тому +8

    im always a few gens behind because i cant really afford the best of the best but rn im pretty focused on upgrading my cpu. ive currently got a 1660 super and i think i want to upgrade my graphics card sometime next year , i want to see how much the 30 series drops in price once all the 40s have been announced, released and sold. new tech and competition is a good thing for everyone.

    • @Bdot888
      @Bdot888 Рік тому +2

      Good idea! I waited for a while and recently went the used gpu route and got a 3080 for $550. But im sure prices will drop a little more, just keep an eye out!

    • @Erikcleric
      @Erikcleric Рік тому +2

      3090 prices dropped like Crazy in Sweden the past weeks. From 2600-ish USD to 1300 USD now. So I'm upgrading my brand-new rig which has the 3060ti to a 3090.
      4000 series, it's overkill for any game right now and the coming years unless you NEED 4K ultra at max fps...
      My 3060ti will go into a future desktop I'll get for my old room at my moms place. Hate for it just be abandoned.

    • @RIGHTxTRIGGERx
      @RIGHTxTRIGGERx Рік тому +1

      @@Bdot888 i might look into something used actually. Not something ive ever thought about doing but it could save alot of cash!

    • @RIGHTxTRIGGERx
      @RIGHTxTRIGGERx Рік тому +1

      @@Erikcleric definitely not in the market for anything over 1k but the way prices have been trending, i dont think that’s something ill have to worry about soon. I get not wanting to toss parts, it feels like such a waste. Ive got a bunch of old parts taking up space in my closet that im never going to touch again but i cant get myself to get rid of them lol.

  • @taaoquinn3731
    @taaoquinn3731 Рік тому +1

    My plans are to wait till GTA 6 comes out and see what that needs to run comfortably. And then get something in the mid range price sales at that point.
    My RTX 2060 still runs everything I need.

  • @incar95678
    @incar95678 Рік тому +3

    I'm looking to see how the 40 series performs for Microsoft Flight Simulator 2020 as I'm considering moving to VR (I'm still going with a 980Ti)

    • @fightingsloth
      @fightingsloth Рік тому +1

      Look at Optimum techs 3090 review that came out today.. Flight simulator is absolutely incredible on a 4090, particularly with a new technology Optimum shows in his review where AI increases frames with a slight input delay added.

    • @incar95678
      @incar95678 Рік тому

      @@fightingsloth Thanks very much. All the best.

  • @blakehansen1208
    @blakehansen1208 Рік тому +5

    I am fully off the green train especially after the whole EVGA situation. I am not a power user so ultimately what the top spec card can do is of no concern to me. I will be waiting for RDNA3 to launch and I am certain I will find a card to fit my needs to a T. Unless it is somehow the most catastrophic launch in history, It will be a marked improvement over my 2080 either way. I want to see team green take one on the chin and buying a team red card will be my opportunity to help.

    • @Toutvids
      @Toutvids Рік тому +2

      I bought a used AMD RX 6800 XT on ebay for $500. I'm good for years. Not giving money to Nvidia anymore.

  • @randallporter1404
    @randallporter1404 Рік тому +6

    Really interested in seeing AMD's response to the 4090s.

  • @TikkaQrow
    @TikkaQrow Рік тому

    My desk, with peripherals, lamp, monitor, and my pc. If I built a system with one of these cards, I have no doubt it would flip my 15 amp office circuit.
    Power draw on these things is insane

  • @danepoitra9146
    @danepoitra9146 Рік тому +4

    I held off from going 30xx series. After watching this, I’ll definitely be upgrading to a 4090 from my 2080 ti. I’ll still wait to see what amd brings first though.

    • @MrZakarias01
      @MrZakarias01 Рік тому

      When will AMD announce the 7000 series?

    • @danepoitra9146
      @danepoitra9146 Рік тому

      @@MrZakarias01 they will launch in a month. Reviews for them will drop in about 3 weeks or so.

  • @ApexSim
    @ApexSim Рік тому +5

    Got 2 of those inbound. Personal pc and racing rig. One replaces a 3080 that is going into my second racing rig which has a 1080ti atm. And the other 4090 goes to main racing rig, with a 2080ti which is going to be sold along with the 1080ti. That is complemented with 365 instant noodles packages 1 for each day and a thousand hamsters to provide the electricity needed to run the cards.

    • @Granyala
      @Granyala Рік тому

      Hamsters run on instant noodles?!

    • @zachr369
      @zachr369 Рік тому

      I have a evga 3090 ftw3 ultra in my sim rig for assetto in vr, it’s great.

  • @Harrisboyuno
    @Harrisboyuno Рік тому +3

    Love the iFixit promos as well. All around info and entertainment. Thanks so much Jay.

  • @DrRyan82994
    @DrRyan82994 Рік тому

    it looks to me like the low end 4080 is going to be pretty great. everyone is scared to say the products will be good
    people don’t like it so suddenly they sound like the folks that said “the 970 is all you need for 1080p so why would i upgrade” like 5 years after.

  • @NoNeaux
    @NoNeaux Рік тому +7

    I honestly didn't think NVIDIA was going to throw this much of an improvement into the 4090 thought it could be phoned in. Unfortunately priced/hate how they're handling the 4080(s). But this is shockingly good.

  • @IgoByaGo
    @IgoByaGo Рік тому +11

    I am waiting to see what AMD does. I had the 2080 Ti (Sold it when I got my 6900) and the Radeon VII (Still have it) I liked the price/performance better on the Radeon VII. Especially the tuneability. I played with a 3090 and bought a Red Devil 6900 when it came out. I liked the 6900 more because it ran sooo much cooler than the 3090, especially the ram temperatures.

    • @doublevendetta
      @doublevendetta Рік тому

      I'm amazed yours is still kicking. Also envious

    • @spankbuda7466
      @spankbuda7466 Рік тому

      I believe that you're lying about all of what you're saying. You're just another individual who is on that Nvidia "Hate" bandwagon and cheering for the multi-million dollar AMD & Intel as if they're the "underdogs". I'm shocked that you didn't mention EVGA in your fictional story.

    • @IgoByaGo
      @IgoByaGo Рік тому

      @@spankbuda7466 I didn’t say a bad thing about Nvidia. I still have my 970’s, Pascal was amazing, but I gave my 1080’s to my friend who was building a PC for his birthday when I got my Radeon VII. Now I will tell you, I did not like the 2080 Ti. I paid $1300 for the EVGA XC Ultra (the one that had the 3 slot mounting bracket), and it didn’t perform as well as I wanted. I wanted a 3090 but couldn’t find one when I bought my 6900. I borrowed my friend’s 3090, but the ram temps were super high (Suprim X). When I was playing Cyberpunk and stuff it was like 95 C all the way to 100 C. I’ve just been really happy with my 6900. However, if I decide to try for a 4090, I will have to get a new PSU. I have a Supernova 1000 that I got in 2017 when I built my first Ryzen system (I went from an i5 6600 to Ryzen 7 1800x, but I built my wife an i7 7700k with a 1070). I don’t pick one company over the other, but the last product Nvidia made that I loved was the 1080 and 1080 Ti. I’m liking the performance of the 4090, but I’m still waiting to see what AMD does.

  • @midnightlexicon
    @midnightlexicon Рік тому +14

    Wanna see what the 4080 fe has in store for us. Good to see FE construction allows for good boosting whitout fan speed adjustment. Might stick with FE from now on.

  • @joshuaableiter4545
    @joshuaableiter4545 Рік тому

    I agree with JayzTwoCents' observations on the power connector.