RTX 40 series Price Analysis - How Much Should You Pay?

Поділитися
Вставка
  • Опубліковано 8 чер 2023
  • GPUs are grossly overpriced this generation leading many to skip this gen. How much would the price have to come down before you even consider buying this generation? What is a reasonable price?
    #gpuprices #gpuprice #gpupricedown
  • Наука та технологія

КОМЕНТАРІ • 349

  • @obi0914
    @obi0914 Рік тому +140

    Never underestimate AMD ability to completely screw up

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +16

      That sums it up!

    • @obi0914
      @obi0914 Рік тому +7

      @@ImaMac-PC Last time i was excited for a Radeon was nearly 20 years ago, and only because they genuinely offer a good card....dam i'm old.

    • @alvarg
      @alvarg Рік тому +21

      Amd literally had the ball handed to them this year with nvidia horrible launch prices, amd could of swept in and just given us a 7900xtx at like 700 bucks and the 7900xt for 600 and would of out sold Nvidia by miles

    • @Icureditwithmybrain
      @Icureditwithmybrain Рік тому +10

      @@alvarg They snatched defeat from the jaws of victory.

    • @tomtomkowski7653
      @tomtomkowski7653 Рік тому +13

      Duopoly. Nvidia is setting stupid prices and AMD follows.
      Another Massive Disappointment.

  • @Z4d0k
    @Z4d0k Рік тому +31

    It’s easy, the more you buy, the more you save! Thanks leather jacket.

  • @magnusnilsson9792
    @magnusnilsson9792 Рік тому +39

    Just shift the prices one more time and they will become decent:
    4090 $799
    4080 $599
    4070Ti $499
    4070 $399
    4060Ti 16GB $299
    4060Ti 8GB $249
    4060 8GB $199

    • @Krenisphia
      @Krenisphia Рік тому +9

      I would agree except for the last three. They should be:
      4060Ti 16GB $249
      4060Ti 8GB $199
      4060 8GB $149

    • @jasonking1284
      @jasonking1284 Рік тому +1

      Looks about right...

    • @XX-_-XX420
      @XX-_-XX420 Рік тому +7

      Those are actual decent prices

    • @curie1420
      @curie1420 Рік тому +4

      people will say "bUt inFLatIoN"

    • @OneTrueMomo
      @OneTrueMomo Рік тому +1

      You know the 10 Series Cards where priced like this.

  • @Sly_404
    @Sly_404 Рік тому +20

    As much as I agree with your recommended prices, there is no way the 40 series will drop to these levels outside of stock clearance after the 50 launch. Nvidia simply has 0 pressure to move units. If people don't pay at the set price levels, they shift production resources to commercial use products that make them the intended profit margins.

    • @AmericanCock
      @AmericanCock Рік тому

      Exactly they are no longer catering for gamers they have reached Apple level and moved on to catering for companies and commercial businesses

    • @nowaywithyoueveragai
      @nowaywithyoueveragai Рік тому

      Pray for a miracle of AMD, intel or the Chinese GPU company coming out with some breaking though technology...

  • @zdspider6778
    @zdspider6778 Рік тому +5

    The 4060Ti is a 4050 in disguise, which should cost around $119-149. It has *_less than half_* the die size of the 3060Ti, which means they can cut 2x more chips from the same wafer and double their profits. And they're trying to sell that for $400 ($500 if you want an extra 8GB VRAM). Really, Nvidia? 😠 How about "no"?
    The 4070 is a 4060 in disguise, which should cost between $199-249, at most. Ideally, under $199. This should have been "the people's choice", with a decent performance boost and value, interesting enough and affordable enough that you can upgrade every generation, if you wanted to. Gatekeeping it behind an artificially inflated price is anti-consumeristic bullshit.
    The 4080 is what the the 4070 should have been, and the "4070Ti" ("4080 12GB") should not have existed in the first place (in its current form). Launching a "Ti" should come _after,_ not first. Wtf are you doing, Nvidia? The 4060 isn't even out yet and there's a "Ti" version of it? Dafuq?
    Why such small memory buses on all these cards? And why so little VRAM? Gimped cards for ridiculous amounts of money. 16GB VRAM should have been standard LAST gen, and it's not even THIS gen? Anyone buying an 8GB card in 2023 is making a terrible, terrible decision, considering that some games TODAY use more than 8GB at 1080p. Those charts showing off the cache size and texture compression is just blowing smoke up our asses.
    If they don't sort out their shit pricing with the 5000 series, that's gonna be another skip generation.

  • @user-lk5kn2tr7k
    @user-lk5kn2tr7k 9 місяців тому

    Love your graphs man.

  • @L0rd_0f_War
    @L0rd_0f_War Рік тому +20

    Love your content, but whats with the 18% increase for 4060Ti? That's higher than what nvidia claims (15%) over 3060Ti, and Meta review averages are around 8%.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +17

      That is a good catch. I swapped the 4060 & 4060 Ti numbers. The 4060 Ti should be 15% while the 4060 is 18%.

    • @zdspider6778
      @zdspider6778 Рік тому

      The 4060Ti is a whopping *_2 PERCENT_* "faster" than the 3060Ti (at 4k, on average, meaning it actually LOSES in some games to the 3060Ti).
      Nvidia gimped it too much with a smaller memory bus, less VRAM, PCI-Express 8x instead of 16x, smaller die size than it should have had, and for much more money considering that this is _4050 in disguise,_ which should be like $119-149. They're trying to sell it for $400 ($500 if you want an extra 8GB VRAM). Good fucking luck with that! What a bunch of clucking clowns! 🤡

  • @conza1989
    @conza1989 Рік тому +10

    I completely missed the small/medium die strategy video I'll have to go look for that, looks brilliant. For this video, I largely agree, well researched, well put, every card is priced 1 tier too high. I am curious, from your perspective, is it really popular for people to upgrade each generation? I've now skipped 3 generations so far, which feels like a long time, I thought most people keep a card for, 2 generations, skip then buy, are people buying more frequently, usually? Eg. Buy a GTX 1060/ RX 580, skip 20/5000 series, buy 30/6000 series, skip 40 / 7000 series. Obviously the last part is thrown off for some by shortages.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +1

      Thank you. Most PC gamers don't buy every gen. They buy a GPU and then when it can't play the game they want, will jumped back into the market and buy another, which is as you say, is every 2-3 generations. Only enthusiasts like to buy the latest and greatest. But even then, the upgrade has to be worth it from a performance to price perspective.

  • @4sightfilmsLLC
    @4sightfilmsLLC Рік тому

    Fantastic breakdown and video!

  • @jerryquesenberry2520
    @jerryquesenberry2520 Рік тому

    Thank you for doing this.

  • @jaxonswain3408
    @jaxonswain3408 Рік тому +4

    I hate the 4070 ti, 12GB of vram is the bare minimum for ultra quality PC gaming, people who are paying 800$+ should not be getting the bare minimum. 😑

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +1

      Getting a gimped GPU for 800$+ just feels wrong...

  • @rocketman3770
    @rocketman3770 Рік тому

    Those 4070 open box that you mentioned with those steep price cuts are the Gigabyte models which had that recent drama about the PCB cracking.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +1

      Those Gigabyte models are using massive quad slot Air Coolers and the weight alone puts massive stress on PCI slot. From jayztwocents video, the board seems rather weak. At least the 4070 is a two slot Air Cooler that is lightweight by comparison.

  • @MARKELL00F
    @MARKELL00F Рік тому +7

    Amazing work , for me from beginning i saw Nvidia increasing prices or playing with GPU names and that misleading, let's wish we will see good performance and prices in the next RTX 5000 series or RIP Gaming GPU's

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +4

      Thanks! There is always next gen to look forward to...

    • @bfish9700
      @bfish9700 Рік тому +3

      If the AI bubble bursts before 5000 series, they'll have to deliver value. Just like Ampere came after turing (but that value creation was killed by mining), the 5000 series might deliver better value.

  • @BPMa14n
    @BPMa14n Рік тому +1

    Reasonable analysis

  • @joh2805
    @joh2805 Рік тому

    I own a g-sync only monitor, so together with dlss the GeForce feature set over Radeon (or Arc) is just substantially better and I kind of stuck with them. Additionally - PC hardware has ridiculous pricing and sometimes availability in my rigeon, and there are more green GPUs to choose from.

  • @TheDrMihai
    @TheDrMihai Рік тому +1

    i bought a 4090 because i can totally aford it but i still bought it right before the scalping stoped and i got it AT MSRP and not a penny more because even if it`s a HALO top tier product in no shape or form is it a good deal =)) totally agree with you!

  • @pinakmiku4999
    @pinakmiku4999 11 місяців тому

    Love such analysis!! Great video! Based on this video I have tried to shop my 4090 for around $1200. Couldn’t reach $1200 but managed to get a brand new 4090FE for $1300 after taxes and offers. This is my first time paying so much for a gpu but I also have managed to get used 3070 with 2.5 years warranty for $200 and 3060ti new for $250. Gonna flip those for profit to reduce my cost for 4090 close to $1200!

    • @ImaMac-PC
      @ImaMac-PC  11 місяців тому

      Thank you and Nice work on scoring that 4090!

  • @mistermcluvin2425
    @mistermcluvin2425 Рік тому

    Love your charts bro, some of the best I've seen. Not sure if we'll see your price structure anytime soon but...Id love to get a 4090 at 4080 prices.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      Thanks. As with every GPU ever made, the price will come down...but it may not be until the 5090 is announced next year 😆

  • @4hedron852
    @4hedron852 Рік тому +9

    I'm going for a 3060 because AI support on Radeon isn't where I'd want it to be, considering one of my main motivations for upgrading is messing around with ai. Otherwise the 6700 or 6800 would be looking amazing. In fact I think very similarly, maybe even better, about the A770 16GB and the Pro A60. I might even want to try underclocking the 3060 to reduce power draw making it similar to a more expensive A60.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +2

      The Pro A60 looks interesting however, it only has half of the cores of the A770.

    • @XX-_-XX420
      @XX-_-XX420 Рік тому

      @@ImaMac-PC I'd personally get intel just to spite Nvidia.

  • @SFO195
    @SFO195 10 місяців тому

    Can you do this same video but for AMD cards now that we have a 7900 and an upcoming 7800 XT? They're also doing the naming shenanigans

  • @mbmk6607
    @mbmk6607 Рік тому +4

    Another great analysis video from you. 4k series ez skip but you know plenty will fall for nvidia marketing unfortunately. It's a shame amd colluded with nvidia. It was a good opportunity for amd to gain marketshare.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +3

      Thank you. AMD could have gained market share but they need to make an effort. Sadly, they just don't seem to want to compete with nvidia.

  • @OrcCorp
    @OrcCorp Рік тому +3

    Need Nvidia for my work as a content creator and graphic designer, but I'm not willing to pay more than 900€ for my card. This is the price point I came up with by comparing the performance of my 3080 to the new 4000 series. And having any sense in my spending.
    I got my 3080 at release with 740€, which was a good and proper price for it, so even 900€ is a tad high to be honest. I'd be willing to spend that extra 160€ for a 4080 if had near double the performance of a 3080 (let's say around 170+%), but it doesn't come even close to that.
    So this generation is truly a skip-gen, as even with a much lowered price it's not a good upgrade.
    Perhaps the 5000 series will do better. A 5080 at a price around 800€, and with over double the performance of a 3080, would be a good deal that I'd happily pay for. We'll see how bad they f*ck up the 5000 generation 🤷🏻‍♂️

    • @XX-_-XX420
      @XX-_-XX420 Рік тому +2

      I'm afraid the 5080 at the very best well be a about as fast as a 4090. If this tren continues it'll probably not even beat a 4080 like how the 4060ti loses to the 3060ti a bunch of times.
      And it'll probably be like 1500 at this rate. I think they really don't care about GPUs for companies, and normal people anymore even if AMD will release a good product. Which lets face it AMD always succeeds to ruin everything for them selves one way or the other. So I think you're just dependent on intel to give you a good gpu for the money as they atleast try with with the A750.

    • @OrcCorp
      @OrcCorp Рік тому

      @@XX-_-XX420 a 5080 at a performance level of a 4090 at 800-900€ would be a proper upgrade from my 3080.
      We will see how it goes when it's released. I'm happy to roll with my 3080 a year longer. Maybe catch a 4080 from the second hand market next year at around 700€. Or not 🤷🏻‍♂️

    • @XX-_-XX420
      @XX-_-XX420 Рік тому +1

      @@OrcCorp 4080 used at 700 might happen but that's the best I see happening with how things are going. Greedflation is insane so not sure what prices will do in a few years I'd governments won't step in (which they probably won't).
      And I mean for everything. Food prices have gone up insanely much in just this past half year. Like everyone is gonna be eating 1 meal less a day at this rate in 1 year if not sooner.

  • @peterkovacs184
    @peterkovacs184 Рік тому +1

    I would rather consider using transistor number for each gpu, as performance mostly correlates with that number. Die size is not really a good performance indicator, the smaller the lithography, the more likely errors will be in the gpu with iso size, so prices will naturally goes up for the same "size" as you need to create more and use more material to get the "same amount" of gpus.

    • @zdspider6778
      @zdspider6778 Рік тому +2

      Sure, the smaller the node process, the more wafers will have defects. But only up to a certain point. And only in the beginning. Once the process is mature enough, they make up those losses (by cutting more chips from the same wafers) and it all evens out.
      It's not like they can go back to 16nm or 28nm. It's not how it works. It would ironically cost them MORE to revert back to an older process once everything is set up for the smaller one. And the number of transistors doesn't tell the whole story. Performance isn't infinite, they are still limited by the laws of physics for how fast a capacitor can charge and discharge, meaning how fast registers can receive memory.

  • @dh8203
    @dh8203 Рік тому +36

    Toward the later half of 2022, I was thinking I would do a build with an AMD GPU when their new cards came out. Then AMD managed to torpedo themselves so hard at every opportunity I just haven't been able to bring myself to buy AMD.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +19

      I am utterly amazed at how AMD just torpedo's themselves over and over again.

    • @alvarg
      @alvarg Рік тому +6

      ​@@ImaMac-PC because instead of under cutting nvidia and making a profit via sales, they try and price match to performance of nvidia but as nvidia has a greater market share, as well as having more features or at least people believe they do people don't buy amd when they price similar to nvidia.
      which I'm surprised as nvidia has always charged a premium for their products so naturally when the "budget for the gamers" brand price matches based on performance to nvidia, no one is going to buy it.

    • @SIPEROTH
      @SIPEROTH Рік тому +2

      @@alvarg They undercut Nvidia every time actually. They just don't undercut them hugely like people want when announcing msrp but their cards are cheaper and get further cheaper after a while unlike Nvidias.
      Even now the best offers out there are AMD cards and some are amazingly priced actually.
      And they can't undercut Nvidia hugely when announcing mspr prices because that will bring a price war. Nvidia will then react and bring better products cheaper.
      You may get happy about that because then you will buy the cheaper Nvidia card you wanted but AMD wouldn't get a sale, it will just be helping you buy the Nvidia you wanted while they lose even more profit due to low prices and end up closing shop because they get no value from making GPu's.
      AMD is really struggling in the GPU market with Nvidia having over 85% of the market share.
      It's almost a monopoly. So how can they start challenging Nvidia in a price war when they have so little market share and profit in comparison while their GPU tech isn't crushing Nvidia ether?
      Obviously Nvidia will have much much better ability to offer better cards at lower prices if they wanted to crush AMD. If they wanted they could close them down and then increase prices again when they are the only ones left. Nvidia is just letting AMD exist because they don't want to be left alone and have government authorities bother them about being a monopoly and trying to split their corporation.
      So before nagging about AMD realize that it all starts from you people not buying their cards and putting them in such a weak place that they can't even do a price war. The only thing AMD can do is sell as they sell and hope that they will manage to find some magic bullet that will allow them to have a GPU that performs much better than Nvidia(or that Nvidia screws up) and then start gain share in the market threw that and be more able to hit Nvidia.

    • @REgamesplayer
      @REgamesplayer Рік тому +1

      @@SIPEROTH AMD doesn't undercut Nvidia as they launch their GPUs with ridiculous prices, gather bad reviews and them proceed to lower them. Even when it comes to performance, something like RX 7900 XTX is matched by RTX 4080. Considering regional pricing, they are priced within 100 euros in my region. Who is ever going to buy RX 7900 XTX then?

    • @shanez1215
      @shanez1215 10 місяців тому

      ​@REgamesplayer That's another issue entirely. I have no idea why (likely volume), regional distributors charge a lot more to ship/import AMD cards and the costs end up getting passed on. In the US the 7900 XTX is about 200 dollars or more cheaper than the 4080.

  • @jrddoubleu514
    @jrddoubleu514 Рік тому +4

    I bought a 1080TI when it was like 700 quid back in the day. I've since boutght one more and at least 5 Quadro M2000 cards so I can run multi-touchscreen arrays on various work-focal setups,
    That second 1080ti cost me 270 about a year ago.
    I've got a 1650 in a Razer Blade Stealth,
    and a 2080Ti Super in a Razer Blade Advanced I picked up a couple years back.
    I bought my first 1080 for a main gaming rig build to replace my old build, so I could justify the price then as I was upgrading from a GTS OC card (So long ago I can't remember what it was, but it was basically a lubricated poo pipe, so it did the job for it's time).
    TBH, the prices these companies are asking are a bit outlandish.
    1080TI will do most anything on ultra, so unless you're specificalluy looking to go large with 4-8K Ray tracing, or need those high frames to compete, you can save upwards of a grand just buying a second hand GTX1080TI.
    I seldom use the RB's with it's 2080.
    I've got an 86 inch 8k tv, so one day I will upgrade, but not til those prices get a lot more reasonable.
    I mean, 11GB vram. If I were getting 24GB Vram on a single chip card, I'd be willing to pay 700 quid, but....not now. Even then. I'd be more inlined to price at 500, or less considering you can pick up an M40 or K80 with 24GB Vram for less than 200 quid a piece.

    • @4hedron852
      @4hedron852 Рік тому

      Those workstation gpus, iirc, don't have video output. However, iirc, unless you really needed the performance of a 4070 or more, it would be cost effective to buy a 6800xt/3060 ti and grab a graphics accelerator for vram heavy toying around.

  • @cheetahfurry9107
    @cheetahfurry9107 Рік тому +5

    I just grabbed a gigabyte 4070 to replace my 2060 6gb. I would not have done it but I got it open box from bestbuy for $479 this week.

    • @jaxonswain3408
      @jaxonswain3408 Рік тому +1

      Not too shabby.

    • @KinngMe
      @KinngMe Рік тому +1

      I got a 4080 for 980 open box Best Buy

  • @dreamlocker
    @dreamlocker 10 місяців тому

    So if I`m looking for upgrade what model should I look for, I`ve got 1080ti (buggy card, works only in debug mode) with i7 7700k z270 chipset (PCI 3.0)?
    Wanna stay on 2K 144 Ghz gaming if it`s real =)

  • @03chrisv
    @03chrisv Рік тому +7

    I took advantage of the Diablo 4 promotion with nVidia so my 4070's effective price was $530. Not amazing but it was a pill I was willing to swallow. So far the 4070 has exceeded my expectations and runs everything I throw at it. I know it's seen as a 1440p card by youtubers and even nVidia themselves but it can run most games at native 4k 60. In fact the only game that it can't hit 4k 60 that i own so far is Portal RTX, which I drop to 1440p and use frame generation on. It looks absolutely gorgeous and even though it's 1440p it still looks very sharp on my 4k oled tv, so if in the future if I have to drop resolution to 1440p in other games even with DLSS I know it'll still look good on my TV . The 4070 is definitely at a different tier compared to my PS5, its almost like my PC is a Turbo charged PS5 Pro by comparison 😆

    • @SIPEROTH
      @SIPEROTH 10 місяців тому

      The 4070 could have been a major hit if it was just a little lower in price. Is not the card that is bad but the pricing.

  • @mariotop742
    @mariotop742 Рік тому +1

    Very good analysis .... stupid is not the one who ask , a sucker is the one who gives .In the end of you want to pay the ultimate price , do it !!

  • @DC3Refom
    @DC3Refom Рік тому

    i agree with what you said mack ,also those people that dpout i got an oprn box deal in the majoriry of Europe , including the uk we do not have option

  • @dat_21
    @dat_21 Рік тому +1

    Yep, same observation. It's priced 1 tier too high. Almost like a well thought, purposeful thing to do.

  • @garethwoodall577
    @garethwoodall577 Рік тому +1

    Spot on.

  • @DKTronics70
    @DKTronics70 Рік тому +7

    AMD have got complacent, not just because of Nvidia, but because they have all that nice console money, considering their hardware is in both Xbox, and PS5 consoles, and will probably be in Pro versions of both consoles - should Sony, or MS, go that route.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +6

      I wonder what AMD would do if Intel decided to go after that sweet console money?

    • @InternetListener
      @InternetListener Рік тому

      every angry customer from Nvidia that won't find an AMD dicrete GPU if still wants to play it probably will buy a PS5/ Xbox...guess what instead of a 290€ deceiving gpu or a non launched 349€ 4060 (4060 Ti start at 419€, 449€ FE) they go and buy a full AMD based system at 549€/499€. INstead of selling some hundred thousands per year, or some milions like Nvidia has done, just imagine selling hundreds of milions of PS4/Xbox to current models...are we talking 20 milions per year, easily? on the same nodes as RX 6000 7000... and no need to waste on cache because you already hahve shared memory high speed bandwidth...losing latency but...the 1440p rendering and RT and 8 cores 10 to 12 Tflops are still to be maxed for 5 years to the levels as developers from Zelda have achieved on last launch on a 0.7 Tflops GPU of and old cut down Tegra X1... Any Nintendo current game ported to Apple Silicon will be 1080p to Retina digital work of art.
      Intel should clearly start pairing CEleron (efficient line) with PS4 Pro / Series S iGPU and i3/i5 with A730m at least... but Xbox or even PS5 could always go for a fee (like apple parallels or former dual boot) the Windows S (or Windows 11) and the Custom ISO (or Linux boot) and jeopardize cheap PC market with one bios and drivers release. kit AMD 4700S it has already the first working drivers finished...copy paste some Ryzen igpu and discrete source...and done in weeks or days...they could also add all the DXVK for Dx9/Dx11 and improve OpenGL etc with the linux opne source...and make Intel sweat...

  • @KinngMe
    @KinngMe Рік тому

    Just bought a 4080 open box new for 980 best buy! They still have some go buy asap

  • @Humanaut.
    @Humanaut. Рік тому +3

    None of the above.
    Vote with your wallets.
    You can wait to buy, they can't wait to sell.

  • @DETERMINOLOGY
    @DETERMINOLOGY Рік тому

    @ 2:10 decent chart but i always look at where im coming from, Where im going to, What's the cost between both and is it worth it
    For me i had the GTX 1080 which was 599 @ launch and upgraded to a 4070 Ti which is 799 @ launch, 200$ difference but the performance gains was well worth it. To upgrade from a GTX 1080 to 4080 would of been 600$ or so to gain 16GB of vram overall but to gain 15-30fps more which wasn't worth it so im happy with the 4070 Ti by far.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      That 1080 was from more than 5yrs ago...you got your money's worth out of that one!

    • @DETERMINOLOGY
      @DETERMINOLOGY Рік тому

      @@ImaMac-PC Yea, That's how i look at upgrades. For me to upgrade from the 4070 Ti it would have to be sorta the same type of gains in performance
      I can remember back when i had the GTX 1080 and gotten the RTX 2080 which wasn't worth it at all and kept the GTX 1080 so small performance jumps like that really doesn't cut it for me

  • @Beehj84
    @Beehj84 10 місяців тому

    I agree with your pricing and thesis entirely. I currently have a RTX 3070. The only card I would buy is an aftermarket 7900xt under 700 with Starfield (because I'll be buying it no doubt). Everything else is terribly priced. If the 4080 were appropriately priced it would be the most attractive card, in terms of power, features, efficiency, VRAM etc.

  • @GamerZHuB512
    @GamerZHuB512 Рік тому +2

    I've had a weird GPU lifecycle. As my 1060 breathed its last, I went from it to a 6600, and then I got bit hard by the upgrade bug. two GPUs and one monitor later, and I tried out the RTX 3070, and to be fair, I didn't find myself missing Nvidia's features nearly all that much. Furthermore, I kinda found RT to be a mixed bag since most of my games don't have it it. But DLSS was absolutely fantastic in what few games I had that worked.
    I remember an article that was done about two months ago, that was titled "Nvidia isn't selling graphics cards, it's selling DLSS" which is 100% true, considering it looks great in most games, provided you tweak the version and all.
    However, I'd go a step further to say that Nvidia may wind up getting rid of their GPUs and start investing more in GeForce Now for the midrange sector. Seems like the most logical choice since it cuts costs and provides a steady source of income every month.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +1

      I agree. They would like to sell more GeForce NOW and it will grow if they keep raising their prices. However, it really is dependent on your internet connection and is not great for some games due to the latency issue.

  • @Icureditwithmybrain
    @Icureditwithmybrain Рік тому

    Im currently on a RTX 3070, I was planning on getting an AMD 7900 XT when the price reached $700 but now Im not sure because I recently got into local Ai LLMs and from what I read nvidia gpu's are better at running Ai. But nvidia cards are super expensive right now. Im considering buying a used RTX 3090 but Im worried about their overheating memory. I dont have a clear path forward.

    • @CyberneticArgumentCreator
      @CyberneticArgumentCreator Рік тому +1

      Overheating memory was an issue for people mining.

    • @XX-_-XX420
      @XX-_-XX420 Рік тому

      That's a though one. I occasionally see "cheap" 3090ti's where that issue is fixed. Although I think it was the worst om evga cards. It could probably also be fixed if you can work on the card yourself. Another option could be to try and snipe a 3080 12GB for about 500. The do sell for that much every so often.

    • @rekoj8085
      @rekoj8085 Рік тому +1

      try selling your 3070 for like 350-400 then buy a 4080.. I sold my 3070ti for 500 and then bought the 4080 FE from best buy so really I only paid 700. In 1440p the jump from 3070ti to 4080 is incredible, performance, vram, thermals, etc. The reason I didn't go for a 4090 is because with the 4080 I am able to keep all other components the same and i only game at 1440p 240hz

    • @xNickerishx
      @xNickerishx Рік тому

      @@rekoj8085 I'm building a new pc. i have an i9-13900k. what GPU should i get

  • @Chejov1214
    @Chejov1214 Рік тому +3

    cool vid. thanks . the only reason I opt for Nvidia is bcz I do some 3d stuff and in blender green team works better :/ if it wasn't for that I would go with amd .

  • @fdjw88
    @fdjw88 Рік тому +1

    Radeon can't do stable diffusion right now. And SD is important to me.

  • @paulschaaf8880
    @paulschaaf8880 Рік тому +1

    The 40 series should shift a full performance tier down in pricing because when you look at their specs and the die used for each of them, that's where they should have launched to begin with. Then people would actually be buying their stuff. That means the 4090 should be $1200. The 4080 should be priced where the 4070 is, and so on.

  • @mickaelsflow6774
    @mickaelsflow6774 Рік тому +2

    To you last question: VR on AMD is tough. Works in most case, but ever so often, there is a problem with encoder or with game support (RRRE) or just plain performance.
    As for the price tier as "reasonable" that's... pretty much what I've been saying all along :D Except the 4090. Couldn't care less for it's price. I'll give them the $1500 as last gen. It's stupid, but all the "it's expensive", "it's top of the line", "it's the for prosumer" works for me to justify it. Anything else? Hell to the no. I see you, Nvidia. I see you!

  • @Eiden193
    @Eiden193 Рік тому

    I would have liked to see where Intel fits into this whole lineup comparing performance and price vs nvidia

  • @patb3356
    @patb3356 Рік тому

    great chart

  • @vaghatz
    @vaghatz Рік тому

    I think the best metric is mean fps in same gen games, not die size

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +1

      Normally, that is all you should care about. Unfortunately, that only works in a competitive market.

  • @tylerhill40
    @tylerhill40 Рік тому +2

    Frustrating reality atm.

  • @jeremysohan
    @jeremysohan Рік тому +1

    Jensen: money.....money....moneeeyyy

  • @tbx59
    @tbx59 Рік тому +2

    The price gouging and wattage has soured me on getting a new graphics card from Nvidia anytime soon and AMD doesn't seem interested in providing a compelling alternative so I guess I wait and see what breaks first my card or my will.

    • @4hedron852
      @4hedron852 Рік тому

      Honestly if you don't need massive performance and are okay with beta drivers, the Arc Pro A60 doesn't seem like an awful pick. 12GB vram albeit barely better than the 3050. It does however have the same tdp.
      I do however not know what card you're running. For all I know you're running a 1080 and the A60 would be a downgrade gaming wise.

    • @Elvewizzy.
      @Elvewizzy. Рік тому

      if you need a new GPU, the only real option is the 4090 currently. You can undervolt it quite a bit for some impressive efficiency gains. IIRC it's like 30% power consumption decrease at the cost of 5% performance.

  • @bharatdabhi1044
    @bharatdabhi1044 Рік тому +2

    I agree with your pricing. I wanted to upgrade from my GTX 1650 to RTX 4060. But those bastards rebranded it as RTX 4060Ti and charging almost $100 more. I wouldn't pay more than $300 = INR 24k-25k for RTX 4060Ti. And here in India they charge $ 70-80 more.

  • @cyraxvisuals6203
    @cyraxvisuals6203 Рік тому +1

    Im probably gonna end up with a 4070ti since i need it for a new build

  • @darthpaulx
    @darthpaulx Рік тому

    You are completely right with the pricing.
    And your pricing has already inflation considered into it.
    Normally i would like to see the x80 series not more then 700 dollars.
    Amd also added the xtx version to higher the prices.
    But it's a good thing that Amd didn't raise the price compared to the rx6900xt.
    The thing Amd did is to add a 900 dollar card that should be a 600-700 card like the rx6800xt.
    I think Amd should have just called the 7900xtx as 7900xt.
    And the 7900xt as 7900 non xt. Because there never was a rx6900 non xt.
    I hope that Amd will keep the rx7800xt at 650 usd like last gen.
    Seeing their 7600 not making any sense, i don't have high hopes.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      Thanks. Don't get your hopes up on the 7800xt. I covered in a previous video what should be the 7800xt. Instead AMD will give you the same performance for the same price. Not looking good for RDNA3.

    • @darthpaulx
      @darthpaulx Рік тому

      @@ImaMac-PC Yes, because the coming 7800xt should have been a 7700xt. While the 7900xt should have been the real 7800xt.
      I think it will at least have performance of the 6900xt. Still a little bit better then 6800xt. 10% or so.
      Well, let's see what will happen.

  • @sanzyboy3952
    @sanzyboy3952 Рік тому +3

    I'd say 4070 is probably the best deal, not perfect, but the best out of 40 series

    • @xxNATHANUKxx
      @xxNATHANUKxx Рік тому +1

      Me too, plenty of 4070’s below msrp already. If this continues we might be able to pick one up for that $500 mark

  • @MWcrazyhorse
    @MWcrazyhorse Рік тому

    I have a 2060 overclocked.
    It's basically as fast as the 4060.

  • @dannymitchell6131
    @dannymitchell6131 Рік тому +4

    Sadly I need Nvidia cards and ideally Intel CPUs (but some AMD ones would be ok) because I do 3D modeling for work and CUDA has been and option for so long you can guarantee it will work with your chosen software.
    Intel chips are great for using the viewer when working with anything like 3D or even media editing (the viewer is the window you do your work in, the render is the output from said work).
    I'd LOVE to build an AMD rig but I have no idea if their products will work with what I'm doing...hell I don't know if Intel cards would work either.
    For gaming I'd be fine with anyone but for work? Time is money and if there's a single piece of software that's not going to two work, it's worth it to just spend more upfront.
    I've even thought about buying the cheapest current gen AMD and Intel options just to see.
    I wish hardware reviewers tested for this. Like have a gaming review and a workstation review in separate videos, separate channels even.

    • @TheLinkedList
      @TheLinkedList Рік тому +1

      What's the software you are using? I strongly doubt there is anything an Intel CPU can do that an AMD option can't. They have the same instructions sets and features and they're both x86. The only thing that comes to mind that differs with Intel is their QuickSync encoder.

    • @dannymitchell6131
      @dannymitchell6131 Рік тому

      @@TheLinkedList
      AMD chips can do it, but the strong single thread lets you brute force a lot of geometry on screen all at once, just moving the model around will push a few cores and the GPU to 100%, especially with shadows and textures turned on.
      I use SketchUp to model and Thea to render but I also have vRay and Unreal Datasmith to drop models into Unreal Engine. I have Solidworks and AutoCAD to learn too but my job is done in SketchUp and rendered with Thea...it's always good to learn new things when you can. 😉
      My laptop has a 13th gen i7 and a Studio RTX 3060 (1080p) and even plugged in I can bog down the CPU with the viewport, it feels like playing a game at 25fps.
      My desktop is a delidded 4790k at 4.5ghz with a GTX1080, the laptop is much faster in the window and render time, that's why I want to build a new rig...but my desktop is 1440p so don't that that comparison too seriously.
      Obviously AMD CPUs will work just fine but I know I could have gone Intel and made the viewer alot faster (this is one of the things I do have some benchmarks for from the modeling community).
      I'm more than happy with the CPUs out there, I just don't know what sort of compatibility I'll have with any GPU other than Nvidia. 😔
      OpenCL is available for Thea but some GPUs are just Beta support.

    • @TheLinkedList
      @TheLinkedList Рік тому

      @@dannymitchell6131 if you're getting by with a mobile CPU you'll be just fine with an AMD desktop. Intel mobile CPUs only boost single cores for really short periods anyway (even when plugged in) and it sounds like you'll be viewing models/renders for way longer than the single core speeds will stay at their highest. A desktop CPU will maintain its boost indefinitely if cooled correctly. Rendering also is best done with massive cache sizes, a 7800 3D cache or greater would suit that workload well.
      Have a look on anandtech for reviews, they tend to benchmark the scientific and professional rendering software. Going Intel or AMD for the CPU won't be much different from each other if at all for what you do. Just get whatever is best value for money and has a potential upgrade path if that's what you think you'll be doing. Upgrade paths however are not the be all and end all though, it's just a nice to know/have.
      For a GPU, you could look into the ML series or whatever it's called from AMD (Vega architecture professional stuff). Or play it safe and get Nvidia. Unfortunately, they're all expensive. Might be worth pricing them all up alongside a Mac Studio, it might actually be the cheapest and best option in the end

  • @SigmaGrindset-vg4oh
    @SigmaGrindset-vg4oh Рік тому

    Should I wait RDNA 4/5000 series, if I have PS5 and it feels enough for now?
    Sure, I'd love to max my LG C2, but I am not happy to be scallped by Nvidia. Obviously, I can afford it even at the scalped price of 1600 USD, but money doesn't grow on trees and I need to be sure it's worth it.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      If you don't need to, wait.

  • @davefroman4700
    @davefroman4700 Рік тому

    Screw that. I bought a recycled 5700xt with 12 gigs for $200 off the chinese recycling masters. Had it for over a year now with no issues and it will play everything just fine. If I get another year or two out of it I am happy with that. I can always get a another one if it dies for a less than I would pay for any new card today.

  • @julianB93
    @julianB93 Рік тому +2

    4080 should cost what the 4070 ti's Msrp is right now

  • @gedeuchnixan3830
    @gedeuchnixan3830 Рік тому

    My last GPU was Radeon, my current is a Radeon and my next might be Intel´s second generation but certainly no nGreedia card. Also Adrenalin is a way better driver than nGreedia´s ancient WinXP control panel and the maleware called "experience".

  • @MarvRoberts
    @MarvRoberts Рік тому +4

    I just said screw it and replaced my 1660 ti with a used 3080 for $400 from ebay.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +1

      That's a deal that will last for years. Best way to spend $400 in this market!

  • @kingcrackedhen3572
    @kingcrackedhen3572 Рік тому

    I got a 3070 with a alphacool waterblock for 305
    I was originally at the time looking for a 3070 for 280 or less and I’m still happy with the deal I found all be it a used card
    I use Nvidea simply for minecraft rtx (somehow my fps goes up when I turn it on)

    • @oskan8522
      @oskan8522 Рік тому +1

      wow thats insane i got mine at first release for 650 euro 🥲

  • @bfish9700
    @bfish9700 Рік тому +2

    AI saved Nvidia this gen, thankfully no one is buying these GPUs at these prices. I think the AI bubble will burst in a year or two, Nvidia will have to deliver value to gamers at that point or explain to investors why their revenue tanked.

  • @1oglop1
    @1oglop1 Рік тому

    I was considering AMD GPU myself but then I saw people selling AMD to get NVIDIA for VR and AI training. Also driver issues and lack of "geforce experience" convenience from AMD.

  • @kkrolik2106
    @kkrolik2106 Рік тому

    Second hand is only way I get GPRO X080 for $100 *( Rx 6700 without off display ports) I using my motherboard outputs via IGP :)

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      Now that's a way to get price to performance!

  • @phlips11
    @phlips11 Рік тому

    bought a 6950xt for $630. It's great. No issues at all. Everything max settings 1440p, no RT. Very happy with this purchase. I will be using this card for several years

  • @fail.thunder
    @fail.thunder Рік тому +1

    The 4090 is way to overhated but if you think about it, the card is gonna last you years and years. Its the only card that had an outstanding jump in performance. The 4090 is a solid long term investment. If you upgrade often, than stay away from it

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      That is an interesting perspective. If you keep it for 5-7 years then the cost is $200-$300/yr...I wonder what the performance will be like then?

  • @catriona_drummond
    @catriona_drummond Рік тому

    For 1080p, get an Intel card.They need some sales.

  • @Drebin2293
    @Drebin2293 Рік тому +1

    There are a lot more AMD users than most think. A decent chunk of NVidia users on things like the steam hardware survey comes from net cafes. Then you have the laptop legion that is almost exclusively NVidia because AMD hasn't had a real presence there in a while. But when it comes to desktop users? I know quite a few people with AMD cards. Myself included. All you have to do is look at the gaming division sales numbers to know how many cards are shipping.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      I looked into the net cafe claim. When the pandemic struck, all the net cafes closed. All of them. So you would think, that after a couple of years, the numbers would have made a huge shift. Except they didn't. They were very consistent. So consistent that the "decent chunk" coming from them was insignificant. The problem with gaming division sales with AMD is that you need to remove the sales of the chips supplied to consoles. That's a lot of sales.

  • @istgyash
    @istgyash Рік тому

    The reason I bought 4070ti I still had option for 7900xt is simply because I wanted to productive tasks as well as gaming for me dlss 3 isn't as imp as nvidias cuda for application support it's much much useful than amd in adobe applications and in 3d modelling if amd had brought a better competition in productive market I would have really ditched nvidia...

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      Thanks for the feedback. Many people need CUDA for productivity and that just seals the deal against AMD Radeon GPUs.

  • @jasonking1284
    @jasonking1284 Рік тому +1

    Will you help Jensen achieve his goals.... Yes, then buy a 40 series today....

  • @gexnfx4223
    @gexnfx4223 Рік тому +2

    Everyone knows this is a skip it generation. The 20 series was as well, then Nvidia had to correct with the 30 series. Should see the same with the 50 series, if they care about GPU sales anymore.

    • @Some-person-dot-dot-dot
      @Some-person-dot-dot-dot Рік тому +1

      There will be another "Miner boom" level event for the 50 series. Blackwell will not be sold in large quantities most likely. I'm predicting a very slow supply similar to the pandemic. An artifical shortage similar to what we saw in late 2022. Nvidia is fighting with TSMC on a deal that's already been made. TSMC forcing them to buy the silicon isn't going to deter what's coming. Nvidia will probably try to shift the graphic card skus again. This means the 5090 will actually have to same die configuration that the 4080 16gb had. 5080 is now on the same die configuration as the 3060 ti(Because double shift up). 5070 is now the 4060 class die which was the 3050 die configuartion. This would make their Blackwell line-up double shifted. Because they are shifting the line up on top of the shift up for the Ada Lovelace cards. A double shift up. They will charge 90 class money for an 80 class card. Probably 1100 dollars for the 5070 ti. 900 dollars for the 5070.

  • @theultimategamer8537
    @theultimategamer8537 Рік тому

    I guess I feel slightly more validated that I got my 4070 for effectively 500 with that MicroCenter deal, wouldn’t have considered it otherwise.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +2

      That really was the deal...for a short period of time.

    • @theultimategamer8537
      @theultimategamer8537 Рік тому

      @@ImaMac-PC yah I literally managed to pick it up the last day it was available too

  • @ceuser3555
    @ceuser3555 Рік тому

    $500 is my limit for the highest end gpu. Anything above that is crazy.

  • @shasha023
    @shasha023 Рік тому

    i still have a GTX card, tried to upgrade to a 2080 but they were too expensive at the time, then came the 3080 and you could never find them, then came the 4080 had my money ready but then it cost $1200, i told myself ill wait for the 5080 hopefully it will have a reasonable price.
    Ill never buy AMD, maybe if Intel had a higher class card i would buy it but the A770 is too weak

  • @lexluthor4156
    @lexluthor4156 Рік тому +4

    In the Chinese market, some lower end RTX 4080 are already 7299 CNY, which is about 1000 USD tax included.

    • @stepbruv8780
      @stepbruv8780 Рік тому +2

      That still $300 over normal msrp

  • @benc3825
    @benc3825 Рік тому +1

    Let's look at it from their earnings.
    AMD's gaming net income for Q1 2023 was $1,757 million, down 6%
    Nvidia's gaming net income for Q1 2023 was 2,240 million, down 38%
    AMD defines its gaming segments as:
    Discrete GPUs, semi-custom SoC products, and development services
    Nvidia defines its gaming segment as
    GeForce GPUs for gaming and PCs, the GeForce NOW game-streaming service and related infrastructure, solutions for gaming platforms, Quadro/NVIDIA RTX GPUs for enterprise design, GRID software for cloud-based visual and virtual computing, and automotive platforms for infotainment systems.
    For Nvidia, "gaming" includes a lot more than AMD. Market share cannot go from 20% to 10% in a quarter. It is simply impossible. Maybe it was shipments that the graph is really meaning, that happened the press misunderstanding JPR's market share recently.
    From what I know, the market share is really around 25% to 75% for AMD and Nvidia respectively

    • @XX-_-XX420
      @XX-_-XX420 Рік тому

      I think it's more like +/-80% Nvidia. Then intel is a bit bigger than AMD at around 10% and then AMD with the remaining.

  • @Altimittkun
    @Altimittkun Рік тому

    RIP $100 cards. I remember I bought my R7 250 for $100 and my GTX 960 for $130.

  • @HDJess
    @HDJess Рік тому

    Well, the only reason I wouldn't consider a Radeon GPU nowadays is the lack of DLSS, the other stuff it's just gimmicks. Actually, I liked my RX6600 more than my current 3060 due to owning a Freesync Premium display, it was just so smooth and with zero stuttering. With Nvidia, even if the monitor is G-Sync compatible, it somehow works worse.
    About upgrading, I'll most probably skip this gen, the prices are insane in Romania. RTX 4060Ti = 505 EUR, 4070 = 685 EUR and so on.
    Plus, I don't have any hard reasons to upgrade, the most demanding games I play right now are PoE and D4, and DLSS does wonders in D4 at 1440p on this 3060, I get 100+ FPS with max details.

  • @Jacklee-qh1cv
    @Jacklee-qh1cv Рік тому +1

    399 for a 8gb card in 2023 is ridiculous and the 7600 is just as bad.

  • @starofeden9277
    @starofeden9277 Рік тому +1

    TSMC has announced another price hike the other day
    gpu prices going up again for next gen
    and nvidia isd helping with that
    every 5 years the prices go uip
    check the flow chart
    it aint coming down
    4000 series is the new low for prices

  • @australienski6687
    @australienski6687 Рік тому

    I purchased a used 3090 for $500. I have tried AMD GPUs in the past and I kept having issues with drivers and random crashes.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +1

      Thanks for the feedback. Enjoy that 3090, you got a great deal!

  • @Skotty64081
    @Skotty64081 Рік тому +1

    I've historically only bought Nvidia GPUs but I'm actually planning on switching to AMD next year. The 4080 is overpriced, the 4060 Ti is a "feel bad" product (it's technically better than the previous gen, but with lower bus width, lower PCIE lanes, and worse performance scaling at higher resolutions, it's the kind of product that comes packaged with a side of regrets), 12 GB of VRAM is unacceptable on the 4070 Ti at the $800 price point...12 GB is enough in a general sense, but come on...$800 price point. I think only the 4070 is regret free, but still not a good value. Anyway, I'm looking towards the 4070 Ti performance level, but I just can't stomach 12 GB of VRAM at $800, so I'm heavily leaning towards the Radeon 7900 XT.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      I agree with the "feel bad" on that 60ti. If that 7900 XT drops to $700, then I think it's a worthy upgrade.

  • @NormanTiner
    @NormanTiner Рік тому

    Of the bad Nvidia options, the 4070 is the most decent. If you can snag a used or Micro Center returned card then it could even be a good deal.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      I agree that with openbox prices, the 4070 is the "least bad" this generation.

  • @johngray3449
    @johngray3449 Рік тому +1

    All cards but the 4090 have fake names, the performance increase should be 60% over last gen.

  • @CreateNowSleepLater
    @CreateNowSleepLater Рік тому

    good video and I think the entire PC gamer AND content creator space agrees with you. Nvidia really is the only choice if you do content creation. AMD fails miserably at Blender and Premier pro. So the thing is, what choice does anyone have? Its not like the 5k series is going to be lower. You might have some value proposition in terms of increase in performance over the 4K but they are not lowering prices. Sales are terrible and Nvidia doesn't care. They will likely sit on the 5K launch because what motivation do they have for a next gen? Their current gen isnt selling.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      Thanks. You may be right about delaying the 5k launch because...why do they need to.

    • @CreateNowSleepLater
      @CreateNowSleepLater Рік тому

      @@ImaMac-PC They still have the 16Gb 4060ti (which wont help much because of the die size and memory bus) and the 4050. That is even before you get to the inevitable 4080ti and 4090ti. As we speak I am agonizing over getting a 4080 or 4070ti. One thing to note, you don't get dual encoders for AVI unless you go 4070ti and up. If you stream and record at the same time, this is huge. I know the AMD cards have this but as mentioned, their benchmarks for 3D stuff are garbage. You also have to hope that programs support AMD's implementation. I believe OBS and Davinci do now. To your point, if the 7900XTX was $800, it would be flying off shelves but why would I spend $1000 when for $100 more I can get a 4080? This gen sucks lol.

  • @floorgang420
    @floorgang420 Рік тому

    Remember that you don't need a "too powerful" GPU now thanks to upscaling (DLSS, FSR).
    If you got a 4k120 display like an OLED TV, your target framerate is 1440p90 (70 -110 VRR).
    And for that reason, if you don't need RT, RX 6800/6900 are good enough. And if you need RT, RTX 4070/ TI are best for the price.

  • @RazSkull673
    @RazSkull673 Рік тому

    I have been a life long Nvidia user and its the same exact reason I don’t switch brands from my iPhone in solid brands that work and are reliable. AMD does have all the bells and whistles regardless of specs. I just bought an open box 4070 at $500 including taxes.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      There is something about getting something that "just works". Open box for $500...well done! At that price, you won't feel remorse about the purchase and it does work well at 1440p.

  • @pweddy1
    @pweddy1 Рік тому

    I feel like AMD's 6600(XT)/6650xt are currently the new "rx580s". They got crap reviews because they had a bad MSRP and were panned, but everyone recommended them after the prices dropped.
    The 7600 will also likely fall into that category.
    The big issue with Navi 3 is they seemed to fail to achieve the design goals. The 7600 is performing like an RDNA 2 refresh, not like a new generation. You have 32CUs that have a marginal improvement over the last 32CU card. This is more like the difference between the rx580 and the 590 than say the difference between Navi 1 & 2. There was a major technical fail somewhere and we may never be told what happened.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      Something majorly went wrong with RDNA3 and it's not the fault of "chiplets". I also think of the RX 6600/6600XT/6650XT as the new RX 570/580/590.

  • @taylorshin
    @taylorshin Рік тому

    I just purchased RX7600 for my secondary PC. And I'm still using 6700XT on my gaming(and main) rig. And if opportunity allows, I will also try out Intel ARC (or Battlemage) series. Unless you have specific reason to use CUDA based application, i.e. AI acceleration or media producing, or being a professional gamer who need 4090 performance with DLSS3, there is no reason to pay the premium to NVIDIA at all.
    The price of AMD cards are dropping and getting stabilized as their performance reveals. The driver fiasco does not last months. They at least fix critical bugs and most of them has at least some walkarounds.
    People avoiding AMD/Intel is just religious. Intel? sure, there are still a few fiascos left but catching up quite fast. And pissed off by their pope since he started charging the 'correct' price for being fanatics.

  • @theultimategamer8537
    @theultimategamer8537 Рік тому

    I just hope that Intel has something to say about the state of the market with their next gen of gpus

    • @bfish9700
      @bfish9700 Рік тому +2

      They need to figure out how to write functional drivers and get a product out on time first.

    • @XX-_-XX420
      @XX-_-XX420 Рік тому

      ​@@bfish9700drivers are better than Nvidias ones tbh. Nvidia has the worst drivers. Having a good amount of issues with RTX 4000 Vs no issues with ARC and a little issues with RX6000.

  • @Uraim
    @Uraim Рік тому

    I got an XFX RX Vega 64 8Gb used for 110eur+5eur shipping, I think it worth it.
    I upgraded from a dang stinky Radeon R7-240 2Gb.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому

      Much more usable GPU and that price is good. My recommendation is to watch the temps and undervolt as needed to keep it cool.

    • @Uraim
      @Uraim Рік тому

      @@ImaMac-PC on max load +25% more power and the temps are 70-72C

    • @Uraim
      @Uraim Рік тому

      @@ImaMac-PC Temps are good, using on +25% power, which consumes arround 240Watts, and its still 70-72C, even after hours with no vsync but maxed out load. No undervolt needed.

  • @ozanozkirmizi47
    @ozanozkirmizi47 Рік тому +1

    This not just greed, it is also obvious humiliation to own customers...
    Not gonna waste my money, nor support this kind of policies...
    I see no problem to not to buy any trap product.
    Good luck wasting their time to with the hope of making me easy prey.

  • @johnc8327
    @johnc8327 Рік тому

    7900XTX is the only non-Nvidia GPU that significantly beats a 4070ti. It's not surprising that they are expensive.

  • @rudybomba2
    @rudybomba2 Рік тому

    I had a 1660s and went with a 4070 TI ... I really didn't want mess with my almost 10y old 850w power supply. Any superior AMD model would require me to upgrade both the power and cooling.

    • @stangamer1151
      @stangamer1151 Рік тому

      I'd recommend to replace 10 year old PSU anyway. I had a very good PSU, which served me ~11 years and once when I tried to turn my PC on, one of the capacitors just exploded. Thankfully this does not hurt any of other PC components, but I'd seriously recommend not to wait until a PSU fails.
      4070 Ti is an expensive GPU and it should be trated well.

  • @Eianex
    @Eianex Рік тому +1

    I would not consider Radeon cause of RTX. Period. And maybe G-SYNC...

  • @CostasLilitsas
    @CostasLilitsas Рік тому +1

    Well, I personally have two reasons to avoid AMD. 1 - After all these years of Raster graphics I simply got tired of them. I fell in love with ray tracing when I first saw it on Control! It was the first time after 20+ years that I got excited again about graphics. And 2 - My monitor's vertical resolution is 1200 (3840x1200). I need that ppi because I use my computer for Cubase too. Higher ppi makes UI elements too small. So in that low vertical resolution the only upscaler that works well is DLSS. FSR needs higher pixel count to do a good job. So the only card I'm considering right now is 4080 (currently on 2070). It's good on ray tracing, it supports DLSS and it has a chance to be adequate in vram (cross fingers). On the other hand nVidia is stealing my money.. no doubt about that.. And I don't see next gen to be any cheaper, thanks to AMD.. + it's a year and a half away. I recently played Dying Light 2, and in order to have Global Illumination active (day and night difference) I had to play it in DLSS Performance.. Should I wait and play nothing until next gen? I don't know man.. All deals are crap everywhere I look... Damn it..

    • @bansheezs
      @bansheezs Рік тому

      Wait till fsr 3. It will be equal to dlss almost and work on everything

  • @marymartin2167
    @marymartin2167 Рік тому

    we need a new gpu maker that can provide same or more performance than nvidia and amd intel need to step up more.

  • @TheCoolLama
    @TheCoolLama Рік тому +4

    Very good video, well researched and put together, thanks!
    Nvidia gpu's are generally (much) cheaper than AMD here in Europe (NL) and South East Asia price/perf and always in stock. We don't get the sweet deals the US has and this is most likely the reason you see Nvidia has the most marketshare and overrepresented in Steam HW Survey. The 6800 still goes for €540 and €400 second hand, while the 4070 goes for €600 and Intel A770 for €400. Same scaling applies for lower tier cards (amd is cheaper with high end). Nvidia got much better perf/W and features (much better encoder vs amd's utter shit quality per same bitrate, dlss, etc.). It's a no brainer which is the (much) better value for monery here and even if price/perf was the same, I'd pick Nvidia over AMD (innovation vs copy-pasting half baked "chinese" clones). Pure raster performance, e.g. WH3 Ultra @ 1440p UW @ €/Fps: A770 €11.88, 6800 €10.35, 4070 €8.04.
    That being said, there is no price/perf improvement over last gen, 4070 is about on par and the rest are worse. Rdna3 is a complete disaster. Clear skip here. It seems Nvidia got a tick-tock strategy Intel used to employ. Uneven numbers are pretty good value for the money, so let's see what 5000 brings. Regardless, I've expected it to be this way for over a year now vs all the hopeful sheep (hope = delayed disappointment) of current affairs. I'll upgrade once either release 3nm cards with >= 16GB VRAM with proper performance (>= 60 fps 1% lows WH3 Ultra @ 1440p UW), much better perf/W and at least 15% better price/perf over the 4070.
    Note, as an interesting observation comparing stats in my spreadsheet, the fab factor gen-on-gen scales pretty well within 10% margin of expected perf/W improvement, regardless of transistor density. 3nm should be interesting not only in perf/W, but theoretically also in price/perf, as the N3 node is only 15-20% more expensive with over 50% more density and better yields. Also, gddr6(x) is at an ATL price of $25/8GB (not even counting staple discount). If they still up the premium and therefore margins on the 5000 series, that will be my last straw and quit the occasional AAA games that I play on pc. I noticed that I enjoy and play indie (Rimworld and co) and paradox games way more, which can easily utilize a weak old gpu. I always buy console when a new Rockstar comes out (because I love rdr and gta series). I can afford the 4000 series and 4090, but I refuse to be milked, those experiences and memories will last a lifetime (you always remember the bad purchases you have ever made, i.e. "I should have" and people seem to never learn, not me).
    I wish amd could and even tried to compete, but nvidia is just too far ahead and can price w/e they want if they feel people will buy it and still make a good profit from. I already saw this with the 1000 series, nvidia's hevc encoder was actually really useable and perfected it with adding b-frames on the 2000 series (the fixed encoder is still the same on the 4090), whereas amd's encoder was pure shit quality for the same bitrate and not useable for either archiving or plex usecases. This technology got used in DLSS and now DLSS3, the latter which is pretty complex and I don't think AMD will be able to match that (based on their history) any time soon (say 6-8 years, if ever). Amd can only compete in the price/perf department and they REFUSE to do so and are copying Nvidia's strategy (too high pricing, delaying like Nvidia does, absolute shit marketing for over a decade (pure lies), the failure that is rdna3 (~100W idle power consumption with multi monitors), etc.). If you're also in this market for decades like me, you know this shit happened before just before the 2008 crash where AMD and Nvidia pricefixed the market. I dare to bet this has been happening here again since the crypto mining craze.
    Thing is, if people keep buying, prices will go up next gen out of proportions once again. Corporations have one goal, which is to make more profit. I already knew smartphones would be priced the way they are because of this. I know quite a lot of people who buy the latest and greatest iPhone each year and the price always has gone way up and they still will buy it, but, Apple has at least reasonable margins on the quality components they use, about 25-30%. This is why democracy never works, you got the masses of idiots who will ruin everything. Remember the first days of something new are always the best experiences, like the internet or a new game online. Yeah, it will be ruined later on by hackers and trolls. It only will get worse, unless there is saving grace from those corps (the IRONY), don't expect it from the sheeple, it will never happen, the numbers you will never get, just like in any revolution, it's only a few % who make the change and need to enforce it on the unwashed or brainwashed/indoctrinate them by daily propaganda, else nothing gets done. This is why history keeps repeating, same morons keep reincarnating over and over again.

    • @ImaMac-PC
      @ImaMac-PC  Рік тому +1

      Thank you! I echo your comments on the 4090...I also don't like to be milked! Human behavior seems to not learn from history very well and over time, it tends to repeat itself.

    • @SIPEROTH
      @SIPEROTH Рік тому

      Lol, no. Nvidia isn't cheaper in Europe and AMD isn't cheaper in high end only. The RX6600 is literally the best value card in Europe right now and so are other cards like the RX4700 etc.

    • @TheCoolLama
      @TheCoolLama Рік тому

      @@SIPEROTH Nah... Like I've said, Nvidia is cheaper. You pay upfront more, but you get more fps/€ here in NL (Europe). Just because you see €200 "cheap", doesn't mean it is. Go do your math first. WH3 Ultra @ 1440p UW @ €/Fps: 6600 (€200) €8.46 (@ 19.69fps/100W) vs 4070 (€600) @ €8.04 (@ 37.14fps/100W).
      6600 is 5.2% more expensive and uses 88.6% more W to get the same fps as the 4070. Also, it only gets 23.63 avg fps @ 1440p uw vs 4070's 74.66, so it's a useless card for me, i.e. I'd never buy your "cheap" card in the first place, it would be literally €200 pissed away, as I've no use for it.
      With current ceiled electricity prices @ €0.40/kWh (in '24 no more, price variability €0.76-1.12 per kWh last 6 months), the 4070 is MUCH cheaper in use as well, it already is €/fps. You also get more features, which I do use (like the encoder).
      Yes, big "lol" to all people who don't use their intellect.

    • @9210069235
      @9210069235 Рік тому

      @@TheCoolLama which is best value card right now

    • @Name-tn3md
      @Name-tn3md Рік тому

      ​@@TheCoolLama lol smart geek wannabe

  • @blodrush25
    @blodrush25 Рік тому +1

    i am so done with Nvidia until the price get cut in haf

  • @youcantata
    @youcantata Рік тому

    This video clearly shows that nVidia is trying to lower price resistance of mainstream / budget gamers on graphics cards. GPU is getting ever more powerful that small, cheap GPU will be enough for most averge gamer @FHD, that will eat up nVidia's revenue on mass market. To prevent that nVidia is limiting performance improvement of budget graphics card and raise price level of them.