NVIDIA expects to get away with this...

Поділитися
Вставка
  • Опубліковано 20 вер 2022
  • New Customer Exclusive, Receive a FREE 256GB SSD in Store: micro.center/osd
    Check out Micro Center’s PC Builder: micro.center/va2
    Submit your Build to Micro Center’s Build Showcase:micro.center/zsd
    Get your JayzTwoCents Merch Here! - www.jayztwocents.com
    ○○○○○○ Items featured in this video available at Amazon ○○○○○○
    ► Amazon US - bit.ly/1meybOF
    ► Amazon UK - amzn.to/Zx813L
    ► Amazon Canada - amzn.to/1tl6vc6
    ••• Follow me on your favorite Social Media! •••
    Facebook: / jayztwocents
    Twitter: / jayztwocents
    Instagram: / jayztwocents
    SUBSCRIBE! bit.ly/sub2JayzTwoCents
  • Наука та технологія

КОМЕНТАРІ • 14 тис.

  • @VGMoney
    @VGMoney Рік тому +11424

    I hope AMD/Intel can really put up some competition so NVIDIA can't pull things like this

    • @Gilaric
      @Gilaric Рік тому +105

      What if they join their pricing? Everyone raises their prices

    • @MrTheguywiththemoney
      @MrTheguywiththemoney Рік тому +239

      Stop buying their gpus then...

    • @aznjay94
      @aznjay94 Рік тому +157

      I think AMD isn't lacking in the hardware department, but in the software department. NVidia's DLSS is making (or made) their cards more attractive than AMD cards....well until the price hike....

    • @IdeasAreBulletproof
      @IdeasAreBulletproof Рік тому +31

      Hopefully neither don't get greedy and put nvidia in check. I know Arc A can only compete with a 3060 to 3060 ti but arc B if that ever comes but for sure RX 7000 should hopefully put nvidia in check unless AMD is also greedy as f

    • @strelokscar2578
      @strelokscar2578 Рік тому +54

      They don't have to. It's up to us not to buy their trash pricing.

  • @LAZYYELLOWDOG
    @LAZYYELLOWDOG Рік тому +5038

    The price hikes and loss of EVGA are major deterrents to me going RTX 4000.

    • @ErrorInvalidName
      @ErrorInvalidName Рік тому +42

      Agreed I've fully sworn it off, my 3 series are overly powerful enough for anything I'll ever do.

    • @CharleyRiddle
      @CharleyRiddle Рік тому +95

      Losing EVGA is major!

    • @anivicuno9473
      @anivicuno9473 Рік тому +19

      yeah if they want me to pay these kind of prices i better be able to get an FTW3 card

    • @vascovalente3929
      @vascovalente3929 Рік тому +2

      Really ? SO, they can present you a downgraded GPU and that is not one of the factors ?

    • @r3mxd
      @r3mxd Рік тому +1

      ok and? just get a strix

  • @fundude365
    @fundude365 Рік тому +307

    Waiting to see what AMD can do.
    Might end up with a 100% team red system for the first time.

    • @rexyoshimoto4278
      @rexyoshimoto4278 Рік тому +5

      Money wise, Nvidia's always been rather cruel with us, who enjoy their products and buy them exclusively. Also waiting to see what Intel's gonna do ARC. Gamer Nexus, Linus and many other tech websites are holding Intel's feet close to the fire. They're shouting, "Feast or famine!" Make a great video card and we'll buy.. . or die in humiliation and obscurity."🙂

    • @Edel99
      @Edel99 Рік тому +2

      Yeah, that's what I'm waiting for.

    • @SoloWingGutman
      @SoloWingGutman Рік тому +2

      Same

    • @NoBrainah
      @NoBrainah Рік тому

      I did that last year. It was fucking trash but this year I hope they can do something with the cpus and make them better

    • @whataday443
      @whataday443 Рік тому +3

      @@NoBrainah How? They already have some of the best CPUs on the market. The 5800x3d is not fucking around.

  • @rexyoshimoto4278
    @rexyoshimoto4278 Рік тому +110

    You got that right. I remember when Nvidia 970 was advertised as a 4gb card. Really was a 3.5 gb. Class action law suit followed.

    • @Thewaterspirit57
      @Thewaterspirit57 Рік тому

      Actually the 970 does have 4GB of ram, its just nvidia set up the other 512MB wrongly, so games don't use the full amount. the 980 has the same ram, but for some reason its not broke.
      I think nvidia just rushed the 970 to release along side the 980 for 2014.

    • @derek400004
      @derek400004 Рік тому +9

      @@Thewaterspirit57 you are missing the point. The 0.5gb was not implemented in the same pipeline and was also significantly slower. Also, Nvidia misstated key figures like the ROPs at launch, which further misled customers by making it harder to discover the 0.5gb of VRAM was crap. Nvidia agreed to settle the lawsuit, which is basically an admission of guilt imo.

    • @rexyoshimoto4278
      @rexyoshimoto4278 Рік тому

      @@derek400004 What point is it when they've agree with the lawsuit? They further admitted it. The point is they gave no indication of this and -does that mean they can say "Well, it's not true only if you ask". Imoa they knew what they were doing, admitted it and paid the class action lawsuit. You have to remember Nvidia is not in the business of fixing or repairing video cards. They're in the business of selling them.

    • @derek400004
      @derek400004 Рік тому +7

      @@rexyoshimoto4278 my response wasn't to you, but to the other commenter who said Nvidia did give the GTX970 4gb ram, which was missing the point.

    • @rexyoshimoto4278
      @rexyoshimoto4278 Рік тому +1

      @@derek400004 Point taken. I stand corrected.

  • @BlackHoleForge
    @BlackHoleForge Рік тому +5809

    First they manipulate price of the cards, then they cut off almost half of the cores and still charge almost $1,000. It's time that we got some diversity in these cards. EVGA is looking a lot better today for their decision. If our leather coated friend wants to be Apple so much he should just go work for them. First we got shafted by the pandemic, then they shafted Us by selling most of the cards to Bitcoin miners, then it felt like a paper launch of the 3070 series, then they artificially inflate the price by controlling the supply, and now this. I don't care how many cool gadgets like resizable bar or Ray tracing they have, Nvidia right now looks like they don't know how to run their business, and they lost the mindset of their value customers, the ones who helped build them up to what they are today.

    • @r3mxd
      @r3mxd Рік тому +20

      or stop being poor and just buy one lmfao.

    • @jamisonmunn9215
      @jamisonmunn9215 Рік тому +94

      Here is the thing, no one is buying them and as long as no one is buying them the right price will be found.

    • @richardyao9012
      @richardyao9012 Рік тому +4

      The 4080 uses a different die size according to techpowerup, which got its data from people at Nvidia running GPU-Z. They did not disable almost half of the cores. Their yields are likely not that bad.

    • @Artificial.Unintelligence
      @Artificial.Unintelligence Рік тому +29

      I feel like no one addressed why evga can’t just swap to amd? And until nvidia starts cleaning up their act and doesn’t just keep throwing more power at the problem.. I’ll be going amd. No one wants to be popping breakers in their office for daring to have more than 1 device on

    • @mujtabamujeeb786
      @mujtabamujeeb786 Рік тому +136

      Agreed.

  • @Whoever_knows
    @Whoever_knows Рік тому +4191

    The more I read into Nvidia, the sleazier their business seems. No wonder evga split ties

    • @0fficialdregs
      @0fficialdregs Рік тому +36

      Same. I'm moving to AMD when I can.

    • @r3mxd
      @r3mxd Рік тому

      oh noes poor evga oh noes wahh holy fuck current gen of workers cry so much.

    • @darthwiizius
      @darthwiizius Рік тому +6

      Shame they didn't do a deal with Intel, that way they could exit the GPU market slowly while helping Intel establish a 3rd competitor. This would their help staff too I think.

    • @CarlosXPhone
      @CarlosXPhone Рік тому +1

      The timing seems suspicious.

    • @ronniewhitedx
      @ronniewhitedx Рік тому +1

      Yeah after this review I'm really happy ETA cut ties with this scummy company absolutely disgusting misleading consumers like that.

  • @bassx101
    @bassx101 Рік тому +12

    Micro Center is an excellent place to shop for electronics and pc builder enthusiast, so glad you have these guys as a sponsor. Good co op Jay 👍

    • @Fuzzhead392
      @Fuzzhead392 Рік тому

      I got all my parts at micro center to build my pc

  • @ApexOT
    @ApexOT Рік тому

    Glad to hear about this. Thank you for putting this out.

  • @gnomekingclive
    @gnomekingclive Рік тому +1373

    I remember paying $400 for my 1070, those were the days. Even then i thought that was ridiculous. Nvidia is spitting on us these days.

    • @mr.puddintater1805
      @mr.puddintater1805 Рік тому +13

      It's actually clearing their throat and adding mucus.

    • @cyllananassan9159
      @cyllananassan9159 Рік тому +1

      and the fanbois are taking it as if was a porn star....

    • @StarsMarsRadio
      @StarsMarsRadio Рік тому +4

      Yup. I thought they were shooting themselves in the foot with prices, but people keep buying cards that cost as much as a used car.

    • @MachineGunSquirrel
      @MachineGunSquirrel Рік тому +63

      Bought my 970 for $300 and balked at spending $500 on a 980. Fast forward six years and now I'm spending $700 on a 6900xt. Granted, 6700xt is still reasonably priced in comparison with modern cards.

    • @BarAlexC
      @BarAlexC Рік тому +4

      The funny thing is we had the 3060ti at 400, too and that's ~90% of the 3070.
      I guess too much 3000 remaining stock turns 4070 magically into 4080 12gb at an absurd price...

  • @WhiteshadowTV
    @WhiteshadowTV Рік тому +527

    Can I just say this: If a company gets pissed off at your channel for speaking the truth, then that company isn’t worth our money. If nvidia can’t fix their issues, then they don’t really deserve the business that you can potentially give them.

    • @antioc5095
      @antioc5095 Рік тому

      Capitalism at its best. We need to remember to use that that leverage in all industries. Stop supporting the companies that are intentionally duping us.

    • @Totenglocke42
      @Totenglocke42 Рік тому +3

      Manufacturing outrage for profit is not "speaking the truth".

    • @iamrocketray
      @iamrocketray Рік тому +24

      Nvidia are getting too big for their boots in my opinion, although their Ray Tracing is better than AMD's I went from a Nvidia 2080 to an AMD 6900XT and I couldn't be happier, It compares(if you discount the Ray Tracing) with the top tier of Nvidia 3 series and with me also upgrading my CPU from a Ryzen 5 2600 to a Ryzen 5 5600 I am good for 3 years at least. When I next upgrade Nvidia are going to have to work hard to drag me back into their fold!

    • @samuelvanlane
      @samuelvanlane Рік тому +20

      Not to mention between Jay, Paul, Linus and gamer nexus they control a good 80% of the viewership for GPU reviews and these guys are all pretty good to one another if Nvidia fucks with any of them itl be a worse outcome for Nvidia than it will be for the reviewers.

    • @neoasura
      @neoasura Рік тому +3

      @@iamrocketray They dont need to drag you back, for every 1 person that goes AMD, they have 1000 going Nvidia. Amd competition is a drop in their bucket, and they know it. Just look at the used 3xxx cards on ebay, still demanding a high price.

  • @mikeybee98
    @mikeybee98 Рік тому +136

    I'm actually excited to try out AMD's next GPU lineup for the first time

    • @davidcsokasy5451
      @davidcsokasy5451 Рік тому

      Cosign

    • @Terbojeezus
      @Terbojeezus Рік тому +5

      I took my first step away from Nvidia in years when the 6900xt was released and couldn't be happier. AMD has absolutely been delivering on their promises and I hope that holds true this November.

    • @KremsonKhan
      @KremsonKhan Рік тому +3

      I switched back to AMD in 2020 and last 2 years has been great! like very good deal,
      only the software needs help
      some versions are super op and some will break your pc but overall with the right software it works like a charm!

    • @danyates1814
      @danyates1814 Рік тому +1

      I was excited when I brought a 5700xt on release day however it was the most infuriating experience. Shocking drivers, constant crashes, ridiculous temps. Never ever again will I buy an AMD gpu

    • @KremsonKhan
      @KremsonKhan Рік тому

      @@danyates1814 Lol I completely understand you, u just need to find the right driver version! like for mine 5600xt, it is [RadeonSoftware-March2022] rest are all crashing or green lines!

  • @Jawzzy
    @Jawzzy Рік тому +181

    You are absolutely right. They are just naming lower end cards to the next class and hope nobody would notice and in the same time increase the performance gap between classes, so they can later slot in the Ti versions. That 12GB 4080 is what the 4070 was supposed to be. I really think the blind greed got the better of them this time.
    We have insanely high prices, huge number of second hand cards flooding the market, now that the mining went tits up. Add to that the brutal increase in prices of things people actually need to survive and the insane power draw of these new cards. I think Nvidia will have a heck of a hard time moving the new cards this time around.
    A lot more people will pay attention to AMD's release this time around. AMD, if they have a good product, may be able to snatch a solid chunk of Nvidia's market share.

    • @MrKvm96
      @MrKvm96 Рік тому

      S

    • @GraveUypo
      @GraveUypo Рік тому +9

      oh but they already did that. the fact you're calling that 4080 12gb a 4070 proves it. it is in fact a 4060, not a 4070. the 4080 16gb is what the actual 4070 should have been. but nvidia has already successfully hiked naming before, they're just doing it AGAIN.

    • @azorees7259
      @azorees7259 Рік тому +3

      yeah i have been a Nvidia user for over 20 years now and even i'm looking at what AMD brings out i think its time to change back to AMD .

    • @allxtend4005
      @allxtend4005 Рік тому +2

      ti or super versions ... or did the people forgot the RTX Super cards ? they dont have to call it super or ti, they can use anything and the people will still buy it.

    • @menace2society00
      @menace2society00 Рік тому +1

      @@azorees7259 Fr bro i literally love nvidias driver support and products, but man the company makes it hard to like their products with decisions like these. Not to mention them fucking over their partner companies by making founders edition so much cheaper

  • @jamesk7256
    @jamesk7256 Рік тому +281

    You know NVIDIA has become the enemy of the consumer when they announce "dropping GPU prices are a thing of the past" after years of dramatic price *inflation*.

    • @gustomucho81
      @gustomucho81 Рік тому +19

      Specially when they built the company on gamers but then the crypto dudes lined their pockets for a couple of years, now the crypto is on a slum. Gamers were happy to be able to buy GPUs again at normal price only to have Nvidia slap them hard. I think it is going to take years before we see a shift in GPU being at normal price again.

    • @only4posting
      @only4posting Рік тому +3

      One recent intel arc card, with 6GB of gddr6x, was announced at around 150 bucks (the arc a180, I think). Let's imagine intel is willing to make, let's say, 60 to $70, on each sale. Let's say the retailers will make 20 or 30 bucks, on each sale. That would mean, building the card, with cables, eventual adapters, the box, the transport, AND the 6GB of gddr6x, might cost intel around 40 to $50 ! Yes, intel would never spend billions on R&D, and several years, just to make some miserable $10 or $20, on each unit sold.
      That means....6GB of gddr6x memory probably only cost $40.... or even $30...! 6 fukin GIGABYTES !
      Nvidia rtx 3080 12GB will start at 899. The 3080 16GB will start at 1199. 300 dollars, for 4 extra GB of memory, that probably only cost Nvidia $20... or even less ! Because when Nvidia tells tsmc/other 'hey, we're going to buy 10 million 'parts', they get an additional % off !
      We all remember when high-end Nvidia cards used to cost 700 or $800. Even at those relatively low prices, nvidia was still making some amazing profits !
      Today, they are selling cards that cost between $1000 and $2000... and those cards probably ONLY COST Nvidia a few hundreds of dollars, to manufacture ! Yes, only a few hundreds of dollars, to manufacture these cards. And gamers are happily paying between 1000 and $2000, just to get one. Or even $3000, on ebay. So... WHY would they lower their prices...
      Yes, manufacturing a rtx4090 card probably only costs Nvidia $300...or even $250 !
      Sounds crazy, but it's probably not too far from the truth...

    • @nordicwarrior9566
      @nordicwarrior9566 Рік тому

      Drunk on those pandemic profits thinking it will continue forever, need to be put in their place hard!!!

    • @Malc180s
      @Malc180s Рік тому

      @@gustomucho81 Get out of the past. Nvidia don't give a shit about you. Gamers?! They're a bunch of whinging know it alls, and they're best ignored. Nvidia owe them nothing. It's still the cheapest hobby on the planet, and still you all moan. Boo boo, my graphics card cost more than it used to... Nobody cares chump.

    • @rightwingsafetysquad9872
      @rightwingsafetysquad9872 Рік тому +1

      Gotta pay for R&D and nut fluffing somehow. But seriously, chip manufacturing prices scale factorial with size. NVIDIA's AD102 probably costs 10-20 times as much as Intel's chip.

  • @worldkat1393
    @worldkat1393 Рік тому +913

    The irony is it is clear NVIDIA did this to try not get backlash for a 900 USD 4070 but in the end only made people even more angry.

    • @ericwagstaff2227
      @ericwagstaff2227 Рік тому

      for sure 100%, now that put the whole lower tier cards in question. Also, are they trying to pull a fast one to make you think 4080 12gb is faster then it really is (if you are in the know) because with all the left over 3090 that may get way cheaper would be a better value to performance.

    • @VoldoronGaming
      @VoldoronGaming Рік тому +11

      4070 is going to be at least 600.00 and wont come out until earliest 1st quarter next year but i think at least 6 months.

    • @CertifiedClapaholic
      @CertifiedClapaholic Рік тому +1

      And be what, the same ~6000 cuda core GPU for the same price?

    • @1vend7
      @1vend7 Рік тому +56

      I still think the prices are too inflated I hope they don't sell much this generation, I hope most people buy used GPUs

    • @nunosanches3693
      @nunosanches3693 Рік тому +4

      @@VoldoronGaming the 4070 is already out its called 4080 12gb , they will give you next the real 4060 with the 4070 naming...

  • @christroy8047
    @christroy8047 Рік тому

    Jay - man thanks for your honesty and integrity. Well said brother.

  • @addusernamehere
    @addusernamehere Рік тому +9

    Great consumer awareness video, Jay! Nice work!

  • @ericbonow4809
    @ericbonow4809 Рік тому +712

    The EVGA split was a big deciding factor for me to stay away and this deceptive naming seems to be in alignment with their business practices.

    • @crazydude5825
      @crazydude5825 Рік тому +38

      This ridiculous massive price increase is likely one of the reasons the split happened. If the price of the next iteration of a product is raised to literally 200% of the previous while demand is plummeting, I wouldn't feel confident in that business arrangement.

    • @davidbrennan5
      @davidbrennan5 Рік тому

      @@crazydude5825 I think its also the threat of cheap miner gpus hitting ebay, hard to sell a new card when you can buy last gen for pennies on the dollar.

    • @The_Blade_Gamer
      @The_Blade_Gamer Рік тому +1

      yeah im out i have a 3080 evga card and im staying there i wont buy a new card

    • @cryptogizmos1694
      @cryptogizmos1694 Рік тому

      Eric I agree. This showed how little Nvidia cares about you, me, developers or their partners. They believe their own hype and when they start pissing off their customers, AMD doesn't have to beat them, just come close and show they want your business, and people bail.

    • @s1mph0ny
      @s1mph0ny Рік тому

      EVGA created their own problems, they've been selling $300+ video cards without backplates which is insane.

  • @kaytrim
    @kaytrim Рік тому +1884

    Nivida is a money grubbing company. I am 100% behind you on this Jay. The cards are great but the pricing and naming games are only to their benefit. I am glad and sad that EVGA broke up with them.

    • @jameswayton2340
      @jameswayton2340 Рік тому +62

      Every company, is a money grubbing company. Well yea ok not all of them but 90-95% at least. Thats what they do, make money. Every choice they make, every word spoken, everything they do has one purpose: Making as much money as possible. They don't ''like'' their consumers and don't care for them in any way. Just like politicians don't give a damn. ''How much can i fuck with my buyers and get away with?'' is whats at play.

    • @jkutnink87
      @jkutnink87 Рік тому +3

      @@jameswayton2340 A company is for profit, they are not money grubbing but are for profit. It's up to us the consumers if we want to spend our earned money on their products. After awhile another company will show up with a product just as good or better and for cheaper. Guess who will buy it. But as for me, I have deep pockets and will more likely purchase a 4090 and replace my 3080 Founders Edition computer.

    • @billyno-talent345
      @billyno-talent345 Рік тому

      99.9% of companies are money grubbing companies. no company youve er heard of isnt

    • @BOZ_11
      @BOZ_11 Рік тому +4

      @@jameswayton2340 any company that works with reasonable margins and provides good service is not "money grubbing". You're confusing working for a living, with being a disingenuous shyster. Learn the difference.

    • @nonfungiblemushroom
      @nonfungiblemushroom Рік тому +5

      @@jameswayton2340 there are gradations to these things. Every company exists to profit but not every company engages is the same level of deception, anti-consumerism, manipulation, avarice, monopolism, etc. It's wise to be aware of the relative shadiness of any company and vote with your wallet accordingly.

  • @smitfs
    @smitfs Рік тому

    thank you for educating us!

  • @tjstrato1
    @tjstrato1 Рік тому

    Thanks for the explanation Jay

  • @awesomejakex3798
    @awesomejakex3798 Рік тому +1202

    Nvidia’s marketing is completely predatory and they are fully aware that some consumers just won’t know any better and will get duped by the 4080’s naming scheme

    • @potal_memes
      @potal_memes Рік тому +15

      more like 4080's naming scam...

    • @Saiku
      @Saiku Рік тому +1

      Yep, exactly!

    • @TheGoreforce
      @TheGoreforce Рік тому +3

      most people don't care, they understand that typically the worse phone has less ram. Series X and S are only known the difference because of the letter. The general consumer wants the best product for what they can afford. And sometimes purely on aesthetics. To believe that the general consumer, the OEM full system build with a 4080 12gb cares, if it does what they need it to do and that's play the latest games at a really high quality. It sucks the price increase. AMD may have competition in pricing, and probably performance.

    • @FcoEnriquePerez
      @FcoEnriquePerez Рік тому +5

      It has always been, and will be forever, and the Nvidia chills will never stop paying them even if it is for some bullshit

    • @rogerroger10-47
      @rogerroger10-47 Рік тому +4

      I have a 980 TI. 9>4 so it's better.

  • @rnetsrak
    @rnetsrak Рік тому +680

    An even bigger difference between the 12GB and 16 GB versions of the 4080 besides memory and core count is the bandwidth, only 192bit wide vs 256bit wide and even 21Gbps modules vs 23 Gbps modules, so sad they call the 12GB version a 4080 :(

    • @priyanujdutta7549
      @priyanujdutta7549 Рік тому

      Could have brought back the "super" naming scheme and called them 4080 12G and 4080 super 16G. But nvidia chose to be misleading and rip people off.🤦‍♂️

    • @pentagramyt417
      @pentagramyt417 Рік тому

      It's just because a RTX 3080 12 GB was a RTX 4070, so they changed it into "80" with higher price. Simply.

    • @arc00ta
      @arc00ta Рік тому

      The 4080 12GB is the V6 camaro of video cards. All the looks but missing the most important bits. It'll still work but the people who know will laugh at you when you're not there.

    • @davidolszak6318
      @davidolszak6318 Рік тому +64

      Even the 16GB has a narrower memory bus than the 3080 10GB. But 192-bit? Nvidia must think that none of the people spending $1600/1200/900 bother to look at specs.

    • @4thMUSKETEERS
      @4thMUSKETEERS Рік тому +41

      Imo they are just 4070 and like 4070 ti. Expect a multitude of gpu's to launch above them. Nvidia will double-dip harder than ever before

  • @pharbath
    @pharbath Рік тому

    I appreciate the no-nonsense style you bring to this video.

  • @fcordero111
    @fcordero111 Рік тому

    Well said, Glad you are speaking up and letting the consumers know the truth.

  • @electrospank
    @electrospank Рік тому +188

    I work in a different industry but this video hits home. We make an amazing product, the people that make and support the product are amazing, but the upper management and sales practices are creating a customer experience that is unnecessarily awful. It's been extremely frustrating to both love a product and see how many people have been unhappy for no good reason. No disclosure of the company but I've finally had it and I'm looking for a different position somewhere else. All I can do is leave.

    • @mikaelmillby7060
      @mikaelmillby7060 Рік тому +6

      That’s the challenges I also having, as people tends mainly work at a company to get their pay check and not there for the vision or common goals or can feel customers need and frustrations. Thanks for sharing.

    • @scroopynooperz9051
      @scroopynooperz9051 Рік тому +24

      Lol the bean counters always fcuk everything up.
      Passionate, hard working engineers and creative people make these products.. and then the soulless bean counters and management types fcuk it all up xD

    • @shannonmcstormy5021
      @shannonmcstormy5021 Рік тому +6

      And what is especially bad about this type of behavior is that it doesn't need to be this way - Saving a few pennies but alienating your customer base, your internal and external vendors, your brand reputation, is not only unethical, but its just not worth it.

    • @TheTekknician
      @TheTekknician Рік тому +4

      Corporate greed ruins everything.

    • @Disillusioned2022
      @Disillusioned2022 Рік тому

      I remember when that never happened

  • @mrsuperselenio5694
    @mrsuperselenio5694 Рік тому +165

    That last frame " 12 GB is the new 3.5GB" hits hard maan, being the owner of a gtx 970 that found that out not long after purchase and being currently in the middle of replacing my system just bring back those vibes that makes me want to go full team red instead of team green with regrets.

    • @wardiver6520
      @wardiver6520 Рік тому +2

      Same!!

    • @ABaumstumpf
      @ABaumstumpf Рік тому +1

      But by that statement basically all cards are crippled - Nvidia and AMD. Yeah 12 - for the next couple years 12GB is plenty.

    • @adamdunne6645
      @adamdunne6645 Рік тому +11

      @@ABaumstumpf His point is that they're telling you it's one thing while it's actually another. The 12gb card really is just a 4070, while the 970 really only had 3.5gb vram usable while claiming 4gb.

    • @andrasbiro3007
      @andrasbiro3007 Рік тому +10

      I had two 970 in SLI. That was the first setup capable of acceptable 4K gaming. I bought them right after launch, so it took some time to learn that the micro stuttering is due to the memory and not the exotic setup.

    • @ABaumstumpf
      @ABaumstumpf Рік тому +1

      @@adamdunne6645 "970 really only had 3.5gb vram usable while claiming 4gb."
      That card still has 4 GB despite your claim.

  • @systrex
    @systrex Рік тому

    I used to go to the microcenter in Tustin, CA. They were close to going out of business for a while, I'm glad they're still around.

  • @rtssj
    @rtssj Рік тому

    Hands up for bringing this to light

  • @overgrowncaterpillar
    @overgrowncaterpillar Рік тому +89

    Which means it went from 499USD for the 3070 to 899USD for the "4070" and the 3080 went from 699USD to 1199UD for the 4080. Amazing inflation in prices. Stunning. Lets hope team Red releases more affordable cards.

    • @scproinc
      @scproinc Рік тому +1

      And yet, it's (almost) all sold out on Newegg. Nothing will happen.

    • @hjelpen5387
      @hjelpen5387 Рік тому +4

      @@scproinc exactly, all that happend is nvidia understood from the last years people are willing to pay the enourmus scalping prices for gpus, so why not just increase it themself to get maximum profit when they get sold it instantly anyway

    • @banescar
      @banescar Рік тому

      Not exactly, because performance is also higher.

    • @fionasherleen
      @fionasherleen Рік тому

      @@hjelpen5387 there were miners

    • @robotspartan9100
      @robotspartan9100 Рік тому +4

      @@banescar bad counter, as performance is ALWAYS higher generation to generation
      now, if the performance increase is double what it usually is (a 3060=2070=1080 roughly, so in this case a 4060=3080), then its more of a name shift which is a different conversation altogether. But that stands to be seen
      Short version: if a 4060= 3070 its greed from nvidia. If a 4060=3080 its a badly implemented shift in SKU naming.

  • @benjaminoechsli1941
    @benjaminoechsli1941 Рік тому +669

    Another difference that isn't mentioned on that chart: memory bus width. You can't just lop 4 GB of VRAM off and call it a day, you have to change the bus, too (something about each bus width having only two memory configs it can handle, and they're multiples of each other).
    So the "4080 12GB" not only has less memory, it has a smaller bus (256-bit down to 192-bit). The performance gap between these two is going to be _much_ bigger than the box would imply. I agree, Jay, the FTC needs to get on this.
    Great video, sir. The rumor mill says AMD is going to have a very competitive product this year. If they don't go nuts with the pricing, they'll be the winners in customers' minds even if they lose the performance crown by a few %.

    • @TheReferrer72
      @TheReferrer72 Рік тому

      The clock is quicker

    • @pedroalbuquerquebs
      @pedroalbuquerquebs Рік тому +6

      The 4080 12GB is the 4070. They just didn't want to market that way. Because of that, the 4070 will probably be what the 4060 should have been. An eventual, 4060 will just be the 4050. And there will probably be no 4050 series.

    • @justinvanhorne8859
      @justinvanhorne8859 Рік тому +4

      Totally agree, I was harsh on jay yesterday and I owed an apology. Until 11:00.
      Back to defending the poor corporations. He calls it justifying increases, but when a company is reaping RECORD profits, you have zero leeway when it comes to walking this line between corporations that love tech, and corporations that are out to bleed tech for every last cent its worth.

    • @meeponinthbit3466
      @meeponinthbit3466 Рік тому +2

      The bus is clearly tied to how many memory chips they put on the card. They have dedicated data lines to each one, so removing chips removes those data lines from the total. The speed per chip is the same.

    • @JeanPiFresita
      @JeanPiFresita Рік тому

      I think their prices are going to be much more competitive, their cards are continuously lowering prices in a much more aggressive way, as if trying to make room for the ones to come. It's a win/win for amd and gamers looking for a good quality/price ratio. For now I'm already decided to buy an amd gpu, I'm just waiting for the right time. Goodbye Nvidia, I'm tired of you.

  • @wag-on
    @wag-on Рік тому +4

    Absolutely SPOT ON! I would definitely assume the same GPU chip being used - if the product is labelled as 'RTX 4080' with only a difference in on-board memory. Having the 'Ti' moniker I would expect performance variation from either clock/cores or both.

  • @ikerhermann380
    @ikerhermann380 Рік тому

    thanks, downloaded, all works!

  • @DJJ212
    @DJJ212 Рік тому +256

    I've waited 3 years, I can wait until November. Competition keeps people honest, I just hope AMD can put enough pressure on Nvidia to start to right things.

    • @bara555
      @bara555 Рік тому +13

      ikr? Still rocking a 960, so what is 2 months at this point?

    • @pandemicneetbux2110
      @pandemicneetbux2110 Рік тому

      I have no earthly idea why any of you guys are acting like that even is just a possibility, CLEARLY AMD is going to put pressure ffs they just did it two generations in a row. Basically, I've come to just find that nVidia fanboys act like their heads are filled with Intel's sand or something, like all they hear is marketing talking points and logos and never actually look at the performance and costs. nVidia has fucked us repeatedly in the past and will do so again, they act like a spousal abuser literally no one in the tech industry can even stand and the people still buying it just look like beaten housewives for saying it. "Well I hope they do" of course they will, already it makes zero sense at all to buy nVidia's lower end RTX 3000 series cards, and they had to slash prices tremendously at the top to even remain competitive with AMD.

    • @razor19906
      @razor19906 Рік тому

      u are a joke both ceo of nvidia and ceo of amd are family

    • @xorinzor
      @xorinzor Рік тому +16

      ​@@razor19906 You know she dismissed that, right? Unless you have any actual evidence, it just sounds like you're the joke here.
      And don't act as if Lisa Su hasn't done anything for AMD. She's why we have Ryzen nowadays.

    • @TAP7a
      @TAP7a Рік тому +11

      @@razor19906 you are not a clown. You are the entire circus.

  • @insidethreewide7133
    @insidethreewide7133 Рік тому +322

    What matters the most to me is honesty and transparency. You called it right. Because of the pandemic, NVidia has decided to change their business model seriously affecting their moral clause. I have been for NVidia from the 980ti, 1080ti and now the 2080ti. It all ends their now. I will be switching to AMD CPU and GPU on my next build 1st quarter 2023. They will shoot themselves in the foot by denying sending you product for review. It will serve you well in validating their schemes. Your viewers are smarter than that.

    • @Pyroteq
      @Pyroteq Рік тому +11

      Moral clause? Lol it's a graphics card to play games champ, not water and shelter.
      The demand for GPUs for AI is exploding. You're competing against data centres for the same silicon.
      You don't NEED to play games at ultra in 4k

    • @insidethreewide7133
      @insidethreewide7133 Рік тому +43

      @@Pyroteq well then by all means keep sending them your money. That’ll show ‘em.

    • @zekiii6653
      @zekiii6653 Рік тому +3

      @@Pyroteq I need to play them at 1080p with 1000+ fps, what are we talking about 🤣

    • @StrixWar
      @StrixWar Рік тому +4

      Honestly at this point I’m never going AMD in my main builds I have used them enough to know intel is more reliable. I have been using AMD since the A10 series and each generation of amd I have used it gives me issues
      A8 - A10 - FX8350 - Ryzen 5 1600 - Ryzen 5 2400g - ryzen 9 3900x everyone has given me issues meanwhile my intel cpus run perfectly

    • @Born_Stellar
      @Born_Stellar Рік тому +4

      @@StrixWar I have heard people say this but my experience is the opposite. I've only owned 2 intel cpu's, 2500k and 6600k, but with both I experienced stuttering where with the AMD I did not even if the fps was lower.
      I did have problems with the 2200g graphics, have to use a card with it now but I never really expected integrated graphics to be all that great.
      I also had to re-install the IHS on the 6600k otherwise thermals were unmanageable, even on a full water loop.

  • @clarkmeyer7211
    @clarkmeyer7211 Рік тому +1

    I agree. I don't know all too much about computer hardware but I get told what specs to look out for but the boxes never show the details I'm looking for. It definitely should be a requirement especially when you are making a huge investment. Its like being told there is a house up for sale but you aren't allowed to know the details except for seeing the front of the house.

  • @dsalobaid
    @dsalobaid Рік тому

    working fine thank you

  • @DocBrewskie
    @DocBrewskie Рік тому +708

    Nvidia is so focused on high end and AI that average gamers are left in the dust. If AMD plays it right they could scoop up all of us gamers who aren’t going to spend $800-900 on just a gpu.

    • @Raven.Bloodrot
      @Raven.Bloodrot Рік тому +107

      Honestly I don't think I'll buy another Nvidia card again after seeing how they truly are, it's disgusting.

    • @OwlUnknown
      @OwlUnknown Рік тому +1

      @@Raven.Bloodrot I share this sentiment

    • @Jahh_2s
      @Jahh_2s Рік тому +96

      @@Raven.Bloodrot 3070 owner here.. Ever since the release of ryzen i went amd for every cpu i owned. Guess ill do the same with gpus now, cause i wont support this kind of business practices....

    • @rigoti
      @rigoti Рік тому

      Must be new if you didn't know budget cards are announced after the flagships

    • @Raven.Bloodrot
      @Raven.Bloodrot Рік тому

      @@Jahh_2s I currently own a 2070 SUPER and was looking to upgrade this generation but I also can't support this type of business practices either. I'll wait for AMD cards and go that route as well.

  • @playascap
    @playascap Рік тому +257

    Jay, you seem burnt out from disappointment after disappointment from the industry, nonetheless I want to thank you for continuing to put out videos with integrity that you believe will help your viewers. Thank you!

    • @steves.6649
      @steves.6649 Рік тому +4

      Haters will hate. While fans and those seeking more GPU performance for apps like prosumer VR for example will buy the 4090. Thank you.

    • @VesperAegis
      @VesperAegis Рік тому +2

      It's quite hard to be too disappointed if you focus on the incredible specs alone of the 4090. Double the 3090(at least) is one of the biggest leaps we've seen in performance in many years. Let's focus on the actual quality of the device itself rather than its price point, which has simply risen with inflation as a premium product. The best product is always exponentially more expensive, and that aspect compounds when inflation rises.

    • @VesperAegis
      @VesperAegis Рік тому +3

      @@steves.6649 Could not agree more. Balls to the wall on performance, I say.

    • @birrk1
      @birrk1 Рік тому +11

      @@VesperAegis are you working for nvidia?

    • @AsaMitakasHusband
      @AsaMitakasHusband Рік тому +2

      @@VesperAegis found the meat rider

  • @aaronsimpson7206
    @aaronsimpson7206 Рік тому +34

    I was pretty stoked to buy a 4090 next month, but this is at least giving me second thoughts. Thanks for the insights!

    • @VonSpud
      @VonSpud Рік тому

      Same here...

    • @azorees7259
      @azorees7259 Рік тому +4

      i was looking at the 4080 , and if Jay didn't mention about the core count difference and the fact that the 12GB is basically the 4070 i would have gone for that as many games at the moment don't need more than 12GB at 4K . So i'm glad he mentioned that and saved me making a huge expensive mistake. i can only wish that 50 or even 75% of Nvidia users boycott them and move to AMD , i wonder what Nvidia would do then :)

  • @Purenrgy
    @Purenrgy Рік тому

    I was watching this as I was researching some upgrade choices. I think I will be waiting and switching to RDNA. Thanks, great information!

  • @VintageandVoltage
    @VintageandVoltage Рік тому +153

    I used to be team green, and most of my computers had an Nvidia card in them. But this is exactly why I've become a pretty loyal AMD guy. I will happily buy something slightly slower/worse if it means supporting a company with good ethics. That's more important to me than a minor performance difference I can hardly perceive.

    • @nekuraookami
      @nekuraookami Рік тому +5

      Thing is is the spec differance between amd and nivida for cards at same price pointsisnt as huge as its made out to be on the benches a lot of those differneces are barley if noticeable at all to you well in game

    • @steffenkawa8374
      @steffenkawa8374 Рік тому +2

      I also had amd cards for 15years but I had a lot driver issues with certain games like Nier Automata. Because of that I switched to Nvidia.

    • @snozbaries7652
      @snozbaries7652 Рік тому +20

      AMD just like Nvidia are corporations who's soul goal is to make more money than last year. The ceo's literally have a fiduciary requirement to make more profit than last year. I wouldn't give AMD too much credit until we see pricing on RDNA3. Corporations aren't your friends. It's quite the opposite.

    • @benjames5423
      @benjames5423 Рік тому +9

      @@snozbaries7652 Corporations are just groups of people. How those groups of people conduct themselves determines whether I want to forge a relationship with those people or not.
      While it is the duty of the CEO to ensure the group is making a profit, there are many ways they can do this. Nvidia have chosen the “easy” path, which seems to be almost entirely dependent on the now non-existent crypto mining segment of the market. That’s a group of people I don’t wish to associate with.

    • @jayb2705
      @jayb2705 Рік тому +9

      If AMD had the higher market share, they would pull the same stunts. These companies are not our friends, as soon as AMD got the performance crown with the 5000 series they hiked prices and only released higher end SKUs. And I say that as someone who has supported AMD through the bad years. What we need is competition and choice in the market that keeps all the companies in check.

  • @GriffinCorpOne
    @GriffinCorpOne Рік тому +462

    I love Nvidia but after losing EVGA, I've started to open my eyes on what this company has become and the more I think about it, the more I want to go to AMD. I will wait for November!

    • @ProjectArjun
      @ProjectArjun Рік тому

      I absolutely agree with you on that

    • @dylanspriddle
      @dylanspriddle Рік тому +1

      Amd just need to make good cards and sort the drivers and they will be king.

    • @antequamm
      @antequamm Рік тому

      I currently have the Zephyrus G14 with the RTX 3060, and after seeing this I think Imma cop the G14 with the RX 6800S

    • @TheEudaemonicPlague
      @TheEudaemonicPlague Рік тому

      What do you mean, "has become"? Nvidia was a dirtbag company in the 90s, and hasn't changed its spots at any point.

    • @papasmurf5598
      @papasmurf5598 Рік тому +11

      Your not going anywhere so stop playing the victim. As long as NVidia keeps laying the Golden Egg, you me and everyone else will keep on buying there GPUs.

  • @1unkn0wn
    @1unkn0wn Рік тому

    Thanks J.

  • @lunaticstrike5319
    @lunaticstrike5319 Рік тому +14

    Quite interesting video. Honestly I had do replace two old computers with old GTX 970s of my children one year ago and ended up buying two notebooks with RTX3060 graphic chips.
    Meanwhile i still run my 980ti and I will stick to it as long as it runs. I am was planning to buy an RTX 3000 series but skipped it due to the insane pricing back then. Meanwhile I was hoping for the 4000 series but I will skip that too. I am currently planning to buy the next AMD once it’s released and completely avoid NVIDIA or maybe go for that new Intel ARC cards if they turn out to be good. I am not entirely sure

    • @Ionstorm
      @Ionstorm Рік тому

      Intel ARC top line makes more sense than buying a 3060. Also laptops with RTX Cards you need to be very VERY careful of because they all have different TDP's.. honestly its a shambles.
      Say you buy a laptop with a RTX 3070 in it, there could be one with 100w TDP and one with 200w TDP which performans way better, but no one will tell you the TDP without closely looking at the specs (that alone should be illegal imo) anyway, yeah just care.

  • @crazynluv4293
    @crazynluv4293 Рік тому +109

    I am not computer smart enough to know the difference so having videos like this is really educational and enlightening. I would be very upset finding out that there were significant differences in the makeup of the card but it was called the same thing. Companies seem to forget that customers are people and people will get tired of being mistreated eventually especially if other companies see the mistreatment and step up. At the end of the day they're just helping their competitors.

    • @100Bucks
      @100Bucks Рік тому

      This info will help you since you don't know much about computers. As long as you can achieve 60fps, your are good. There's an application for $5 called Lossless scaling. This program scales low resolution to high resolution. If you buy the application I highly recommend using FSR or integer scaling. Integer Scaling doesn't take any CPU or GPU power. To make this work properly for Integer use only. Pick Resize before scaling and scale factor should always be set at 1. All your games will run in 1080p or 4k depending on what monitor you have. Whatever your windows display is, that what the scale will be. Integer Scaling isn't talked about much and everyone should be using this. Integer Scaling sharpens everything and wipe out blur completely you can keep all options in game off. With Lossless scaling you won't care about any GPU. If you are already playing games at 60fps. You don't need future GPU. Just apply Lossless Scaling on every game.

  • @barrakuda17009
    @barrakuda17009 Рік тому +39

    I've been Nvidia loyal since an AMD card burned out on me like 20 years ago. But this kind of BS is really pushing me away from them. The fact that EVGA bailed on them is the icing on the cake. My next build may be my first AMD CPU/GPU

    • @quann06
      @quann06 Рік тому +1

      I'm starting to feel that way as well. I might switch back to AMD GPU's.

    • @charlieman8503
      @charlieman8503 Рік тому +1

      I had many 30 series cards and 6900xt was one of the best cards I used when raytracing is off

    • @Piotr_Majchrzak
      @Piotr_Majchrzak Рік тому

      It makes even more sense if we consider performance-per-watt ratio, where AMD seems putting their effort much more than Intel or especially Nvidia.

  • @equinstarbeat
    @equinstarbeat Рік тому +7

    The best part about all of this, is that even with these prices, board partners may not even be making a profit on these cards.

  • @johnmichaels4330
    @johnmichaels4330 Рік тому +1

    Thanks for the Micro Center ad. Got the hard drive for one of the old PCs I have for retro gaming. We have one in houston, but I have to ask... how do we not have a Micro Center next to Nasa where I live? I didn't even know we have them in Texas. They would make so much money closer to all us geeks.

  • @MrPaddy2010
    @MrPaddy2010 Рік тому +59

    Going full AMD - the only perk before for me was Nvidea’s ray tracing, but at these prices I’m not that fussed, AND keep getting better and better and their pricing is waaay better

    • @scroopynooperz9051
      @scroopynooperz9051 Рік тому +5

      AMD RX 7000 series ray tracing is expected to be better than RTX 3000 series but fall short compared to 4000 series.
      That said, will most people even notice? Wait for the reviews.
      Either way, if AMD doesn't get all greedy and fcuk up this opportunity, if they release say an RX 7700XT around $500 - $600 that is more or less the equivalent of an RTX 3090/ti, they will grab massive market share

    • @spankbuda7466
      @spankbuda7466 Рік тому +1

      Good! Now it will not be a hassle when I get my 4090.

    • @TheGrimeyVibes
      @TheGrimeyVibes Рік тому +1

      I have a 3070 and ray tracing for almost 2 years. Looked amazing in Cyberpunk but the game was still shit. Never used it since, overhyped right now.

    • @sierraecho884
      @sierraecho884 Рік тому +1

      Unfortunately the software that I use that heavily depends on the GPU needs cuda cores =( But I will still make the change to AMD GPU.

    • @alpha007org
      @alpha007org Рік тому +1

      Remember when "Tessellation" was a big deal? Now it's a feature every card has. This gen (4000/7000) won't have parity in RT but I hope when next (next) gen comes out, it will be the same. Nobody will care, because both will be able to deal with RT effects without massive performance loss.

  • @pieterrossouw8596
    @pieterrossouw8596 Рік тому +54

    Man I was hoping to upgrade my GTX 1080 to a RTX 4080 this release, finally use a 4K gaming monitor. Maybe AMD can still make it a reality but when a card's MSRP is so much higher than a house mortgage payment you'd have to have your priorities pretty skewed to fall for it.

    • @pandemicneetbux2110
      @pandemicneetbux2110 Рік тому

      Again, why the F do you guys keep saying "I hope" "maybe" as if this is even a possibility that AMD won't. They already are quite literally better hardware getting an RX 6600 than a shitty "RTX" 3050, as if you can even use the RT at acceptable non-pudding framerates/appearance at 1080p with that thing. It's basically a GTX 3050 for about as much as a 5700XT cost back in 2019, but for vastly inferior performance and worse software.
      Meanwhile let's talk about the future, you can not "bet" but "be completely certain" that AMD is going to come in swinging. Like that's just a basic fundamental fact. We already know that nVidia's TDP has literally skyrocketed recently just because of how hard AMD is pushing them against the wall, and now we're at freaking 450w, that's absolutely insane. You can tell how hard AMD is pushing them back and how much AMD claimed the performance crown just by the amount of watts, which similarly you can sorta tell with Intel and why Intel sucked next to recent Zen generations. AMD's power draw is like "man I'm not even trying I'm just playing with you" tier. So what we're assuredly going to end up as is that AMD is at minimum going to compete with them hard, and more than likely is going to be the new best gaming cards hands down. That's the entire reason nVidia is pushing those TDPs so high like Jesus Christ do you guys think that nVidia is doing that just to give you more performance or something? They are doing that because they are threatened.

  • @TuntematonMand
    @TuntematonMand Рік тому +2

    I'm going red. Lots of people I know are already full red and they love it. I think it is time for myself to make the switch.

  • @Vash12788
    @Vash12788 Рік тому +8

    Saw this coming when they started selling so many cards during the pandemic that they couldn't keep them on shelves, even at double or triple MSRP. They made the stockholders really happy and now they have to find some way to keep making the same amount money or the stockholders get mad. A big change is coming, whether it's bad or good I don't know.

    • @Jarredlol
      @Jarredlol Рік тому +2

      Artificial growth can't be maintained forever, I get what you're saying. Their chart wasn't organic, they pumped and at some stage now they're going to dump because they were manipulating their own market.

  • @SightedNZ
    @SightedNZ Рік тому +315

    You're dead right Jay. I've been building computers for 20 years now and I've never seen the industry in such a F#%ked state. I just watched Steve's video on the partner cards and they're absolutely ridiculous. The size of the cards, the price of the cards and the superficial/shallow nature of all the marketing is disgusting.
    I'm definitely skipping the 40XX series cards and will run my 1080 Ti until it dies.

    • @carlito24uk
      @carlito24uk Рік тому +15

      Aren’t UA-camrs partly to blame for this?

    • @GamerErman2001
      @GamerErman2001 Рік тому +16

      Can't wait for RTX 5000 series with 6 slot cards and computer cases with a place for an extra power supply so you can plug your GPU into a different circuit breaker's outlet.

    • @xXAlmdudlerXx
      @xXAlmdudlerXx Рік тому +10

      Do GPU‘s even die at all? I’ve been running my 980TI for 6 years now and it’s totally fine.

    • @Ineluki_Myonrashi
      @Ineluki_Myonrashi Рік тому +13

      @@carlito24uk How does your logic work for blaming "youtubers" for this?? Who do you mean by youtubers, and how exactly do they carry blame?

    • @Black7Tech
      @Black7Tech Рік тому +3

      🙏🏽Amen to that!
      I'm running a Strix 1080 TI as well💪🏽 but in SLI (I know that for gaming it's basically dead but...)
      Till it does on me Imma keep it

  • @piguyalamode164
    @piguyalamode164 Рік тому +14

    I can wait for us to have to decide between a 4080 TI 16 gb, a 4080 TI 20 gb, 4080 Extreme, 4080 16 gb, 4080 12 gb, 4070 (actually just a rebranded 30-series card picked at random), and a 4090 21.5 gb PCI gen 5 usb 4.20.69

    • @ExMachinaEngineering
      @ExMachinaEngineering Рік тому

      And the day that guy decides to go for the 4090 for ultimate performance, NVidia will announce the 4090 Ti

    • @piguyalamode164
      @piguyalamode164 Рік тому

      (If you are wondering why the usb version matters for a video card, it doesn't, NVIDIA just realized that putting more random numbers in their card names increases sales)

  • @larryharry2768
    @larryharry2768 Рік тому +1

    Bought a used 1080 a few years ago for 300 bucks and still not in any rush to upgrade.

  • @trolldatshityeahyou4001
    @trolldatshityeahyou4001 Рік тому +9

    I'm mostly happy with my 1080 which is why I can wait for what AMD will show

  • @Meowskyy
    @Meowskyy Рік тому +188

    At what point are the manufacturers supposed to start clarifying core counts in the name of the product aswell, might aswell do that since the video memory is in there aswell. This is just absolutely scamming the end users.

    • @frankytanky5076
      @frankytanky5076 Рік тому

      Idk, but this is literally a scam and it's annoying how it's allowed. People who don't know anything about computers are totally gonna just buy the 4070* not even understanding they just got ripped off.

    • @alexanderschulte8248
      @alexanderschulte8248 Рік тому

      Generally because core count doesnt mean much across different generations. Like a newer card could have less cores than the older generation but be much faster. Then you end up with a problem of people thinking the older worse cards are faster.

    • @VoldoronGaming
      @VoldoronGaming Рік тому

      Most buyers wont even know what cuda even means.

  • @madpistol
    @madpistol Рік тому +179

    The sad part is that Nvidia makes an amazing GPU. It's marketing and their CEO that go and ruin it.

    • @nonfungiblemushroom
      @nonfungiblemushroom Рік тому

      The only way to stop this is to vote with your wallet. Don't give in, wait this generation out, go team red, or buy used previous gen Nvidia if you absolutely have to. Don't give Nvidia any more money directly. We have power in numbers but only if we can persevere and not give in to temptation for the "best" when "very good" is a fraction of the price.

    • @racso5628
      @racso5628 Рік тому

      I certainly don’t disagree, but always remember that the real bosses are the shareholders. Everyone loves seeing those high rates of returns on their investments so execs, marketing, etc will do whatever it takes to make sure shareholders are happy.

    • @Breakzoras
      @Breakzoras Рік тому

      De Javu, ever heard of Activision Blizzard , EA, Ubisoft ? These companies are all the same, they dont care about the consumer, and look where they all are today, in the hall of shame

    • @AgneDei
      @AgneDei Рік тому +1

      It may be that the manufacture costs of those cards are just too high, and that would be reasonable considering the runaway power draw increase through newest generations of GPUs.

    • @doctordothraki4378
      @doctordothraki4378 Рік тому

      Yeah. Naming scemes are also ruining it (with people saying the 4080 12gb should be called the 4070). It's bad enough when USB, 5G or HDMI have misleading naming scemes

  • @ruadeil_zabelin
    @ruadeil_zabelin Рік тому

    Interresting. Thanks for bringing that up. Now I know yet another thing I have to look out for when buying a GPU.

  • @Xithar_tri
    @Xithar_tri Рік тому +4

    This is the main reason why I buy only after reading not only initial reviews, but also the ones a month later, as well as waiting for enough user benchmarks to see how well the cards perform in RL.
    In regard of buying the fastest, most expensive card: I did this error once with the 1080. 3 month later the 2070 came out with mostly more performance in games and only around 2/3 of what I paid. So I did swear myself never again, usually now I'm not really looking at the numbers in the names much. Atm. i have a 2080 Super and a 3060 Ti (had to buy the 3060Ti when the 2080S had to be repaired after 1,5 years because of constant overheating). The 2080S performs a very small bit better, but the 3060 Ti uses less power, doesn't produce nearly as much (2/3) heat and runs much quieter, which is why this is now the card I'll be using till I get the itch to upgrade it again - which with such prices might need a looong time.

  • @JustIn-sr1xe
    @JustIn-sr1xe Рік тому +327

    At this point. I want to see Intel and AMD really apply the pressure to Nvidiot. Ever since the 3000 series came out. I have wanted to see AMD have something super competitive. Something Team Green can't ignore, and has to do something about. Mostly in their pricing.

    • @marjanp
      @marjanp Рік тому

      You actually have to buy their product, not just wait to Nvidia drop the price. They're not charity.

    • @kennyplayer7731
      @kennyplayer7731 Рік тому +2

      Amd still win in efficiency and price just got the rx 6900 xt for 599 nothint can beat that . Who needs ray tracing marketing busht

    • @arc00ta
      @arc00ta Рік тому +1

      I too want this. I went all in with a 6900XT and it was the worst gaming experience I've had in a long time. I ended up having to trade out for Nvidia just so I can enjoy the games I want to play. If they can step up their RT abilities it would be really nice. I'm playing at 4K120 so I cannot compromise on performance.

    • @dockgamer973
      @dockgamer973 Рік тому

      AMDumb are no better, only there cards aren't as good at all.

    • @SEANBERRY56
      @SEANBERRY56 Рік тому

      @@arc00ta What was so bad about your experience. I bought a 3080ti overpriced as hell and regret it now due to this price increase from NVIDIA. Was it the drivers causing you issues?

  • @garrethopwood8178
    @garrethopwood8178 Рік тому +319

    I am one in the camp of "wait til November" to see what AMD brings out. Hoping it's good enough to go full team red.

    • @ruadeil_zabelin
      @ruadeil_zabelin Рік тому +27

      Except that AMD's drivers are still bad. Working as a programmer on OpenGL, DirectX 11 and Vulkan projects, AMD has always been the bane of my existance. There are so many workarounds for AMD drivers in our code its rediculous. If you browse through the Unreal Engine's sourcecode you find the same thing. If you're just gaming, I guess it's okay, since the developers of that game have already fixed the problems; but the fact that AMD makes us deal with those kinds of issues still to this day is rediculous to me.

    • @KevinJohnson-cv2no
      @KevinJohnson-cv2no Рік тому +4

      Keep hoping lmao

    • @GaaraSama1983
      @GaaraSama1983 Рік тому +9

      @@ruadeil_zabelin That's a bullet I'm willing to bite just like I did when going with Ryzen 3700X knowing full well that Windows, games and software are better optimized for Intel CPUs. Sometimes this is necessary to give a company a reality check or it will never change. Also since Ryzen got better market shares, the optimizing for their stuff also improved. So hoping here that same happens with AMD GPUs.
      I know that I will pay the price of way worse performance and more issues, especially for stuff like VR, emulation, older games, ... but the primary reason I want a new GPU is for native 4K/60fps in rasterizing with modern games. I think at least on that AMD/RDNA 3 will deliver.

    • @ruadeil_zabelin
      @ruadeil_zabelin Рік тому +10

      @@GaaraSama1983 I've been using AMD CPU's without issue. In fact the moment they announced the Ryzen's and Threadrippers with loads of cores, I was immediately on board with that. But I haven't seen any real improvements on AMD GPU drivers in the past 15 years.
      It's getting less of an issue with the newer stuff (DirectX 12 and Vulkan) though. From what I've seen so far, the implementation of that has been pretty good. There are some oddities in Vulkan, but nvidia has similar oddities there so that's fine. But indeed if you're looking at older games (or older software I work on in my case), if it is something that relies on OpenGL; then AMD GPU's are pure hell. OpenGL and AMD just don't go well together. Luckily OpenGL is on the way out though.
      Indeed if you're purely getting a GPU for modern gaming; either is probably fine indeed.

    • @WhiteWolfYT964
      @WhiteWolfYT964 Рік тому +4

      @@KevinJohnson-cv2no 🤡

  • @llpolluxll
    @llpolluxll Рік тому +111

    This is the cycle. Honestly, I think it's a sign that AMD is going to put out something good. Nvidia wants to extract as much wealth from the system before their dominance wanes which ironically is going to push more people to the other side

    • @knrdash
      @knrdash Рік тому +5

      A self-fulfilling prophecy.

    • @mykle1642
      @mykle1642 Рік тому +8

      Yea, I think you're right. Stockholders have too much power, it's evident in almost every market. Guys on the outside that just try to make thier money worth more. Yet, they destroy products.

    • @TheWiredWolf
      @TheWiredWolf Рік тому +2

      I think is the opposite, AMD is bringing nothing exciting to the GPU market so NVIDIA might be able to get away with this

    • @knrdash
      @knrdash Рік тому +1

      @@TheWiredWolf Neither does NV. Very little excitement since the 3000 series.

    • @angeldemore6922
      @angeldemore6922 Рік тому

      "dominance wanes" is hopeful at best like dude amd on top? LOLLOOOOOOOL LOLOLOL OOOLOLOL thats actually hilarious. You are probably someon who things radeon is even comparable to current nvidia. literally no one makes games with drivers specifically for radeon not like they do nvidia.. like dude its not even in the same realm. Radeon is just showing up like they always did. Now Intel is in the market too.

  • @kltechnerd8177
    @kltechnerd8177 Рік тому +6

    I have always used N Vidia. This is the kind of stuff that makes me consider switching to AMD. I hope they step up.
    Thanks JayZ2cents.

    • @azorees7259
      @azorees7259 Рік тому

      same here been with Nvidia for 20+ years and now looking at swapping over to AMD , i am already looking at upgrading from my 3080 so if AMD brings out a card that has similar speeds and performance to the 4080 ( 16GB ) i will be looking at that card.

  • @redjules1215
    @redjules1215 Рік тому +176

    I've never had an AMD card but I'm definitely waiting to see what they have to show. The Nvidia cards look great but the price and the power draw and the general shenanigan's are leaving a really negative impression.

    • @martin-1965
      @martin-1965 Рік тому

      I was AMD for years and then went Intel/Nvidia last Xmas with an i9 900k and RTX3080 but - as good as my setup is for now, I'm not going to be buying a 2000w PSU in a few years just to run the GPU and CPU. Currently, with no AC as in UK, my computer is the biggest power draw in my home which is crazy. As such I think - as a casual gamer - I'll be back with AMD for my next build or (sprinkles holy water) I'll just say screw it and go with an Apple iMac Pro or whatever is around then, as with electricity prices through the roof in the UK at least, even with the Apple price premium, it would pay for itself through the savings on power consumption. Crazy world we live in where the basics of life like power and fuel keep getting more expensive and the things that were luxuries - TVs, phones, computers even (not you Nvidia lol) keep getting cheaper 🙃

    • @GuiGuIXOx
      @GuiGuIXOx Рік тому

      feel the same like you ! IF i change my card !

    • @patrickbateman3326
      @patrickbateman3326 Рік тому

      I only had an Nvidia laptop. It died.
      My last 2 cards have been AMD cards, never disappointed.

    • @Arkstellar
      @Arkstellar Рік тому

      The two AMD cards that I've owned (RX 580 8GB, and HD 7970) were great. No problems with them whatsoever. I have no problems with going AMD this time around. Will be enjoyable to see if AMD embarrasses Nvidia and blows then out of the water from a price-to-performance standpoint.

    • @Jooffee
      @Jooffee Рік тому

      99 % ppl don't know any better

  • @xgamecodes
    @xgamecodes Рік тому +19

    As an European its quite scary how big the power draw is considering the sky high electricity prices are as of late even if i had that kind of money to buy a 40 series I whoud not buy it

    • @Bendoughver
      @Bendoughver Рік тому +1

      If you're gaming then yeah not very efficient, however compared to performance per watt the 40 series is actually pretty efficient.... its kind of double edged sword that has been happening as moores law dies. You need more voltage on each side of a transistor once they get smaller and smaller leading to higher and higher power usage. We've made many improvements in efficiency, but node improvements are really hard....

    • @gavinbuck8130
      @gavinbuck8130 Рік тому

      I think the 3070 has one of the better performance per watt ratings, it can also be undervolted for another drop in wattage with an unnoticeable drop in framerate.

  • @babaspector
    @babaspector Рік тому

    thanks for making this more known

  • @garys5540
    @garys5540 Рік тому

    That's it! I'm going back to Etch a Sketch!

  • @cizzymac
    @cizzymac Рік тому +201

    Nvidia's marketing and executive teams should be ashamed of themselves for this.
    But we know they won't. They just see dollar signs.

    • @Unknown-pc9yq
      @Unknown-pc9yq Рік тому

      They probably got raises for this.

    • @ChantingInTheDark
      @ChantingInTheDark Рік тому

      Corporate greed pisses me off.

    • @watchm4ker
      @watchm4ker Рік тому

      The shame will come when they find out miners aren't interested in GPUs anymore.

    • @WarbossRB
      @WarbossRB Рік тому

      Its the CEO that does this

    • @iHadWaterForDinner
      @iHadWaterForDinner Рік тому +6

      theyre smashing models on their private jets. they dont care and I wouldnt if I were them either.

  • @kennykidd6013
    @kennykidd6013 Рік тому +23

    It’s called 4080 solely to justify the price that they could never charge for a 4070…which worries me more for the budget gamers at this point. I bought a 3090ti because it was sooo cheap at the time and I’m sticking with that for now but when I do upgrade, I’m now 100% going team red from here on out. They’re worried about profits too but I feel like they’re more transparent on “this is what it is and what you’re paying for”.

    • @animefreak5757
      @animefreak5757 Рік тому +1

      AMD probably wouldn't act much better if they were in Nvidia's position. I think if they ever reach performance parity they will keep eachother in check, but until then i root for the underdog in this 2 (hopefully soon to be 3) horse race. So long as AMD has comparable price to performance, I'll keep buying their cards. The only reason I bought intel last time was because ryzen was oddly significantly more expensive in my local at the time.

    • @DeAtHvAnGeL
      @DeAtHvAnGeL Рік тому +1

      Doing the same bought a 3090 ti from a 1080ti. Got the EVGA FTW Ultra for $1000 I will keep this one until I see something from AMD or intel that would crush it in both regular performance and with RT enabled. Can't and wont support the scummy move from Nvidia.

  • @JasonMeeks79
    @JasonMeeks79 Рік тому

    Don't know how you think of different cards. But any time there's a change I'm memory , I always expect mire than just that a change. And dig into specs myself.
    But totally agree it should all be on the side of the box for easy comparison.

  • @bitsbfg1810
    @bitsbfg1810 Рік тому +2

    We feel the same. It's come a long way since the NVIDIA SLI days and we wouldn't be surprised if the 4090 is shaved down to make the 4080 Ti fit and just name is that.

  • @Coyote05WRX
    @Coyote05WRX Рік тому +48

    The 4080 12gb specs are more like what a 4070 ti would be imo. I like your idea of getting the FTC involved in NVIDIAs packaging requirements

    • @s1mph0ny
      @s1mph0ny Рік тому

      Nope, 20-25% weaker is right on track for the difference between 80 and 70 performance in past gens. When you look at the weaker memory system here, the 4070 will be lucky to be only 25% weaker than the 4080.

    • @onomatopoeia162003
      @onomatopoeia162003 Рік тому

      Wouldn't mind that either. If other places have to show how many calories and all that..

  • @kinggalactix
    @kinggalactix Рік тому +8

    Next thing you know, the 5000 series will be as expensive as a high end PC ($2000+).

  • @WilliamHeimPhotography
    @WilliamHeimPhotography Рік тому +4

    I've been waiting over two years now to upgrade from my 1080ti and haven't yet due to crazy cost and no availability. Just recently, I've seen the 3090ti in stock with a price drop to $1099 on Best Buy but didn't purchase because I was hoping to see the price drop even further. Now I'm kicking myself because the cards are once again "out of stock". With the announcement of these new cards coming (eventually) what would you think my best upgrade path is? I have quad QHD monitors (3 setup as linked triples) and use my PC for 4K editing and iRacing. I'm hoping for a significant performance increase over the 1080ti for these applications but don't want to overpay due to these "farmers" driving prices up. You videos are very informative but most of the information is way over my head. I just wish someone could tell me the best bang for the buck and where I can buy, without getting gouged on price. Thank you

  • @1Dshot
    @1Dshot Рік тому

    One of my pet peeves is companies with confusing naming conventions. MS & Apple have this issue with their Xbox consoles and ipad/iphones respectively. So many variants make it confusing and difficult to really tell what you need, and it slows down the buying process anyway 'cause you have to parse all the differences between them.

  • @ChampionHero
    @ChampionHero Рік тому +304

    The other thing you forgot to mention was that not only does the 12GB have a lower spec core, the memory bus is also lower than that of the 16GB version (which is already lower than that of the 3080!) Its a seriously gimped card, actually more relevant to being a 4060ti when you compare it against its peers spec for spec
    FTR, I'm still running a GTX 1080 simply because the whole range of Nvidia cards since the RTX launch has been incredibly poor value for money. Sure the 3000 series was great performance, but their obscene pricing has put me off. I remember paying £280 for my 970 and we're now expected to pay almost a grand for the same range?
    I'm waiting to see what AMD do (not that the 1080 really struggles at 1440p tbh)

    • @Sanquinity
      @Sanquinity Рік тому +17

      To be fair, the 1080 (and 1080 ti) is a really good card for it's value. It's better than a 3060 at least.

    • @zaklamp89
      @zaklamp89 Рік тому +10

      1080ti user here, def. waiting on that AMD 7800XT or so.. which should be plenty of power for my 1440p 240hz setup :) and for now she still runs mighty fine

    • @squidwardo7074
      @squidwardo7074 Рік тому +13

      @@Sanquinity Not even close 3060 is about equivalent to a 2080. That being said there is almost no need for more than a 2070

    • @1000percent1000
      @1000percent1000 Рік тому +7

      I've had my 1080 for 4 years now and have yet to be bottlenecked by it. I don't play enough video games nor care enough about graphics quality to double the price of my card. Seeing reflections in games would be the only benefit I could tangibly get, and I don't care. Building an all AMD next time around.

    • @Sp4rKzTV
      @Sp4rKzTV Рік тому +3

      Unless you were incredibly lucky and managed to get a 3080 at launch at MSRP price (which was around 700USD, so in that case it's one hell of a card for the money), then 3000 series wasn't a very good value in general. Prices were in the upper 1000$ range few weeks after launch and stood there for more than 1 year. It's crazy... It's just coming back down now but still kind of expensive VS the launch price.

  • @boiler-tech
    @boiler-tech Рік тому +317

    Nvidia has become the monster that we all helped to create. As with so many things, supply and demand is a pretty simple concept. If we're all just so giddy because these cards are available again and we run out and buy them knowing what Nvidia is up to, and we just don't care, well...

    • @tobingj
      @tobingj Рік тому +8

      I didn't help. 👍👍🤪

    • @Buddy308
      @Buddy308 Рік тому +8

      Those of us who watch videos on UA-cam will shun the cards announced so far. Many of us will shun the entire Nvidia product line, but my fear is that the majority of buyers will remain unaware of the scam and will fall for this customer-hostile, flagrantly dishonest naming scheme, actually rewarding Nvidia for their evil choices.

    • @Pers0n97
      @Pers0n97 Рік тому +9

      We didnt help, the crypto miners did.

    • @user-og6hl6lv7p
      @user-og6hl6lv7p Рік тому +2

      @@Pers0n97 Question: Was Nvidia a billion dollar corporation before bitcoin was created or after bitcoin was created?

    • @Pers0n97
      @Pers0n97 Рік тому +7

      @@user-og6hl6lv7p was Nvidia a company trying to act like Apple before the crypto bubble?

  • @KnapfordMaster98
    @KnapfordMaster98 Рік тому +10

    Just got an MSI 3080ti, very happy with it. I work in Blender with 3D rendering so I have a lot of incentive to stick with Nvidia cards. I expect this 3080 to suit me for a LONG time, hopefully by the time I need something more they'll have gotten their act together.

    • @jazzochannel
      @jazzochannel Рік тому +1

      "they don't have their act together, but I buy their stuff anyway." demonstrating that you have no principles, doesn't make you a principled person. after a LONG time when you need a new card, you'll simply buy whatever turd of theirs that you can afford AGAIN. and so will everyone else, until the competition shapes up. be real about it.

    • @KnapfordMaster98
      @KnapfordMaster98 Рік тому

      @@jazzochannel I actually bought my 3080 before any of this came to light (afaik). What I'm saying is, I like their product but I do not like the direction they're going. And I hope they get their act together so that their product is worth supporting down the line. If not, I will look elsewhere for the next best option, primarily keeping Blender rendering in mind.

    • @jazzochannel
      @jazzochannel Рік тому

      ​@@KnapfordMaster98 I understand your position, but it's not worth much. Don't take what I said personally. I bought the 2080 ti myself. The most overpriced product I ever bought, and I don't even work with graphics or AI. Nvidia has had criticism thrown their way for many years, the recent stuff is more of the same behavior, but the consumers don't care because there is nothing else on the market.

    • @jazzochannel
      @jazzochannel Рік тому +1

      My point is: _saying_ that we don't like Nvidia's practices is meaningless if we _buy_ their products anyway.

    • @KnapfordMaster98
      @KnapfordMaster98 Рік тому +1

      @@jazzochannel Agreed, and I will keep that in mind going forward.

  • @tomutrein5088
    @tomutrein5088 Рік тому

    I bought the 1060 3gig card when it released after waching several revievs with the 6gig kard. And because I didnt need more ram and the revievs said it was only a ram difference. And the $50 difference was the tiping point. I was massively disapointed by the performance of the 3gig kard, it wasnt close to the performance of the 6 gig card. Finaly I got the answer to why. Thanks for the informative video

  • @DreamerC1
    @DreamerC1 Рік тому +80

    NVIDIA themselves told in their last Earnings Call that due to the excess supply of 3000 series cards they will manipulate the prices of 4000 series so they can sell the remaining 3000 series cards at the same time as 4000 series, while keeping 3000 series cards also high. Basically they can't let the 3000 series cards prices fall, then nobody would buy the 4000 series.

    • @scroopynooperz9051
      @scroopynooperz9051 Рік тому +23

      So screw them - just buy a used RTX 3000 as miners are dumping. Or if you dont wanna go used, wait another 2 months for AMD to release RX 7000 series

    • @Mark-kr5go
      @Mark-kr5go Рік тому +2

      @@scroopynooperz9051 why buy used 30xx anyway. Get a rx 6800 and above for that vram longevity. 30xx 8 and 10 gb was always a scam.

    • @lilkwarrior5561
      @lilkwarrior5561 Рік тому +1

      No. This is textbook resource allocation being cynically manipulated to more than what it is for people trying to spend as little on GPUs as possible with their limited funds knowing we're dealing with recession-like environments in our respective countries. It's entitlement for something most admit is more important than they realize as a non-essential good to their digital computing happiness.

    • @oldmoney1022
      @oldmoney1022 Рік тому

      @@Mark-kr5go Lol who needs more than 8 gb for gaming dude.

    • @sack8439
      @sack8439 Рік тому +2

      @@oldmoney1022 Have you seen the new games? Bruh stop being under a rock old man.

  • @13Dread
    @13Dread Рік тому +148

    Let's just say, I hope RDNA3 puts Nvidia back to be a consumer-centric company, and not a consumer-scammy-centric company. The way you explained the 4080 12GB is the exact way I felt when I discovered the CUDA count difference. It's even worse the way they gave an "official" explanation on their website in a Q&A where specifically people were asking them about this, and they said "well, it's like 30 series with the 3080 10GB and 12GB". No Nvidia, it is NOT.

    • @haggisman0812
      @haggisman0812 Рік тому +11

      What makes you think AMD will do anything other than offer what nvidia are at $50 less. Not sure how AMD continue to have so much good faith.

    • @sierraecho884
      @sierraecho884 Рік тому +2

      Me too, this is clearly misleading. We here know because we spent quiet some time looking up those things but other consumers won´t know those details.

    • @13Dread
      @13Dread Рік тому +4

      @@haggisman0812 they are deviating from the standard architecture if rumors are correct. Basically, they are creating a GPU which is essentially what Ryzen was for Intel. Let's wish it is like that, because if it is not, we can be entering a dark age for GPUs.

    • @greenumbrellacorp5744
      @greenumbrellacorp5744 Рік тому +3

      @@haggisman0812 yea, same happened with cpus, once amd got the perf crow over intel, they "intelized" their prices with a +50% increase, the second they won price spike, so, for gpus if they r underdog, nvidia -50%, if they start wining, over nvidia no problem, neither amd intel or nvidia are friends, best we can hope is they compete

    • @CimInc
      @CimInc Рік тому +1

      Same... Nvidia gets shadier by the generation....

  • @ash3sgaming812
    @ash3sgaming812 Рік тому +1

    So my theory about the cuda core differences on the 4080 12gb vs 16gb is the tensor core and RT core count. The 12gb may have a higher RT Core count to make up for the lack of memory and cuda cores to get the same performance with DLSS out of the card so raw performance is lower that the 16gb but running DLSS is equal in performance to the 16gb while running DLSS on both so idk we will see when it hits

  • @storkman
    @storkman Рік тому

    get em Jay!!

  • @donaldduck7628
    @donaldduck7628 Рік тому +78

    Jay, you forgot the reduced bandwidth. They drop from 256bit buss down to 128bit on the 12 gig card. The 16 gig card is 192bit. The 4080's are cut in more than one way.

    • @marcusellby
      @marcusellby Рік тому +16

      The 16GB is 256 bit and 12GB is 196 bit, but yeah, that's a major point as well

    • @stewenw4120
      @stewenw4120 Рік тому +6

      @@marcusellby Because its a 4070. 192 was memory bandwith of xx70 cards since GTX series. Its a greedy, sh*tty move.

    • @slowanddeliberate6893
      @slowanddeliberate6893 Рік тому

      The higher boost clock in the 12GB allows the performance to not be too far off from the 16GB.

    • @WomboBraker
      @WomboBraker Рік тому +1

      Thanks for pointing that out man, I did not know this

    • @phondekarvidhesh
      @phondekarvidhesh Рік тому +1

      True and 192bit is like the 3060 level cards.

  • @marksaxon
    @marksaxon Рік тому +13

    EVGA saw this coming....they didn't want to be involved with the shadiness. I'm sticking with my 3070 and I will wait until we see how the next 6-10 months go. A 4070 over $600-700 is a no go for me.

    • @codemonkeyalpha9057
      @codemonkeyalpha9057 Рік тому

      Yeah, I think that is spot on, they couldn't keep their ethical customer focus and sell a 4070 branded as a "4080" with a secret core cut. I think that might also point to how recent this change of naming was. I wouldn't be surprised if Nvidia only told their partners a couple of weeks back and that is why this sudden change of direction happened. Buying from Nvidia now is like buying cheap Chinese tat off Amazon, you have to expect lies, false reviews, and misleading descriptions. Sad really.

  • @yngndrw.
    @yngndrw. Рік тому +3

    You have to remember that it's not just the price of the card, but probably also the price of a larger case and possible also a new PSU.

    • @medavis
      @medavis Рік тому +1

      Seriously, these 40-series cards are stupid huge and power hungry to the point where I suspect ~80% of the customers who are picking up a 40-series are going to have to spend on either a new PSU, a bigger case, or both. The value proposition on these Nvidia cards is terrible.
      Anyone can make a GPU chip perform better by making it larger, putting a bigger cooler on it and throwing more power at it. What this extreme increase in size and wattage tells us is that Nvidia was not able to engineer higher density, more efficient GPUs -- which is the core of their business and arguably their entire reason for existing as a company. Instead they threw more silicon and copper and wattage at the problem and are now trying to cram that down our throats for an exorbitant price.

  • @robertvickers1441
    @robertvickers1441 Рік тому +2

    i honestly wish there was a mass 4 and 5 k series boycot.

  • @manuelgaribi
    @manuelgaribi Рік тому +413

    I was literally about to buy a 3080 but the lack of compatibility with the new DLSS is bullshit. I bought a 6900 instead cause AMD improvements are almost always backwards compatible. Nvidia's behavior is too disappointing honestly.

    • @joshuasterling2144
      @joshuasterling2144 Рік тому

      The 6900 especially on water is very good. Right now you can get a 6900xt Sapphire Toxic liquid cooled card for 899 on Amazon.

    • @davidbrennan5
      @davidbrennan5 Рік тому +1

      I am going AMD. I will wait till the 3rd of November.

    • @julianlopez723
      @julianlopez723 Рік тому

      i didn't know this and was looking for a 3070ti. thanks

    • @GameCyborgCh
      @GameCyborgCh Рік тому

      wait the 3080 isn't compatible with DLSS 3.0? the heck nvidia

    • @mazhaa3989
      @mazhaa3989 Рік тому

      And AMD allows you to use FSR on nvidia cards loool

  • @Smaksum
    @Smaksum Рік тому +49

    When I watched the keynote, after the whole EVGA thing i already had lowered expectations. Then they announced the price, and that made up my mind to skip the 40 series. I have enough performance for my needs, i just hope AMD doesn't follow suit on price.

    • @233kosta
      @233kosta Рік тому +1

      I'm sure they'd jump at the opportunity to grab the market share, assuming they have the capacity to supply

  • @ifrit35
    @ifrit35 Рік тому +2

    When it comes to graphics card I've come back and forth between AMD and Nvidia so I'm really not concerned about what's called what since I'm going to get whatever is best from a price to performance point of view. And honestly, the high end of graphics card doesn't really tickle my fancy. These days I'm most impressed when I see what my Steam Deck is capable of with a measly 15W tdp.

  • @Ruiluth
    @Ruiluth Рік тому

    I wish there was a Micro Center in Oregon. I visited one for the first time a couple years ago when I was in Ohio and it was like a candy store. I even sent a letter to their office asking them to open a store in Portland to replace Fry's. No luck so far, but I freaking love that store.

  • @phobos2k2
    @phobos2k2 Рік тому +62

    I Started into PC building back in the early 90's and it was just a really fun hobby. I know there were some crazy prices back then, but it seems like there wasn't all this drama surrounding products. I know there wasn't. Now, I think with social media, companies can't hide dubious intentions like they maybe could have in the old days. It will get out and it will spread like a wild fire. I feel like 90's me with a copy of PC Magazine and my new 3dfx Voodoo card were maybe a touch naive, but definitely happier with the hobby.

    • @Play-On7
      @Play-On7 Рік тому +5

      Hey since you been in this hobby for so long I have a question to ask. Do you agree that 1997 to 2008 was the golden age of gaming?

    • @FoolOfATuque
      @FoolOfATuque Рік тому

      I remember my first Voodoo rush. It barely fit in the case. Lol!

    • @Ruddline
      @Ruddline Рік тому +2

      I am old too and 90s me got so much better stuff I could play anything, Everquest at max settings no lag, XWing vs Tie Fighter, now I am broke 2020 me and I don't want to pay 3K for a PC I just can't and I don't.

    • @Andre-nx5xl
      @Andre-nx5xl Рік тому

      @@Ruddline 90's me had a C64 until my first Pentium 100..in 97-ish..:P

    • @kmieciu4ever
      @kmieciu4ever Рік тому

      @@Ruddline I could afford Voodoo 3 as a kid. Now I can't afford a new graphics card as an adult :-)

  • @stuarth317
    @stuarth317 Рік тому +101

    I have to agree with this completely. The naming series feels like an intentional act to confuse consumers and let them think they are getting more than they are.
    Whilst it wasn't all down to NVidia, the troubles gamers had with pricing and availability of the 3000 series, this could have been a fantastic opportunity for NVidia to show that it cares about its core customer base but they actively CHOSE not to. It's all about the profits, and they will keep trying to get away with it as long as people keep buying their cards.
    I skipped the 3000 series to wait for the 4000 series, but tactics like this leave me not wanting to buy even if they are the best cards.

    • @HoboInASuit4Tune
      @HoboInASuit4Tune Рік тому +2

      Get team Red, Stuart! Will be interesting to see what chiplets will do to AMD's GPU. We've seen it radically boost its CPU lineup at least.

    • @markcallaghan8389
      @markcallaghan8389 Рік тому +1

      i prefer NVIDIA cards , with all this high pricing and confusing 4080 line up i am waiting to
      see what AMD has to offer in October in terms of their new cards pricing and
      the pricing of the previous generation. I have a gtx 1070 aurus that plays
      the games i am likely to want play in the next 6 months at 1440p

    • @stuarth317
      @stuarth317 Рік тому

      @@HoboInASuit4Tune If AMD can manage to pull off the same sort of performance gains and value for money that they have with their CPUs, especially with ray tracing support them I'll be more than happy to give them my money. Fingers crossed for RDNA3!

    • @paulie-g
      @paulie-g Рік тому +2

      @@markcallaghan8389 Even current-gen AMD has very good options for 1440p, like a 6750xt, say. I'll be picking one up once the prices drop even further ahead of/after the launch.

    • @markcallaghan8389
      @markcallaghan8389 Рік тому

      @@paulie-g agree i have just seen an MSI rx 6700 xt for £420 which around what i willing to pay
      for a card that is 62fps vs 27fps ( gtx 1070 my card ) 1440p ultra red dead redemption 2

  • @snavsmatiq
    @snavsmatiq Рік тому

    What you said at 7:39 is exactly what I'm doing lol, the new cards simply arent worth the hassle, ill stick with my 2070 I bought at launch.

  • @zangetsu8639
    @zangetsu8639 Рік тому +6

    Depends on the price but I'd stick with a 2080 Ti. In my opinion, it's still the best card from Nvidia at the moment, everything after that is just too inefficient, oversized, expensive and way too hot, which also results in them being loud ( air cooling ). Just that the 2080 Ti is a bit better than a 3070 Ti says it all in terms of progress. In addition, the prices go down drastically for the 2000 cards.

    • @Maxatal
      @Maxatal Рік тому

      Yes. 3000 series was supposed to be the refined version of the 2000 series where they get a handle on power draw, heat and size, something we’ve seen before. (1000 series kind of being a refinement of Maxwell). But they pushed it further for performance. Then we were hoping the 4000 series would tone it down. Nope, more is merrier apparently.
      My 2070 Super will last me till 2025 at the least.

  • @braf7349
    @braf7349 Рік тому +75

    Only reason I got a 3090 was because that was what I was able to find/get. Nvidia is getting complacent with what they have, and it's giving AMD/Intel room to grow (I really hope one of the two put Nvidia in their place). It's gotten to the point where I need a dedicated A/C unit to keep my gaming room cool. The cons are starting to completely outweigh any performance boost.

    • @digizilla164
      @digizilla164 Рік тому +7

      I hope AMD comes in swinging and this is time for Intel to start pushing and improving their own product.

    • @Shreddylife
      @Shreddylife Рік тому +2

      My EVGA 3090 runs at 60C while gaming. What brand do you have?

    • @sleeve9097
      @sleeve9097 Рік тому +1

      i got a 3060

    • @AxelKuno
      @AxelKuno Рік тому +4

      @@Shreddylife the 60°c doesn't really matter here, it's about wattage. My point being a card at 50°C using 1000 Watts is more than double as good at heating up the room than a card at 70°C using 500 watts to the surroundings provided their efficiency is the same. The wattages was crazy on the 3090 TI

    • @TAP7a
      @TAP7a Рік тому +1

      @@Shreddylife equilibrium temp is irrelevant, power output is the only meaningful factor for room comfort

  • @ScoundrullonYouTube
    @ScoundrullonYouTube Рік тому +5

    Jay is like our financially responsible yet cool father figure, he wants you to have the best time ever but he also wants you to understand the situation you might find yourself in but ultimately he will allow you to make your own decisions before you go out with your friends LOL

  • @giantnanomachine
    @giantnanomachine Рік тому

    Reminds me of the 970 basically being a 3.5GB card being sold as a 4GB card, with the last half GB being connected so slowly it’s basically useless.

  • @Zensuki2
    @Zensuki2 Рік тому +1

    I feel like the 4080 12GB should of been marketed off as the 4070TI and released with the TI series, releasing it under the same name as the better spec 3080 does seem like a scheme to trick new and unsuspecting buyers