If they don’t start shoveling a LOT of R&D into optimizing for power efficiency soon, their top-tier cards are going to require consumers to put PC-only circuits in their houses. Absolutely insane.
Yeah, the power draw is insane. I mean, for the longest time gamers did not really care how much power was consumed. But now we are approaching a ridiculous degree, where we are not even speaking of problematic heat and noise output anymore, but about having to change the electrical infrastructure of your house. Good news is: it needs only one generation of GPUs where people do not buy the products because of insane power draw. After that, I think we will be fine again :P
Jay, I think that at this point the content creator community discussing video cards needs to start using the phrase _"price fixing"._ The video card manufacturers are now executing the very same business practices that RAM industry was accused of and sued over in 2002. This needs to be talked talked about _extensively_ in the tech press.
What happened then? I never heard of that before with RAM. Also, what do you mean, period? Do you mean how the 3080 12GB did not come with an MSRP and could be priced with a wild amount of variance; hence, no "price-fixing" being present (having an MSRP, instead)?
@@somerandomperson8282 I used the term 'dropping' similar to 'name dropping' where you just bring it up out of the blue. But I don't want to confuse anyone with what I mean, so I edited my comment. Thanks for the feedback.
Nvidia “we’re so happy to announce to you, the power of it all. It only requires this one simple configuration that you will all be so happy about. Everyone will just need to get a commercial power delivery to their dwelling units.”
Man I remember when I bought my 980ti for $650 and I thought I was crazy for spending so much on a GPU. Only thing higher was a Titan at that point. These GPU prices are getting crazy.
Missed buying 1080ti at launch for a grand, had I sold during whatever the last 2 years was, years of totally free gaming. Instead I bought a second hand 6900XT 🤙
It’s stupid now because of cryptocurrency Back in time gpus were basically for play games or editing so the value wasn’t so elevated like nowadays because of mining and yeah more people are streaming so gpus are one of the most demanding components of tech right now. I’m mad every-time i remember when I bought my 6900xt for 1400usd two years ago…i was pissed but nothing we can’t do…it’s the market!
Even if the 4000 series is twice as fast, how can they justify those prices.... I remember the top cards being under 1000. Its ridiculous simply to play games.....
I remember getting a top end card for $400 CAD back in 2010 (ATI HD5870). Now I'm lucky if I get a low end card for that price. Even a 3060 is $500CAD.
People bought the 30 series at highly inflated prices because people are dumb. I said this during the last crypto boom that both Nvidia and AMD were looking at this behavior and would price accordingly. Blame your fellow consumer
dude you don't even need a 1080 to play video games. I have a 1080 and I literally don't run into any issues at all playing with a 6 year old card. You could probably run a 980 ti and still be just fine. It's just people have expectations that they are entitled to having the most overkill equipment at all times. People need to chill out, be happy with what they have, and save their money because this economy is in dire straits right now. GPUs are the least of our worries right now.
If idiots didn't purchase GPUs at these prices, Nvidia and retailers will drop the price. The consumers are to blame here because they showed how much they would agree to pay.
As a PC enthusiast, I am going to skip the 40 series. I can already feel the Steep price and all the manipulation and schemes Nvidia and 3rd Party resellers are going to play. like previous years. Don't let them scam you of your hard earned money.
Do you think its worth to buy a 3080 now? I'm dying to decide if I wait or not. My card is a 1660ti and I can run everything I want very well at 1080p, but I want to upgrade before my computer needs a bigger change to keep up with chipsets and gpu performance.
@@SethOmegaful honestly, go AMD instead when they come with their cards around the same time. Will probably be way less power requirment, and a big uplift compared to 30s nvidia series. That's what im thinking atleast, i have a fantastic 3080 OC strix card right now.
I'm 54 and my wife and I are VERY worried about our future , gas and food prices rising daily . We have had our savings dwindle with the cost of living into the stratosphere , we are finding it impossible to replace it . We can get by , but cant seem to get ahead . My condolences to anyone retiring in this crisis , 40years nonstop just for a crooked system to take all you worked for
I feel your pain mate, as a fellow retiree I'd suggest you look into passive index fund investing and learn some more. For me, I had my share of ups and downs when I first started looking for a consistent passive income so I hired an expert advisor for aid, and following his advice, I poured $30k in value stocks and digital assets,Up 200k so far and pretty sure I'm ready for whatever comes.
@Sunnycaroni My advisor is Arjun B Jagat , found him on Bloomberg where he was featured and I reached out at once . If you care for a little no < pressure guidance , you can look up his name. \
yep. the nvidia cards are just too powerhungry. most people are not lucky enough to live in a cool enough environment to dissipate that heat, making their rooms waaay too hot. making you not bother actually using your new gpu. if amd can match nvidias cuda/ nvenc performance and built in support in software, thats it. im going all amd. i have always gone amd for cpu, and nvidia for gpu, but this may change soon.
I’m still annoyed with the price jump from the 10 series to the 20 series for the smallest gains. The 1080ti was $699, the 2080ti was $1200! Then the 3090 and 3090ti come at $1500 and $2000 respectively. All based on 102 class silicon of their generation. The 90ti is just a renamed 80ti. We didn’t get a GA-102-400 for the 90/Ti to be a titan. So from 10 series to 30 series the MSRP jumped just shy of 3X.
90 and 90 Ti class gpus are ripoffs like nvidia is clearly targeting daddy's money territory people A 3080 cost 699 and is 6%~8% slower It's clearly a Titan class GPU for data centre and high vram task
@UCi6sz_Hw_80bUYMgVhCAtpg no, a 3090/ti has nothing that makes it a titan. a titan had the excuse, that it could use certified pro drivers and still able to play games for privat use ( instead of a quadro card ) a 3090/ti has no certified pro driver support, so its jaust a over expensive gaming gpu that happens to be good at video editing and even then you have no certified drivers. certified pro drivers are driver specificly for one program, where the "certified" means, that the driver will not crash and works flawlessly
As soon as I heard that the 4080 would potentially require 420w, me and my 750w PSU and 1440 display noped outta there and picked up a 3080 12gb while the prices were down; that tactical maneuver is looking wiser by the day.
Went to AMD and got myself a 6950XT at MSRP . I'm not waiting for months just to "have the chance" to buy a 3000 dollar GPU and I sure as hell ain't paying double the cost of a 6950XT for a 3090 Ti.
And you wont see enough improvement to justify the cost, thats for sure. Its never worth it to upgrade to the very next generation from what you currently have.
@@lyianx not quite true. I upgraded from a Msi 970 4 gb to a Msi 1070 8gb back in 2015-2016 and that was a huge improvement. Sold my old 970 and it costed 500 for 70% increase from what I seen.
Yeah, making it bigger isn't really what i would call "innovation". At best it's a sideways evolution in regards to tech. Sure there is a place for more GPU power at all costs for certain applications, but a next gen GPU is supposed to deliver more for the same amount of power/money or less.
Power per watt needs to be focused on a lot more by youtubers and consumers. Intel and Nvidia are out of control, AMD seems to be putting some effort in.
It's all falling into place. Make GPUs super power hungry and suddenly people forget about performance and now want more efficient ones. What a great way to avoid making things better. Just make something worse so you can "fix" it in the next generation. Apple would be proud of Nvidia and Intel.
Yeah, considering companies like WPS are already having a hard time keeping up with the power demands of communities, having a PC in your home that has a 1k watt power supply being fully utilized when gaming is insane. Also, my 3080 rig produces a ton of heat, I can't even imagine what a 4090 rig would produce. It's basically a furnace with RGB fans at that point.
@@burrfoottopknot Hopefully the 40 series cards don't sell well. That's the only way a change will happen. But I think we all know they are going to sell out day one sadly. New generations should draw the same amount of power as previous gens, maybe a bit more, and perform better. That is true evolution of a product, not just putting more power hungry stuff into a chassis and calling it better.
@@burrfoottopknot more so a step down in processing size. 420W at the 4080 tier makes it look like they are still on the 10nm node, instead of the 7nm that the 6000 series for AMD is on. not only are the AMD cards less power hungry, but performance per watt is also higher. RDNA3 will hopefully be on 5nm, and thus showing Nvidia that they need to put way more R&D into more power efficiency. aka stepping down in processing node size from 10nm to 7 or 5nm. a bulk of their fab reserve is at TSMC, and TSMC has finished up research on 5nm and soon even 3nm processing nodes, so obviously there is room for improvement.
AMD has a decent power consumption because they sacrifice power for it. Intel knows thier chips are strong, yet run hot and hea y, so they go all in. This only works though because you can still run an Intel chip with a stock cooler, and generally you will be fine. Nvidia on the other hand is jacking up both things and they are making it so that you HAVE to make changes for it to function
My computer's power consumption is genuinely driving my apartment's cooling needs. It's ridiculous and if anything I'm going to be angling for a more power efficient card even at the expense of performance. It feels with the price and power increases that we aren't getting better architectures, we're just getting bigger, thirstier, more expensive parts.
Agree. Just seems that we’re going through iterations of brute force rather than anything clever going on. Maybe the SOCs of apple and the like will be where it’s at in 5 years or so
I mean here in Europe and anywhere where 240 volt Standard is used its actually scary how close its getting to that limit. I'm very worried about the 5000 series and how that will basically reek havoc on eu and 240 volt standard houses
4090ti and a 12900ks overclocked, a win win for the winter. AMD is a BIG step in lower power consumption and undervolting the card is a must, more than 100w with little FPS drop.
@@eddiegraham3241 They are no less greedy than the ones who buy the product. People who always have the latest and the greatest equipment need to take a minute and think through what's really important in life. No one is forcing anyone to make the purchase.
What was a major factor in killing the mighty US car industry? the energy crisis. Who exactly across much of Europe does Nvidia think it will be selling these cards to in the volumes they want, when people cant afford to keep the fridge stocked and running. If AMD/Intel comes out with a line of more power efficient + good enough GPU's for reasonable money, they will mop up.
The only problem I have with AMD is their cards are still not as powerful, which makes the decision very difficult for me. I would love to go AMD and stick it to Nvidia, but the performance really just isn't there yet.
@@peter2liter yeah, i'm not saying they're garbage or can't do anything, i simply want the best possible performance all around. AMD is still lacking, unfortunately. getting better though, which is nice to see.
@@DragoNate well if you think its worth it to pay 1000 more to get 5-10% more performance (to go from 150 to 165fps) then go for it. even at 15% more (you very rarely get a higher difference except on specific games) i would question the sanity of someone making the choice to pay that much but hey, its your money - your choice
@@jpteknoman The sanity comment only makes sense when money is the issue, when the price has no relevance, then the choice is not about currency v performance. People, including people like Jay, play to the crowd about value for money whilst adjudging that the purchases are on a level playing field they are not. Basic economics luxury products do not operate under normalised economic calculations but operate in the opposite direction.
Anyone remember the days where anything that was about $1k was a professional card (video editing, graphics rendering, ect), and a high end graphics card was $300-450... if prices are going up too $3k for high end graphics cards it will likely be 30 years before my next upgrade. Edits: maybe im just too old... There was a time where graphics card companies would master the manufacturing process to bring down prices with VERY little or no loss (most of the time there were gains, because you could also fix the bugs). We need that to come back as well. additional edit: because i have seen this comment in more than once and more than one way I should clarify: 300-450 price was lab produced cards they where not mass produced in the typical sense, yet because they figuring out how to mass produce was the tricky bit. So cards use to have two stages for the public, lab cards and factory cards. again maybe im too old.... remembering shit too few do. for historical purpose: some tech tubers have alluded to this time in other ways. Products use to have two different model names, short and long. Short name was on the front of the box, long was the model n./serial. Long name was useful for nerdy/geeky talk, because if you understood the model number system that was used you understood what was under the hood. It allowed faster conversation and drooling. And for conversation purposes you would be able to swap around segments of the model number to build your dream card. Just because something was new and could brute force something from earlier times didnt mean there where not more interesting combinations due to behavior of the previous iteration.
The time they brought down prices and didn't make any less profit was due to increases in sales quantity. Right now, they have less resources at hand due to risen cost.
If you don't increase profit year over year and give all the benefit to stock holders your company is viewed poorly. It's really stupid and its because the stock market is a ponzi scheme.
Remember that the GeForce 256 from 1999 was on a 220nm node had 17 million transistors in 139mm². It had up to 64 MB DDR RAM, and 50 gigaflops of processing power. It cost 279 USD in 1999, 488 USD in 2022 money. The RTX 3070 is in 8nm, has 17.4 billion transistors in 392.5mm². It has 8 GB of GDDR6 RAM, and up to 20.31 teraflops. It's MSRP is 499 USD.
I dunno, man. I'm coming up on 40 years old, and I've been building PC's my entire life, but with the way things have been, and the way we've been taken advantage of in terms of this ridiculous pricing, I'm just not at all excited anymore for any of it. I'm running a 2080 Black right now, and I'll likely run that until it stops working, and then just wash my hands of PC building. I'm gonna buy a laptop and a console and call it a day, and Nvidia, Intel and AMD can all go kick rocks as far as I'm concerned.
Im right there with you. It used to be fun having tons of options to build a computer for a decent price. Now the fun has just been sucked out of it and its more as if they are doing us a favor for selling us components at 4x the price.
Yeah I’ve built 3 pcs every 4 to 6 years and I’m happy with my ps5 my pc only gets used to do work but barely any gaming now, since the ps5 launched wish I got it on that day
Couldn't agree more. I used to build a new PC every 4-5 years so much fun and didn't break the bank... the fun has been sucked out of it (similar feeling with retro gaming).
I think demand for 40 series cards will be pretty weak overall, though will still probably sell out at launch. The cost of living (particularly energy) is so high that regular people have bigger priorities than upgrading there PC's with more power hungry cards and mining demand has dried up completely. The only people buying them will be the enthusiast who has to have the latest and greatest. Everybody else has allready bought a 30 series card, is planning to buy a 30 series card or is happy enough with what they have got. I have been waiting to upgrade to 30 series since launch but If 40 series is priced too high and there are no 30 series cards to buy for cheap then Im just sticking with what I got until it breaks. I have learnt to be patient to wait this long might as well keep going until the cost of upgrading actually makes sense
PC gaming has never been worth it anyway at high end. you're paying like 4 or 5 times as much for a bit more FPS and quality and some raytraving and so on. its never been worth it unless you are an enthuasiast with money. and the rising living costs likely won't affect those people.
I 100% agree. I skipped the 3s completely. I may grab a 4 if it's not too ridiculous. But having to grab a psu at the same time will be a pain. Cards are getting like phones where the next year whatever you have will be surpassed in power. You'll have the new series come out. Then the next year the TI, then a new series, rinse, repeat. It's getting silly for something so pricey.
I bought a 3070 for msrp from Best Buy and I’ve seen them in stock a few times after as well. They’re not always in stock, but it seems like it’s getting easier and easier to buy cards now
I was planning on building a new PC when the 30 series came... Now I just feel so fatigued from this market, I don't even know if I care anymore. My 1070 still pushes alright frames at 1080p, and gas is expensive.
Jay and Linus are the only youtubers that I look forward to seeing their ads. Linus has the segues to his ads and Jay has the I Fix It kit ads. I love it. lmao
I get the feeling every 4090 and 4090TI will be water cooled. That'll keep temps in check, while allowing Nvidia to charge 2k for a 4090 and 3k for a 4090TI
Nvidia does not do water/AIO cooling, board partners will. they will still use their 3090TI air cooler on the FE version, but board partners will probably not be able to get away with a 3 or 4 slot cooler.
Das ist auch mein Plan. Bis zur 4090, wenn's denn klappt, muss das noch die 1080 Ti schaffen. Ich heize generell extrem spärlich und komme auch mit 16 bis 18 Grad wunderbar zurecht. Hoffen wir das Beste, auch was die Bepreisung der neuen Top-End-Karten betrifft.
@@QoraxAudio But here in Germany…gas will be much more expensive than electricity ⚡️. Our both countries are working together on lng2-terminals. So hopefully prices here in Germany will drop…. In the meantime for the coming winter electricity will be cheaper (convektor etc.). I myself bought a stove and wood for heating my flat up this winter. 🔥
Bright side for me, I'm still rocking a 980 ti, when I upgrade it was going to be from scratch, everything brand new. So having to get a new PSU or new case was already part of the plan. I'm mostly concerned about price and the power bill.
Ayy me too. I have the MSI version and its still holding up pretty well. Although i don't play many games these days. Except i already upgraded the rest of my system a year ago. Considering upgrading my gpu to an AMD 6800 xt.
Power bill could be the problem for me. In ny country electricity could go up by 200 to 300% which is nuts. Availability and the pricing will be the key, but if power requirements will be stupid I will skip this generation
I'm rocking a 970 lol I upgraded my computer with money I got from a bonus earlier this year and was able to upgrade everything except my GPU. Hoping to get a 3070 but if they're going to become unavailable before I can get one... idk what to do lol I'm not shelling out more than the $6xx the 3070 I want is worth, but I'm not buying used... Really don't want to spend that money on a card with no warranty left.
@@miranda.cooper this availability issue is just talk. It's not going to happen because those people who have 3080/90s are going to sell them to put money towards a 4080/90. So buy second hand if you can. Just make sure you ain't buying an old mining card otherwise it's lifespan will probably be weak.
I have a feeling that AMD and Intel could be golden just by trying to maximise the performance on their cards with a maximum cap of 300 watts. With currently inflating energy costs, especially in Europe, these cards gonna be way more interesting for consumers. Of course these cards have to be „reasonably“ priced against Nvidia.
The thing is though performance comes at a cost. Not just money for a card but also for power. Higher performance requires more power as it runs at “higher performance”, setting a cap on something like gpu wattage (especially as low as 300 W) means performance is limited no matter how hard people try.
bingo. I checked my system useage, It has a i5 6 core thingy. GTX 1660 Ti and played Cyberpunk on it. 240Watt... Playing Old Games, 90 ish Watt. Hell video encoding? Without use of GPU.. 130Watt I can life with this
Since I'm skeptical of the release of RTX 4000 series and also started to get impatient with just all these rumors going around, I just went ahead and got myself a used RTX 3080 from local marketplace for a pretty decent deal. This market is just a mess and Nvidia needs to get a grip, crypto has crashed, and who knows how long it's gonna take to recover, if ever, and inflation is high so extra expenses are lower for many, many people out there.
Same. Has been time to build a new PC for a while but I didn't want to give in to scalpers and the inflated market. When prices came down I finally bought my new hardware - got a 3080 Ti for a reasonable price brand new and I don't think I'll be wishing for more performance any time soon.
@@bretttanton328 No worries about being curious, got it for 690€, specifically the Zotac Gaming AMP Holo one. As to where, Slovakia. Could've gotten a RTX 3080 TUF for 720€ about 2 weeks ago, but that was taken from me immediately the next day after seeing the ad. Still happy with my purchase, nice looking card with decent temps. Also upgraded my PSU because of this, went with the Corsair RM850 as it was the most affordable and reasonable one to get locally.
More cores, higher frequency, more power draw, more money. I'm tired of this race to pull more and more power. Other computer technologies are getting more performance with the same or less power. They also do it for the same or lower price! Time for the GPU manufacturers to get back to that. Sure there will always be some flagship products that draw an absurd amount of power, but I should be able to buy a $300 GPU that draws 150 watts and more than doubles the performance of an RTX 2060. Heck, it used to be you could buy a decent mid-tier card for $150. Those days are gone.
Well that's also a function of inflation, which has been at an average of 6% a year for the last few decades (using the real formula which were discarded in 1970).
Thats the reality of GPU tech, it's actually more power efficient, but the speed is a lot lot higher and that means even though its more power efficient it ends up being more power hungry than last gen.
Man do I miss the days when you could EASILY build a beast of a PC for the price of a console. I feel like we need a major recession because these companies are just getting disgustingly greedy
Remember when Bill Maher wished for that to get Trump out? People die when that happens... but ya let's hope for that because you don't like a presidents mean tweets or hope for more affordable graphics cards 😂🙄
Honestly, these types of videos are starting to become my favorite JTC content. Just Jay talking into the camera , speaking his mind, distilling a lifetime's worth of computer knowledge and computer history and trying to peek out ahead into the PC landscape of the future. No editing pressure, no pressure to be funny or quippy, just my guy giving us the 411. I started with many other PC and gaming channels. JTC is starting to clearly pull ahead, in my opinion. The crown is yours for the taking. Keep em coming!
yee, i don't even watch LTT stuff anymore, JTC is the only PC hardware channel i watch regularly unless I'm looking for specific hardware reviews during a build.
TSMC is not letting them cut allocation, that was the full story. Nvidia is screwed with oversupply as they were expecting the crypto boom and GPU demand to continue
This video actually made me look at GPU pricing again. I'd set my goal years ago. Once RTX 3070 level performance is available for 300 bucks or less, I'll upgrade. Anything more is just not worth it. And even with these "massive" price drops due to crypto miners selling off a bunch of cards... yeah, still 500-600 bucks in my country. Unless my current GPU dies, I'll be stuck with it for even more years. It currently has it's 5th anniversary coming up.
Yes, what they are pushing right now is insane. Who has $1500, let alone $3000 for a card that can dim the lights in a small city? RTX9090: 440v/3-phase/25kW But checkout all the videos of how Minecraft looks AWESOME!!!! on it!!!! Too bad you will need a 30-year mortgage to get one :(
7:40 I think I'm just gonna wait and buy... nothing. The GPU bubble priced me out of the market to upgrade from my RX 580. Manufacturers explicitly trying to un-pop that bubble get zero instead.
@@AsquareM Yeah, I picked up a rx 6600 xt for $400AUD. This should be good for the next 5 years. I don't play fps type games, but even if I did, I still wouldn't understand how people justify spending more than $500USD on a gpu, I guess if you're into VR it may be worth it or you need it professionally.
Just bought a 3070 a few weeks ago. Got a great deal and was a great update from my 1070. Not gonna bother with 40 series at all, I’m just glad I was finally able to buy a GPU for under $600
I was able to purchase a high tier 3060 brand new at a very good price a month ago, and i already oc it till match (mostly) a 3060ti, i will not give nvidia money for 40 series, one for all this rumors, and two because for two years they laugh at us and be deaf about gaming/arts/work market to be minners bitches, so f nvidia. This cards will be capable of gaming till 70 series and they will have to kill dlss to make us buy new things
I'm really interested to see what happens with the 40 series (performance, price, power draw, transient power spikes, PSUs handling it, temps coming off the PC, do you need a dedicated circuit for the PC...), but I can't mistake that for wanting to buy one. I bought a 3080 only to improve my PCVR apps - which it did, a bit, I mean I won't go back to my RTX 2060 SUPER, but the improvements didn't blow my mind - and I'm not interested in upgrading again until GPUs at or below the x080 can go 100% above the 3080.
yeah i got a 3080. I dont plan on upgrading for a good long while, i went from a 970 so the performance increase for me was pretty massive. Was interested in getting the 3080ti but couldn't justify the extra $200 for a few more frames. like if its that bad and i already turned down everything i can turn down then its time to upgrade period. which im hoping won't be for another decade, or 5 years at least. i kinda do want games to improve graphically in a big way but my wallet is like. please dont, take ur time lol.
I'm glad I was able to get a 30 series card when I did (end of February). The 40 series might've priced me out just from the higher power requirement. It's nuts.
Holding on to my 1070 out of spite at this point, not necessarily cause I can't afford it. No new games coming out I feel are worth upgrading for anyways. GPUs haven't been fun to talk about for years now, and it doesn't look like that'll change any time soon.
For 6 year old cards, 1070s and 1080s are absolute beasts. 6 years ago, a 6 year old graphics card (570 and 580?) were struggling to keep up. But the high 10s can handle VR
@@V742 Yup. Had I known the 10 series was the last time GPUs would be fun, I would've bought a 1080ti. 1070 still produces acceptable frames, so long as I tweak the settings.
@@kimjongpoontv69 They statet >50% not doubled. And the bigger the card the smaller the increase because of powerlimits. A 7900XT will be nice if it can hit 50%. A smaller card like an 7500XT should be doubled compared to the 6500XT.. When nividia takes 450W and AMD only 350W (TDP on AMD cards is only GPU power, not the whole card), then AMD has a win. If their chiplet architecture works, it will be much cheaper for AMD to produce such cards, whcih will pressure nvidia and stop the insane pricing. And even minimal defectives chiplet can be put together for a bigger card, because it does not matter if 20% of the die is defective and disabled. If you disable a lot of space like nivida did with the 2060 GTO manufacturing such a chip is twice as expensive compared to just using the smaller chip.
AMD is already close to nvidia and better in rasterizing. And you can bet, that AMD will have improved their raytracing solutiion withiout dedicating extra silicone for it (Tensor cores). Their hybrid approach was the smarter way even if it's not fast enough now.
@@eliotrulez Well if they utilize MCM it’ll be a lot more than just 50 % that’s for sure, id be extremely disappointed with 50% unless they retain the same prices.
I bought a 2070 super from nvidia right before the 30 series announcement. I felt like a total chump at the time. Well, you all know what happened and I felt redeemed. At this point Nvidia would have to really make some game changing moves towards customer satisfaction to get me to buy anything if theirs again.
looking at prices even after the price crash, my 2070 was an unbeatable investment, bought it end of 2019 for 400€ and I'm pretty confident I could live with it for another 2 years
Literally the same with me man. More than enough for current gaming. Even with the old motherboards such as the B550, they are going to the same design from the x570 lol.
dud, i bought my 2020 super ONE WEEK before our nationwide lockdown(the first one), it was i think before or just after the 3000 announcement. What i regret is not buying 2, because one week after my buy the prices went up a lot, and then they went bonkers as we all know, i could've sold that 2nd card for 3 times what i bought it for... I'm not planning to upgrade unless i see a HUGE gain across the board, like you guys say, at least 35% increase at $500, otherwise i'll keep this GPU for 5+ years, maybe i'll upgrade on the 6000 series
I am still on a waiting list from last years release for a 3070, so I will definitely not be jumping on this unless they meet the price at the same level while giving more power. Which is highly improbable.
and moores law is dead just said opposite... retail have oversupply of expensive GPU-s that they may need to sell at a loss and they requested delay for 40 series. retailers were forced to buy loads of current gen gpu-s on market that doesnt have demand for nvidia to actually supply 40 series to them when they become available. also he said that pricing may even drop further (specially on used market)
last gen is about to crash on used market for sure. I think MSRP holds at retail but once 40 series comes out I can see 10-25% sales start to happen for new last gen
TSMC has raised prices by around 20% for the 40 series Chips. Nvidia is going to pass this hike to the consumer and also add a little bit on top of that. So, at the very least the RTX4090ti will cost minimum $2500. The $2999 price is not that far fetched !!
Unfortunately rebar seems to have significant problems with some older titles, CSGO being one, with rebar enabled my 12400 perform worse than my old 4790k
I have no interest in next gen. Just replaced my card from 2015 with a brand new RX6700XT two days ago. Love it, I can't believe what I was missing graphically. I will be good for awhile.
As well you shouldn't. The longer we hold out, the quicker prices will normalize. Sadly people will pay Nvidia's extortion prices and nothing will change.
I made an error and two months ago I switched from my old G752vs (gaming laptop) to a desktop with a i7 12th gen and a 3080 Ti. I don't regret it (a 1070m is far less powerful than its desktop counterpart), but looking how annoying their practices are, I don't wanna be part of the problem.
@@MrlspPrt I don't think you made an error. I have probably waited far too long but I stuck it out. Next month the MB, processor, memory upgrade happens.
At this point in time, I'm thinking of going from my Gtx 1080 to AMD's 7000 series. I think the 7000 series will be some what more popular than the RTX 4000 series at this rate, if and only if AMD really ensures that the power draw is reasonable.
That's reasonable but let's think that AMD draws it prices from the competition, if 4000 series cards are revealed with a huge price point, AMD will follow suit.
@@oxfordsparky AMD has much less of the market and are always cheaper. You do understand what a fanboy is right? People who continue to buy nvidia and Intel every generation at higher and higher prices are the definition of fanboys.
Dudes I'm on a 970, I was gunna wait for a 40 series and was expecting to wait till November/December for the 4070 but Jay has made me totally flip my mindset and I'm just gunna buy a 3080 now rather than wait months and struggle to get a 40 series anyways. Done deal. I'm sure I won't be disappointed
just get a 6600xt used and sell it for basically the same price in 5-6 months. the 3080s will have dropped even more by then, and some used 40 series will be on the market.
It wasn't long ago they posted a video about saving money by not overdoing it buying bigger power supplies that you don't "need" motherboards that have too many slots you probably won't use, and cases that fit too many fans in order to save money. If I had taken their advice a few months ago when I built a new rig I'd be up shit creek right now having to buy another power supply and case to better support heat rejection. I'm glad I listened to myself and "overbuilt" my newer build. At least now I know it will support the 40 series without any issues. If you listen to every UA-camrs advice it's probably going to end up biting you in the ass. Do what you feel like doing, stop listening to UA-camrs speculative advice.
DON'T buy anything. Im on a 980 and there's really no reason to upgrade right now because there aren't any games that need the upgrade worth playing. When the 40 series release those 30 series will drop in price, as well as every other card. Just bide your time and wait for these miner cards to continue to dump.
I just brought a 3080 upgrading from a 1080 ti and that upgrade is very noticeable so yeah totally worth it for the price they are selling since last month.
I've been holding on to my little old GTX1050 since 2017 - the 16/20 series didn't manage to convince me in terms of performance per dollar and between the silicon shortages, the pandemic, scalpers galore, and more recently inflation, I've entirely missed out on the 30 series. That said, I'm mostly a retro / "patient" gamer, preferring to pick up the latest and greatest games of the previous decade or two on Steam for under £5 apiece, so far be it from me to need bleeding-edge hardware. However, we're at the point in time where games from the late 2010s are going on sale for basically pennies and peanuts, and I keep finding that I'm teetering way too close to the limit of the 1050. With that in mind, I'll be really curious what the 4050 will bring (or whether it'll even exist - remember, we never got a 2050 either) and for what price.
@@lolthisnerdsaidmemes That depends on whereabouts in the world you are and which variant you're looking at, really. To me $480 sounds a bit too steep for a 3060, considering here in the UK they can be had for £320 (~US$380) for the 12GB Asus version brand new from Amazon with free next-day shipping, or even as little as £160 (~US$200) on the used market if you don't mind spending an hour or so giving it a good clean and replacing the thermal paste. If you happen to be in South-America, Eastern-Europe or Asia, $480 might be a bit more appropriate, considering hardware usually costs a bit more in those areas due to import taxes and whatnot.
After looking around I would honestly recommend looking at a 3050 if you can find them. Sounds like a good fit for you and your needs. 8GB of VRAM, that's plenty for even games that were released in 2015 going on sale. Take for example GTA which is technically a 2013 release game 2015 for PC. A 3050 is more than enough for that and similar games as well as other tasks. And I can find them here in the US for under $350. Seems like a win to me.
@@lolthisnerdsaidmemes I would say that it depends really on what your needs are and what kind of games you are going to play. Aare you doing any sort of video editing? Or are you just playing games mostly? And what games are they? I say that because as I mentioned in my previous comment a RTX 3050 8GB for under $350 USD seems like a bang deal to me.
Love the PC world but honestly, if these prices are even remotely close to real, I think I'm done with this stuff. It's hard to not justify just snagging a PS5 or Series X, enjoying gaming there, and having a moderate PC build for productivity and everything else. It's hard to justify paying this much money just to play videogames anymore, but maybe that's just me.
Same man. I am a lifelong pc gamer but I will switch to consoles if prices stay like this. It's just a fucking game, I am not going to spend a fortune to play a game in better quality than a console.
You don't need a 3080-3090 to play games at a decent frame rate and resolution, my 6600xt kills it at 1080p-1440p and I got mine for under MSRP at $339 and the 3060ti is going for right under $400 right now...
@@cemsengul16 Same, I'm still using an undervolted rx590 and I can't complain, it still manages 70ish frames at high/max settings so it's good enough for me. I mainly use my PC for music production tho so gaming takes a backseat tbh. Hell, and these days the consoles have 2070-adjacent performance tho so the value proposition for consoles have never been stronger
You don’t need a top end PC if you can’t afford it. Even if you buy a 4060 when it comes out it would annihilate the new consoles in performance across the board.
Got my 2080 Super at the start of 2020 (a month before covid), and I'm still happy with it. What I like _least_ is that I can use my PC effectively as a *space heater* when running FurMark and Prime95 together. I'll wait until power requirements come down and efficiency improves before I pull the trigger on an upgrade, provided my current card lasts long enough.
I did the exact same thing. I almost kept my old 1070 figuring I'd buy a new 3080, but then remembered how stock is usually low on a release. As it turned out, it was one of the best decisions I could have made.
I’m still using a 980ti and it performs perfectly on highest graphics settings for modern games at 60fps considering I only have room for a 1080p monitor. Old cards aren’t bad, but if I wanted to get to 144hz then I’d have to shell out
I lived with a Radeon hd 6970 until 2016 and that card was a space heater. I upgraded to an rx 580 after that card and it ran so cool. Now ive upgraded to a 3090 ti and i once again have a space heater. Seems to me that every 6 to 8 years youll get a good card thats not also a space heater.
Had a Red Devil 6900xt Ult. Best PC purchase I've ever made. It didn't even get out of bed when I maxed out RDR2, Chernobylite, Metro: Exodus or anything else.
I was all set to drop 1500-2000 on a new PC in Spring 2020. Not even thinking about it today. Not playing this game (or any game, pun intended) by their rules anymore. It would take a complete reset on the supply/price/value equation to get me back. Just sad how they all conduct themselves. I recall when 'entry level' gpu performance was only $300 and high end was $700, now it is $1000+ for entry and $3000+ for high end. I. Am. Out.
@@stevedixon921 Imagine how I feel when entry level gpu performance was $40-70 and high end was $250. I think I spent something like $150 on my hd4850 in 09, something like 160-175 on my hd7850 in '13. Both decent midrange cards, around the same price over 4-5yrs span. Still running the hd7850 so it's been around 9yrs now. But 'lower mid range' or nearly entry level is $400-500? Ouch. I think my geforce 6600gt was only around 120 when I bought it, burned up in less than a year and bfg replaced it with a similar 7 series like a 7200gs or something. But that was back when cards were still on agp ports vs pcie.
honestly... i dont see many consumers seeing any big of a difference jumping from 30 to 40 anyways, id just get either if its on sale and then stick to it for the next 5 to 8 years or so, maybe even longer if they are still good
@@SuperSavageSpirit nahh im still chilling on a 1060 6gb, i think itll hold out pretty well for a teensy bit longer, i think the 30 cards will be okay for a lonnnng time even on high res
I'd laugh if I saw a $3k 2090 on launch. At that point any arguments of "inflation" and how that affects the pricing that they may have used will be absolutely invalidated as this is a move to capitalize on what they can potentially get from the market based on how crazy people overbid for graphics cards during the mining/scalping craze and has absolutely nothing to do with the buying power of the dollar.
With ethereum moved over to proof of work as opposed to what was stupidly profitable mining, there shouldn't be any real threat to the supply/demand in that aspect. I imagine the scalping will be much more short lived too because of this :) we can only wait and see what happens with the 40 series
I believe them when they use inflation as an excuse for bumping prices to keep their share highly valued on the stock market. To a corporation like Nvidia, share price is the only point of consideration. If they maintain the prices as they are tier for tier right now regardless of cost of manufacturing, the shareholders will complain Nvidia isn't fighting inflation enough and their share value is reduced. That's the price (no pun) of being a publicly-traded company. The customer takes the last place in the hierarchy of priorities.
@@paul.1337 You're misunderstanding me, I'm not saying inflation doesn't exist, nor am I saying there aren't real world impacts as a result of it (e.g. rising prices). What I'm saying if they release a $3k 4090 it will be more about Nvidia seeing what people were willing to pay (i.e. over twice MSRP in many cases) and them wanting a piece of that action being the primary cause for this increase in pricing and largely having nothing to do with inflation.
@@EversCS Ethereum is still proof of work... it is going to be moved to proof of stake. "Stupidly profitable mining" when right now miners are earning more ethereum than before its just worth less now. So which means when it goes back up again miners are earning more now then they were when it was "stupidly profitable mining"
Damn that's a real scummy move, Nvidia. It's not enough that you are sucking up ever greater amounts of that expensive electricity, you've got to screw people wanting to buy cards that they can actually afford *during an economic depression!?*
Yeah it's an eye opener for sure. Nvidia couldn't give a shit about the consumers who supported them for decades. They traded us for miners buying truck loads without hesitation.
@@thejohnbeck people forget this. And they forget that part of the recession/depression we're in is due to rampant inflation from money printing and restrictions on logistics and fuel.
There is no economic depression, we're not even in a recession yet. The government's deliberately slowing the economy to avoid hyperinflation. Companies have no reason to increase the prices other than the fact that they know everyone is still flush with COVID-19 stimulus cash and they want to take it out of your wallet in every way possible.
@@thejohnbeck literally nailed it. people live quite comfortable lives if their major worry is if they can "afford" a luxury for entertainment such as a gpu during an economic depression. out of touch with reality smh.
Still thankful I bought my 20 series card when I did. I had major buyers remorse at the start but that changed quickly when it was damn near impossible to get a 30 series. Good video, Jay!
Same mike, same. Bought a 2070 super founders from nvidia right before the 30 series announcement. Still works great for the games I play......(mostly Dayz)
I have a feeling Nvidia might suffer heavy losses this time. What happened during the 3000 series was a golden opportunity for Nvidia ; Crypto was going into peak bull market > people were at home all day (gaming) > chip shortages were hitting. Nvidia produced the most cards they ever did back then and it still wasnt enough. Now I fear they will produce a huge amount of cards and they'll just sit purely because crypto mining is at an all time low, chip shortages are slowly going away, and most of all people are suffering from inflation right now. Most PC enthusiasts aren't going to be upgrading this series tbh unless you're running something ancient. Oh and if Nvidia decided to artificially inflate prices well that'll be a kill blow like Intel did to themselves and bring the true victor AMD to taking over GPUs next. Nvidia is repeating the same mistakes Intel did. Nvidia should fear a Ryzen type release in the GPU world aka same performance and half the cost.
There are always enough people who are buying an nVidia just because of it. Be it, because nVidia launches first most of the time or be it because "you can't game on AMD". Sometimes it is the only green thing they will see all day
I've given up on owning gaming rigs... From now on it's going to be SBCs and cloud gaming... I'll spend my money on gig internet and services. My power bill went down about $50 a month getting rid of all my towers and monitors and stuff... Now I have an SBC on each of my TV's and it's going great.
Have a 3080 , that cost £900 due to its very high cost it’s going to have to last for 4 to 5 years Happy to reduce graphic details things don’t look much different anyway now days
I am still running a 1080gtx, and can still play "most" games. Not the prettiest but they run. But I am pretty sure I will be grabbing a 4080/4090 - just waiting for this clusterfuck to end.
So the 30 series was a fluke for a few reasons. It was the perfect storm to keep cards sold out for a long time. You had people like me who were running 900 series still because I looked at the price hike on the 20 series and the lack of performance gain. Mostly just RTX which wasn't blowing skirts up at the time and said hell no I'll wait for 30 series. Then you had the pandemic and a massive jump in computer parts demand for people building or upgrading systems for the lock downs. Then to top off that shit sandwich was the damned scalpers and crypto miners taking a bunch of product as well. So this time it will be different as all of the people like me who got aholt of a 30 series aren't in the market. No pandemic lockdowns to push a ton of new builds or upgrades. No big demand for new cards from the crypto miners. So the scalpers can try but they will fall flat on their faces for the most part on 40 series. Other than the morons who just HAVE to have it day 1 and don't care how much they spend. Now I would still follow Jay's advice and not sell your current card before you get a new one. Too much of a risk that something doesn't work out right and you end up with no GPU.
Exactly. I was gonna wait for a 40 series since I have a 3050 which does damn well for the games I play. But seeing the new power requirements I’m just gonna grab a 3080 from micro center tomorrow. Then just throw the 3050 into my brothers pc lol
@@brandonshurtugal Yep I got lucky enough to not have to wait "too" long to score an EVGA FTW 3080 Ti and I have zero intention of moving to 40 series.
I had a few 30-series cards, now ending with the 3080ti and it BARELY fits in my NR200. I'll go for a 40-series only if the power draw isn't insane on the 4080/4080ti.
The request from Nvidia to reduce wafer orders for the 40 series was my reminder they don't really need cards for gamers. They were selling pallets of cards to miners and now that the demand is gone they are fine making less cards. I wouldn't be surprised if they tried to keep invintory low and prices high as long as possible, they are a for profit company after all. I was lucky to get a 6800xt in 2020 and have been thrilled with the performance and the additional VRAM. 4k texture packs for days and Smart Access Memory with my 5900x boosted it even more. Now a 6900xt is $800-$900 also, performance vs AMD really depends On the games you play and the specs of your monitor. A good 1440p 165hz monitor is $300-350 and is such an upgrade over 1080.
@@J-D I've had no issues with my AMD drivers on the 6800xt. New game ready drivers always roll out quickly with new cards launch. My brother had a 3070 and and it seemed like there were as many driver updates as steam updates
I saw all this insanity coming and decided to avoid it, so last week I improved my gaming setup about 1000 percent by buying a 6700XT as part of a whole new build. I'll be making popcorn and enjoying the show when all the new cards hit the market.
its crazy how little time it took for the prices to stabilize* normally it should take 5 years at least theres still more room for improvement in these prices tho, they will go a little bit lower
@@Boogerdick69 if you’re not looking to upgrade your psu you might just wanna go ahead and get the 3070. Cause I mean 3080 recommendation for psu is 850 and the power hungry 3090ti is a different beast. And with 40 series no telling how power hungry they will be.
@@billcipher534 yeah, I’m in the line on the Best Buy app for a 3070 right now. Founders for 500$, is it worth the upgrade though? I mainly want it so I can get a better 4K experience on my tv. Some games run at 50-55 FPS with my 2080 on med-high settings
Jay, you have your MSRP prices all wonky. Below is the historical list of MSRP prices on the top end NVIDIA desktop cards at release since the 6XX series. The key factor is the X80 cards have crept up from $500 to $700 in the last 10 years. Titan cards have been all over the place and the X90 sometimes is the "Titan" with the 6XX and the 3XXX series. Some series having multiple tiers of Titan cards as well. I have never considered the Titan or the equivalent series of cards as "gamer" cards since the 6XX series with the 690 starting at $1000 compared to the previous 590 card only being $700 which was the first x90 series card. The issue is that I consider the x80 series of cards the top end gaming card, like most sane people, and that price has started to go through the roof as well to the point that even it cant' be considered a "gaming" card anymore although it is still built as one. 3080ti at $1200? I am hoping market forces finally start bringing down these stupid priced cards back to the realm of what gamers can afford and not miners/companies. The x80 should be $500 and he x80ti should be $700. That has been their historical prices for over a decade until the 2XXX series changed that. 680 = 500 690 = 1000 780 = 500 780ti = 700 Titan = 1000 Titan Z = 3000 980 = 550 980ti = 650 Titan X = 1000 1080 = 600 (orig) / 500 (drop) 1080ti = 700 Titan X(p) = 1200 2080 = 700 2080 super = 700 2080ti = 1200 Titan RTX = 2500 3080 = 700 3080ti = 1200 3090 = 1500 3090ti = 2000
I'll stick with this 3060 I bought. I'd like to buy a few more just to have them cuz I wanna build some more, but the average fellow can't just go buy 6 new gpus. I just hope in a year or two when I plan on upgrading I can still get something like a 3080ti new, at a reasonable price. Not at all interested in these higher power draw cards tbh.
watch what the market prices are once the 40 series drops and then grab a 30 series if they take a drop in price is what I'm thinking on doing . I have a 3070 ti , but I also have a 2nd machine that I could put that in IF I wanted to upgrade my current gpu .
Why would you need more than one? It's not like you can still use them (profitably) to generate crypto, right? I mean you can buy as many as you want but logically nobody is mining anymore which is the reason.... for the eBay fire sales
A 3080ti will never drop much in price as long as it's still reasonably usable. Top tier hardware is always expensive. Look up the prices for a 980ti for example.
Considering electricity costs in the UK have risen up by 52 percent and due to increase again in October.. good luck anyone planning to run a system with the 4000 series.
@@ultrawidegameplayandbenchm9158 light bulbs literally cost £7 a year to run 24/7. It’s appliances like wash machines, dryers, kettles, fan ovens, PCs, fish tanks, etc that draw stupid amount of power.
Saw a 6800xt for £670 and picked it up. My Vega 56 was struggling, and rumours of high prices and power draw for next gen made buying now a wiser choice in my view.
I am in the same exact boat. Have you undervolted it to get the TDP < 300W? What PSU do you own (I have 650, but I know I will need single-rail 700+ most likely).
I have an EVGA GQ 650w, it’s paired with a r7 5700x, 32gb of ram, 1tb boot drive a 2 tb SSD, 5 case fans with PBO enabled. The 6800xt I have (gigabyte) pulls up to 280w standard from what I’ve seen. Not tinkered with its power as I’ve had no reason to do so what so ever.
I am using 2 separate pci-E power leads. I’ve heard it’s better to do so. Although 1 lead with splitter from the same Psu handled my Vega which had the 64 bios and probably pulled more power at times.
I was originally thinking about getting a 30-series card, but decided to wait because availability in my country was practically zero and the few cards we got were ridiculously overpriced, with some 3080s being sold by actual stores for up to $3000. We've only gotten reliable stock and (somewhat) reasonable prices in the last 2-3 months, and with the 40-series presumably launching around October I was like "Nah. At this point I've waited close to two years, so I can wait a few more months." The only thing I'm a bit worried about is power draw, so I'm going to wait and see if AMD have something that gives me the performance I'm looking for without giving the circuit breakers in my apartment anxiety attacks. I'm hoping the trend from current gen continues, where AMD gives acceptable performance (for my use) with ~100 watt less power draw. But I guess time will tell.
I know someone at AMD already said they're going to have more power draw than RX 6000 series did, but they're probably gonna be still less than Nvidia's RTX 4000 because of competition (I think Moore's Law Is Dead did a video on it but I might've seen it somewhere else)
Power draw for RDNA3 will be only a slight increase over last gen. I wouldn't expect their top end part with MCM to run over 425w. It's likely it will be over 4090ti performance. Jayz doesn't specify 4090ti power requirements, but AIBs have been told by Nvidia to prepare cooling and voltage regulation for 600W. So here are your options: 1. Buy Lovelace, a new case, fans, and power supply to deal with 3x transient spikes and huge heat output Even a new motherboard may be needed to deal with transient spikes - see the investigation on GN, I highly recommend it Or 2. Just buy AMD and drop it into your existing system I think people seriously underestimate how much raw heat is pumped into a room by even just a 400w with a high end cpu. It is not comfortable regardless of how good your air conditioning is. That heat source is right next to you. Combine that with Nvidia's lazy approach to transients and I think flipped power breakers and sweat will push people to AMD this gen, even if someone initially bought a Lovelace card.
@@yourhonestbro217 Norway. We've either had the worst low end models or halo products being sold for scalper prices pretty much since the 30-series (and rdna2) launched.
Considering the "reports" that AIB's have too much 30XX stock and nVidia is trying to delay the production of 40XX with TSMC, if the price rumours turn out to be true either AMD is going to be laughing all the way to the bank or it is going to destroy the market once more.
AMD won't be laughing to the bank. Purchasing power is getting too low to warrant a $600 graphics card, let alone an $800 card or more. There's people now scraping their wallet for fuel to go to work, who used to buy new graphics cards every 2 years.
@@michelvanbriemen3459 GPU makers forgot how small globally the segment is for $800 GPUs. A mere 5 years ago a $800 gpu was a Halo product, now they think it's a mainstream premium class card price segment. 99.99% of PC gamers globally can't afford a $800 GPU.
AMD, just like Intel, will price things based on how much they can get away with just like Nvidia. These companies aren't your friend, there main agenda is making as much profit as possible, so if they can get away with charging $999 for a RTX 4080 then thats exactly what they will do, just like they did with the 2080ti being twice the price of the 1080ti, the excuse was the RTX tech, now there will be another excuse and I'm sure there will be another set of excuses for the 50 series which will primarily be MCM so they will say the cache and interlinks are expensive. AMD are exactly the same, they did the same as Intel with Ryzen, they got a core advantage and then they got the performance advantage and then they raised their prices because there was no competition and they could get away with it lol, Nvidia has always got away with it and that won't change this coming gen
@@zeitgeistx5239 That's true, I remember buying an R9 290 for €400, and the tier-equivalent now is double the price if not more. Back when I bought it that was borderline enthusiast-grade pricing. Now it's "cheap". The masses won't spend that money either, they can see sense in €250 but that's where it ends for most of them.
“To those of you saying I’m gonna but the 30 series when the 40 comes now is probably the time to buy it” As I’m one step ahead I wanna buy a 20 series when the 40 is released
No way i'm getting a card that draws this much power. Even if its free, I wont do it. There are a lot of reasons to avoid that card. Heck, I don't think my house electric system can even support it...
@@asbestosfibers1325 It's not necessarily the power draw what's gonna be the issue, it's more the heat the card will produce. A 500+ watt card which has 16,000 cores is gonna produce a ton of heat. A card like that will require watercooling at best.
@@asbestosfibers1325 My place was made in 1980 from what I know, so that's hmmm.. 40 years ago? We already have issues here, when we turn AC on and a vacuum cleaner (plus a few other things) I won't change and replace every single cable just because of a single GPU. A total makeover of the electric system will cost a ton. I already calculated the whole thing. Not to mention the repairs on the walls after everything is done. Also also, I'm not sure the other people here would agree to that.
@@laszlozsurka8991 Indeed, then the AC would need to be turn on ever longer/work harder. The heat is one of my issues with the card. Watercooling is cool, but my friends always have a leak with their tubes..leaking on other computer parts. I'm always scared of that. Also also, playing games all day with 500+ watts can't be good for the power bill as well. When you mix the CPU, the monitors and everything. So yea, there are a million issues with a card like that.
I only learned about the US electrical grid recently and it blows my mind. I found out because most of them don't have electric kettles (because no tea drinking culture) but even so, they aren't worth is as they take too long due to the 1.8kWh limit. Please don't @ me with abuse as I know it's more nuanced than this but I'm not here to recap the video I saw 🤣
I'm honestly interested in what AMD puts out this gen. If they've put work into RT which it sounds like they have, and they're more power efficient AND the price is lower... might be time to make the switch to team Red.
AMD has always had better products price wise and longer term support wise. RTX was the only reason to ever buy team green, and it has never actually been viable since buying a card powerful enough to play RTX games has been only for the top 1% enthusiasts. None of the 20 series counted because RTX is a joke on them.
I'm actually a little frustrated with some of these takes, especially the ones regarding "things are somewhat back to normal". No. They are not. 3080 launched at msrp for 699, so finding them at 800 is still over the msrp. Even if we give a long-shot take and say all 3080 cards can be found new for 699, this still IS NOT GOOD. We are looking at effectively 2 year old graphics cards that still sell at their launch msrp. The rtx2k series launched their mid-release "Super" variants which were a decent jump in performance, but didnt jump in msrp. And that was after just a year. If things were actually back to normal, then all gpus should be selling 20-30% below msrp right now (especially considering we are heading into a new gen gpu launch in a few weeks tops). Yet most cards sell around 20-30% ABOVE msrp still. effectively that's a normalized price difference of 40-60% between what the prices are, and what they should be under normal circumstances. This is not normal. We are very much still in a hyperinflated price period.
Give it time - AIBs are sitting on a mountain of 3080s. And the MSRP given for it didn't cover the cost of the card. Moore's Law covered this pretty extensively looking through the Bill of Materials for the 30 series of cards
I agree but Nvidia and AIB's will never sell below 30% below MSRP, they'll just rebin and rebadge their old stock into a 4050, 4030 or whatever. The most I can see the prices dropping is MSRP with a $50-$100 rebate, but that'll mostly be on a retailer level. But who knows, I wouldn't complain about $400-500 3080's.
You forgot one thing: 699 is the msrp for the nvidia founder editions. The 3rd party manufacturers cards like Asus, MSI, Gigabyte etc. have a higher msrp (around 100+) so we are on a msrp from ~800 Also theres the inflation, global crisis and so on… You are right, the prices should be lower at this time (lifecycle end of 30 series) but lets be honest, the greed of those companies and ,amufacturer is too big. There wont be a time sonn anymore, where you get graphics card for such cheap prices like in the past. Pretty sure the msrp for the next card series will be higher again because the people are still buying it…i mean, people bought graphics card for 2-3x msrp…..
ive been wanting a pc since 7 or so yrs ago. Finally having the money for it, ive built my 1st pc 1 month ago with a 2060. Today a 3060 costs exactly as much as the 2060 Ive bought :)) but even with the 40 series coming out i dont regret a thing, bcuz ive wanted a pc for so long I wouldnt have had the money for a 40 series anyway so i am more than happy with my 2060
Good for you. Was in the same position as you about 6 years ago when I got my 1080. But I'll probably be looking to upgrade I just don't know if I'll be able to beat the scalpers and end up having to wait until next year sometime to upgrade. Glad to hear you got what you worked for though. Just don't be surprised if you find yourself needing a bit more power at some point, depending on what games or applications you run.
i mean games are not improving graphical wise to justify an upgrade with dlss fsr etc and the new unreal engine that is pretty impressive even on older hardware the extra power needed and heat that will be generated by the 40 series its not worth upgrading but people will still fall for FOMO.
I still have 2060 in the computer I use for Adobe and other graphics programs and 2070 in the mini-PC I use for gaming, and I got a 3070 at MSRP a while ago that I never used because I did not see the need right away. I have been building my own PC systems for 20 years now, and the GPU scarcity con game has become an impediment for anyone trying to get into making a computer and a turn-off for long-time builders. I am still wondering what the average person thinks they are getting out of some of these new graphics cards, considering the number of games that realistically use any computing power is few and far between. Unless you are doing some high-level CGI rendering and tons of video encoding for work, school, hobby, etc., the average user does not really see what the power increases are really meant to do.
I really got into PC building youtube in 2019 but couldn't justify spending the money to build anything. Fast forward through the pandemic and I didn't think I'd ever be able to build anything with how expensive stuff got. My dad ended up buying me a 3080 Ti in February and I bought the rest of the PC myself. I'm hoping it will last me a long time because of the way nvidia prices and inflation are looking. Great video Jay, I absolutely love your content.
It definitely will, I'm still rocking my gtx 1070 @ 2k in most games and it's just about time to upgrade for me so what's that 6 years on one card? That's a lower tier than yours. That said though I mostly play shooters or SIM racing with a bit of VR so other than VR I don't really need any upgrades, I'm happy banging it on medium and getting 80-120 FPS though, still higher than console
@@mgproryh i am in the same boat, i don t care about raytracing at all, i don t care about resolution about 1440p at all, but man they got me on VR. I have a 1080ti and while i can still play VR at very high quality my frame rate is abyssmall either 45 fps or for DCS i got as low as 30fps lock in VR. Not the best experience to say the least.
@@gzaos damn really on a 1080ti? I presume your on a quest 2 or other 4k headset? I use a Lenovo explorer gen 1 so it's like half / a quarter the resolution (don't remember exactly) so runs alot smoother even on the worse card. I loved my brother's quest but I knew I wouldn't be able to run it
Sad that no one ever mentions AMD in these conversations, leaks suggested ages ago that rdna3 was doubling performance and to match that performance increase nvidia had to juice their 40 series hence the suggested power draws that we’re seeing. I don’t think that team red’s cards are going to be near the same power draw
Possibly because the AMD rumours from the last 3-4 years didn’t pan out, and if you were expecting AMD to beat NVidia or Intel, they are … forever unlikely to be the cheaper or better choice in the midrange or budget range. And that’s hard to root for. Expensive hardware that isn’t the best option, is unreliable and buggy, and doesn’t have the fastest performance, is missing features, etc. AMD has to break the habits of just doing the average job if it wants to compete with Intel or Nvidia. It’s not impossible, it just needs to break the habits they have. If it’s just on rumours becoming too aggressive or hyped, and it doesn’t happen for 2-4 years, or the tech doesn’t improve for the next generation, the Drivers and software is buggy, etc. So, at the same time we were seeing the 3080 launch, and RDNA2 was going to be included in the next Ryzen APU series… to compete with the PS5/XSX CPUs… It wouldn’t have been that believable, but it’s been several Ryzen launches and the APU hasn’t been launched or mentioned yet. That did not happen, and the 6600xt didn’t come out to be close to the 3070 series specs.
@@Toliman. Let's calm down talking about Intel, they have no experience and what they've given us so far has been crap even on the technical side with compatibility. Id take AMD over anything Intel atleast for a few years until Intel irons out all the problems they have and are going to have with their more powerful cards
@@Bi9Clapper Intel has been doing iGPUs forever, plus this is what? Their 5th? 6th? attempt at a dedicated GPU? They usually come out with one, it'll bomb, and they'll cancel it and give up for 5yrs+
that is because AMD has been out of the high end gpu space for years. they compete in the midfield but even then its mostly not at a compelling price point per performance so you end up being better off with a midtier nvidia card unless you just want to support team red. Every launch there are rumors this will be the new hotness by AMD but years of disappointment means i will be happy if they release something amazing and cheaper but... not holding my breath
@@Toliman. Nothing wrong with team red, unless you're a team green fan boy. Been gaming on Team Red cards for nearly a decade now ever since I switched because of instability on my second team green card and guess what... my current card maxes out the 165Hz refresh on 1440P at Ultra settings on most every game I play. Notice I said that *I* play, so anything else doesn't mean sh!t to me. If *I* don't play it, then it might as well not even exist to me!! Oh and did I mention, it does it using a wimpy little 700 watt PSU? (kill-a-watt shows about 565-595 watts total)
The prices AND the energy consumption of graphics cards has gotten completely out of hand in the last few years. It's just crazy what's going on in these times. Fortunately, I'm relatively fine because I only play in 1080p. But that doesn't make it any less insane!
Yeh last gen prices dropped some six months before the new cards were released and about 2 months before they started going back up again. I remember because I wanted to buy but I was still a little in debt and wanted to clear that before I did. But by the time I got it cleared the prices had gone back up. I didn't worry, figured I would get one of the new cards, but then they launched high and never came down. Just getting higher and higher. So when I was able to get a good system for a decent price a short time back I grabbed it.
I built mine last year 5800x and 3070. Solid rig for a long time. But I'm considering getting a 3080ti just so I have another gpu when this one craps out. I'm glad I waited so long till I built my first pc.
I also went all out on my build (and like 1 tier above what I originally had planned for the build for most parts) and plan on keeping it for at LEAST the next 2 or 3 generations of parts.
I managed to get a 3070 from Bestbuy Canada at MSRP and just going to hold onto that until the 5000 series. To stick it to Nvidia more companies need to enter the market just like Intel and create more GPU options. AMD adding ray tracing would have nividia on the defensive as well.
I mean, AMD does have Raytracing on a hardware level... It's just nowhere as effective, also because software leveraging the power of GPU is often much slower on AMD - which is likely drivers, optimization and more. So there needs to be a lot of software development done on the side of the software developers of the programs.
Thanks for the updates. Prices for the RTX 3090 just came down even more, so I just ordered one of those. Last year when I built my monster build, I used high end components except the video card. Now that last hole in my build for a 3090 RTX at $1200 was a realistic choice and it should last for years. Waiting for the series 40 cards seemed just too much of a gamble as the prices will just be insane, even if you are able to get one. Since my build is a AMD 5900 with the MSI Godlike motherboard I did consider getting a AMD 6900 XT but since I wanted more options with Ray tracing and DLSS I decided to stick with Nvidia,
I just did the same. £1500 for everything but the GPU. God tier MSI board. MSI gold rated 850w PSU. Enough to handle the 4080ti when it arrives (it's reported to be maxing out at 600w so 850 is the minimum considering CPU and drives.)
Huge power demand. You may have to plug the GPU into one outlet and the CPU, monitors, speaks, etc into the plug next door, which runs on another circuit, so you don't trip the breaker. And don't even think to plug in your phone charger while gaming.
I feel you're a little behind on this. Ampere production finished months ago due to overstocking. Who knows how long it will last but Nvidia tried to cancel some of their 5nm orders at TSMC, that suggests they're worried about oversupply I wouldn't be rushing to get a new GPU just yet.
Over supply of 30 series. If they artificially rise the prices of 40 series for launch it will push more people that have been waiting to just settle with 30. Supply will dry up quick and history will repeat itself.
Given this, and especially as a Linux user, it's going to be AMD for me...and I suspect whatever I get will be a major improvement over the GTX 960 I've been using for seven years. :)
@@evacody1249 Buddy is running a 7 year old GPU and you are recommending him $800-$1200 GPU's. These are obviously out of his pricepoint or he would have upgraded way before Covid before crazy price hikes
Last I heard, the 4000 series rollout was being 'delayed' to late 2022, early 2023 to move more 3000 series cards. As usual, prices jumped up $100 instantly. I was expecting the opposite as I was sitting on a 3080 to come down closer to msrp, but no... 😡
@@leeloodog d is not that bad tbh I bought a 6900xt st the launch(not from scalpers) and I didnt have a single problem with it. Yes you have lower performance than a 3090 but not that much
Definitely looking at the 30 series and RX6000 series from AMD. And it's mostly due to power draw concerns. The steady climb in power usage is simply out of control. And just remember that Pascal had a power draw that was about half of what we're seeing today. This is an issue that needs addressing.
the 40 series will be more efficient, it wouldn't be surprising for a lower tier 40 series card to outperform a higher tier 30 series card at a lower power consumption. So far the power consumption rumors are just rumors, and u'll probably only need to worry if ur going for the highest end cards like the 4090(ti), which wont be cheap. If u can afford a 4090(ti), just throw in a new power supply with it.
The really sad thing is that nobody will inconvenience themselves for a single generation and not buy their cards to punish them for absolutely anti-consumer practices like this.
All I need is for AMD's 7000 series is to catch up in raytracing and in video compression engine. That's it. I don't need them to "kill" Nvidia's best cards, I just need them to be equivalent :-/
Streamers, natch. It does a lot for the perceived usefulness of the cards even if the end user doesn't end up using the feature. @The Momaw; AMD are reintroducing B frames into h264 media encode with a driver update for the 6000 series (I think it might already be out now?). That's been tested and brings the quality much closer to Nvidias - still a little behind, but much closer. The 7k series has a new media engine slated to bring AV1 encode as well, remains to be seen if it brings further improvement to h264.
@@winterscrescendo Thanks for the tip, I wasn't aware. It looks like the improvements to AMF have not really been widely adopted yet, so if you want to use it then you're deep into the "compile it yourself" weeds. Bit beyond my skill level but maybe Coming Soon(tm)
If they don’t start shoveling a LOT of R&D into optimizing for power efficiency soon, their top-tier cards are going to require consumers to put PC-only circuits in their houses. Absolutely insane.
The future is the past?
only US though.
@@OA-B essentially, we used to have insane power draw and decent performance but now we have insane performance and insane power draw
Yeah, the power draw is insane. I mean, for the longest time gamers did not really care how much power was consumed. But now we are approaching a ridiculous degree, where we are not even speaking of problematic heat and noise output anymore, but about having to change the electrical infrastructure of your house.
Good news is: it needs only one generation of GPUs where people do not buy the products because of insane power draw. After that, I think we will be fine again :P
We have still a little space to spare here in Germany. Your're welcome! Just kidding - kinda s**ks for you guys
Jay, I think that at this point the content creator community discussing video cards needs to start using the phrase _"price fixing"._ The video card manufacturers are now executing the very same business practices that RAM industry was accused of and sued over in 2002.
This needs to be talked talked about _extensively_ in the tech press.
Dropping usually means to stop doing something. I think you mean they should start using the term "price fixing".
What happened then? I never heard of that before with RAM.
Also, what do you mean, period? Do you mean how the 3080 12GB did not come with an MSRP and could be priced with a wild amount of variance; hence, no "price-fixing" being present (having an MSRP, instead)?
@@somerandomperson8282 no, drop as in name drop. it has meant "to mention" for at least 10 years. we are just old. 🤓
@@somerandomperson8282 I used the term 'dropping' similar to 'name dropping' where you just bring it up out of the blue. But I don't want to confuse anyone with what I mean, so I edited my comment. Thanks for the feedback.
Omg he said it in italics, it must be serious
The next GPU iteration will require a dedicated three phase circuit and an AC unit to cool it
I bet for a separate tower for the exclusive PSU(with RGB please)
2026: Latest Pc Modding craze: how to modify your old fridge chassis to contain your new pc.
I'm wonder what we need all that power for on the consumer level.
Nvidia “we’re so happy to announce to you, the power of it all. It only requires this one simple configuration that you will all be so happy about. Everyone will just need to get a commercial power delivery to their dwelling units.”
No bloody wrong
Man I remember when I bought my 980ti for $650 and I thought I was crazy for spending so much on a GPU. Only thing higher was a Titan at that point. These GPU prices are getting crazy.
just dont buy them
@@orkhepaj I haven't. I'm still rocking my 980ti.
Missed buying 1080ti at launch for a grand, had I sold during whatever the last 2 years was, years of totally free gaming. Instead I bought a second hand 6900XT 🤙
I remember buying a Geforce 9800 GX2 for £330. Inflation isn't the issue 😆
It’s stupid now because of cryptocurrency
Back in time gpus were basically for play games or editing so the value wasn’t so elevated like nowadays because of mining and yeah more people are streaming so gpus are one of the most demanding components of tech right now.
I’m mad every-time i remember when I bought my 6900xt for 1400usd two years ago…i was pissed but nothing we can’t do…it’s the market!
Even if the 4000 series is twice as fast, how can they justify those prices.... I remember the top cards being under 1000. Its ridiculous simply to play games.....
I remember getting a top end card for $400 CAD back in 2010 (ATI HD5870). Now I'm lucky if I get a low end card for that price. Even a 3060 is $500CAD.
People bought the 30 series at highly inflated prices because people are dumb. I said this during the last crypto boom that both Nvidia and AMD were looking at this behavior and would price accordingly. Blame your fellow consumer
dude you don't even need a 1080 to play video games. I have a 1080 and I literally don't run into any issues at all playing with a 6 year old card. You could probably run a 980 ti and still be just fine. It's just people have expectations that they are entitled to having the most overkill equipment at all times. People need to chill out, be happy with what they have, and save their money because this economy is in dire straits right now. GPUs are the least of our worries right now.
If idiots didn't purchase GPUs at these prices, Nvidia and retailers will drop the price. The consumers are to blame here because they showed how much they would agree to pay.
Simple, they can justify those prices because they know people will pay. They've seen all the idiots buying 30 series at 2x-3x retail!
As a PC enthusiast, I am going to skip the 40 series. I can already feel the Steep price and all the manipulation and schemes Nvidia and 3rd Party resellers are going to play. like previous years. Don't let them scam you of your hard earned money.
Same. I've got a 3070 and feel no need to upgrade. What game do I even need a 40 series for to play decently?
Do you think its worth to buy a 3080 now? I'm dying to decide if I wait or not. My card is a 1660ti and I can run everything I want very well at 1080p, but I want to upgrade before my computer needs a bigger change to keep up with chipsets and gpu performance.
@@SethOmegaful I'd go for it if you're worried about 40 series prices. A 3080 will do wonderfully for the next few years.
@@SethOmegaful honestly, go AMD instead when they come with their cards around the same time. Will probably be way less power requirment, and a big uplift compared to 30s nvidia series. That's what im thinking atleast, i have a fantastic 3080 OC strix card right now.
@@SethOmegaful either go with a 3080 or a 6800xt. I would say skip Nvidia 40 series.
I'm 54 and my wife and I are VERY worried about our future , gas and food prices rising daily . We have had our savings dwindle with the cost of living into the stratosphere , we are finding it impossible to replace it . We can get by , but cant seem to get ahead . My condolences to anyone retiring in this crisis , 40years nonstop just for a crooked system to take all you worked for
I feel your pain mate, as a fellow retiree I'd suggest you look into passive index fund investing and learn some more. For me, I had my share of ups and downs when I first started looking for a consistent passive income so I hired an expert advisor for aid, and following his advice, I poured $30k in value stocks and digital assets,Up 200k so far and pretty sure I'm ready for whatever comes.
@Sunnycaroni My advisor is Arjun B Jagat , found him on Bloomberg where he was featured and I reached out at once . If you care for a little no < pressure guidance , you can look up his name. \
FAC>BOK (Arjun B Jagat) /
@Sunnycaroni The
人人人+𝟣𝟩𝟨𝟧𝟤𝟥𝟢𝟤𝟣𝟤𝟩☎ ෴人人꧂美国美国 人
Honestly AMD's upcoming GPU's are looking very promising hearing that they have more of a focus on efficiency.
yep. the nvidia cards are just too powerhungry. most people are not lucky enough to live in a cool enough environment to dissipate that heat, making their rooms waaay too hot. making you not bother actually using your new gpu.
if amd can match nvidias cuda/ nvenc performance and built in support in software, thats it. im going all amd. i have always gone amd for cpu, and nvidia for gpu, but this may change soon.
the AMD GPU problem is the drivers anyway
Thank you for dropping this info. I had already decided to go AMD, and I'm obsessed with efficiency.
@@MsTatakai I've used a variety of both Nvidia and AMD cards in the last 10 years, I've had less problems with AMD drivers than Nvidia's.
@@MsTatakai That hasn't been true in about 10 years. I've had more nVidia drivers issues than AMD.
I’m still annoyed with the price jump from the 10 series to the 20 series for the smallest gains. The 1080ti was $699, the 2080ti was $1200! Then the 3090 and 3090ti come at $1500 and $2000 respectively. All based on 102 class silicon of their generation. The 90ti is just a renamed 80ti. We didn’t get a GA-102-400 for the 90/Ti to be a titan. So from 10 series to 30 series the MSRP jumped just shy of 3X.
Yeah Nvidia is fucking greedy. I remember when the best Geforce was just $700. These are only GPUs for gaming man who do they think they are?
so the 4090ti will cost 2500
2999.99 i said 2999.99
so basically 0 dollars
90 and 90 Ti class gpus are ripoffs like nvidia is clearly targeting daddy's money territory people
A 3080 cost 699 and is 6%~8% slower
It's clearly a Titan class GPU for data centre and high vram task
Just dont by them, nobody forcing yah. Plus inflation is thing. The 3090ti is in another league compared to a 2080ti
@UCi6sz_Hw_80bUYMgVhCAtpg no, a 3090/ti has nothing that makes it a titan. a titan had the excuse, that it could use certified pro drivers and still able to play games for privat use ( instead of a quadro card ) a 3090/ti has no certified pro driver support, so its jaust a over expensive gaming gpu that happens to be good at video editing and even then you have no certified drivers. certified pro drivers are driver specificly for one program, where the "certified" means, that the driver will not crash and works flawlessly
As soon as I heard that the 4080 would potentially require 420w, me and my 750w PSU and 1440 display noped outta there and picked up a 3080 12gb while the prices were down; that tactical maneuver is looking wiser by the day.
Same here man, got a 3080 ftw3 10 GB Version. Seems like a wise move now, considering my quite new 750W PSU. 👍🏼
Yeah especially because it’s probably going to be 2-3 years before normal people will be able to find a 40 series card for sale.
I”m happy with my asus rogue strix 3070 bought last year on 700gold psu.
Congratz for ur new gpu guys! Enjoy them
We could see 3000 under msrp soon if the oversupply rumors are true.
I'm running a 3070 on a 450W psu no problem.
Went to AMD and got myself a 6950XT at MSRP . I'm not waiting for months just to "have the chance" to buy a 3000 dollar GPU and I sure as hell ain't paying double the cost of a 6950XT for a 3090 Ti.
I'm wondering in the same boat as well. I'm looking at second hand with 3070 at $425 or wait for sales on 6800 getting closer to $550 for me to bite.
And you wont see enough improvement to justify the cost, thats for sure. Its never worth it to upgrade to the very next generation from what you currently have.
Yah AMD might be the way to go next time around, plus RDNA 3 is rumored to get more performance than these nvidia cards whilst being more efficient
@@lyianx not quite true. I upgraded from a Msi 970 4 gb to a Msi 1070 8gb back in 2015-2016 and that was a huge improvement. Sold my old 970 and it costed 500 for 70% increase from what I seen.
@@killerrf 10 series was a diamond in the rough, we will never see those gains ever again, at least for those prices
When you require the power limit to scale up with the performance its not really a new gen of cards.
Along with the price
yeah, its just a higher tier of the same gen.
Absolutely agree. It's not next gen.
Soon you will get a portable diesel generator to go with your pc lol
Yeah, making it bigger isn't really what i would call "innovation". At best it's a sideways evolution in regards to tech.
Sure there is a place for more GPU power at all costs for certain applications, but a next gen GPU is supposed to deliver more for the same amount of power/money or less.
Power per watt needs to be focused on a lot more by youtubers and consumers. Intel and Nvidia are out of control, AMD seems to be putting some effort in.
It's all falling into place. Make GPUs super power hungry and suddenly people forget about performance and now want more efficient ones.
What a great way to avoid making things better. Just make something worse so you can "fix" it in the next generation. Apple would be proud of Nvidia and Intel.
Yeah, considering companies like WPS are already having a hard time keeping up with the power demands of communities, having a PC in your home that has a 1k watt power supply being fully utilized when gaming is insane. Also, my 3080 rig produces a ton of heat, I can't even imagine what a 4090 rig would produce. It's basically a furnace with RGB fans at that point.
@@burrfoottopknot Hopefully the 40 series cards don't sell well. That's the only way a change will happen. But I think we all know they are going to sell out day one sadly. New generations should draw the same amount of power as previous gens, maybe a bit more, and perform better. That is true evolution of a product, not just putting more power hungry stuff into a chassis and calling it better.
@@burrfoottopknot more so a step down in processing size. 420W at the 4080 tier makes it look like they are still on the 10nm node, instead of the 7nm that the 6000 series for AMD is on. not only are the AMD cards less power hungry, but performance per watt is also higher. RDNA3 will hopefully be on 5nm, and thus showing Nvidia that they need to put way more R&D into more power efficiency. aka stepping down in processing node size from 10nm to 7 or 5nm. a bulk of their fab reserve is at TSMC, and TSMC has finished up research on 5nm and soon even 3nm processing nodes, so obviously there is room for improvement.
AMD has a decent power consumption because they sacrifice power for it. Intel knows thier chips are strong, yet run hot and hea y, so they go all in. This only works though because you can still run an Intel chip with a stock cooler, and generally you will be fine. Nvidia on the other hand is jacking up both things and they are making it so that you HAVE to make changes for it to function
My computer's power consumption is genuinely driving my apartment's cooling needs. It's ridiculous and if anything I'm going to be angling for a more power efficient card even at the expense of performance.
It feels with the price and power increases that we aren't getting better architectures, we're just getting bigger, thirstier, more expensive parts.
Agree. Just seems that we’re going through iterations of brute force rather than anything clever going on. Maybe the SOCs of apple and the like will be where it’s at in 5 years or so
I mean here in Europe and anywhere where 240 volt Standard is used its actually scary how close its getting to that limit.
I'm very worried about the 5000 series and how that will basically reek havoc on eu and 240 volt standard houses
4090ti and a 12900ks overclocked, a win win for the winter. AMD is a BIG step in lower power consumption and undervolting the card is a must, more than 100w with little FPS drop.
@@hermanhetherington2473 Then You will wait 4 more years
Lol buy amd and you have what you want
When it comes to Nvidia, I'm always expecting negative news
The punishment beatings will continue until our attitude improves 👀🤣
Businesses are there to make money. So far Nvidia has been very successful.
@@Safetytrousers keep defending a billion dollar company who is greedy
@@eddiegraham3241 They are no less greedy than the ones who buy the product. People who always have the latest and the greatest equipment need to take a minute and think through what's really important in life. No one is forcing anyone to make the purchase.
@@Safetytrousers watch that diminish after losing pandemic/crypto money.
What was a major factor in killing the mighty US car industry? the energy crisis.
Who exactly across much of Europe does Nvidia think it will be selling these cards to in the volumes they want, when people cant afford to keep the fridge stocked and running.
If AMD/Intel comes out with a line of more power efficient + good enough GPU's for reasonable money, they will mop up.
I already went with an AMD 6000 Series card because of the better power per watt performance and will continue to do so.
@@PatrickSchraner hell ya! 🤘🤘
I'm playing on my 15W Steam Deck.
@@Herr.Mitternacht I still have in use a 4W Radeon 9550
@@bestopinion9257 How's Elden Ring looking?
If I'm buying a graphics card in the next 24 months then it's very likely going to be an AMD card.
The only problem I have with AMD is their cards are still not as powerful, which makes the decision very difficult for me. I would love to go AMD and stick it to Nvidia, but the performance really just isn't there yet.
@@DragoNate AMD cards can more than handle their own when comparing dollar for FPS with NVidia, they simply lack the lighting effects capability.
@@peter2liter yeah, i'm not saying they're garbage or can't do anything, i simply want the best possible performance all around.
AMD is still lacking, unfortunately. getting better though, which is nice to see.
@@DragoNate well if you think its worth it to pay 1000 more to get 5-10% more performance (to go from 150 to 165fps) then go for it. even at 15% more (you very rarely get a higher difference except on specific games) i would question the sanity of someone making the choice to pay that much but hey, its your money - your choice
@@jpteknoman The sanity comment only makes sense when money is the issue, when the price has no relevance, then the choice is not about currency v performance. People, including people like Jay, play to the crowd about value for money whilst adjudging that the purchases are on a level playing field they are not. Basic economics luxury products do not operate under normalised economic calculations but operate in the opposite direction.
Anyone remember the days where anything that was about $1k was a professional card (video editing, graphics rendering, ect), and a high end graphics card was $300-450... if prices are going up too $3k for high end graphics cards it will likely be 30 years before my next upgrade.
Edits: maybe im just too old...
There was a time where graphics card companies would master the manufacturing process to bring down prices with VERY little or no loss (most of the time there were gains, because you could also fix the bugs). We need that to come back as well.
additional edit: because i have seen this comment in more than once and more than one way I should clarify: 300-450 price was lab produced cards they where not mass produced in the typical sense, yet because they figuring out how to mass produce was the tricky bit. So cards use to have two stages for the public, lab cards and factory cards. again maybe im too old.... remembering shit too few do.
for historical purpose: some tech tubers have alluded to this time in other ways. Products use to have two different model names, short and long. Short name was on the front of the box, long was the model n./serial. Long name was useful for nerdy/geeky talk, because if you understood the model number system that was used you understood what was under the hood. It allowed faster conversation and drooling. And for conversation purposes you would be able to swap around segments of the model number to build your dream card. Just because something was new and could brute force something from earlier times didnt mean there where not more interesting combinations due to behavior of the previous iteration.
The time they brought down prices and didn't make any less profit was due to increases in sales quantity. Right now, they have less resources at hand due to risen cost.
Demand supply
If you don't increase profit year over year and give all the benefit to stock holders your company is viewed poorly. It's really stupid and its because the stock market is a ponzi scheme.
Remember that the GeForce 256 from 1999 was on a 220nm node had 17 million transistors in 139mm². It had up to 64 MB DDR RAM, and 50 gigaflops of processing power. It cost 279 USD in 1999, 488 USD in 2022 money.
The RTX 3070 is in 8nm, has 17.4 billion transistors in 392.5mm². It has 8 GB of GDDR6 RAM, and up to 20.31 teraflops. It's MSRP is 499 USD.
"maybe I'm just too old".... Welcome 🍻
I dunno, man. I'm coming up on 40 years old, and I've been building PC's my entire life, but with the way things have been, and the way we've been taken advantage of in terms of this ridiculous pricing, I'm just not at all excited anymore for any of it. I'm running a 2080 Black right now, and I'll likely run that until it stops working, and then just wash my hands of PC building. I'm gonna buy a laptop and a console and call it a day, and Nvidia, Intel and AMD can all go kick rocks as far as I'm concerned.
Im right there with you. It used to be fun having tons of options to build a computer for a decent price. Now the fun has just been sucked out of it and its more as if they are doing us a favor for selling us components at 4x the price.
We need more people like you in the community
Intel cards are a comin'
Yeah I’ve built 3 pcs every 4 to 6 years and I’m happy with my ps5 my pc only gets used to do work but barely any gaming now, since the ps5 launched wish I got it on that day
Couldn't agree more. I used to build a new PC every 4-5 years so much fun and didn't break the bank... the fun has been sucked out of it (similar feeling with retro gaming).
I think demand for 40 series cards will be pretty weak overall, though will still probably sell out at launch. The cost of living (particularly energy) is so high that regular people have bigger priorities than upgrading there PC's with more power hungry cards and mining demand has dried up completely. The only people buying them will be the enthusiast who has to have the latest and greatest. Everybody else has allready bought a 30 series card, is planning to buy a 30 series card or is happy enough with what they have got. I have been waiting to upgrade to 30 series since launch but If 40 series is priced too high and there are no 30 series cards to buy for cheap then Im just sticking with what I got until it breaks. I have learnt to be patient to wait this long might as well keep going until the cost of upgrading actually makes sense
PC gaming has never been worth it anyway at high end. you're paying like 4 or 5 times as much for a bit more FPS and quality and some raytraving and so on. its never been worth it unless you are an enthuasiast with money. and the rising living costs likely won't affect those people.
I 100% agree. I skipped the 3s completely. I may grab a 4 if it's not too ridiculous. But having to grab a psu at the same time will be a pain. Cards are getting like phones where the next year whatever you have will be surpassed in power. You'll have the new series come out. Then the next year the TI, then a new series, rinse, repeat. It's getting silly for something so pricey.
Only reason I upgraded from my 1080 ti is because the 3090 paid for itself
I bought a 3070 for msrp from Best Buy and I’ve seen them in stock a few times after as well. They’re not always in stock, but it seems like it’s getting easier and easier to buy cards now
I was planning on building a new PC when the 30 series came... Now I just feel so fatigued from this market, I don't even know if I care anymore. My 1070 still pushes alright frames at 1080p, and gas is expensive.
Jay and Linus are the only youtubers that I look forward to seeing their ads. Linus has the segues to his ads and Jay has the I Fix It kit ads. I love it. lmao
Its stupid and i love it 😂
I only see Jay's. I stopped watching Linus years ago.
I get the feeling every 4090 and 4090TI will be water cooled. That'll keep temps in check, while allowing Nvidia to charge 2k for a 4090 and 3k for a 4090TI
Nvidia does not do water/AIO cooling, board partners will. they will still use their 3090TI air cooler on the FE version, but board partners will probably not be able to get away with a 3 or 4 slot cooler.
I absolutely agree. Or if not every one, it will be the norm rather than the exception. There might be one or two 4090s that are air cooled.
3090ti was their test bed and did fine on air.
@@Deviantsoundz yes but it takes up half of a pc's case
@@jerbid_ maybe on an matx or itx case. Either way it proved that they can cool 400 plus watts just fine.
gas prices are on the rise in germany (maybe like 8x the cost of last year). 4090ti is my option to heat my flat in germany this year. 🔥
Das ist auch mein Plan. Bis zur 4090, wenn's denn klappt, muss das noch die 1080 Ti schaffen. Ich heize generell extrem spärlich und komme auch mit 16 bis 18 Grad wunderbar zurecht. Hoffen wir das Beste, auch was die Bepreisung der neuen Top-End-Karten betrifft.
Lol not just gas, but also electricity has become more expensive over here in the Netherlands, so it doesn't really matter for us lol
@@QoraxAudio But here in Germany…gas will be much more expensive than electricity ⚡️.
Our both countries are working together on lng2-terminals. So hopefully prices here in Germany will drop….
In the meantime for the coming winter electricity will be cheaper (convektor etc.).
I myself bought a stove and wood for heating my flat up this winter. 🔥
@@QoraxAudio First comment is with a note of sarcasm 😂
Bright side for me, I'm still rocking a 980 ti, when I upgrade it was going to be from scratch, everything brand new.
So having to get a new PSU or new case was already part of the plan. I'm mostly concerned about price and the power bill.
Ayy me too. I have the MSI version and its still holding up pretty well. Although i don't play many games these days. Except i already upgraded the rest of my system a year ago. Considering upgrading my gpu to an AMD 6800 xt.
Power bill could be the problem for me.
In ny country electricity could go up by 200 to 300% which is nuts.
Availability and the pricing will be the key, but if power requirements will be stupid I will skip this generation
I'm rocking a 970 lol I upgraded my computer with money I got from a bonus earlier this year and was able to upgrade everything except my GPU. Hoping to get a 3070 but if they're going to become unavailable before I can get one... idk what to do lol I'm not shelling out more than the $6xx the 3070 I want is worth, but I'm not buying used... Really don't want to spend that money on a card with no warranty left.
the upgrade from a 980ti is going to cost you $3000 or so.
@@miranda.cooper this availability issue is just talk. It's not going to happen because those people who have 3080/90s are going to sell them to put money towards a 4080/90. So buy second hand if you can. Just make sure you ain't buying an old mining card otherwise it's lifespan will probably be weak.
I have a feeling that AMD and Intel could be golden just by trying to maximise the performance on their cards with a maximum cap of 300 watts. With currently inflating energy costs, especially in Europe, these cards gonna be way more interesting for consumers. Of course these cards have to be „reasonably“ priced against Nvidia.
my 6900 didn't even run at a full 300W at stock default settings, so I'm inclined to agree with you.
The thing is though performance comes at a cost. Not just money for a card but also for power. Higher performance requires more power as it runs at “higher performance”, setting a cap on something like gpu wattage (especially as low as 300 W) means performance is limited no matter how hard people try.
bingo.
I checked my system useage, It has a i5 6 core thingy. GTX 1660 Ti and played Cyberpunk on it. 240Watt...
Playing Old Games, 90 ish Watt.
Hell video encoding? Without use of GPU.. 130Watt
I can life with this
Well that and food. Food's kind of expensive and important. 😁
AMD's slogan for the RDNA3 series might be "Fits in your current shit."
Since I'm skeptical of the release of RTX 4000 series and also started to get impatient with just all these rumors going around, I just went ahead and got myself a used RTX 3080 from local marketplace for a pretty decent deal. This market is just a mess and Nvidia needs to get a grip, crypto has crashed, and who knows how long it's gonna take to recover, if ever, and inflation is high so extra expenses are lower for many, many people out there.
How much did you get the used 3080 for, and where did you get it? Just curious that’s all.
Nvidia is getting a grip of our wallet, I think they like the current situation, why wouldn't they, more money.
I bought an evga ftw3 ultra 3080 for $730. Just check eBay a bunch. Lots of non mining GPUs show up all the time.
Same. Has been time to build a new PC for a while but I didn't want to give in to scalpers and the inflated market. When prices came down I finally bought my new hardware - got a 3080 Ti for a reasonable price brand new and I don't think I'll be wishing for more performance any time soon.
@@bretttanton328 No worries about being curious, got it for 690€, specifically the Zotac Gaming AMP Holo one. As to where, Slovakia. Could've gotten a RTX 3080 TUF for 720€ about 2 weeks ago, but that was taken from me immediately the next day after seeing the ad. Still happy with my purchase, nice looking card with decent temps. Also upgraded my PSU because of this, went with the Corsair RM850 as it was the most affordable and reasonable one to get locally.
More cores, higher frequency, more power draw, more money. I'm tired of this race to pull more and more power. Other computer technologies are getting more performance with the same or less power. They also do it for the same or lower price! Time for the GPU manufacturers to get back to that.
Sure there will always be some flagship products that draw an absurd amount of power, but I should be able to buy a $300 GPU that draws 150 watts and more than doubles the performance of an RTX 2060. Heck, it used to be you could buy a decent mid-tier card for $150. Those days are gone.
What $150 card do you have in mind?
Then go buy a Mac lmao
Well that's also a function of inflation, which has been at an average of 6% a year for the last few decades (using the real formula which were discarded in 1970).
Thats the reality of GPU tech, it's actually more power efficient, but the speed is a lot lot higher and that means even though its more power efficient it ends up being more power hungry than last gen.
It would be nice but unfortunately as long as people buy all the stock day 1 and make them sell out they have ne incentive to do anything different
Man do I miss the days when you could EASILY build a beast of a PC for the price of a console. I feel like we need a major recession because these companies are just getting disgustingly greedy
That is really an ignorant statement!!!!
So the best you can do is throw your hands in the air and hope someone else will solve your problem. You want stuff to change do something about it
Except y'know, recessions only make the greed worse.
Welcome to capitalism, best economic system to ever exist, ever. Enjoy your stay.
Remember when Bill Maher wished for that to get Trump out? People die when that happens... but ya let's hope for that because you don't like a presidents mean tweets or hope for more affordable graphics cards 😂🙄
@@dive2drive314 Just curious, are you by any chance confusing Recession with Depression??
Honestly, these types of videos are starting to become my favorite JTC content. Just Jay talking into the camera , speaking his mind, distilling a lifetime's worth of computer knowledge and computer history and trying to peek out ahead into the PC landscape of the future. No editing pressure, no pressure to be funny or quippy, just my guy giving us the 411.
I started with many other PC and gaming channels. JTC is starting to clearly pull ahead, in my opinion. The crown is yours for the taking.
Keep em coming!
I agree. I am just now preparing to build a new PC, and learned pretty much all I need to know to finally decide what to do about a graphics card.
Agreed, love the talking head vids
yee, i don't even watch LTT stuff anymore, JTC is the only PC hardware channel i watch regularly unless I'm looking for specific hardware reviews during a build.
what do you mean with 411?
Came down here just to say te same thing lol
*Looks at his 1050Ti... "You're OK buddy, ain't giving up on you yet."* *Looks at wife's 750Ti... "Your days are numbered..."*
Well nvidia cutting allocation from tsmc sounds to me like they're going to jack up prices by claiming demand is higher than supply.
TSMC is not letting them cut allocation, that was the full story. Nvidia is screwed with oversupply as they were expecting the crypto boom and GPU demand to continue
You should have said "trying to cut allocation". TSMC said "no, fck u". they will only get delayed shipment.
This video actually made me look at GPU pricing again. I'd set my goal years ago. Once RTX 3070 level performance is available for 300 bucks or less, I'll upgrade. Anything more is just not worth it.
And even with these "massive" price drops due to crypto miners selling off a bunch of cards... yeah, still 500-600 bucks in my country. Unless my current GPU dies, I'll be stuck with it for even more years. It currently has it's 5th anniversary coming up.
3070 perf and power at 150W is what im waiting
or 3080 at 200w
Yes, what they are pushing right now is insane. Who has $1500, let alone $3000 for a card that can dim the lights in a small city?
RTX9090: 440v/3-phase/25kW
But checkout all the videos of how Minecraft looks AWESOME!!!! on it!!!!
Too bad you will need a 30-year mortgage to get one :(
@@karlcorrz Just undervolt a 3080. 3% performance drop but 250w instead of 400. You could probably push it down even further
They all play the same games. I don't understand the obsession with a few FPS.
My gtx 960 has just celebrated 7th anniversary 😆 nis is a game saver
7:40 I think I'm just gonna wait and buy... nothing. The GPU bubble priced me out of the market to upgrade from my RX 580. Manufacturers explicitly trying to un-pop that bubble get zero instead.
I just upgraded from my rx 580 to 3060. Got it for a good price. I'm tired of waiting and idc about 40 series at this point
@@AsquareM Yeah, I picked up a rx 6600 xt for $400AUD. This should be good for the next 5 years. I don't play fps type games, but even if I did, I still wouldn't understand how people justify spending more than $500USD on a gpu, I guess if you're into VR it may be worth it or you need it professionally.
@@AsquareM yea I got a 3060 a week ago for msrp f it
If only this mentality was the majority. We'd all be able to afford so much more
I'll wait until I see used 3080 cards for $400 or less.
Just bought a 3070 a few weeks ago. Got a great deal and was a great update from my 1070. Not gonna bother with 40 series at all, I’m just glad I was finally able to buy a GPU for under $600
3070 rules have fun
i upgraded from 1060 and got a 3070 ti for msrp :D
@@neokilpinen Let me know when you guys are upgrading again ill be one my 1060 for a while haha.
Same!
I was able to purchase a high tier 3060 brand new at a very good price a month ago, and i already oc it till match (mostly) a 3060ti, i will not give nvidia money for 40 series, one for all this rumors, and two because for two years they laugh at us and be deaf about gaming/arts/work market to be minners bitches, so f nvidia. This cards will be capable of gaming till 70 series and they will have to kill dlss to make us buy new things
I love how most high-end cards cost 4 times my whole setup
I'm really interested to see what happens with the 40 series (performance, price, power draw, transient power spikes, PSUs handling it, temps coming off the PC, do you need a dedicated circuit for the PC...), but I can't mistake that for wanting to buy one. I bought a 3080 only to improve my PCVR apps - which it did, a bit, I mean I won't go back to my RTX 2060 SUPER, but the improvements didn't blow my mind - and I'm not interested in upgrading again until GPUs at or below the x080 can go 100% above the 3080.
yeah i got a 3080. I dont plan on upgrading for a good long while, i went from a 970 so the performance increase for me was pretty massive. Was interested in getting the 3080ti but couldn't justify the extra $200 for a few more frames. like if its that bad and i already turned down everything i can turn down then its time to upgrade period. which im hoping won't be for another decade, or 5 years at least. i kinda do want games to improve graphically in a big way but my wallet is like. please dont, take ur time lol.
RTX 5090 will come with it's own diesel generator if this continues lol
I'm glad I was able to get a 30 series card when I did (end of February). The 40 series might've priced me out just from the higher power requirement. It's nuts.
Same. I got a 3080ti and with an ultra wide I need a bit more juice but I'm not letting them rob me just because. Screw them.
@@beasthunt cant believe you actually bought the ti version, you already let them Rob you my dude lmfao
@@beasthunt I got the 3080ti as well from bestbuy at msrp. But if these cards launch at affordable prices 😤They will be way better performers.
@@razial5745 agreed, bud. I believe the 4xxx series will be monsterous.
Planning on getting a 3060 for my MSI Trident 3 just for awhile.
Holding on to my 1070 out of spite at this point, not necessarily cause I can't afford it. No new games coming out I feel are worth upgrading for anyways. GPUs haven't been fun to talk about for years now, and it doesn't look like that'll change any time soon.
For 6 year old cards, 1070s and 1080s are absolute beasts. 6 years ago, a 6 year old graphics card (570 and 580?) were struggling to keep up. But the high 10s can handle VR
@@V742 Yup. Had I known the 10 series was the last time GPUs would be fun, I would've bought a 1080ti. 1070 still produces acceptable frames, so long as I tweak the settings.
If AMD can stay relatively competitive while keeping their power draw down, there just might be another AMD renaissance occuring...
They’re already rumored to have more than doubled the performance of the 6900xt in one their mid to high end GPUs, 7900xt will be like 2.5x better
I hope, i want to replace my rx5500xt 4gb with something....well....better
@@kimjongpoontv69 They statet >50% not doubled. And the bigger the card the smaller the increase because of powerlimits. A 7900XT will be nice if it can hit 50%. A smaller card like an 7500XT should be doubled compared to the 6500XT.. When nividia takes 450W and AMD only 350W (TDP on AMD cards is only GPU power, not the whole card), then AMD has a win. If their chiplet architecture works, it will be much cheaper for AMD to produce such cards, whcih will pressure nvidia and stop the insane pricing. And even minimal defectives chiplet can be put together for a bigger card, because it does not matter if 20% of the die is defective and disabled. If you disable a lot of space like nivida did with the 2060 GTO manufacturing such a chip is twice as expensive compared to just using the smaller chip.
AMD is already close to nvidia and better in rasterizing. And you can bet, that AMD will have improved their raytracing solutiion withiout dedicating extra silicone for it (Tensor cores). Their hybrid approach was the smarter way even if it's not fast enough now.
@@eliotrulez Well if they utilize MCM it’ll be a lot more than just 50 % that’s for sure, id be extremely disappointed with 50% unless they retain the same prices.
I bought a 2070 super from nvidia right before the 30 series announcement. I felt like a total chump at the time. Well, you all know what happened and I felt redeemed. At this point Nvidia would have to really make some game changing moves towards customer satisfaction to get me to buy anything if theirs again.
I HAVE to buy Nvidia because my software development is locked in the their software. AI support.
looking at prices even after the price crash, my 2070 was an unbeatable investment, bought it end of 2019 for 400€ and I'm pretty confident I could live with it for another 2 years
I am rocking my 2070 Super until I see a 35% performance increase that will only cost me $500, that is it, case closed
Literally the same with me man.
More than enough for current gaming.
Even with the old motherboards such as the B550, they are going to the same design from the x570 lol.
dud, i bought my 2020 super ONE WEEK before our nationwide lockdown(the first one), it was i think before or just after the 3000 announcement. What i regret is not buying 2, because one week after my buy the prices went up a lot, and then they went bonkers as we all know, i could've sold that 2nd card for 3 times what i bought it for...
I'm not planning to upgrade unless i see a HUGE gain across the board, like you guys say, at least 35% increase at $500, otherwise i'll keep this GPU for 5+ years, maybe i'll upgrade on the 6000 series
I am still on a waiting list from last years release for a 3070, so I will definitely not be jumping on this unless they meet the price at the same level while giving more power. Which is highly improbable.
Best Buy has Founders Edition 3070's in stock right now for $499 what are you talking about?
Got one for £280 today, not used for mining and it feels good 😁
@@notmyyoutube84 where da fuk did you even find that
@@Jmack1lla germany as well. Even in many stores :D
@@notmyyoutube84 1. it was probably used for mining
2. mining isnt bad for hardware, its made to run at 100% 24/7
and moores law is dead just said opposite... retail have oversupply of expensive GPU-s that they may need to sell at a loss and they requested delay for 40 series. retailers were forced to buy loads of current gen gpu-s on market that doesnt have demand for nvidia to actually supply 40 series to them when they become available. also he said that pricing may even drop further (specially on used market)
Yeah, I like Jay but I don't put much weight in his analysis tbh.
last gen is about to crash on used market for sure. I think MSRP holds at retail but once 40 series comes out I can see 10-25% sales start to happen for new last gen
@@SpencerHHO yeah, moores law is dead is wayyyy better on these rumours and information
@@samaybhattarai5330 considering they have a very very high accuracy when it comes to leaks, yes MLID is a better person to listen to.
TSMC has raised prices by around 20% for the 40 series Chips. Nvidia is going to pass this hike to the consumer and also add a little bit on top of that. So, at the very least the RTX4090ti will cost minimum $2500. The $2999 price is not that far fetched !!
Lol straight up, people aren't ready for this supply chain shortage hit. Gonna leave some fools floored.
at this point it's more than likely.
Its just people if buying so high price its their choice. Not worth to me ever just.
Glad I have gotten back into console gaming. My PS5 pulls 220 watts and doesn’t make a peep
I completely agree.
Seems like Nvidia may be anticipating a wider adoption of the resizable bar feature to increase overall system performance
Unfortunately rebar seems to have significant problems with some older titles, CSGO being one, with rebar enabled my 12400 perform worse than my old 4790k
I have no interest in next gen. Just replaced my card from 2015 with a brand new RX6700XT two days ago. Love it, I can't believe what I was missing graphically. I will be good for awhile.
As well you shouldn't. The longer we hold out, the quicker prices will normalize. Sadly people will pay Nvidia's extortion prices and nothing will change.
I made an error and two months ago I switched from my old G752vs (gaming laptop) to a desktop with a i7 12th gen and a 3080 Ti. I don't regret it (a 1070m is far less powerful than its desktop counterpart), but looking how annoying their practices are, I don't wanna be part of the problem.
@@MrlspPrt I don't think you made an error. I have probably waited far too long but I stuck it out. Next month the MB, processor, memory upgrade happens.
👍🏼
At this point in time, I'm thinking of going from my Gtx 1080 to AMD's 7000 series. I think the 7000 series will be some what more popular than the RTX 4000 series at this rate, if and only if AMD really ensures that the power draw is reasonable.
That's reasonable but let's think that AMD draws it prices from the competition, if 4000 series cards are revealed with a huge price point, AMD will follow suit.
Why do you care how popular a card is?
@@Saturnit3 they already cranked up pricing on last gen CPU’s and gpu’s so I’d fully expect those prices to get cranked again to milk the fanboys.
@@oxfordsparky fanboys? I am using a 1050ti mate I will need to upgrade at one point or another that shouldn't make me a fanboy lol
@@oxfordsparky AMD has much less of the market and are always cheaper. You do understand what a fanboy is right? People who continue to buy nvidia and Intel every generation at higher and higher prices are the definition of fanboys.
Dudes I'm on a 970, I was gunna wait for a 40 series and was expecting to wait till November/December for the 4070 but Jay has made me totally flip my mindset and I'm just gunna buy a 3080 now rather than wait months and struggle to get a 40 series anyways. Done deal. I'm sure I won't be disappointed
just get a 6600xt used and sell it for basically the same price in 5-6 months. the 3080s will have dropped even more by then, and some used 40 series will be on the market.
might be better to wait if you can and see how the 4070/4080 is priced. Or the 3080 might be cheaper then
It wasn't long ago they posted a video about saving money by not overdoing it buying bigger power supplies that you don't "need" motherboards that have too many slots you probably won't use, and cases that fit too many fans in order to save money. If I had taken their advice a few months ago when I built a new rig I'd be up shit creek right now having to buy another power supply and case to better support heat rejection. I'm glad I listened to myself and "overbuilt" my newer build. At least now I know it will support the 40 series without any issues. If you listen to every UA-camrs advice it's probably going to end up biting you in the ass. Do what you feel like doing, stop listening to UA-camrs speculative advice.
DON'T buy anything.
Im on a 980 and there's really no reason to upgrade right now because there aren't any games that need the upgrade worth playing. When the 40 series release those 30 series will drop in price, as well as every other card.
Just bide your time and wait for these miner cards to continue to dump.
I just brought a 3080 upgrading from a 1080 ti and that upgrade is very noticeable so yeah totally worth it for the price they are selling since last month.
I've been holding on to my little old GTX1050 since 2017 - the 16/20 series didn't manage to convince me in terms of performance per dollar and between the silicon shortages, the pandemic, scalpers galore, and more recently inflation, I've entirely missed out on the 30 series. That said, I'm mostly a retro / "patient" gamer, preferring to pick up the latest and greatest games of the previous decade or two on Steam for under £5 apiece, so far be it from me to need bleeding-edge hardware. However, we're at the point in time where games from the late 2010s are going on sale for basically pennies and peanuts, and I keep finding that I'm teetering way too close to the limit of the 1050. With that in mind, I'll be really curious what the 4050 will bring (or whether it'll even exist - remember, we never got a 2050 either) and for what price.
Hey! You seem like a wise one to ask this question...
Should I get a RTX 3060 for about $480?
@@lolthisnerdsaidmemes That depends on whereabouts in the world you are and which variant you're looking at, really. To me $480 sounds a bit too steep for a 3060, considering here in the UK they can be had for £320 (~US$380) for the 12GB Asus version brand new from Amazon with free next-day shipping, or even as little as £160 (~US$200) on the used market if you don't mind spending an hour or so giving it a good clean and replacing the thermal paste. If you happen to be in South-America, Eastern-Europe or Asia, $480 might be a bit more appropriate, considering hardware usually costs a bit more in those areas due to import taxes and whatnot.
we did get a 2050 in laptops, which was actually the 3050, when the 3050 in laptops was a 3050 ti
After looking around I would honestly recommend looking at a 3050 if you can find them. Sounds like a good fit for you and your needs. 8GB of VRAM, that's plenty for even games that were released in 2015 going on sale. Take for example GTA which is technically a 2013 release game 2015 for PC. A 3050 is more than enough for that and similar games as well as other tasks. And I can find them here in the US for under $350. Seems like a win to me.
@@lolthisnerdsaidmemes I would say that it depends really on what your needs are and what kind of games you are going to play. Aare you doing any sort of video editing? Or are you just playing games mostly? And what games are they? I say that because as I mentioned in my previous comment a RTX 3050 8GB for under $350 USD seems like a bang deal to me.
Love the PC world but honestly, if these prices are even remotely close to real, I think I'm done with this stuff. It's hard to not justify just snagging a PS5 or Series X, enjoying gaming there, and having a moderate PC build for productivity and everything else. It's hard to justify paying this much money just to play videogames anymore, but maybe that's just me.
Same man. I am a lifelong pc gamer but I will switch to consoles if prices stay like this. It's just a fucking game, I am not going to spend a fortune to play a game in better quality than a console.
You don't need a 3080-3090 to play games at a decent frame rate and resolution, my 6600xt kills it at 1080p-1440p and I got mine for under MSRP at $339 and the 3060ti is going for right under $400 right now...
@@cemsengul16 Same, I'm still using an undervolted rx590 and I can't complain, it still manages 70ish frames at high/max settings so it's good enough for me. I mainly use my PC for music production tho so gaming takes a backseat tbh. Hell, and these days the consoles have 2070-adjacent performance tho so the value proposition for consoles have never been stronger
Not even just the prices, I don't want my PC drawing as much power as my fridge. It's just stupid. Cards aren't getting better, just bigger.
You don’t need a top end PC if you can’t afford it. Even if you buy a 4060 when it comes out it would annihilate the new consoles in performance across the board.
Got my 2080 Super at the start of 2020 (a month before covid), and I'm still happy with it. What I like _least_ is that I can use my PC effectively as a *space heater* when running FurMark and Prime95 together. I'll wait until power requirements come down and efficiency improves before I pull the trigger on an upgrade, provided my current card lasts long enough.
I did the exact same thing. I almost kept my old 1070 figuring I'd buy a new 3080, but then remembered how stock is usually low on a release. As it turned out, it was one of the best decisions I could have made.
I’m still using a 980ti and it performs perfectly on highest graphics settings for modern games at 60fps considering I only have room for a 1080p monitor. Old cards aren’t bad, but if I wanted to get to 144hz then I’d have to shell out
Power requirements will only continue on an upwards trajectory as there really isn't much more they can get out of silicon.
I lived with a Radeon hd 6970 until 2016 and that card was a space heater. I upgraded to an rx 580 after that card and it ran so cool. Now ive upgraded to a 3090 ti and i once again have a space heater. Seems to me that every 6 to 8 years youll get a good card thats not also a space heater.
The 7600 XT is going to be faster than the 2080 Super, and use about 50 fewer watts while doing so. ( edit - for maybe a $400 usd msrp )
I still have a RTX 2080 Ti but with all the BS Nvidia has been doing lately I might just go with the RX 7900 XT for my next GPU upgrade
Had a Red Devil 6900xt Ult. Best PC purchase I've ever made. It didn't even get out of bed when I maxed out RDR2, Chernobylite, Metro: Exodus or anything else.
that's quite early to upgrade from a flagship card tho I'd wait at least 1 year into the 4k release to see how the market is and I "only" run a 2070s
I switched from nvidia to amd last year, best decision.
@@opreax2145 Same
2021 - thank you for awesome GPUs we cannot buy
2022 - thank you for amazing GPUs we cannot afford
I was all set to drop 1500-2000 on a new PC in Spring 2020. Not even thinking about it today. Not playing this game
(or any game, pun intended) by their rules anymore. It would take a complete reset on the supply/price/value equation to get me back. Just sad how they all conduct themselves. I recall when 'entry level' gpu performance was only $300 and high end was $700, now it is $1000+ for entry and $3000+ for high end. I. Am. Out.
2023 - thank you for awesome GPUs we cannot power hah
@@stevedixon921 Imagine how I feel when entry level gpu performance was $40-70 and high end was $250. I think I spent something like $150 on my hd4850 in 09, something like 160-175 on my hd7850 in '13. Both decent midrange cards, around the same price over 4-5yrs span. Still running the hd7850 so it's been around 9yrs now. But 'lower mid range' or nearly entry level is $400-500? Ouch. I think my geforce 6600gt was only around 120 when I bought it, burned up in less than a year and bfg replaced it with a similar 7 series like a 7200gs or something. But that was back when cards were still on agp ports vs pcie.
honestly... i dont see many consumers seeing any big of a difference jumping from 30 to 40 anyways, id just get either if its on sale and then stick to it for the next 5 to 8 years or so, maybe even longer if they are still good
They won't still be good in 5 to 8 years unless ur on some peasant resolution
@@SuperSavageSpirit nahh im still chilling on a 1060 6gb, i think itll hold out pretty well for a teensy bit longer, i think the 30 cards will be okay for a lonnnng time even on high res
@Baby gimme Feet rocking a 2060 super on a 7600k xD. i should upgrade my cpu soon....
@@SuperSavageSpirit i had an 1080 for ~5 years and now an 3080. I think the 3080 will be good enough for atleast the next 4 years.
@@fighterwalkthrough if you can manage to keep the VRAM cool enough…
I'd laugh if I saw a $3k 2090 on launch. At that point any arguments of "inflation" and how that affects the pricing that they may have used will be absolutely invalidated as this is a move to capitalize on what they can potentially get from the market based on how crazy people overbid for graphics cards during the mining/scalping craze and has absolutely nothing to do with the buying power of the dollar.
With ethereum moved over to proof of work as opposed to what was stupidly profitable mining, there shouldn't be any real threat to the supply/demand in that aspect. I imagine the scalping will be much more short lived too because of this :) we can only wait and see what happens with the 40 series
I believe them when they use inflation as an excuse for bumping prices to keep their share highly valued on the stock market. To a corporation like Nvidia, share price is the only point of consideration. If they maintain the prices as they are tier for tier right now regardless of cost of manufacturing, the shareholders will complain Nvidia isn't fighting inflation enough and their share value is reduced.
That's the price (no pun) of being a publicly-traded company. The customer takes the last place in the hierarchy of priorities.
@@paul.1337 You're misunderstanding me, I'm not saying inflation doesn't exist, nor am I saying there aren't real world impacts as a result of it (e.g. rising prices). What I'm saying if they release a $3k 4090 it will be more about Nvidia seeing what people were willing to pay (i.e. over twice MSRP in many cases) and them wanting a piece of that action being the primary cause for this increase in pricing and largely having nothing to do with inflation.
@@EversCS Ethereum is still proof of work... it is going to be moved to proof of stake. "Stupidly profitable mining" when right now miners are earning more ethereum than before its just worth less now. So which means when it goes back up again miners are earning more now then they were when it was "stupidly profitable mining"
I'm building a small thorium reactor in my basement to power that generation. I figured I'd start early.
Damn that's a real scummy move, Nvidia. It's not enough that you are sucking up ever greater amounts of that expensive electricity, you've got to screw people wanting to buy cards that they can actually afford *during an economic depression!?*
Yeah it's an eye opener for sure. Nvidia couldn't give a shit about the consumers who supported them for decades. They traded us for miners buying truck loads without hesitation.
These high end cards are luxury items, hardly the thing to get if you have affordability as a priority
@@thejohnbeck people forget this. And they forget that part of the recession/depression we're in is due to rampant inflation from money printing and restrictions on logistics and fuel.
There is no economic depression, we're not even in a recession yet. The government's deliberately slowing the economy to avoid hyperinflation. Companies have no reason to increase the prices other than the fact that they know everyone is still flush with COVID-19 stimulus cash and they want to take it out of your wallet in every way possible.
@@thejohnbeck literally nailed it. people live quite comfortable lives if their major worry is if they can "afford" a luxury for entertainment such as a gpu during an economic depression. out of touch with reality smh.
Still thankful I bought my 20 series card when I did. I had major buyers remorse at the start but that changed quickly when it was damn near impossible to get a 30 series. Good video, Jay!
Same mike, same. Bought a 2070 super founders from nvidia right before the 30 series announcement. Still works great for the games I play......(mostly Dayz)
I have a feeling Nvidia might suffer heavy losses this time. What happened during the 3000 series was a golden opportunity for Nvidia ; Crypto was going into peak bull market > people were at home all day (gaming) > chip shortages were hitting. Nvidia produced the most cards they ever did back then and it still wasnt enough. Now I fear they will produce a huge amount of cards and they'll just sit purely because crypto mining is at an all time low, chip shortages are slowly going away, and most of all people are suffering from inflation right now. Most PC enthusiasts aren't going to be upgrading this series tbh unless you're running something ancient. Oh and if Nvidia decided to artificially inflate prices well that'll be a kill blow like Intel did to themselves and bring the true victor AMD to taking over GPUs next. Nvidia is repeating the same mistakes Intel did. Nvidia should fear a Ryzen type release in the GPU world aka same performance and half the cost.
i would like an updated analysis from you in say...4 months and 1 year
Yeah they better not make a super butt ton of cards because nobody can afford it right now and people are still satisfied with their 30 series.
More 4090s for me
@@insight_lolubad and then one day no more 6090s for anyone
There are always enough people who are buying an nVidia just because of it. Be it, because nVidia launches first most of the time or be it because "you can't game on AMD". Sometimes it is the only green thing they will see all day
You can have a 10 minute video, and that iFixit ad can play the entire time. Absolute perfection
Bruh quit simping
@@TheGohthecrow The internet is just full of simps.
@@markm0000 internet full of haters too.
@@TheGohthecrow I’d rather just stay a positive person. Thanks for your input
@@airforcedude08 positivity or negativity has nothing to do with it but ok "king". Shall I call you that too? "Stay positive." 🙄
I swear, at some point single GPU systems will genuinely NEED two power supplies to function normally.
Just buy a psu which can handle it 🤣
Define normally 🤣
that or they're gonna go 240V
@@annebokma4637 i guess this guy uses cheap ass china bomb psu's 🥴
He probably never heard of 1600W PSUs
I've given up on owning gaming rigs... From now on it's going to be SBCs and cloud gaming... I'll spend my money on gig internet and services. My power bill went down about $50 a month getting rid of all my towers and monitors and stuff... Now I have an SBC on each of my TV's and it's going great.
Have a 3080 , that cost £900 due to its very high cost it’s going to have to last for 4 to 5 years
Happy to reduce graphic details things don’t look much different anyway now days
I am still running a 1080gtx, and can still play "most" games. Not the prettiest but they run. But I am pretty sure I will be grabbing a 4080/4090 - just waiting for this clusterfuck to end.
You mean 10 years, at least me this is what i will do
Honestly so many new games look very similar to the eye regardless of graphics settings.
Bought myself a 3080 12gb a few days ago on a hunch that this is how things are gonna be. Thanks for the affirmation that I made the right choice lol
Same, I got the EVGA which did you get?
I’m with ya! I got an MSI 3080 Ti recently.
@@iamrobbb7503 Nice! I have just ordered the same but the Gigabyte vision variant.
i mean when the 40 series drops the 3080 prices will also drop so...
@@bird2049 Msi gaming z trio for me
So the 30 series was a fluke for a few reasons. It was the perfect storm to keep cards sold out for a long time. You had people like me who were running 900 series still because I looked at the price hike on the 20 series and the lack of performance gain. Mostly just RTX which wasn't blowing skirts up at the time and said hell no I'll wait for 30 series. Then you had the pandemic and a massive jump in computer parts demand for people building or upgrading systems for the lock downs. Then to top off that shit sandwich was the damned scalpers and crypto miners taking a bunch of product as well.
So this time it will be different as all of the people like me who got aholt of a 30 series aren't in the market. No pandemic lockdowns to push a ton of new builds or upgrades. No big demand for new cards from the crypto miners. So the scalpers can try but they will fall flat on their faces for the most part on 40 series. Other than the morons who just HAVE to have it day 1 and don't care how much they spend.
Now I would still follow Jay's advice and not sell your current card before you get a new one. Too much of a risk that something doesn't work out right and you end up with no GPU.
Exactly. I was gonna wait for a 40 series since I have a 3050 which does damn well for the games I play. But seeing the new power requirements I’m just gonna grab a 3080 from micro center tomorrow. Then just throw the 3050 into my brothers pc lol
Amen
@@brandonshurtugal Yep I got lucky enough to not have to wait "too" long to score an EVGA FTW 3080 Ti and I have zero intention of moving to 40 series.
Fuck the gtx 960 2gb
@@Stubbies2003 Get the EVGA extended warranty and wait for at least the 60 series.
I had a few 30-series cards, now ending with the 3080ti and it BARELY fits in my NR200. I'll go for a 40-series only if the power draw isn't insane on the 4080/4080ti.
Power draws going to be insane
250w minimum
The request from Nvidia to reduce wafer orders for the 40 series was my reminder they don't really need cards for gamers. They were selling pallets of cards to miners and now that the demand is gone they are fine making less cards. I wouldn't be surprised if they tried to keep invintory low and prices high as long as possible, they are a for profit company after all.
I was lucky to get a 6800xt in 2020 and have been thrilled with the performance and the additional VRAM. 4k texture packs for days and Smart Access Memory with my 5900x boosted it even more.
Now a 6900xt is $800-$900 also, performance vs AMD really depends On the games you play and the specs of your monitor. A good 1440p 165hz monitor is $300-350 and is such an upgrade over 1080.
@@garrettmancuso4417 you'll have very big regrets when they release their next gpu and drivers become 2-3 monthly updates, ask me how I know....
@@garrettmancuso4417 nice
im pretty sure i have the exact monitor youre talking about lmao
Nah, don't waste precious money on useless AMD garbage, it's just pathetic trash.
@@J-D I've had no issues with my AMD drivers on the 6800xt. New game ready drivers always roll out quickly with new cards launch. My brother had a 3070 and and it seemed like there were as many driver updates as steam updates
I saw all this insanity coming and decided to avoid it, so last week I improved my gaming setup about 1000 percent by buying a 6700XT as part of a whole new build. I'll be making popcorn and enjoying the show when all the new cards hit the market.
My 6700xt has been treating me well for the last year. Enjoy your new build!
Good choice, I got 6800xt.
Just picked up a 3070 for a decent price today. Had to take advantage of the market being decent cause its gonna go to shit again real quick.
I went for a 6900xt under msrp, never seen it past 250W playing at 4K, 50-60 degrees. Couldn't care less about new nvidia toasters.
Same here... Picked up a 3080 in the new system so I'll be set for the next 4-5 yrs. I only game at 1440p so it'll be good for me.
Just got my 3060. Plenty enough for the next 2-3 years. But crazy how long it took for the prices to stabilize.
its crazy how little time it took for the prices to stabilize* normally it should take 5 years at least
theres still more room for improvement in these prices tho, they will go a little bit lower
Yeah. Been debating if I should go for a 3070 since I have a 750w psu and a 2080. I might just wait and see how the new ones are
@@Boogerdick69 if you’re not looking to upgrade your psu you might just wanna go ahead and get the 3070. Cause I mean 3080 recommendation for psu is 850 and the power hungry 3090ti is a different beast. And with 40 series no telling how power hungry they will be.
3060 is not plenty if you wanna play every single game over 200fps at max settings
@@billcipher534 yeah, I’m in the line on the Best Buy app for a 3070 right now. Founders for 500$, is it worth the upgrade though? I mainly want it so I can get a better 4K experience on my tv. Some games run at 50-55 FPS with my 2080 on med-high settings
Jay, you have your MSRP prices all wonky. Below is the historical list of MSRP prices on the top end NVIDIA desktop cards at release since the 6XX series. The key factor is the X80 cards have crept up from $500 to $700 in the last 10 years. Titan cards have been all over the place and the X90 sometimes is the "Titan" with the 6XX and the 3XXX series. Some series having multiple tiers of Titan cards as well. I have never considered the Titan or the equivalent series of cards as "gamer" cards since the 6XX series with the 690 starting at $1000 compared to the previous 590 card only being $700 which was the first x90 series card. The issue is that I consider the x80 series of cards the top end gaming card, like most sane people, and that price has started to go through the roof as well to the point that even it cant' be considered a "gaming" card anymore although it is still built as one. 3080ti at $1200? I am hoping market forces finally start bringing down these stupid priced cards back to the realm of what gamers can afford and not miners/companies. The x80 should be $500 and he x80ti should be $700. That has been their historical prices for over a decade until the 2XXX series changed that.
680 = 500
690 = 1000
780 = 500
780ti = 700
Titan = 1000
Titan Z = 3000
980 = 550
980ti = 650
Titan X = 1000
1080 = 600 (orig) / 500 (drop)
1080ti = 700
Titan X(p) = 1200
2080 = 700
2080 super = 700
2080ti = 1200
Titan RTX = 2500
3080 = 700
3080ti = 1200
3090 = 1500
3090ti = 2000
Think the 2080 release confused Jay since 2080 ti and 2080 launched the same day when all other x80ti models launched later than x80.
I'll stick with this 3060 I bought. I'd like to buy a few more just to have them cuz I wanna build some more, but the average fellow can't just go buy 6 new gpus.
I just hope in a year or two when I plan on upgrading I can still get something like a 3080ti new, at a reasonable price. Not at all interested in these higher power draw cards tbh.
watch what the market prices are once the 40 series drops and then grab a 30 series if they take a drop in price is what I'm thinking on doing . I have a 3070 ti , but I also have a 2nd machine that I could put that in IF I wanted to upgrade my current gpu .
@@randoir1863 Did you even watch the video?
@@neondemon5137 lmao fr
Why would you need more than one? It's not like you can still use them (profitably) to generate crypto, right? I mean you can buy as many as you want but logically nobody is mining anymore which is the reason.... for the eBay fire sales
A 3080ti will never drop much in price as long as it's still reasonably usable.
Top tier hardware is always expensive. Look up the prices for a 980ti for example.
With a recession an almost certainty soon, save your money guys, you're gonna need it. Buy a 30 series now? They're still $/£200+ over MSRP.
We're already in a recession. The D word is starting to appear.
@@chriswilson9331 🤔......Desserts? 🎂 🍨
Considering electricity costs in the UK have risen up by 52 percent and due to increase again in October.. good luck anyone planning to run a system with the 4000 series.
I'm having a hard time justifying to myself having a light bulb on
@@ultrawidegameplayandbenchm9158 light bulbs literally cost £7 a year to run 24/7. It’s appliances like wash machines, dryers, kettles, fan ovens, PCs, fish tanks, etc that draw stupid amount of power.
Saw a 6800xt for £670 and picked it up. My Vega 56 was struggling, and rumours of high prices and power draw for next gen made buying now a wiser choice in my view.
I am in the same exact boat. Have you undervolted it to get the TDP < 300W? What PSU do you own (I have 650, but I know I will need single-rail 700+ most likely).
I have an EVGA GQ 650w, it’s paired with a r7 5700x, 32gb of ram, 1tb boot drive a 2 tb SSD, 5 case fans with PBO enabled. The 6800xt I have (gigabyte) pulls up to 280w standard from what I’ve seen. Not tinkered with its power as I’ve had no reason to do so what so ever.
I am using 2 separate pci-E power leads. I’ve heard it’s better to do so. Although 1 lead with splitter from the same Psu handled my Vega which had the 64 bios and probably pulled more power at times.
I have 6800XT and 850W PSU, everything good here.
I was originally thinking about getting a 30-series card, but decided to wait because availability in my country was practically zero and the few cards we got were ridiculously overpriced, with some 3080s being sold by actual stores for up to $3000. We've only gotten reliable stock and (somewhat) reasonable prices in the last 2-3 months, and with the 40-series presumably launching around October I was like "Nah. At this point I've waited close to two years, so I can wait a few more months."
The only thing I'm a bit worried about is power draw, so I'm going to wait and see if AMD have something that gives me the performance I'm looking for without giving the circuit breakers in my apartment anxiety attacks. I'm hoping the trend from current gen continues, where AMD gives acceptable performance (for my use) with ~100 watt less power draw. But I guess time will tell.
In which country you live?
I know someone at AMD already said they're going to have more power draw than RX 6000 series did, but they're probably gonna be still less than Nvidia's RTX 4000 because of competition (I think Moore's Law Is Dead did a video on it but I might've seen it somewhere else)
Gotta get that overkill 1600 watt power supply
Power draw for RDNA3 will be only a slight increase over last gen. I wouldn't expect their top end part with MCM to run over 425w. It's likely it will be over 4090ti performance. Jayz doesn't specify 4090ti power requirements, but AIBs have been told by Nvidia to prepare cooling and voltage regulation for 600W. So here are your options:
1. Buy Lovelace, a new case, fans, and power supply to deal with 3x transient spikes and huge heat output
Even a new motherboard may be needed to deal with transient spikes - see the investigation on GN, I highly recommend it
Or
2. Just buy AMD and drop it into your existing system
I think people seriously underestimate how much raw heat is pumped into a room by even just a 400w with a high end cpu. It is not comfortable regardless of how good your air conditioning is. That heat source is right next to you. Combine that with Nvidia's lazy approach to transients and I think flipped power breakers and sweat will push people to AMD this gen, even if someone initially bought a Lovelace card.
@@yourhonestbro217 Norway. We've either had the worst low end models or halo products being sold for scalper prices pretty much since the 30-series (and rdna2) launched.
Considering the "reports" that AIB's have too much 30XX stock and nVidia is trying to delay the production of 40XX with TSMC, if the price rumours turn out to be true either AMD is going to be laughing all the way to the bank or it is going to destroy the market once more.
AMD won't be laughing to the bank. Purchasing power is getting too low to warrant a $600 graphics card, let alone an $800 card or more.
There's people now scraping their wallet for fuel to go to work, who used to buy new graphics cards every 2 years.
@@michelvanbriemen3459 GPU makers forgot how small globally the segment is for $800 GPUs. A mere 5 years ago a $800 gpu was a Halo product, now they think it's a mainstream premium class card price segment. 99.99% of PC gamers globally can't afford a $800 GPU.
AMD, just like Intel, will price things based on how much they can get away with just like Nvidia. These companies aren't your friend, there main agenda is making as much profit as possible, so if they can get away with charging $999 for a RTX 4080 then thats exactly what they will do, just like they did with the 2080ti being twice the price of the 1080ti, the excuse was the RTX tech, now there will be another excuse and I'm sure there will be another set of excuses for the 50 series which will primarily be MCM so they will say the cache and interlinks are expensive.
AMD are exactly the same, they did the same as Intel with Ryzen, they got a core advantage and then they got the performance advantage and then they raised their prices because there was no competition and they could get away with it lol, Nvidia has always got away with it and that won't change this coming gen
Reducing their tsmc allocation means they'll jack up prices by claiming demand is higher than supply. I bet that's what's gonna happen.
@@zeitgeistx5239 That's true, I remember buying an R9 290 for €400, and the tier-equivalent now is double the price if not more.
Back when I bought it that was borderline enthusiast-grade pricing. Now it's "cheap". The masses won't spend that money either, they can see sense in €250 but that's where it ends for most of them.
“To those of you saying I’m gonna but the 30 series when the 40 comes now is probably the time to buy it”
As I’m one step ahead I wanna buy a 20 series when the 40 is released
Same here a 2060 seems to be the new mid tier pc gpu
@@GTSW1FT no brother buy the 3050 if possible 8gigs of vram comes in handy ;-)
@@S7OVN 3050 is dog shit. It's massively over priced. Amds equivalent is so much better perf/price
No way i'm getting a card that draws this much power. Even if its free, I wont do it. There are a lot of reasons to avoid that card. Heck, I don't think my house electric system can even support it...
@@asbestosfibers1325 It's not necessarily the power draw what's gonna be the issue, it's more the heat the card will produce. A 500+ watt card which has 16,000 cores is gonna produce a ton of heat. A card like that will require watercooling at best.
@@asbestosfibers1325 My place was made in 1980 from what I know, so that's hmmm.. 40 years ago?
We already have issues here, when we turn AC on and a vacuum cleaner (plus a few other things)
I won't change and replace every single cable just because of a single GPU. A total makeover of the electric system will cost a ton. I already calculated the whole thing. Not to mention the repairs on the walls after everything is done.
Also also, I'm not sure the other people here would agree to that.
Getting that sweet three-phase current connection
@@laszlozsurka8991 Indeed, then the AC would need to be turn on ever longer/work harder. The heat is one of my issues with the card. Watercooling is cool, but my friends always have a leak with their tubes..leaking on other computer parts. I'm always scared of that.
Also also, playing games all day with 500+ watts can't be good for the power bill as well. When you mix the CPU, the monitors and everything. So yea, there are a million issues with a card like that.
I only learned about the US electrical grid recently and it blows my mind.
I found out because most of them don't have electric kettles (because no tea drinking culture) but even so, they aren't worth is as they take too long due to the 1.8kWh limit.
Please don't @ me with abuse as I know it's more nuanced than this but I'm not here to recap the video I saw 🤣
I'm honestly interested in what AMD puts out this gen. If they've put work into RT which it sounds like they have, and they're more power efficient AND the price is lower... might be time to make the switch to team Red.
AMD is going to absolutely OWN this generation of PC Hardware.
AMD has always had better products price wise and longer term support wise. RTX was the only reason to ever buy team green, and it has never actually been viable since buying a card powerful enough to play RTX games has been only for the top 1% enthusiasts. None of the 20 series counted because RTX is a joke on them.
RDNA3 will definitely be more efficient, and should beat nvidia performance wise too, except for maybe some crazy halo product that no one can afford.
And this is why I've switched to team red, despite some of the drawbacks. I refused to pay a premium just for a name. Thanks Jay!
Okay AMD fanboy
@@pessiescobar4707 okay person who read 2 words instead of the entire comment.
@@DragoNate 💯🤣🤣
0:33 That ad was actually amazing!
I'm actually a little frustrated with some of these takes, especially the ones regarding "things are somewhat back to normal". No. They are not. 3080 launched at msrp for 699, so finding them at 800 is still over the msrp. Even if we give a long-shot take and say all 3080 cards can be found new for 699, this still IS NOT GOOD. We are looking at effectively 2 year old graphics cards that still sell at their launch msrp. The rtx2k series launched their mid-release "Super" variants which were a decent jump in performance, but didnt jump in msrp. And that was after just a year.
If things were actually back to normal, then all gpus should be selling 20-30% below msrp right now (especially considering we are heading into a new gen gpu launch in a few weeks tops). Yet most cards sell around 20-30% ABOVE msrp still. effectively that's a normalized price difference of 40-60% between what the prices are, and what they should be under normal circumstances. This is not normal. We are very much still in a hyperinflated price period.
Give it time - AIBs are sitting on a mountain of 3080s. And the MSRP given for it didn't cover the cost of the card. Moore's Law covered this pretty extensively looking through the Bill of Materials for the 30 series of cards
I have found 3080s for $900 (CAD) the msrp is over $1000
I agree but Nvidia and AIB's will never sell below 30% below MSRP, they'll just rebin and rebadge their old stock into a 4050, 4030 or whatever. The most I can see the prices dropping is MSRP with a $50-$100 rebate, but that'll mostly be on a retailer level. But who knows, I wouldn't complain about $400-500 3080's.
@Bon appétit ! (CANADIAN PRICE)
You forgot one thing:
699 is the msrp for the nvidia founder editions. The 3rd party manufacturers cards like Asus, MSI, Gigabyte etc. have a higher msrp (around 100+) so we are on a msrp from ~800
Also theres the inflation, global crisis and so on…
You are right, the prices should be lower at this time (lifecycle end of 30 series) but lets be honest, the greed of those companies and ,amufacturer is too big. There wont be a time sonn anymore, where you get graphics card for such cheap prices like in the past.
Pretty sure the msrp for the next card series will be higher again because the people are still buying it…i mean, people bought graphics card for 2-3x msrp…..
ive been wanting a pc since 7 or so yrs ago. Finally having the money for it, ive built my 1st pc 1 month ago with a 2060. Today a 3060 costs exactly as much as the 2060 Ive bought :)) but even with the 40 series coming out i dont regret a thing, bcuz ive wanted a pc for so long I wouldnt have had the money for a 40 series anyway so i am more than happy with my 2060
Good for you. Was in the same position as you about 6 years ago when I got my 1080. But I'll probably be looking to upgrade I just don't know if I'll be able to beat the scalpers and end up having to wait until next year sometime to upgrade.
Glad to hear you got what you worked for though. Just don't be surprised if you find yourself needing a bit more power at some point, depending on what games or applications you run.
Yeah bruh I waited 4 years to get 2060 this year
RIP
i mean games are not improving graphical wise to justify an upgrade with dlss fsr etc and the new unreal engine that is pretty impressive even on older hardware the extra power needed and heat that will be generated by the 40 series its not worth upgrading but people will still fall for FOMO.
I still have 2060 in the computer I use for Adobe and other graphics programs and 2070 in the mini-PC I use for gaming, and I got a 3070 at MSRP a while ago that I never used because I did not see the need right away. I have been building my own PC systems for 20 years now, and the GPU scarcity con game has become an impediment for anyone trying to get into making a computer and a turn-off for long-time builders. I am still wondering what the average person thinks they are getting out of some of these new graphics cards, considering the number of games that realistically use any computing power is few and far between. Unless you are doing some high-level CGI rendering and tons of video encoding for work, school, hobby, etc., the average user does not really see what the power increases are really meant to do.
Anything over $600 for a graphics card is too high lol
$300
$400 is my limit.
I really got into PC building youtube in 2019 but couldn't justify spending the money to build anything. Fast forward through the pandemic and I didn't think I'd ever be able to build anything with how expensive stuff got. My dad ended up buying me a 3080 Ti in February and I bought the rest of the PC myself. I'm hoping it will last me a long time because of the way nvidia prices and inflation are looking. Great video Jay, I absolutely love your content.
It definitely will, I'm still rocking my gtx 1070 @ 2k in most games and it's just about time to upgrade for me so what's that 6 years on one card? That's a lower tier than yours. That said though I mostly play shooters or SIM racing with a bit of VR so other than VR I don't really need any upgrades, I'm happy banging it on medium and getting 80-120 FPS though, still higher than console
@@mgproryh i am in the same boat, i don t care about raytracing at all, i don t care about resolution about 1440p at all, but man they got me on VR. I have a 1080ti and while i can still play VR at very high quality my frame rate is abyssmall either 45 fps or for DCS i got as low as 30fps lock in VR. Not the best experience to say the least.
It WILL last long. Take care of it. It will take care of you.
You got 7 years + on that for sure.
@@gzaos damn really on a 1080ti? I presume your on a quest 2 or other 4k headset? I use a Lenovo explorer gen 1 so it's like half / a quarter the resolution (don't remember exactly) so runs alot smoother even on the worse card. I loved my brother's quest but I knew I wouldn't be able to run it
Sad that no one ever mentions AMD in these conversations, leaks suggested ages ago that rdna3 was doubling performance and to match that performance increase nvidia had to juice their 40 series hence the suggested power draws that we’re seeing. I don’t think that team red’s cards are going to be near the same power draw
Possibly because the AMD rumours from the last 3-4 years didn’t pan out, and if you were expecting AMD to beat NVidia or Intel, they are … forever unlikely to be the cheaper or better choice in the midrange or budget range.
And that’s hard to root for. Expensive hardware that isn’t the best option, is unreliable and buggy, and doesn’t have the fastest performance, is missing features, etc.
AMD has to break the habits of just doing the average job if it wants to compete with Intel or Nvidia. It’s not impossible, it just needs to break the habits they have.
If it’s just on rumours becoming too aggressive or hyped, and it doesn’t happen for 2-4 years, or the tech doesn’t improve for the next generation, the Drivers and software is buggy, etc.
So, at the same time we were seeing the 3080 launch, and RDNA2 was going to be included in the next Ryzen APU series… to compete with the PS5/XSX CPUs…
It wouldn’t have been that believable, but it’s been several Ryzen launches and the APU hasn’t been launched or mentioned yet. That did not happen, and the 6600xt didn’t come out to be close to the 3070 series specs.
@@Toliman. Let's calm down talking about Intel, they have no experience and what they've given us so far has been crap even on the technical side with compatibility. Id take AMD over anything Intel atleast for a few years until Intel irons out all the problems they have and are going to have with their more powerful cards
@@Bi9Clapper Intel has been doing iGPUs forever, plus this is what? Their 5th? 6th? attempt at a dedicated GPU? They usually come out with one, it'll bomb, and they'll cancel it and give up for 5yrs+
that is because AMD has been out of the high end gpu space for years. they compete in the midfield but even then its mostly not at a compelling price point per performance so you end up being better off with a midtier nvidia card unless you just want to support team red. Every launch there are rumors this will be the new hotness by AMD but years of disappointment means i will be happy if they release something amazing and cheaper but... not holding my breath
@@Toliman. Nothing wrong with team red, unless you're a team green fan boy. Been gaming on Team Red cards for nearly a decade now ever since I switched because of instability on my second team green card and guess what... my current card maxes out the 165Hz refresh on 1440P at Ultra settings on most every game I play. Notice I said that *I* play, so anything else doesn't mean sh!t to me. If *I* don't play it, then it might as well not even exist to me!! Oh and did I mention, it does it using a wimpy little 700 watt PSU? (kill-a-watt shows about 565-595 watts total)
Wouldn't it be funny if 40 series released and no one cared.... I would laugh my ass off :)
I give it 20% chance if happening,you know after the initial madness by techies.
I think I'll just stick with my Steam Deck going forward. Electricity is already too expensive so 1000W+ gaming systems are just silly now.
The prices AND the energy consumption of graphics cards has gotten completely out of hand in the last few years. It's just crazy what's going on in these times.
Fortunately, I'm relatively fine because I only play in 1080p. But that doesn't make it any less insane!
Yeh last gen prices dropped some six months before the new cards were released and about 2 months before they started going back up again.
I remember because I wanted to buy but I was still a little in debt and wanted to clear that before I did.
But by the time I got it cleared the prices had gone back up.
I didn't worry, figured I would get one of the new cards, but then they launched high and never came down.
Just getting higher and higher.
So when I was able to get a good system for a decent price a short time back I grabbed it.
I’ve waited 10 years to build my PC and now felt like the best time to buy. I went all out and got 3080ti and paid MSRP. Best case scenario for me
same thing, i have a gtx 640 and have recently just purchased a 3080 ti vision
I built mine last year 5800x and 3070. Solid rig for a long time. But I'm considering getting a 3080ti just so I have another gpu when this one craps out. I'm glad I waited so long till I built my first pc.
I also went all out on my build (and like 1 tier above what I originally had planned for the build for most parts) and plan on keeping it for at LEAST the next 2 or 3 generations of parts.
good choice! enjoy your monster of a card! i absolutely love mine 😊
@@dethtour I have a 3070 Suprim X, and a 3090 SLI setup. The difference won't be worth it. Wait man. 3070's are great.
I managed to get a 3070 from Bestbuy Canada at MSRP and just going to hold onto that until the 5000 series. To stick it to Nvidia more companies need to enter the market just like Intel and create more GPU options. AMD adding ray tracing would have nividia on the defensive as well.
I mean, AMD does have Raytracing on a hardware level... It's just nowhere as effective, also because software leveraging the power of GPU is often much slower on AMD - which is likely drivers, optimization and more. So there needs to be a lot of software development done on the side of the software developers of the programs.
5000? Brah. I got a 3080 recently and I don't think I'm getting a new GPU before the 6000 series
@@tektauron the fact that games are devolving as well makes this statement true.
@@tektauron even then... why? games are not even straining 20 series yet like say so
AMD has already introduced RT though
Thanks for the updates.
Prices for the RTX 3090 just came down even more, so I just ordered one of those. Last year when I built my monster build, I used high end components except the video card. Now that last hole in my build for a 3090 RTX at $1200 was a realistic choice and it should last for years. Waiting for the series 40 cards seemed just too much of a gamble as the prices will just be insane, even if you are able to get one. Since my build is a AMD 5900 with the MSI Godlike motherboard I did consider getting a AMD 6900 XT but since I wanted more options with Ray tracing and DLSS I decided to stick with Nvidia,
I just did the same. £1500 for everything but the GPU. God tier MSI board. MSI gold rated 850w PSU. Enough to handle the 4080ti when it arrives (it's reported to be maxing out at 600w so 850 is the minimum considering CPU and drives.)
Huge power demand. You may have to plug the GPU into one outlet and the CPU, monitors, speaks, etc into the plug next door, which runs on another circuit, so you don't trip the breaker. And don't even think to plug in your phone charger while gaming.
Don't forget about the heat it'll generate!
You'll need to build inside a deep freezer.
I feel you're a little behind on this. Ampere production finished months ago due to overstocking. Who knows how long it will last but Nvidia tried to cancel some of their 5nm orders at TSMC, that suggests they're worried about oversupply I wouldn't be rushing to get a new GPU just yet.
Over supply of 30 series. If they artificially rise the prices of 40 series for launch it will push more people that have been waiting to just settle with 30. Supply will dry up quick and history will repeat itself.
Getting into retro gaming fixed any issues I had with the current PC parts issues. I highly recommend it.
Yeah good idea for people who want to play the latest triple A titles...
Retro is fine but this is clearly a solution you created in your mind.
@@i3l4ckskillzz79 Triple A sucks anyway. The only good one has been Elden Ring and even then its not as good as past FromSoft games.
@@i3l4ckskillzz79 of course it's not for everyone.
I'll be going back to AMD when it's time to replace my 1080. Neither company is a saint, but one is better than the other.
Given this, and especially as a Linux user, it's going to be AMD for me...and I suspect whatever I get will be a major improvement over the GTX 960 I've been using for seven years. :)
I'm using an AMD MSI Gaming X rx6700xt.
I would say right now if you can go for a 6800xt or 6900xt. Though a 6700xt is not bad either.
Linux user and still using my R9 Nano no intention of changing until i cant play the games that i play.
@@evacody1249 Buddy is running a 7 year old GPU and you are recommending him $800-$1200 GPU's. These are obviously out of his pricepoint or he would have upgraded way before Covid before crazy price hikes
I just got an RTX 3060ti after using GTX 970 since 2015 :D
So NVIDIA's driver is still gonna be bad? Need to build a new machine and will go AM5.
Last I heard, the 4000 series rollout was being 'delayed' to late 2022, early 2023 to move more 3000 series cards. As usual, prices jumped up $100 instantly. I was expecting the opposite as I was sitting on a 3080 to come down closer to msrp, but no... 😡
😡
@@leeloodog d is not that bad tbh I bought a 6900xt st the launch(not from scalpers) and I didnt have a single problem with it. Yes you have lower performance than a 3090 but not that much
Definitely looking at the 30 series and RX6000 series from AMD. And it's mostly due to power draw concerns. The steady climb in power usage is simply out of control. And just remember that Pascal had a power draw that was about half of what we're seeing today. This is an issue that needs addressing.
my 6600xt doesn't even heat up my room. a steal now for $300 on ebay
at least in the case of the 3070, it performs about 90% faster than the 1070 and only consumes 70W more
When we have to plug our GPU into its own power supply and power cord…
I have 6700 XT and it’s plenty powerful.
the 40 series will be more efficient, it wouldn't be surprising for a lower tier 40 series card to outperform a higher tier 30 series card at a lower power consumption. So far the power consumption rumors are just rumors, and u'll probably only need to worry if ur going for the highest end cards like the 4090(ti), which wont be cheap. If u can afford a 4090(ti), just throw in a new power supply with it.
The really sad thing is that nobody will inconvenience themselves for a single generation and not buy their cards to punish them for absolutely anti-consumer practices like this.
false, I am still rocking a gtx 1050 2Gb. My contribution does not seem to have changed anything
All I need is for AMD's 7000 series is to catch up in raytracing and in video compression engine. That's it. I don't need them to "kill" Nvidia's best cards, I just need them to be equivalent :-/
Im pretty sure the 7000 series is going to accomplish that and likely for far more reasonable prices
who cares about video compression
Streamers, natch. It does a lot for the perceived usefulness of the cards even if the end user doesn't end up using the feature.
@The Momaw; AMD are reintroducing B frames into h264 media encode with a driver update for the 6000 series (I think it might already be out now?). That's been tested and brings the quality much closer to Nvidias - still a little behind, but much closer. The 7k series has a new media engine slated to bring AV1 encode as well, remains to be seen if it brings further improvement to h264.
@@winterscrescendo Thanks for the tip, I wasn't aware. It looks like the improvements to AMF have not really been widely adopted yet, so if you want to use it then you're deep into the "compile it yourself" weeds. Bit beyond my skill level but maybe Coming Soon(tm)