it's crazy, it's the reason why I haven;t wanted to upgrade my 2060, I can't justify spending 1500 on a 2080 when that's what my whole build cost in 2019
Dare I ask what the total Wh will be and how much it will also cost in electricity bills? The insult will just be too much. Edit: Y’all I know they’re being sarcastic. That was a rhetorical question. I know exactly how power hungry this card would be, especially with my use cases. It’s totally worth every penny. (That’s sarcasm, ya babies.)
It's so sad to hear that the 4000 series is probably going to be more and more power hungry... I was so blown away by the 10 series when it launched and most of the offerings were significantly faster than the 900 equivalent AND sipped way less power while doing it. I wanna see that kind of generational improvement again someday.
expect to go over 1k wats and I wouldn't blink at at 2k or 5w power supply :D don't blame nvidia blame miners that use these to push the chips to the edge of their limits to mine coins and prove they can keep it up :D lol because doing that raises the bar to perform :D
@@AndyOO6 man, no. Yes, it’s possible to reach 1000W, the problem is that the rise in wattage is directly proportional to the temperature, if it reached a 1000W, the TDP would be 1000W, and it isn’t that easy to cool a thing like that. A 5000W power supply is almost impossible to happen because of cooling, stability and size. And don’t forget that GPUs need to go into laptops too. It’s almost irrealistic to say that it’ll reach 1000w, although it’s possible, it’s really hard to reach it, and the card would need more time to release, since nVidia would need to wait for new power supplies that can actually carry it (since a 1000W GPU would be used with a CPU like the I9, R9 or Threadripper Pro.
And maybe they would need to wait for new cases to arrive, because cooling a 1000W GPU would need at the very least 3 to 5 slots, and that’s a fucking TON of slots.
@@Tigerfish. almost impossible to something like that to happen. 3090Ti is a fucking beast, it’s not easy to make something this cheap and as powerful as something this powerful.
That doesn't really matter if you'll only sell a couple of 100, while they could sell thousands if they dropped the price by 500. Cost per card has dropped since release of this generation.
As I've said elsewhere, I bought into the idea that the prices in the last 18 months were beyond the control of the manufacturers. Then Nvidia posted insane revenues for 2021. Going forward, I'll be driving my hardware into the ground before I upgrade and then I will spend as little as I can to make the games I play work. Nvidia has bent me over the table for the last time.
Yep! and like I keep saying as well. Used hardware is a fantastic idea! I for one can't remember the last time I purchased something new. With the one exception being the hard drive and power supply. Mice, speakers, cables, ram, even MOBO's are generally perfectly fine used. You do also need to be fine with stuff from up to 2018. But then how much has been available up to recently again anyway? It sounds nice on paper to say you can do 4k video and blah blah blah. Outside of doing that for movies and commercials. Games will look fine at less than 4k. I doubt many truely use techniques to scale up to 4k.
2000-2008 was even better...(that's also including for the 3rd world, you could buy a 2 year old flagship card for US$20-30 and run all the latest games with no issues)
I like the fact that GPUs have been getting more and more efficient ever since like GTX 600 series (compared to GTX 400 and GTX 200) up till GTX 10 series and now we are going backwards to making GPUs even more hungry for watts. I hope AMD does better than Nvidia with the efficiency.
@@memethief4113 Yeah, this is being done more because of competition than anything. The nice thing is it forces AIB's to improve cooling, so if you want an efficient card you can just undervolt it back down.
This is the culmination of nvidia's laziness. The 200, 400 and 500 series cards were like that because nvidia didn't want to design more efficient architecture and found that cranking more amps into the silicon for more fps was a viable strategy. Were seeing a repeat of that history now.
AMD: pheeeu, thanks to Nvidia we now have room to play with power, nobody is even going to notice if our cards go power hungry, as long as they are not drawing more power than theirs.
Scalper prices lol The fact is you can't get them at MSRP most of the time anyway. The reason scalpers exist is because MSRP has been too damn low for so long. I'm no fan of Nvida, AMD, or Intel but when all of their products are more expensive than usual then consider that it's not some scheme. That being said I wonder why they keep going with high end chips. From my limited knowledge that means their defective chip rate will be higher at the factory. Before you say miners, I'm pretty sure 4 $500 cards will be as good or better than a $2000 card only considering performance, not even their wattage which would make the former even more promising.
@xionliing sh Crypto mining guys, ML devs, and grpahics designers should be blaming us gamers. We don't get to blame because our use doesn't make anything at all.
6:58 I'm sorry to burst your bubble LTT but it was me who snagged that card from spain and it was a bog standard scam, he's been doing it for months with burner accounts all at £700
Ofc they won't be any different. They're a corporation too, they work for money, plus they're building a new US factory that's an *investment* that needs to be returned on. Bye-bye decent pricing. 😞
A BIG part of the reason for these high prices is because of crypto mining, not because consumers are paying these ridiculous prices... and those miners make a lot of money with those cards and aren't really hurt by these prices... that's the problem.
@@MickayG That's the only reason why you would need such a powerful card. They aren't for gamers. I don't believe there's gonna be a game that will fully utilize it for awhile.
Just for point of comparison, the window air conditioner that keeps my play space at a chilly 60F year round, only pulls between 200 and 500 watts of power. And it keeps a big as game room ice cold. 500 watts of GPU power for F1 is insane.
Shhhh...the cards are already green. Don't need them to Hulk out. May piss off AMD and end up doing something Red Hulk related. Then Intel will do something Blue Hulk. And we just have this Hulk battle of GPUs. Why the fuck is this not an ad? I am a genius!
Almost all the efficiency gains come from lithography shrinks. Some are architectural but it's not a lot. Once you can't make the transistors any smaller, power consumption *has* to go up, in order for performance to go up by any noticeable amount.
exactly. I'm not going to upgrade my GPU until there's a model released that's efficient, with low power consumption, like the GTX 1000 series was super power efficient, but it's like they stopped caring after that generation.
@@CrimsonEclipse5 very true. But I just feel there are ways in which performance gains can be had without just increasing the size of the straw the gpu gets to suck on. Probably ways that may not have been thought of yet. But even so, I really don't think anyone NEEDS to game at 8K anyway. 4K on larger monitors or tvs I understand. If they can just keep the performance where it is and work on increasing efficiency I would prefer that over exponential growth in my power bill. If Nvidia worked as hard on efficiency as they do at beating AMD I'm sure they could come up with something.
Not really. Companies are learning from scalpers that most people are willing to pay $2k+ for a new graphics card. So why not just under cut the scalpers and sell it at that price from the start?
@@ADobbin1 That's not how it works. There is an absolute limit, and $2k is approaching that limit. So the most a scalper could sell a $2k card for would be about $2200 -- which isn't much of a profit after taxes.
Nice to see the future of gaming/productivity. What do we need? Two extra solar panels? A diesel generator? An industrial contract with the power grid companies ?! Maybe they are implementing an extra power package for all those nVidia 4xxx series users ?! I'm glad the price for energy is currently dropping .. isn't it?
Energy pricing hasn't really changed where I live. But, it would be smart to invest around 5k in a solar array just in general for your entertainment electronics.
So to add to this, a standard North american home electrical circuit will be on a 15 amp breaker. 15 amps x 120 VAC = 1800 Watts. At the point where our PCs start drawing 1500 watts we will probably need to start changing the wiring in our homes. And how far away from 1500watts are we? I'm guessing 2 years or less, Ryzen 7 is supposed to have a huge performance boost over Ryzen 5, and even the AMD 7xxx series is being reported to have huge performance gains over previous gen. Might need to wire a 30 amp circuit so you can run your PC and an air conditioner at the same time, a 3090ti would probably heat a well insulated house.
just wait till next gen hopper is supposed to use 700 watts ( this is the data center gpus) while lovelace (4090) around 450 - 600 watts. RDNA3 (7970) is expected to be around 375 - 450 watts.
Hasnt this been the case for something like 15+ years now? Or did you mean this card alone draws more than another high performance computer including its GPU?
Seriously! As someone that's new to the hobby and just built their first computer last year, I had one hell of a time trying to figure out which card to get. It seems like every build and every advice website nowadays just uses/advertises the latest and greatest card; I have no clue what the bare minimum is nor do I have any idea of what constitutes "good enough" for most games available right now.
I was waiting in line at a Microcenter in hopes to get a graphics card. I was there for a few hours so I started talking to the guys in front of me. When we got around to talking about what cards we wanted, one guy said he was going for the 3090. So I asked what kind of professional work he did and without missing a beat he said he wanted it to play Call of Duty Warzone. I died inside.
Yeah, admittedly kind of guilty of that myself. Been considering the 3090 or maybe one of the upcoming 40 series because with some games I'm noticing my still fairly potent 1080 Ti (which I've had for 3 years at least now, maybe 4? Yeah, whatever card I get next needs to be as long of a haul as well) struggling to keep up. Though I also play quite a few VR games (and seen performance issues in a few of them too) and have branched into a little Unity and Blender work, so maybe I'll actually have a use for the extra power unlike that guy.
I remember when the flagship GPU used to cost $500 a little over a decade ago. Now it's 4x the cost and only marginally better than the previous flagship for a third more.
the flagship gpus back then had more than 5x LESS performance than even a 3080. thats the cost of advancement. you cant expect to get 5x better shit for the same price as a 500-700 series gtx graphics card. thats poor mans thinking..
I bought a strix 1070 ti 4 years ago..... it should be about a 150 to 200 dollar card because of age. Could of sold it for 100 more of what i originally bought it for. I WANT SO BADLY DO BUILD A SYSTEM AGAIN..... I can but i like my high end gear and that high end gear is going to cost me a god d-amn leg and an arm to do so.
@@robynd6138 Not just the cost of advancement but also of economic inflation and Nvidia is still largely unchallenged as the king of GPUs, AMD is getting there so hopefully prices will lower once they duke it out
@@robynd6138 Back then? Define it, please. Back in 2005? A 3080 doesn't even perform 2x better than a 1080 as said by userbenchmarks. Feel free to check it yourself.
I have a 3090, the guy that sold them ( re-seller ) told me that he sold a bunch of Founders Edition here, people do actually like the BFG! I personally, with the energy crisis that some countries are experiencing, I'm hoping that tech giants will focus more on energy efficiency rather than just more performance
If they do want to keep the Europe market, they'll need to work on efficiency. Energy bills here (Belgium for exemple) are becoming insane. I (and many) will deffinitively check energy efficiency on the 4000 and 5000 products.
The correct approach will be finding ways to generate electricity from renewable sources. Then, competition should take care of also creating efficiency, but that always comes after creating power. Is like not selling plastic bags when I am not the one dumping them in the ocean. The root of the problem is deep down, true solutions must come from there.
The sad part is that the 3090 Ti will likely still sell out, giving Nvidia little reason to decrease the price along with the lessened sanctions on China.
I don't think so, 3090s are all over local ads at less than MSRP, Founders cards aren't selling at MSRP, and NewEgg/amazon has them in stock all day long so they definitely aren't moving
I mean why do you sanction china if you need their stuff You like getting fked huh? Sanctioning them, consequently buying stuff from them at a higher price Such a smart move by MURICA politicians
I got my 2080 Ti just before they released the 30 series, and I’m really glad I made my commitment. I was lucky enough to get it for a discount and haven’t been disappointed with it nor do I think I will be for a long time.
Same here. I recently put a water block on it (which was discounted). It hurt for a moment when the 3070 came available, but then the prices went up again so I was happy I got it when I still could.
I finally upgraded my gtx 780 to a rtx 2080 a couple years ago for my 4k gaming setup, and am now looking more into pc stuff because I need a cpu upgrade. The difference from the 2080 to the 3080 hurts my soul please give me some of your strength.
A bit frustrating we didn’t get this kind of scathing review on the 3090, 3080ti and 6900xt but I’m glad LTT is finally coming around. It’s also super refreshing to hear that it sounds like Anthony regrets the whole 8k bs that Nvidia tried to sell the 3090 as and paid influencers to promote. As always anthony is the goat.
Well as far as the 6900XT goes it has a much lower price point (MSRP was 1k) and a much lower power consumption (most cards are in the 300-330W when pushed pedal to the metal and only the XTXH versions hit higher). So, was it expensive ? Yeah ! Hell I have one lol (XFX Merc one, bought for 1150 euros). This however is just completely stupid, and it was already stupid when the 3080 launched.
Really seems like they're reaching for ANY improvements at all from Ampere by throwing much more power at it, even though the 3090 itself was really maxing out what Ampere could do. I really don't think the extra performance is worth that much more power consumption.
If that's the case, can we extrapolate that the Ada series are having similar problems, given their power requirements? I would hope efficiency would be something nvidia starts to worry about at some point.
@@danielharvison7510 High power requirements don't always mean that efficiency is low, imo GPUs having massive power requirements are fine for those who want them.... Only if the performance increase you get from that is justifiably massive. The 3090 Ti doesn't justify its higher power requirement for such a measly increase in performance, that's the problem I have.
I'm starting to think upgrading is a mistake. Power draw DOES matter.. I'd much rather have a high-end card at around the 200Watt range, that seems more reasonable. Do not care about more cores at this stage. Power efficiency. Please.
However there is no denying 30 series cards are more efficient than previous generations . I get almost identical performace from My 3060ti vs 2080Ti but with considerably less power draw.
It matters a lot. Especially in a world where energy is getting more expensive (here in the Netherlands 45 cents kWh on average) and scarce. In a world in climate crisis we should pay more attention to power draw.
@@anakinlowground5515 He wasnt talking about the 3090. The lower in 30s dont use anywhere near the ammount of power the 90 does. 3060 ti is around 200 watts.
I think I'm moving into more specialized hardware teritory with the current rising kwh prices. Makes more sense to game on a switch / my old playstation 4 / playstation 5 considering their power draw. Apple definitely got the memo with the M1 chips.
@@indeepjable watch these cards get sell out and get scalped. It might not be the same story around the world but in the west people are still paying thousands of dollars for top tier cards
@@haariger_wookie5646 Being an electrician helps, but with an addidtional 32A constant power draw x gamers,you may need a new powergrid for your city^^
Imagine paying over 2 grand for a single GPU and hardly getting over 60FPS in your games… dear Lord, we’ve definitely reached diminishing performance returns.
I mean sensible settings are a thing, not defending the card but I got a 3080 for 4k gaming and play everything 60-120fps depending on what it is.But for how much this card costs it should be 2x the 3080 not like +20%
The sad thing is that since they can't actually improve the efficiency and die they just bump up the power by changing the power limits and a few tweaks to the cores so that they don't actually burn up.
They got the memo alright... the shortage situation showed them exactly how much people are willing to pay, and this time they are happy being the scalpers themselves.
@@pikkyuukyuun4741 I maintain that gaming is an addiction, and as such, it is pretty easy to justify any kind of lunacy we commit. Be it "I am not going out drinking so in fact I am saving money". "Hey, number of hours per $ spent is super cheap!". We just enable our addiction and find ways to justify it all the while, the dealers keep testing the limits of how far we are willing to go, and again and again, we keep falling for it.
Very good point, did not know how to exactly express this idea but yeah. They realized how much people were willing to pay, and then they screwed their costumers over.
@@tesmat1243 Yeah, but if you just wanted to play games, iGPUs have been kinda decent and cheap for the last couple of years, not great but good enough to enjoy some games and if you absolutely had to have a GPU then laptops haven't been scalped too badly either, if you're desperate there are other options out there aside from paying $2,000+ for a super overpriced high-end GPU.
That's bananas. I'm actually looking forward to Intel's offerings. I've been an Nvidia user for over a decade now but Nvidia seriously needs a wake up call.
I only went with Nivida but when I got my gaming PC Nvidia card was so high I got XFX RX570 and I happy with that card but price on jump over a grand for little bit but IDK I very happy with AMD card in my AMD rig It hurt me to say Nivida is setting prices on a lot there cards highs when prices have drop My next card will be with AMD Nivida lost me in few years when I upgrade my system
It's ridiculous pricing, but we were already accepting 1080 at $600 and 1080ti at $700 from AIBs back then, so that is the bananas part. Unless we collectively turn our nose at these high prices, $2k+ for a consumer top-end and $1k+ for a tier 1 card will be the norm.
Nvidia is making it that way. Only AMD cards make any kind of sense for their prices. Which even those are still high. I hope these new Intel GPUs shake things up. For now tho, I'm glad I got a 5700xt and I'll be getting a 7000 series Radeon when those drop. AMD has just been better value for years now.
the whole industry does. CPUs are in the same boat, but less so, to the point where clownish price hikes there are completely overshadowed by beyond-parody price hikes in GPU land. If intel is successful with their attempt to price-snipe the consumer market here, I hope to god someone else does the same with CPUs. Who knows, in the future, the most popular systems may switch from 1060s and cheap intel CPUs to mid range intel GPUs and Nvidia entry level CPUs. Team cyan role reversal.
i miss the days where technological innovation was seen as making PCs smaller, cheaper and more efficient (the RPi 4 before shortages for example). but i guess if people still buy this bs, then team red, blue and green will all continue to focus on those who pay the most.
the laptop market has made leaps and bounds in recent years. Jarrod did a comparison showing a certain rtx 3060 laptop variant matching performance of the desktop one (while plugged in ofc).
@@lord_nin0 the rtx 3060 is kinda wack tho as seen by the absolutely massive difference between a 3060 and its ti variant $50 difference earns a nearly 50+fps increase in all scenarios that is insanity just shows you how bad the 3060 really is
They've took a bit off Apple's book. It's so sad how we've strayed away from the quote "There's a basic principle about consumer electronics: it gets more powerful all the time and it gets cheaper all the time."
@@theuprising2216 i use the Asus G17 3060 model and i agree, running MC in Linux with lagless shaders is enough to make the fans spin up in "silent" mode. its not the most scientific study out there but i would have expected a modern day card to handle a game over 10 years old.
@@JonatasAdoM Actually the mocked Apple’s book. It gets more powerful and more power efficient all the time. This are getting ridiculously power hungry while Apple is laser focused on efficiency without compromising performance.
They are incredible efficient, but mostly running OCed to get the last bit of performance. But same as with Intel CPUs - if you run them according to the actual stock specs they are rather efficient.
@@ABaumstumpf And that's been the big issue with Intel, if you run them at base spec they are super efficient but their performance per watt is also very mediocre - and you don't please potential customers with "mediocre" 😅
don't forget that these 3*** chips were developed circa 2019. they are selling old tech at inflated prices because they know the consumer has no choice. hopefully intel can bring some price competition to the market.
considering their only targeting hte mid to low end, it's very likely their going for the general consumer rather than epeen and crypto douche market. Thank christ. Once those fads die off, they'll be holding the reliable market, and hopefully they'll actually appreciate it. If not, nvidia and AMD will still have to come back to it, and compete for price then.
the problem with newer chips is that the transistors themselves are reaching a point where they cant physically go any smaller so instead of developing new technology, they ramp up the power consumption for an artificial performance gain.
Well they're still shrinking and developing new technology as well - but in the meantime while we wait for Lovelace & RDNA3, we get this thing because Nvidia couldn't handle a 6950XT topping the charts.
@@SnifferSock Agreed, artificial is not the best word, they are sacrificing efficiency and dumping power into the architecture. That shows they reach the limit of its current tech and are now scrapping the bottom of the barrel to get any extra performance. Going by power draw, they are deep into the land of diminishing returns per watt. That doesn't bode well for their 4000 series card if their tech is not a major leap and given the existing fab processes have not changed that much or gone to a smaller size tech. The safe prediction is that the 4000 series will be a minor improvement over the 3000 series. Not good when they are scalping this hard.
I hope a new technology comes out to make graphics processing way more efficient. 500 watts is too much and those ray tracing marks were 70 fps. That's embarrassing
@@Galf506 Anything Nvidia ever does is to counter anything AMD ever does or not if they don't have to because of a lack of competition, always and will always stay that way.
I think it has more to do with making money and being able to sell a new "best gpu ever" for more money upfront where they couldn't just raise the price of the 3090 base
I want to see a new 75w card that is twice as powerful as the last one. Or just power efficiency in general. More fps for more power seems pointless at this point, and certainly doesn't help with a looming energy crisis.
I guess we can hold some hope for desktop APUs with RDNA2? Extrapolating from Steamdeck's performance and the development of FSR and RSR, the next generation APU might finally be a solid budget/SFF choice.
i always kinda hate these comparisons cause it doesnt take into account the time. sure you could say someone buying this card is a bad buy when a 4070 might beat it at 800 msrp, but when will that be out, and how do you quantify the time between now and when ever next gen is out.
The issue is if this thing sells at this rate, Nvidia will keep on hiking prices. Eventually "entry" level GPU's like the "4070" will cost over a grand. The whole point of Anthony's rant here is that we need to stop paying so much for these cards. The only reason prices are going to where they are now is because people pay them, and if you think Nvidia won't hike all the prices on their next generation of gpu's I'd hope you're right but doubt you heavily.
@@CocaineCobain Yes. It's pretty sad how everyone complains about high prices... Then go pay for it anyway. I don't know who is buying these cards, and how they're affording them. I'll be using this 1070 until the day it dies.
you could argue that looking out for the next gens is stupid in itself. most ppl buy a GPU ether in a cycle of years or when they deem it necessary. there is always something better on the horizon. sure buying "right befor" the next launch isnt the best idea, but the biggest cards just come out late into the market. I just like to look back at previous gens to determine if the current cards are worth it. thats why i still rock a 980 (though i should have bought a 3080 at launch tbh). its not fair to compare a top of the line, fully enabled chip against anything but a another top of the line chip. the lower cards are always build towards a healthier price to performance ratio.
Crazy how I bought my RX 570 8gb in early 2020 for $110. And it still runs everything fine. The entire PC I built at that time only cost me $450. If I wanted to built it again, it would be $700-$1100 depending on what's available.
I'm building mine from scraps off the side of the road. +$200AU for an RX 580 (lucky me) and ~+55AU for a new PSU (as the ancient one i was using died) to think that if I bought everything brand new it would cost me probably $700AU+. It is rediculous.
@@carlosmonroy3441 Yeah if I include my monitors and peripherals it's more like $750 but the PC alone I built for just under $450. Really actually less than that cause I had just bought a xbox one x from a pawn shop for $150 and sold it for $350, so it nearly covered the PC alone 😂 I didn't buy the xbox to intentionally scalp btw, It was my main system but as soon as I got the PC, I didn't use it anymore at all, and just so happened to be when you couldn't get consoles AT ALL in the store or online. So I made out like a bandit 🙌
I honestly think most of the people buying the 3090 and 3090 Ti are using them for video editing and 3d modeling, they just happen to also game on that PC so they show up in the Steam hardware survey. There are also the types of people that post their specs even when nobody is asking, they are also buying the 3090 Ti cards. I actually just bought a regular 3090 FTW3 on Ebay for about 25% off MSRP purely for gaming but I also am one of those guys that wants to install every graphics mod, 4k texture packs, shaders, and set everything as high as possible. If it doesn't look like real life its not good enough LOL!
I just like to think back to the launch of the 30 series of cards, when we were all crazy excited with how much performance these cards had for your buck. Before the scalpers, before the shortage in materials etc... This shit was THE BOMB and Nividia was felt like they hit a home run. Back to reality and this has somehow been one of the worst times to upgrade, the most expensive cost to performance curve in recent times, and a major PR hit to Nividia. Talk about a 180.
@@ahmedo7875 not Crypto. Scalpers. remember they released in Oct the Mining craze didn't kick off till jan 2021. before that it was ALL scalping due to the shortage. Miners from feb 2021 made it worse.
@@Xyphren idk dude people were GPU mining for years before the 30X0 series was released. savvy miners saw the TH/s/W for these cards and jumped on them immediately. If you could get a pallet of them for anywhere near MSRP you were making BANK. They had a
Should have thrown a 3080 in there to show how dumb the 3090 is, let alone the 3090ti. People were disappointed with the 20-30% increase the 2080ti had over the 2080, for 50% higher price. I never understood why people seemed more fine with the 10% increase the 3090 got over the 3080 for double the price!
the 3080 -> 3090 price hike is easier to swallow because you get 24GB of GDDR6X, which is expensive stuff. You aren't getting your money's worth in games but in applications that use that VRAM, it's a much closer deal.
@@igameidoresearchtoo6511 24gig GPU's are actually very expensive if you want cuda accelerated ML. In that specific scenario, and only there, the RTX 3090 can be considered worth the cost. I am off course also considering that this is part of a job or education.
Actually this is genius, they simultaneously get rid of the old cards to clear their warehouses set price to 2K before new gen RTX 4090 launch. And now they can set prices for their new gen so much higher. Evil but Genius
Nvidia has already done this before; they stacked the 2000 series cards pricing on top of their old gen 1000 series card, and they got away with it until AMD actually brought out new gen 5000 series cards, which they then began to phase out the over-priced rip-off that was the 2060 and 2070 (which were essentially mid-range 60/60ti cards being launched at $329 and $499), and introduced Super cards to compete with performance to price that made a bit more sense.
@@thescorpionruler5882 Na, nobody is gonna be mining with these things. It won't be very profitable with that power draw, and on top of that the cash cow for mining (Ethereum) won't be minable at all pretty soon, nobody is gonna turn a profit if they try. Much better off buying the actually good cards in the series, like the 3070 & 3080, even the regular 3090 isn't worth it right now, a year ago sure, but not now. The only people buying this are the ones that are overcompensating for something, the people so rich or dumb they buy "the best" no matter what it costs, or scalpers hoping to sell to those guys lol..
I get the feeling that the point of this card is not to sell it's to make what they bring out next seem much better despite also still being overpriced, perhaps even more so and still being power hungry.
It’s been really cool to see the change over time, I liked the guy from the first video I saw him in, and the first time I heard him speak I knew, this guy is a genius.
I built a computer at the end of 2021 to celebrate finally finishing college, I was afraid that prices were not going to get better and I decided that I was ok paying to have it then rather than wait till some unknown point to save money because I had no clue how far off that might have been. I did buy a 3080 rather than a 3090 though, the gap between the two did not seem to warrant the price difference.
I’d love to build a new PC considering I’m still down here with a 1070, but out of principle it seems absurd to spend more than $1000 on just the GPU alone. Perhaps one day lol
1070ti owner here. Even on 1440p i'm still fine. I turn down some settings and that's it. No way in hell do i spend 500 bucks for a 10% improvement over my currently 5 year old card that cost me 300 bucks back then. I'll ride this thing until the gpu dies.
@@domsch1302 I’m running a 1440p monitor as well and you’re right, it’s manageable, but it’s by no means luxury anymore. We’ll just have to see how things change over the coming years.
I have a 1070 also, I run at 1440p and the games I struggle with are more unoptimized beta titles like Escape from tarkov. I won't upgrade until I get can get a solid upgrade for under 400.
I have a RTX 2070 Super that I’m selling for pretty low, I bit the bullet and got a 3080 12GB earlier this month. The Super was fine overall, just struggled in some instances with some newer games but I could play most of the games I have at solid settings at 1440p. Now it’s a bit overkill but now I can play at 165fps for my games
As for the "But why?" cause AMD are coming out with the 6950XT with higher memory speeds and increased core clocks, Nvidia are scared of AMD taking the perf crown.
@@phuzz00 that is not what their investors want. More expensive GPUs mean more revenue, which translates, roughly, to better stock prices. That is what the investors want.
I got my current card, the gtx 1080, when it came out. I can’t speak for everyone else, but personally, I tend to buy the top of the line products at the time, cause I know it’ll be 6-10 years before I upgrade. My 1080 is still satisfactory. I don’t need an upgrade, which is fortunate cause I couldn’t get one if I wanted to. But for my buying practices of only upgrading once every 7 years, it makes sense to buy the top of the line now, so 7 years from now it’s still acceptable. That being said, I would be satisfied with a 3080. But I ain’t paying no scalpers so I’m just gunna have to wait till the silicon shortage is less so, then maybe I’ll actually need an upgrade.
I'm sort of similar, I prefer to game at 1440p and my second hand 1070 ti still allows me to do that, and I won't be going near ANY new card, probably ever again at this rate. The last time I had a rig that pulled close to a kilowatt was GTX 280 in tri SLI, and that thing really did heat the room, and now they are back at it again..
@@RachoTLH north of 1k for it, while he can also sell his 1080 for like 400$ right now. Puts the card actually in a decent spot price wise for a big upgrade
The timing of slap in the face is so fitting on what happened yesterday. This is not looking good how GPU requires so much power in the future with little improvement. But this card is most likely for improving the flexing rather than performance.
Yeah Nvidia had to throw as much energy as they could in the 3090ti in order to get a small performance boost over the 3090 just so they could sell it as the new most powerful GPU. This product is ridiculous.
From a gaming perspective, at this point it's not the GPUs that need to get more powerful (OK for 8K maybe, but really who needs that), it's the game devs, or moreover the engine devs that should be producing more efficient code.
Agreed. I've observed this trend through the last decade. It seems like ever since computers became more powerful than potatoes, developers have been lazy about optimizing. There are games from a decade ago that look great. Hmm...
When I built my previous rig I got 2x1080's and a 4K monitor, I'm now using a 3090 and a 2k 144hz monitor and to be honest I can't tell the difference in picture quality.
I think that a large part of what drives the demand for these GPUs is VR. Even this 3090 Ti wouldn't have enough horsepower to run a flight simulator in 100% resolution with anti-aliasing (let alone supersampling), and get true 90 FPS without reprojection. For a few years there we were at a spot where the GPU hardware was ahead of, or at least on par with, the display hardware. But the insane pixel count of VR has flipped that equation around. Hopefully we will see software advances in eye-tracking and foveated rendering to reduce the ever-increasing demands of VR headsets.
As a PC part consumer (building my own PCs since the early 2000s), I do notice power draw. I have made CPU purchase decisions based on wattage, and I am now doing the same for GPUs. I just won't buy a part if it draws too much power for my taste.
The heat of 600+W is no joke. It's plenty to heat a normal room in the dead of winter on its own, but if you're like me where you have to run AC in the summer, you're going to need an extra 2000+ BTUs of cooling to offset that extra heat. I definitely pay attention to this, and design my system around a cap of about 300W peak gaming load. This is easy with a GTX 1080, but I'll have to think a bit harder for my next GPU, and probably do some substantial underclocking.
@@NeilMortimer Electric heaters are exactly 100% efficient since they're converting electricity to heat, but air conditioners are more than 100% efficient since they're only moving heat.
@@NeilMortimer you're 100% correct. I have a 450watt 3080 water cooled. That coupled with one hundred and change watts from the CPU will make me open my room windows during 40 degree or less days mid Winter. I have ac in my house but I have a portable ac just for my PC room otherwise the rest would end up at 64 degrees.
My guess: Nvidia is trying to get a bigger slice of the cake as long as the RTX 3090 is still around the same price as the 3090 Ti. As hardly anyone buys the 3090 either way, the few that will are probably going to think "ugh, I may aswell just spend the extra $100-200 for the Ti version". Then Nvidia gets likely around 250-500$ of extra income..
Feeling really great having bought a gtx1080 all these years ago... besides dlss I'm not missing anything, and fsr is there to save me when I (rarely) need it.
We've already been trained, through years of scalping, to expect to pay a lot more for GPU - and we've shown as a collective - we will pay it. The genie is out of the bottle and it can't be put back in. nVidia has a license to over charge from now on and we handed it to them.
Frankly it would have made business sense for them to raise the MSRP price in the first place. They would have caught a lot of flak; but I'd rather Nvidia have double the sales price than scalpers taking the money. More revenue means more R&D. Instead we got nothing for the higher prices.
I dunno, all those mining cards gonna hit the second hand market with energy prices going through the roof. Lot of people going to be happy to settle with used over new.
@@ryanj610 If Nvidia double the MSRP, then the scalper would do their thing and sell at double the current scalper's price as well. You really think raising MSRP is going to stop scalpers? Nah it's just gonna give an even bigger beatdown to regular consumers.
Were we really being trained? Maybe GPUs were underpriced for years. I'm not saying it's one way or another. Likely a combination of both. But I remembered being appalled at myself when I paid 380USD for a new 1070 when they launched. But after being with it as long as I was, and the value I got from it. I realized paying 900 for 4 years of decent gaming wasn't so outrageous. This kind of assessment is very personal and vastly different person to person. I think it's time we start thinking that people really value our hobby. That they love it so much they'll put that much money into it.
def not server gpu power draw, server gpus usually consume less than their gaming counterparts for a simple, reason, power draw is important on server environments, and most server tasks scale well with like 200 gpus, so, better have multiple of the slighghtly lower clocked cards, that's why the server version of the 3090 ti consumes 300w
I really appreciate that LTT went out of their way to avoid hyping this GPU release up because it's such an awful value in a market saturated by awful MSRPs... Thank you so much for this coverage
Looks like I have been completely priced out of PC gaming. Just going to use my 1060 as long as possible and buy a seriesX. Consoles offer a compelling value now with PC parts being the way they are.
Not certain if you actually “wrote” this or not, Anthony, but I genuinely appreciate your honesty and even some of the “self ownership/responsibility” around possibly pushing the desire for people wanting the “big boy” and likely more than what they truly need. Thank you for that.
The people who buy this card don't care about that, in fact they will be happy about it, because they will just sell the 3090Ti and get the next fastest, it doesn't matter how much it costs, or if it's just a tiny bit faster.
Yeah, they just don't care anymore. They know at least a couple suckers are gonna pay, and they know that the smarter folks are waiting for the next generation
@@kelvinhbo I might've done just that, but I can't be bothered to rebuild my rig and loop just to do it again in September. Besides my 3090 already has a higher power limit and the VRAM is overclocked very close to the stock speed here.
Was on amazon for some shopping and an 3070Ti popped out being sold by amazon at $812 and I couldn't believe it. Then I looked up that this card MSRP is $599! It has been so long since we had GPUs at MSRP that a $200 above MSRP is now seems like a steal. Unless your work or job depends on a new GPU, I would recommend you wait on getting a new GPU. Prices will continue to fall down this year due to - easing of Chip shortage (expected by summer) - tariff canceled for some stuff coming from China, - Intel GPUs to be released this year (if Intel keeps up its release schedule), - scalpers needing to offload unsold cards , - 40xx series cards (new GPU architecture which have potential to be what the 30xx series was to the 20xx series)
I got a 3080 at msrp. For 700 that craps on the 3090ti in terms of frames per dollar. We’ll see if the 40 series can outperform in terms of frames per dollar but if not then it’s not worth it.
@@nupagagyi Not necessarily, if the recent pandemic forced many people to upgrade within the 20 and 30 series cards, it's likely that a large portion of the customer base will not be as willing to part ways with their money quite yet, as their current systems are still fairly recent and are good for another couple of years. Probably the only ones going for high spec cards this season are extreme enthusiasts and potentially the few people who waited out the gpu crisis and skipped the 20 and 30 series altogether. We'll see, but I wouldn't assume that people are as willing to throw their fortune at new graphics cards as they were a year ago.
I have a 3090 and I don't think it'll be even worth it for me to go up to the 4000 series. Yes, it's rumoured to be twice the performance, but do you even need that with 24gb of VRAM?
Which is the reason the 50,60 and 70 line exist. The ones who are willing to pay for this level of hardware are the ones that enable your low cost options to also exist
@@dennisp8520 Having a hard time understanding how the 70 line counts as a low cost option for a consumer market. There are very few uses for more power than that for 80% of buyers.
I cant believe none of the AMD GPU's are on that top 10 steam list when some of them give a really good bang for buck compared to some of those cards on there.
nvidia is like intel before ryzen for the masses. Almost all laptops or prebuild pc's have a nvidia gpu because probably bigger margin and ongoing contracts. You see that bullshit exactly on the number of ryzen powered laptops over the last couple years. They were clearly the better chips in every aspect
@@MHWGamer the 6900xt is a crap miner but DAMN it its not a 3090 but better. its MADE for 4k. then below that the 3080 and below that the 6800xt. below that the 3070ti below that the 6800.. its a Perfect product stack with AMD's software based FXSR it means that in time the 6900xt will get better and better as its not hardware locked like DLSS
Availability was/is a huge problem for Radeon cards. I work at Best Buy and as awful as our 3000 series supply has been since launch, our 6000 series supply has been worse. For every Radeon card I have seen come in for customer pick up, I have seen three or four of their Nvidia counterparts come in.
This will be ... the biggest joke of a product in a few years The best example of excess, overkill, overpriced behemoth The future of gaming graphics will be small and lean
I think GPU performance itself has reached a point of diminishing returns, at the point where higher performance will not bring much benefit to the average consumer considering the price. I think Nvidia has reached the point that Intel recently did with 11th gen and 14nm++++++++++.
Yeah that's the problem. Node shrinks just aren't doing what they used to for efficiency gains. Physics is a bitch and if we want to keep increasing performance for more intense games with more intense effects, we're going to have to accept that all of the easy gains have long since been gotten and we're going to need to make some uncomfortable choices about power consumption or graphical fidelity.
Nope, they're just making GPUs for AI and not for gaming. FP16 and FP8 performance gains are massive and that's what matters for AI/ML applications which is their biggest market as of now.
Well not true. I think next generations GPUs will allow the average consumer to comfortably play on 4K, which even this card doesnt allow you to. As long as the current flagship cant deliver 60 fps (or preferably 100-144 fps) in 4K to match monitors, then there is a gap in the market. However Nvidia only cares about promoting its FPS in marketing, not wattage draw or heat - which 90 % of the buyers wont be able to figure out. AMD has a way more efficient GPU, and they seem to be on a good track with their next gen in terms of fps/power and hopefully pricing. Just stay cool for a few weeks/months and you will be able to buy this GPU in a smaller packaging for half the money ;)
@@veda9151 been rocking 1000w since 2012, for the marginal price increase and better performance, just makes sense. Going forward I'll probably move to 1200-1500
That whole "people pay more for our cards because RayTracing" from Nvidia is just plain wrong. They pay more, because they don't have any alternative. RT is a nice feature, but not required and most gamers just want to game and don't look for specific technical features. But as long as people are willing to pay, they will sell.
I paid more for RTX, and look to run it whenever possible. It massively improves immersion overall. I'll give you I rarely play anything competitive online.
@@williampinnock2256 I have no choice but to buy Nvida I need both Cuda cores for machine learning and Nvenc for transcoding videos Fuck this monopoly, I wish we get a third party, some chinese company would be nice
@@stitchfinger7678 I don't think he's referring to gaming so much as he's referring to the machine learning hardware scene. If you run ML-centered workflows, NVIDIA is pretty much all there is, even if it is oftentimes wrapped up in a cloud vendor subscription.
They really shit on the people who bought the 3090ti. Brought out the worlds most powerful card at a very high price, knowing round the corner they had the 40 series ready to be on sale within 5 months that absolutely decimated the 3090ti. It’s a disgrace. They didnt even bother optimising the 3090ti, it could perform far better than it does. You can tell by the fact that in spec it’s on a whole other level to the 3080 and 3080ti but performs barely any better. It’s nonsense.
Nvidia treats its customers like morons when it falsely compares this to the Titan RTX rather than the vanilla 3090. If you buy the 3090 Ti for gaming, you're proving them right.
What's the average percentage difference in performance?? I must of missed it in the video. I got a 3090 by pure luck at microcenter and I honestly almost didn't buy because of the price but it was all they had. Honestly worth it because it was right before the chip shortage. I got a msi oc 3090 for $1700.
@@shortyorc121 5 - 10% better on average . Enough to show up on a graph but honestly Ive tested myself and I can’t tell any difference in pc hardware performance until the difference is closer to 20%
There are people who have money, lots of money who don't care if it's 2k. They aren't morons. NVIDIA is trying to tap into that market. And I'm personally fine with that. It's better that the card doesn't have a 40% boost in performance for 2k because that would have been really shit if it did. If the main flagship cards remain on low price like 4080 for £660, then I don't care what prices the other cards are.
As someone who read the investor brief, I was *upset* (putting it mildly) at how they were spinning the GPU pricing. Maybe for the big institutions and shareholders who hold other companies don't get it and are the target audience. But it doesn't take a genius to figure out that all of that spin is just fluff to make it seem like the stock itself is capable of infinite growth.
@I suck at being bad once again... then don't buy it? If they don't sell any then the price will drop. Merch sales for UA-camrs aren't really about the product, it's about supporting the channel.
I brought the 3090 at MSRP and it saved me so much money for training deep learning models because my alternative is to pay a minimum of $2-$3 per hour for a cloud GPU instance with 16GB+ memory and they are generally 50% slower than my 3090. The $1500 price tag for 3090 is totally worthy. The 3090ti, on the other hand, sounds like a cash grab. 48GB memory would make more sense.
It is ridiculous that those cards cost as much as my PC I ve built in 2019 .
it's crazy, it's the reason why I haven;t wanted to upgrade my 2060, I can't justify spending 1500 on a 2080 when that's what my whole build cost in 2019
@@robcarr9968 Street price for a 2080ti is about 900 right now. I just picked up a FE for $870.
to be fair, im fine with things like this so long the 3080 is at 700-800, i mean, "3090 ti", aka titan ampere ti
Can you imagine that a few years ago, a "mid-range" GPU like the Radeon 7800 was available for $180? Those were great years.
@@pspublic13 right?? The Nvidia 10 series was the peak. The value of a 1050ti was unmatched
2% better productivity, 7-9% better fps, draws 100W more watts and 500$ more. WoRtH EvErY pEnNy
Dare I ask what the total Wh will be and how much it will also cost in electricity bills? The insult will just be too much.
Edit: Y’all I know they’re being sarcastic. That was a rhetorical question. I know exactly how power hungry this card would be, especially with my use cases. It’s totally worth every penny. (That’s sarcasm, ya babies.)
@@glass.hammer i think he is being sarcastic.
Prepare the memes.
Planet destroying video cards...
I'm preparing for the streamers that start popping breakers and starting fires in their walls because they don't know watt they are doing.
"Recommend a kilowatt" Words I never thought I'd hear in a GPU review.
Wow
Get ready for the 2k PSU for the 4000 series later in the year.
@@namantherockstar spam commenter might even be a bot report him for spam anyways
I have sapphire rx6900XT toxic, which has 400w TDP and almost a year old. 🤣 450w doesnt seem so outrageous.
@@Soutar3DG 3kw consumer power supplies when
It's so sad to hear that the 4000 series is probably going to be more and more power hungry... I was so blown away by the 10 series when it launched and most of the offerings were significantly faster than the 900 equivalent AND sipped way less power while doing it. I wanna see that kind of generational improvement again someday.
time to buy intell and amd
expect to go over 1k wats and I wouldn't blink at at 2k or 5w power supply :D don't blame nvidia blame miners that use these to push the chips to the edge of their limits to mine coins and prove they can keep it up :D lol because doing that raises the bar to perform :D
@@AndyOO6 man, no.
Yes, it’s possible to reach 1000W, the problem is that the rise in wattage is directly proportional to the temperature, if it reached a 1000W, the TDP would be 1000W, and it isn’t that easy to cool a thing like that.
A 5000W power supply is almost impossible to happen because of cooling, stability and size.
And don’t forget that GPUs need to go into laptops too.
It’s almost irrealistic to say that it’ll reach 1000w, although it’s possible, it’s really hard to reach it, and the card would need more time to release, since nVidia would need to wait for new power supplies that can actually carry it (since a 1000W GPU would be used with a CPU like the I9, R9 or Threadripper Pro.
And maybe they would need to wait for new cases to arrive, because cooling a 1000W GPU would need at the very least 3 to 5 slots, and that’s a fucking TON of slots.
@@Tigerfish. almost impossible to something like that to happen. 3090Ti is a fucking beast, it’s not easy to make something this cheap and as powerful as something this powerful.
I think they just got a bigger memo from their shareholders who like the idea of selling GPUs for thousands of dollars per unit.
Actually.. as being a shareholder of Nvidia-- the stocks actually were WORSE lol.
That doesn't really matter if you'll only sell a couple of 100, while they could sell thousands if they dropped the price by 500. Cost per card has dropped since release of this generation.
@@Xfade81 to sell more they first would need to be able to make more
@@namantherockstar Stay in school kid.
At least in Europe the card does not sell out instantly, like all the other FE cards. The price seems to be actually too high, even for scalpers...
Another highly overpriced GPU that's a few percents better than the previous one. Exactly what we needed!
Goofy NVIDIA, taking them a ton of months just to make a GPU that's slightly more powerful
Ahhhh Yes 7% more performance for 20% more power
I would get a 6900 xt for close performance and cheaper price tag.
nvidia doing what nvidia does best.. "is our product lineup confusing and overbloated yet? nope, lets release another card"
@@emilienbindelle overclock
It's hard to imagine that only a few years after LTT was asking "Is any GPU worth $1200?" we'd have people defending asking $2200 for a CONSUMER GPU.
I will never defend anything over MSRP, and even MSRP (like in this case) is _really fucking arguable_
Inflation will only accelerate from here. 3 years from now you'll be paying 4-5000 for a flagship graphics card.
Well I'm not buying one :)
@@Axiomatic75 this has nothing to do with inflation
also inflation is a bullshit copout 99% of the time anyway.
@@Axiomatic75 That's not inflation, that's artificial price increase. Totally different things. One is normal, the other is not.
As I've said elsewhere, I bought into the idea that the prices in the last 18 months were beyond the control of the manufacturers. Then Nvidia posted insane revenues for 2021. Going forward, I'll be driving my hardware into the ground before I upgrade and then I will spend as little as I can to make the games I play work. Nvidia has bent me over the table for the last time.
Lol 😂
Well said
Yep! and like I keep saying as well. Used hardware is a fantastic idea! I for one can't remember the last time I purchased something new. With the one exception being the hard drive and power supply. Mice, speakers, cables, ram, even MOBO's are generally perfectly fine used. You do also need to be fine with stuff from up to 2018. But then how much has been available up to recently again anyway?
It sounds nice on paper to say you can do 4k video and blah blah blah. Outside of doing that for movies and commercials. Games will look fine at less than 4k. I doubt many truely use techniques to scale up to 4k.
Yeah these greed mongers are pi$$ing me off...
Here come the nvidia shills..i can hear the "reeees" coming from a mile away
We literally don't know how good we had it pre 2019.....
You now know how bad the third world had it xD
@@JonatasAdoM honestly lol
2000-2008 was even better...(that's also including for the 3rd world, you could buy a 2 year old flagship card for US$20-30 and run all the latest games with no issues)
@@Lancia444 You're high
True, looking back when I thought paying $500 for a 5700xt was crazy, like why would you pay so much just for gaming? Ahh, the good times
Props to Anthony and the entire LTT staff for this huge F-U to Nvidia, this is getting ridiculous
Right
what do you mean GETTING ridiculous ?
Agreed.
NVDIA doesn't give a fuck though. They just want them dollars
@@psycronizer it's BEEN rediculous.
I like the fact that GPUs have been getting more and more efficient ever since like GTX 600 series (compared to GTX 400 and GTX 200) up till GTX 10 series and now we are going backwards to making GPUs even more hungry for watts. I hope AMD does better than Nvidia with the efficiency.
Well I mean they’re still more efficient than ever, they’re also just more power hungry than ever
@@memethief4113 Yeah, this is being done more because of competition than anything. The nice thing is it forces AIB's to improve cooling, so if you want an efficient card you can just undervolt it back down.
This is the culmination of nvidia's laziness. The 200, 400 and 500 series cards were like that because nvidia didn't want to design more efficient architecture and found that cranking more amps into the silicon for more fps was a viable strategy. Were seeing a repeat of that history now.
I don't really know all the power draws of all the AMD cards of this gen, but I hardly see how they could do worse in the power requirements.
AMD: pheeeu, thanks to Nvidia we now have room to play with power, nobody is even going to notice if our cards go power hungry, as long as they are not drawing more power than theirs.
Nvidia deserves a swift kick in the nuts for trying to normalize scalper prices, insult and rip their customer base.
Well said.
Scalper prices lol
The fact is you can't get them at MSRP most of the time anyway. The reason scalpers exist is because MSRP has been too damn low for so long.
I'm no fan of Nvida, AMD, or Intel but when all of their products are more expensive than usual then consider that it's not some scheme.
That being said I wonder why they keep going with high end chips. From my limited knowledge that means their defective chip rate will be higher at the factory.
Before you say miners, I'm pretty sure 4 $500 cards will be as good or better than a $2000 card only considering performance, not even their wattage which would make the former even more promising.
@@deoxal7947 What the heck are you on about?
@xionliing sh Crypto mining guys, ML devs, and grpahics designers should be blaming us gamers. We don't get to blame because our use doesn't make anything at all.
@@deoxal7947 Miners are scum because cryptocurrency sucks. But the others I understand
6:58 I'm sorry to burst your bubble LTT but it was me who snagged that card from spain and it was a bog standard scam, he's been doing it for months with burner accounts all at £700
Did you get your money back?
@@EdwardWB97 I did eventually but My money got tied up for about a month, caused huge cash flow problems that meant I had to borrow.
@@pixels_per_inch I'll do you one better, a stream. Hopefully friday with any luck.
@@Peterscraps I'm going to assume they are somehow able to pull out those funds and then probably delete the ebay account instantaneously?
@@pixels_per_inch an NFT to be fair.
They will continue raising prices until people stop buying. I really hope Intel can shake things up.
i dont even care how bad the intel cards are so long as they sell well priced mid tier cards
Ofc they won't be any different. They're a corporation too, they work for money, plus they're building a new US factory that's an *investment* that needs to be returned on. Bye-bye decent pricing. 😞
@@em0_tion oh im sure they'll charge just as much, but not for the first few generations. they gotta break into the market somehow after all.
A BIG part of the reason for these high prices is because of crypto mining, not because consumers are paying these ridiculous prices... and those miners make a lot of money with those cards and aren't really hurt by these prices... that's the problem.
@@MickayG That's the only reason why you would need such a powerful card. They aren't for gamers. I don't believe there's gonna be a game that will fully utilize it for awhile.
Remember the times when you could get a top of the line graphics card for 500$ and a dual GPU "monster" for not even double that?
Well, you can keep those old GPUs and not buy anything else, they're only bad if you conpare it to the modern ones
It’s a catch 22. Games and graphics have advanced quickly and so have the tech in the Gpu. Prices match and current world events add to that cost.
Remember when you could get a back pack for less than $250 ?
@@issamkholoud2009 u can get one very easily for under 250, just not a good one.
Remember the times when you could get a 64kb RAM IBM compatible PC for just 1200$ and an insane 40MB hard drive for just a half that?
Just for point of comparison, the window air conditioner that keeps my play space at a chilly 60F year round, only pulls between 200 and 500 watts of power. And it keeps a big as game room ice cold. 500 watts of GPU power for F1 is insane.
This is the harshest review, yet most deserved by Nvidia. I appreciate the honesty.
They need to start including tiny nuclear reactors on the cards.
fuck no, i want a coal plant on mine.
Shhhh...the cards are already green. Don't need them to Hulk out. May piss off AMD and end up doing something Red Hulk related. Then Intel will do something Blue Hulk. And we just have this Hulk battle of GPUs. Why the fuck is this not an ad? I am a genius!
The power of the sun yadda yadda
@@Mhytron coal or oil dude. the sun doesnt shine all day Keepo
Why tiny? LETS GET A FULL SIZE NUCLEAR REACTOR
I feel like we need to be more focused on efficiency than fps. There's no reason we need to draw 450 watts from the gpu alone.
Almost all the efficiency gains come from lithography shrinks. Some are architectural but it's not a lot. Once you can't make the transistors any smaller, power consumption *has* to go up, in order for performance to go up by any noticeable amount.
@@CrimsonEclipse5 We've reached plateau for silicon haven't we? Guess it's time to figure out better architecture? Or maybe go back to analog?
Moore's law is dead and we're encroaching on the limits of efficiency.
exactly. I'm not going to upgrade my GPU until there's a model released that's efficient, with low power consumption, like the GTX 1000 series was super power efficient, but it's like they stopped caring after that generation.
@@CrimsonEclipse5 very true. But I just feel there are ways in which performance gains can be had without just increasing the size of the straw the gpu gets to suck on. Probably ways that may not have been thought of yet. But even so, I really don't think anyone NEEDS to game at 8K anyway. 4K on larger monitors or tvs I understand. If they can just keep the performance where it is and work on increasing efficiency I would prefer that over exponential growth in my power bill. If Nvidia worked as hard on efficiency as they do at beating AMD I'm sure they could come up with something.
When the GPU costs more than the rest of the PC combined there's a problem.
Not really. Companies are learning from scalpers that most people are willing to pay $2k+ for a new graphics card. So why not just under cut the scalpers and sell it at that price from the start?
@@telengardforever7783 because the scalpers will then ask for 4k.
@@ADobbin1 That's not how it works. There is an absolute limit, and $2k is approaching that limit. So the most a scalper could sell a $2k card for would be about $2200 -- which isn't much of a profit after taxes.
@Grant Todd yeah right the rtx series is for workstations.
@@telengardforever7783 many many 3090s were selling at 3600-3800, so no, 2k is not approaching the limit, based on market its closer to 5k
Nice to see the future of gaming/productivity.
What do we need? Two extra solar panels? A diesel generator? An industrial contract with the power grid companies ?! Maybe they are implementing an extra power package for all those nVidia 4xxx series users ?!
I'm glad the price for energy is currently dropping .. isn't it?
Energy pricing hasn't really changed where I live. But, it would be smart to invest around 5k in a solar array just in general for your entertainment electronics.
nice comment^^ exactly my thoughts
It sounds like 1000w PSU gonna be the standard from now on where we used to be like 750 is more than what ya need
GPUs with internal nuclear generators maybe?
So to add to this, a standard North american home electrical circuit will be on a 15 amp breaker. 15 amps x 120 VAC = 1800 Watts. At the point where our PCs start drawing 1500 watts we will probably need to start changing the wiring in our homes. And how far away from 1500watts are we? I'm guessing 2 years or less, Ryzen 7 is supposed to have a huge performance boost over Ryzen 5, and even the AMD 7xxx series is being reported to have huge performance gains over previous gen.
Might need to wire a 30 amp circuit so you can run your PC and an air conditioner at the same time, a 3090ti would probably heat a well insulated house.
When your GPU draws more power than most computers, there’s a problem
the year is 2030, the high end GPUs has became it's own stand alone computing unit outside of the pc case.
just wait till next gen hopper is supposed to use 700 watts ( this is the data center gpus) while lovelace (4090) around 450 - 600 watts.
RDNA3 (7970) is expected to be around 375 - 450 watts.
Hasnt this been the case for something like 15+ years now?
Or did you mean this card alone draws more than another high performance computer including its GPU?
@@ninus3d that’s what I was thinking,there was even a gpu that had its own power brick built in
pretty much this, my PC with a 3070 and a 5950x, 32gb ram does not draw as much as a 3090ti.
I think we need a "Mid-grade Appreciation" video to help remind people that mid range cards are good enough for most people.
Not if you're gaming on 4k they arent.
covid be like :11% inflation since 2019
Seriously! As someone that's new to the hobby and just built their first computer last year, I had one hell of a time trying to figure out which card to get. It seems like every build and every advice website nowadays just uses/advertises the latest and greatest card; I have no clue what the bare minimum is nor do I have any idea of what constitutes "good enough" for most games available right now.
@@spyretto i think you completely missed the part where he said "mid grade"
gaming at 4k is not mid-grade
@@spyretto 4k isn't mid grade for most people
I was waiting in line at a Microcenter in hopes to get a graphics card. I was there for a few hours so I started talking to the guys in front of me. When we got around to talking about what cards we wanted, one guy said he was going for the 3090. So I asked what kind of professional work he did and without missing a beat he said he wanted it to play Call of Duty Warzone. I died inside.
That's actually hilarious. Imagine if he said fortnite haha
It still gets some of the highest fps in Warzone compared to other cards. Get off your high horse.
@@avgvstvs96 i think you are missing the point 😂
Yeah, admittedly kind of guilty of that myself. Been considering the 3090 or maybe one of the upcoming 40 series because with some games I'm noticing my still fairly potent 1080 Ti (which I've had for 3 years at least now, maybe 4? Yeah, whatever card I get next needs to be as long of a haul as well) struggling to keep up.
Though I also play quite a few VR games (and seen performance issues in a few of them too) and have branched into a little Unity and Blender work, so maybe I'll actually have a use for the extra power unlike that guy.
Well, don't judge people for buying what they can afford and willing to pay for whatever reason. It's their money.
I remember when the flagship GPU used to cost $500 a little over a decade ago. Now it's 4x the cost and only marginally better than the previous flagship for a third more.
the flagship gpus back then had more than 5x LESS performance than even a 3080. thats the cost of advancement. you cant expect to get 5x better shit for the same price as a 500-700 series gtx graphics card. thats poor mans thinking..
@@robynd6138 So 1080ti was a poor mans card? Its performance rocks still to this day.
I bought a strix 1070 ti 4 years ago..... it should be about a 150 to 200 dollar card because of age. Could of sold it for 100 more of what i originally bought it for. I WANT SO BADLY DO BUILD A SYSTEM AGAIN..... I can but i like my high end gear and that high end gear is going to cost me a god d-amn leg and an arm to do so.
@@robynd6138 Not just the cost of advancement but also of economic inflation and Nvidia is still largely unchallenged as the king of GPUs, AMD is getting there so hopefully prices will lower once they duke it out
@@robynd6138 Back then? Define it, please. Back in 2005?
A 3080 doesn't even perform 2x better than a 1080 as said by userbenchmarks. Feel free to check it yourself.
I have a 3090, the guy that sold them ( re-seller ) told me that he sold a bunch of Founders Edition here, people do actually like the BFG! I personally, with the energy crisis that some countries are experiencing, I'm hoping that tech giants will focus more on energy efficiency rather than just more performance
rtx 3070 FE is best
Like apple
If they do want to keep the Europe market, they'll need to work on efficiency.
Energy bills here (Belgium for exemple) are becoming insane.
I (and many) will deffinitively check energy efficiency on the 4000 and 5000 products.
@@mrdurtydeedz4277 exactly 😭😖
The correct approach will be finding ways to generate electricity from renewable sources. Then, competition should take care of also creating efficiency, but that always comes after creating power. Is like not selling plastic bags when I am not the one dumping them in the ocean. The root of the problem is deep down, true solutions must come from there.
The sad part is that the 3090 Ti will likely still sell out, giving Nvidia little reason to decrease the price along with the lessened sanctions on China.
we gotta coin nft
I don't think so, 3090s are all over local ads at less than MSRP, Founders cards aren't selling at MSRP, and NewEgg/amazon has them in stock all day long so they definitely aren't moving
Nothing sad about it, just don't be a broke btch
I mean why do you sanction china if you need their stuff
You like getting fked huh?
Sanctioning them, consequently buying stuff from them at a higher price
Such a smart move by MURICA politicians
@@cryingfreemanenjoyer5897 spending $2000+ on a card you don't need is exactly *how* you go broke. lol cope harder...
I got my 2080 Ti just before they released the 30 series, and I’m really glad I made my commitment. I was lucky enough to get it for a discount and haven’t been disappointed with it nor do I think I will be for a long time.
Same here. I recently put a water block on it (which was discounted). It hurt for a moment when the 3070 came available, but then the prices went up again so I was happy I got it when I still could.
I finally upgraded my gtx 780 to a rtx 2080 a couple years ago for my 4k gaming setup, and am now looking more into pc stuff because I need a cpu upgrade. The difference from the 2080 to the 3080 hurts my soul please give me some of your strength.
A bit frustrating we didn’t get this kind of scathing review on the 3090, 3080ti and 6900xt but I’m glad LTT is finally coming around. It’s also super refreshing to hear that it sounds like Anthony regrets the whole 8k bs that Nvidia tried to sell the 3090 as and paid influencers to promote. As always anthony is the goat.
But it is capable of 8K gaming in some games (mainly with DLSS). Of course it's mostly just for the marketing, but fact is that it can do it.
@@thenonexistinghero i feel it's just hype fluff and buzzwords lol
Well as far as the 6900XT goes it has a much lower price point (MSRP was 1k) and a much lower power consumption (most cards are in the 300-330W when pushed pedal to the metal and only the XTXH versions hit higher). So, was it expensive ? Yeah ! Hell I have one lol (XFX Merc one, bought for 1150 euros). This however is just completely stupid, and it was already stupid when the 3080 launched.
3080 ti is actually worth the money they're asking.
@@thenonexistinghero what ? you got shares in team green ?
Really seems like they're reaching for ANY improvements at all from Ampere by throwing much more power at it, even though the 3090 itself was really maxing out what Ampere could do. I really don't think the extra performance is worth that much more power consumption.
If that's the case, can we extrapolate that the Ada series are having similar problems, given their power requirements? I would hope efficiency would be something nvidia starts to worry about at some point.
@@danielharvison7510 no we cannot
At this point Ampere's going to be seen as 2021's Fermi
@@danielharvison7510 High power requirements don't always mean that efficiency is low, imo GPUs having massive power requirements are fine for those who want them.... Only if the performance increase you get from that is justifiably massive. The 3090 Ti doesn't justify its higher power requirement for such a measly increase in performance, that's the problem I have.
Yeah this is just dumb shit, just settle for waiting another couple of years and grab a 50 series.
I'm starting to think upgrading is a mistake. Power draw DOES matter.. I'd much rather have a high-end card at around the 200Watt range, that seems more reasonable.
Do not care about more cores at this stage. Power efficiency. Please.
This, dont want to have a dedicated nuclear reactor just for my gpu
However there is no denying 30 series cards are more efficient than previous generations . I get almost identical performace from My 3060ti vs 2080Ti but with considerably less power draw.
It matters a lot. Especially in a world where energy is getting more expensive (here in the Netherlands 45 cents kWh on average) and scarce. In a world in climate crisis we should pay more attention to power draw.
@@chexlemeneux8790 i am heavily doubting that claim when the 3090 draws the power consumption of an entire computer.
@@anakinlowground5515 He wasnt talking about the 3090. The lower in 30s dont use anywhere near the ammount of power the 90 does. 3060 ti is around 200 watts.
I think I'm moving into more specialized hardware teritory with the current rising kwh prices. Makes more sense to game on a switch / my old playstation 4 / playstation 5 considering their power draw. Apple definitely got the memo with the M1 chips.
They should really focus on getting the existing cards more available
They should, but there's more profit in pumping out highest tier cards
But where is the money on that?
that would likely be much more difficult than one would suspect
@@indeepjable watch these cards get sell out and get scalped. It might not be the same story around the world but in the west people are still paying thousands of dollars for top tier cards
@@naamadossantossilva4736 Selling more cards equals more money? If demand is not met, you're losing money. I mean, it really is economics 101.
Time for a 5 slot 4090 Ti with two of the new connectors adapted to a direct wall power plug!
GPU have their own PSU
Literally just needs a power connector at the IO end that plugs straight into the wall LMAO!
4090 TI with 240V AC input, at least you can grill your steak on it.
Dipping into the 3dfx IP; the Voodoo5 6000 had an external PSU.
@@haariger_wookie5646 Being an electrician helps, but with an addidtional 32A constant power draw x gamers,you may need a new powergrid for your city^^
Imagine paying over 2 grand for a single GPU and hardly getting over 60FPS in your games… dear Lord, we’ve definitely reached diminishing performance returns.
I mean sensible settings are a thing, not defending the card but I got a 3080 for 4k gaming and play everything 60-120fps depending on what it is.But for how much this card costs it should be 2x the 3080 not like +20%
The sad thing is that since they can't actually improve the efficiency and die they just bump up the power by changing the power limits and a few tweaks to the cores so that they don't actually burn up.
They got the memo alright... the shortage situation showed them exactly how much people are willing to pay, and this time they are happy being the scalpers themselves.
if only people waited for a bit instead of bending over for scalpers. oh well they voted with their wallets
@@pikkyuukyuun4741 a bit = 2 and a half years
@@pikkyuukyuun4741 I maintain that gaming is an addiction, and as such, it is pretty easy to justify any kind of lunacy we commit. Be it "I am not going out drinking so in fact I am saving money". "Hey, number of hours per $ spent is super cheap!". We just enable our addiction and find ways to justify it all the while, the dealers keep testing the limits of how far we are willing to go, and again and again, we keep falling for it.
Very good point, did not know how to exactly express this idea but yeah. They realized how much people were willing to pay, and then they screwed their costumers over.
@@tesmat1243 Yeah, but if you just wanted to play games, iGPUs have been kinda decent and cheap for the last couple of years, not great but good enough to enjoy some games and if you absolutely had to have a GPU then laptops haven't been scalped too badly either, if you're desperate there are other options out there aside from paying $2,000+ for a super overpriced high-end GPU.
Best “review” ever thank you for not hyping the fact that nvidia cares nothing about it’s widest customer base
Now introducing the 4070. The same performance as the 3090ti for less than half the price!
Could you imagine?
But it actually needs even more power
Edit: well actually 1k is still expensive as fuck, I should have done the maths 😂
Oh God that may be the plan to make people think a 4070 is worth $1k
For reference, the 980 Ti, 1080 Ti, and 2080 Ti all have TDPs of about 250 watts. At 450 watts, the 3090 Ti is a mini heater for your office.
That's bananas. I'm actually looking forward to Intel's offerings. I've been an Nvidia user for over a decade now but Nvidia seriously needs a wake up call.
I only went with Nivida but when I got my gaming PC Nvidia card was so high I got XFX RX570 and I happy with that card but price on jump over a grand for little bit but IDK I very happy with AMD card in my AMD rig It hurt me to say Nivida is setting prices on a lot there cards highs when prices have drop My next card will be with AMD Nivida lost me in few years when I upgrade my system
Same, and I'm saying that as a person who went to nvidia because 3dfx went under long ago. The whole industry needs a shake-up if I had to be honest.
It's ridiculous pricing, but we were already accepting 1080 at $600 and 1080ti at $700 from AIBs back then, so that is the bananas part. Unless we collectively turn our nose at these high prices, $2k+ for a consumer top-end and $1k+ for a tier 1 card will be the norm.
Nvidia is making it that way. Only AMD cards make any kind of sense for their prices. Which even those are still high. I hope these new Intel GPUs shake things up. For now tho, I'm glad I got a 5700xt and I'll be getting a 7000 series Radeon when those drop. AMD has just been better value for years now.
the whole industry does. CPUs are in the same boat, but less so, to the point where clownish price hikes there are completely overshadowed by beyond-parody price hikes in GPU land. If intel is successful with their attempt to price-snipe the consumer market here, I hope to god someone else does the same with CPUs. Who knows, in the future, the most popular systems may switch from 1060s and cheap intel CPUs to mid range intel GPUs and Nvidia entry level CPUs. Team cyan role reversal.
i miss the days where technological innovation was seen as making PCs smaller, cheaper and more efficient (the RPi 4 before shortages for example). but i guess if people still buy this bs, then team red, blue and green will all continue to focus on those who pay the most.
the laptop market has made leaps and bounds in recent years. Jarrod did a comparison showing a certain rtx 3060 laptop variant matching performance of the desktop one (while plugged in ofc).
@@lord_nin0 the rtx 3060 is kinda wack tho as seen by the absolutely massive difference between a 3060 and its ti variant $50 difference earns a nearly 50+fps increase in all scenarios that is insanity just shows you how bad the 3060 really is
They've took a bit off Apple's book.
It's so sad how we've strayed away from the quote
"There's a basic principle about consumer electronics: it gets more powerful all the time and it gets cheaper all the time."
@@theuprising2216 i use the Asus G17 3060 model and i agree, running MC in Linux with lagless shaders is enough to make the fans spin up in "silent" mode. its not the most scientific study out there but i would have expected a modern day card to handle a game over 10 years old.
@@JonatasAdoM Actually the mocked Apple’s book. It gets more powerful and more power efficient all the time. This are getting ridiculously power hungry while Apple is laser focused on efficiency without compromising performance.
Nvidia should be optimizing their cards for more power efficiency. This performance gap is totally unnecessary.
@@pixels_per_inch That would be cool. Will that be on the 4000 cards or 5000s??
@@pixels_per_inch I thought lovelace was on tsmc?
They are incredible efficient, but mostly running OCed to get the last bit of performance.
But same as with Intel CPUs - if you run them according to the actual stock specs they are rather efficient.
@@ABaumstumpf And that's been the big issue with Intel, if you run them at base spec they are super efficient but their performance per watt is also very mediocre - and you don't please potential customers with "mediocre" 😅
@@ABaumstumpf ehhh I don’t find them impressive now that we’ve seen ARM scaled up
don't forget that these 3*** chips were developed circa 2019. they are selling old tech at inflated prices because they know the consumer has no choice. hopefully intel can bring some price competition to the market.
considering their only targeting hte mid to low end, it's very likely their going for the general consumer rather than epeen and crypto douche market. Thank christ. Once those fads die off, they'll be holding the reliable market, and hopefully they'll actually appreciate it. If not, nvidia and AMD will still have to come back to it, and compete for price then.
consumer has a choice, nvidia simply knows people are very smart and they’ll buy it
consumers do have a choice though: to not buy into the hypetrain.
@@Archonsx libertarian Andy. No bitches?
the problem with newer chips is that the transistors themselves are reaching a point where they cant physically go any smaller so instead of developing new technology, they ramp up the power consumption for an artificial performance gain.
Well they're still shrinking and developing new technology as well - but in the meantime while we wait for Lovelace & RDNA3, we get this thing because Nvidia couldn't handle a 6950XT topping the charts.
Not happening with Apple tho
Not sure artificial is the right word..
@@SnifferSock Agreed, artificial is not the best word, they are sacrificing efficiency and dumping power into the architecture. That shows they reach the limit of its current tech and are now scrapping the bottom of the barrel to get any extra performance. Going by power draw, they are deep into the land of diminishing returns per watt. That doesn't bode well for their 4000 series card if their tech is not a major leap and given the existing fab processes have not changed that much or gone to a smaller size tech. The safe prediction is that the 4000 series will be a minor improvement over the 3000 series. Not good when they are scalping this hard.
Snapdragon 8 gen1 is made on 4nm technology.
I hope a new technology comes out to make graphics processing way more efficient. 500 watts is too much and those ray tracing marks were 70 fps. That's embarrassing
This is clearly a "the RX 6900 XT is outperforming the 3090 in 2 games. We have to stop it!!!" moment from Nvidia.
So, like when they released the 780Ti because the R9 290X was slightly better than the GTX Titan in a few games?
@@kire929 yep.
More like they wanted to compete with the 6950 xt
@@Galf506 Anything Nvidia ever does is to counter anything AMD ever does or not if they don't have to because of a lack of competition, always and will always stay that way.
I think it has more to do with making money and being able to sell a new "best gpu ever" for more money upfront where they couldn't just raise the price of the 3090 base
man i got my new 3090ti for 1500 and i’ve never been happier with my purchase
I just picked one up on amazon for 1326
I want to see a new 75w card that is twice as powerful as the last one. Or just power efficiency in general. More fps for more power seems pointless at this point, and certainly doesn't help with a looming energy crisis.
I guess we can hold some hope for desktop APUs with RDNA2? Extrapolating from Steamdeck's performance and the development of FSR and RSR, the next generation APU might finally be a solid budget/SFF choice.
75 watt era is dead next gen 4060 is expected to be 180 - 250 watt and Rdna3 7700 is expected at 180 - 230
That card cost 3K in Sweden. Crazy times, imagine the disappointment when the 4xxx series with perhaps a 4070 beating it at 800$ MSRP releases.
i always kinda hate these comparisons cause it doesnt take into account the time. sure you could say someone buying this card is a bad buy when a 4070 might beat it at 800 msrp, but when will that be out, and how do you quantify the time between now and when ever next gen is out.
The issue is if this thing sells at this rate, Nvidia will keep on hiking prices. Eventually "entry" level GPU's like the "4070" will cost over a grand. The whole point of Anthony's rant here is that we need to stop paying so much for these cards. The only reason prices are going to where they are now is because people pay them, and if you think Nvidia won't hike all the prices on their next generation of gpu's I'd hope you're right but doubt you heavily.
@@CocaineCobain Yes. It's pretty sad how everyone complains about high prices... Then go pay for it anyway. I don't know who is buying these cards, and how they're affording them. I'll be using this 1070 until the day it dies.
Jag väntar på 4000 serie :)
you could argue that looking out for the next gens is stupid in itself. most ppl buy a GPU ether in a cycle of years or when they deem it necessary. there is always something better on the horizon. sure buying "right befor" the next launch isnt the best idea, but the biggest cards just come out late into the market. I just like to look back at previous gens to determine if the current cards are worth it. thats why i still rock a 980 (though i should have bought a 3080 at launch tbh). its not fair to compare a top of the line, fully enabled chip against anything but a another top of the line chip. the lower cards are always build towards a healthier price to performance ratio.
Crazy how I bought my RX 570 8gb in early 2020 for $110. And it still runs everything fine. The entire PC I built at that time only cost me $450. If I wanted to built it again, it would be $700-$1100 depending on what's available.
Basically same except mine is an rx 580
I'm building mine from scraps off the side of the road. +$200AU for an RX 580 (lucky me) and ~+55AU for a new PSU (as the ancient one i was using died) to think that if I bought everything brand new it would cost me probably $700AU+.
It is rediculous.
Literally i think my dual monitor setup, with a ryzen 7 5700g and 1060 6gb cost me around 1k and i got lucky i got the 1060 for 200
@@carlosmonroy3441 Yeah if I include my monitors and peripherals it's more like $750 but the PC alone I built for just under $450. Really actually less than that cause I had just bought a xbox one x from a pawn shop for $150 and sold it for $350, so it nearly covered the PC alone 😂 I didn't buy the xbox to intentionally scalp btw, It was my main system but as soon as I got the PC, I didn't use it anymore at all, and just so happened to be when you couldn't get consoles AT ALL in the store or online. So I made out like a bandit 🙌
@@tacticalmattress fr for me well i got lucky my lady got me the Mobo, psu, ssd and ram which honestly what else could i ask for right
I honestly think most of the people buying the 3090 and 3090 Ti are using them for video editing and 3d modeling, they just happen to also game on that PC so they show up in the Steam hardware survey. There are also the types of people that post their specs even when nobody is asking, they are also buying the 3090 Ti cards. I actually just bought a regular 3090 FTW3 on Ebay for about 25% off MSRP purely for gaming but I also am one of those guys that wants to install every graphics mod, 4k texture packs, shaders, and set everything as high as possible. If it doesn't look like real life its not good enough LOL!
I'm in that use case. I stream, edit, and rely on Nvidia broadcast to filter my audio. Encoding is a big deal.
I just like to think back to the launch of the 30 series of cards, when we were all crazy excited with how much performance these cards had for your buck. Before the scalpers, before the shortage in materials etc... This shit was THE BOMB and Nividia was felt like they hit a home run.
Back to reality and this has somehow been one of the worst times to upgrade, the most expensive cost to performance curve in recent times, and a major PR hit to Nividia. Talk about a 180.
no kidding, the whole 30X0 lineup is staggeringly good at msrp
Crypto killed any chance of me building my first pc 😭
@@ahmedo7875 not Crypto. Scalpers. remember they released in Oct the Mining craze didn't kick off till jan 2021. before that it was ALL scalping due to the shortage.
Miners from feb 2021 made it worse.
@@Xyphren idk dude people were GPU mining for years before the 30X0 series was released. savvy miners saw the TH/s/W for these cards and jumped on them immediately. If you could get a pallet of them for anywhere near MSRP you were making BANK. They had a
Should have thrown a 3080 in there to show how dumb the 3090 is, let alone the 3090ti. People were disappointed with the 20-30% increase the 2080ti had over the 2080, for 50% higher price. I never understood why people seemed more fine with the 10% increase the 3090 got over the 3080 for double the price!
the 3080 -> 3090 price hike is easier to swallow because you get 24GB of GDDR6X, which is expensive stuff. You aren't getting your money's worth in games but in applications that use that VRAM, it's a much closer deal.
@@CrimsonEclipse5 Just get a gpu dedicated for that, most of them are at MSRP and in stock since they can't game or mine.
@@igameidoresearchtoo6511 you must not do editing/ gaming then.
3090 is ok if you also do ML stuff or rendering. 3090Ti is ultimately stupidity 😅
@@igameidoresearchtoo6511 24gig GPU's are actually very expensive if you want cuda accelerated ML. In that specific scenario, and only there, the RTX 3090 can be considered worth the cost. I am off course also considering that this is part of a job or education.
Actually this is genius, they simultaneously get rid of the old cards to clear their warehouses set price to 2K before new gen RTX 4090 launch. And now they can set prices for their new gen so much higher. Evil but Genius
Nvidia has already done this before; they stacked the 2000 series cards pricing on top of their old gen 1000 series card, and they got away with it until AMD actually brought out new gen 5000 series cards, which they then began to phase out the over-priced rip-off that was the 2060 and 2070 (which were essentially mid-range 60/60ti cards being launched at $329 and $499), and introduced Super cards to compete with performance to price that made a bit more sense.
I mean, is that really sustainable? How many people can really buy cards above 2000$?
@@thescorpionruler5882 Scalpers, miners...
This is brilliant up until they hemorrhage customers
@@thescorpionruler5882 Na, nobody is gonna be mining with these things. It won't be very profitable with that power draw, and on top of that the cash cow for mining (Ethereum) won't be minable at all pretty soon, nobody is gonna turn a profit if they try. Much better off buying the actually good cards in the series, like the 3070 & 3080, even the regular 3090 isn't worth it right now, a year ago sure, but not now.
The only people buying this are the ones that are overcompensating for something, the people so rich or dumb they buy "the best" no matter what it costs, or scalpers hoping to sell to those guys lol..
no one talks about 250 usd for a backpack or 70 usd for a screwdriver? i mean..... what?
I get the feeling that the point of this card is not to sell it's to make what they bring out next seem much better despite also still being overpriced, perhaps even more so and still being power hungry.
Love how comfortable Anthony is getting in front of a camera! Always love his segments!
It’s been really cool to see the change over time, I liked the guy from the first video I saw him in, and the first time I heard him speak I knew, this guy is a genius.
Honestly I like him the most - he has a no fuss narrative approach. Straight to the point. Thats what I like
I sometimes watch LTT videos. I ALWAYS watch Anthony videos.
He's definitely the most beloved member of Linus Tech.
Oh man the 6 minute mark when he says "slap in the face" was a perfect spot for the new Will Smith meme.
Can't wait for the RTX 4070 and RX 4700XT to clobber the 3090TI in 6-8 months.
@@handlemonium true this was unnecessary
I built a computer at the end of 2021 to celebrate finally finishing college, I was afraid that prices were not going to get better and I decided that I was ok paying to have it then rather than wait till some unknown point to save money because I had no clue how far off that might have been. I did buy a 3080 rather than a 3090 though, the gap between the two did not seem to warrant the price difference.
I’d love to build a new PC considering I’m still down here with a 1070, but out of principle it seems absurd to spend more than $1000 on just the GPU alone. Perhaps one day lol
1070ti owner here. Even on 1440p i'm still fine. I turn down some settings and that's it. No way in hell do i spend 500 bucks for a 10% improvement over my currently 5 year old card that cost me 300 bucks back then. I'll ride this thing until the gpu dies.
@@domsch1302 I’m running a 1440p monitor as well and you’re right, it’s manageable, but it’s by no means luxury anymore. We’ll just have to see how things change over the coming years.
I have a 1070 also, I run at 1440p and the games I struggle with are more unoptimized beta titles like Escape from tarkov. I won't upgrade until I get can get a solid upgrade for under 400.
@@domsch1302 only 10% improvement huh? Coping
I have a RTX 2070 Super that I’m selling for pretty low, I bit the bullet and got a 3080 12GB earlier this month. The Super was fine overall, just struggled in some instances with some newer games but I could play most of the games I have at solid settings at 1440p. Now it’s a bit overkill but now I can play at 165fps for my games
As for the "But why?" cause AMD are coming out with the 6950XT with higher memory speeds and increased core clocks, Nvidia are scared of AMD taking the perf crown.
I wish they'd try and compete on price instead.
@@phuzz00 that is not what their investors want. More expensive GPUs mean more revenue, which translates, roughly, to better stock prices. That is what the investors want.
@@phuzz00 same
They just dgaf about people
@@phuzz00 but if we stop buying from them, they'll realize what's wrong with them and they'll decrease the prices
Won't happen until AMD gets their ray tracing game together.
Even though prices are coming down and tariffs are being scrapped,
I do not see high-end GPU's become affordable anytime soon.
eh give it like, a decade or atleast 5 years it might become more reasonably cheap but im unsure
@@namantherockstar fak no
@@namantherockstar ik she really didn’t say that
i suggest you might think they will never come down again with that bloaded mrsp
@@indeepjable nah I don't think so ;(
with that cooler should've went with the amd solution of shipping it with a watercooler like the R9 Fury x
I got my current card, the gtx 1080, when it came out. I can’t speak for everyone else, but personally, I tend to buy the top of the line products at the time, cause I know it’ll be 6-10 years before I upgrade. My 1080 is still satisfactory. I don’t need an upgrade, which is fortunate cause I couldn’t get one if I wanted to. But for my buying practices of only upgrading once every 7 years, it makes sense to buy the top of the line now, so 7 years from now it’s still acceptable. That being said, I would be satisfied with a 3080. But I ain’t paying no scalpers so I’m just gunna have to wait till the silicon shortage is less so, then maybe I’ll actually need an upgrade.
You can buy adjusted MSRP from Best Buy right now. No scalpers involved, but still a high price. Bit north of 1k for 3080's.
Had the same mindset. Built my pc with a 1080ti and was planning to go for a 3080ti. Thankfully my card is still more than strong enough I can wait.
I'm sort of similar, I prefer to game at 1440p and my second hand 1070 ti still allows me to do that, and I won't be going near ANY new card, probably ever again at this rate. The last time I had a rig that pulled close to a kilowatt was GTX 280 in tri SLI, and that thing really did heat the room, and now they are back at it again..
@@RachoTLH north of 1k for it, while he can also sell his 1080 for like 400$ right now. Puts the card actually in a decent spot price wise for a big upgrade
I'm sitting here with a 1080 Ti and I'm doin' just fine. My only desire is to try Radeon's next line of cards to see how AMD compares.
The timing of slap in the face is so fitting on what happened yesterday. This is not looking good how GPU requires so much power in the future with little improvement. But this card is most likely for improving the flexing rather than performance.
So what you're saying is that nvidia gpu's are now in the same category as iPhones? Status symbols?
Yeah Nvidia had to throw as much energy as they could in the 3090ti in order to get a small performance boost over the 3090 just so they could sell it as the new most powerful GPU.
This product is ridiculous.
can't believe I had to scroll down this far to see a comment about the slap lol
@@danielharvison7510 no they're talking about the flexing the card will be doing from being so damn heavy
@@Rov-Nihil Getting a GPU brace is a negligible investment given the cost of the GPU.
From a gaming perspective, at this point it's not the GPUs that need to get more powerful (OK for 8K maybe, but really who needs that), it's the game devs, or moreover the engine devs that should be producing more efficient code.
Agreed. I've observed this trend through the last decade. It seems like ever since computers became more powerful than potatoes, developers have been lazy about optimizing. There are games from a decade ago that look great. Hmm...
“Just optimise the code” how rediculous, have you worked on a code base larger than hello world?
When I built my previous rig I got 2x1080's and a 4K monitor, I'm now using a 3090 and a 2k 144hz monitor and to be honest I can't tell the difference in picture quality.
@@AiphosGaming - Not to question your experience, but have you read comments on Epic's bloated source code for Unreal?
@@AiphosGaming "the food doesnt taste good", do you know how to cook other than microwaving something?
I think that a large part of what drives the demand for these GPUs is VR. Even this 3090 Ti wouldn't have enough horsepower to run a flight simulator in 100% resolution with anti-aliasing (let alone supersampling), and get true 90 FPS without reprojection. For a few years there we were at a spot where the GPU hardware was ahead of, or at least on par with, the display hardware. But the insane pixel count of VR has flipped that equation around. Hopefully we will see software advances in eye-tracking and foveated rendering to reduce the ever-increasing demands of VR headsets.
As a PC part consumer (building my own PCs since the early 2000s), I do notice power draw. I have made CPU purchase decisions based on wattage, and I am now doing the same for GPUs. I just won't buy a part if it draws too much power for my taste.
The heat of 600+W is no joke. It's plenty to heat a normal room in the dead of winter on its own, but if you're like me where you have to run AC in the summer, you're going to need an extra 2000+ BTUs of cooling to offset that extra heat. I definitely pay attention to this, and design my system around a cap of about 300W peak gaming load. This is easy with a GTX 1080, but I'll have to think a bit harder for my next GPU, and probably do some substantial underclocking.
@@NeilMortimer Either do that, or you need a massive cooling system upgrade.
@@anakinlowground5515 You mean a better air conditioner?
@@NeilMortimer Electric heaters are exactly 100% efficient since they're converting electricity to heat, but air conditioners are more than 100% efficient since they're only moving heat.
@@NeilMortimer you're 100% correct.
I have a 450watt 3080 water cooled. That coupled with one hundred and change watts from the CPU will make me open my room windows during 40 degree or less days mid Winter. I have ac in my house but I have a portable ac just for my PC room otherwise the rest would end up at 64 degrees.
My guess: Nvidia is trying to get a bigger slice of the cake as long as the RTX 3090 is still around the same price as the 3090 Ti. As hardly anyone buys the 3090 either way, the few that will are probably going to think "ugh, I may aswell just spend the extra $100-200 for the Ti version". Then Nvidia gets likely around 250-500$ of extra income..
With every video he does, Anthony just keeps getting better and better.
keep on D riding!
Feeling really great having bought a gtx1080 all these years ago... besides dlss I'm not missing anything, and fsr is there to save me when I (rarely) need it.
We've already been trained, through years of scalping, to expect to pay a lot more for GPU - and we've shown as a collective - we will pay it. The genie is out of the bottle and it can't be put back in. nVidia has a license to over charge from now on and we handed it to them.
Frankly it would have made business sense for them to raise the MSRP price in the first place. They would have caught a lot of flak; but I'd rather Nvidia have double the sales price than scalpers taking the money. More revenue means more R&D. Instead we got nothing for the higher prices.
I dunno, all those mining cards gonna hit the second hand market with energy prices going through the roof. Lot of people going to be happy to settle with used over new.
@@ryanj610 If Nvidia double the MSRP, then the scalper would do their thing and sell at double the current scalper's price as well. You really think raising MSRP is going to stop scalpers? Nah it's just gonna give an even bigger beatdown to regular consumers.
Were we really being trained? Maybe GPUs were underpriced for years. I'm not saying it's one way or another. Likely a combination of both. But I remembered being appalled at myself when I paid 380USD for a new 1070 when they launched. But after being with it as long as I was, and the value I got from it. I realized paying 900 for 4 years of decent gaming wasn't so outrageous. This kind of assessment is very personal and vastly different person to person. I think it's time we start thinking that people really value our hobby. That they love it so much they'll put that much money into it.
Damn, that consumer GPU pulling in a server GPU amount of watts
def not server gpu power draw, server gpus usually consume less than their gaming counterparts for a simple, reason, power draw is important on server environments, and most server tasks scale well with like 200 gpus, so, better have multiple of the slighghtly lower clocked cards, that's why the server version of the 3090 ti consumes 300w
4:35 server gpus draw less, its even in the video
@@greenumbrellacorp5744 They now have a 700 watt server card set to launch.
I really appreciate that LTT went out of their way to avoid hyping this GPU release up because it's such an awful value in a market saturated by awful MSRPs...
Thank you so much for this coverage
Looks like I have been completely priced out of PC gaming. Just going to use my 1060 as long as possible and buy a seriesX. Consoles offer a compelling value now with PC parts being the way they are.
Not certain if you actually “wrote” this or not, Anthony, but I genuinely appreciate your honesty and even some of the “self ownership/responsibility” around possibly pushing the desire for people wanting the “big boy” and likely more than what they truly need. Thank you for that.
Imagine paying 2k for a halo product 6 months before its obsolete
We don't call those fanboys " Nvidiots " for nothing.
The people who buy this card don't care about that, in fact they will be happy about it, because they will just sell the 3090Ti and get the next fastest, it doesn't matter how much it costs, or if it's just a tiny bit faster.
Yeah, they just don't care anymore. They know at least a couple suckers are gonna pay, and they know that the smarter folks are waiting for the next generation
Partially due to the demand, and now there's some more supply available is going to be a big reason for buying too
@@kelvinhbo I might've done just that, but I can't be bothered to rebuild my rig and loop just to do it again in September. Besides my 3090 already has a higher power limit and the VRAM is overclocked very close to the stock speed here.
ummm $70 USD for a f****** screw driver?!?!?! WTF are you guys thinking
Was on amazon for some shopping and an 3070Ti popped out being sold by amazon at $812 and I couldn't believe it. Then I looked up that this card MSRP is $599! It has been so long since we had GPUs at MSRP that a $200 above MSRP is now seems like a steal.
Unless your work or job depends on a new GPU, I would recommend you wait on getting a new GPU. Prices will continue to fall down this year due to
- easing of Chip shortage (expected by summer)
- tariff canceled for some stuff coming from China,
- Intel GPUs to be released this year (if Intel keeps up its release schedule),
- scalpers needing to offload unsold cards ,
- 40xx series cards (new GPU architecture which have potential to be what the 30xx series was to the 20xx series)
Can't wait to see Nvidia's pricing get boo'd on the 40 series, it's going to be ridiculous.
These youtubers complain for 10 seconds and still follow all these companies rules. The 5 mins of booing wont do anything.
And still, everyone's gonna buy their overpriced cards, because people wanna play.
I got a 3080 at msrp. For 700 that craps on the 3090ti in terms of frames per dollar. We’ll see if the 40 series can outperform in terms of frames per dollar but if not then it’s not worth it.
@@nupagagyi Not necessarily, if the recent pandemic forced many people to upgrade within the 20 and 30 series cards, it's likely that a large portion of the customer base will not be as willing to part ways with their money quite yet, as their current systems are still fairly recent and are good for another couple of years. Probably the only ones going for high spec cards this season are extreme enthusiasts and potentially the few people who waited out the gpu crisis and skipped the 20 and 30 series altogether. We'll see, but I wouldn't assume that people are as willing to throw their fortune at new graphics cards as they were a year ago.
I have a 3090 and I don't think it'll be even worth it for me to go up to the 4000 series. Yes, it's rumoured to be twice the performance, but do you even need that with 24gb of VRAM?
My entire computer costs less than both of these! This is so above my needs as a consumer it's absolutely wild
Which is the reason the 50,60 and 70 line exist. The ones who are willing to pay for this level of hardware are the ones that enable your low cost options to also exist
@@dennisp8520 no they're not, since they make an extremely small percentage of the market
@@dennisp8520 Having a hard time understanding how the 70 line counts as a low cost option for a consumer market. There are very few uses for more power than that for 80% of buyers.
I cant believe none of the AMD GPU's are on that top 10 steam list when some of them give a really good bang for buck compared to some of those cards on there.
nvidia is like intel before ryzen for the masses.
Almost all laptops or prebuild pc's have a nvidia gpu because probably bigger margin and ongoing contracts.
You see that bullshit exactly on the number of ryzen powered laptops over the last couple years. They were clearly the better chips in every aspect
@@MHWGamer the 6900xt is a crap miner but DAMN it its not a 3090 but better. its MADE for 4k. then below that the 3080 and below that the 6800xt. below that the 3070ti below that the 6800.. its a Perfect product stack with AMD's software based FXSR it means that in time the 6900xt will get better and better as its not hardware locked like DLSS
AMD GPUs were incredibly supply constrained
Availability was/is a huge problem for Radeon cards. I work at Best Buy and as awful as our 3000 series supply has been since launch, our 6000 series supply has been worse. For every Radeon card I have seen come in for customer pick up, I have seen three or four of their Nvidia counterparts come in.
To be fair, its more than possible to get the 3090 ti right now near its MSRP, but the 3090 is also sitting around $2000 for its available cards
I really feel like I can trust what this guy has to say when it comes to pcs. 👌
yes he great
I could buy almost 3 of my entire setups with this one GPU’s price.
mine is 4
I can buy 10. The value of my probook is 200 dollars give or take.
7 here
This will be ... the biggest joke of a product in a few years
The best example of excess, overkill, overpriced behemoth
The future of gaming graphics will be small and lean
GTX 480 comes to mind.
I would love to have a COST OF OWNERSHIP
Example how much does it cost to GAME 2 hours daily. This can make people open their eyes about the price.
I think GPU performance itself has reached a point of diminishing returns, at the point where higher performance will not bring much benefit to the average consumer considering the price. I think Nvidia has reached the point that Intel recently did with 11th gen and 14nm++++++++++.
Yeah that's the problem. Node shrinks just aren't doing what they used to for efficiency gains. Physics is a bitch and if we want to keep increasing performance for more intense games with more intense effects, we're going to have to accept that all of the easy gains have long since been gotten and we're going to need to make some uncomfortable choices about power consumption or graphical fidelity.
@@mikeydude750 I just hope we won't end up having to pair two of those electricity hogs together just to boot a game and heat the house during summer.
Nope, they're just making GPUs for AI and not for gaming. FP16 and FP8 performance gains are massive and that's what matters for AI/ML applications which is their biggest market as of now.
Well not true. I think next generations GPUs will allow the average consumer to comfortably play on 4K, which even this card doesnt allow you to. As long as the current flagship cant deliver 60 fps (or preferably 100-144 fps) in 4K to match monitors, then there is a gap in the market. However Nvidia only cares about promoting its FPS in marketing, not wattage draw or heat - which 90 % of the buyers wont be able to figure out. AMD has a way more efficient GPU, and they seem to be on a good track with their next gen in terms of fps/power and hopefully pricing. Just stay cool for a few weeks/months and you will be able to buy this GPU in a smaller packaging for half the money ;)
I thought I was going completely overkill on my 1000W PSU about 5 years ago. Now I'm worried it soon won't be enough!
Me too. Glad I got a Evga 1000W 80 gold psu in 2014 and it runs like a tank to this day.
I ensured my friend in 2018 that a good 650W would be enough for any single card gaming build.
@@veda9151 Well, you were right :) By the time he'll need to buy a new GPU, his PSU might already have died and/or you'll need a new connector anyway
Still rocking my 450W PSU with a 1650 and an i5. Have an HD Screen and everything is playable just fine.
@@veda9151 been rocking 1000w since 2012, for the marginal price increase and better performance, just makes sense. Going forward I'll probably move to 1200-1500
That whole "people pay more for our cards because RayTracing" from Nvidia is just plain wrong. They pay more, because they don't have any alternative. RT is a nice feature, but not required and most gamers just want to game and don't look for specific technical features.
But as long as people are willing to pay, they will sell.
I paid more for RTX, and look to run it whenever possible. It massively improves immersion overall. I'll give you I rarely play anything competitive online.
@@williampinnock2256 I have no choice but to buy Nvida
I need both Cuda cores for machine learning and Nvenc for transcoding videos
Fuck this monopoly, I wish we get a third party, some chinese company would be nice
@@The0Yapster Its literally not a monopoly
@@stitchfinger7678 Great argument
@@stitchfinger7678 I don't think he's referring to gaming so much as he's referring to the machine learning hardware scene. If you run ML-centered workflows, NVIDIA is pretty much all there is, even if it is oftentimes wrapped up in a cloud vendor subscription.
They really shit on the people who bought the 3090ti. Brought out the worlds most powerful card at a very high price, knowing round the corner they had the 40 series ready to be on sale within 5 months that absolutely decimated the 3090ti. It’s a disgrace. They didnt even bother optimising the 3090ti, it could perform far better than it does. You can tell by the fact that in spec it’s on a whole other level to the 3080 and 3080ti but performs barely any better. It’s nonsense.
Nvidia treats its customers like morons when it falsely compares this to the Titan RTX rather than the vanilla 3090. If you buy the 3090 Ti for gaming, you're proving them right.
If you buy the 3090 Tie at all, it does the same thing.
What's the average percentage difference in performance?? I must of missed it in the video.
I got a 3090 by pure luck at microcenter and I honestly almost didn't buy because of the price but it was all they had. Honestly worth it because it was right before the chip shortage. I got a msi oc 3090 for $1700.
@@shortyorc121 No average was mentioned, but i would say 8-10% across everything, gaming and productivity
@@shortyorc121 5 - 10% better on average . Enough to show up on a graph but honestly Ive tested myself and I can’t tell any difference in pc hardware performance until the difference is closer to 20%
There are people who have money, lots of money who don't care if it's 2k. They aren't morons. NVIDIA is trying to tap into that market. And I'm personally fine with that. It's better that the card doesn't have a 40% boost in performance for 2k because that would have been really shit if it did.
If the main flagship cards remain on low price like 4080 for £660, then I don't care what prices the other cards are.
I was so lucky to get my 3070 at $500 two weeks after launch. Couldn’t be happier.
Last minute luck
As someone who read the investor brief, I was *upset* (putting it mildly) at how they were spinning the GPU pricing. Maybe for the big institutions and shareholders who hold other companies don't get it and are the target audience. But it doesn't take a genius to figure out that all of that spin is just fluff to make it seem like the stock itself is capable of infinite growth.
They're in a position where they don't have enough product to satisfy the currently existing demand, so high pricing is all they can really boast.
can we just take a minute an acknowledge how photogenic anthony has become over the years? my favorite ltt videos
he's the type I'd rather hear than see
4:27 no matter how much i like LTT im not spending $250 USD on a bloody backpack
To be fair you've to be moron to buy anything from LTT store.
Then don't buy it?
Maybe it has a tiny refrigerator that keep your water cold. 😂
@@Karthunk Congrats, you win Most Generic Comment of the day.
@I suck at being bad once again... then don't buy it? If they don't sell any then the price will drop. Merch sales for UA-camrs aren't really about the product, it's about supporting the channel.
"willing to pay 300$ more to replace a GPU" well, we have to because you keep giving most your inventory to miners directly and scalpers indirectly
I brought the 3090 at MSRP and it saved me so much money for training deep learning models because my alternative is to pay a minimum of $2-$3 per hour for a cloud GPU instance with 16GB+ memory and they are generally 50% slower than my 3090. The $1500 price tag for 3090 is totally worthy. The 3090ti, on the other hand, sounds like a cash grab. 48GB memory would make more sense.
Any tips on how to learn this type of thing? I want to use my PC for more than gaming
this is the best review I've seen on LTT. Thanks Anthony!