While were at it let’s give the 5080 12gB of Vram and give the 5090 48gB of Vram for some reason so people are forced to buy the flagship model to feel some sort of regret for buying not the brst card.
Who people should be more mad a are Western Game developers. They make unoptimized games that require a ton of VRAM. Yet this year Asian Game Studios have put out games that are beautiful to watch. Fun to play and require on 6GB of Vram. Let that sink in.
why does it matter when we are lower peasant consumer end, it should be taken seriously if you are large scale buyer and in this case thats where these companies listen to
They'll be my "friend" if they want to compete for marketshare and offer better value. At that point, it's just in their best interest as a company to do that, although AMD seems to not understand this at launch. I will still be getting a 7700XT because, here, with taxes, a 4070 is unnaffordable ($800) and the 4060 is a pile of junk for the price.
This is wrong, AMD is my friend, they gave me frame generation but Nvidia refused. They gave me 24 GB of V ram, they give me hope for the future that FSR four will be incredible. All at a good price
You can thank AMD for not competing and shooting their own foot at the perfect opportunity when they have a clear shot on Nvidia. Oh well, I guess AMD fanboy's incessant praise and lack of criticism for AMD shitty value proposition just screws everyone at the end.
@@asdfjklo234 not even apple is this shitty.. I mean.. look at their M4 basic.. compared that to what NVidia has to offer made Apple look like a saint.
@@dpptd30huh? AMD have a better line-up than Nvidia, they just don't compete at the high end. price to perf leans to AMD, the major downside is just their software package.
Nvidia has absolutely been planning for this with no worries. Why were they so quick to cancel production of the 40 series? Because those prices are about to look like a good deal.
@@kamipride9288 FSR 1-2-3 are not AI based. If they use AI things could be different. All GPUs are capable of processing AI with compatible software since they have thousands of cores to process parallel.
@@Dempig you are the person this comment is talking about. FSR is often timed just as good with the quality setting at 1440p and 4k and you literally get better raster performance with AMD compared to Nvidia. Not only that, but the VRAM discussion is settled and Nvidia is doing crimes. Ray tracing, instant nvidia win, but then the question still remains "why buy a 4060 when it performs worse, can't use ray tracing and doesn't have enough vram for high quality settings with frame gen"
@@rgbgamingfridge I dont really care about the youtube fake outrage cancel culture. If I want to play a game i'll buy it and the required hardware necessary to play it.
12GB for the 4070 and 16GB for the 5080 is absolutely ridiculous!! They will launch higher super versions with more VRAM but they want to milk us just a little more first… f*ckersss. Also sad that they are not using a newer and more efficient node like maybe TSMC N3B but I do get why they keep that capacity for their commercial AI GPU’s that cost 100K or more per GPU.
Honestly I don't care how good this new AI feature is going to be it still will not make the 50 series worth buying since they're clearly going to be massively overpriced and each tier is going to be massively gimped. Anybody that still continues to buy this new 50 series at these prices is helping Nvidia with killing the PC market for good.
I agree. Unfortunately, fanboys will once again buy their shitty overpriced gimped GPUs. They will find the most pathetic excuses to defend Nvidia for this. They definitely are killing PC gaming.
@@Dempig Intel GPUs are maybe not as stable, but they have gotten massively better with drivers. AMD GPUs tend to offer better price to performance than most Nvidia GPUs and usually have more VRAM. Nvidia GPUs do have their advantages. But to say AMD GPUs are unusable is pathetic.
@@Gamer-q7v FSR makes amd cards completely worthless for most current gen games that require upscaling. Its a fact. FSR is so bad its unusable. I have a 6950xt and will never buy AMD again. I will gladly pay an extra $100 to not have to deal with FSR ruining games.
@@Dempig It's not just about features such as upscaling and frame gen. Yes, DLSS is superior to FSR. RT performance is substantially better on Nvidia. However, it's not enough to say Nvidia is always better, and all AMD GPUs are trash. AMD GPUs usually have more VRAM, and VRAM is becoming increasingly important as the graphics of games continue to enhance. Also, Nvidia completely cheaped out on the low-end and midrange RTX 40 Series GPUs. Every GPU below the 4080 is named the tier above what they should be. Nvidia is ripping off gamers in the low-end and midrange segments by using technologies such as DLSS and frame gen to cut corners on the hardware and still charge absurd prices. This is what I can't stand about Nvidia nowadays.
I think you mean the 5050 (real name). Remember to count the number of times they did a stack-slide. These products can just be called Blackwell 100%, Blackwell 50%, Blackwell 33%, Blackwell 25%. Whats missing is Blackwell 75%.
Good luck with the greed and price fixing crime... I am not going to buy any overpriced graphics card... I got R9 380 (from 2015) installed in my pc and I have spent total of 0 (zero) $ during; - mining craze - pandemy and - price fixing era... If tariffs skyrocket the pc component, so be it. I am not going to buy any overpriced products. I'll wait until healthy market conditions returns and price fixing crime eventually ends... If that means, I have to wait until 2032 - 2033 I say "Lets go... Lets roll on to 2033..." No need to rush for me... The very last nvidia gpu I bought was xfx geforge 7600GT back in 2006... So... In last 18 years, I gave (totally) 0 (zero) $ to nvidia...
i used to think like that but ..... its not possible i did my upgrade 9 year apart but you have to realise that your life is short so you have to treat yourself to something .... just buy rdna 5 gpus youll be just fine in terms of price atleast the expectations are too high in terms of resouliton like we have to give 4k, who decided that? i am happy and content in 1080p 60fps(nodwayds people want 4k 120fps for some reason ) pathtracing and i am willing to buy lets say rtx 4070 or upcoming rdna 8700 or 8700xt gpu or who knows if intel shines in rt i might go for it as long as i can play in that resolution
Would a 7900XTX not interest you? Those seem to be very reasonably priced atm for what they do right now. That R9 380 won't run modern titles - not even at slideshow pace , but flat out missing features (and driver support) now.
Dear friend @@bmqww223 Hopefully "battlemage" and/or "rdna4" options will be affordable for end users in 2025. Fingers crossed. I hope amd and or intel may come up with good value gpu(s). But as you know, they abused us so much since 2018 so I lost my confidence as a customer.
Your old card won`t even run modern games anymore. So you probably also spend 0 zero hours gaming. Hey, maybe you should wait until hell freezes over, there is more chance of that.
16 gb for the 5080 is just criminal imo, it's barely enough right now and the more features nvidia introduces the worse it will get, especially in combination with how much more demanding games seem to be getting. sure you can have enough vram if you skip using frame gen, ray tracing, dlaa and whatever the new ai gimmick will be but then what's even the point of going for nvidia? i'm thinking of waiting for a 5080 version with the new chips that will have more vram later on and if they don't release that, i think i will just go for the xtx, which might be close to 600$ at that point and save myself 400+ or one of the new amd gpus if they are better.
Dude, 16GB is more than enough for modern gaming. I have a 12GB 4070 and have never encountered an issue with running out of VRAM. Even super demanding UE5 games like Silent Hill 2 Remake, Stalker 2, Black Myth Wukong, and Hellblade 2 use between 7.5GB to 9.5GB of VRAM at Epic settings running at 4k with DLSS. Only a VERY tiny amount of games push past 12GB of VRAM, and those are either unoptimzed slop or the devs have actually retroactively updated their game to have better VRAM management and efficiency. And if 12GB truly isn't enough in a particular game, lowering the texture quality from max to high is all that needs to be done. You don't have to max out every setting in every game you play. So 16GB is definitely fine, by the time a lot of games truly require more than 16GB of VRAM, the 5080 will be very outdated anyway and wouldn't be capable of running games on high settings anyway.
@@03chrisv if you want to use the nvidia features like ray tracing, dlaa and fg there are games even now that use more than 16gb of vram for 4k, like cyberpunk for example and a few others that come close. the same goes for 12gb for 1440p and it's even worse for 8gb for 1080p. as i said yes, you can choose to not use all these features but then there is no point in buying an nvidia card for the insane prices that they ask for. in general the "just reduce the settings" argument is completely idiotic when you pay 800+ dollars for a current gpu to not even get what that card was marketed to be able to do. also if you want you can play at 1080p, dlss performance, at low setting and never run out of vram, that's not a good argument of whether 16 gb is enough or not. hardware unboxed did the testing for vram, go watch that video in case you are thinking i'm making things up about how much vram games need now. my advice is to stop doing all these mental gymnastic just to excuse a truly horrible company and their predatory practices cuz it hurts all gamers, including you.
@@03chrisv I've definitely gone over 16 with modded skyrim, and MS flight sim. Two games I play a lot of to this day. I'm worried how the next MS flight sim is going to run given how badly optimized it seemed in last weeks beta. it may even max out a 5090 at the rate they are going. There are some other VR games I've seen go over 16 as well. Sure these are all specific use cases. But I mean, there are those of us out there that play these games. I know because I'm one of those of us out there
@@proress i ain't paying that much money for a used gpu, maybe it's me being paranoid but i would never trust to buy something used, i want the safety of the warranty and a trustworthy shop.
@@kaiichinose9590 nothing is truly enough right now and i wouldnt pay a 5090 for it. I like my hrz and my fps. Might not be willing to have to use dlss so oftem just bc i want ultra everything at 4k with low hrz
@@RumpleFoldSkin I meant AMD gpus aren’t powerful enough for me though I do like the 9800x3d cpu paired with the 4090 is a machine. I like high frames and fidelity so high end it is for me. But I also don’t play into the whole needing 200 gps for esports games. I like story driven games. 4k 90 and above for those types of games are plenty and if I do play and fps game I easily get well over 120 fps in those types of titles.
There's never been a better time to abandon AAA gaming and PC upgrades, and get yourself a nice new monitor to play your back catalog and older favorites on.
I have 3 pc's one running an intel Arc a77016gb one running the 3090 T.I Fe and one running the 7900XT all three cards do well . The only upcoming card on my Radar right now is Battlemage.
@@phoenixrising4995 Many people think the 8800xt will save the gaming market. It won't. AMD is price fixing the market with Nvidia. AMD is not going to just start selling cards with massive price to performance ratio advantages over what Nvidia is selling.
AMD is price fixing the market with Nvidia. I guarantee you AMD is not going to sell the 8800xt at a price to performance ratio for much less than NVidia's offerings. People keep saying the 8800xt will have 7900xtx levels of performance for only $500. Lol, dream on.
I just picked up a 4070TiS for this reason. I have zero faith in the 5000 series but my 3060Ti is struggling with 8GB of VRAM on my 1440p monitor. It is gasping for breath and I just put a piece of duct tape over it's mouth and tell it to hush.
Why no Techtuber is argueing the consequences of obvious price hiking? Pretty sure that the EU (is it called watchdog in english?) is already looking at the lack of competition and the pricing and at some point they will be sanctioned. Thank you 💙
Having two 3d vcache chiplets wouldn't remove the latency penalty, if a game went to both, even with vcache, the latency interchip is still there, infinity fabric is still massively slower than intrachip comm. I wish they were both vcache, but it's always going to be more performant to stick with one chip.
In one way you actually can't blame nVidia. They can either make gaming GPUs that make them X dollars for every dollar they spend to make them. Or they can make that AI crap that makes them maybe 100 times as much for each dollar they spend. It's in their interest to make their gaming GPUs as unattractive as possible. Kinda wonder why they even bother making gaming GPUs anymore. Maybe incase the AI bubble bursts, so they have something to fall back on.
It doesn't make sense what they market the 5070 against, if anyone is buying a $500+ card and not looking up benchmarks or getting themselves informed, they deserve to make a bad purchase.
12 and 16 were in the same family same as 4N (Ada) and N4P (Blackwell). Point is the node is not where we will see much improvement, it will be pushing power and memory speed mostly. All SKUs aside from the 5090 only have 4 More SMs than their predocessor.
Is it still actually surprising people how little they are getting each iteration of Nvidia cards? The slap in the face pricing didn't clue you in on this?
The new feature is DLSS 4. They will use it to artificially bump up the performance numbers compared to 40 series just like with DLSS 3 when the 40 series was announced.
Nvidia is intentionally nerfing the 5070-5080 so that people will shell out $2,000+ on a 5090 instead. I wish AMD was competitive at the high end so Nvidia couldn't get away with this crap.
I just went ahead and got a 7900xtx for around $700 on Black Friday. The chance there is a big uplift in the midrange at a decent price is very slim. That's if you can even get them for months when they release.
there is no fine wine in those cards anymore due to udna is only be improved with next architecture ....rdna gonna be like vega...frogotten...you amd fans got scam in biggest crime in history as cash grab marketing
I switched to AMD with 7800xt and I am NEVER GOING BACK to NVIDIA. With AMD performing excellent with Unreal Engine 5 and lumen I am all in on AMD going forward.
Amd performs horribly in UE5 what are you talking about about lol. I have a 6950xt and will never buy amd again. FSR is completely unusable and required for UE5
@@bfhandsomeface409 FSR only is OK at 4k in my experience , at lower resolutions it falls off a cliff-face. You shouldn't need to be upscaling at 1080p and in many case at 1440p if devs did their work as they should
@@tourmaline07 There in lies the issue, devs are lazy and don't know how to use UE5. All the old heads are retired and this new generation of devs are stupid compared to them.
My 3090 died just out of warranty, so i picked up a used one, and that died a few months later. I sent them to a guy I know that has repaired a lot of gpu's for a lot of the companies i do IT for, but he said neither were repairable. Right now I'm on a 7600xt, but I was planning on getting a 5090. Still saving for it...but I'll be honest, I'm Kind of disappointed that I've been buying NVIDIA since the rivaTNT and never had a card burn out on me. I wont be buying another 3090 again though.
@@v.cotoiu3568 Really? I hadn't heard that. Although I water cooled the first one. I never saw any high temps on the sensors. maybe there was heat elsewhere though on it. second one i left on its factory fhsk.
@@s7r49 3090 has 12 vram modules on one side and the other 12 one the other side of the board. That PCB in between these things gets hot no matter how you cool the card.
@@v.cotoiu3568 i had active cooling on the back as well but yeah you're probably right. we'll maybe the 4090's will come down when the 5090's come out and some people want to unload them, 3090 really wasn't cutting it for some of the VR games i play.
The one thing that no one talks about... is the absurd weight they will have.... And we will see even more graphics cards breaking and having problems with soldering the broken memory modules and the GPU....
Even my 3090 fe weighed too much. It kept coming out of the socket, so much so that I had to waterblock it to convert it into a 2 slot just to have it fit on a riser and have it vertical instead of horizontal. Can't wait till the 4 slot cards come out.
@@foxify52 wow. My 3070 ti fe wasn't that bad. But why not use brackets? First thing I did after installing it in my case was a gou support bracket and have had zero issues
@@xenosayain1506 The problem isn't for me, most people don't even know how to install a graphics card and they think they can play ball with them and end up breaking them... And then you have the Noobs who watch videos on UA-cam who think they need mods on everything to get more 10fps in COD and end up ruining everything.
@@starlightHT mods on cod for 10fps? I've never heard that. Can you explain please😅. But I see what you mean. Anyone ebuying a prebuikt wouldn't know and gpu sag could be a huge issue with that heavy a card
Perhaps these RTX50xx series cards will have a better DLSS version associated with them. Generational performance uplift may be more to do with software rather than hardware in some launches, especially with the lower and mid tier offerings. Be interesting to see if AMD do the same with their FSR software too, or instead offer more beefy hardware in their low and mid range cards than Nvidia, or a balance of both?
The 5070 Ti seems interesting to me, it's basically a 5080 with 17% less CUDA cores as it uses the same die and features the same bus width and 16GB of VRAM. If it's priced at $799 it'll be an OK card. If the 5080 performs on par to a 4090 as rumors suggest, then the 5070 Ti should be able to outperform a 4080 by 10% or so, that's NOT bad.
5080 definitely not matching 4090, at least not in raster. It's a 70 class die and expecting a 30% boost in performance from Nvidia is highly unlikely.
Last time 70 class was on par with the old 80, so exactly a 60 class. This time it might be a 50 class, as it isn't even faster than a 4070 Ti Super. I guess you can savely buy a 4000 card on sale. 5000 will just be bad...
I am so glad that i bought a amd 7900xtx for 1100€ at launch almost 2 years ago instead of an overpriced rtx 4080. Raytracing wasn‘t that important to me due to i had the gtx 1070 before. Of i would have waited for the 50 Series it would have been a huge letdown for me with These Specs from RTx 5070, ti and 5080 and the Prices of Minimum 40 Series.
Bought a 4070Ti Super with 16GB VRAM for $756 on sale. Card is a blower configuration going for 900$ at full price. I should be set to outlast the Trumpaggedon Economy, right guys? 😅
@SD-vp5vo so it is! Just like the 3060Ti I purchased in Sept of 2021, also for $750! Didn't have much choice back then. My 1060 6Gb had rolled over and died!
At this point. They are cutting down every single other gpu to maximize purchases of the 4090 and get their quarterly earnings up for the gpu market like this did this Q3 this year. It's up 1 billion from Q2 and up 1.2billion compared to last year.
I bet that in the new exclusive specification of the RTX 5000 it will take work away from the CPU and alleviate it to achieve more FPS with DLLS and low resolutions.
Nvidia used to offer good products at higher price and rarely once in a blue moon (good price + good product) But ever since RTX 40 they are offering bad products at higher prices There's always something pissing you off about their cards: Low VRAM, or Low VRAM Bandwidth/Bus or 5% increase only gen over gen or a 90 card that have fire hazard cable because it's drawing way too much power over a fragile cable
The entire stack is being built and priced to make the 5090 seem like a great value. When in actuality, it is everything else that is simply a trashy value. No reason to do otherwise when gamers will scoop it all up anyways.
If y'all stop running out buying this over priced tech then they will have to drop the price. But y'all won't lol you do it to yourselves. So they gonna screw you.
The rtx 3090ti has more cuda cores than the 4080 but the 4080 was way more powerful. Let's way and see because new tech might be playing a big role again.
Just buy second-hand 😁. Where I live, now an RX 6800 can be bought for 270$ for example, an RTX 4070 is around 400$. If you don't want the latest and gratest, second-hand market is the way to go.
I bought the overpriced but less expensive 4080S, with the faintest hope it would encourage Nvidia to charge lower prices on 50 Series GPUs, or *at least* the low-mid tier cards. This GPU sold well so hopefully like-minded buyers had some sort of impact, even if it's small. Shame they probably won't up the VRAM on their cards though, I'd hoped they would but the 4080S having 16GB instead of 20GB was the biggest red flag they weren't going to do that.
We should atleast be getting past the VRAM issue we had on the 40 series, I mean seriously the 5080 to 5090 jump is insanity. The 5080 should have 24GB VRAM, 5070 at 16GB VRAM and 5060 at 12GB VRAM, then put that original 8GB VRAM on the 5050. This isn't even mentioning the cuda core counts, I mean 10k between the 80 and 90?? They gonna make a 5080 TI along with a 5080 TI Super? 😂 I want to buy a 5080 but I'd be hoping it is better than the 4090 but it's seeming like it's going to honestly be minimal improvements except for tech which I mean is good and all but this is a generational upgrade, I expect new tech from a drivers update.
Then maybe waiting for the RTX 6000 series is a better option. Though i may have to delay playing modern games for 2 more years. I play on 4K and i have a GTX 1080 Ti. I mainly play old games. Games that were released up to 2016 or 2017. Games after 2016 i can't play at 4K. GTx 1080 Ti is not enough for that at 4K. So if i stick to old games like Batman: Arkham Knight, Dying Light, Battlefield, Doom 2016, Doom Eternal etc, i can wait for two more years. Hmm, i guess i'll wait and see. RTX 5090 is out of the question, it will be too expensive, i'm eyeing RTX 5080. If it's better than RTX 4090 i may buy it. We'll see in time.
I am still hoping some cosmic entity will guide the Jacket Man, and we'll get a 5060 with 10/12GB of VRAM, 20% faster than the 4060 Ti, and costing $350. Amen!
Nvidia realized they gave too much for too little with Ampere. Each successive generation are all one step down with little improvement. They simply are not making the mistake of naming it a 4080 12GB.
5070ti will likely be 4080 super level of performance. For most people, that will be plenty, even for 4K . 5070 will easily run 1440p. Ppl just want to complain lol.
@@johnc8327 People are complaining because Nvidia is trying to make consumers pay more for less. They are gimping the hardware of all their graphics cards except the 5090 and still charge absurd prices for them. Obviously, people are going to complain then. The 70-class GPU usually matches or outperforms the previous generation flagship GPU. From the looks of it, the 5070 won't even match the 4080. That's pathetic at its best.
I think they should lower 5060 VRAM to 6gB and 5070 to 8Gb. The peasants don't need to play at higher than 640p.
Here I'm gaming on a 1060 6GB at 1440p. 😮
The new upscaling and frame generation will do the rest. (sarcasm off)
While were at it let’s give the 5080 12gB of Vram and give the 5090 48gB of Vram for some reason so people are forced to buy the flagship model to feel some sort of regret for buying not the brst card.
@@Aethelbeorn you're gaming some discounted platformers
@@summergamer7650 Just be happy that Nvidia doesn't call that 5080 16GB, 12GB and 8GB (sarcasm off)
The entire lineup is made so people only shovel out money for a 5090. Disgusting.
Apple does this same nonsense. They sell a 800 dollar 60hz phone so the 1300 Pro looks heavenly lmao
So 5090 not worth? 4090 better
@@SLAYERx3DBuy the 5090 and leave the used 4090 market viable for me.
Who people should be more mad a are Western Game developers. They make unoptimized games that require a ton of VRAM. Yet this year Asian Game Studios have put out games that are beautiful to watch. Fun to play and require on 6GB of Vram. Let that sink in.
@@D4C_LoveTrain1 😂😂
no company is your friend ...i repeat amd fan boys and nvidia fan boys ...
why does it matter when we are lower peasant consumer end, it should be taken seriously if you are large scale buyer and in this case thats where these companies listen to
I don't want friends. I want the best.
They'll be my "friend" if they want to compete for marketshare and offer better value. At that point, it's just in their best interest as a company to do that, although AMD seems to not understand this at launch.
I will still be getting a 7700XT because, here, with taxes, a 4070 is unnaffordable ($800) and the 4060 is a pile of junk for the price.
This is wrong, AMD is my friend, they gave me frame generation but Nvidia refused. They gave me 24 GB of V ram, they give me hope for the future that FSR four will be incredible. All at a good price
@@Aethelbeorn overcompensating
These specs look so atrocious. They should just cancel this absolute joke of upcoming shitty gimped GPUs. Nvidia has killed PC gaming.
"Apple-fication"
You can thank AMD for not competing and shooting their own foot at the perfect opportunity when they have a clear shot on Nvidia. Oh well, I guess AMD fanboy's incessant praise and lack of criticism for AMD shitty value proposition just screws everyone at the end.
Radeon keeps looking better and better
@@asdfjklo234 not even apple is this shitty.. I mean.. look at their M4 basic.. compared that to what NVidia has to offer made Apple look like a saint.
@@dpptd30huh? AMD have a better line-up than Nvidia, they just don't compete at the high end.
price to perf leans to AMD, the major downside is just their software package.
No it won't, because it was expected : as long as people keep paying, companies will give less and less.
Nvidia really wants its customers to play at 720p with DLSSS FG DLLAA etc. With those Vram specs again
Nvidia, the way you're meant to be played.
Might as well get a steamdeck at that point. 😂😂😂
Nvidia has absolutely been planning for this with no worries. Why were they so quick to cancel production of the 40 series? Because those prices are about to look like a good deal.
70 cards slowly turning into budget performance for ‘mid’ range pricing 🤦♂️
If amd improves fsr, gives more vram, sells at least %10 cheaper I believe nvidia will have a hard time. In a parallel universe I guess.
The rx 7900-7700 used 5nm. Only 7600 used 6nm
Fsr needs a hardware based solution moving to 3nm won't change that
@@Naffacakes98 Sorry, didnt know that.
@@kamipride9288 It's coming. FSR 4 will be an AI based upscaler that will utilize those AI accelerators on RDNA 4 and hopefully on RDNA 3 too.
@@kamipride9288 FSR 1-2-3 are not AI based. If they use AI things could be different. All GPUs are capable of processing AI with compatible software since they have thousands of cores to process parallel.
Nvidia learned from Intel... +5% performance increase every generation when no real competition on the market.
The problem is the normies and Rabid fanboys that will just buy Nvidia no matter what. If it still sells, nothing changes.
People buy Nvidia because there are no other options if you want to play current gen AAA. FSR is unusable and Upscaling is required for ue5
then don't buy those games if devs go bankrupt they'll think twice about optimization
Not this time. I could afford a 5090 but I refuse to pay that much.
@@Dempig you are the person this comment is talking about. FSR is often timed just as good with the quality setting at 1440p and 4k and you literally get better raster performance with AMD compared to Nvidia.
Not only that, but the VRAM discussion is settled and Nvidia is doing crimes.
Ray tracing, instant nvidia win, but then the question still remains "why buy a 4060 when it performs worse, can't use ray tracing and doesn't have enough vram for high quality settings with frame gen"
@@rgbgamingfridge I dont really care about the youtube fake outrage cancel culture. If I want to play a game i'll buy it and the required hardware necessary to play it.
12GB for the 4070 and 16GB for the 5080 is absolutely ridiculous!! They will launch higher super versions with more VRAM but they want to milk us just a little more first… f*ckersss. Also sad that they are not using a newer and more efficient node like maybe TSMC N3B but I do get why they keep that capacity for their commercial AI GPU’s that cost 100K or more per GPU.
Turing introducing Ray Tracing so you could game at 24FPS... The whole thing was a joke.
.... I just changed my old 1070 for rx 6700 buy used for 100$ ... for me its fine ...
That's a really good deal. I also have a 1070 (which I won't be changing any time soon, but if I had that opportunity I would've took it!). 👍
Bruh for 100$?? That’s an amazing deal
I have it almost same, changed from 1070ti to 6700 and I am so happy. Welcome to AMD team
Honestly I don't care how good this new AI feature is going to be it still will not make the 50 series worth buying since they're clearly going to be massively overpriced and each tier is going to be massively gimped.
Anybody that still continues to buy this new 50 series at these prices is helping Nvidia with killing the PC market for good.
I agree. Unfortunately, fanboys will once again buy their shitty overpriced gimped GPUs. They will find the most pathetic excuses to defend Nvidia for this. They definitely are killing PC gaming.
@@Gamer-q7vThere are literally no other options. Amd and Intel cards are unusable
@@Dempig Intel GPUs are maybe not as stable, but they have gotten massively better with drivers. AMD GPUs tend to offer better price to performance than most Nvidia GPUs and usually have more VRAM. Nvidia GPUs do have their advantages. But to say AMD GPUs are unusable is pathetic.
@@Gamer-q7v FSR makes amd cards completely worthless for most current gen games that require upscaling. Its a fact. FSR is so bad its unusable. I have a 6950xt and will never buy AMD again. I will gladly pay an extra $100 to not have to deal with FSR ruining games.
@@Dempig It's not just about features such as upscaling and frame gen. Yes, DLSS is superior to FSR. RT performance is substantially better on Nvidia. However, it's not enough to say Nvidia is always better, and all AMD GPUs are trash. AMD GPUs usually have more VRAM, and VRAM is becoming increasingly important as the graphics of games continue to enhance. Also, Nvidia completely cheaped out on the low-end and midrange RTX 40 Series GPUs. Every GPU below the 4080 is named the tier above what they should be. Nvidia is ripping off gamers in the low-end and midrange segments by using technologies such as DLSS and frame gen to cut corners on the hardware and still charge absurd prices. This is what I can't stand about Nvidia nowadays.
If people are sensible, they will shrug their shoulders and keep the one they already have
Blackmail delivered. I'd say we are not missing much unless developers can work on their optimisation issues and ray tracing efficiency is improved.
I think you mean the 5050 (real name).
Remember to count the number of times they did a stack-slide.
These products can just be called Blackwell 100%, Blackwell 50%, Blackwell 33%, Blackwell 25%.
Whats missing is Blackwell 75%.
Good luck with the greed and price fixing crime... I am not going to buy any overpriced graphics card...
I got R9 380 (from 2015) installed in my pc and I have spent total of 0 (zero) $ during;
- mining craze
- pandemy
and
- price fixing era...
If tariffs skyrocket the pc component, so be it. I am not going to buy any overpriced products.
I'll wait until healthy market conditions returns and price fixing crime eventually ends...
If that means, I have to wait until 2032 - 2033 I say "Lets go... Lets roll on to 2033..."
No need to rush for me...
The very last nvidia gpu I bought was xfx geforge 7600GT back in 2006...
So... In last 18 years, I gave (totally) 0 (zero) $ to nvidia...
i used to think like that but ..... its not possible i did my upgrade 9 year apart but you have to realise that your life is short so you have to treat yourself to something .... just buy rdna 5 gpus youll be just fine in terms of price atleast the expectations are too high in terms of resouliton like we have to give 4k, who decided that? i am happy and content in 1080p 60fps(nodwayds people want 4k 120fps for some reason ) pathtracing and i am willing to buy lets say rtx 4070 or upcoming rdna 8700 or 8700xt gpu or who knows if intel shines in rt i might go for it as long as i can play in that resolution
Would a 7900XTX not interest you? Those seem to be very reasonably priced atm for what they do right now. That R9 380 won't run modern titles - not even at slideshow pace , but flat out missing features (and driver support) now.
Dear friend @@tourmaline07
Unfortunately, even the cheapest rx7900xtx is 1076 Euro where I am.
So...
Dear friend @@bmqww223
Hopefully "battlemage" and/or "rdna4" options will be affordable for end users in 2025.
Fingers crossed.
I hope amd and or intel may come up with good value gpu(s).
But as you know, they abused us so much since 2018 so I lost my confidence as a customer.
Your old card won`t even run modern games anymore. So you probably also spend 0 zero hours gaming.
Hey, maybe you should wait until hell freezes over, there is more chance of that.
16 gb for the 5080 is just criminal imo, it's barely enough right now and the more features nvidia introduces the worse it will get, especially in combination with how much more demanding games seem to be getting. sure you can have enough vram if you skip using frame gen, ray tracing, dlaa and whatever the new ai gimmick will be but then what's even the point of going for nvidia? i'm thinking of waiting for a 5080 version with the new chips that will have more vram later on and if they don't release that, i think i will just go for the xtx, which might be close to 600$ at that point and save myself 400+ or one of the new amd gpus if they are better.
Dude, 16GB is more than enough for modern gaming. I have a 12GB 4070 and have never encountered an issue with running out of VRAM. Even super demanding UE5 games like Silent Hill 2 Remake, Stalker 2, Black Myth Wukong, and Hellblade 2 use between 7.5GB to 9.5GB of VRAM at Epic settings running at 4k with DLSS. Only a VERY tiny amount of games push past 12GB of VRAM, and those are either unoptimzed slop or the devs have actually retroactively updated their game to have better VRAM management and efficiency. And if 12GB truly isn't enough in a particular game, lowering the texture quality from max to high is all that needs to be done. You don't have to max out every setting in every game you play. So 16GB is definitely fine, by the time a lot of games truly require more than 16GB of VRAM, the 5080 will be very outdated anyway and wouldn't be capable of running games on high settings anyway.
@@03chrisv if you want to use the nvidia features like ray tracing, dlaa and fg there are games even now that use more than 16gb of vram for 4k, like cyberpunk for example and a few others that come close. the same goes for 12gb for 1440p and it's even worse for 8gb for 1080p. as i said yes, you can choose to not use all these features but then there is no point in buying an nvidia card for the insane prices that they ask for. in general the "just reduce the settings" argument is completely idiotic when you pay 800+ dollars for a current gpu to not even get what that card was marketed to be able to do. also if you want you can play at 1080p, dlss performance, at low setting and never run out of vram, that's not a good argument of whether 16 gb is enough or not. hardware unboxed did the testing for vram, go watch that video in case you are thinking i'm making things up about how much vram games need now. my advice is to stop doing all these mental gymnastic just to excuse a truly horrible company and their predatory practices cuz it hurts all gamers, including you.
@@03chrisv I've definitely gone over 16 with modded skyrim, and MS flight sim. Two games I play a lot of to this day. I'm worried how the next MS flight sim is going to run given how badly optimized it seemed in last weeks beta. it may even max out a 5090 at the rate they are going. There are some other VR games I've seen go over 16 as well.
Sure these are all specific use cases. But I mean, there are those of us out there that play these games. I know because I'm one of those of us out there
Buy a used 4090
@@proress i ain't paying that much money for a used gpu, maybe it's me being paranoid but i would never trust to buy something used, i want the safety of the warranty and a trustworthy shop.
Whelp, I'll be buying a XTX lmao.
Amd better give us a good option, they can definitely step in
@@doublewoofwoof I doubt it
No one should be buying, promoting, or accepting these for review. Anyone that gets one should say no thank you and send it back
Blackwell could be turing 2.0 but Ada Lovelace sure ain't Pascal 2.0
If i recall, they did a Stack Slide which made it appear that way.
Actually the new 1070 was the 2080. Then they out of names for the new 1080 etc.
You know what? Jensen and Company can skate fast and eat grass.
And this is how amd stays alive . And how amd garners more support now.
Yeah and stays alive on their cpu but I wouldn’t be getting their gpus. I game at 4k and AMD gpus are not enough.
@@kaiichinose9590 nothing is truly enough right now and i wouldnt pay a 5090 for it. I like my hrz and my fps. Might not be willing to have to use dlss so oftem just bc i want ultra everything at 4k with low hrz
@@RumpleFoldSkin I meant AMD gpus aren’t powerful enough for me though I do like the 9800x3d cpu paired with the 4090 is a machine. I like high frames and fidelity so high end it is for me. But I also don’t play into the whole needing 200 gps for esports games. I like story driven games. 4k 90 and above for those types of games are plenty and if I do play and fps game I easily get well over 120 fps in those types of titles.
There's never been a better time to abandon AAA gaming and PC upgrades, and get yourself a nice new monitor to play your back catalog and older favorites on.
I have 3 pc's one running an intel Arc a77016gb one running the 3090 T.I Fe and one running the 7900XT all three cards do well . The only upcoming card on my Radar right now is Battlemage.
8800XT is the only card I'm looking at now.
@@phoenixrising4995 Many people think the 8800xt will save the gaming market. It won't.
AMD is price fixing the market with Nvidia.
AMD is not going to just start selling cards with massive price to performance ratio advantages over what Nvidia is selling.
Fingers crossed that RX 8800 XT wont disappoint , hope that FSR 4 will be good with AI Wizardry
I hope you and your boyfriend are pleasantly surprised :)
Same 8800XT looks more appealing right now.
AMD is price fixing the market with Nvidia.
I guarantee you AMD is not going to sell the 8800xt at a price to performance ratio for much less than NVidia's offerings.
People keep saying the 8800xt will have 7900xtx levels of performance for only $500. Lol, dream on.
we need some effing performance leaks already so I can know if I just buy a 4070s now
Glad I got the 4070TI Super a few months ago. It should last at least 5 years, hopefully more.
Thx to the 16gb vram. If it had 12gb cut that time in half
I just picked up a 4070TiS for this reason. I have zero faith in the 5000 series but my 3060Ti is struggling with 8GB of VRAM on my 1440p monitor. It is gasping for breath and I just put a piece of duct tape over it's mouth and tell it to hush.
@@mikeramos91 i got the 4070 oc super just gonna sell it in couple years and buy a better one thats what i always do
I'm just looking for amd's low end cards sub 200 on rdna 4. If not I'll stick with my rx 6500 xt for another generation.
Why no Techtuber is argueing the consequences of obvious price hiking? Pretty sure that the EU (is it called watchdog in english?) is already looking at the lack of competition and the pricing and at some point they will be sanctioned. Thank you 💙
Its hardly a necessity of life!
Nvidia GeForce best series is 30 series! Best value! ❤🎉
Having two 3d vcache chiplets wouldn't remove the latency penalty, if a game went to both, even with vcache, the latency interchip is still there, infinity fabric is still massively slower than intrachip comm. I wish they were both vcache, but it's always going to be more performant to stick with one chip.
If the value proposition is just slightly better than Lovelace + a shiny new AI software they will get away with this.
Whether we like it or not.
In one way you actually can't blame nVidia. They can either make gaming GPUs that make them X dollars for every dollar they spend to make them. Or they can make that AI crap that makes them maybe 100 times as much for each dollar they spend. It's in their interest to make their gaming GPUs as unattractive as possible. Kinda wonder why they even bother making gaming GPUs anymore. Maybe incase the AI bubble bursts, so they have something to fall back on.
My backlog will last well into 6000 series. Maybe they will make a 6060 Ti with 2350 cores by then.🎉
seeing Jensen's face now just causes depression...
Ah the 70 series used to be such a powerful GPU for the price but no longer at all.
5:47 I bet its ngreedia FG 2.0
Ordered a 4080 super last week at $999 we’ll see if I regret in time
It doesn't make sense what they market the 5070 against, if anyone is buying a $500+ card and not looking up benchmarks or getting themselves informed, they deserve to make a bad purchase.
5:28 Correction: Pascal was on TSMCs 16 nm node and some Pascal dies were on Samsung's 14 nm node. Turing was on TSMCs 12 nm node.
12 and 16 were in the same family same as 4N (Ada) and N4P (Blackwell). Point is the node is not where we will see much improvement, it will be pushing power and memory speed mostly. All SKUs aside from the 5090 only have 4 More SMs than their predocessor.
And this is why I never touch Nvidia
If only the 4070 super went down in price. I would buy that and forget the 5000 series
Is it still actually surprising people how little they are getting each iteration of Nvidia cards? The slap in the face pricing didn't clue you in on this?
They only want to sell the 5090... and even then, they won't even make enough to actually avoid scalpers. Dumb
The new feature is DLSS 4. They will use it to artificially bump up the performance numbers compared to 40 series just like with DLSS 3 when the 40 series was announced.
But the 40 series had a vram bump whereas 50 series won’t 🤷🏻♂️
DLSS4 make fps in afterburner show higher number while being stutterier.
@@phoenixrising4995 how could u possible now that when it's not out yet
Nvidia is intentionally nerfing the 5070-5080 so that people will shell out $2,000+ on a 5090 instead. I wish AMD was competitive at the high end so Nvidia couldn't get away with this crap.
I just went ahead and got a 7900xtx for around $700 on Black Friday. The chance there is a big uplift in the midrange at a decent price is very slim. That's if you can even get them for months when they release.
You will regret it. High end AMD cards are kind of pointless with how bad FSR is.
where did you get it for that price?
there is no fine wine in those cards anymore due to udna is only be improved with next architecture ....rdna gonna be like vega...frogotten...you amd fans got scam in biggest crime in history as cash grab marketing
$700 for a 7900xtx?? wow thats amazing
@@iuriibakach481 Found it at Amazon, it was on sale for a very short time and is back up now.
I switched to AMD with 7800xt and I am NEVER GOING BACK to NVIDIA. With AMD performing excellent with Unreal Engine 5 and lumen I am all in on AMD going forward.
Amd performs horribly in UE5 what are you talking about about lol. I have a 6950xt and will never buy amd again. FSR is completely unusable and required for UE5
@@Dempig FSR is not "completely unusable" lol! It works very good at high resolutions imo.
@@bfhandsomeface409 FSR only is OK at 4k in my experience , at lower resolutions it falls off a cliff-face. You shouldn't need to be upscaling at 1080p and in many case at 1440p if devs did their work as they should
@@tourmaline07 There in lies the issue, devs are lazy and don't know how to use UE5. All the old heads are retired and this new generation of devs are stupid compared to them.
why gamers complain these days? A 2k monitor paired with a used 3090ti and you're absolutely golden.
My 3090 died just out of warranty, so i picked up a used one, and that died a few months later. I sent them to a guy I know that has repaired a lot of gpu's for a lot of the companies i do IT for, but he said neither were repairable. Right now I'm on a 7600xt, but I was planning on getting a 5090. Still saving for it...but I'll be honest, I'm Kind of disappointed that I've been buying NVIDIA since the rivaTNT and never had a card burn out on me. I wont be buying another 3090 again though.
@@s7r49 you need a 3o9oTI. Not a 3090. 3090's are flawed from having vram modules crowded and overheating. Not the case with the TI versions.
@@v.cotoiu3568 Really? I hadn't heard that. Although I water cooled the first one. I never saw any high temps on the sensors. maybe there was heat elsewhere though on it. second one i left on its factory fhsk.
@@s7r49 3090 has 12 vram modules on one side and the other 12 one the other side of the board. That PCB in between these things gets hot no matter how you cool the card.
@@v.cotoiu3568 i had active cooling on the back as well but yeah you're probably right. we'll maybe the 4090's will come down when the 5090's come out and some people want to unload them, 3090 really wasn't cutting it for some of the VR games i play.
Most likely nvidia will rely on a newer exclusive DLSS 4 to create a gap of performance 🤷🏻♂️🤦🏻♂️
The one thing that no one talks about... is the absurd weight they will have.... And we will see even more graphics cards breaking and having problems with soldering the broken memory modules and the GPU....
Just use a gourmet support bracket by default and you should be okay.
Even my 3090 fe weighed too much. It kept coming out of the socket, so much so that I had to waterblock it to convert it into a 2 slot just to have it fit on a riser and have it vertical instead of horizontal. Can't wait till the 4 slot cards come out.
@@foxify52 wow. My 3070 ti fe wasn't that bad. But why not use brackets? First thing I did after installing it in my case was a gou support bracket and have had zero issues
@@xenosayain1506 The problem isn't for me, most people don't even know how to install a graphics card and they think they can play ball with them and end up breaking them... And then you have the Noobs who watch videos on UA-cam who think they need mods on everything to get more 10fps in COD and end up ruining everything.
@@starlightHT mods on cod for 10fps? I've never heard that. Can you explain please😅.
But I see what you mean. Anyone ebuying a prebuikt wouldn't know and gpu sag could be a huge issue with that heavy a card
I am happy with my 7900xtx until it dies
Perhaps these RTX50xx series cards will have a better DLSS version associated with them. Generational performance uplift may be more to do with software rather than hardware in some launches, especially with the lower and mid tier offerings. Be interesting to see if AMD do the same with their FSR software too, or instead offer more beefy hardware in their low and mid range cards than Nvidia, or a balance of both?
why they even bothering with bellow xx80 gpus last gen perf almost same price same vram lol let me guess a new feature only for 5000 series
THE MORE YOU WANT THE RTX 50 Series THE MORE IT WILL COST
The 5070 Ti seems interesting to me, it's basically a 5080 with 17% less CUDA cores as it uses the same die and features the same bus width and 16GB of VRAM. If it's priced at $799 it'll be an OK card. If the 5080 performs on par to a 4090 as rumors suggest, then the 5070 Ti should be able to outperform a 4080 by 10% or so, that's NOT bad.
Are you mentally ok ??
5080 definitely not matching 4090, at least not in raster. It's a 70 class die and expecting a 30% boost in performance from Nvidia is highly unlikely.
Glad i have a 4070ti super and will wait a year to see the proformance of all new cards.not good to just buy without knowing.
AMD about to take over midrange…
Last time 70 class was on par with the old 80, so exactly a 60 class. This time it might be a 50 class, as it isn't even faster than a 4070 Ti Super. I guess you can savely buy a 4000 card on sale. 5000 will just be bad...
I am so glad that i bought a amd 7900xtx for 1100€ at launch almost 2 years ago instead of an overpriced rtx 4080. Raytracing wasn‘t that important to me due to i had the gtx 1070 before.
Of i would have waited for the 50 Series it would have been a huge letdown for me with These Specs from RTx 5070, ti and 5080 and the Prices of Minimum 40 Series.
im done with crazy PC gaming costs, I am buying an XboX and getting out the rat race
now you have to pay 60 bucks a year to play online
@@icecycles859 I have to pay that anyway for game pass on my PC so whats your point?
@@welshminty Game pass is garbage. If you're using it to play fcking Halo then just buy the game. Game pass is draining your wallet.
@@Ignisan_66 The price of PC parts its garbage mate, and no I dont play Halo
@@Ignisan_66 I think you missed the point of my post
maybe they are making a software for the lack of vram ,like ntc ? still , 16gb vram should have been the minimum
I was really hoping that the 40 series was just a mistake and a flop by Nvidia, but nope they are doing it again :(
I miss the 30 and 10 series
No no no...i guess it's amd for me this gen
Looking how small the jump in performance could be with the 40 series, i think ill just skip this generation, it just not worth it.
My 2080s is supreme card, playing everything I want.
Next card will also be top tier card that I will also pay 500€ for, brand new.
Stalker 2 only uses like 8GB of VRAM on a 3080Ti.
12 GB is plenty…at least for that game.
Looks like I'll be hanging on to my 4070 TI Super unless AMD releases a card equivalent to a 4080 which seems doubtful.
Yeah I am good with my 3080Ti and will later just upgrade to 4080 Super when it gets cheaper.
No it won't.
my 4070 ti super is just fine
Bought a 4070Ti Super with 16GB VRAM for $756 on sale. Card is a blower configuration going for 900$ at full price.
I should be set to outlast the Trumpaggedon Economy, right guys? 😅
That’s a solid hard you’ll be good
overpriced
@SD-vp5vo so it is! Just like the 3060Ti I purchased in Sept of 2021, also for $750!
Didn't have much choice back then. My 1060 6Gb had rolled over and died!
very nice!!! I decided to buy a GRE after the elections. I figured if tariffs are coming- better late than never?
@kokaboba time to batten down the hatches, because a big storm is coming...
Please buy. I'm a shareholder.
Yeah, its not consumer facing products you should look at but their business-facing AI stuff where they want their muffin buttered.
You've done well my friend 😁
we need another 10/30 series... and it's about time
What if im upgrading from a 1080ti to a 5090 is that still a rip off
Likely won't be able to find one at MSRP, get ready to spend at least $2500 - $3000 for just the 5090.
At this point. They are cutting down every single other gpu to maximize purchases of the 4090 and get their quarterly earnings up for the gpu market like this did this Q3 this year. It's up 1 billion from Q2 and up 1.2billion compared to last year.
I bet that in the new exclusive specification of the RTX 5000 it will take work away from the CPU and alleviate it to achieve more FPS with DLLS and low resolutions.
Nvidia used to offer good products at higher price and rarely once in a blue moon (good price + good product)
But ever since RTX 40 they are offering bad products at higher prices
There's always something pissing you off about their cards:
Low VRAM, or Low VRAM Bandwidth/Bus or 5% increase only gen over gen or a 90 card that have fire hazard cable because it's drawing way too much power over a fragile cable
Please stop with tarrifs already..not everyone live in US. But yeah i agree with you. Only 5090 looks good. Rest is crap.
Thx - 600W TGP for the 5090? So one needs min 1000w PSU in their case…
I am just going to either buy a rtx 4080 or rtx 3090 if i have to buy a gpu next year
The entire stack is being built and priced to make the 5090 seem like a great value. When in actuality, it is everything else that is simply a trashy value. No reason to do otherwise when gamers will scoop it all up anyways.
If y'all stop running out buying this over priced tech then they will have to drop the price. But y'all won't lol you do it to yourselves. So they gonna screw you.
The rtx 3090ti has more cuda cores than the 4080 but the 4080 was way more powerful. Let's way and see because new tech might be playing a big role again.
Why do I feel the Nvidia market is not for us now? Like.. we can fully build AMD pc at 1440p with that price
Just buy second-hand 😁. Where I live, now an RX 6800 can be bought for 270$ for example, an RTX 4070 is around 400$. If you don't want the latest and gratest, second-hand market is the way to go.
I bought the overpriced but less expensive 4080S, with the faintest hope it would encourage Nvidia to charge lower prices on 50 Series GPUs, or *at least* the low-mid tier cards.
This GPU sold well so hopefully like-minded buyers had some sort of impact, even if it's small.
Shame they probably won't up the VRAM on their cards though, I'd hoped they would but the 4080S having 16GB instead of 20GB was the biggest red flag they weren't going to do that.
We should atleast be getting past the VRAM issue we had on the 40 series, I mean seriously the 5080 to 5090 jump is insanity. The 5080 should have 24GB VRAM, 5070 at 16GB VRAM and 5060 at 12GB VRAM, then put that original 8GB VRAM on the 5050. This isn't even mentioning the cuda core counts, I mean 10k between the 80 and 90?? They gonna make a 5080 TI along with a 5080 TI Super? 😂
I want to buy a 5080 but I'd be hoping it is better than the 4090 but it's seeming like it's going to honestly be minimal improvements except for tech which I mean is good and all but this is a generational upgrade, I expect new tech from a drivers update.
12GB
Then maybe waiting for the RTX 6000 series is a better option. Though i may have to delay playing modern games for 2 more years. I play on 4K and i have a GTX 1080 Ti. I mainly play old games. Games that were released up to 2016 or 2017. Games after 2016 i can't play at 4K. GTx 1080 Ti is not enough for that at 4K. So if i stick to old games like Batman: Arkham Knight, Dying Light, Battlefield, Doom 2016, Doom Eternal etc, i can wait for two more years. Hmm, i guess i'll wait and see. RTX 5090 is out of the question, it will be too expensive, i'm eyeing RTX 5080. If it's better than RTX 4090 i may buy it. We'll see in time.
I am still hoping some cosmic entity will guide the Jacket Man, and we'll get a 5060 with 10/12GB of VRAM, 20% faster than the 4060 Ti, and costing $350. Amen!
Still rocking my 4090 😎 I’ll see how the 5090 roll out, if it is worth the upgrade or not
Im going 7900xt
I got GIGABYTE GeForce RTX 4070 WindForce 3 OC
On sale
Include, Indiana Jones and the Great Circle: Premium Edition, pre order bonus
Its great
Moores law is dead, what we see is 10000% improvement in 10 years, proceeds to release 30% (at best) improvement over 2 years
Its not Moores law, its corporate greed.
All large corporations have gone down this same path after the plandemic started.
Nvidia realized they gave too much for too little with Ampere. Each successive generation are all one step down with little improvement. They simply are not making the mistake of naming it a 4080 12GB.
5070ti will likely be 4080 super level of performance. For most people, that will be plenty, even for 4K . 5070 will easily run 1440p. Ppl just want to complain lol.
@@johnc8327 People are complaining because Nvidia is trying to make consumers pay more for less. They are gimping the hardware of all their graphics cards except the 5090 and still charge absurd prices for them. Obviously, people are going to complain then. The 70-class GPU usually matches or outperforms the previous generation flagship GPU. From the looks of it, the 5070 won't even match the 4080. That's pathetic at its best.
@@johnc8327 I'm more concerned about AMD giving up on high-end cards. Insane price of 4090 is a direct result of AMD not having an alternative.
Oh look, another nvidia clown@@johnc8327