Whokeys 25% off code:RGT Windows 11 Pro $23,2: biitt.ly/581eD Windows 10 Pro $17,6: biitt.ly/9f0ie Windows 10 Home $15:biitt.ly/XYm9o Office 2016 $28,8:biitt.ly/cy7Vb Office 2019 $48.6:biitt.ly/YInvw www.whokeys.com/
Lets hope AMD can get it's Raytracing performance closer to Nvidia with RDNA 5, and further improve FSR etc.. And maybe Intel upping their dedicated GPU game.. More real competition is the only hope we have for lower prices in the PC gaming videocard market..
@@ippothedestroyer the thing is that rtx 50 series won't be launching this year. expect them to launch in q1, most likely q2 of next year. amd however, will be launching rdna 4 this year and then very likely rdna 5 next year to compete with the 50 series. they r 2 generations behind but they can catch the 50 series if things went alright.
If the low end cards use 4nm I'd expect we'll pay more for the same performance on the low end. Expect the 5060ti to be like 5% better than the 4060ti for 20% more money
@@arenzricodexd4409 They are already on that 5nm+, or 4nm. It would need a radical architecture change to bring any meaningful increases on the same node. Maxwell did this back in the day, but it went from compute architecture to gaming - I don't see that happening again.
@@avatarion turing also using enhance 16nm. And nvidia going from gaming to more compute oriented design with turing. 4060, 4060Ti are barely faster than the original 3060, 3060Ti. probably because nvidia want to give some head room towards 50 series. For example 4060 right now sits at 115w. For the upcoming 5060 nvidia can make the chip a bit bigger with power consumption around 170w. GB102 will go beyond 600mm2 like TU102 before. And they will going to charge hefty price for it.
@@arenzricodexd4409 Turing used an improved node from Pascal. Lovelace is already on that improved 5nm you talked about. They are not going to stay on the same node without major architectural changes, like I said. All of their RTX generations have been compute. They don't bother making a separate design for the GeForce anymore.
i don't think nvidia will split blackwell die in two different process nodes. the lower end trims are the same as the higher end dies which have more defects.
@@dobermanownerforlife3902Ditto, and an ancient Sandy Bridge CPU - doesn't make sense to upgrade that with same GPU being the limiter, but the price/performance ratio for GPUs right now is so fucked that I'm just content chipping away at my backlog. I'm very pragmatic.
@@cesare4926 So Nvidia's valuation exploding over the last 4 years to surpass the marketcap of of Alphabet and the entire GDP of Canada happened because PC gaming just randomly got 13X more popular than it was in 2020? Meta, Google, Microsoft, and OpenAI aren't buying hundreds of thousands of H100s each and nVidia doesn't have orders that max out their capacity for the next several years? Amazon AWS, Microsoft Azure, IBM Wason, Oracle, Tencent, Alibaba, Baidu, Llambda, RunPod, Vast, and dozens of other AI cloud services aren't filling datacenters faster than nVidia can get silicon manufactured?
@Healthy_Toki according to average ebay price finder, the average used 3080 goes for $467 + shipping and the average used 6800xt goes for $402 + shipping. And as a personal choice, I like to have a warranty, and this gen is likely out of it. Also this gen went through the mining craze, and although I know many tech tubers have shown that cards that were mined on are just as good if taken care of, I still have trust issues.
I'm less bothered about price of the card as its power consumption as long as its below $600. Electricity is costly these days and you could easily end up costing yourself a lot more simply because your gfx card is ballooning the electricity bill every month.
@@arenzricodexd4409 100%. And those 'gains' are becoming more and more obscure - "oh look, component #13245.5423 (nothing to do with raster or RT) is now operating 70% faster, let marketing know!"
I just got my 3080 RMA due to a sensor failure and since there was any other new I got a free upgrade to a 4080, imagine my surprise when I saw that it costs 1533€. Not changing GPU any time soon.
I got an Asus TUF OC 7900xtx for free from amazon. Beat that lol. And it was Amazon India. I told them that my card is giving me black screens, and they just refunded me the money and told me to dispose off the card at my convenience. It was an undervolt and oc issue. Sorted it out and enjoying the card for free.
Honestly I don’t give a shit about performance anymore. It doesn’t get me excited. All I care about is price and how much faster it is compared to what I already have.
A big jump in performance maybe will have a top model for the wealthy - rtx 5090. Normals buying xx60 and xx70 cards will get 20-30% more compared to the current generation, and probably more expensive than now. Wonderful. Sure.
🎯 Key Takeaways for quick navigation: 01:48 *Blackwell GPU incredible performance teased.* 03:23 *Blackwell variants, architectural differences, memory types.* 06:35 *Blackwell's focus on ray tracing performance.* 07:29 *Supply constraints for Next Gen Nvidia GPUs.* 08:53 *Uncertainty about RTX 50 release date.* 10:45 *Nvidia's potential lack of competition from AMD in high-end GPUs.* Made with HARPA AI
we say that and then the gpus are sold in like hours on release. hell, people are ready to pay over 50% more price than the msrp. that's how much demand there is for nvidia gpus.
@@AsianPersuation24x7Wouldn't affect him as I'm guessing he lives in a basement/attic/guest house and doesn't pay for rent/food/utilities/insurance etc.
Last quarter nvidia "gaming" revenue is 2.9 billion. The quarter before that it was 2.8 billion. Clearly the unreasonable price of 40 series did no deter nvidia from selling more.
Yes, u are right - torn between a boring 4080 super and the uncertain launching / supplies of Blackwell, I gunned for the RTX 4090 recently instead. Feeling was satisfactory. 😁
NVIDIA does not need to tell the truth - performance could be 100% - but they make it 10% if thats what they need to take the less. Exactly the same as Intel - 5% Per generation.
NVIDIA makes so much money outside of gaming that they no longer need to hold back in relation to AMD. They are basically in competition with themselves at this point, and as far as Blackwell is concerned, the faster and sooner it comes out the better.
I dont think Nvidia is gonna sell the full die on the 5090. Why would they? If AMD has nothing in the top end they won by default. Just like current gen. Would make way more sense to keep the uplift at 30-40%, charge the same and sell more RT and DLSS. That way you can show a "steady increase" gen on gen with higher margins and more headroom for 60series. I just wish Nvidia would deliver more in the mid range. Top end is won anyway.
If next generation plays out anything like this generation. The best bet would be to buy the Flagship and sit on as long as possible. You get best in class gaming for at least 2 years after that you can deal with it.
Im actually waiting to see if the 5000 series get announced for this year. But if not, then i will just pull the trigger on a 4090. Or If maybe AMD happened to get better in RT and have something compelling in their new lineup, i may just end up going AMD this time around... we shall see.
I use my 4090 on a 4080 level and would use a 5090 just like the 4090 so result is, I just won't need a 5090 over the next 3 years and plus I'm still on AM4 with 5800X3D and pretty happy with it. I use FPS limit to 115 and play on 120Hz LG OLED TV. So I would also have to move to AM5 to use PCIe 5.0 of the 5090 and just don't need that performance. 😅
I bought this PSU in November 2020: Corsair HX Series HX1000 1000W Platinum Rated Fully Modular Power Supply Will that be fine for a 5090 when it comes out? Just wondering since the PSU was probably made before the concept of the 12VHPWR was envisioned.
It also depends on CPU and other components. I would say even if You have intel CPU and not overclock neither of em(CPU and GPU), you could be fine. However 1200 to 1400W will be safer bet.
@@johnrehakI have a 5900X CPU, pretty sure the 5090 would still fall well within the overall power limits, just worried about the age and connectors on my PSU. Maybe a year after I upgrade to 5090 I will look at a full upgrade of everything else, depends on what CPU is out then. Crazy that 1/2 price of a PC is the GPU and the about the same price you can buy every other component.
The performance may be incredible according to Nvidia, but how much Uplift can I get from my 3060 ti.. and at what price? That is what I am wondering. If I run a game at 50 FPS average on my 3060 ti, how much money will I need to spend with the new Blackwell GPU's to get 100 FPS on the same game... WITHOUT FRAME GENERATION?
Please can you help me decide. I really want new 4k 240hz oled monitor, but I have to build new PC for it. Is it stupid to build 4090/7800x3d setup now ?
If you really need the new pc right now then go for it, but I would wait a couple more months since the ryzen 9000 series is coming out later this year and obviously the rtx 5000 series a bit later. Also I heard x3d chips are less effective as resolution increases, but I’d still get an x3d chip either way
the new AM5 might be Q2 this year or Q3 so maybe at least wait for that, even if it means getting a non X3d CPU because they might release later, then get something like a 7900xt if you can get a good deal on it (like 250$ less than the xtx) to wait for the 5090
I was going to say go ahead and buy the monitor and upgrade your system later. But then I noticed that you said OLED and OLED displays do not last decades like LCD displays. My monitor is still bottlenecked in games but it's amazing for productivity, media, even photo editing.
Thank you for tips guys ! I decided to try that 32 inch 4k oled monitor and I will try to run it in QHD in games. If it will be good even in QHD then I can wait for pc upgrade when 5090/7800X3D launch. So I get all benefits of oled and next upgrade will be last step to 4k and a bit more fps.
Yeah, the super and Ti super cards are basically just only going to be bought by the people who waited up til now for some refreshes, and that's about it. Some Titan treatment might be nice to see though, since I kind of want to get an Nvidia card soon for a few reasons. But I don't want to buy the last gen's high end either. A 5k series Titan showing up would be quite nice to see in that regards, even if I have to sell some computer stuff, take a second job for a few months, and maybe eat cheap for a month or two just to pay for it.
pretty sure 5090 will start out at 2,000usd this time since 4090 is still being bought at this price. 5080... if we are lucky, 1,000usd (hope nvidia learn their lesson that 1200 wont sell). that would be such a huge gulf in pricing between 5080 and 5090. im gonna assume nvidia is gonna do something lame and make the performance difference reflect that.
5090 might be lot faster but useless for gamer's. It will be a productivity card and very small percentage will be in gamer's use other than a reference point. 5080 would be the more realistic comparison that most likely won't be a huge step from 4080. This is how people are fooled to pay lot for nothing. Click bait.
The xx90 and xx80 release within 2 weeks of each other. This is Nvidia's history. So, the 5090 and 5080 will be announced in November 2024. It COULD be that the 5090 will release in November and the 5080 release in December. It could be earlier. But Nvidia does this for fiscal reasons to benefit their market value, keeping a stretch of up to 6 straight months of interest in their company and products.
If it performs much better than last generation, Nvidia would release an RTX 5090 with a GB203 chip, when it really should've come with an GB202 chip. They'll then claim it uses much less power than the RTX 4090, then release what it should have been one year (or a bit longer) later.
No way is it coming this year , Id bet if you wanted to sell your hardware to prep for blackwell, Novembers a comfortable time but I bet we won’t see anything till feb 1st
what makes you think so? they don't seem to be producing much more RTX 4000 stock...almost all of the RTX 3000 stuff seems gone if it's not in a pre-built. I think late 2024 is looking more likely tbh
Fact is I use my 4090 on a 4080 level and would use a 5090 just like the 4090 so result is, I just won't need a 5090 over the next 3 years and plus I'm still on AM4 with 5800X3D and pretty happy with it. I use FPS limit to 115 and play on 120Hz LG OLED TV. So I would also have to move to AM5 to use PCIe 5.0 of the 5090 and just don't need that performance. 😅
@@MB-rx2qy No just undervolting it to 0.875mV and GPU clock to 2500Mhz. No VRAM OC. Plus, I limit my FPS to 115 FPS ob 120Hz OLED C1 and plus using almost always DLSS when it's possible. Raytracing I don't use most of the time. The result is an extremely efficient usage with no wish for more performance at the time in any game I like to play.💯👌🏻😄
Nice. Right now I use my 4080 with a 1440p 144Hz monitor, and there are long stretches of time when the GPU fans don't even need to turn on. I've been liking the quiet and efficiency of the higher-end 40 series cards (when used without going full-out "4K max" all of the time, of course).@@Serandi1987
Ridiculous, extensive GPU? You have no idea what you are talking about. For professionals, GPUs are money-makers; you recoup the investment within 6 months. Over a 24-month lifespan, you make 300% of your initial investment, which is 150% every year. When you don't understand the market and view the world with a closed mind, you just embarrass yourself. No body cares about gamers with no money for an expensive hobby.
yes, the next generation graphic cards will be incredible power hungry, not to mention power draw and another melting stuff... Where is the efficiency ????
@@arenzricodexd4409 A lot of people are hyped when there's great value. Otherwise is like "look, another super powerful card that I can't buy". Look at a Steam survey and see if the vast majority of people are buying 4090s or if they are still stuck with their 1060.
You not being on camera is a blessing as I normally have to minimize your videos anyway as I don't want to be turned to stone. PCIe5 x 16 that's going to be compromised by current motherboards using their top M.2 slot
No real engineer likes the way ray-tracing/path-tracing is done using guessing algorithms. Even if it was artifact free... the card wastes so much energy.
What will happen to the battle mage of Intel?Will its performance approach RTX 4080? And the price is half of RTX 4080.We know that battle mages are coming!!!
@@ofon2000 intel right now is using TSMC 6nm which is a variant of 7nm. 7nm starts around $9000 per wafer. TSMC 4nm probably part of their 5nm process variant. Their 5nm starts at $17, 000 per wafer. This info is pretty much public. Just google it.
*how to keep HYPE in check...* Look into what the PERFORMANCE per TRANSISTOR is for just normal rasterization gen on gen. You can only TWEAK that so much as they are simply little math units at the end of the day and so you're quite limited in what you can do, given the same number of transistors, to improve WITHOUT breaking backwards compatibility. Also look at performance per watt. You'll soon find that the numbers are NOT nearly as impressive over the years as many people think. With the RTX3000 series for example, they had to put on amazing COOLERS because the Wattage got so high (as transistor count was increasing). You can't keep doing that. A lot of the performance numbers quoted will be CONFUSING. Things like Frame Generation, or FG with DLSS, or just ray-tracing values etc. That's not to say those are useless numbers, just that when you actually investigate the numbers are not quote what one might think.
Have you ever done that because it doesn't sound doable. How do you account for binning for example? In this calculation the transistor count of a 4070 ti super and a 4080 is the same, but the performance and the number of active transistors are very different.
Performance per watt is also a matter of "okay at what voltage?" I've googled some info about this and the fps/watt for 3080 vs 2080 ti is 110% if the 3080 runs at 320W but 125% if you undervolt it to 270W (which was the shipping config of the 2080 ti). So was the gen on gen increase 10 or 25%? You're also effectively measuring the cooler they shipped with because cooler silicon will run better.
Yeah...no! Nvidia and AMD will, again, work together to keep performance improvements minimal, while Nvidia again raises the pricebar...with AMD happy to latch onto the bottom rung.
My gues is the biggest performance leap will be rtx 5090 for $1999 msrp ($3k real price). 5080 will tie todays rtx 4090 and 5070 will tie 4070 ti super….. all cards below rtx 5090 will be $100-200 more expensive for sure. This patern is boring af.
it may as well be 2 years before these new products have wide availability so be patient? prices will be high and there will be little competition to provide any choice for 3-4 years - real ai won't be here for 5 years so why get antsy? #who the hell knows #who the hell cares
A DISLIKE. for uttering the n letter/word. No one cares about the n letter/word company anymore. I hope the n word/letter company and their price gouging will be met with same extreme level of prejudice as in this very post.
Hey Paul, the cash has always been in waiting, but Nvidia screwed the stack this time around. I have always bought a XX80Ti card. I currently run a 3080Ti because it was only so many percent off a 3090. I can understand Nvidia wanting people to buy a 4090 this generation, but the price was initially a shock for a gaming card, and the gap between the 4080 and 4090 was just too big. Even the 4080 super did not interest me, even though it wins on the looks. So where am I at, well I hope Nvidia will not hold a grudge for turning down an offered 4090. Even if the price dropped I am waiting for Blackwell now. These rumoured performance boosts make the 40 series the generation that I skipped. If Nvidia does not price the 5090 to high, then I'm officially waiting.
Nvidia already make crazy money on the 40 series. If you did not count the time their geforce sales being boosted by crypto this generation is their best when it comes to geforce sales for the entire history of nvidia.
Very true, but enough of us voted with our wallets, to force a price change. Might have been there best generation, but how much better could it have been, had they made better choices. Hopefully the 50 series can be the redeemer for all.
Since I couldn’t buy a 4090 recently when I planned to upgrade, I’ve now got time and am ready to pick up a 5090 at the eye watering price when it comes out. I’m sure long run I’ll be happy with not getting the 4090 if it’s really such a significant performance increase.
Anyone who buys a RTX40xx now needs their head examined with RTX50xx just few months away and double the performance. Look at all those idiots that bought RTX3090TI as the RX7900XT is faster. 🤣🤣🤣
B100 ABALK WEL X20 THEN A100 HMMM ADN 1000W GPU .YEA SHURE RAITH. ME NO CAZ OF tha 1000w sory but nevre it is facik insene barf pupu any gpu ovre 200W hasz to be godkale ..norging more notinhg les .fin 300w ultra max fin end final
AMD's next GPU is FANTASTIC... _says AMD._ 😆 Who really cares about this PR shit until the cards are actually benchmarked? Nvidia has always grossly exaggerated their performance gains with futz charts that exploit very specific games in very specific scenarios and compare them to very specific GPUs to get those results. They're meaningless.
Exactly. As it should be pronounced. It's on big letter and one small so it should be pronounced as a name and not as an abreviation like RTX. It's "Tie" and not T.I.
Whokeys 25% off code:RGT
Windows 11 Pro $23,2: biitt.ly/581eD
Windows 10 Pro $17,6: biitt.ly/9f0ie
Windows 10 Home $15:biitt.ly/XYm9o
Office 2016 $28,8:biitt.ly/cy7Vb
Office 2019 $48.6:biitt.ly/YInvw
www.whokeys.com/
Prices will also be incredible. And not in a good way.
Lets hope AMD can get it's Raytracing performance closer to Nvidia with RDNA 5, and further improve FSR etc.. And maybe Intel upping their dedicated GPU game.. More real competition is the only hope we have for lower prices in the PC gaming videocard market..
@able2 They won't even get close. They will be a generation or 2 behind like always.
I'm willing to pay it.
And people will still buy them up
@@ippothedestroyer the thing is that rtx 50 series won't be launching this year. expect them to launch in q1, most likely q2 of next year. amd however, will be launching rdna 4 this year and then very likely rdna 5 next year to compete with the 50 series. they r 2 generations behind but they can catch the 50 series if things went alright.
Can't wait to pay $1000 for 12GB VRAM
That will be the Super version!
@@nossy232323Dumb comment
and 700 dollars for a 5050 branded as a 5060💀
How much of VRAM do you usually see in use in games you play?
The super has 16GB at 1000 dollars so don't see that happening.
"nVidia's performance if off the charts!"
_changes scale on y axis_
"nVidia's performance is back on the charts!"
Underrated
If the low end cards use 4nm I'd expect we'll pay more for the same performance on the low end. Expect the 5060ti to be like 5% better than the 4060ti for 20% more money
Gaming blackwell probably use custom 5nm.
@@arenzricodexd4409 They are already on that 5nm+, or 4nm. It would need a radical architecture change to bring any meaningful increases on the same node. Maxwell did this back in the day, but it went from compute architecture to gaming - I don't see that happening again.
@@avatarion turing also using enhance 16nm. And nvidia going from gaming to more compute oriented design with turing. 4060, 4060Ti are barely faster than the original 3060, 3060Ti. probably because nvidia want to give some head room towards 50 series. For example 4060 right now sits at 115w. For the upcoming 5060 nvidia can make the chip a bit bigger with power consumption around 170w. GB102 will go beyond 600mm2 like TU102 before. And they will going to charge hefty price for it.
@@arenzricodexd4409 Turing used an improved node from Pascal. Lovelace is already on that improved 5nm you talked about. They are not going to stay on the same node without major architectural changes, like I said. All of their RTX generations have been compute. They don't bother making a separate design for the GeForce anymore.
i don't think nvidia will split blackwell die in two different process nodes. the lower end trims are the same as the higher end dies which have more defects.
Im still using a 1080ti here. Might upgrade to a 3080 if the prices on the used market will go down more.
You will never upgrade.
1060 6gb here. 1080p gamer in no hurry.
@@ZackSNetwork Yeah Zak, you da man, keeping Jensen very happy ;)
@@dobermanownerforlife3902Ditto, and an ancient Sandy Bridge CPU - doesn't make sense to upgrade that with same GPU being the limiter, but the price/performance ratio for GPUs right now is so fucked that I'm just content chipping away at my backlog. I'm very pragmatic.
They should if new prices adjust accordingly
Well obviously the company will say that. :D
I am more concerned about chip shortages again because of A.I. demand.
There is no AI demand
@@cesare4926 Are you kidding me?
@@Hexenkind1 Maybe it's some kind of troll / joke
@@cesare4926 So Nvidia's valuation exploding over the last 4 years to surpass the marketcap of of Alphabet and the entire GDP of Canada happened because PC gaming just randomly got 13X more popular than it was in 2020? Meta, Google, Microsoft, and OpenAI aren't buying hundreds of thousands of H100s each and nVidia doesn't have orders that max out their capacity for the next several years? Amazon AWS, Microsoft Azure, IBM Wason, Oracle, Tencent, Alibaba, Baidu, Llambda, RunPod, Vast, and dozens of other AI cloud services aren't filling datacenters faster than nVidia can get silicon manufactured?
@@Hexenkind1 Nvidia's AI chips cant do raytracing, bruh.
The day Nvidia or AMD brings down 3080/6800xt performance down to $300 MSRP is the day I will upgrade from my 2060.
you can get used for around that
@Healthy_Toki according to average ebay price finder, the average used 3080 goes for $467 + shipping and the average used 6800xt goes for $402 + shipping. And as a personal choice, I like to have a warranty, and this gen is likely out of it. Also this gen went through the mining craze, and although I know many tech tubers have shown that cards that were mined on are just as good if taken care of, I still have trust issues.
most likely that could happen with rtx 50 series. probably not 300 usd but maybe 400 bucks.
@@Salty_Nutella Thanks dude. For that average eBay price finder tool site mention. I had no idea something like that existed.
I'm less bothered about price of the card as its power consumption as long as its below $600.
Electricity is costly these days and you could easily end up costing yourself a lot more simply because your gfx card is ballooning the electricity bill every month.
So basically they'll promise 2-4x again, but only deliver 50-70% again while doubling price.
..yup, and (as we see here) we'll get the usual Nvidia-sponsored YT channels gushing at Ngreedia's marketing bullcrap.
It is already lucky if you can gain that much per generation lol.
@@arenzricodexd4409 100%. And those 'gains' are becoming more and more obscure - "oh look, component #13245.5423 (nothing to do with raster or RT) is now operating 70% faster, let marketing know!"
I bet they gonna release some sort of Frame Generation 2.0 or DLSS 4 ONLY for the RTX 5000 Series :D
Frame gen 1.0 = 1 fake frame between 2 real ones. Frame gen 2.0 = 2 fake frames between 2 real ones. Frame gen 3.0…..
I just got my 3080 RMA due to a sensor failure and since there was any other new I got a free upgrade to a 4080, imagine my surprise when I saw that it costs 1533€. Not changing GPU any time soon.
I got an Asus TUF OC 7900xtx for free from amazon. Beat that lol. And it was Amazon India. I told them that my card is giving me black screens, and they just refunded me the money and told me to dispose off the card at my convenience. It was an undervolt and oc issue. Sorted it out and enjoying the card for free.
That is basically stealing.@@crazycop7774
Honestly I don’t give a shit about performance anymore. It doesn’t get me excited. All I care about is price and how much faster it is compared to what I already have.
So You care about performance vs Your curent card is that correct? 😀
Hope it will be 1000$ more than the 4090 so I can sleep calm
A big jump in performance maybe will have a top model for the wealthy - rtx 5090. Normals buying xx60 and xx70 cards will get 20-30% more compared to the current generation, and probably more expensive than now.
Wonderful.
Sure.
Time to save up for the mighty RTX 5090 then.
Can't wait for the 5090 to drop. Already starting to save up for it lol.
We'll assume that once again, you've made a deposit with Jensen ;)
🎯 Key Takeaways for quick navigation:
01:48 *Blackwell GPU incredible performance teased.*
03:23 *Blackwell variants, architectural differences, memory types.*
06:35 *Blackwell's focus on ray tracing performance.*
07:29 *Supply constraints for Next Gen Nvidia GPUs.*
08:53 *Uncertainty about RTX 50 release date.*
10:45 *Nvidia's potential lack of competition from AMD in high-end GPUs.*
Made with HARPA AI
They were never going to say it's only so-so!
the number include upscaling and fake frames 🤣🤣
Funny how Nvidia saying something is "incredible" is more of an indication of where it's at in the production cycle, rather than anything else.
no matter how good the performance are, if the price is unreasonable, not many is gonna buy it
It's not "the prices " that are wrong, but your own wallet. Let's just hope everyone will make enough money to buy GPU they like and stop crying
we say that and then the gpus are sold in like hours on release. hell, people are ready to pay over 50% more price than the msrp. that's how much demand there is for nvidia gpus.
@@IGame4Fun2I bet your happy new car prices are up 40% in 4 years, and groceries are up "30%" as well.
@@AsianPersuation24x7Wouldn't affect him as I'm guessing he lives in a basement/attic/guest house and doesn't pay for rent/food/utilities/insurance etc.
Last quarter nvidia "gaming" revenue is 2.9 billion. The quarter before that it was 2.8 billion. Clearly the unreasonable price of 40 series did no deter nvidia from selling more.
Yes, u are right - torn between a boring 4080 super and the uncertain launching / supplies of Blackwell, I gunned for the RTX 4090 recently instead. Feeling was satisfactory. 😁
*I took a huge chunk of my tax return and threw it in my savings until the 5080 are available.* 👍 *Thank goodness it was a large return. LoL!*
just got the MSI 4070ti super. very happy with it. no need to upgrade.
NVIDIA does not need to tell the truth - performance could be 100% - but they make it 10% if thats what they need to take the less. Exactly the same as Intel - 5% Per generation.
Blackwell sounds so "Intel like" name to me that I still get confused about that 🤦🏻♂️😂
NVIDIA makes so much money outside of gaming that they no longer need to hold back in relation to AMD. They are basically in competition with themselves at this point, and as far as Blackwell is concerned, the faster and sooner it comes out the better.
Suddely BLACKWELL for 5090 series used GB103 and nvidia reserve 102 and 100 for Server and AI.
Lol. 10, 20% perf increase maybe then dlss 4 to account for 12gb vram with AI enhanced textures.
1000 bucs for a 60 class card 😢😢😢
I dont think Nvidia is gonna sell the full die on the 5090. Why would they? If AMD has nothing in the top end they won by default. Just like current gen. Would make way more sense to keep the uplift at 30-40%, charge the same and sell more RT and DLSS. That way you can show a "steady increase" gen on gen with higher margins and more headroom for 60series. I just wish Nvidia would deliver more in the mid range. Top end is won anyway.
Charge the same 😂😂😂
Buy AMD?
@@olha2 nope, for many reasons. Nope
Note to content creator: There are no ladies or gentlemen here.
If next generation plays out anything like this generation. The best bet would be to buy the Flagship and sit on as long as possible. You get best in class gaming for at least 2 years after that you can deal with it.
The "conflicting information" as always, is due to people feeding Paul bullsh#t.
dead god I hope they wont release the rumoured lovelace titan, Personally think a blackwell titan would be way more worth it
Price will be incredible! Next!
Im actually waiting to see if the 5000 series get announced for this year.
But if not, then i will just pull the trigger on a 4090.
Or
If maybe AMD happened to get better in RT and have something compelling in their new lineup, i may just end up going AMD this time around... we shall see.
I use my 4090 on a 4080 level and would use a 5090 just like the 4090 so result is, I just won't need a 5090 over the next 3 years and plus I'm still on AM4 with 5800X3D and pretty happy with it. I use FPS limit to 115 and play on 120Hz LG OLED TV. So I would also have to move to AM5 to use PCIe 5.0 of the 5090 and just don't need that performance. 😅
@Serandi interesting 🤔
I have an LG C2.
Is good to know I have options.
Ow, and thanks for your input. It is greatly appreciated.
@@AlejandroOni np bro 😊💪
Need a hook with the PRICE of the New GPU will be INCREDIBLE EXPENSIVE!
Hopefully, they come out with a *RTX B2000* that uses 70W
They only recently released the RTX2000 Ada almost a year after the twice as expensive RTX4000 Ada SFF, they might do something like that again.
I bought this PSU in November 2020: Corsair HX Series HX1000 1000W Platinum Rated Fully Modular Power Supply
Will that be fine for a 5090 when it comes out? Just wondering since the PSU was probably made before the concept of the 12VHPWR was envisioned.
I'm sure it will be fine. 1000w is a good baseline. Only might need to upgrade if 5090 comes with 2x12vhpwr instead of just 1.
@@jjlw2378rumors says it will.
It also depends on CPU and other components. I would say even if You have intel CPU and not overclock neither of em(CPU and GPU), you could be fine. However 1200 to 1400W will be safer bet.
@@jjlw2378Oh geez, that would be a monster.
@@johnrehakI have a 5900X CPU, pretty sure the 5090 would still fall well within the overall power limits, just worried about the age and connectors on my PSU. Maybe a year after I upgrade to 5090 I will look at a full upgrade of everything else, depends on what CPU is out then. Crazy that 1/2 price of a PC is the GPU and the about the same price you can buy every other component.
guess I'll keep using my good old 1080Ti for a while then
"says Nvidia"
Well of course they would. If any company talks crap about their own products and services something is seriously wrong.
RTX 5080 is going to be $1799 and RTX 5090 is going to be 4999.99 or 197 easy payments of 99.99 at 23% APR
The performance may be incredible according to Nvidia, but how much Uplift can I get from my 3060 ti.. and at what price? That is what I am wondering. If I run a game at 50 FPS average on my 3060 ti, how much money will I need to spend with the new Blackwell GPU's to get 100 FPS on the same game... WITHOUT FRAME GENERATION?
Remember when nvidia said the titan x pascal was a 8k gpu
Please can you help me decide. I really want new 4k 240hz oled monitor, but I have to build new PC for it. Is it stupid to build 4090/7800x3d setup now ?
If you really need the new pc right now then go for it, but I would wait a couple more months since the ryzen 9000 series is coming out later this year and obviously the rtx 5000 series a bit later. Also I heard x3d chips are less effective as resolution increases, but I’d still get an x3d chip either way
OK thank you for tip buddy. Ye I still go for X3D, I play a lot of WoW and this CPU is monster in this game. @@denisneghina5987
the new AM5 might be Q2 this year or Q3 so maybe at least wait for that, even if it means getting a non X3d CPU because they might release later, then get something like a 7900xt if you can get a good deal on it (like 250$ less than the xtx) to wait for the 5090
I was going to say go ahead and buy the monitor and upgrade your system later. But then I noticed that you said OLED and OLED displays do not last decades like LCD displays.
My monitor is still bottlenecked in games but it's amazing for productivity, media, even photo editing.
Thank you for tips guys ! I decided to try that 32 inch 4k oled monitor and I will try to run it in QHD in games. If it will be good even in QHD then I can wait for pc upgrade when 5090/7800X3D launch. So I get all benefits of oled and next upgrade will be last step to 4k and a bit more fps.
To be honest I completely checked out of the while GPU race, the steps are too small and I just can't take the prices serious anymore.
Cope. The step up to rtx40 was not small.
Yeah, the super and Ti super cards are basically just only going to be bought by the people who waited up til now for some refreshes, and that's about it. Some Titan treatment might be nice to see though, since I kind of want to get an Nvidia card soon for a few reasons. But I don't want to buy the last gen's high end either. A 5k series Titan showing up would be quite nice to see in that regards, even if I have to sell some computer stuff, take a second job for a few months, and maybe eat cheap for a month or two just to pay for it.
pretty sure 5090 will start out at 2,000usd this time since 4090 is still being bought at this price. 5080... if we are lucky, 1,000usd (hope nvidia learn their lesson that 1200 wont sell). that would be such a huge gulf in pricing between 5080 and 5090. im gonna assume nvidia is gonna do something lame and make the performance difference reflect that.
4080 $1200 pricing is to push people towards $1500 4090.
5090 50% increase. 5060 will be 0% increase in performance. Buy more, pay more, get less. Nvidia's new motto.
5090 might be lot faster but useless for gamer's. It will be a productivity card and very small percentage will be in gamer's use other than a reference point. 5080 would be the more realistic comparison that most likely won't be a huge step from 4080. This is how people are fooled to pay lot for nothing. Click bait.
Lol ceo clearly stated enjoy the low allocation, gamers.
will the RTX 5090 come in Q4 this year?
I feel like no? idk what do you think?
@@Healthy_TokiI think top card will come the 5090, maybe rest later in Q1 2025.
they planning to put out b100 this quarter 2,so i doubt it would wait another year for the cut-down version come out
The xx90 and xx80 release within 2 weeks of each other. This is Nvidia's history.
So, the 5090 and 5080 will be announced in November 2024. It COULD be that the 5090 will release in November and the 5080 release in December. It could be earlier. But Nvidia does this for fiscal reasons to benefit their market value, keeping a stretch of up to 6 straight months of interest in their company and products.
If it performs much better than last generation, Nvidia would release an RTX 5090 with a GB203 chip, when it really should've come with an GB202 chip. They'll then claim it uses much less power than the RTX 4090, then release what it should have been one year (or a bit longer) later.
Nvidia will undoubtedly bump the price up as the demand becomes greater.
Nah, they have enough braindead to release at bumped up prices. Unfortunately.
No way is it coming this year , Id bet if you wanted to sell your hardware to prep for blackwell, Novembers a comfortable time but I bet we won’t see anything till feb 1st
what makes you think so? they don't seem to be producing much more RTX 4000 stock...almost all of the RTX 3000 stuff seems gone if it's not in a pre-built. I think late 2024 is looking more likely tbh
Fact is I use my 4090 on a 4080 level and would use a 5090 just like the 4090 so result is, I just won't need a 5090 over the next 3 years and plus I'm still on AM4 with 5800X3D and pretty happy with it. I use FPS limit to 115 and play on 120Hz LG OLED TV. So I would also have to move to AM5 to use PCIe 5.0 of the 5090 and just don't need that performance. 😅
Are you power-limiting your 4090?
@@MB-rx2qy No just undervolting it to 0.875mV and GPU clock to 2500Mhz. No VRAM OC. Plus, I limit my FPS to 115 FPS ob 120Hz OLED C1 and plus using almost always DLSS when it's possible. Raytracing I don't use most of the time. The result is an extremely efficient usage with no wish for more performance at the time in any game I like to play.💯👌🏻😄
@@MB-rx2qy worst case watts usage is about 250-300 watts. I really like that🚨👍🏻
Nice. Right now I use my 4080 with a 1440p 144Hz monitor, and there are long stretches of time when the GPU fans don't even need to turn on. I've been liking the quiet and efficiency of the higher-end 40 series cards (when used without going full-out "4K max" all of the time, of course).@@Serandi1987
@@MB-rx2qy 😍👌🏻
no real gains in performance is just boring,both from nvidia and amd. no interest to by any gpu in next 5 years
Ridiculous, extensive GPU? You have no idea what you are talking about. For professionals, GPUs are money-makers; you recoup the investment within 6 months. Over a 24-month lifespan, you make 300% of your initial investment, which is 150% every year. When you don't understand the market and view the world with a closed mind, you just embarrass yourself. No body cares about gamers with no money for an expensive hobby.
This is embarrassing 😂
Ask your sources about what power connector the 5090 will have. If it doesn't set my house on fire I might buy one.
yes, the next generation graphic cards will be incredible power hungry, not to mention power draw and another melting stuff...
Where is the efficiency ????
I'm starting to detest all this hype. Not because of the performance, but because we know that Nvidia will make the cards super expensive.
You, me and lots of folks. It doesn't take many years monitoring CPU and GPU releases to realise that marketing and youtube are full of bullsh#t.
Because that's how they should be priced?
@@arenzricodexd4409 A lot of people are hyped when there's great value. Otherwise is like "look, another super powerful card that I can't buy". Look at a Steam survey and see if the vast majority of people are buying 4090s or if they are still stuck with their 1060.
i like the chart.. 1x - 11x - 18x - actualy perfomance made by software not a iron :) gpu just 5% better itself :)
Would love to see psie5. I dont think it will change much but maybe after having it someone will adapt some tech to it. Maybe ai workloads
The 4090 is not even maxed out by pcie3 x16 !!! It's gonna be a long, long, long time before gen5 x16 is needed by any gaming gpu.
Well they aren't going to say their new cards are shit.
You not being on camera is a blessing as I normally have to minimize your videos anyway as I don't want to be turned to stone. PCIe5 x 16 that's going to be compromised by current motherboards using their top M.2 slot
Yeah, it's damning that PCIe lane count has stagnated for years on desktop.
No real engineer likes the way ray-tracing/path-tracing is done using guessing algorithms. Even if it was artifact free... the card wastes so much energy.
Looking forward to a $4000 Blackwell GPU for 2x performance at 600W+. No one will be able to buy it anyways.
Say BATTLEMAGE
What will happen to the battle mage of Intel?Will its performance approach RTX 4080? And the price is half of RTX 4080.We know that battle mages are coming!!!
Half price of 4080? That's pretty much asking intel to quit discrete gpu market.
Prices gonna be hiigh, but cheap for intel since intel choose 4nm not 3nm
Im going battlemage next gen
good luck brother!
4nm at the very least still twice more expensive than the node they are using right now.
@@arenzricodexd4409 where are you getting this from?
@@ofon2000 intel right now is using TSMC 6nm which is a variant of 7nm. 7nm starts around $9000 per wafer. TSMC 4nm probably part of their 5nm process variant. Their 5nm starts at $17, 000 per wafer. This info is pretty much public. Just google it.
@@ofon2000 TSMC wafer price is not exactly a secret.
*how to keep HYPE in check...*
Look into what the PERFORMANCE per TRANSISTOR is for just normal rasterization gen on gen. You can only TWEAK that so much as they are simply little math units at the end of the day and so you're quite limited in what you can do, given the same number of transistors, to improve WITHOUT breaking backwards compatibility. Also look at performance per watt. You'll soon find that the numbers are NOT nearly as impressive over the years as many people think. With the RTX3000 series for example, they had to put on amazing COOLERS because the Wattage got so high (as transistor count was increasing). You can't keep doing that. A lot of the performance numbers quoted will be CONFUSING. Things like Frame Generation, or FG with DLSS, or just ray-tracing values etc. That's not to say those are useless numbers, just that when you actually investigate the numbers are not quote what one might think.
Have you ever done that because it doesn't sound doable. How do you account for binning for example? In this calculation the transistor count of a 4070 ti super and a 4080 is the same, but the performance and the number of active transistors are very different.
Performance per watt is also a matter of "okay at what voltage?" I've googled some info about this and the fps/watt for 3080 vs 2080 ti is 110% if the 3080 runs at 320W but 125% if you undervolt it to 270W (which was the shipping config of the 2080 ti). So was the gen on gen increase 10 or 25%? You're also effectively measuring the cooler they shipped with because cooler silicon will run better.
All these channels feed on hype, and remember, hype = clickbait = mostly fake info.
5090 for 3000 USD after scalper taxes. Thank me later
What else did you expect that Nvidia was going to say?
Correct, and we immediately get to see which YT channels are 'sponsored' by Nvidia ;)
"incredible" They say that about every generation. lol
We all know they’re gonna charge Apple prices and lock a new version of a software behind the new gen, marking it as ‘proprietary’.
Yeah...no! Nvidia and AMD will, again, work together to keep performance improvements minimal, while Nvidia again raises the pricebar...with AMD happy to latch onto the bottom rung.
Master of double talk... 11 minutes.. a few old slides of information...
i don t get it why u wonder , allready nvidia is double then amd wtf u smoke
What about the Intel stuff?
My gues is the biggest performance leap will be rtx 5090 for $1999 msrp ($3k real price). 5080 will tie todays rtx 4090 and 5070 will tie 4070 ti super….. all cards below rtx 5090 will be $100-200 more expensive for sure. This patern is boring af.
Entire video was just a spoiler for Alan Wake.
it may as well be 2 years before these new products have wide availability so be patient? prices will be high and there will be little competition to provide any choice for 3-4 years - real ai won't be here for 5 years so why get antsy? #who the hell knows #who the hell cares
A DISLIKE.
for uttering the n letter/word. No one cares about the n letter/word company anymore. I hope the n word/letter company and their price gouging will be met with same extreme level of prejudice as in this very post.
get a hold of this u guyz :
i don't care about the n-word company anymore
And it comes with 500% price increase
Hey Paul, the cash has always been in waiting, but Nvidia screwed the stack this time around. I have always bought a XX80Ti card. I currently run a 3080Ti because it was only so many percent off a 3090. I can understand Nvidia wanting people to buy a 4090 this generation, but the price was initially a shock for a gaming card, and the gap between the 4080 and 4090 was just too big.
Even the 4080 super did not interest me, even though it wins on the looks.
So where am I at, well I hope Nvidia will not hold a grudge for turning down an offered 4090. Even if the price dropped I am waiting for Blackwell now. These rumoured performance boosts make the 40 series the generation that I skipped.
If Nvidia does not price the 5090 to high, then I'm officially waiting.
Nvidia already make crazy money on the 40 series. If you did not count the time their geforce sales being boosted by crypto this generation is their best when it comes to geforce sales for the entire history of nvidia.
Very true, but enough of us voted with our wallets, to force a price change.
Might have been there best generation, but how much better could it have been, had they made better choices.
Hopefully the 50 series can be the redeemer for all.
98bit bus? 😁
Since I couldn’t buy a 4090 recently when I planned to upgrade, I’ve now got time and am ready to pick up a 5090 at the eye watering price when it comes out. I’m sure long run I’ll be happy with not getting the 4090 if it’s really such a significant performance increase.
I didn't hear him say anything about power efficiency? 😂😂
WuZuP? AMD, FTW !
also ... this kind of hype is well know and boring .. its always the same thing bla bla bla x 2 bla bla bla .. and at the end its 30% at most
Anyone who buys a RTX40xx now needs their head examined with RTX50xx just few months away and double the performance. Look at all those idiots that bought RTX3090TI as the RX7900XT is faster. 🤣🤣🤣
RTX5090 will be 59% faster than RTX4090
335% faster
PCIe 5.0 will be a game changer for graphics cards
Nvidia do talk a lot of shit though.....
hopefully 5080 is close to 4090 or a bit better
😂
It better be or it'll be a huge disappointment.
The 4080 was faster than the 3090ti and significantly more power efficient. What is the point of your comment?
@@ZackSNetwork I'm aware it was.
The point of my comment is I felt like leaving it. Y'all get so weird over comments. It's very strange.
5080 close to the 4090? If everything goes as always, id expect the 5080 to smash the 4090
bs
Already sold one of the kids, putting the house up for sale next so I have enough cash to buy one, 🤣
mb
lfg!!
Yes with DLSS 5 and FrameGen and other bullshits it will be incredible!
5090 ... or nothing!
B100 ABALK WEL X20 THEN A100 HMMM ADN 1000W GPU .YEA SHURE RAITH. ME NO CAZ OF tha 1000w sory but nevre it is facik insene barf pupu any gpu ovre 200W hasz to be godkale ..norging more notinhg les .fin 300w ultra max fin end final
AMD's next GPU is FANTASTIC... _says AMD._ 😆
Who really cares about this PR shit until the cards are actually benchmarked? Nvidia has always grossly exaggerated their performance gains with futz charts that exploit very specific games in very specific scenarios and compare them to very specific GPUs to get those results. They're meaningless.
2:14 "Tie"
Exactly. As it should be pronounced. It's on big letter and one small so it should be pronounced as a name and not as an abreviation like RTX. It's "Tie" and not T.I.