Why the RTX 4060 is really a RTX 4050
Вставка
- Опубліковано 4 лип 2023
- Is the RTX 4060 actually a RTX 4050 in disguise? It looks like Nvidia hasn't learned from the RTX 4080 12GB debacle.
Support me on Patreon: www.patreon.com/user?u=46978634
Follow me on Twitter: / highyieldyt - Наука та технологія
What do you think about all of this? Is it really a "4060" if it uses the most low end chip Nvidia is producing? Imagine if Nvidia would have targeted a similar 70%+ performance gain for the 4060, like they did with the 4090...
Nvidia's biggest problem is overperforming. Their gpus performed better than expected that's why they decided it was too good to call full AD103 4070 and go for the insane 4080 12gb idea. And all the trash that comes after it. But not only that, cards like the 4060ti and 4070ti become terribly bad even without taking name and price to consideration because bus and vram are highly uncompatible with their performance. Making them even worse than last gen 3070ti.
At least 4070 and 4080 will be good choices used in 2025 at 300$ and 500$ respectively.
While AMD's biggest problem is the opposite, underperforming. And the fact that they called the 7800xt -> 7900xt made it a lot worse. This was partly helped by Nvidia plan of 4080 12gb too i think.
This gen is disappointing at best!
Nope, Nvidia are insulting gamer intelligence. They priced models based on the mining boom nonsense of the ti launches.
This is the strategy phone manufacturers have used successfully for years. The flagship phones keep improving every year with large leaps in performance while the midrange and low-end phones stagnate in performance and features. The choice is either an expensive new flagship or the older generation and people always want the latest new thing so this ensures progress on the midrange and low-end is very slow and the customer base is steady.
if it was the case, that 4050 (at $249 would completely destroy the RX 7600 (at $270; some cheaper models at $250) from AMD.
And we could say the same with the other GPU: a 4080 (aka 4090) at $1200 would have destroyed the 7900XTX, a 4070 (aka 4080) at $799 would have destroyed the not yet released RX 7800XT (navi 32 ?)...
So, it seems that Nvidia knew that, so, why bother, especially when you have 80%+ of the marketshare ?
Let's hope better result with RDNA 4, but, if Nvidia seems to have delayed "Ada Lovelace Next" for a year, that would mean that RDNA 4 is no a threat for them...
@@CM0G it is not performance but the relatively high production costs and their drive for high margin, which dictated Nvidia pricing strategy. The AI boom has rescued them, so they are happy to divert wafers to the corporate market. The fact was both 4080 models were struggling to deliver the typical tier improvement 2 years of development brings. It is believed the 4090 was priced lower than expected, while the 3090 was only marginally faster than 3080 in 2020 but had large VRAM.
The testing of very high watts cards was to ensure a win vs AMD's chiplet approach.
Nvidia's own figures showed lack lustre gains in raster, so there's emphasis on ray tracing and the fake frames as progress.
Those who pointed out the limited VRAM would be an issue have been proven correct. When the models were set, GDDR6 pricing was high and they copied the limited bandwidth of RDNA2 with caches, but skimped on VRAM. The demand for 1440p & 4K plus developers move to UE5, has made Nvidia's range poor buys.
Actually being slower than the 3060 in some tests and only matching the monolithic 6nm 7600 8GB is the result, a debacle not "overperforming".
The AMD drivers have improved the dual issue and improved efficiency now, the AMD range were designed to hit cost targets. While Nvidia went all out chasing high end performance and gimping everything below it, with products designed for the mining boom era not anticipating the 2023 requirements.
The specs make it clear, Nvidia doesn't care and clueless people will buy it because it has Nvidia slapped on it. We as consumers pay the price for that. I'm not even getting into the 16gb 128 bit bus for 500 dollars, which imo is the biggest scam they are trying to pull off. And AMD could be banking on all these faults but chooses to be following the same crappy practices, being one micro step below nvidia into not being as a crappy company. It's a terrible fixed duopoly.
Many people say Intel is the only hope But I highly doubt it, they are having it REALLY rough lately and if they feel their GPUs are not gonna be profitable on the short or middle term, there is a very real high chance that will kill the project entirely.
Basically AAA gaming right now is not worth it, the only games that are worth are indie games (which have low requirements for the hardware) and the games of 2 or 3 AAA gaming companies.
Both companies are majority owned by same shareholders. So, those shareholders hv both the market share and high profits at the same time. Situation is not gonna change anytime soon.
amd tried before but you guys keep on buying nvidia, remmeber gtx970 3.5gb? AMD was Offering the 390 8GB for $90 cheaper, double the ram and 10% faster... same goes for the 6700xt double the ram and faster than the 3070ti and you guys kept on buying nvidia.
@@unavailable291 But 390 consumed almost 300w or so, or even more for OC models that actually could beat the 970? On top of that AMD had shitty coolers and the cards ran loud. So I wonder why people got 3.5gb of vram, people got more perf than with AMD just because of the better drivers and all that at half power.
This is what I have been saying this entire generation. The 4070 is a 4060, with 4080 pricing.
Compared to Ampere, the 4070 would really be more like a 4050Ti. The 4070Ti is basically what the 3060 was. The 4070 is a further cut down version.
@@maynardburger ??? the 4070 is about 5% slower than a 3090, so by your logic the 4050 ti should be only 5% slower than the $1500 flagship that came in 2020...
If the 4080 12GB wasn’t rebranded as the 4070 Ti, there’s a real possibility that every card below it would’ve been moved up a tier; making this an RTX 4060 Ti.
that would have been a catastrophe.
But isnt that exactly what they did ?.
Using the history of the 70 Class all had 256bit busses
The '12 Gb forty fakety' has a 192 bit bus -
It was promoted 2 Full tiers knowing full well it would be spotted & they were going to "eat humble pie and drop down a single tier"
Net result the 4070ti should have been the 4060ti. And Nvidia carried on adding "upscaling" to their line up.
@@ochsosfollies 4070 being 4060ti would have been more reasonable.
Rtx 4000 series gpus mobile are beta rtx 4000 series
Rtx 4080 mobile is just rtx 4070
Let's also not forget how the new 103 class die is basically a rebranded 104 die with it's due size and 256bit memory bus. Shifting the whole lineup down a tier making this like a last gen 108
Yes, this is super important to realize. It's really even worse than many people understand. It's also why there's such a massive gap between the 4080 and 4090 in performance(when you can find a non-CPU limited scenario for the 4090...). And even the 4090 is more like a 4080Ti, if we compare with Ampere. There's a lot of room for further cut down AD102 parts, but it's unlikely we'll see many at all because of how high priced they made the 4080 already.
very few people understand this but it's just so obvious. That doesn't mean "it's bad" or nvidia are scheming or whatever. It just means that die codenames are as fluid as actual brand names. And people think you can just magically directly compare an AD104 to a GP104 or a TU104 etc. Which is just not the case. Just as you can't really compare the GA103 to the AD103.
I mean it was pretty obvious that the entire product stack this generation from NVIDIA was elevated a tier above what it should have been the second they unveiled the original 4080 12GB (Now 4070ti) , everyone rightfully called NVIDIA on their BS but then forgot to consider the fact for NVIDIA to offer such a card they needed to re-shuffle their entire product stack.
The only class of GPUs that actually lines up with its GPU Die size is the 4080TI and up which NVIDIA has decided to launch at insane prices.
4090 gang
And yea well the now called 4070Ti is still a tier below where those usually sit, the 4080 12GB would be two tiers below it's own branding, it was/is straight up a "4060" based on specs alone. It's really wild.
elevated price tier
@@fVNzO Aye the 4070 ti should have just been a 4070, although there has always been some flexibility in the TI entries historically NVIDIA used them to fill out gaps in the market where AMD is more competitive.
The 4090 is a good example where we didn't see the full AD102 GPU (the 4090 TI) for nearly a year after release because AMD simply doesn't have anything to compete with so NVIDIA isn't pressured to release anything better.
Nvidia feels like they can use DLSS3, specifically frame generation, as a means to sell off lower-end chips as midrange. AMD choosing, or just not being able, to be truly price to performance competitive with Nvidia is allowing this to happen. We need real price competitive competition in the space in order for things to change.
AMD needs to start aiming for market share and not their shareholders. fighting for market share now is better for the shareholders in the long-term IMO.
@@kotztotz3530 Both companies are majority owned by same shareholders. So, those shareholders hv both the market share and high profits at the same time. Situation is not gonna change anytime soon.
I will gladly pay $100 extra for DLSS3 and the better RT performance.
@@prajaybasu go simp for nVidia somewhere else!
@@prajaybasu I wish you the best of luck having "better RT performance" playable on 8GB of VRAM and 128-bit memory interface
why has big youtubers not mentioned this before, this is big, its like there downgrading every series to a 1 lower grade then originally done to make the top gpu 90 series look super insane performance in comparison
Hardware Unboxed have more or less said a similar thing, they just didn't go into more detail into explaining exactly what Nvidia are doing like this guy did. It's why it's frustrating when people say that 40 series/Lovelace is 'disappointing'. Like no, it's actually super impressive stuff. It's only the naming/pricing that's destroying its reputation. This is basically a Pascal-like uplift in performance and efficiency, but instead of sane pricing, they've essentially more than doubled the pricing for nearly every tier. So like if they sold us a 1070 for $800 instead of $400.
Hey, I just want to express that Nvidia has been doing this since the 600 series of cards. The 680 was a GK104 chip. It was a 670 sold as a 680. They release the full die as a super expensive halo product in this case the very first titan. NVIDIA has always tried to screw us over every chance they can get.
Lovely 120$ card, honestly. Finally a great budget option from Nvidia! Good to seem them price their 50 series reasonably again.
What do you mean it's a 60 class card?
WHAT DO YOU MEAN IT'S 300$???
WHAT DO YOU MEAN ITS 450Euros ?
This is no surprise. Nvidia literally just took almost their entire product stack and bumped it up one price bracket. The 4060 is a 4050, the 4060ti is a 4060, the 4070 is a 4060ti, the 4070ti (which was going to be pushed even further as a 4080) is a 4070... the only two GPUs that are in the correct classes are the 4080 and the 4090, but they are priced like a 4090 and a 4090ti respectively.
Nvidia be like: "Oh my GOD, we just made a powerful 4050 that is beating out a 3060. $200 is too low for it." And the rest is history.
Yeah, you basically get RTX 3060 performance, but it's much cheaper for Nvidia to produce.
PS: Gratz on hitting 1K subs!
@@HighYield Thanks brother. 😍
uhh... generally that is how the generations went, you buy a 1080 it was around the same performance (or better) than the 980ti (with titans mixed in there if you wanted to compare them). So the 4050 should be 3060/3060-ti performance for less money. What they have done is to take a shit on the collective market and hope that people continue being sheep and don't buy AMD.
@@Azureskies01 True, if Nvidia would have called it RTX 4050 Ti and launched at $229 it would have been a decent card and good upgrade for many. Slightly faster than the old 3060, low power consumption and still decent for 1080p, with all new features. It's not a bad chip, just bad naming and pricing.
It was intended to be an RTX 4050
Its not enough to go on I know, but I kind of felt this way about the base 4070. We haven't had an efficient 70 sku GPU since the 1070 and even then that was the might high end models that used dual 8 pin connectors. Again, probably wrong, but the 4070's "efficiency" seems to me to be more so because it should've been a 4060. The 4090 and 4080 seem to the the only correctly named GPUs in the 40 series.
Nope even the 4080 is cut down a bit. It has more vram but then it has a smaller bud. 256 instead of 320
@@Chrisp707- wellp, it would limit the memory options. I think 4080 is a fine card, but it should have been 4070/4070ti at best, with half the price.
@@Antagon666
4080 shouild name 4070 600$
real 4080 = 102 die cut 320bit 20gb vram 900$
Shrinkflation in the gpu market is a real issue. The used market is so much better to buy from. I bought my 3090 for $660. I'm going to run that card for as long as possible. Hopefully by the 6000 series, both Nvidia and AMD will have multi chip module working and can bring greater performance at reduced costs.
We don't need the latest and greatest node if we can use more of the lesser nodes. Hopefully by then the AI boom should be saturated as well.
You also could of bought a brand new high end AMD GPU for $600-700 instead. I'm pretty sure the RX 6950 XT goes for around $700
@homelesswizard3161 AMD just stopped producing the card and recently increased its price from $600 cuz they think its giving better value than the new 7000 series thats going to be released soon. dont buy the card now cuz you'd be paying more than the original price
@@kenjaws7066 But it's still around $700, for a really high end GPU targeting 4k that's the best value right now. It's performance is really close to a 7900 XT.
@@homelesswizard3161 I could have bought the 6950xt but I’m also feeding the ai boom. I don’t really game much. I mostly use the 3090 with open source llm models and deep learning. I really like cuda and the vram this card offers.
I do game on the side though. The 6950xt is very good but the 3090 is more than enough for me. I think it does better with Ray tracing as well.
Yeah, you effectively bought nVidia and called it a day of victory. You really showed them. Congratulations. The money you spent will go to nVidia pocket, because someone now will buy new nVidia gpu. You made it possible. I'm sure you will find many reasons. Excuses, excuses...
And 3090 wasn't even that good card.
The 50 class has been pretty weird lately: The jump from GTX 1650 to RTX 3050 brought 100$ (~67%) higher MSRP and 40-55W (~50%-75%) higher TBP, but double VRAM capacity/RT/DLSS and 74% more performance...
Now from RTX 3050 to RTX 4050 it's AGAIN 50$ higher MSRP for just 59% higher performance, while TBP and VRAM has been kept at 115-130W and 8GB... other than that it's just DLSS2 vs. DLSS3.
Honestly, I wouldn't be able to decide which was the better generational jump. I don’t care for DLSS or RT, but hated the TBP increase…and I can excuse the VRAM stagnation of the 4050, because the 3050 had just doubled it… Still I’m hesitant to say 4050 because another 50$ added to the MSRP means double the MSRP of the 1650 in just two generations!
subbed, love all the tech break down
Nail on the head as usual thanks.
If Nvidia had actually sold this generation of gpus as per their actual class ie all down 1 tier from the (and including) the 4080
There would have been no way anyone would have bought the over stocked 30 series. .
End result a half generational upgrade for the buyer.
Bait and switch in practice. the 12Gb 4080 being the decoy ..
"The more you buy,
The more you save"
-- leather jacket man
What I see NVIDIA is trying to do is market DLSS 3 FG frames as real frames. I mean, look at their whole marketing. It's all about DLSS 3 FG fps numbers. So yeah, that's the point, the whole generational uplift is just about efficiency and DLSS 3, which I honestly believe that RTX 3000 can do it too, if not for a software lock. (remember, Ampere and lovelace is basically the same thing aside from the MASSIVELY increased L2 cache).
I think it's time for GPUs to become modular themselves.
I hate the idea of having to replace my GPU so early, just because it doesn't have that much VRAM.
I have a 3060Ti, and I'm very happy with it. But game developers have been so incompetent lately, that they don't optimize VRAM usage, and since gamers just buy whatever is out there, I don't see games being more optimized in the future.
I would love it if I could just install more video memory directly on the card...
Oh well. I'm just not gonna buy unoptimized games.
Not gonna lie, this video has taught me a TON of what components are what. It was nice displayed and highlighted I felt like I instantly understood the whole floorplan
I am glad I took a chance on an Intel A770. I picked one up on a fire sale for $289 and couldn't be happier. I can play almost any game at medium-high settings and get over 60 fps in many games at 1440p. Drivers have improved a lot since March. And they keep getting better.
My gtx 1080 has been doing that for years... all for the $250 5 years ago... I just can't deal with this market
I think that with the poor sales of the 60 class lineup,if the 50 ones have anything lower than 8gb with 128bit bus,they will be a automatic DoA
Great and insightful as always.
I haven't bought a new GPU since the GTX 1080. Pretty much nothing from team red or green have truly impressed me in over 8 years. The 3080 was good, but mining and scalping ruined it. The 6700 and 6800 were good, but are only just now becoming cost effective, yet they've been out for two years. They were overpriced at launch even without mining/scalping. What they cost now is what they should have launched at, or at least only a little bit more than what they are now. The problem with buying one now is that the prices should be even lower. A 6700 should be a bargain GPU now.
I'm still holding on to my 1080ti... But if i had to upgrade right now, the best value to performance i see in my area is the 7900 GRE which is going for around 600€, it offers high end performance, not the best performance but still great performance for that price tag... with 16gb VRAM, which is plenty for a card like this and it will survive the test of time well probably.
Most of these cards could have been like the 1080ti, nvidia does not allow this because they either screw up the price, the vram or the name or all 3 at the sametime. AMD is not popular in the GPU market, they can offer something amazing but they will never sell as much as nvidia, this hurts AMD because they are trying to get into the market more and more, they offer better value to performance most of the time when compared to nvidia. Nvidia you are buying a fake product that is produced to trick the consumer into buying something that is not, for higher MSRP. While AMD is just trying to keep up with nvidia. Intel at this rate is not going anywhere... I did like what A770 16gb offered, but the driver issue and slow release schedules are making them a no-go for higher end market, which means they would have to compete with the low end market, and the low end market is already being replaced by APUs to some extent, so uh.... yeah. I feel like AMD and intel will battle for the low to mid range area in the market, while nvidia continues to scam people and get away with it, which is sad.
I think that Nvidia was really testing the waters with their gpu launches whether the demand is still there, if not they can always just sell it for less, I believe that with the ,,lowrr" end customers they've realized that they're easier to undercut I guess because to some people what matters is to just get decent FPS on normal settings 1080p. I mean I can speak from experience, I have a 1050ti in my desktop and there's no way I'd shell out hundreds of dollars for a gpu upgrade the one I'm currently thinking of is a 2060 super upgrade as sure the 2070 and 2080s are way better value used but they're still around 300$ which is a gamble I just can't take because I suspect that my poor performance recently hasn't only been due to my gpu as CPU usage is always higher pretty much when gaming according to task manager so at this point I'm unsure if my i5 8400 deserves an upgrade because every site says that the bottleneck in a build like that shouldn't be the processor yet I feel like that
It is actually WORSE than people realize. AD103's inclusion in the main lineup is more or less the equivalent of what x04 parts were before. In other words, every die after AD102 has basically been pushed down. AD103 = x04, AD104 = x06, AD106 = x07, and so AD107 is not equivalent to GA107 or whatever, it's effectively a new, even lower tier!
This is why Nvidia should've been more friendly to aibs, especially EVGA. Yeah, it's fine if they want to be an ai company, but at the very least they could've let the partners vote on how the leftover chips from their ai business get classified. Cause that's what it is now, gaming GPUs are what we get when the wafer doesn't pass the stability and specs needed for workstations, servers, and supercomputers. It's the same microarchitecture every time.
Not looking hopeful for the Rtx 5000 series. Tiny chips on super expensive 3nm perhaps.
need more ppl talking about this topic
It's great for Nvidia that Nvidia can get out of 150mm^2 the same ish performance as last gen with 392 mm^2 with some added features in as well. Nothing is flowing through to consumers though
If you are talking about the 3060 ti, that card has the same chip like 3070 with most part of the chip disabled so the die size doesn't mean much for the 3060 ti
lol, I'm surprised nobody else seems to have picked up on the x050 die being used on the 4060
As someone who has Limited knowledge of gpus can i ask a question?
Would an old high end card like a 5700 XT be on par with the brand new lower/mid options like rx 7600 and 4060 or would it be better/worse?
You can get a used 5700 xt for £160 uk or a 4060 for twice that
The 5700xt would be slower but only by like 10 to 20% slower depending on game, but around 50% cheaper it is a really good deal if you have a PSU that can support (it draws around 225w)it the newer cards would only be better if you can't get a good enough psu
@@transistorjump919 Yea, a lot of people did not understand that AMD simply did not make any high end RDNA1 parts at all. The 5700XT was the 'highest' offering they did, but it was not actually a high end part at all. It was actually more comparable with like Polaris 10(RX480/580) than anything, coming in at 250mm². Just an ordinary midrange part, but with upper midrange pricing.
VRAM is quite cheap for nvidia like 16GB costs around 30-50 for them, they just won't do it as it isn't in their business plan (must upgrade to a higher tier gpu for more vram bud).
A big reason for the renaming of the RTX 4080 12 GB version was there was already another RTX 4080 with 16 GB. If that wasn’t the case nothing would probably have changed other than I would assume the 16 GB would be called RTX 4080 Ti, lol.
I would buy this card if its around 200$, I really think its useful because of low power consumption and the games I will be playing will be on medium settings at 1080p.
The main issue with this 4000 series isn't the just price/performance,it's that the introduction of the RTX 4070Ti which takes place right where the
RTX 4070 normally would be,thus making every card down the stack a lower end product with a higher number and price tag. Now,you can say names are just arbitrary,but they aren't if you price the products by their names and not their performance tiers and specs.
this in turn makes the RTX 4060 the RTX 4050,by specs and performance. the RTX 4060 we should've gotten costs 400$ and is the RTX 4060Ti,which in itself is
marketed as the RTX 4070 and goes for 600$. the 4070Ti is what should've been just the regular RTX 4070,and they could've never managed to get away with selling it for the absurd price of 800$,so first they tried calling it the "4080 12GB" but that didn't work out and they ended up changing it,but still fucking us over just the same.
Also,all cards have a crippling amounts of VRAM.
Imagine if Toyota took the 2024 Yaris,up it's price to match the 2023 Corolla,and called it a Corolla although it's the same size
as a Yaris,then change the name of the Corolla to Camry,and price it like a Camry although it's just a Corolla inside and out,then make the real actual Toyota Camry
The "Camry Premium" or something and price it almost twice what it was the previous year. That's the 4070Ti situation.
This is exactly what Nvidia is somehow getting away with and it's mind boggling.
Actually the *4080* is what the 4070 would normally be, at least comparing to Ampere. I really dont think many people understand how truly insane Nvidia's naming and pricing greed for the 40 series is.
At this point, where the 2 main players don't want to compete, I would support the 3rd player, buy a A750-A770.
I waited more than 5 years to replace my RX 590 and there is still no feasible(new) upgrade from it in the same money I paid for it(200$).
So I took the logical step of switching to the third player.
Even the 4080 looks like a 4070 Ti, and the 4070 Ti which was going to be released as a 4080 12GB looks like a 4070
It's worse than that. Going by Ampere, the 4080 is the equivalent of what the 3070 was. And the 4070Ti is the equivalent of what the 3060 was. Even the 4090, which is touted as the 'good' one, is actually more like what the 3080Ti was.
@@maynardburger imagine the performance we would've gotten if they actually created the REAL ones. But then again they would've had another 1000 series which would last longer and hurt their sales. The 3000 was liked by pretty much everyone when they launched, for their price to performance and gains. But now 2-3 years later they are running out of VRAM. Sad times
The memory bus alone gives it away
It is a 4050. We're gonna have to look at the technical specs before buying anything going forward.nivida and amd both are doing this.
The RX 7600 is using Navi 33 which is the direct replacement of the RX 6600 which uses Navi 23. Both are 128 bit bus width, both use 64 ROPS, both use 8 Gb of ram ddr6. Navi 33 has higher cores and TMUs. So, for the low end AMD isn't doing what NV is doing. This is the reason the RX 7600 performs like a 6650XT which is 1.5 tiers higher than the RX 6600 (25-30% better).
Even though the 4090 is "A monster of a chip" it is still an XX80 card. Maybe a Ti but they still got their wish the 80Ti for $1600!!!
Instead of a true generational uplift we are now expected to pay for what we got as an expected benefit of progress in the past.
My 3080 is comfortable where it is. Let them learn the hard way. A.I. is making Nvidia big$$$ now but the bubble will burst.
Then where do they look to? The ones that made them.... Gamers!!!
I immediately said so when 4050 and 4060 on laptops were released. 4060 is 4050ti due to +15% core count. It is not a class difference, it is super or ti difference. Take 2060 and 2060 super for example.
they are same AD107 while 107 chip is tier for xx50 since years ago
Ouch. I hope this video gains a lot of traction
60 class used to be mid-range? I thought 50/60 class were low end, 70 class was mid range, 80 high end and 90/titan as flagship
90/Titan didn't even exist for a long time. 80/Ti was high end, 70 was between high end and mid range, 60 was firmly mid range and 50 was entry level.
@@HighYield 90 was also really only used for dual gpu cards with 2x 80 series gpus on them before 30 series.
I dont understand that Nvidia "seems" to care about backlash so much that they "cancelled" a GPU but dont care enough to fix the naming this generation when theres ton of backlash and lack of sales.
That was before they became the darling of the investment world with their AI and trillion-dollar status. Now more than ever, Nvidia 'belongs to the streets'. Wallstreet.
my first nvidia card was a geforce 2 MX, never went back. Thx amd ^^
symptom of a lack of competition
while the desktop market performance uplift for mid range sucked, the laptop market is actually pretty okay
yet it can compete in performance with AMD rx7600, has the Ada lovelace feature set and can wipe the floor with it in RT. we can't criticize nvidia for this. the competition under-performed.
nice video man subbed... i have a 4060 45w lap and i think i mightve been better off with a 3060 lmaoo 😬
Let's just hope they lower their prices by 50-70 by november or even earlier. Buying now gives big regrets
Literally why I will probably wait another 3-5 years before upgrading. Probably will not go with NVIDIA next computer.
I really hope there will be no 107 die next gen.. but knowing nvidia 106 gets rebranded to 105 and 107 to 106 🙄
nvidia is scamming it's most loyal customer base.
Do you believe in customer loyalty for giant corporations that are in a practical duopoly? Buying a product from a brand that usually makes good products is usually reasonable logic, but “brand loyalty” with a corporation this large is like going off a cliff staring at an airplane that you are following while you are on the ground, it’s a terrible idea and the airplane doesn’t care what you are doing. Don’t get out of your car at the Grand Canyon and run around in circles in the middle of the night blindly thinking that the pit in the ground is so beautiful that it cares about you. If you’re spending enough money checking out the product you should potentially buy is important. I’ve bought a lot of products that I’ve regretted buying, but I don’t blame the company that is selling them for my lack of knowledge about widely available information about products. You are not being scammed when information is widely available about products. The RTX 4060 is probably not a great choice of products to purchase sadly not everyone has time or even the ability to realize this.
This kind of action where you try and sell something which is of much lower quality than expected at the full price is generally known as a "scam"....
Honestly trying to compare the tier grading of the new generation cards to the old gen cards is no longer the same as the chipsets are different between the series tiers.
Now you need to look at the 50,60 70 and so forth as how the cards are placed within THIS generations offerings or product stack as that is all that really applies or matters.
What they call a card is irrelevant, you are buying a card based off the features you want including the amount of vram and the amount of money you are either capable of or willing to spend to get that level of performance you desire and are willing to pay for.
If a person wants a higher level of performance or features than a certain class of card delivers then the answer is move up the to a higher tier card in the product stack but be aware that it will cost you more money to get the extra features or performance.
You willingly have the choice to set your limits to what suites you and there is always console gaming if you are looking for a cheaper gaming alternative.
But thinking these companies owe you top tier performance at a bottom tier price point and whining because they do not is not going to get you anywhere.
Speed cost money, just how fast do you really want to go?
What is the rtx 4050 a HALF length card
im never buying nvidia again. its amd/intel from here on out. selling us a 4050ti for 400$ was the last straw in 15 years of bullshit from this company
I'm a bit disillusioned now. I bought my 970 in 2015. Planned to buy new GPU in 2019 or 2020. Then cryptocurency boom and pandemy happened. Prices were crazy.
And now I'm feeling like only reasonable choice would be Arc 770 16GB (In places where it is still available) but it's a leap of faith. Faith in Intel still working on drivers.
So in the end it feel like I will stay with 970 as long as it will run good indie games and rely on CPU for my amateur graphic and video rendering...
What about a cheap RX 6700 XT, if you can find one?
@@HighYield thanks for your answer :) it's good value for now, but I fear that for such money it won't age very well.
There are two reasons why I'm not sure about buying it. In general I feel that I will get less than for same price in 2015 (adjusted for inflation).
Second reason is my 650W PSU. Manufacturers calculator sugest that even non OC version of 6700XT will be dangerously close to 90% load... Now I tought about one question, maybe You can answer such calculators are using peak power consumption. If I limited my 13600K to 180W then probably there shouldn't be any problems, if 6700XT between 100W a 250W in games, am I right?
In the end it's probably me not having enough money in bad market and bad economy ;)
@@HighYield I just have to bother You again. I'm In fact set in for new GPU, yet I'm not sure what is better deal. 6700XT (Saphire Pulse or Asus Dual) for 390€ or A770 (Bifrost) for 325€.
"Ans welcome our newest product the RTX 4050, also called the 1030 v2"
so what 4050 true name would be?
40 series just dont abide by the class naming system, better to just ignore it and view it by performance
Imagine if they didn't pull this sh*t. The 4000 series would've been the best generation if these cards would've been properly branded.
me waiting for 5000 sereis just for the 5050 name
they should’ve made the 4060 ti 12gb and add 200mhz to core and 1100 mhz to memory and called it the 4060
It was kind of Nvidia not to sweep AMD by not calling it a 4050 and priced cheaper, making the 7600 look (even) worse in comparison.
If the 4060 was cheaper, then you would be right, but it's not. It's more expensive by $50. In this price range where the buyers are low end, that makes a big difference, except for shills. Much like when the shills bought the RTX 3050 over the RX 6600. The performance difference to this day is disgustingly in favor of the RX 6600. Bottom line, they didn't and buying a last gen card, is better than buying a 4060. But, a shill will still defend it and buy a 4060.
Seriously though, Nvidia's greed has absolutely bailed out RDNA3 from looking as bad as it really is. With more sane pricing, 40 series would have forced AMD to sell high end parts at like midrange prices(kinda like they had to with Vega), which would have been painful, especially in the face of their chiplet strategy which was supposed to help them maintain good margins.
@maynardburger AMD are doing the same as NV, just not as extreme. It's super cheap to manufacture Navi 31, 32, and 33 compared to ADA (less than half the cost compared to NV) and even Navi 21, 22, and 23. The chips are smaller and they're using 6 NM instead of 4 nm.
Then there is the ASUS STRIX 4060, the same tiny GT 4030-class GPU with an absolutely hilarious 3-slot cooler. Low-end trash in disguise.
The entire 40-series is such a scam. Half the memory bandwidth compared to RTX 30 series, yet for a higher price. Really 5 years backwards development.
And yet Jensen Huang at Computex claimed we gamers were "lucky" to be getting more L2 cache and .... cutting edge AI frame generation and ray tracing!
Remember the future for graphics is AI image generation and no longer rasterization, at least according to him.
When he said this in his presentation my jaw dropped because he actually believed he was doing us a favor!
Of course during his Computex speech he looked coked up so I wouldn't be surprised to see him act like a clown on stage.
Nvidia backed themnselves into a wall with their naming scheme.
In two series if they have a "50 series" model it'll be called a "6050" NO ONE is going to buy that.
Ergo, every generation since the RTX 2k they've been altering, ever so slightly, their tier lineup in the process of eliminating that possibility.
Its it cool that company could fit low power chip and achieve x60 performance at least by using ai? Not the fan of ai, but i like the idea.
Anyone who watched Jensen Huang's presentation at this year's GTC conference and his max cringe keynote at Computex already knows NVidia doesn't care. For Huang and for NVidia all that matters is feeding the AI/Data Center market.
I've been keeping track of the orders for their $30000 H-100 GPUs and they're selling them in 10,000 quantity orders to the AI startups and Data centers.
Consider that the price of 1-H100 = 100-4060 GPUs!
Recently Inflection AI raised $1.5 billion in VC funding, they have already stated they're going to expand their Data center for AI training from 3600 GPUs to 22000 H100s!
Twitter is buying 10000, OpenAI is buying at least another 10,000 through MS Azure, AWS and Meta are also acquiring at least 10,000 more.
Their sales for AI are going to be at least double what they've made from gamers and they're selling a lot fewer physical units.
Gamers are yesterday's news, they'll give us what they want and we either buy or don't, they don't care.
And for those saying AI is a bubble, yes and no, this round of funding maybe a lot of hype but the dollars are real, when it dies down think of AI funding like a form of permanent crypto market, the high prices are going to stay from now on.
NVidia has no incentive to cater to gamers and every incentive to cash in as quickly as possible.
NVidia's stock closed today at $423/share, let that sink in.
Wait till the theoretical RTX 4050 sporting a AD108 die with half the cores inside a AD106 (2304 CUDA, 72 TMU, 24 ROP, 24 max Tensor Cores), 6GB GDDR6 96-bit and a PCIe 4.0x4.
Nice!
Also don't forget their desperate propaganda to even sell those GPUs...
Whole freaking blog posts about: effective bandwidth, 50W lower power consumption with freaking calculated power bills, smeared frame gen, 8 gb memory the lord and savior and super duper ray tracing improvements.
It makes me just sick.
They really need to investigate if Nvidia and AMD are manipulating the market. Like, Lisa Su's Grandfather is literally brother of Janson Huang's mother. They are literally one family.
These were built for Dell so they can up-sell them as top tier gaming machines at Walmart.
ive got a 3050, and the 4060 for $300 has double the power. so ima upgrade. (if anyone wants to know ive got a i5 12400)
Honestly, bar the 4090 and 4080 16GB, every card has felt like a one-tier down card in terms of performance, whereas ALL the cards have been priced at 1 tier-up.
I believe they all got shifted the 70 is the 60 and 90 is the 80
I was cleaning up my backyard after my doggie made a mess. I picked up a big turd and named it 4060.
"Branding does not fit", sure, but AD 104/6/7 are designed for the laptop market shoed into desktop card form factor. Even AD 103 is a mobile / desktop hybrid that works okay ' double duty'. AD 102 is the only true desktop GPU this generation and there is even controversy about that on the area decrease v GA102/100n and TU 102/100. You can see ADa is a mobile generation primarily on the slim bus saves power and x8 for AD107 its all about low power and laptop board real estate savings so easy to see. Also Nvidia knew Intel targeted Arc for Raptor CPU _ GPU kit sales and would fail, Nvidia knew Arc first time would fail. So Nvidia had Ada available on mobile emphasis and stepped right in to save contract manufacture producers with Ada GPU supply, 50 M on my estimate for AMD and Intel Raptor, Phoenix, Meteor mobile H attach. The other issue was no AD 4080ti at launch which left a big void in Nvidia's product stack 4080 mobile desktop hybrid is really 70 class and when AD 102 4080ti launches in q4 23 or q1 24, that will push 4080 down into its' actual 4070 price rung that will also dump on RDNA lll. Blackwell I suspect will stay on depreciated 4nm to offer a mass market desktop card on the economics of cost : price / margin returns , to desktop emphasis which was Ampere's emphasis on a process tick and where Ampere also did mobile double duty, but from the desktop design angle you can see it in the GA bus design. So the pendulum will swing back to Blackwell desktop design emphasis from Ada mobile emphasis and where Ada will continue to serve mobile H attach through 2024. mb
haha, you're assuming they will never release a 4050. wait and see..
i never knew the 750ti and 750 were maxwell i always thought they were kepler! the more you know!
They are still supported by the newest drivers thanks to being Maxwell based. And from what I've heard they even beat higher end GTX 700 series cards in some newer games
You can tell its a 4050 by the benchmarks. The 3050 Laptop and the 3050 have about a 25% difference in performance. The 4060 and the 4050 laptop also have a 25% difference in performance. So automatically that makes the 4060 a 4050.
Then what is the 4050? is it better than a 3050ti?
Interesting: Yes
Surprising: Nope
But... it has electrolytes (aka DLSS3).
Shrinkflation smh but also lol
Nvidia keeps doing this, lowering card tiers and raising prices.
Nvidia thought their consumers are too dumb to notice
64bit... accurate
Jake AppleStrong ?
Buy used soo many great gpus at great prices!
@@N_N23296 oh wow lol that's soo upside down
how dare you nvidia I thought you were my friend :(
I mean, their name and logo LITERALLY means envy. I put how much they care about us on the same level as Apple. I wouldn't be surprised if they. Suppose GeForce Partner Program (GPP), EVGA leaving, many anti-competitive practices, lying, GeForce Partner Exclusivity, "GameWorks," artificial scarcity, etc., didn't make you realize they aren't your friend. In that case, I don't think anything can.
Every single card in the 40 series is a downgrade by 1, aka they canceled the real 4090 named the 4080 as 4090, so 4070ti became now a 4080, 4070 became 4070ti... etc!
i show that both amd/nvida is screw over consumer.
Is true this was never meant to be a real 4060 this is a 4050 in disguise.
it seems like the 4060 ti should have been the "4060"
i think that they will never launch a 4050, because it won't make any sense at all ! Nvidia will puch these GPUs ( AD107 and AD106) for entry level AI-entended accelerators, which much wider and more memory to dominate the AI marketshare
At 96 bit bus with 6GB of VRAM, it makes perfect sense as xx30 tier GPU for 250$. At least for Ngreedia.