Also just to be clear, viewers have said in no uncertain terms that the then upcoming 16GB version of the RTX 4060 Ti is a waste because of the 128-bit wide memory bus, as in there won't be enough bandwidth to service the larger capacity.
@@transistorjump919 .. for most of the current titles. That's important to note, because future titles may assume and be designed to utilize larger amounts of VRAM and then the limited bus width may come to bite you hard.
@@transistorjump919 Because of resizable bar (Smart Access Memory @ AMD, Clever Access Memory @ Intel) memory bus bandwidth doesn't matter that much anymore. Unless you have a very old motherboard or you are not smart/clever enough to enable it ;-). Never the less, the card is much overpriced and might only be worth it for people that use it for AI and make back their money quickly. Maybe that is the reason why NVIDIA has set this high price and does no promotion for it. AI need lots of memory, not that much compute power. This card will kill their sales of their higher priced cards that are sold because they don't offer anything else with enough memory at a decent price (and AMD or Intel cards are harder to use in AI at the moment).
@@transistorjump919 indeed If one also looks at the numbers and what games are popular and being played, 8Gb is plenty No one is playing Apex Legends or Fortnite or whatever somewhat seriously and running max textures Either a noob or happy with a disadvantage. Pretty games are distracting games. Most games being played are multiplayer games It’s almost as if this loud tech vocal is just one of those loud minorities whinging for the sake of whinging They don’t vote with their wallets They’re full of 💩
NVIDIA has been lowering the ram on all their cards since the 30 series in order to force you to upgrade sooner. It's not stupidity, it's deliberate obsolescence.
@@transistorjump919 No you idiot. They used a scam to give people less than 4GB which the card was supposed to have. The cards were sold as 4gb cards but only had 3.5gb.
@@transistorjump919Still successfully got sued, so it doesn't count. You could technically argue AMD FX had 8 cores, but the state of California says otherwise.
I dont know if you need a lot of vram when using frame generation. Is frame generation vram dependent or does frame generation lessen the importance of vram amount ?
@@SupraBagels this is the result of AMDs lack of competition for F^ck sake Nvidia owns the market Why would they be incentivised to border on monopoly grounds which the community would then complain about ? Like why do they NEED to do anything They aren’t a public service If you don’t like it go buy else where What’s that, hardware surveys says the community is full of 💩
Nvidia made a 4050 TI with more VRAM than 4070 Tis that cost more than double the price when this doesn’t have enough power to even push settings that use a full 16 GB and said: “🗣️🔥This is some heat! Drop it in July!”
@@transistorjump919 Nvidia themselves market this card as a 1080p GPU. Sure games push over 8 GB, but it’s not hitting 16 at 1080p. Plus, 128-bit bus bottlenecks it so hard that frame rates gaming at native 1440p at high/ultra settings are barely maintaining 60fps.
@@sooubic lmfao if you call transistor jump an nvidia fanboy then you clearly clueless about the usual people in the comment section, your blind hate is showing too much, while i agree 4060 Ti 8 and 16gb is bad atleast im not as fanboyism about AMD as you are
Nope. 4080 12GB was fine. Also the 16GB 4060 Ti seems to run very cool which is nice. So either a 4060 Ti FE it is or some decent partner model 16 GB version. Just ignore the upsell stuff if you think this is not the last GPU generation and you are not yet sure whether or not this will be your last graphics card. Still waiting for more power consumption charts (idle, video playback, multi-monitor).
No, it's just extreme expensive way how solve 8GB VRAM problem (=new games from consoles). Even if last 4GB is never really used, old RTX3060 was just more cost-optimal.
I have an NVIDIA RTX 4060 TI with 16GB of VRAM, 32GB of RAM, and an AMD Ryzen 7 5700X processor. I have to tell you, I have almost all modern games and I don't experience any latency issues. And my motto is: Your money is for you to buy, mine is for me to buy, your GPU is for you to play, whereas mine is for me to play. My budget is not yours, which is why I have a 4060 TI with 16GB. Additionally, I seek power efficiency because we have two gaming PCs at home and I'm the one who pays the electricity bill. I chose this GPU because I like it, and I also like the MSI brand. I bought it new, and as the Dominican gamer saying goes: He who dies for his GPU, death tastes like glory.
i thought the mining community lived in their own reality, selling heavily used cards at ridicules prices, but its actually nvidia, getting high on their mining boom bottled up farts who lost touch with time and reality.
I got an ex mining RX 6800 for 400 euros last december and was lucky the guy was active on forums here so I could see exactly how long he mined with it and how seriously. I got an awesome deal so miners are better than Nvidia in my case. :)
@@how2pick4name AMD and Nvidia were not amused that their GPUs got scalped. There seems to be an empty gap between what miners and gamers are willing to spend, despite improvements have been made this generation.
Got a ex mining 6800XT Sapphire Nitro+ OC for £330 and a 3070 for £230 The 3070 was in perfect condition almost Mining always gives the best deals after causing prices to skyrocket
I personally think the 4060 ti 16GB card was solely created, not for gamers... but for those who are trying to get into AI. A lot of AI related things require more than 8GB VRAM, and perform best using Nvidia cards. I kinda wish that Nvidia marketed this card for AI acceleration, instead of for gamers. It might have gone over a little better.... mabye... mabye not
@@transistorjump919 An entry level card for AI matters. Education, AI drawing, etc. A lot of applications. Having enough ram to run a large model is more important than the processing speed, if you have to pick one.
Update 1-10-24 - Today, the 4060ti 16GB priced at $450 is the best bang for the buck for Generative AI purposes that strictly require an Nvidia GPU, second closest is an additional $500. I didn't buy them for gaming so not concerned about that usage.
THIS. Yes, this card is not great for gamming, but it has VRAM enough for SD XL and Local LLMs. 128-bit is slower, but for AI a model running 30% slower on VRAM is MUCH better than not running the model at all or having to run on RAM.
@@zerorusher The thing is it is GREAT for gaming at 1080p, which is the resolution most people over 30 play games at. 2/3 of my library looks awful at resolutions higher than that anyway. Hell with my eyesight I can barely tell the difference anyway. But yeah it's targeted and LLMs and AI gen and it does the job for a decent price. I already owned a 3060ti and moved to the 4060ti 16gb and don't regret it at al.
I bought an Nvidia 4060 Ti with 16Gb, and it's a very good card, and I am fully satisfied by it's performance! It is miles ahead of my previous GeForce 970, with the extra benefit of having lower power consumption. My friend still owns his 3060 Ti, and my 4060 Ti outperforms his one, in every game we tested. The hate towards this card comes only because of its price, otherwise the performance is fully justified.
every card has good performance just not every card has good PRICE TO PERFOEMANCE you could have bought a 7700 xt or a 7800 xt for the same price more or less!
@@mohandx4zI originally brought amd cards but recently switched to nvidia n let's be honest, drivers, Ray tracing, uoscaling are all far better as is the power consumption and they are quieter systems.
@@GewelReal I never argued that point, I was arguing how ridiculoudly poor value this GPU is. I have a 4080 but thats cus I had the money and my 2070 died. The 4080 is still poor value despite being an incredible card. Its power usage is INSANE like 4k games at 230 watts.
@@WSS_the_OG exactly my point. Hell even the 3000 series was actually very good value if you look at their retail prices. The 3080 was £699 in the UK which I thought was a bit much but now lookign back is an insanely good deal if u got one.
hopefully Nvidia wont mess up their naming schemes of their own product again.I guess the intern dropped the naming shields and mixed them up or something.
Well hey, if you're a data scientist major wishing to build a 'budget' PC with a decent hardware for deep learning and AI workloads, that 16 GB RTX 4060 TI is a steal! Before, our only option was the 12 GB RTX 3060, which was an okay card, but that 16 GB will surely open other possibilities for bigger and more complex model, with enough CUDA cores .
Coming from a 1080p gamer here, we stil exist. I bought an RX 7600 OC and it can do 1080p no problem and in most cases it can handle 1440p pretty fine too. Not everyone is switching to 1440p YET.
The difference in VRAM-limited situations between the 8GB and 16GB 4060ti is enormous (no stutters and up to 4x fps in for example Callisto Protocol with RT), so the extra $100 is well worth it. But at a $500 price point something like the 6800XT makes more sense if you dont care about power draw. There is approximately 13GB or RAM available to developers on the PS5 console, BUT they cant allocate all of that just to the GPU alone. Realistically, they allocate around 8-10GB of VRAM to the GPU on the PS5 (TLOU1 remake was using 10GB of VRAM for sure). In my opinion, GPUs with 12GB of memory will be sufficient for a long time if someone game at 1080p. The most VRAM demanding games however will require some tweaking at higher resolutions for sure. For example the R&C Rift Apart can easilly fill 12GB of VRAM at 4K. For example I saw 3080ti gameplay at 4K DLSSQ and fps went from 25fps to 100fps when game was no longer using over 12GB of VRAM.
You can buy 192bit bus nvidia GPU, the name is 4070. You can also buy 256bit Nvidia GPU the name is 4080… You are barking wrong tree. You ca say that GPUs are expensive. That I agree whole hearthy! But asking why Nvidia does not make 256bit gpus, when they make 256 bit gpus is not valid. Asking why prices are bad… is IMHO
What difference would bus width make? Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor. Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance. Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti. MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
@@ioannispoulakas8730 192 bit 4060 with 12 GB doesn't make sense. 128 bit 16 GB is perfect. They should've increased or atleast equalled the processing units. It's literally barebones cuda, texture mapping and render output units. The only reason there are any performance gains is because of the gen on gen architectural improvements per processing unit.
Got an RX 6800xt for just over $450 on prime day as thr centerpiece for my first build. Never even considered Nvidia due to such poor price to performance.
I think the move to lower VRAM is due to NGreedier wanting to kill the amateur AI creators out there, and move them to more expensive professional options. Just like how you can download a productivity GPU drivers set for NGreedier, or game ready drivers. If performance there is limited by it's frame buffer significantly, that will be the proof.
you missed the part where you explain that "if you are coming from a 10 series card, the 4060ti is great for people who are coming from WAY older systems." but i guess your target audience is people who waste money on graphics cards every time they come out
If you have an old system, you likely have pcie gen 3. Which makes it worse. Also, you could save $50 and get the exact same performance(or more if pcie gen3) with a 3060ti or 6700xt. Which at this price range is a lot of money.
The price to performance is the same whether or not you upgrade from a 3060 or 1060 lol. It's still objectively garbage and downright insulting to even offer such a terrible product. At this price range, you have way better options. And the 16 gigs of vram is just a mockery from nvidia at this point. They created the vram problem, and now they are selling the solution but with slower memory. What a joke.
Actually you missed a lot of points. 16G model bring big advantages for some cases. But you evaluate its value for unrelated uses cases such as FPS. Nobody buy 16G model for better FPS at well optimized and current games. The game like Hogwars consume too much VRAM. Also in 2 years I am sure most new game will required more VRAM than 8GB. So this 16G model will take advantages for that uses cases. another thing is AI. AI tools like stable diffusion need more VRAM to handle tasks. People who are work with such tools will buy 16G model. It is much cheaper than 4070, not to mention 4080 and 4090 price is sky.
Good point about wider buses scaling better to 1440p. I think's likely correct for a significant number of titles. Trouble is, this is not a GPU worth putting a 256-bit bus on, or even 192-bit. Adding a bus costs in many ways, space on the GPU, lines on the board, it all costs. Who knows what the BOM on this was, but the 4N node and space on the GPU massive cache was a big part. Cache genuinely does mitigate for slower memory, but because of that, faster memory then makes little difference to performance. You can look at this in the CPU space. Get an X3D CPU on your board and suddenly RAM speed adds much less to performance than it did before.
It takes only 7 months to make this vid obsolete. Anything below 16GB is not worth for generative AI. It is not about speed. It is about life and death, whether a programme can infer a result or not.
The 4070 in comparison isn't so bad after all. The cheapest 4070 have dropped to 579. And the cheapest 4060 Ti 16GB is 549... How can anyone justify spending this much for a low-tier GPU?
4070 is just meh in general but made good by 4060ti. It's obvious nvidia wants to clear the shelves of 3000 cards. Either way, lack of sales of 4000 and consistent sales of 3000 cards are all in their favor
@@alrizo11154070 isn't meh compared to 4060 Ti and other lower tier cards. 4070 and 4070+ is only good from 40 series... I own a 4070 and love it. Lower power consumption and faster than 1080 Ti.
@@MatruchusAmpere and Ada is different architecture, You can't say that since nvidia is made them that way in this generation, and they have rights what they are naming on there GPUs. If you just gaming and don't care much on frame generation/ray tracing performance, you can get 30 series or amd GPUs on good deals. 😊
This is what people cried about , being sheep following behind hardware unboxed. The RTX 4070 was probably already set in stone when people made the insane amount of videos crying about vram . So Nvida put 16 GB's on a card to shut people up . Yet I'd agree 4060 & 4060ti it's pointless, 12 would be enough . Those cards aren't strong enough to utilize the vram yet if hardware unboxed say it can ALL OF A SUDDEN PEOPLE BELIEVES IT CAN .....ALL SHEEP FOLLOWING youtubers
1 year later and I still wish people would stop gauging GPUs based on gaming only, the 4060 ti is not the best, but a good GPU for 3d workload and better at saving power at 160w, that coming from someone who renders 2-3 weeks straight.
Guess we'll see how it goes for me - I'm coming from a 1660 Ti and wanted a more modern upgrade that has 16 GB and went as low as possible $ wise, so timing it up with cyber monday meant I could nab a 4060 Ti 16gb for $450. Even if it's subpar to others, the alternatives cost more than I wanted to spend or didn't have 16 GB. I think in my niche situation of just wanting *any* valid upgrade that satsifies that, I might be in the target audience for this (whatever that is)
the price is down to $450 and it's kinda worth it, with 16gb vram and low tdp it's actually saving my money because i less paying the electricity bill and i can make some money by training ai models on my gpu because the vram 16gb it still used for training ai models with decent amount of time, and this gpu is decent enough to get 60fps on any game with 1080p monitor.
What difference would bus width make? Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor. Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance. Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti. MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
@@Eleganttf2 It's baffling to me how everyone just keeps on harping on one of the least significant issues with the 60 series. 99% of the community yaps about bus width. Less processing units, VRAM and absurd pricing is what are the issues. Nvidia fixed VRAM, but made the existing absurd pricing, beyond comprehension. 16GB 4060ti with this level of silicon should not be more than 350. If it had equal to greater than 3060ti silicon, then 400 would've been acceptable for 16 GB. 500 for lesser processing power than 3060ti is just pure greed.
the 16gb version wasnt made just for gamers lol.. the extra gb is also for people who do 3d content or any content,rendering,video editing.etc..duh and you can still enjoy games on ultra at 1080p
The other issue is unsold stock from the previous generation (3000 series and amd's 6000 series cards). Both companies overproduced for the crypto mining boom and when it crashed had a lot of unsold cards. They deliberately gimped their new generation in order to make the last generation cards still attractive so people would buy them.
Nope, the problem is that the assembly lines switcherd to the new architecture and miners and gamers are now selling their 30x0 cards for half the price on ebay, really hurting their low and medium tier 40xx cards :)= What comes after a big Boom - A Crash!
I guess Nvidia made it pretty clear, the 16GB 4050Ti is not for gaming (at least not its primary purpose), but for some AI calculations, a large proportion of the gaming community just could not accept the fact that they are no longer the VIPs to be served (or have their "feelings" considered), and obviously Nvidia is aware of the presence of the massive decommissioned mining cards in the used market, if they had to complete the price with those used mining cards, they don't see it profitable and decided to prioritize the AI demand that 40 series exclusive properties is a necessity, rational business decision IMHO (of course it is unpopular to the gaming community, but Nvidia knows the "reputation" of the company means nearly nothing to this market segment, people will still need to swallow the 50/60 series pill when the massive stockpile of the 30 series mining cards dry out eventually). At the end of the day, even those "picky" folks will buy the new cards when Nvidia release some decent products again, why should they care about your feelings? This was already proven in the mining era, we see no boycott when the 4090 was release even gamers were mistreated in the past years, gamers won't punish AMD/Nvidia for their past actions, if so, why should they care?
bought this card for $500 brand new, as an upgrade to a 2060 super 8gb, and i have no regrets it runs 1440p at 60-70 fps on wukong and SM2 and 150-200fps at 1440p on warzone. i was gonna buy a 3060 ti but they werent available
I upgraded from a 3060 ti 8 gig and damn was it ever a massive difference. I don't what what these content creators are smoking but everything I play went up a solid 10-20 fps, everything plays much smoother, and larger AI models spit out tokens at 2-3x the speed.
Nvidia with their shady business practices and people have the nerve to Call Out AMD because they “Think” DLSS is being blocked from the biggest game launching in 2023
@@arenzricodexd4409 Yeah, so while some of the things they do I can tolerate, the tendency is to keep a GPU for longer if quality is good enough. If not you can switch to integrated GPUs, which I did for 10 years or become an expert in repairing GPUs. Options are limited. They will accept a shrinking PC market because gamers' expectations are too high or they will fix things and everyone is happy without any exception.
I think they both have their place. If all you are looking for is rasterization, go AMD as that's what their primary focus is. You don't have to pay for tensor cores, or RT cores, just basic rasterization. Personally I think consoles are great for that, and I have a PS5 myself. If you want more than rasterization, and are willing to pay for tensor cores for DLSS/DLAA + frame generation (which frame gen is spectacular), or need cuda cores for apps, go with Nvidia. Nvidia has the extras to artificially double your frame rate, and it's actually quite good. I was playing a path traced Cyberpunk the other day and decided to shut it off and see what the game naturally looked like in a busy area with default lighting, and that's when I seen how big of a difference it actually was and one of the few games I keep DLSS + Frame gen on for rather than native 4k. Not even a top of the line AMD for $1000 will get you there though, and once UE5 gets more implementation with their feature set, we will find out how well AMD really does with just a heavy focus on rasterization without all the extras. Not everyone wants to pay extra for tensor/RT/Cuda cores and that's understandable, but know you are going to pay for them if you want them, and AMD not having them means you can pay only for rasterization only without the extras, and there's a market for basic playing of games. Nvidia with the extras is always going to cost more for those extras, and until they convince people (steam stats seem to show that they have) that their extras are worth it, you have the option to go with the minority leader instead. There's a GPU for everyone here, and luckily there's 2 companies to target different areas of the market. Maybe they should had eliminated tensor cores/RT cores all together on the 4060 and put it into rasterization instead, but they are heavily focused on RT/DLSS at this point and now it is a bundle package. If it isn't for you, it isn't for you.
The rtx 4060 ti chip is called AD 106, which means it was intended as the 4060, not the 4050 ti. The 4060 has AD107, which means it was intended as a 4050/4050ti.
Nvidia is not dumb enough to make their lies THAT transparent. Of course they will name the chips in a way which suggest everything is cool. But the 4060 Ti cannot be a 3060 ti replacement because it lacks the performance gain over the old card of about 20-25%. If you consider this, it becomes obvious that the 4060 Ti is, performance wise, in the 4050 ti area, no matter what nvidia is trying to tell you.
For artificial intelligence learners or hobbyists like me, 16GB of VRAM is quite meaningful with a CUDA enabled hardware. A770 has also 16GB but there is no support for CUDA on Intel GPUs. CUDA is actually de facto standard in AI, especially for training.
Man, the prices here in Australia are even crazier. I was looking at the 4060ti 16gb and found one for $910 AUD, the. i noticed I can get a 4070 for $950 AUD from the same store. I don’t understand how that is justified or how the pricing works here but it’s ridiculous
Another viewpoint. GPUs can take years to get from initial design to setting up a fab, ordering materials and starting full commercial production. Followed by global distribution etc. During the pandemic, production of raw materials and component parts dramatically reduced. The cost of everything went up quickly, and availability went down. Even big companies like NVIDIA rely on others. Contracts have to be fulfilled, yet lead times for parts and materials suddenly became several months rather than weeks. Businesses are trying to recoup the extra costs however they can, utilising what's available, whilst still playing catchup. Adding extra RAM to an existing board is a "quick fix"... and some people will buy it.
The ideal config for these cards, at around the $400 US price point, would have been a 12 GB version with a 192 bit wide memory bus. That's a card people (besides Nvidia fan boys) could not only defend credibly, but actually feel they're getting their money's worth. Nvidia could put 32 GB of VRAM on this silicon, and it would still suck due to that 128 bit wide memory bus. It also makes me wonder what Nvidia thinks is the right use case for the 4050 series; 720p? Or maybe 480p? Progress is dead, long live progress.
Actually now that the AI boom is here you may see some new manufacturers jump into the picture. Governments everywhere are pushing for semiconductor manufacturing so in a few years even that may help. Really thankful for the US China Cold War. Will really push the elites to actually progress rather than simply tread water.
The "Clampshell" way just means that you have 2 32Bit Chips on ONE 32Bit Bus - And that means they work like a 2 Drives RAID-0 SATA Connection with a bigger cache - Means for small texture files it is okay but for big texture files that even exceed the cache size you will lose a lot of transfer speed compared to a bus that allows all of the texture being transfered with 2x32bit. So that makes it clear that you will suffer as soon as you go hi res with high texture settings. Will it be better than 8GB on 128bit combined thoug - For sure, because the cache is faster than using the RAM to cache those files. For 32GB and 128bit you would have 4 people that need to use a single toilet at the same time...
For me 4060ti is best gpu ever made) best value for my money. I work with AI , so I should have at least 12vram these days. With 4060ti I am get 16. Even better. Not sure why anyone need to whine if about this gpu. Not good for gaming? For me it would be okay for gaming too. In my Asus fx505dm laptop I have rtx 1060 with 6gb vram. Last two games I played there on high graphic settings were Hellblade 2 and Alan Wake 2. Everything was smooth with decent framerate
In the years to come I could see Nvidia 4060 Ti 16Gb staying on top in terms of value and technical edge vs AMD if upcoming games rely extensively on DLSS 3 to guarantee high FPS at 1440p. Unless/until FSR catches up 😅
I wouldn't like the Frametime impact compared to faster native FPS! DLSS is allready great even without tricks. And 3080/3090 get cheaper on the used market day by day. Not to mention how far Intel has come with their ARC Drivers and how interesting their Battlemage will be.
You say only 3% performance but on games that suffer from the 8GB VRAM, the percentage increase greatly. Although, I understand, still don't worth it. However, personally I really like the card 'cause is extremely efficient.
The vid is bs. I'am an AMD fanboy but I have to admit the 16 gb version is a multi purpose GPU. Don't look at it only from a gaming perspective. Having 16 gb on this card for AI is just nice and the efficiency is good. It's more like an affordable AI card that can game as well which makes sense.
I want to congratulate all the simps who completely fail to understand the purpose of this Card. It is NOT GOOD FOR GAMING. This card is PERFECT for people who want to RUN AI AT HOME. Because in AI, VRAM is everything and with this card you get 32GB of Vram for much less than it would previously cost to buy one 4080 which only has 16gb as well. You can buy three of these for the price of 1 4090. That means 48 GB instead of 24. And still more than enough cores to run LLMs. And forget AMD, they suck in productivity. Period.
They add VRAM to the one card that needed it least, overprice it and then they will be: Why would we increase RAM on the others? You didn't buy the 4060 16gb. It's just ridiculous.
It's not even that bad. I own one, and it runs all my games at 1440p just fine. It's hugely power-efficient, too, which is something you completely overlooked. I bought mine for £400 a year ago. No regrets!
You missed virtual reality. VRAM also matters for VR because you are using 2 individual displays on top of whatever you have going on with your desktop displays. So, if they priced it properly, it would have assisted some VR users
maybe its trash for Gaming ...but its a great budget Gpu for rendering (the 16 gb version), i bought one from msi as a reserve gpu for my rendering rig , nice perf and temps.
It's great for stable diffusion or any AI, I copped one since it has more vram than the 4070. Decently good for gaming since I'm coming from a 1070 too, not like you have good alternatives in AMD.
Probably: Yes, a new GPU! Because they are completely oblivious to where the product actually stands in the market, they will simply be happy to get something new.
Great thought and analysis, Vex. Also from Daniel Owen channel testing of 8 vs 16 GB 4060Ti , the Frame Generation they championed in marketing of 4060 Ti 8GB actually make it consume more VRAM , at least on CP2077. Thats ironic on Nvidia side...😂😂 Also from Yuzu devs, 128 bit bus will make upscaling on 4060Ti perform worse than 3060Ti with 256bit...
Even if you could afford the card the cost to upgrade your power supply would put you over the edge. I live in an off-grid solar powered house and when I switch on my computer and game with my 3060ti card, my computer outdraws my refrigerator or air conditioner. The 4060ti will lower that a great deal.
To be fair, NVIDIA is jacking up prices because they know they will get what they're asking, whether it's now or eventually, when the elevated prices of GPUs becomes a normality down the road. Real solution is to have a competent rival that's pushing prices down, which seems to be a hit or miss lately.
Lol That super niche example you state at 14:25 might represent a bigger crowd that you think. I'm one of those creator that requires a GPU on Nvidia with the most amount of VRAM possible. Daz Studio users is that crowd.
Like Linus talked about on WAN Show, even if they asked DOUBLE the price of the DRAM/VRAM it would've been about 50 dollars, not 100. This is just greedy and shameless.
They are trying to keep the profit stream going from the crypto boom. But it’s too different customer bases and gamers aren’t backed by billion dollar investment funds just working on a quick buck. But shareholders don’t care and demand the same profits quarter to quarter.
16gb 4060ti is actually a good card 16gb means future proof even for 1440p. Doesn't have limitation 4060ti 8gb has But it's just priced wrong. If its pricw comes down to $400 I think it will be a good choice.
What difference would bus width make? Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor. Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance. Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti. MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
@@TedBoltcutter It's baffling to me how everyone just keeps on harping on one of the least significant issues with the 60 series. Less processing units, VRAM and absurd pricing is what are the issues. Nvidia fixed VRAM, but made the existing absurd pricing, beyond comprehension.
It is funny for people think the 4060ti is a 4050ti, nvidia wouldnt sell you a 4050ti that was this close to a 3070. Techpowerup says it is 5% slower than the 3070. It is too bad and too expensive to be a 4060ti, but too good to be a 4050ti. Is is also funny to care about the name of gpus or gpu die, they dont really matter tbh, the price is the problem. If the 4070 ti was called a 4080 12gb but cost $600, everyone would cheer for it, price to performance is the problem, not name to performance. Even if they used the the AD101 die that is usually reserved for the best gpus but still performed like this, it would still be trash. If the 4090 had a AD107 low tier die and kept its performance & power usage, nobody would care. The problem is the performance. We dont care how they do it, we just want the performance we paid for. Dont teach nvidia that they can change the die naming scheme & unsuccessfully trick us.
NVIDIA owns the market There is only so much market share they can have before it becomes a problem Look at the competitions offerings What happens if Nvidia released a 3070/Ti as a 4060/Ti at $299 (as you said , naming schemes aren’t important now are they, just performance RIGHT ?) What HAPPENS if Nvidia released a $299 3070Ti with 12-16GB Vram 256bit bus, what happens?? Ask yourself ? There is a reason why Nvidias product stack under the 4090ti is Gimped to some degree If y’all were really what you preached about, we’d be seeing the market share swing the other way massively. But it isn’t. Y’all want More from AMD and Intel without supporting them but also want more for less from the company who owns majority of the market ? Does ANYONE SEE THE PROBLEM HERE ?
@@flimermithrandir if you think nvidia would make you a 4050 price class gpu that is 5% away from the 3070 in performance when there are so many 3070s still in the market, i cant help you. 😂 50% generational perf uplift from a XX50 class card ha ha ha. It is like you people have never met nvidia before, remember how the gtx1050 was only 6% faster than the gtx950🤣. I knew then that nvidia sees the XX50 class as the dumpster ground. They created shit 4060 and 4060 ti class cards the name means nothing, i want the price down. So what they can call them 4050s or 4030s for all i care, if the price doesnt go down, it means nothing, the price is what i want down.
@@Ober1kenobi im not sure i understand your point. If the nvidia launched the 3070ti as a 4060 ti for 299, it would be a way better produt than what we got, 3060 msrp was 299 and 3060ti was $399. I dont even care if nvidia called the 4060 ti a 4090, if they charge me $299 for the 4060 ti is better than the trash we got. It would be a 33% increase in price to performance. Dont let nvidia trick you with names, they'll change the die naming skews in 2024 to trick us. I havs used both AMD and nvidia GPUs for my gaming, but i need nvidia for my job, i use cuda for machine learning at my job. Edit: im waiting for battlemage with intel for my main gaming rig. Im not biased against intel and amd, i use a 13700k cpu and 7900xtx on my main gaming rig. My nvidia rig is an emplyment necessesity, even though convenient for me.
I mean I went to micro center the other day and picked up a refurbished RTX 4060 Ti 16 GB Gigabyte Windforce OC for only $382.00, so I see this as an absolute win! 🏆
In 2011 Nvidia CEO "Jensen" said to ignore customers if you wanna stay in business. I wonder why they did listen when people complained about number of CUDA cores, less bus width and a smaller chip. It is efficient and offers a better price/performance ratio in most use cases. For some content creation it is even faster because of higher clock speeds plus less percentage is disabled compared to the bigger AD103 chip.
@@PURENT Sure thing but it is faster in 1080p than a 3090 Ti in average , + 13% faster in Forza Horizon. So I liked it a lot from the start but not everyone did.
@@mytestbrandkonto3040 That's just cope. They tried selling an inferior card using the same branding as the superior one, under the guise that all you were losing was 4gb VRAM.
@@PURENT The fact it has a different chip was not hard to find checking the data sheet at all, so I think Nvidia did not try to hide this. I would have found it better if they just sticked to their 4080 12GB besides a price discussion. RTX 2070 was a TU206 DIE and GTX 680 a GK104. So I had no problem with an 80 class using a "4" chip, mid-size. They are more efficient under light to medium loads while delivering more than enough performance for my needs. People should not claim they were not intelligent enough to find out spec differences between RTX 4080 with 12 or 16GB. Sounds like trash talk to me from people that are not interested anyway to buy this product. I think Nvidia made a mistake there listening. I saw huge performance difference between 2 versions and was immediately intrigued because of how freaking fast the small version is compared to last generation graphics cards. But what else to expect if I could not buy the Ampere GPU I wanted. It is not cope, it's affection.
also, historically, 50-class cards were around $150 - $175 tops. When you consider this, it _really_ puts into perspective just how hard nvidia is trying to screw us.
The 4060ti should've always just had 12GB. That's what they should've done. No 8GB model, no 16GB model, just a happy combo of the two at 12GB. EDIT: Maybe even just a 10GB model would've been better, since the 4070 is already at 12GB.
I love how hysterical and clickbaity people are thesedays on youtube. Just straight up nonsense and sensationalism. The 4060ti 16GB is great and cheap as chips for anything productivity or AI (with gaming on the side) yeah VRAM matters alot for that. And no not everyone working with AI, Iray or other productivity tasks want a 4090 or need to spend that kind of money on 4090 because their kind of workflow might not require it. And its a pricetag of like what, 4 times or something like that? Inbefore AMD. Yeah good luck with AI on AMD or anything productivty because most Devs cater to CUDA even today.
I thought I would regret buying a 3060 ti in July of last year because of the 40 series coming out but now that they’re out and pretty bad I guess my choice wasn’t bad lol
Just bought one. Haters are gonna hate but the 4060 Ti 16gb is an amazing value right now for playing at 1440p max with DLSS AND running stable diffusion!
The poor handling of the VRAM and inability to really take advantage of it reminds me a lot of the VRAM "scandal" with the GTX 900-series cards being unable to actually use the 4GB on the card. I owned one of those cards and sold it as soon as I could. It wasn't just the fraud. Wait. That's harsh. It wasn't just the ooopsie, it was also that the 970 didn't perform that much better than the 770. Went AMD for a GPU last year for the first time since they were ATI and I love my AMD card. It has delivered above and beyond what it cost and makes zero excuses. It just delivers. NVIDIA has forgotten gaming their customers is not the game they are supposed to be playing.
It should have been 256 bit bus like the 3060ti and more cores, it has far less than the 3060ti. Or they could have made it 196 bit bus with 12gb memory, that would have been a good compromise.
True that. But thinking how much easyer it was for Nvidia to switch their 4060 assembly to a clampshell VRAM configuration and selling the card for 100$ more was probably genius in Nvida's eyes! A 196 bit bus would have ment a total reconfiguration that would have made the card as expensive as the 4070!
I hate these clickbait titles you people do. The card is great, just not the best price. Doesn't mean its a bad GPU in any minor way, and it isn't. Just imagine how many people sit there seeing a fat discount on an amazing nvidia card and just go "nuh uh, youtube guy said its bad" so they proceed to get a worse card... Why even bother making a review if you're just gonna sit there and review a price tag for 16 minutes straight, especially considering the price tag changes drastically depending on where you live. For me a 4060 ti 16gb is 450$, a 6800 xt is 700$, no sane person would go for the 6800xt in this scenario, unless people like you trick them to. If you want to buy a new card, the 4060 TI is an amazing choice, period.
@@aregulargenericname8794 yeah! i agree! should be at least 192 bit! but actually the best for it is 256 bit! but even the with 128 ti memory bus it's working not so bad! but i would never buy a 128 bit if i have a choice of course!
I hated the price but that's the GPU I wanted LOL A 3070 like with 16GB VRAM with a lower power consumption and small enough to fit nicely on an itx case. And no, AMD is not an option when you need the GPU for VR
Preach, sometimes there are situations where the card makes alot of sense, yes its pricey but thats how it is, Im still not paying insane amounts so im good
My sacrifice wasn't for nothing! Your video Vex made it all worth it, thank you sir.
Also just to be clear, viewers have said in no uncertain terms that the then upcoming 16GB version of the RTX 4060 Ti is a waste because of the 128-bit wide memory bus, as in there won't be enough bandwidth to service the larger capacity.
My man
@@transistorjump919 .. for most of the current titles. That's important to note, because future titles may assume and be designed to utilize larger amounts of VRAM and then the limited bus width may come to bite you hard.
@@transistorjump919 Because of resizable bar (Smart Access Memory @ AMD, Clever Access Memory @ Intel) memory bus bandwidth doesn't matter that much anymore. Unless you have a very old motherboard or you are not smart/clever enough to enable it ;-). Never the less, the card is much overpriced and might only be worth it for people that use it for AI and make back their money quickly. Maybe that is the reason why NVIDIA has set this high price and does no promotion for it. AI need lots of memory, not that much compute power. This card will kill their sales of their higher priced cards that are sold because they don't offer anything else with enough memory at a decent price (and AMD or Intel cards are harder to use in AI at the moment).
@@transistorjump919 indeed
If one also looks at the numbers and what games are popular and being played, 8Gb is plenty
No one is playing Apex Legends or Fortnite or whatever somewhat seriously and running max textures
Either a noob or happy with a disadvantage.
Pretty games are distracting games.
Most games being played are multiplayer games
It’s almost as if this loud tech vocal is just one of those loud minorities whinging for the sake of whinging
They don’t vote with their wallets
They’re full of 💩
NVIDIA has been lowering the ram on all their cards since the 30 series in order to force you to upgrade sooner.
It's not stupidity, it's deliberate obsolescence.
@@transistorjump919 No you idiot. They used a scam to give people less than 4GB which the card was supposed to have. The cards were sold as 4gb cards but only had 3.5gb.
3.5, the issue was with the last 0.5
@@transistorjump919Still successfully got sued, so it doesn't count. You could technically argue AMD FX had 8 cores, but the state of California says otherwise.
I dont know if you need a lot of vram when using frame generation. Is frame generation vram dependent or does frame generation lessen the importance of vram amount ?
@@manefin It increases VRAM usage.
I like how NVIDIA continues to be persistent disappointing their customers.
And they still do not give a fuck about gamers
I think they dont really care about gamers they are more concerned about theyre ai part
yeah nvidia has been disappointin me, due the 3060 12 Gb is only 128Bit and the 3060ti is only 192bit, and even the stinky 1650 is 128bit
@@collt3035 yeah also the drivers, my friend have nvidia videocard, he have ok framerates, and then it snaps and stays there... like wat da heil
?
That's what consistently mean😂.
at 500 dollars you can buy an RX 6800XT which is nearly 30 percent faster
TITAN RTX($2499USD) is 2.5 times expensive than RTX 2080 Ti($999 USD)
@@yuan.pingchen3056that doesn't make it a good argument to buy a 4060 for 500$
6800XT is the best card At this time Got mine a week Ago!
@@Adnyeus The title of the film uses the emotional word 'worst' but it is clear that the 4060Ti is not the worst TITAN RTX is the worst
@@yuan.pingchen3056titan rtx is not for gaming you gonorrhea stained pants
If AMD grows a brain and launches a 196-bit, 12gb card for the same 330$ MSRP as the 12gb RTX3060, then nvidia will be in some deep shit.
AMD Probably gonna miss their shot at this
YEH, Who’s gonna buy it ? who ? Y’all talk but y’all don’t walk
“Vote with your wallets”
- Checks the Hardware Surveys
Well, that was a Lie
@@SupraBagels this is the result of AMDs lack of competition for F^ck sake
Nvidia owns the market
Why would they be incentivised to border on monopoly grounds which the community would then complain about ? Like why do they NEED to do anything
They aren’t a public service
If you don’t like it go buy else where
What’s that, hardware surveys says the community is full of 💩
AMD already missed it with the RX 7600 who was released massively overpriced.
You will buy a 7600 and you will be happy....
Nvidia's next marketing move is that the 4060 Ti is meant to be used in SLI mode.
🤣 please not.
hahahahaha dead SLI 🤣🤣🤣🤣🤣🤣😂😂😂😂
Triple sli mode so you get 48gb on 128 bits 😅
@@NGreedia lol Imagin someone spending two thousand dollars on four useless cards. 😋
NVIDIA will release RTX 4060 ti 32gb on december 2% boost fps HAHA
Nvidia made a 4050 TI with more VRAM than 4070 Tis that cost more than double the price when this doesn’t have enough power to even push settings that use a full 16 GB and said: “🗣️🔥This is some heat! Drop it in July!”
@@transistorjump919 Nvidia themselves market this card as a 1080p GPU. Sure games push over 8 GB, but it’s not hitting 16 at 1080p. Plus, 128-bit bus bottlenecks it so hard that frame rates gaming at native 1440p at high/ultra settings are barely maintaining 60fps.
@@transistorjump919 which Nvidia themselves cut down from the 3060 Ti. Stop coping fan boy.
@@transistorjump919 he listens to the mainstream Tech tubers and treats their word as Gospel
Most of the community is full of entitled Sheep
@@sooubic lmfao if you call transistor jump an nvidia fanboy then you clearly clueless about the usual people in the comment section, your blind hate is showing too much, while i agree 4060 Ti 8 and 16gb is bad atleast im not as fanboyism about AMD as you are
@@sooubic “This guy disagrees with my opinion! Time to break out the personal insults and disrespectful behavior!”
Grow the hell up.
Nvidia should've unlaunched this too
They should’ve unlaunched the 8 GB and dropped the 16 GB price by at least $150. Or more realistically, just unlaunch the whole generation.
Nope. 4080 12GB was fine. Also the 16GB 4060 Ti seems to run very cool which is nice. So either a 4060 Ti FE it is or some decent partner model 16 GB version. Just ignore the upsell stuff if you think this is not the last GPU generation and you are not yet sure whether or not this will be your last graphics card. Still waiting for more power consumption charts (idle, video playback, multi-monitor).
It’s fine at the end of the month Nvidia will say
“Jk”
No, it's just extreme expensive way how solve 8GB VRAM problem (=new games from consoles).
Even if last 4GB is never really used, old RTX3060 was just more cost-optimal.
@@mytestbrandkonto3040nvidia fanboy 💀
GREAT RTX 4050Ti 16GB
I have an NVIDIA RTX 4060 TI with 16GB of VRAM, 32GB of RAM, and an AMD Ryzen 7 5700X processor. I have to tell you, I have almost all modern games and I don't experience any latency issues. And my motto is: Your money is for you to buy, mine is for me to buy, your GPU is for you to play, whereas mine is for me to play. My budget is not yours, which is why I have a 4060 TI with 16GB. Additionally, I seek power efficiency because we have two gaming PCs at home and I'm the one who pays the electricity bill. I chose this GPU because I like it, and I also like the MSI brand. I bought it new, and as the Dominican gamer saying goes: He who dies for his GPU, death tastes like glory.
Will note the last quote in my life lesson book.
i thought the mining community lived in their own reality, selling heavily used cards at ridicules prices, but its actually nvidia, getting high on their mining boom bottled up farts who lost touch with time and reality.
I got an ex mining RX 6800 for 400 euros last december and was lucky the guy was active on forums here so I could see exactly how long he mined with it and how seriously.
I got an awesome deal so miners are better than Nvidia in my case. :)
actually the same can be said about consumer.
@@how2pick4name AMD and Nvidia were not amused that their GPUs got scalped. There seems to be an empty gap between what miners and gamers are willing to spend, despite improvements have been made this generation.
Got a ex mining 6800XT Sapphire Nitro+ OC for £330 and a 3070 for £230
The 3070 was in perfect condition almost
Mining always gives the best deals after causing prices to skyrocket
Got Rx 5700 xt for $124 from the miner you can't get cheaper GPU that could run anything at 1080p atm.
I rememeber hearing someone say "There's no such thing as a bad video card, only bad prices"
I personally think the 4060 ti 16GB card was solely created, not for gamers... but for those who are trying to get into AI. A lot of AI related things require more than 8GB VRAM, and perform best using Nvidia cards. I kinda wish that Nvidia marketed this card for AI acceleration, instead of for gamers. It might have gone over a little better.... mabye... mabye not
@@transistorjump919 What if you're on a budget?
@@transistorjump919 An entry level card for AI matters. Education, AI drawing, etc. A lot of applications. Having enough ram to run a large model is more important than the processing speed, if you have to pick one.
bruh u can get arc a770 with 16gigs and bigger bus for around half the price
@@kriss4882 Yes, but how well would it perform in Stable Diffusion right now in comparison to an Nvidia card.
Trust me 16gb not enough for AI at all
Update 1-10-24 - Today, the 4060ti 16GB priced at $450 is the best bang for the buck for Generative AI purposes that strictly require an Nvidia GPU, second closest is an additional $500. I didn't buy them for gaming so not concerned about that usage.
THIS. Yes, this card is not great for gamming, but it has VRAM enough for SD XL and Local LLMs. 128-bit is slower, but for AI a model running 30% slower on VRAM is MUCH better than not running the model at all or having to run on RAM.
This
They can game as well. I use mine for both.
@@zerorusher The thing is it is GREAT for gaming at 1080p, which is the resolution most people over 30 play games at.
2/3 of my library looks awful at resolutions higher than that anyway. Hell with my eyesight I can barely tell the difference anyway.
But yeah it's targeted and LLMs and AI gen and it does the job for a decent price.
I already owned a 3060ti and moved to the 4060ti 16gb and don't regret it at al.
I bought an Nvidia 4060 Ti with 16Gb, and it's a very good card, and I am fully satisfied by it's performance! It is miles ahead of my previous GeForce 970, with the extra benefit of having lower power consumption. My friend still owns his 3060 Ti, and my 4060 Ti outperforms his one, in every game we tested. The hate towards this card comes only because of its price, otherwise the performance is fully justified.
every card has good performance just not every card has good PRICE TO PERFOEMANCE you could have bought a 7700 xt or a 7800 xt for the same price more or less!
@@mohandx4zI originally brought amd cards but recently switched to nvidia n let's be honest, drivers, Ray tracing, uoscaling are all far better as is the power consumption and they are quieter systems.
Nvidia and AMD's only excuse this generation is GREED. Let their GPU's rot on the shelves.
Regardless of radeon 7000's disappointment, they are a better buy than Ngreedia, not to mention radeon 6000 series.
AMD could of been a good guy but the 7600 was a disappointment
@@SupraBagels yep, had they given it 10 gigs of vram and it would have destroyed the market at $249.
@@pranavmohite8544And yet nobody buys AMD
@@GewelReal sheep mentality is strong on that one.
I remember buying a 980ti day one for £549. A flagship GPU..... It was such a good deal and lasted me years.
28nm wafer was like 1/5 the price of 5nm
@@GewelRealboth were at the time the smallest transistor sizes possible at that time so not really
To be fair, £549 in 2013 equates to £730 in 2023 money, adjusted for inflation.
@@GewelReal I never argued that point, I was arguing how ridiculoudly poor value this GPU is. I have a 4080 but thats cus I had the money and my 2070 died. The 4080 is still poor value despite being an incredible card. Its power usage is INSANE like 4k games at 230 watts.
@@WSS_the_OG exactly my point. Hell even the 3000 series was actually very good value if you look at their retail prices. The 3080 was £699 in the UK which I thought was a bit much but now lookign back is an insanely good deal if u got one.
What Nvidia should do next
Rtx 5050:10GB
Rtx 5060:12GB
Rtx:5070:16GB
Rtx 5080:20GB
Rtx 5090:24GB
rtx 5090 need 32gb vram
Thwy wont. 4050 has 6gb on a 96bit bus. They nwver listen to users.
They gonna milk this and release 4080 ti 20GB a year later.
hopefully Nvidia wont mess up their naming schemes of their own product again.I guess the intern dropped the naming shields and mixed them up or something.
@@__-fi6xg nah, they knew what they were doing. It just backfired.
Well hey, if you're a data scientist major wishing to build a 'budget' PC with a decent hardware for deep learning and AI workloads, that 16 GB RTX 4060 TI is a steal! Before, our only option was the 12 GB RTX 3060, which was an okay card, but that 16 GB will surely open other possibilities for bigger and more complex model, with enough CUDA cores .
Coming from a 1080p gamer here, we stil exist. I bought an RX 7600 OC and it can do 1080p no problem and in most cases it can handle 1440p pretty fine too. Not everyone is switching to 1440p YET.
We don't only exist, we are the majority by far. As someone that plays both older and new games, and dabbles with AI this card is a godsend.
Idk why but I found this channel and it’s now my Favorite PC channel Good Videos!
🫡
Yeah, Vex is fun.
I just got a 4060ti 16GB for $330 for Black Friday. Very impressed with the performance
Yep, I paid about the same for mine. The thing rocks. Very impressive.
The 2000 series won't be any longer the worst generation, I can finally enjoy my 2070 super
You clearly are young and do not remember burning Fermi, self-desoldering and overpriced 8th generation and infamous FX 5th gen.
The difference in VRAM-limited situations between the 8GB and 16GB 4060ti is enormous (no stutters and up to 4x fps in for example Callisto Protocol with RT), so the extra $100 is well worth it. But at a $500 price point something like the 6800XT makes more sense if you dont care about power draw.
There is approximately 13GB or RAM available to developers on the PS5 console, BUT they cant allocate all of that just to the GPU alone. Realistically, they allocate around 8-10GB of VRAM to the GPU on the PS5 (TLOU1 remake was using 10GB of VRAM for sure). In my opinion, GPUs with 12GB of memory will be sufficient for a long time if someone game at 1080p. The most VRAM demanding games however will require some tweaking at higher resolutions for sure. For example the R&C Rift Apart can easilly fill 12GB of VRAM at 4K. For example I saw 3080ti gameplay at 4K DLSSQ and fps went from 25fps to 100fps when game was no longer using over 12GB of VRAM.
This comment helped me more than the video
Why not just made 4060 192bit/12GB and 4070 256bit/16GB? How much expensive the wider bus is? The gpus have increased in price anyways...
You can buy 192bit bus nvidia GPU, the name is 4070. You can also buy 256bit Nvidia GPU the name is 4080…
You are barking wrong tree. You ca say that GPUs are expensive. That I agree whole hearthy! But asking why Nvidia does not make 256bit gpus, when they make 256 bit gpus is not valid. Asking why prices are bad… is IMHO
@@haukikannel But the 3070 was 256. So why not make the 4070 256 as well so it can fit 16GB?
What difference would bus width make?
Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor.
Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance.
Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti.
MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
@@jal.ajeera So you can fit more vram. I am mostly talking about the 4070/Ti.
@@ioannispoulakas8730 192 bit 4060 with 12 GB doesn't make sense. 128 bit 16 GB is perfect. They should've increased or atleast equalled the processing units. It's literally barebones cuda, texture mapping and render output units. The only reason there are any performance gains is because of the gen on gen architectural improvements per processing unit.
Got an RX 6800xt for just over $450 on prime day as thr centerpiece for my first build. Never even considered Nvidia due to such poor price to performance.
I think the move to lower VRAM is due to NGreedier wanting to kill the amateur AI creators out there, and move them to more expensive professional options. Just like how you can download a productivity GPU drivers set for NGreedier, or game ready drivers. If performance there is limited by it's frame buffer significantly, that will be the proof.
I’d much rather use a 40 series card at 100-200w than a 30 series card at 240,000,000w
you missed the part where you explain that "if you are coming from a 10 series card, the 4060ti is great for people who are coming from WAY older systems." but i guess your target audience is people who waste money on graphics cards every time they come out
If you have an old system, you likely have pcie gen 3. Which makes it worse. Also, you could save $50 and get the exact same performance(or more if pcie gen3) with a 3060ti or 6700xt. Which at this price range is a lot of money.
The price to performance is the same whether or not you upgrade from a 3060 or 1060 lol. It's still objectively garbage and downright insulting to even offer such a terrible product. At this price range, you have way better options. And the 16 gigs of vram is just a mockery from nvidia at this point. They created the vram problem, and now they are selling the solution but with slower memory. What a joke.
Preach. I came from a 960 and the 4060ti is great in my energy efficient 5.5 liter build. I can run whatever I want at 1080p
@@tabushka292what else can you buy with 16gb of ram for same price with relative performance for ai work?
Actually you missed a lot of points. 16G model bring big advantages for some cases. But you evaluate its value for unrelated uses cases such as FPS. Nobody buy 16G model for better FPS at well optimized and current games. The game like Hogwars consume too much VRAM. Also in 2 years I am sure most new game will required more VRAM than 8GB. So this 16G model will take advantages for that uses cases. another thing is AI. AI tools like stable diffusion need more VRAM to handle tasks. People who are work with such tools will buy 16G model. It is much cheaper than 4070, not to mention 4080 and 4090 price is sky.
Good point about wider buses scaling better to 1440p. I think's likely correct for a significant number of titles. Trouble is, this is not a GPU worth putting a 256-bit bus on, or even 192-bit. Adding a bus costs in many ways, space on the GPU, lines on the board, it all costs. Who knows what the BOM on this was, but the 4N node and space on the GPU massive cache was a big part. Cache genuinely does mitigate for slower memory, but because of that, faster memory then makes little difference to performance. You can look at this in the CPU space. Get an X3D CPU on your board and suddenly RAM speed adds much less to performance than it did before.
It takes only 7 months to make this vid obsolete. Anything below 16GB is not worth for generative AI. It is not about speed. It is about life and death, whether a programme can infer a result or not.
Nobody is seriously doing AI work with a 4060 ti. This isn't even a good attempt at trolling.
The 4070 in comparison isn't so bad after all. The cheapest 4070 have dropped to 579. And the cheapest 4060 Ti 16GB is 549...
How can anyone justify spending this much for a low-tier GPU?
Nobody can, cause 4060TI is essentially a 4050TI. At most this should cost 250$.
4070 is just meh in general but made good by 4060ti. It's obvious nvidia wants to clear the shelves of 3000 cards. Either way, lack of sales of 4000 and consistent sales of 3000 cards are all in their favor
@@alrizo11154070 isn't meh compared to 4060 Ti and other lower tier cards. 4070 and 4070+ is only good from 40 series... I own a 4070 and love it. Lower power consumption and faster than 1080 Ti.
@@MatruchusAmpere and Ada is different architecture, You can't say that since nvidia is made them that way in this generation, and they have rights what they are naming on there GPUs. If you just gaming and don't care much on frame generation/ray tracing performance, you can get 30 series or amd GPUs on good deals. 😊
They can justify spending this much because they have humor.
No. Laptop 4070 is the worst GPU ever made.
You'd think that 16gb would be more suitable for the 3070 or even 4070 but nope! Let's use that to a card that won't utilize it that much!
Maybe next year you get what you want. 😉
This is what people cried about , being sheep following behind hardware unboxed. The RTX 4070 was probably already set in stone when people made the insane amount of videos crying about vram . So Nvida put 16 GB's on a card to shut people up . Yet I'd agree 4060 & 4060ti it's pointless, 12 would be enough . Those cards aren't strong enough to utilize the vram yet if hardware unboxed say it can ALL OF A SUDDEN PEOPLE BELIEVES IT CAN .....ALL SHEEP FOLLOWING youtubers
1 year later and I still wish people would stop gauging GPUs based on gaming only, the 4060 ti is not the best, but a good GPU for 3d workload and better at saving power at 160w, that coming from someone who renders 2-3 weeks straight.
Guess we'll see how it goes for me - I'm coming from a 1660 Ti and wanted a more modern upgrade that has 16 GB and went as low as possible $ wise, so timing it up with cyber monday meant I could nab a 4060 Ti 16gb for $450. Even if it's subpar to others, the alternatives cost more than I wanted to spend or didn't have 16 GB. I think in my niche situation of just wanting *any* valid upgrade that satsifies that, I might be in the target audience for this (whatever that is)
the price is down to $450 and it's kinda worth it, with 16gb vram and low tdp it's actually saving my money because i less paying the electricity bill and i can make some money by training ai models on my gpu because the vram 16gb it still used for training ai models with decent amount of time, and this gpu is decent enough to get 60fps on any game with 1080p monitor.
What difference would bus width make?
Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor.
Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance.
Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti.
MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
Finally someone with a brain
@@Eleganttf2 It's baffling to me how everyone just keeps on harping on one of the least significant issues with the 60 series. 99% of the community yaps about bus width.
Less processing units, VRAM and absurd pricing is what are the issues.
Nvidia fixed VRAM, but made the existing absurd pricing, beyond comprehension.
16GB 4060ti with this level of silicon should not be more than 350. If it had equal to greater than 3060ti silicon, then 400 would've been acceptable for 16 GB. 500 for lesser processing power than 3060ti is just pure greed.
the 16gb version wasnt made just for gamers lol..
the extra gb is also for people who do 3d content or any content,rendering,video editing.etc..duh and you can still enjoy games on ultra at 1080p
Welcome to the mind of a young person that didn't have to actually work for money. "Bro just get a 4090!"
Yes, I am Blender user. I need CUDA and 16GB of VRAM. :)
The other issue is unsold stock from the previous generation (3000 series and amd's 6000 series cards). Both companies overproduced for the crypto mining boom and when it crashed had a lot of unsold cards. They deliberately gimped their new generation in order to make the last generation cards still attractive so people would buy them.
Nope, the problem is that the assembly lines switcherd to the new architecture and miners and gamers are now selling their 30x0 cards for half the price on ebay, really hurting their low and medium tier 40xx cards :)= What comes after a big Boom - A Crash!
I guess Nvidia made it pretty clear, the 16GB 4050Ti is not for gaming (at least not its primary purpose), but for some AI calculations, a large proportion of the gaming community just could not accept the fact that they are no longer the VIPs to be served (or have their "feelings" considered), and obviously Nvidia is aware of the presence of the massive decommissioned mining cards in the used market, if they had to complete the price with those used mining cards, they don't see it profitable and decided to prioritize the AI demand that 40 series exclusive properties is a necessity, rational business decision IMHO (of course it is unpopular to the gaming community, but Nvidia knows the "reputation" of the company means nearly nothing to this market segment, people will still need to swallow the 50/60 series pill when the massive stockpile of the 30 series mining cards dry out eventually).
At the end of the day, even those "picky" folks will buy the new cards when Nvidia release some decent products again, why should they care about your feelings? This was already proven in the mining era, we see no boycott when the 4090 was release even gamers were mistreated in the past years, gamers won't punish AMD/Nvidia for their past actions, if so, why should they care?
bought this card for $500 brand new, as an upgrade to a 2060 super 8gb, and i have no regrets it runs 1440p at 60-70 fps on wukong and SM2 and 150-200fps at 1440p on warzone. i was gonna buy a 3060 ti but they werent available
I upgraded from a 3060 ti 8 gig and damn was it ever a massive difference. I don't what what these content creators are smoking but everything I play went up a solid 10-20 fps, everything plays much smoother, and larger AI models spit out tokens at 2-3x the speed.
Nvidia with their shady business practices and people have the nerve to Call Out AMD because they “Think” DLSS is being blocked from the biggest game launching in 2023
Yup, that is what Tensor Cores are made for. DLSS is great if most developers will not fine tune for different GPU architectures.
Yet they have their own proprietary stuff like DLSS and CUDA :-DDD kinda ironic to cry for being left out.
@@RuruFIN In future AMD might become stronger executing all sorts of stuff in their cores but for now DLSS looks brilliantly.
Both actually are doing shady things for a very long time now.
@@arenzricodexd4409 Yeah, so while some of the things they do I can tolerate, the tendency is to keep a GPU for longer if quality is good enough. If not you can switch to integrated GPUs, which I did for 10 years or become an expert in repairing GPUs. Options are limited. They will accept a shrinking PC market because gamers' expectations are too high or they will fix things and everyone is happy without any exception.
I think they both have their place. If all you are looking for is rasterization, go AMD as that's what their primary focus is. You don't have to pay for tensor cores, or RT cores, just basic rasterization. Personally I think consoles are great for that, and I have a PS5 myself. If you want more than rasterization, and are willing to pay for tensor cores for DLSS/DLAA + frame generation (which frame gen is spectacular), or need cuda cores for apps, go with Nvidia. Nvidia has the extras to artificially double your frame rate, and it's actually quite good. I was playing a path traced Cyberpunk the other day and decided to shut it off and see what the game naturally looked like in a busy area with default lighting, and that's when I seen how big of a difference it actually was and one of the few games I keep DLSS + Frame gen on for rather than native 4k. Not even a top of the line AMD for $1000 will get you there though, and once UE5 gets more implementation with their feature set, we will find out how well AMD really does with just a heavy focus on rasterization without all the extras. Not everyone wants to pay extra for tensor/RT/Cuda cores and that's understandable, but know you are going to pay for them if you want them, and AMD not having them means you can pay only for rasterization only without the extras, and there's a market for basic playing of games. Nvidia with the extras is always going to cost more for those extras, and until they convince people (steam stats seem to show that they have) that their extras are worth it, you have the option to go with the minority leader instead. There's a GPU for everyone here, and luckily there's 2 companies to target different areas of the market. Maybe they should had eliminated tensor cores/RT cores all together on the 4060 and put it into rasterization instead, but they are heavily focused on RT/DLSS at this point and now it is a bundle package. If it isn't for you, it isn't for you.
The rtx 4060 ti chip is called AD 106, which means it was intended as the 4060, not the 4050 ti. The 4060 has AD107, which means it was intended as a 4050/4050ti.
Nvidia is not dumb enough to make their lies THAT transparent. Of course they will name the chips in a way which suggest everything is cool. But the 4060 Ti cannot be a 3060 ti replacement because it lacks the performance gain over the old card of about 20-25%. If you consider this, it becomes obvious that the 4060 Ti is, performance wise, in the 4050 ti area, no matter what nvidia is trying to tell you.
For artificial intelligence learners or hobbyists like me, 16GB of VRAM is quite meaningful with a CUDA enabled hardware. A770 has also 16GB but there is no support for CUDA on Intel GPUs. CUDA is actually de facto standard in AI, especially for training.
Yes this card is the king for AI at this price point.
This is the biggest comedy that i have seen for a while.
Can’t wait for Nvidia to say
“Jk”
Featured you and this video in my new video releasing tomorrow. Good work!
this should cost 300$ and named RTX 4060 and be 10/12GB not 16GB.
Man, the prices here in Australia are even crazier. I was looking at the 4060ti 16gb and found one for $910 AUD, the. i noticed I can get a 4070 for $950 AUD from the same store. I don’t understand how that is justified or how the pricing works here but it’s ridiculous
Happy with my 6800 xt, just got it for $325 second hand.
That’s huge. Pop off
Another viewpoint. GPUs can take years to get from initial design to setting up a fab, ordering materials and starting full commercial production. Followed by global distribution etc. During the pandemic, production of raw materials and component parts dramatically reduced. The cost of everything went up quickly, and availability went down. Even big companies like NVIDIA rely on others. Contracts have to be fulfilled, yet lead times for parts and materials suddenly became several months rather than weeks. Businesses are trying to recoup the extra costs however they can, utilising what's available, whilst still playing catchup. Adding extra RAM to an existing board is a "quick fix"... and some people will buy it.
The ideal config for these cards, at around the $400 US price point, would have been a 12 GB version with a 192 bit wide memory bus. That's a card people (besides Nvidia fan boys) could not only defend credibly, but actually feel they're getting their money's worth. Nvidia could put 32 GB of VRAM on this silicon, and it would still suck due to that 128 bit wide memory bus.
It also makes me wonder what Nvidia thinks is the right use case for the 4050 series; 720p? Or maybe 480p? Progress is dead, long live progress.
Actually now that the AI boom is here you may see some new manufacturers jump into the picture. Governments everywhere are pushing for semiconductor manufacturing so in a few years even that may help. Really thankful for the US China Cold War. Will really push the elites to actually progress rather than simply tread water.
The "Clampshell" way just means that you have 2 32Bit Chips on ONE 32Bit Bus - And that means they work like a 2 Drives RAID-0 SATA Connection with a bigger cache - Means for small texture files it is okay but for big texture files that even exceed the cache size you will lose a lot of transfer speed compared to a bus that allows all of the texture being transfered with 2x32bit. So that makes it clear that you will suffer as soon as you go hi res with high texture settings. Will it be better than 8GB on 128bit combined thoug - For sure, because the cache is faster than using the RAM to cache those files. For 32GB and 128bit you would have 4 people that need to use a single toilet at the same time...
For me 4060ti is best gpu ever made) best value for my money. I work with AI , so I should have at least 12vram these days. With 4060ti I am get 16. Even better. Not sure why anyone need to whine if about this gpu. Not good for gaming? For me it would be okay for gaming too. In my Asus fx505dm laptop I have rtx 1060 with 6gb vram. Last two games I played there on high graphic settings were Hellblade 2 and Alan Wake 2. Everything was smooth with decent framerate
those 128 bits are way too little, that 16gb vram ain´t strong enough it´s pointless u won´t be able to use that 16gb at it´s best at all
In the years to come I could see Nvidia 4060 Ti 16Gb staying on top in terms of value and technical edge vs AMD if upcoming games rely extensively on DLSS 3 to guarantee high FPS at 1440p.
Unless/until FSR catches up 😅
I wouldn't like the Frametime impact compared to faster native FPS! DLSS is allready great even without tricks. And 3080/3090 get cheaper on the used market day by day. Not to mention how far Intel has come with their ARC Drivers and how interesting their Battlemage will be.
You say only 3% performance but on games that suffer from the 8GB VRAM, the percentage increase greatly. Although, I understand, still don't worth it. However, personally I really like the card 'cause is extremely efficient.
The vid is bs.
I'am an AMD fanboy but I have to admit the 16 gb version is a multi purpose GPU.
Don't look at it only from a gaming perspective.
Having 16 gb on this card for AI is just nice and the efficiency is good.
It's more like an affordable AI card that can game as well which makes sense.
I want to congratulate all the simps who completely fail to understand the purpose of this Card. It is NOT GOOD FOR GAMING. This card is PERFECT for people who want to RUN AI AT HOME. Because in AI, VRAM is everything and with this card you get 32GB of Vram for much less than it would previously cost to buy one 4080 which only has 16gb as well. You can buy three of these for the price of 1 4090. That means 48 GB instead of 24. And still more than enough cores to run LLMs. And forget AMD, they suck in productivity. Period.
They add VRAM to the one card that needed it least, overprice it and then they will be: Why would we increase RAM on the others? You didn't buy the 4060 16gb.
It's just ridiculous.
4060 Ti 8GB was the one that needed more VRAM the most tho?
While some say it is good to have more options, a 3060 12GB was unpopular first so we might see something surprising with this 4060 TI 16GB.
It's not even that bad. I own one, and it runs all my games at 1440p just fine. It's hugely power-efficient, too, which is something you completely overlooked. I bought mine for £400 a year ago. No regrets!
You missed virtual reality. VRAM also matters for VR because you are using 2 individual displays on top of whatever you have going on with your desktop displays. So, if they priced it properly, it would have assisted some VR users
Nvidia used to do this crap before, remember the gt 640 with 4gb vram, that thing was a potato, that amount of vram was completly useless
maybe its trash for Gaming ...but its a great budget Gpu for rendering (the 16 gb version), i bought one from msi as a reserve gpu for my rendering rig , nice perf and temps.
The 4060 Ti is one of the best options for rendering though. Efficient and reliable. Obviously get the 16GB version
It's great for stable diffusion or any AI, I copped one since it has more vram than the 4070. Decently good for gaming since I'm coming from a 1070 too, not like you have good alternatives in AMD.
4050 6gb says
"hold my beer"
Somewhere, somebody is buying a 4060 ti 16gb and I wonder what thoughts are going through their head.
Probably: Yes, a new GPU! Because they are completely oblivious to where the product actually stands in the market, they will simply be happy to get something new.
"An AI accelerator for my amateur project for a bargaining price"
Don't worry, I won't buy it.
I bet is better than my rtx2060 for sure 😂
With right prices (there are much lower now in some countries), and upgrading from e.g. GTX 1070, wanting to use DLSS 3 and FG - it's a good deal ;)
I'm coming from the future to say that the title of the worst gpu ever made is the rtx 3050 6gb.
^ This.
it's sad because nvidia is now a trillion dollar company and doesn't care if we buy the 40 series or not and will probably continue to disappoint
Great thought and analysis, Vex. Also from Daniel Owen channel testing of 8 vs 16 GB 4060Ti , the Frame Generation they championed in marketing of 4060 Ti 8GB actually make it consume more VRAM , at least on CP2077. Thats ironic on Nvidia side...😂😂 Also from Yuzu devs, 128 bit bus will make upscaling on 4060Ti perform worse than 3060Ti with 256bit...
Just buy the 4090 I don't know what is wrong with you people 😅
The price tag of that gpu
I don’t have spare kidneys
Not all the people is a spoiled rich kid that can't even stand up from his crouch
Even if you could afford the card the cost to upgrade your power supply would put you over the edge. I live in an off-grid solar powered house and when I switch on my computer and game with my 3060ti card, my computer outdraws my refrigerator or air conditioner. The 4060ti will lower that a great deal.
To be fair, NVIDIA is jacking up prices because they know they will get what they're asking, whether it's now or eventually, when the elevated prices of GPUs becomes a normality down the road. Real solution is to have a competent rival that's pushing prices down, which seems to be a hit or miss lately.
16gb will age amazingly.
Lol That super niche example you state at 14:25 might represent a bigger crowd that you think. I'm one of those creator that requires a GPU on Nvidia with the most amount of VRAM possible. Daz Studio users is that crowd.
Like Linus talked about on WAN Show, even if they asked DOUBLE the price of the DRAM/VRAM it would've been about 50 dollars, not 100. This is just greedy and shameless.
They are trying to keep the profit stream going from the crypto boom. But it’s too different customer bases and gamers aren’t backed by billion dollar investment funds just working on a quick buck. But shareholders don’t care and demand the same profits quarter to quarter.
This is what happens when you shift your focus from gaming to bitcoin mining to AI. It's a excellent buy for AI but not much else.
16gb 4060ti is actually a good card 16gb means future proof even for 1440p. Doesn't have limitation 4060ti 8gb has
But it's just priced wrong. If its pricw comes down to $400 I think it will be a good choice.
You just play on 1080p? Everything else get's hurt by bandwith!
no, laptop 4070 is the worst GPU ever made.
If they only fixed the memory bus and the price...
If only Nvidia could know that 🧐
What difference would bus width make?
Bandwidth of 4060ti is 288GB/s (Not including the cache) which translates to roughly 36 Gigabytes per sec. That's a massive data transfer per sec. Even a 4k scene does not require that much data per second to render. Bus width is NOT A LIMITING FACTOR when it comes to games. It can cause bottleneck, but only if the game is heavily data streamed. Maybe a flight sim kinda game at 4k ultra everything on. There bus width maybe a limiting factor.
Most games are rendered and processed on the go. Hence they rely heavily on GPU processing at higher resolutions. Bus width does not bottleneck the games. VRam can, because to render a scene, sometimes over 12 GB is required in memory on 4k, but bus width does not affect squat in 99.99% of the games. This bus width adage is from the stone ages that more bus width = more performance.
Also, 4060ti is NOT A 4K CARD. 128 bit is way way more than enough for 1440p. Flight Sim also does not stream 36 GB data at that resolution. There is NO game on the mainstream market which will ever get bottlenecked by 128bit at 1440p or below resolution so bus width isn't a problem with 4060ti.
MASSIVE reduction in processing units is the main problem with 4060ti. Games utilizing multiple GPU cores better will show negligible uplift over 3060ti or may even perform worse. Bus width has almost 0 affect at 1440p or lower resolution so stop whining about higher bus widths.
@@TedBoltcutter It's baffling to me how everyone just keeps on harping on one of the least significant issues with the 60 series.
Less processing units, VRAM and absurd pricing is what are the issues.
Nvidia fixed VRAM, but made the existing absurd pricing, beyond comprehension.
The thing is that you see all just from gamer point of view... That memory may be great, e.g., for Substance Painter with tons of layers.
It is funny for people think the 4060ti is a 4050ti, nvidia wouldnt sell you a 4050ti that was this close to a 3070. Techpowerup says it is 5% slower than the 3070. It is too bad and too expensive to be a 4060ti, but too good to be a 4050ti. Is is also funny to care about the name of gpus or gpu die, they dont really matter tbh, the price is the problem. If the 4070 ti was called a 4080 12gb but cost $600, everyone would cheer for it, price to performance is the problem, not name to performance. Even if they used the the AD101 die that is usually reserved for the best gpus but still performed like this, it would still be trash. If the 4090 had a AD107 low tier die and kept its performance & power usage, nobody would care. The problem is the performance. We dont care how they do it, we just want the performance we paid for. Dont teach nvidia that they can change the die naming scheme & unsuccessfully trick us.
NVIDIA owns the market
There is only so much market share they can have before it becomes a problem
Look at the competitions offerings
What happens if Nvidia released a 3070/Ti as a 4060/Ti at $299 (as you said , naming schemes aren’t important now are they, just performance RIGHT ?)
What HAPPENS if Nvidia released a $299 3070Ti with 12-16GB Vram 256bit bus, what happens??
Ask yourself ? There is a reason why Nvidias product stack under the 4090ti is Gimped to some degree
If y’all were really what you preached about, we’d be seeing the market share swing the other way massively. But it isn’t. Y’all want More from AMD and Intel without supporting them but also want more for less from the company who owns majority of the market ?
Does ANYONE SEE THE PROBLEM HERE ?
It is a 4050. Proven over and over again and again.
Its funny how ppl cant see Facts even.
@@flimermithrandir if you think nvidia would make you a 4050 price class gpu that is 5% away from the 3070 in performance when there are so many 3070s still in the market, i cant help you. 😂 50% generational perf uplift from a XX50 class card ha ha ha. It is like you people have never met nvidia before, remember how the gtx1050 was only 6% faster than the gtx950🤣. I knew then that nvidia sees the XX50 class as the dumpster ground. They created shit 4060 and 4060 ti class cards the name means nothing, i want the price down. So what they can call them 4050s or 4030s for all i care, if the price doesnt go down, it means nothing, the price is what i want down.
@@Ober1kenobi im not sure i understand your point. If the nvidia launched the 3070ti as a 4060 ti for 299, it would be a way better produt than what we got, 3060 msrp was 299 and 3060ti was $399. I dont even care if nvidia called the 4060 ti a 4090, if they charge me $299 for the 4060 ti is better than the trash we got. It would be a 33% increase in price to performance. Dont let nvidia trick you with names, they'll change the die naming skews in 2024 to trick us. I havs used both AMD and nvidia GPUs for my gaming, but i need nvidia for my job, i use cuda for machine learning at my job. Edit: im waiting for battlemage with intel for my main gaming rig. Im not biased against intel and amd, i use a 13700k cpu and 7900xtx on my main gaming rig. My nvidia rig is an emplyment necessesity, even though convenient for me.
I mean I went to micro center the other day and picked up a refurbished RTX 4060 Ti 16 GB Gigabyte Windforce OC for only $382.00, so I see this as an absolute win! 🏆
I paid $360 for my 4060ti 16gb. Works great. I don't think it's "the worst gpu ever made"
For me it looks like Nvidia is focusing on renaming Stuff as i remember was the 4070ti a 4080 with lower Vram.
In 2011 Nvidia CEO "Jensen" said to ignore customers if you wanna stay in business. I wonder why they did listen when people complained about number of CUDA cores, less bus width and a smaller chip. It is efficient and offers a better price/performance ratio in most use cases. For some content creation it is even faster because of higher clock speeds plus less percentage is disabled compared to the bigger AD103 chip.
Remember the 4080 12 gb was not only less VRAM, it also had less cores and a slower memory bus.
@@PURENT Sure thing but it is faster in 1080p than a 3090 Ti in average , + 13% faster in Forza Horizon. So I liked it a lot from the start but not everyone did.
@@mytestbrandkonto3040 That's just cope. They tried selling an inferior card using the same branding as the superior one, under the guise that all you were losing was 4gb VRAM.
@@PURENT The fact it has a different chip was not hard to find checking the data sheet at all, so I think Nvidia did not try to hide this. I would have found it better if they just sticked to their 4080 12GB besides a price discussion. RTX 2070 was a TU206 DIE and GTX 680 a GK104. So I had no problem with an 80 class using a "4" chip, mid-size. They are more efficient under light to medium loads while delivering more than enough performance for my needs.
People should not claim they were not intelligent enough to find out spec differences between RTX 4080 with 12 or 16GB. Sounds like trash talk to me from people that are not interested anyway to buy this product. I think Nvidia made a mistake there listening. I saw huge performance difference between 2 versions and was immediately intrigued because of how freaking fast the small version is compared to last generation graphics cards. But what else to expect if I could not buy the Ampere GPU I wanted. It is not cope, it's affection.
also, historically, 50-class cards were around $150 - $175 tops. When you consider this, it _really_ puts into perspective just how hard nvidia is trying to screw us.
The 4060ti should've always just had 12GB. That's what they should've done. No 8GB model, no 16GB model, just a happy combo of the two at 12GB. EDIT: Maybe even just a 10GB model would've been better, since the 4070 is already at 12GB.
I love how hysterical and clickbaity people are thesedays on youtube. Just straight up nonsense and sensationalism. The 4060ti 16GB is great and cheap as chips for anything productivity or AI (with gaming on the side) yeah VRAM matters alot for that. And no not everyone working with AI, Iray or other productivity tasks want a 4090 or need to spend that kind of money on 4090 because their kind of workflow might not require it. And its a pricetag of like what, 4 times or something like that?
Inbefore AMD. Yeah good luck with AI on AMD or anything productivty because most Devs cater to CUDA even today.
Exactly. Good to finally see some sane comments after 3 straight years of AMD propaganda. Maybe youtubers are finally waking up!?
I thought I would regret buying a 3060 ti in July of last year because of the 40 series coming out but now that they’re out and pretty bad I guess my choice wasn’t bad lol
I think it's not bad at all of your choice! Especially in terms of dollar to performance ratio.
Just bought one. Haters are gonna hate but the 4060 Ti 16gb is an amazing value right now for playing at 1440p max with DLSS AND running stable diffusion!
LMAO!
Amd squad represent 💯💯
The poor handling of the VRAM and inability to really take advantage of it reminds me a lot of the VRAM "scandal" with the GTX 900-series cards being unable to actually use the 4GB on the card. I owned one of those cards and sold it as soon as I could. It wasn't just the fraud. Wait. That's harsh. It wasn't just the ooopsie, it was also that the 970 didn't perform that much better than the 770. Went AMD for a GPU last year for the first time since they were ATI and I love my AMD card. It has delivered above and beyond what it cost and makes zero excuses. It just delivers. NVIDIA has forgotten gaming their customers is not the game they are supposed to be playing.
It should have been 256 bit bus like the 3060ti and more cores, it has far less than the 3060ti. Or they could have made it 196 bit bus with 12gb memory, that would have been a good compromise.
196b or 192b?
Xx60 should be 192b, and xx60ti should be 256b.
Consider AMD or 4070 instead..
True that. But thinking how much easyer it was for Nvidia to switch their 4060 assembly to a clampshell VRAM configuration and selling the card for 100$ more was probably genius in Nvida's eyes! A 196 bit bus would have ment a total reconfiguration that would have made the card as expensive as the 4070!
This channel just uses deceptive content to sell AMD cards. I hate Nvidias pricing, but they are the higher quality option at this time.
I hate these clickbait titles you people do. The card is great, just not the best price. Doesn't mean its a bad GPU in any minor way, and it isn't. Just imagine how many people sit there seeing a fat discount on an amazing nvidia card and just go "nuh uh, youtube guy said its bad" so they proceed to get a worse card... Why even bother making a review if you're just gonna sit there and review a price tag for 16 minutes straight, especially considering the price tag changes drastically depending on where you live. For me a 4060 ti 16gb is 450$, a 6800 xt is 700$, no sane person would go for the 6800xt in this scenario, unless people like you trick them to. If you want to buy a new card, the 4060 TI is an amazing choice, period.
It is kinda bad though, with that 128 bit lane
@@aregulargenericname8794 yeah! i agree! should be at least 192 bit! but actually the best for it is 256 bit! but even the with 128 ti memory bus it's working not so bad! but i would never buy a 128 bit if i have a choice of course!
For personal AI LLM application this GPU is great. 3x4060ti cost 1500$ for 48GB of VRAM, it's pretty cheap.
I hated the price but that's the GPU I wanted LOL
A 3070 like with 16GB VRAM with a lower power consumption and small enough to fit nicely on an itx case.
And no, AMD is not an option when you need the GPU for VR
Preach, sometimes there are situations where the card makes alot of sense, yes its pricey but thats how it is, Im still not paying insane amounts so im good
This is my exact situation and the reason I bought my 4060 ti.
Especially amusing considering the 3060 Ti retailed at $399 and performed similarly to the 2080-2080S. So sad how Nvidia has fallen!!
Another banger 720p card woooooo
😂