The entery Lineup of Nvidia and AMD are pointless but you know, still trash fanboys sheeps will buy and overprice GPU with and will invent all excuses....
Did you expect anything better from the 4000 series? Yes the 4090 is Amazeballz! But everything else is just an exercise in how to Dick down your customers!!!
Seems like my gtx 1070 can't retire for another 2 years. I’m not going to pay 1100€ for a 4070ti which is like 2x the performance of a 1080ti, 7 years later. If Nvidia can’t lose their greediness then I can bet Intel's Battlemage platform will be selling like hot cakes soon.
@@artificialstyle998 according to gpu database of techpowerup its about 104% better than the 2070super which is roughly 4 ish % faster than 1080ti so yeah 100%
Imagine thinking Intel won't be just as greedy as every other company. They literally invented the greed game way before Nvidia decided to take it over.
Well the rtx 4070 will be a 3080 performance for the same price £650 with lower tdp. If you can just grab a second hand 3090 for £500-600 and call it a day.
Not to mention a bn 3070 heck even the 3070 ti version should be significantly cheaper than 4060 ti's retail price. The inflation of new gen. GPUs is sickening.
You spelled 6700 XT wrong ;) Seriously though, the 3060 doesn't even benefit that much from all that Vram, and even the 6700 XT often doesn't need that much Vram. That being said, the additional Vram can be useful, and can make up for having less than the optimal amount of system RAM (but that's a bit unusual when that happens).
@@artificialstyle998 6700xt doesnt have good raytracing performance, A770 does. and XeSS is a decent bit better than FSR. is that worth it for potentially worse drivers and stuff? idk, but I am starting to seriously consider it personally.
@@fayefischer1751 personally I wouldn't even consider Ray tracing on a A770, 3060ti, 6600xt tier of GPU. The performance hit/having to lower settings isn't worth it in my opinion
@@fayefischer1751 If you can find an A770 way cheaper than a 6700xt then is a deal, but for raytracing/XeSS is not worthy (raytracing in mid cards right now isn't really worthy, not even nvidia ones), but at the same price 6700xt is an easy win in my opinion (unless you really need those extra vrams from a770). I would skip both thou for next arc gen and fsr3 if I didn't need an update, or considere them at that point if they prices droped or if the new gen gpu's aren't worthy and the intel drivers became betters and fsr3 is supported.
I was thinking on holding as much as possible with my 960 4gb but after seeing the prices nvidia are going for it wasn't worthy to wait to buy a 4000 series that will have an increase in performance for the same or higher increase in price compared to the previous gen (same for amd but not that disgusting as nvidia I guess). Happy I could find a new 6600 xt at $250 in my country when they normally are $350+ and 3060 are $400+. First time using amd gpu and no problems yet after uninstalling nvidia drivers with DDU.
Sounds like it's what many of us thought - That the 4070Ti we got, is more like what the 4060, MAYBE the 4060 Ti was SUPPOSED to be. A 4060 (non-Ti) should be BETTER than a 3070 by a decent margin. For them to release a 4060Ti that basically performs the same as a 3070 is an embarrassment and INSULT and should ONLY be taken as such by consumers. This IS a place where AMD should be taking advantage of Nvidia's greed, but given how the 7000 series has gone so far, they'll botch this one too.
The major issue with the 4060 ti is that 8GB of VRAM. The 3070 is a great 1440p card so the 4060 ti would be as well. But as demonstrated by Hardware Unboxed, 8GB is not enough for games with HD texture packs. Steve showed that in games like Far Cry 6 the RX 6700xt was faster than the RTX 3070ti and it was because of the lack of memory on the Nvidia cards.
@@innocentiuslacrim2290 It’s the mindset that has changed. 3070 is still a good product and people were mad to get it mid last year. You think it became bad in 6 months. Absolutely not, it’s people expectation that they wanted 40 series with big performance jump at same price as 30 series. The way to think is the 30 series cards were made for 4 years then you will start making the sense out it. It’s a mid cycle refresh.
@@ever611 Well it's kinda true but not at break deal. I came from a 960 4gb and holded pretty good for 8 years (I was still playing new games at 1080p or lower with fsr at 30-60fps with optimized settings at high/medium/low deppending they fps impact and visuals, I could keep gaming with it but found a new 6600 xt very cheap in my country compared to everything else), meanwhile the 2 gb version had problems like lowers 1%/stutters/textures load problems/etc. (my friend with the 2gb version updated way before than my because of those reasons in the games we both played and I didn't had that many problems, even when our fps were pretty close), so spending those extra $25 for a 4gb evga card was a great deal in my case (my friend ended upgrading for the only thing he could afford buying a 1060 6gb overpriced at the mining boom regreting he didn't got the same 960 as me). At the time everyone was saying it wasn't worthy the extra 2 gb (it was that or spend $150 more in my country that I didn't have at the time for a 970 even thou they had the 3.5 gb thing xD). If someone is upgrading every 2-4 years it doesn't really matter but for those like me that upgrades after a long time those extra vrams are a little worthy. The amd "fine wine" is more because the drivers thou (I never was an amd user until now, just to be clear), look 1060 vs 580, 1060 started better but at some point 580 had better performance thanks to the drivers and maybe vram (not sure if is because amd has better drivers because they don't forget about previous gens or because nvidia one are bad because they focus on new gen more, or a little of both). I dont't feel 8gb for 4060ti is a problem thou, the problem is going back to GDRR6 when the refreshed 3060ti is GDRR6X (I guess they want to launch a refresh again so that way they keep the prices at "msrp" or higher like they do lately, really scumy like always).
That's not the worst thing about the 4060 ti. It would be better to have 20% more performance than to have more Vram, though 8GB was definitely not a lot even for the 3060 ti, let alone the 3070 or 4060 ti, and the 8GB of Vram definitely is an issue, just not the biggest issue imo. It's still true that 8GB is usually enough to go with the amount of GPU compute power that a 3070 has, or what the 4060 ti will have. The 4060 ti is looking to also be rather weak given what the name of the card is, making it a very minor upgrade over the 3060 ti in every category but power consumption. Indeed, the 12GB of Vram of 6700 XT and 6750 XT, as well as their prices, will definitely will make the 4060 ti look bad.
Nvidia might suck but Amd seems like they have no idea what they are doing bumbling around in the dark and all the anti costumer practice is starting to sound just like Nvidia
Thanks for the advice on buying used. I having been waiting to see if there would be any low-end cards in the 40 series that would be cost effective compared to a midrange used 30 series. I'm beginning to doubt that. I was also hoping to see how good FSR 3 would be, whether it could rival DLSS in any capacity (I don't care about raytracing) and let that inform my decision of going AMD over NVIDIA if I buy current-gen. I'm also curious about AV1 rendering, and looking up whether or not there is any serviceable workaround for running programs that use CUDA on an AMD graphics card. I do some video work from time to time. Probably not often enough to really worry about it that much, but it would be nice. I don't know if that is even be possible; I'm not super experienced in this field.
I was in the same spot about new gen vs previous gen price/perfomance, ended getting a new 6600 xt very cheap at my country and I am loving it, first time buying an amd gpu also (I also was holding because I really wanted a nvidia card for the few things I sometime do that run better on nvidia gpu, not for raytracing since I don't care about reflections and for ligthning I prefer to use my own shaders to make the game look as I like with a way lower hit on performance than rtx on, but after seeing how they are pricing the 4000 series I feel is not worthy, at the end my new card is better in those few things I use it aside gaming than my 960 anyway and for gaming is better than I though).
I got an RTX 2060 prebuilt about a year ago during the shortage. I spent 950 for Gigabite Windforce 2060, Ryzen 3600, 16gb memory (Upgraded to 32), an SSD (Added a 1tb Samsung 980 M2), and an alright power supply. I was thinking of upgrading but I don't see a point right now the new stuff is too expensive and the prior gens work great.
Anyone with any memory of the past two years could have seen this coming from a mile away. The message from Nvidia last gen with the gimping of the 3060 and 3050 came through loud and clear: midrange buyers don't *get* nice things.
@@amirtak9886 If you already own a pc with a decent processor i think its better to buy a midrange graphics card than buying a console, as you are able to play older games on PC, you can also use it for 3d and video rendering, and you are free to choose the input device of your liking (as long as the game devs bother to support it).
Hmm today's ketchup is kinda depressing... But new GPUs from the previous generations are still available, so dealhunting on those should be OK. Also if you don't go beyound 1440p you don't really need a card that's more than $500 anyways. Personally if I can't get a 50% performance boost from upgrading I'll just wait a year for the next gen.
Maybe Nvidia did tier lift - so 4060 is something that supposed to be 4050? Seems that its better to leave this generation alone all together. Im getting happier every day that I managed to get 6800XT for 450€ 😀
They did that with the whole product stack honestly, 4070'Ti' is already a 4060'Ti' and they were going to call it a 4080. lol this 4060 was probably going to be called a 4070, what a fucking joke.
I'm still trying to figure out what games people are playing to warrant an upgrade. 30 series is quite capable to game at all resolutions. The money you'll save buying a used card is ridiculous. Ntm even amd 6k series are good cards as well. Time to teach them both a lesson and vote w your wallet.
@HSS Revenir even when the prices are horseshit and nvidia lying about the 3x performance over the 3090ti, the card is still cheaper, more power efficient and pretty much on par with the 3090ti
Well, ALL 40 series is overpriced, First it went from ripping on the 4070 Ti to now the 4060 Ti when all of them are hi. That said when the 4060 Ti drops people will buy it due to it will be a 40 series card, Will have DLSS 3 and such and will be a great 1080p / High refresh rates and can dab into 1440p a little. Some people dont wanna pay for the 4070 prices if that card drops, Thus people wont wait so the 4060 Ti will be the lowest card of the 40 series and everyone wont be rushing to get old gen 3070 or 3060 Ti, Yall should know this by now.
Probably in 2-3 years we won't have low to mid tier GPUs, most of them will be replaced by Apu's from AMD, not surprising with consoles reaching that kind of performance with their amd Apu's.
So if it's really like RTX3070 performance wise, but more expensive and with new dumb power plug, doubt that's a good deal. Energy efficiency is fine, but in energy efficency alone isn't really a killer feature. In most cases you have to overpay way too much in advance for this and then just hope to see real savings after many years of use, given that you will keep it. Well, unless it's this weird bug in some new AMD cards where they idle at about 100W, but that's of course an exception.
I really don't get it because from what I've seen Nvidia looks to have sold almost all their 3000 series now so I don't think these prices have anything to do with still trying to shift last gen stock.
I like how people think that Nvidia only increased the 4000 series prices so they can push people to buy 3000 series instead. And once that stock of Ampere cards cleared up, they will somehow magically drop the prices of 4000 series. WHY would they do anything like that, especially since at that point consumers won't have any options left on the table anymore. They will throttle the supply and create artificial scarcity and do whatever they can to keep the prices up. It's proven time and time again, when people are forced to choose between getting a bad deal or not buying at all, they will choose the bad deal 100% everytime. Other people will flock to the use market and drive up the prices of those cards too. All the people who waited this long to get GPUs at decent prices waited for nothing basically...again! As it seems decent GPU prices are gone. I hear people saying that PC gaming is not dead and whilst I agree, if this trend continues (as it has been for 2 years already) and the sub $500 segment is priced out of the market (which is like 95% of PC gamers) than you can safely say that PC gaming will be dead.
@@MadBlazer89 Nvidia is not operating in a vacuum though. As you said there is a very healthy used market with a lot of mining cards available for gamers. Plus AMD might finally get their act together and start to pump serious volume into the market to in order to take market share. There is the chance Intel will offer something for the low end that is semi-respectable as their drivers improve. Plus there are always the consoles for a very respectable gaming experience at like $400 all in. So there are options over and above just buying whatever Nvidia decides to offer. And if you look at the sales numbers you can see that people aren't buying these cards at the prices Nvidia is trying to set and Nvidia needs to sell in volume to make money. It's better to sell a 1000 of something and make $10 per sale than sell only 10 but make $100 per sale. A price war is coming, it's just a matter of who needs the turnover more and who has the better margins to start with.
Performance numbers or specs have not been released yet so everything is just speculation. Forming an opinion based on assumptions is completely silly. I have a feeling a 4060ti will be closer to a 3080 than a 3070 and dlss features will improve it further. But only Nvidia knows, so let’s wait for the release.
4060ti is not pointless. It is just replasement to 3070 that you can not buy soon anymore… It is same as saying 1060 is useless. It is a GPU that can draw graphics. So it is not useless. It most likel is too expensive, but so is everything else new tech this year… So it is bad deal, but among other bad deals we have now and future… It belongs.
Also I think AMD is mostly right in regards of games may running out of VRAM in maxed out settings, both raster and ray traced. So I think 1080p should be using 160-bit 10GB, 1440p using 12-16GB at 192-256 bit, and 4K and ultrawide would be 16+GB at 256-384 bit. It also makes sense to make 20GB VRAM and 320-bit bus. Finally when RDNA 4 is in development, I think it would be like totally ground-up for chiplet design unlike RDNA 3, more in line of moment of Zen 3 and Zen 5 on CPUs. This would also be made to dethrone NV in all segment, and 5090 would likely be dethroned by some more GCDs, like stacking Infinity Cache vertically in manner of 3D V-cache, and dual 96CU GCDs with Infinity Links and X3D, with a front IO extender for PCIe, display engine and video codec engines. GCD and MCD still present, and more up to 192CU in 92DCU would be like Threadripper, call it ray-ripper in true flagship.
Really hope kopite is wrong, I mean, we saw the 3060ti being a 2080 super; and now, a 4070ti being a 3090/3090ti, and the 70ti and 70 class gpus were never so far on performance, so hopefully the 4070 would be like a 3080ti, and the 4060ti could possible be a 3080, in best case scenario of course
this generation of cards are just way to over priced for performance gain. I will just wait, I have a 3080 anyway for 1440p gaming. So I don't need one anyway.
Q2 I belive, but I feel they will keep doing what they most of the time do for mid/low tier: wait for nvidia to price they gpu first and then they cut them for $50, but who knows? maybe they mid/low tier this time is cheaper (in the way that they are priced right) because of the new proces of fabrication. I ended getting a 6600 xt new for $250 at my country since I suspect the 7600 xt msrp will be $300 at least ($350 most likely and way higher at my country so the price/perfomance would be the same in my case or worse if I waited for 7600 xt or rtx 3060, and my gpu was almost dead anyway).
@@ApocalipsisEnSabah id be lucky to get a 6500xt for $250 here is Australia. I'm hoping the 7700xt will be roughly $900Aud or better. 🤞 also if going to be worth it...
I'm building a nice ITX PC atm and it's being a right ballache picking a card for 1440p. My laptop 2070 Max-Q on the other hand is doing sterling work playing every thing at ultra and only had to go down to high a few times. If I didn't hate the blur of 1080p so much I'd probably be set for aged.
Might as well buy a used 3070 for less, basically the same performance. You're only missing out on DLSS 3 and power efficiency, but neither of those are that big of a deal unless you are specifically building a tiny SFF PC and you want as much performance as possible.
Let's say AMD matches RDNA 3's VRAM buffers to the performance tiers of RDNA 2; This will likely result in a 7700 XT 16GB and a 7600 XT 12GB. If the speeds stay at 20Gbps, that 7600 XT is going to have 480GB/s of bandwidth... 95% of the 4070 Ti's bandwidth. Even if they drop it to 16Gbps, the 4070 would need 19Gbps to match the bandwidth of a slower card.
Buying ex Mining equipment seems risky because when a computer hardware is defect, the symptoms are random. I never use ex mining but I ever used my pc for botting an online game for 24/7 for years. It's more or less the same. The symptoms are random as if it's not hardware related problem but software related. The windows I'm using now is the old installation used when I was botting a few years ago that had random problem, but now with completely new hardware, no error no bsod, no nothing.
Last time I upgraded was in 2019, AM4 MB/AMD 1200 cpu and 16 GB ram for $233. Got a 570 for $140 all off newegg new. I'll wait for next gen to come out and buy previous gen. Sales have slowed down and less people are pc gaming, they're bound to drop their prices. These companies gotta diversify becomes nvidia gave up on integrated market while AMD getting bigger in server market which is really where the money is at. Intel offerings not as good on the CPU side for price/performance. I feel AMD make be coming up next few years and there is a reason that nvidia stock down 50% in past 6 mo. All the while TSMC just killing it.
I think it makes more sense to make future 5060 with 10GB 160-but VRAM because even 1080p maxed settings and maxed RT would run into VRAM issues. I mean if 4050 and 4060 being replaced by an ARM-based APU on both desktop and laptop given experience gained from Grace Hopper super-chip SiP package, then it makes more sense to make super-chip APU with large 16GB 256-bit GDDR6 at 20-24Gbps. Then 70 card should be priced at 60-level, 80 is the new 70, and 90 is the new 80
With the prediction that the 4060TI will just match the 3070, that would make it a 3060 replacement, not a 3060TI replacement. Now if it matches the 3070TI It would have be the same price as the 3060TI to be even close to reasonable value.
All the 4000 tier got screwed after the gimped 4080 was rebranded as 4070ti. Most likely the 4060ti was the 4070 but they are changing them all to milk the prices. I bet the next one will be a 4050ti that was supposed to be a 4060 xD
@@COINGOBLIN78 Not if that is your main fun. No beer, drugs, bars, strip joints etc.... Gaming can keep a lot of people out of trouble if, they would do it and stick to it and go all out!!!! *Which you need to do!!!!!! Go Green, HULK OUT!!!!! Now go get that 4090!!!! AMEN!!!!
@@PaulJohn01 4070ti to 4080 is $400 gap, 4070 could easily be $649 ($150 gap) and then 4060ti $499 for another $150 gap (sources are a chinese leak, could be totally wrong thou, but sounds about right)
@@m-copyright i don't think this connector will be a problem with the mid to low tier 4000 series cards since they probably won't draw enough power to melt the pins
@@JQNAH Normally it shouldn't since like you said they draw less power, but we shall see. Regardless Nvidia will continue to push this new connector. Redesign it a little bit, but it's here to stay.
@@m-copyright The fire was specifically not to do anything with the pins or cables it was a high resistance parallel path caused by debris and improper insertion. It would probably still happen even on a 200w card under specific circumstances. However you know, just plug in the cable all the way until it clicks and there will be 0 issues lol. The fires were happening on people with the connector only plugged in like half way and then also pulling the wire at an angle so the pins jammed at a severe angle internally to create the parallel path. The connectors do have limited lifespan of ~30 plugin/disconnect cycles but even standard GPU power connectors have a lifespan as well (~50 cycles iirc, might be wrong), and for most people this shouldn't really be a problem. How many times do you plug/unplug your GPU in during its life span? Id guess 5-10 tops for most users.
Sounds like improved power efficiency with the new architecture is really the main benefit here unless you go 4090 for Nvidia. Also, I'm hoping we don't see Amazon/Newegg/everyone else selling "new" GPUs from brand names we've never heard of, because of the pressure washed mining GPU die situation. Buy from reputable brand names instead to avoid that problem.
The entery Lineup of Nvidia and AMD are pointless but you know, still trash fanboys sheeps will buy and overprice GPU with and will invent all excuses....
Did you expect anything better from the 4000 series? Yes the 4090 is Amazeballz!
But everything else is just an exercise in how to Dick down your customers!!!
Idk, I enjoy the 4080. Maybe I do like gettin dicked down by Team Green.
@@ARealPain Do You. I get it, some people just like dick. Live and let live.
Seems like my gtx 1070 can't retire for another 2 years.
I’m not going to pay 1100€ for a 4070ti which is like 2x the performance of a 1080ti, 7 years later.
If Nvidia can’t lose their greediness then I can bet Intel's Battlemage platform will be selling like hot cakes soon.
I figured the 4070ti was more than 2X the performance of the 1080ti consider the 3060 roughly matches it
@@innocentiuslacrim2290 Exactly.
@@artificialstyle998 according to gpu database of techpowerup its about 104% better than the 2070super which is roughly 4 ish % faster than 1080ti so yeah 100%
Imagine thinking Intel won't be just as greedy as every other company. They literally invented the greed game way before Nvidia decided to take it over.
I feel more and more lucky that i have a 1080ti
I hope people are buying nice 4K TVs instead of overpriced GPUs.
+ plus a current gen console from Sony or Microsoft.
4K TV + PS5 = 4K 120 FPS gaming
@@darylferreras6241 you mean upscaled 4k on low graphic settings.
@@darylferreras6241 ps5=2060s how can a 2060s run 4k
@@FunFiction704 and at 60ish fps
Well the rtx 4070 will be a 3080 performance for the same price £650 with lower tdp. If you can just grab a second hand 3090 for £500-600 and call it a day.
I'm still on Vega64. Thinking of waiting for 5000 series Nvidia.
That's a good card though. What monitor you use?
@@BlackJesus8463 Was using a Nixeus 27" 1440p but stuck with the laptop until I get a new one... won't be for a couple months :( lol
Not to mention a bn 3070 heck even the 3070 ti version should be significantly cheaper than 4060 ti's retail price. The inflation of new gen. GPUs is sickening.
Decrease of 25% performance ? Okay take off 5% price
That makes sense for nvidia
yes, it's total garbage
I think the original 3060 12Gig is going to last a really long time if generational improvements are going to be this bad.
You spelled 6700 XT wrong ;)
Seriously though, the 3060 doesn't even benefit that much from all that Vram, and even the 6700 XT often doesn't need that much Vram. That being said, the additional Vram can be useful, and can make up for having less than the optimal amount of system RAM (but that's a bit unusual when that happens).
@@syncmonismNot everyone is a gamer… nvidia is way better for productivity
A770 16Gb with it's software side starting to be ironed out starting to look awful good isn't it
Not really when you can get a 6700xt for around the price
@@artificialstyle998 6700xt doesnt have good raytracing performance, A770 does. and XeSS is a decent bit better than FSR. is that worth it for potentially worse drivers and stuff? idk, but I am starting to seriously consider it personally.
@@fayefischer1751 personally I wouldn't even consider Ray tracing on a A770, 3060ti, 6600xt tier of GPU. The performance hit/having to lower settings isn't worth it in my opinion
@@fayefischer1751 If you can find an A770 way cheaper than a 6700xt then is a deal, but for raytracing/XeSS is not worthy (raytracing in mid cards right now isn't really worthy, not even nvidia ones), but at the same price 6700xt is an easy win in my opinion (unless you really need those extra vrams from a770). I would skip both thou for next arc gen and fsr3 if I didn't need an update, or considere them at that point if they prices droped or if the new gen gpu's aren't worthy and the intel drivers became betters and fsr3 is supported.
Yes, I would go Arc A770 16gb Evert day.
I was thinking on holding as much as possible with my 960 4gb but after seeing the prices nvidia are going for it wasn't worthy to wait to buy a 4000 series that will have an increase in performance for the same or higher increase in price compared to the previous gen (same for amd but not that disgusting as nvidia I guess).
Happy I could find a new 6600 xt at $250 in my country when they normally are $350+ and 3060 are $400+. First time using amd gpu and no problems yet after uninstalling nvidia drivers with DDU.
Excellent choice
Sounds like it's what many of us thought - That the 4070Ti we got, is more like what the 4060, MAYBE the 4060 Ti was SUPPOSED to be. A 4060 (non-Ti) should be BETTER than a 3070 by a decent margin. For them to release a 4060Ti that basically performs the same as a 3070 is an embarrassment and INSULT and should ONLY be taken as such by consumers. This IS a place where AMD should be taking advantage of Nvidia's greed, but given how the 7000 series has gone so far, they'll botch this one too.
I freakin sold my 3060ti to buy this overpriced nonsense. Got scammed by Nvidia. 3060ti = 2080 super. But 4060ti = 3070. WTF Nvidia.
The major issue with the 4060 ti is that 8GB of VRAM. The 3070 is a great 1440p card so the 4060 ti would be as well. But as demonstrated by Hardware Unboxed, 8GB is not enough for games with HD texture packs. Steve showed that in games like Far Cry 6 the RX 6700xt was faster than the RTX 3070ti and it was because of the lack of memory on the Nvidia cards.
oh nice, the old AMD "fine wine" thing again with vram...
@@innocentiuslacrim2290 It’s the mindset that has changed. 3070 is still a good product and people were mad to get it mid last year. You think it became bad in 6 months. Absolutely not, it’s people expectation that they wanted 40 series with big performance jump at same price as 30 series. The way to think is the 30 series cards were made for 4 years then you will start making the sense out it. It’s a mid cycle refresh.
@@ever611 Well it's kinda true but not at break deal. I came from a 960 4gb and holded pretty good for 8 years (I was still playing new games at 1080p or lower with fsr at 30-60fps with optimized settings at high/medium/low deppending they fps impact and visuals, I could keep gaming with it but found a new 6600 xt very cheap in my country compared to everything else), meanwhile the 2 gb version had problems like lowers 1%/stutters/textures load problems/etc. (my friend with the 2gb version updated way before than my because of those reasons in the games we both played and I didn't had that many problems, even when our fps were pretty close), so spending those extra $25 for a 4gb evga card was a great deal in my case (my friend ended upgrading for the only thing he could afford buying a 1060 6gb overpriced at the mining boom regreting he didn't got the same 960 as me). At the time everyone was saying it wasn't worthy the extra 2 gb (it was that or spend $150 more in my country that I didn't have at the time for a 970 even thou they had the 3.5 gb thing xD).
If someone is upgrading every 2-4 years it doesn't really matter but for those like me that upgrades after a long time those extra vrams are a little worthy.
The amd "fine wine" is more because the drivers thou (I never was an amd user until now, just to be clear), look 1060 vs 580, 1060 started better but at some point 580 had better performance thanks to the drivers and maybe vram (not sure if is because amd has better drivers because they don't forget about previous gens or because nvidia one are bad because they focus on new gen more, or a little of both).
I dont't feel 8gb for 4060ti is a problem thou, the problem is going back to GDRR6 when the refreshed 3060ti is GDRR6X (I guess they want to launch a refresh again so that way they keep the prices at "msrp" or higher like they do lately, really scumy like always).
There will be heavy problems as that 8gb memmory will be paired with 128 bit Bus Width!
That's not the worst thing about the 4060 ti. It would be better to have 20% more performance than to have more Vram, though 8GB was definitely not a lot even for the 3060 ti, let alone the 3070 or 4060 ti, and the 8GB of Vram definitely is an issue, just not the biggest issue imo. It's still true that 8GB is usually enough to go with the amount of GPU compute power that a 3070 has, or what the 4060 ti will have.
The 4060 ti is looking to also be rather weak given what the name of the card is, making it a very minor upgrade over the 3060 ti in every category but power consumption. Indeed, the 12GB of Vram of 6700 XT and 6750 XT, as well as their prices, will definitely will make the 4060 ti look bad.
12 GB vram should be the minimum standard in all future GPU. Governments should ban gpus with vram less than 12 GB.
This was more or less predictable as soon as the 4080s specs were revealed to be slightly over/under half of the 4090
Nvidia might suck but Amd seems like they have no idea what they are doing bumbling around in the dark and all the anti costumer practice is starting to sound just like Nvidia
Thanks for the advice on buying used. I having been waiting to see if there would be any low-end cards in the 40 series that would be cost effective compared to a midrange used 30 series. I'm beginning to doubt that.
I was also hoping to see how good FSR 3 would be, whether it could rival DLSS in any capacity (I don't care about raytracing) and let that inform my decision of going AMD over NVIDIA if I buy current-gen. I'm also curious about AV1 rendering, and looking up whether or not there is any serviceable workaround for running programs that use CUDA on an AMD graphics card. I do some video work from time to time. Probably not often enough to really worry about it that much, but it would be nice. I don't know if that is even be possible; I'm not super experienced in this field.
I was in the same spot about new gen vs previous gen price/perfomance, ended getting a new 6600 xt very cheap at my country and I am loving it, first time buying an amd gpu also (I also was holding because I really wanted a nvidia card for the few things I sometime do that run better on nvidia gpu, not for raytracing since I don't care about reflections and for ligthning I prefer to use my own shaders to make the game look as I like with a way lower hit on performance than rtx on, but after seeing how they are pricing the 4000 series I feel is not worthy, at the end my new card is better in those few things I use it aside gaming than my 960 anyway and for gaming is better than I though).
lets wait for the NVIDIA 6600 cards in 5 years..they will finally reach 3080 performance for 400 bucks
I got an RTX 2060 prebuilt about a year ago during the shortage. I spent 950 for Gigabite Windforce 2060, Ryzen 3600, 16gb memory (Upgraded to 32), an SSD (Added a 1tb Samsung 980 M2), and an alright power supply. I was thinking of upgrading but I don't see a point right now the new stuff is too expensive and the prior gens work great.
me and my son are on 2060 supers We aren't upgrading either
Got a 2080 for 200$
Anyone with any memory of the past two years could have seen this coming from a mile away. The message from Nvidia last gen with the gimping of the 3060 and 3050 came through loud and clear: midrange buyers don't *get* nice things.
No point in buying a "midrange " card when a ps5 can be had for almost the same price
@@amirtak9886 If you already own a pc with a decent processor i think its better to buy a midrange graphics card than buying a console,
as you are able to play older games on PC, you can also use it for 3d and video rendering, and you are free to choose the input device of your liking (as long as the game devs bother to support it).
@@ComicAdrian Id agree with that.
Hmm today's ketchup is kinda depressing... But new GPUs from the previous generations are still available, so dealhunting on those should be OK.
Also if you don't go beyound 1440p you don't really need a card that's more than $500 anyways. Personally if I can't get a 50% performance boost from upgrading I'll just wait a year for the next gen.
I am not going past 1080p till I get a rtx 5070
Nvidia is pointless.
The 4070 ti is pointless the 4060ti is pointless but people will buying them cause they can’t afford the 4080 or 4090
Its not really about affordability. For me it's about not wanting to pay used hatchback prices for something that has way less precious metal in it.
Maybe Nvidia did tier lift - so 4060 is something that supposed to be 4050?
Seems that its better to leave this generation alone all together.
Im getting happier every day that I managed to get 6800XT for 450€ 😀
They did that with the whole product stack honestly, 4070'Ti' is already a 4060'Ti' and they were going to call it a 4080. lol this 4060 was probably going to be called a 4070, what a fucking joke.
I'm looking forward to a 40 series graphics card, cut down to just 1 single Cuda core.
Could care less about over locking these days. As long as it works and it performs well I’m happy with the purchase.
I'm still trying to figure out what games people are playing to warrant an upgrade. 30 series is quite capable to game at all resolutions. The money you'll save buying a used card is ridiculous. Ntm even amd 6k series are good cards as well. Time to teach them both a lesson and vote w your wallet.
It's funny how a 3080ti is more expensive than the 4070ti in Australia, isn't the 30 series meant to go down in price
yeah i'm coppin a 4070 ti
No.
@@MarikHavair stay mad
@@JQNAH LMFAO
@HSS Revenir even when the prices are horseshit and nvidia lying about the 3x performance over the 3090ti, the card is still cheaper, more power efficient and pretty much on par with the 3090ti
Well, ALL 40 series is overpriced, First it went from ripping on the 4070 Ti to now the 4060 Ti when all of them are hi. That said when the 4060 Ti drops people will buy it due to it will be a 40 series card, Will have DLSS 3 and such and will be a great 1080p / High refresh rates and can dab into 1440p a little.
Some people dont wanna pay for the 4070 prices if that card drops, Thus people wont wait so the 4060 Ti will be the lowest card of the 40 series and everyone wont be rushing to get old gen 3070 or 3060 Ti,
Yall should know this by now.
Thanks for the news, these video cards are hot freaking garbage.
Probably in 2-3 years we won't have low to mid tier GPUs, most of them will be replaced by Apu's from AMD, not surprising with consoles reaching that kind of performance with their amd Apu's.
So if it's really like RTX3070 performance wise, but more expensive and with new dumb power plug, doubt that's a good deal.
Energy efficiency is fine, but in energy efficency alone isn't really a killer feature. In most cases you have to overpay way too much in advance for this and then just hope to see real savings after many years of use, given that you will keep it.
Well, unless it's this weird bug in some new AMD cards where they idle at about 100W, but that's of course an exception.
I really don't get it because from what I've seen Nvidia looks to have sold almost all their 3000 series now so I don't think these prices have anything to do with still trying to shift last gen stock.
I like how people think that Nvidia only increased the 4000 series prices so they can push people to buy 3000 series instead. And once that stock of Ampere cards cleared up, they will somehow magically drop the prices of 4000 series. WHY would they do anything like that, especially since at that point consumers won't have any options left on the table anymore.
They will throttle the supply and create artificial scarcity and do whatever they can to keep the prices up. It's proven time and time again, when people are forced to choose between getting a bad deal or not buying at all, they will choose the bad deal 100% everytime. Other people will flock to the use market and drive up the prices of those cards too.
All the people who waited this long to get GPUs at decent prices waited for nothing basically...again! As it seems decent GPU prices are gone. I hear people saying that PC gaming is not dead and whilst I agree, if this trend continues (as it has been for 2 years already) and the sub $500 segment is priced out of the market (which is like 95% of PC gamers) than you can safely say that PC gaming will be dead.
@@MadBlazer89 Nvidia is not operating in a vacuum though. As you said there is a very healthy used market with a lot of mining cards available for gamers. Plus AMD might finally get their act together and start to pump serious volume into the market to in order to take market share. There is the chance Intel will offer something for the low end that is semi-respectable as their drivers improve. Plus there are always the consoles for a very respectable gaming experience at like $400 all in. So there are options over and above just buying whatever Nvidia decides to offer. And if you look at the sales numbers you can see that people aren't buying these cards at the prices Nvidia is trying to set and Nvidia needs to sell in volume to make money. It's better to sell a 1000 of something and make $10 per sale than sell only 10 but make $100 per sale. A price war is coming, it's just a matter of who needs the turnover more and who has the better margins to start with.
Performance numbers or specs have not been released yet so everything is just speculation. Forming an opinion based on assumptions is completely silly. I have a feeling a 4060ti will be closer to a 3080 than a 3070 and dlss features will improve it further. But only Nvidia knows, so let’s wait for the release.
4060ti is not pointless. It is just replasement to 3070 that you can not buy soon anymore… It is same as saying 1060 is useless. It is a GPU that can draw graphics. So it is not useless. It most likel is too expensive, but so is everything else new tech this year…
So it is bad deal, but among other bad deals we have now and future… It belongs.
They locked out extreme overclocking. Normal tuning is totally fine. Please do not promote misinformation on youtube. @BootSequence.
Is this a 4050 labelled as a 4060, they already tried doing that to the 4070 as a 4080
the RTX 4060 will be $479 the Ti will be higher than $499. Nvidia where gamers go to go Broke.
whoever is thinking of buying 4060 ti could reconsider for a 6950xt.
Both Nvidia and AMD are abusing their support base, we should stop buying for a generation.
Also I think AMD is mostly right in regards of games may running out of VRAM in maxed out settings, both raster and ray traced. So I think 1080p should be using 160-bit 10GB, 1440p using 12-16GB at 192-256 bit, and 4K and ultrawide would be 16+GB at 256-384 bit. It also makes sense to make 20GB VRAM and 320-bit bus.
Finally when RDNA 4 is in development, I think it would be like totally ground-up for chiplet design unlike RDNA 3, more in line of moment of Zen 3 and Zen 5 on CPUs. This would also be made to dethrone NV in all segment, and 5090 would likely be dethroned by some more GCDs, like stacking Infinity Cache vertically in manner of 3D V-cache, and dual 96CU GCDs with Infinity Links and X3D, with a front IO extender for PCIe, display engine and video codec engines. GCD and MCD still present, and more up to 192CU in 92DCU would be like Threadripper, call it ray-ripper in true flagship.
lol i'll pass im waiting for the 50 cards lol 🤣🤣🤣🤣 the cards are just a cash grab!
a750 arc isnt to bad for what ya get....especially for 250usd
Rtx 3060 ti performed better than a Rtx 2080s. The 40 series is lame asf. The Rtx 3070 beat the Rtx 2080 ti to boot.
Really hope kopite is wrong, I mean, we saw the 3060ti being a 2080 super; and now, a 4070ti being a 3090/3090ti, and the 70ti and 70 class gpus were never so far on performance, so hopefully the 4070 would be like a 3080ti, and the 4060ti could possible be a 3080, in best case scenario of course
Yeah i think so too. At least at fhd it has to be at 3080 perf. And the price should be 399 instead of 499
Nice T-shirt, Snows. Think I will spend my money on some new clothes instead.
Long for the old days when GPU news were good news, now anytime i see gpu news it is always bad news
Hey! 4060ti makes me feel good about having my 3080!
Scalpers would of took the video cards if they were cheap.
I'll just use geforce now until something comes ut.
this generation of cards are just way to over priced for performance gain. I will just wait, I have a 3080 anyway for 1440p gaming. So I don't need one anyway.
Life is a simulation, there's a glitch in the matrix 0:37
Where is AMDs new mid range??.
Q2 I belive, but I feel they will keep doing what they most of the time do for mid/low tier: wait for nvidia to price they gpu first and then they cut them for $50, but who knows? maybe they mid/low tier this time is cheaper (in the way that they are priced right) because of the new proces of fabrication.
I ended getting a 6600 xt new for $250 at my country since I suspect the 7600 xt msrp will be $300 at least ($350 most likely and way higher at my country so the price/perfomance would be the same in my case or worse if I waited for 7600 xt or rtx 3060, and my gpu was almost dead anyway).
@@ApocalipsisEnSabah id be lucky to get a 6500xt for $250 here is Australia. I'm hoping the 7700xt will be roughly $900Aud or better. 🤞 also if going to be worth it...
Nvidia is run by such UA-cams...
The 4060 ti feels like a 4060 or a 4050 ti
I'm building a nice ITX PC atm and it's being a right ballache picking a card for 1440p. My laptop 2070 Max-Q on the other hand is doing sterling work playing every thing at ultra and only had to go down to high a few times. If I didn't hate the blur of 1080p so much I'd probably be set for aged.
Best bet (if you wanna spend money on stuff) 6900 xt and a 1440p 240hz monitor with a Ryzen 7 5800X3D
Might as well buy a used 3070 for less, basically the same performance. You're only missing out on DLSS 3 and power efficiency, but neither of those are that big of a deal unless you are specifically building a tiny SFF PC and you want as much performance as possible.
Let's say AMD matches RDNA 3's VRAM buffers to the performance tiers of RDNA 2; This will likely result in a 7700 XT 16GB and a 7600 XT 12GB. If the speeds stay at 20Gbps, that 7600 XT is going to have 480GB/s of bandwidth... 95% of the 4070 Ti's bandwidth. Even if they drop it to 16Gbps, the 4070 would need 19Gbps to match the bandwidth of a slower card.
Buying ex Mining equipment seems risky because when a computer hardware is defect, the symptoms are random. I never use ex mining but I ever used my pc for botting an online game for 24/7 for years. It's more or less the same. The symptoms are random as if it's not hardware related problem but software related. The windows I'm using now is the old installation used when I was botting a few years ago that had random problem, but now with completely new hardware, no error no bsod, no nothing.
I get the feeling NVIDIA wants to drop the selling of mid-class GPUs to gamers on a budget and nudge them into subscribing to GeForce Now.
4060ti is worth $300...
Where tf is the 4080ti?
And they won't tell you what they really went through so they can charge more. So ya dont buy mining GPU's because 9 times outta 10 your being misled.
Anything TI that cost less then $600, can play 4K at 60 FPS at max settings is worth buying.
I could say the 780ti was cheaper but even the titan had different prices back then WTF.
0:36
Weird. Was it just me?
Yes. Your gpu is dying, start saving for 4060 ti
Last time I upgraded was in 2019, AM4 MB/AMD 1200 cpu and 16 GB ram for $233. Got a 570 for $140 all off newegg new. I'll wait for next gen to come out and buy previous gen. Sales have slowed down and less people are pc gaming, they're bound to drop their prices. These companies gotta diversify becomes nvidia gave up on integrated market while AMD getting bigger in server market which is really where the money is at. Intel offerings not as good on the CPU side for price/performance. I feel AMD make be coming up next few years and there is a reason that nvidia stock down 50% in past 6 mo. All the while TSMC just killing it.
A pity that NGreedia does not scale the price down the same as performance. They might be worth looking at if they did
4050
I think it makes more sense to make future 5060 with 10GB 160-but VRAM because even 1080p maxed settings and maxed RT would run into VRAM issues. I mean if 4050 and 4060 being replaced by an ARM-based APU on both desktop and laptop given experience gained from Grace Hopper super-chip SiP package, then it makes more sense to make super-chip APU with large 16GB 256-bit GDDR6 at 20-24Gbps. Then 70 card should be priced at 60-level, 80 is the new 70, and 90 is the new 80
12 GB vram should be the minimum standard in all future GPU. Governments should ban gpus with vram less than 12 GB.
May want to get refund on your set. It has a yellow tint
With the prediction that the 4060TI will just match the 3070, that would make it a 3060 replacement, not a 3060TI replacement. Now if it matches the 3070TI It would have be the same price as the 3060TI to be even close to reasonable value.
All the 4000 tier got screwed after the gimped 4080 was rebranded as 4070ti. Most likely the 4060ti was the 4070 but they are changing them all to milk the prices. I bet the next one will be a 4050ti that was supposed to be a 4060 xD
Less than 4090 are all senseless to buy!!!!!
i would say spending 2k for a gpu to play games is even more senseless
@@COINGOBLIN78
Not if that is your main fun.
No beer, drugs, bars, strip joints etc....
Gaming can keep a lot of people out of trouble if, they would do it and stick to it and go all out!!!!
*Which you need to do!!!!!!
Go Green, HULK OUT!!!!!
Now go get that 4090!!!!
AMEN!!!!
@@COINGOBLIN78 if you have even half decent job you can buy easily 4090 setup
when is the titan set too release
XTAAAASSSEEEEE
BRUH 😂😂😂
@Boot Sequence you're not going to get a 4070 for $600. 4060ti will cost $649 ! Hahaha 😁😁😁😁 you think we're going to see a 4060ti for $449 ???
Price is apparently gonna be under 500 so like 499.99
@@psylina From what source ? I really don't see a $300 difference between 4060ti MSRP and 4070ti MSRP but we should find out in the next month or so.
@@PaulJohn01 4070ti to 4080 is $400 gap, 4070 could easily be $649 ($150 gap) and then 4060ti $499 for another $150 gap (sources are a chinese leak, could be totally wrong thou, but sounds about right)
Better go Arc A770 16gb or AMD rx 6700xt than give Nvidia money.
is the 4060TI using the same problematic 12VHPWR cables?
Yes, all 4000 series cards that have a supplementary power connect will have the new trash fire causing connector.
@@m-copyright i don't think this connector will be a problem with the mid to low tier 4000 series cards since they probably won't draw enough power to melt the pins
@@JQNAH Normally it shouldn't since like you said they draw less power, but we shall see. Regardless Nvidia will continue to push this new connector. Redesign it a little bit, but it's here to stay.
@@m-copyright The fire was specifically not to do anything with the pins or cables it was a high resistance parallel path caused by debris and improper insertion. It would probably still happen even on a 200w card under specific circumstances.
However you know, just plug in the cable all the way until it clicks and there will be 0 issues lol. The fires were happening on people with the connector only plugged in like half way and then also pulling the wire at an angle so the pins jammed at a severe angle internally to create the parallel path.
The connectors do have limited lifespan of ~30 plugin/disconnect cycles but even standard GPU power connectors have a lifespan as well (~50 cycles iirc, might be wrong), and for most people this shouldn't really be a problem. How many times do you plug/unplug your GPU in during its life span? Id guess 5-10 tops for most users.
Sounds like improved power efficiency with the new architecture is really the main benefit here unless you go 4090 for Nvidia. Also, I'm hoping we don't see Amazon/Newegg/everyone else selling "new" GPUs from brand names we've never heard of, because of the pressure washed mining GPU die situation. Buy from reputable brand names instead to avoid that problem.
Bro 4060ti should be for 3070