They didn't have a choice, there is a shortage of GDDR6X memory right now. They should have at least priced it like $10 lower ideally, but it's NVIDIA no one should expect them to pull something like that.
Truthfully speaking, you aint gonna notice like 1-3 fps difference. Sure they should have definitely priced it lower since it's a slightly lower performance, but the performance difference is so negligible in majority of the titles that it's not that big of a deal. Disclaimer: *I'm NOT trying to justify NVIDIA's actions, since decreasing performance even by a small amount should ideally warrant a price decrease too* , but I think a lot of people are making it out to be a lot bigger deal than it actually is, as if NVIDIA released a completely new GPU that's like 15% worse and priced it the same.
@@auritro3903 you say you're not trying to justify their actions and then you do justify it lol. it should be atleast 25$ cheaper this way Nvidia would also compete with the 7800xt a bit better
@@Defineddlast quarter nvidia gaming revenue was 2.9 billion. They did not need any price cut to compete better. Unless AMD threaten the sales of their gpu then they have no reason to make their gpu cheaper. If using GDDR6 makes the production cost cheaper and increase margin for nvidia then it is even better for nvidia.
1 or 2 FPS either way in my eyes is not worse, margin of error in a automatic benchmark... I would of guessed it would of been a bigger difference with the faster GDDR6X, but as you see, every game was near enough identical.... It totally Surprised me even...
You can't compensate for a difference in memory bandwidth by overclocking the GPU. You'd have to overclock the RAM, which is a bad idea if you want reliable GPUs.
@@H3li0_ Son, I have been overclocking PCs since before you were born. Don't try to teach your grandma how to suck eggs. The point is that the memory is rated for a specific speed, and board vendors can't overclock the memory beyond what it's rated for without risking stability problems. They don't have the time or manpower to manually test every single graphics card to see what memory clocks it can manage, and if they find parts that fail validation at their rated spec, they can get them replaced or reimbursed by the memory manufacturer. They can't do that if they are selling overclocked cards.
They could, but what difference would that make? I've had numerous Nvidia cards that would clock beyond spec out of the box. When thermals are good and power limits are lax, these cards will often boost beyond their box specs.
You literally could put 750mhz on the memory clock and 150mhz to 200mhz on the core clock in MSI Afterburner and gain about 5fps if your lucky And it would make up the memory speed yes.... but even 2fps here would pull you even with the GDDR6X card here FPS wise... But... I always say these days, with the "Powerful-ness" (If that's even a word 😕) of todays tech, there literally is just no point of Overclocking at all, 10fps extra "maybe", when your already hitting 300fps or 250fps is just silly really.... Now on the other hand dude, say you have a GTX 1070ti still or something around there and your getting 55fps in whatever game low settings... Then absolutely for sure start overclocking that puppy bro, send it to the moon... 100% do it.... 10fps or 15fps maybe at 50fps max to start with is well worth it for sure... And also with the new Tech, we now have "Frame Generation" and "DLSS or DLSS 4.0" if you have a 40 series card, and i tell ya what, its CRAZY, i get way way waaaaaaayyyyyyyy more FPS from them settings than I could ever get from any Overclock, period. The other day i tested ALAN WAKE 2 "Native 1080p Res" with everything cranked to the max ultra with "Ray tracing & path tracing" and all them goodies on ultra, and i think i was getting like 38 maybe 40fps max, Buuuut then i turned Frame Gen on and DLSS to the Quality setting and i could not tell the difference whatsoever from "Native Res", and in a way at some points i felt like it looked better than "Native Res somehow" Yeah i dunno how. Buuuut the most amazing thing was the FPS sitting at or around 100FPS... It was amazing, literally... . And there aint no way ever am i getting 60FPS extra from any overclock on any card out there ever, its hard enuff to get 6fps with an OC never mind 60 FPS... Likeeee whaaaaaaat..... Very very impressive i say... Whether people think these new settings are just gimmicks or fake stuff.... Well, im here to say people, that they are not....They bloody work and work well, at that, show me a card that can do an OC and produce an extra 60fps, and i'll say we live in a Simulation guys, and Joe Biden is a stupid lizard rat alien..... 🦎 🐀 Or he actually could be a Turtle tbh 🐢 with the way he creeps around trying to find his way off the stage.... Hes Such a clown... 🤡🧠💨🤯. But hey, sorry to the OP "Andre Freitas" and anybody else reading this, For the rant... But this is just to clear things up, with my real world honest testing... I feel better now anyway.... GG Guys 👍.
The problem isn't the performance drop but the fact that Nvidia is downgrading a card while keeping the same brand and hide it to make more money on their customer while their price are already TOO HIGH.
They made a 3060 ti gddr6x for the same price as a regular 3060 ti and nobody cared of that so it makes that Nvidia does now the opposite as no one care if they swap gddr6 for gddr6x, they assume that swaping gddr6x for gddr6 is okay....and it is as there no noticeable differences.
@stixktv__9999 I think in all honesty from my observations they knew Amd had beat them on the heat war of ddr6. And Amd was 15 degrees cooler than all the gpus with the 6x ram So naturally so. Similar performance trade off at a more efficient system you create is the way, they forgot that for multiple generations as noticed I'm considering team red really hard lately
I highly doubt it. Dlss is too OP. Believe it or not, Nvidia fanboys always sees it as DlssQ vs Raw instead of Raw vs Raw. That's why gamers have different opinions.
@@daemonx867 If we went through the amount of anti consumer shit nvidia has pulled in the last 10 to 15 years you wouldn't even touch an nvidia card ever again
I think a lot of people doesn't know why this test exist. For everyone that says that it's a shame there is no difference: you are actually saying what nvidia is saying.
@@iseeu-fp9po the rtx 4070 was originally designed with gddr6x memory but a few months ago Nvidia stopped producing them this way and used gddr6 instead. Nvidia claims that this will improve production costs and availability of the card without decreasing performance in any way, so the price remained the same. A few people obviously complained about this but this video shows that Nvidia may be right.
@@danimaio9745 Yes, but it's still an inferior product sold at the same price, even if the difference is very slight. There should at least be a discount. Who knows; there could be instances where the difference in Vram and bandwith in newer games down the line where there's more of a difference between them.
For those who don't understand why the new 4070 now has GDDR6 memory: Nvidia is having problems getting large quantities of GDDR6X chips. It's not Nvidia's fault, but Micron's, as they have far too high a demand for GDDR6X chips and can't keep up. And yes, the price is still questionable.
I am having a lot of problems and I am undecided. What do you think I should do? The parts of the system I will buy are almost the same quality, but one has an RTX 4070 and the other has an RX 7800xt. What do you think I should buy? Their prices are the same. I am someone who wants great graphics in 1080p and 1440p ultra settings to play story games. Which one do you think I should buy?
Nvidia is so going to this in their 5000 series. I mean saving up specs to maintain the targeted performance. We budget 60-70 tier consumers are done for. 💀
when nvidia launch 40 series there are lots of GDDR6X in supply. in fact the supply is way too many even in late 30 series life cycle. that's why nvidia end up "upgrading" 3060Ti with GDDR6X later on. next gen nvidia will adopt GDDR7 so micron as the sole supplier of GDDR6X most likely limit their production accordingly so they did not over produce them and have lots of unsold memory when 50 series starts rolling out. this is how it supposed to be... until chinese company start hoarding the GDDR6X to mod nvidia 4090 and 4080 with more memory.
If you refuse to buy the GDDR6 version and only specify the GDDR6X they will have to drop the price( as they should ) of the non x version. If you let them away with this "little thing, no big deal" they are testing us for future scum baggery.
Of course gddr6x is a scam the fps diference is just the clock speed 20gb/s vs 21gb/s a gddr6 22gb/s will beat it. Gddr6x is just marketing to fool you.
@@Definedd I am not defending it, i'm just saying that it isn't really something to be worried about, it wasn't done because they want to still your money, it's because there are no more gddr6x chips
I think with this is more than confirmed than the X memory doesnt really matter after all and its just an excuse for nvidia to charge more for the cards. After all non of the AMD cards have X memory I believe
Punto uno impara a scrivere bene e senza offendere. Punto due si nota perfettamente che la scheda perde solo 1 fps di media per ogni gioco, il che mi fà pensare che sia il chip a essere il collo di bottiglia e non la ram
@@fabiotiburzi Unlike you ignorant person thats talking out of his a$$, the educated and well researched GPU experts that have made videos and articles on both these 4070s determined that it IS the GDDR6 memory that is the issue.
All you guys can complain what you want. No one is forcing you to buy the GDDR6 version. And 6X is still available. And the 50x0 on its way. Besides 4070 is near end of life. Why buying one now?
Games :
Black Myth: Wukong - 0:00
CYBERPUNK 2077 - 1:01
Warhammer 40K Space Marine 2 - 2:08
Ghost of Tsushima - 3:04
Star Wars Outlaws - 3:52
Forza Horizon 5 - 4:56
Avatar: Frontiers of Pandora - 5:44
Horizon Forbidden West - 6:32
Starfield - 7:22
Alan Wake 2 - 8:29
System:
Windows 11
Ryzen 7 7800X3D - bit.ly/43e3VxW
MSI MPG X670E CARBON
G.SKILL Trident Z5 RGB 32GB DDR5 6000MHz - bit.ly/3XlBGdU
GeForce RTX 4070 GDDR6 12GB - bit.ly/3TxFng8
GeForce RTX 4070 GDDR6X 12GB - bit.ly/4eq00CV
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
for those who don't understand, the problem is not in performance, but in price
gddr6 is a cheaper component but the card price is not cheaper 😂
but performance is the same 🤷♂️
BINGO! there you go
its not the same its around 4% slower go watch hardware unboxed video nvidia just pulled a scammy move again 😂@@N5O1
The new 4070 use 20Gbps GDDR6. don't be surprise if they cost more or less the same as 21Gbps GDDR6X.
Wait for the 5000 graphics card
Unnecessary changes
They started making gddr6 4070 cuz there were no more gddr6x chips, it's not like it was intentional
@@Livebasura69in 4k change
@@pedrohenriquedrawanz8803it’s because it’s cheaper for them to source and increases their profit margin. Nothing else.
They didn't have a choice, there is a shortage of GDDR6X memory right now. They should have at least priced it like $10 lower ideally, but it's NVIDIA no one should expect them to pull something like that.
Its a scam strategy, they have a lot of GDDR6 memory laying around
So 1-3 fps less but same price? Fck Nvidia
Truthfully speaking, you aint gonna notice like 1-3 fps difference. Sure they should have definitely priced it lower since it's a slightly lower performance, but the performance difference is so negligible in majority of the titles that it's not that big of a deal.
Disclaimer: *I'm NOT trying to justify NVIDIA's actions, since decreasing performance even by a small amount should ideally warrant a price decrease too* , but I think a lot of people are making it out to be a lot bigger deal than it actually is, as if NVIDIA released a completely new GPU that's like 15% worse and priced it the same.
@@auritro3903 you say you're not trying to justify their actions and then you do justify it lol. it should be atleast 25$ cheaper this way Nvidia would also compete with the 7800xt a bit better
@@Definedd If you need $25 off a GPU for losing 1-3 fps that you can OC and get back, you have other issues.
@@Defineddlast quarter nvidia gaming revenue was 2.9 billion. They did not need any price cut to compete better. Unless AMD threaten the sales of their gpu then they have no reason to make their gpu cheaper. If using GDDR6 makes the production cost cheaper and increase margin for nvidia then it is even better for nvidia.
So they made a worse version of the 4070 and are selling it for the same price, got it.
So basically a 4070 lite for the same price as a better one
And it’s no difference, so it doesn’t matter
1 or 2 FPS either way in my eyes is not worse, margin of error in a automatic benchmark... I would of guessed it would of been a bigger difference with the faster GDDR6X, but as you see, every game was near enough identical.... It totally Surprised me even...
they'll make an 8gb 4070 next
I buy it ♥
Pathetic AMDelusional fangirl 🤡 i like your trashname! I bet you trash pc components are turdzen and trashdeon 🤢
laptop version has 8 gb
🤡🤦@@eduardoixtepan4996
With 128 bit bus and still price it over $500
Couldn't Nvidia overclock slightly these models to compensate for the memory difference? 😅
You can't compensate for a difference in memory bandwidth by overclocking the GPU. You'd have to overclock the RAM, which is a bad idea if you want reliable GPUs.
@@azazelleblack you only have problems with OC if it is something extreme, 300mhz or 500mhz does not kill the card.
@@H3li0_ Son, I have been overclocking PCs since before you were born. Don't try to teach your grandma how to suck eggs. The point is that the memory is rated for a specific speed, and board vendors can't overclock the memory beyond what it's rated for without risking stability problems. They don't have the time or manpower to manually test every single graphics card to see what memory clocks it can manage, and if they find parts that fail validation at their rated spec, they can get them replaced or reimbursed by the memory manufacturer. They can't do that if they are selling overclocked cards.
They could, but what difference would that make? I've had numerous Nvidia cards that would clock beyond spec out of the box. When thermals are good and power limits are lax, these cards will often boost beyond their box specs.
You literally could put 750mhz on the memory clock and 150mhz to 200mhz on the core clock in MSI Afterburner and gain about 5fps if your lucky And it would make up the memory speed yes.... but even 2fps here would pull you even with the GDDR6X card here FPS wise... But... I always say these days, with the "Powerful-ness" (If that's even a word 😕) of todays tech, there literally is just no point of Overclocking at all, 10fps extra "maybe", when your already hitting 300fps or 250fps is just silly really.... Now on the other hand dude, say you have a GTX 1070ti still or something around there and your getting 55fps in whatever game low settings... Then absolutely for sure start overclocking that puppy bro, send it to the moon... 100% do it.... 10fps or 15fps maybe at 50fps max to start with is well worth it for sure... And also with the new Tech, we now have "Frame Generation" and "DLSS or DLSS 4.0" if you have a 40 series card, and i tell ya what, its CRAZY, i get way way waaaaaaayyyyyyyy more FPS from them settings than I could ever get from any Overclock, period. The other day i tested ALAN WAKE 2 "Native 1080p Res" with everything cranked to the max ultra with "Ray tracing & path tracing" and all them goodies on ultra, and i think i was getting like 38 maybe 40fps max, Buuuut then i turned Frame Gen on and DLSS to the Quality setting and i could not tell the difference whatsoever from "Native Res", and in a way at some points i felt like it looked better than "Native Res somehow" Yeah i dunno how. Buuuut the most amazing thing was the FPS sitting at or around 100FPS... It was amazing, literally... . And there aint no way ever am i getting 60FPS extra from any overclock on any card out there ever, its hard enuff to get 6fps with an OC never mind 60 FPS... Likeeee whaaaaaaat..... Very very impressive i say... Whether people think these new settings are just gimmicks or fake stuff.... Well, im here to say people, that they are not....They bloody work and work well, at that, show me a card that can do an OC and produce an extra 60fps, and i'll say we live in a Simulation guys, and Joe Biden is a stupid lizard rat alien..... 🦎 🐀 Or he actually could be a Turtle tbh 🐢 with the way he creeps around trying to find his way off the stage.... Hes Such a clown... 🤡🧠💨🤯. But hey, sorry to the OP "Andre Freitas" and anybody else reading this, For the rant... But this is just to clear things up, with my real world honest testing... I feel better now anyway.... GG Guys 👍.
The problem isn't the performance drop but the fact that Nvidia is downgrading a card while keeping the same brand and hide it to make more money on their customer while their price are already TOO HIGH.
Nvidia is a pig shit company when you think of their levels of greed...
They made a 3060 ti gddr6x for the same price as a regular 3060 ti and nobody cared of that so it makes that Nvidia does now the opposite as no one care if they swap gddr6 for gddr6x, they assume that swaping gddr6x for gddr6 is okay....and it is as there no noticeable differences.
@stixktv__9999 I think in all honesty from my observations they knew Amd had beat them on the heat war of ddr6. And Amd was 15 degrees cooler than all the gpus with the 6x ram
So naturally so. Similar performance trade off at a more efficient system you create is the way, they forgot that for multiple generations as noticed
I'm considering team red really hard lately
I hope AMD wins the public in the future with good mid range GPUs
Sure 😂😂😂
Them, or Intel
Amd confirm will only make mid range gpu, so no one will be 4090 or 5090 competitor
I highly doubt it. Dlss is too OP.
Believe it or not, Nvidia fanboys always sees it as DlssQ vs Raw instead of Raw vs Raw.
That's why gamers have different opinions.
@clem9808 at nowadays dlss is the same as native at quality settings and it's bettere then TAA for sure! DLSS it's the best antialiasing to me
An unnecessary step...
They could have made at least 16GB GDDR6
it is necessary if they want to keep selling 4070. GDDR6X having supply issue. nvidia already halt 4070Ti order to partner because of this.
NVIDIA does not learn…
Learn? What? Nvidia has 90% of gpu marketshare, what they should learn?
@@OmnianMIU yeah they do whatever they want!!!!!!!!
But I still prefer NVIDIA by tecnologies and perfomance over AMD
@FeherViktor-zl8bm yes of course because NVIDIA has the best solution for the pc gaming
@@daemonx867 If we went through the amount of anti consumer shit nvidia has pulled in the last 10 to 15 years you wouldn't even touch an nvidia card ever again
I think a lot of people doesn't know why this test exist. For everyone that says that it's a shame there is no difference: you are actually saying what nvidia is saying.
Care to elaborate?
@@iseeu-fp9po the rtx 4070 was originally designed with gddr6x memory but a few months ago Nvidia stopped producing them this way and used gddr6 instead. Nvidia claims that this will improve production costs and availability of the card without decreasing performance in any way, so the price remained the same. A few people obviously complained about this but this video shows that Nvidia may be right.
@@danimaio9745 Yes, but it's still an inferior product sold at the same price, even if the difference is very slight. There should at least be a discount. Who knows; there could be instances where the difference in Vram and bandwith in newer games down the line where there's more of a difference between them.
vram temp only found in gddr6X
For those who don't understand why the new 4070 now has GDDR6 memory:
Nvidia is having problems getting large quantities of GDDR6X chips. It's not Nvidia's fault, but Micron's, as they have far too high a demand for GDDR6X chips and can't keep up.
And yes, the price is still questionable.
O preço deveria ser menor, é o mínimo.
Still more powerful than a PS5 Pro...😂
No shit
im amazed they have the balls to ask 800 euro for a zen 2 apu
those were released around 2018 and they said ps5 gotta go until 2028🤣
@@Deathscythe91fr, idk what there planing
120 FPS IN 4K
120 FPS IN 8K
Haha the writing on the ps5 is very deceptive lol😂
It will be extremely challenging for gta 6 to hit 60 fps on ps5 pro haha😂
I am having a lot of problems and I am undecided. What do you think I should do? The parts of the system I will buy are almost the same quality, but one has an RTX 4070 and the other has an RX 7800xt. What do you think I should buy? Their prices are the same. I am someone who wants great graphics in 1080p and 1440p ultra settings to play story games. Which one do you think I should buy?
rtx 4070 because a lower power consumption (TDP)
People are actually crying, even though the performance is the same, it’s a Gddr6 20GB/S, so it’s still faster than the Gddr6 which AMD uses.
Does anyone notice that the 6x has a slightly higher temperature? Maybe it is not something important like the price but it is curious.
The GDDR6 version not being $50us cheaper shows a lack of Good Faith by Nvidia
Nvidia is so going to this in their 5000 series. I mean saving up specs to maintain the targeted performance.
We budget 60-70 tier consumers are done for. 💀
Glad I have my gddr6x. It should last me 7 years at least.
I used to love nvidia but i cant justify them anymore. Their price is just so high
Smh they should have just stuck with the standard 6 and kept the gddr6x for a later model
when nvidia launch 40 series there are lots of GDDR6X in supply. in fact the supply is way too many even in late 30 series life cycle. that's why nvidia end up "upgrading" 3060Ti with GDDR6X later on. next gen nvidia will adopt GDDR7 so micron as the sole supplier of GDDR6X most likely limit their production accordingly so they did not over produce them and have lots of unsold memory when 50 series starts rolling out. this is how it supposed to be... until chinese company start hoarding the GDDR6X to mod nvidia 4090 and 4080 with more memory.
Let's Budlite this one people!
FPS meter k upar bhi video banao , kese ye wala fps meter free me on kare
MSI Afterburner
If there was virtually no difference in performance anyway why even use GDDR6X to begin with?
Idk about that 0.1% lows are losing up to 15 fps in some cases. Seems GDDR6X is more stable frames.
GDDR7 will make difference significantly
How much $ its the difference between of those two gpu?
Same price!
Тут больше разница в температуре. У gddr6 ощутимо ниже, а производительность такая же.
Где температура памяти?
Amd ddr6 runs at 45 degrees. Nvidia is all types of faeces bagged as one.
NVIDIA is a joke 💀
Цена вопроса). Если 4070 ddr6 дешевле на баксов 100. То норм
If you refuse to buy the GDDR6 version and only specify the GDDR6X they will have to drop the price( as they should ) of the non x version.
If you let them away with this "little thing, no big deal" they are testing us for future scum baggery.
No reason to drop the price when sales is extremely good. Even if those GDDR6 are cheaper AIB can use that as a mean of improving margin.
thanks again nvidia ❤
This thing should cost max. $400.
classic nvidia move
"Nvidia tries not to scam thier audience IMPOSSIBLE CHALLENGE"
Ohhh god!, the performance dosen't change!
The human eye can only see the 'X' in the 'GDDR6X' name.
Basically despite all the "tech channels" making a big deal (for clicks) about the move to gddr6 vram, there is no real difference in performance
Of course gddr6x is a scam the fps diference is just the clock speed 20gb/s vs 21gb/s a gddr6 22gb/s will beat it. Gddr6x is just marketing to fool you.
The problem is that they released the non X version at the same price of the X. At least at MSRP
@epicnicity916 and? Same performance. What is your point of AMDelusional fanirls?
Do you want to talking about 6800xt = 7800xt?🤡
@@OmnianMIU dont break your arm jerking yourself off to trillionaire companies that are literally ripping off customers
@@OmnianMIU You don't gotta kiss Nvidia's ass bruh they ain't gonna pay you🤣
@EX0007 ok, but stop crying AMDelusional fangirl 😂
the difference Highlight is PUBG, not in test.....
that is a really fast memory there 🥱🥱
4070 память не главноее, она ограниченна напряжеением,маловато...///
Каким напряжением? Может объемом? А по напряжению - на них и так яишницу можно готовить. Куда ещё больше?
IDC nvidia either give me the same overpriced product or reduce the cost!!
Same price, comparable or less performance !
Good stuff Nvidia, good stuff !
' The more you buy GDDR6, the more you save '
4080 s = 4080 4070 ddr6x= 4070 ddr6
Do this test again in 5 years time with the games at the time.
It won't matter, despite the different memory type the specs are almost the same.
@@lharsay Yes you are right, just like 1000hz RAM is not much different to 4000hz RAM. Check the youtube channels on that.
it's 21Gbps memory vs 20Gbps memory here
Waste of Time!
Waste of Money!
Waste of Tech!
They should have given the new 4070 GDDR7 instead, so that people don't get screwed over...
not really
1 fps less it's not really a big thing, mild oc already beats that difference
why would they do that and have to redesign a mid tier card when the 50 series coming out in like a couple months
@@pedrohenriquedrawanz8803 then you can also oc the regular model too? stop tryna defend this, it should be atleast 15-25$ cheaper
@@Definedd I am not defending it, i'm just saying that it isn't really something to be worried about, it wasn't done because they want to still your money, it's because there are no more gddr6x chips
4069
oh nvidia🥴
И не говори..и как можно оставаться фанатом невидии после стольких скамов с ее стороны
just 1-3% diff
cheaper memory at same price stoncks
Another ryzen 4070 benchmark
DDR6? wtf?!
%3 better. Same price. thanks for free service
They downgraded it not upgraded .so it's 3% off for the same price
Not a big deal, still better bandwidth than what they originally planned for a 4070 10GB.
The same price, with worse specs and worse performance,Nvidia wtf
Hi I am the fan of you
I think with this is more than confirmed than the X memory doesnt really matter after all and its just an excuse for nvidia to charge more for the cards. After all non of the AMD cards have X memory I believe
The problem is that they're selling the non X for more expensive than the X. At least at MSRP
@@epicnicity916 jesus for real? Man, fuck nvidia
Ngreedia😂
AMDelusional fanboy?? 🤭
How much AMD paid you for this comment?
Ryzen 4070 🥵👌💪
1%
strangely seem that the core is bottlenecking the card, not the actual ram
@@fabiotiburzi il core che fa bottleneck alla scheda?? Ma cosa straca zzo dite?? Non c'è qualcos'altro che ragiona in una gpu 🤡
Punto uno impara a scrivere bene e senza offendere. Punto due si nota perfettamente che la scheda perde solo 1 fps di media per ogni gioco, il che mi fà pensare che sia il chip a essere il collo di bottiglia e non la ram
@@fabiotiburzi Unlike you ignorant person thats talking out of his a$$, the educated and well researched GPU experts that have made videos and articles on both these 4070s determined that it IS the GDDR6 memory that is the issue.
@@fabiotiburzi ti fa pensare sulla base di quale teoria perdonami? Ma almeno sai di cosa stai parlando? Sai da che cosa è composta una scheda video?
@@OmnianMIU forse meglio di te
same shit
lmaooo wtf is this?????????????????????????????????? WHY NVIDIA?????????????????????
😂😂😂😂😂😂
Fun Fact: This Is Not First Time Nvidia Do This.
They Do Same With GTX 1650 GDDR5 And GT 1030 DDR4 Versions.
Absolute Scam.
GTX1650 was upgraded to GDDR6 for the same price. it is a win for a consumer not a scam lol.
Афера
What has changed? 😂
X
2% is not so mutch
GDDR6X is a scam 🤣
GDDR6*
@jtk9259 guys you are really ignorant. GDDR6X was the old solution! Get some information before commenting
@jtk9259 guys you are really ignorant. GDDR6X was the old solution! Get some information before commenting
All you guys can complain what you want. No one is forcing you to buy the GDDR6 version. And 6X is still available. And the 50x0 on its way. Besides 4070 is near end of life. Why buying one now?
it won't matter because OEMs will pick up these GDDR6 versions in droves and pile them into X-Mas specials with an i5-14400f.
What?🤣
First person
0 fps difference 😁😁
There is no different
При всём этом, gddr6 надёжный компонент платы, а отличие от 6x😂