You'd also like our video where we 'review' the NVIDIA NDA: ua-cam.com/video/7SXmkk_yVMU/v-deo.html Support GN with a Modmat purchase! store.gamersnexus.net/products/modmat Article for more info on why DDR4 vs. GDDR5 matters: www.gamersnexus.net/hwreviews/3330-gt-1030-ddr4-vs-gt-1030-gddr5-benchmark-worst-graphics-card-2018
You forgot to mention. The GT1030 DDR4 version can also only use 4 PCIe lanes, regardless of the width of the connector, so even when it has to dip into system memory it'll be bandwidth limited compared to the old version.
akin to the GT730 GK108 (96 cores) vs GK208 (384 cores), except this one doesn't have more cores and it lacks the 4GB of memory, unlike the GK208. lol. Given the lack of 4GB for CUDA tasks, the "GTX1030"(grin) never looked any better than much older cards that pulled less power just to get the framebuffer out of system memory (like the ATI9250 or 8600GT), and thus never looked good at all to me. for ultra low watt CUDA, 4GB is kind of a min that only the GK208 has. it really isn't any better than most iGPUs for low watt tasks that also lacks CUDA, lol. Great vid GN crew. B)
How much control does nvidia have on their board partners? Nvidia just sells them the gpu.... its up to them to assemble with ram that doesn’t suck. Would gpp have prevented this?
This is why I love GN. You guys don't take crap from anyone, and you give the most useful benchmarks of any channel on YT. I'm gonna check out your patreon this month.
Can you imagine if they tried this with a 1080ti? No, because they know those consumers would know the difference and they'd be crucified. Proves even more that this was done with intent to deceive unknowing customers. I wonder if they're installing these in pre-built computers from other manufacturers? That would be even more egregious.
David Whitfield it isn't confusing anyone watching this video or GN. The very fact we're watching this says we're at least interested enough in the field to know better. How about your grandparents or 90% of the general public who go to Best Buy because they need a computer or part. You think they know to decipher the difference between models? Someone told them the 1030 was a decent card so they got it. Or they bought a pre-built that had a 1030 installed. You expect them to know the difference between ddr4 and gddr5 in the first place? You're looking at this from your perspective and fail to acknowledge you're, nor are any of us, the average consumer.
It is ABSOLUTELY ridiculous since there will be people who buy this card for light gaming with the expectation of getting SIMILAR performance to online benchmarks. The ONLY ACCEPTABLE way to handle this would be to rename the card to reflect the performance such as GT1020 and have only one variant of that so benchmarks are consistent. David Whitfield, You do get that lower-performing cards target the very same market segment who are relatively uninformed? I can think of many scenarios where someone would buy this card EXPECTING the GT1030 experience and not simply as an addon card for more displays or whatever. Marketing deception is never right. Also the fact that there is a 10% or so difference suggests they KNOW people will get ripped off. Would YOU buy a card for $90 with DDR4 and half the performance rather than a similar model for $10 more with over 2x the performance? NO. It should be called the GT1020 or similar hands down.
I don't know how anybody could defend this as being anything other than deceptive practices. And they wonder why consumers don't give them the benefit of the doubt.
Bit like like AMD and their 560. Both are invidious ("nVidia's", sorry couldn't resist the pun) practices. For instance, they (nVidia) should have called the GTX 1060 3Gb just GT 1060 as that gives it a different categorisation, and similarly this card should be, as Steve suggested, called GT 1020. I have no problem with that whatsoever.
With the comparison that the second bad 560 was supposed to be a special OEM edition for chinese internet cafes and not for the end user market out there. I think that it just ended up on some end user market cards because of all the mining stuff that was going on. But still a stupid thing to do for them ... they could have added obvious suffic to it and no one would have cared
I think the 1060 and 560s are more egregious than this. On either of those, you couldn't tell a difference, they are just internally capped, 1060 only says 3 or 6GB, and the 560 tells you absolutely nothing. This 1030 on the other hand will say "DDR4" right on the box or an online link. You have a pretty clear indicator that its different. Like this: www.techpowerup.com/img/rpj7gKMKNANEvvtC.jpg or this: www.techpowerup.com/img/2vcO0lqHZPTb7t4m.jpg
Therre is one thing that really pisses me off though, doesn't USA have any equivalent of our (UK) Trade Description Act which is legislative or some Trading Standards Office ? If ANY company pulls that sort of shit in the UK/Europe they would be penalised heavily.
The cards are available in Portugal, and all over europe as well if you look them up, so i don't think they're getting fined. www.gigabyte.com/uk/Graphics-Card/GV-N1030D4-2GL#kf
I actually just got a 1080ti++ in my toothpaste the other day, it functions sorta like Intel HD graphics, but instead of having toothpaste on the chip, you put the chip in the toothpaste! Mucho good thermals!
Ziv Zulander and srsly not most of people not buys gpu with just mem names actually just like 1060 3g and 6g just box says 3g and 6g and actuall product was cutdown
It'd still suck, but we'd be OK with it if it had a new name. It's that clone naming that's truly problematic. It's one thing to be a bad product -- that's passable. It's another to be a deceptive product!
regular GT 1030 needs to be GT 1040 and the lower specced 1050 needs to be 1040Ti then nVidia needs to change name of 1060 (6GB) to 1060 Ti if they do all of trhis they will have GT 1030, 1040, 1040 Ti, 1050, 1050 Ti, 1060, 1060 Ti 1070, 1070 Ti, 1080, 1080 Ti
Rapidly growing third world markets without much of a used device market can absorb a LOT of low end stuff, and yes those GPUs. And while NVida might be making their $10 - $15 profit per chip, not too bad, the board partners would outright be making a killing, since they save a good chunk on that RAM. That might end up being their largest profit source this year for all i know. Besides it doesn't even bloody matter. Just one person who bought a useless and slow unit thinking they're getting one that's twice as fast - for no fault of their own - is one person too many. And it's not going to be just one person. Such vendor behaviour is not to be tolerated.
This card is a CSM Card (corporate stable model), hence the DDR4, its used for home theaters, server interfaces, network consoles. It uses 50% less power than the GDDR5 card, and supports still 8k @60Hz with HDMI 2.0b and HDCP 2.2 It's literally perfect for a home theater PC or server. What other 20W card supports 8K at 60hz????
Wow... I figured it wouldn't be quite as good as the GDDR5 version, but I didn't expect it to be THAT bad. I agree it's total BS, and a 1020 designation would be very appropriate. 1010 or something
@@farhanabsar7905 I have a few of the DDR4, but I literally only use them as known good cards for seeing if an issue is a particular GPU or installing an OS or basic server graphics- places it makes no difference
not really, i mean, ofc they outperform that... abomination, the g5 card is a pile of useless trash, but this one.. hah seriously the intel hd 630 of the cpu will perform as terrible
Plenus it is but intel still has the better performance If you compare the i5 8400 with the ryzen 5 1600 you gett better gamibg performance on the intel chip even tho its multiplier is locked and has no hyperthreading compared to the r5 1600
I want to see a reviewed by Gamers nexus sticker on product pages. if a company gets a positive review here they should be proud of it and show it off.
KAT Erwhall Might be a good idea. Just like they did with the GN Glass Cube Awards at CES. I'm still loving GN for that "Least amount of RGB crap at booth"-Award they gave Noctua.
Wow they murdered that poor GPU :( That is some really shady stuf. How many of these low end graphic cards are sold compared to the more upper level ones lets say 1060 and up?
What I liked about AMD cards the most was more VRAM for a lot cheaper than comparable Nvidia cards. Of course mining and memory price fixing has made it irrelevant. While a 1060 3GB was great on launch it loses value a lot quicker than the 6GB version because new games tend to quickly ramp up VRAM requirements with each passing year. My old HD6770 remained relevant for a lot longer than some comparable Nvidia cards that had 512 or 768MB of VRAM instead of a whole GB.
When I first got into PC gaming in 2009, even after researching what I didnt know as a teen, I still didnt understand the impact of memory bandwidth. I almost bought a GDDR2 version of the 8600GT when they had a GDDR3 version. If it wasnt for the retailer explaining it to me I wouldve suffered back then. Cant believe 9 years later, this same name thing for different memory tech hasnt changed and I blame no focus on low end products by reviewers tbh cuz thats where its always happening silently, whether the cards are popular or not. It needs to stop.
I can't speak for the ddr4 version, but I've seen comparison videos of the gddr5 1030 vs Intel hd 630 igu (just searched here on YT). The 1030 provides actually quite a nice upgrade for "casual" games like DotA or CS go, even gta v on med settings around 720p. Thinking about getting one to put in the computer my niece uses. She loves to play Sims 4 and it'd be a nice improvemrnt for her.
As mentioned in this video the GDDR5 version is fine, but if you accidentally buy the DDR4 version you're basically fucked, since it's absolute garbage.
Don’t forget the gt730 where it’s not just memory and clock speed changes that are not mentioned very much if at all, but the cuda cores are also changed and I didn’t find anything on my box, the average 730 has about 384 cuda cores, and mine, a low end one that I thought was the same, has 96... Edit: it’s also important to notice that even a gt710 has more cuda cores than my 730...
That sounds more like a faulty product or a AIB that does some crazy shit. If you criple a product to be below any other of your products besides it is supposed to be higher up in the chain that is something that I would say not even Nvidia is shady enough to do.
Melvin Klein nope it’s intended and I found out through some digging on EVGAs store page and looking for the specific variation I have. Gt730 4gb ddr3 EDIT: I found out on EVGAs site yet I’m using an MSI card so it’s more than just a single AIB But the problem is the fact that sure the memory is openly stated, but the core change isn’t. I bought it used thinking, 4gb that’s pretty good, then I found its ddr3, I was like oh ok so it’s still not trash. But what really caused me to go digging is when I looked up gt730 benchmarks and found out the the gt730 2gb gddr5 was kicking my cards ass
You have the GT 730 version with the Fermi architecture and not the Kepler one. The GT 710 is from Kepler architecture and that's why in this case has more CUDA cores available.
Heinz Traub and why isn’t this stated clearly? Lol, either way only reason I bought a gt730 in 2018 is because my 1060 is gone (long story) and I have a Ryzen processor
It is responsibility of the AIB to print it in the box. If you take a look at nvidia's website it shows all available versions and at the top it says that you should check with the manufacturer which version you are shipping.
Just look how Polaris is a mess : RX 5 series is a rebranding. and if you buy an RX 550 or 560, you have to make sure that's the complete chip and not a Lite version with stream processors deactivated. And if you look at CPU, Rysen is the beginning of AMD copying the shitty artificial segmentation of Intel with chipset's limitations (OC, Crossfire) instead of the promised SoC model and they added the platform secure processor, their own Intel ME.
The chipset limitations aren't artificial to my knowledge. Got any info on that? EDIT: The chipset being limited to OC isn't a chipset problem, it's the VRMs not deliverable on cheaper boards. Different thing. As for Crossfire, that's supported on the chipset. You're thinking SLI, which requires NVIDIA certification and adds to the cost
isn't this the very definition of bait and switch? They have marketed the 1030's performance for ages, now the does that would be considered a minor revision by consumers, but is in realty half the performance at a 10% discount.
An even better scam. The implication is that the new revision is somehow better than the old. Even the sales people might be fooled. I shop at MC all the time. Not all their employees are up to snuff on what they're selling.
Yeah, but people act as if GTX970 was somehow crippled when it was not, and nowadays if you search hard enough, you probably could find a used GTX970 for the same price as a new GT1030 (the "good" one).
Damn, glad I came across this video. I was looking for a 1030 to boost the HD530 graphics on the very modest Veriton N4640G in my living room; a very good candidate with its 35W power draw. I had no idea some of them were equipped with GDDR4. I see them now and they cost the same as the ones with GDDR5. What a sucky move.
Keep up the good work. One of the reasons I subbed your channel is because of your attitude towards the computer hardware market. True, direct and down to business. No BS taken or given. No paid reviews. No fear of losing unlike those big channels with small balls. Great stuff. Of course this won't earn you more subs but respect. Linus' or any other big channel would not risk it (which is completely natural), to loss sponsorship, be it of Intel or Nvidia or whatever. Big channels earn money, its basically business for them but GN echos us consumers, spreading awareness and help tackling the monopoly that every now and then clouds the computer hardware market.
I feel like the DDR4 variant was really meant for OEMs so that they can boast having a GT 1030 while also cutting costs and screwing over consumers (admittedly dont read the descriptions carefully) into thinking they're getting the real deal.
C2Lception that could be a scenario. Or maybe Nvidia had a bunch of extra stock of extremly low end GPUs and slapped DDR4 on them. Then called it quits. Either way its sleazy as all hell
True, they all love their cash. Proprietary crap PSUs? Just so you can buy another overpriced one from them again! Proprietary motherboards? Right! Now you can't switch to a better power supply/system! The next one angers me the most: Oh, we'll advertise it's an i5/i7, except we'll use the cheap low power cut down ones that they use in laptops! Who cares if the clocks are slashed and core count halved? The consumer will never notice!
C2Lception HP is specifically guilty of this. I recently tore down one of those shitty HP desktops, I looked at the board and wondered wtf they were thinking with it. All the cables from the front panel connectors, all proprietary except for the Sata data cable (surprised they didn't find a way to make that proprietary). Uses one of their AC adapter power bricks to power the damn thing. The screws to keep the motherboard mounted were all proprietary. I'm just thinking what the hell, cause there's a spot where they could have put an actual PSU like it's clearly a spot where a PSU would be mounted. Also there's only this tiny little exhaust fan with zero front intake.
LordIron ye, depends on the socket (1156 is relatively free of this problem), but dell, lenovo, and acer prebuilts all use annoying proprietary connectors
you can buy a car that normally comes with a 2.4L engine, but you can also buy that same car with a smaller 2.0L for less. It's not false advertising if they put a tiny little different number somewhere on the box. But yeah still pretty shady if you ask me.
Nasty N8, you gave example pulled out of your arse. As you mentioned, with cars you know what engine you buying. With 1030's you do not know if you buying DDR4 version or GDDR5, therefore it is false advertising or at least it is not fully transparent.
WarmSoftKitty It did say DDR4 in the box but it didn't describe that the DDR4 is like 50-90% slower while only cost $10 less than the original, if it cost $30 less then the costumer will at least figured out that the DDR4 version must be much slower than the GDDR5 version. It also didn't say that it has a lower clock speed, and yet they still called it gt 1030
noName yet "It did say DDR4 in the box but it didn't describe that the DDR4 is like 50-90% slower." They are not obligated to do that. Are you upset that they don't advertise the performance difference between a GTX 1050 Ti and a GTX 1070? What about the performance difference between versions of cards that have the same GPU but different amounts of video RAM? Is this card crappy? Yes. Does that mean it shouldn't exist or that Nvidia should be prosecuted? No to both.
This sort of review is why this channel is so good. Indepth, unbiased, with no nonsense. The title is great, "disgrace" is the most fitting word for this. Even if this was a “1020” performace would be too low!
I'm starting to realise that generally when a GN video's sponsor space is taken up by the mod mat, that usually means the video's going to be spicy or otherwise controversial in some way. That's a good thing.
Imagine VW would sell clean diesels that aren't clean... oh wait... Also: Just how much difference can there even be in manufacturing costs between 2GB GDDR5 and DDR4? It must be kinda noticable to even warrant setting up the production/development of such a failure... then again, maybe it's gonna be popular with the prebuild PCs...
I just fell victim to this, but it's for a pc that will be simply playing a game in the first place, nothing modern, texture heavy, etc. So it is a good fit, especially since BF/CM dropped it below $50usd, and a low profile was needed, it works out in the end, but I had no idea that is what the D4 meant, and I coulda swore it said gddr5 on the specs on Amazon, and also beyond that it absolutely looks like a stupid serial number. It works for my needs, but it's also semi-disappointing to see, if not for the fact I found something in my budget / spec range anyways. Awesome video btw!
To avoid "getting punished" for mistaking the two variants, there's an easy way.. *Don't buy a 1030 under any circumstance* Problem solved. Let NVidia be " *punished* " instead. They just did this with the 1060 to a somewhat lesser degree, but every bit as bad.
I bought a Gt 1030 GDDR5, But I heard crap about the card that scared me, but then I realized that was a DDR4 the people bashed about, thanks Gamers nexus for elucidate me.
I'm Definitely with KAT Erwhall-- There should be an official award for computer manufacturers to use stating they survived Gamers Nexus testing and reviews with a positive outcome. There's not a tech review entity out there that claim to be better than GN. Maybe one or 2 on roundabout equal footing- but none *BETTER* than ya'll. You guys are the only UA-cam channel I know of that'll give it to us raw and unabashedly true, with manufacturers feeling not remotely a consideration. You guys are definitely on the side of the customer base, and nobody out there could EVER attempt to call GN shills for any company Keep up the great work Steve, and we'll keep loving you for it
#askgn What happened to meltdown, Spectre, Spectre v2, the Google released "Variant 4" exploit, etc. ? 1. Are they all fixed now? 2. How much performance impact did the microcode patches cause? 3. Is there a way, besides using older bios and windows version, to disable the patches so you get extra performance for benchmark ?
taging along to this...what I want to see now after half a year...if Intel still has the IPC advantage over AMD post all the patches. I don't think they are in the lead anymore.
on our servers it did quite a fit....for example RAID arrays lost about 25% speed and some calculations take longer. Was currious if the benchmarks have changed since initial ryzen launch.
Well, you have to look up the specs of the CPU to know for sure. The i7-8550U can be found in fairly inexpensive laptops, and has 4 cores, 8 threads, a base clock of 1.8 GHz, and turbos up to 4 GHz. A friend's Surface Pro 4 from a couple of years back has a 2-core/4-thread "i7" though.
did any of the past "different products with the same names" gpus have anywhere near this much difference in performance? Getting 50% of the performance for a product with the same name is... stupid.
I owned two versions of the GT 730. One had DDR3 memory, the other had GDDR5 memory. I ran benchmarks and there was a substantial difference. For my purposes, the DDR3 version worked and was passively cooled so I kept it and sold the GDDR5 version. I never even considered giving Nvidia any grief over it.
I just got scammed 😅. I bought on ebay a 1030 to upgrade an old hd 5770. The title said ddr5, but I just noticed it is the ddr4 version by the model number in the pictures. It's still an upgrade and has much lower power consumption.. but still. A bad rating for the seller is guaranteed
Wow, this is just a disgrace... This is literally a scam. Most people will never know the difference and will choose the slightly cheaper GPU and get HALF the performance. Remember when Nvidia wanted gamers to "know what they are buying" in case someone reads NVIDIA instead of AMD on the box?
kong xiong After 20 years of supporting nvidia, that's it for me. 1060 was my last GPU from them and now i'm waiting for AMD to release something new and i will be very happy to switch the teams
Because AMD doesn't have any good gpus to compete with at a higher level, this product is a scam , but their other higher cards are better than AMD counterparts , and AMD have big issue with their cost and power consumption, a bad product doesn't mean shit if almost all of their other products are better than amd's
This card is a CSM Card (corporate stable model), hence the DDR4, its used for home theaters, server interfaces, network consoles. It uses 50% less power than the GDDR5 card, and supports still 8k @60Hz with HDMI 2.0b and HDCP 2.2 It's not marketed to gamers. It's literally perfect for a home theater PC or server. What other 20W card supports 8K at 60hz????
I'm no stranger to memory clock and about every specification, but I admin, I did not know about this crap. I ordered a Home PC with Athlon 200GE and Gigabyte GT 1030 and I did not know there are 2 verisons but fortunately, my new desktop has the GDDR5 version, thanks for this bench video.
I am the one who brought gt 1030 ddr4 version before I knew that this gpu had ddr4 and ddr5 version. The seller didn't even inform me, and I didn't notice. Money wasted!
HilaKleiner Because the old reviews were probably talking about the gddr5 version, and will not be accurate for the ddr4 version. The review will not be negative enough.
Are you stupid? They mean the reviews of the actual card because now those ones are more likely to make someone accidentally pay money for this scam. You know. Like he said IN THE VIDEO. At least unlisting them until this is fixed would be a good idea, that way anyone searching for GT 1030 is going to be more likely to find THIS video and not the reviews of the original.
When you mentioned the GDDR4 era, it would be interesting to look at the “first chips” from each gen of GDDR when GDDR6 cards launch. Or even just at GDDR4 itself which was only on a handful of cards iirc
One of the best UA-cam channels and hardware sites on the internet. Good job once again on doing tests nobody else got the balls for. There was no other possibility then giving this card a devastating conclusion. Nvidia wtf were you thinking again.
It is hardly a new practice , since it was done ever since the MX400. In fact, 1030 was an oddity among the low-end Nvidia cards for only having a GDDR version. The worst offender in that regard was GT 640, which has 5 versions woth the same name.
I have a gt 1030 and I am currently using it as my 2070 super is being "repaired" by nvidia. Had to do fresh install of drivers because all games would crash upon launch.
I don't even understand why you are trying to do this. Certainly someone there or someone at Nvidia knows the difference between a reduced instruction set and a complex instruction set. Nothing else needs to be understood. Except for can you screw a customer better than we can screw a customer?
The only good thing about the DDR4 model is that it only consumes 20w instead of 30w for the GDDR5 model and that it's still better than an integrated IGP, but when it comes to video games it's a horror, compared to GDDR5, DDR4 is a horror, the games are very slow. (Tests with Gigabyte GT 1030 OC GDDR5 and MSI Aero ITX GT 1030 OC DDR4).
I had picked a Zotac one up last year as a console display card but upon seeing its potential used it in my workstation (Where gtx 1080 is a co-processor) and got surprised that I was able to get more than acceptable performance for my rare gaming occasions (WC3, WoW, SC, HotS etc.) And people ask why I hate nVidia more than I hate RTG...
Glad you covered this. They should have called it GT1020 or some other shit. Crafty knob heads. They know what they're doing. They've been doing it since the days of DDR2 / 3 video memory. The GT1030 GDDR5 variants are bloody awesome cards. Such low power and small cards that can play so many games quite nicely.
This reminds me of Geforce4 MX. It didn't have programmable shaders like GeForce3 and was basically a rebrand of GeForce 2 chips. No business being called what it was.
5:52 Hey! That's my old CPU! Had it paired with a RX 580 because I got fired before I could buy my original choice of CPU...which was a 7700k :D Just cool to see GN using hardware I'm familiar with.
Did you try running the Ashes of the Singularity benchmark? I got the same message on Intel HD integrated graphics, just had to click "OK" and it ran the benchmark anyway, at about 12 FPS.
I believe they were aiming this at the Home Cinema type PC for your digital library in your living room that never does anything else but play movies...
I like how Steve keeps tossing products every now and then. It keeps viewer engagement high, as it's not an everyday thing, so we stay put waiting for the next one ;D
I had a gt 1030 with the gddr5. Overclocked it sooo much. Over 2ghz and the memory was also overclocked. Nearly killed it when I tried 2.3ghz. best I got was 2.1 and I forget what I had the ram at. I'm sure if I got better luck on my gt 1030 I think I could have gone to 2.3ghz.
I'm more interested in looking forward to GN and Buildzoid doing a VRM Analysis this Winter 2018 Season on the Turing 1180's from EVGA and Galax etc. I stop GPU considerations for low end PC's at the 1060 on Pascal.
Nice PSA, bought two used 1030 for my GF & her roomate, who just needs something stationary that can run PS & some light games, in conjunction with some used HP small form factor (i7, 3770 & 8gigs of DDR3 ram). I made damn sure that what I got was the GDDR5 ones.
You'd also like our video where we 'review' the NVIDIA NDA: ua-cam.com/video/7SXmkk_yVMU/v-deo.html
Support GN with a Modmat purchase! store.gamersnexus.net/products/modmat
Article for more info on why DDR4 vs. GDDR5 matters: www.gamersnexus.net/hwreviews/3330-gt-1030-ddr4-vs-gt-1030-gddr5-benchmark-worst-graphics-card-2018
Gamers Nexus they should have named it a GT1029
You forgot to mention. The GT1030 DDR4 version can also only use 4 PCIe lanes, regardless of the width of the connector, so even when it has to dip into system memory it'll be bandwidth limited compared to the old version.
How does this POC perform against gen 8 Intel integrated graphics?
Just...why?
akin to the GT730 GK108 (96 cores) vs GK208 (384 cores), except this one doesn't have more cores and it lacks the 4GB of memory, unlike the GK208. lol.
Given the lack of 4GB for CUDA tasks, the "GTX1030"(grin) never looked any better than much older cards that pulled less power just to get the framebuffer out of system memory (like the ATI9250 or 8600GT), and thus never looked good at all to me. for ultra low watt CUDA, 4GB is kind of a min that only the GK208 has. it really isn't any better than most iGPUs for low watt tasks that also lacks CUDA, lol.
Great vid GN crew. B)
"We want more transparency for the gamers" -nvidia
What's more transparent than naming all your products the same thing?!
No transparency!! That effect is way to expensive for DDR4!
Gamers Nexus I mean they could take the exact same parts and give them a new name to increase sales, oh wait they already have.
How much control does nvidia have on their board partners? Nvidia just sells them the gpu.... its up to them to assemble with ram that doesn’t suck. Would gpp have prevented this?
Titan X naming is worse
This is why I love GN. You guys don't take crap from anyone, and you give the most useful benchmarks of any channel on YT. I'm gonna check out your patreon this month.
I subbed to him because I love seeing the numbers and not just puffery. Gonna end up buying one of those modmats to directly support him!
Thanks for the support, guys!
Yeah me too. Legit trustworthy content, especially about cases, I've gotta support this!
Patreon is worth just for access to the Discord chatroom alone (I mean and the whole supporting an excellent channel thing.)
any channel on youtube? ....................
“I don’t know what the ad will be but we’ll figure it out.” *Runs ad for GN*
Green Man Those Ads are the best ones. And from a most trustful sponsor!
Those mod mats are very well made. The visual layout with all the measurements and scales on them say they put time into their product.
Green Man modmat ftw
shouldve ran the GIF ad again
Can you imagine if they tried this with a 1080ti? No, because they know those consumers would know the difference and they'd be crucified. Proves even more that this was done with intent to deceive unknowing customers. I wonder if they're installing these in pre-built computers from other manufacturers? That would be even more egregious.
Pando I don't think they did it with the INTENT to deceive consumers, but to cut down on costs
They're getting crucified here right now, as they deserve.
David Whitfield it isn't confusing anyone watching this video or GN. The very fact we're watching this says we're at least interested enough in the field to know better. How about your grandparents or 90% of the general public who go to Best Buy because they need a computer or part. You think they know to decipher the difference between models? Someone told them the 1030 was a decent card so they got it. Or they bought a pre-built that had a 1030 installed. You expect them to know the difference between ddr4 and gddr5 in the first place? You're looking at this from your perspective and fail to acknowledge you're, nor are any of us, the average consumer.
I agree with this comment
It is ABSOLUTELY ridiculous since there will be people who buy this card for light gaming with the expectation of getting SIMILAR performance to online benchmarks.
The ONLY ACCEPTABLE way to handle this would be to rename the card to reflect the performance such as GT1020 and have only one variant of that so benchmarks are consistent.
David Whitfield,
You do get that lower-performing cards target the very same market segment who are relatively uninformed? I can think of many scenarios where someone would buy this card EXPECTING the GT1030 experience and not simply as an addon card for more displays or whatever.
Marketing deception is never right. Also the fact that there is a 10% or so difference suggests they KNOW people will get ripped off.
Would YOU buy a card for $90 with DDR4 and half the performance rather than a similar model for $10 more with over 2x the performance?
NO.
It should be called the GT1020 or similar hands down.
I don't know how anybody could defend this as being anything other than deceptive practices. And they wonder why consumers don't give them the benefit of the doubt.
Bit like like AMD and their 560. Both are invidious ("nVidia's", sorry couldn't resist the pun) practices. For instance, they (nVidia) should have called the GTX 1060 3Gb just GT 1060 as that gives it a different categorisation, and similarly this card should be, as Steve suggested, called GT 1020. I have no problem with that whatsoever.
With the comparison that the second bad 560 was supposed to be a special OEM edition for chinese internet cafes and not for the end user market out there. I think that it just ended up on some end user market cards because of all the mining stuff that was going on. But still a stupid thing to do for them ... they could have added obvious suffic to it and no one would have cared
I think the 1060 and 560s are more egregious than this. On either of those, you couldn't tell a difference, they are just internally capped, 1060 only says 3 or 6GB, and the 560 tells you absolutely nothing. This 1030 on the other hand will say "DDR4" right on the box or an online link. You have a pretty clear indicator that its different.
Like this: www.techpowerup.com/img/rpj7gKMKNANEvvtC.jpg
or this: www.techpowerup.com/img/2vcO0lqHZPTb7t4m.jpg
Therre is one thing that really pisses me off though, doesn't USA have any equivalent of our (UK) Trade Description Act which is legislative or some Trading Standards Office ? If ANY company pulls that sort of shit in the UK/Europe they would be penalised heavily.
The cards are available in Portugal, and all over europe as well if you look them up, so i don't think they're getting fined. www.gigabyte.com/uk/Graphics-Card/GV-N1030D4-2GL#kf
Scamming low information, low budget buyers. Disgusting!
Sad, low energy card! Nvidia keeps sending its bad ones!
Is the price cheaper?
@Derek Jonez in the video Steve says it is only 10$ cheaper.
It's sad because the people buying these for gaming are usually young and work their asses off just to get scammed.
Oxaile i was one of these, fortunately the ddr4 didn't exist and got gddr5, i changed it for a 1050ti later
Next: gt 1030 with 1.5GB+0.5GB DDR4 memory
2014 flashbacks :O
Plot twist: the GPU is actually a GT *740*
Feels bad man
Oh, you mean 1.5GB DDR4 + 0.5GB DDR3 so it qualifies as a 1030 2GB version
More like 1.5GB of DDR4 + 0.5 GB of DDR3
Keeping the same name with such a difference in performance is just shady as f*ck.
Almeida they did the same with the gt730, same name yet one has 384 cuda cores when the other has 96
According to nvidia that's "transparency"
Hallison Michel if only they had decent high end gpu's. Rocking an r9 390 atm nothing to upgrade to
Nice to tell us that captain obvious.
Imagine buying a cheese bread only to find out that it's honey
the ddr4 version is basically a scam
y, honestly they need to be sued for a name change. Just have a bloody judge force them to change the name...
I actually just got a 1080ti++ in my toothpaste the other day, it functions sorta like Intel HD graphics, but instead of having toothpaste on the chip, you put the chip in the toothpaste! Mucho good thermals!
Ziv Zulander and srsly not most of people not buys gpu with just mem names actually
just like 1060 3g and 6g
just box says 3g and 6g
and actuall product was cutdown
and think about how they are almost same price.
loomnati noscopers Should be sued by consumer protection agencies on every continent
Why not just name it the gt 1020 -_-
It'd still suck, but we'd be OK with it if it had a new name. It's that clone naming that's truly problematic. It's one thing to be a bad product -- that's passable. It's another to be a deceptive product!
regular GT 1030 needs to be GT 1040 and the lower specced 1050 needs to be 1040Ti then nVidia needs to change name of 1060 (6GB) to 1060 Ti
if they do all of trhis they will have GT 1030, 1040, 1040 Ti, 1050, 1050 Ti, 1060, 1060 Ti 1070, 1070 Ti, 1080, 1080 Ti
I was thinking of the GT 1030 S(low)
tintinaus lol
Ziv Zulander
I'm L
"The way it's meant to be played"
"They played us like a damn fiddle."
The way you're meant to be played*
Damn!
What the hell, Nvidia
This is still better than MOST of their behavior. The entire company (ie the people) is morally bankrupt.
TDR REVENGE and the GTX 970!!!
Rapidly growing third world markets without much of a used device market can absorb a LOT of low end stuff, and yes those GPUs. And while NVida might be making their $10 - $15 profit per chip, not too bad, the board partners would outright be making a killing, since they save a good chunk on that RAM. That might end up being their largest profit source this year for all i know.
Besides it doesn't even bloody matter. Just one person who bought a useless and slow unit thinking they're getting one that's twice as fast - for no fault of their own - is one person too many. And it's not going to be just one person. Such vendor behaviour is not to be tolerated.
What the hell Nvidia marketing dep the low tpd is a big selling point and usage case
This card is a CSM Card (corporate stable model), hence the DDR4, its used for home theaters, server interfaces, network consoles. It uses 50% less power than the GDDR5 card, and supports still 8k @60Hz with HDMI 2.0b and HDCP 2.2
It's literally perfect for a home theater PC or server. What other 20W card supports 8K at 60hz????
Wow... I figured it wouldn't be quite as good as the GDDR5 version, but I didn't expect it to be THAT bad. I agree it's total BS, and a 1020 designation would be very appropriate. 1010 or something
I almost bought ddr4 gt 1030. But got suspicious because of the price.
Gt 1010 gddr5 is gt 1030 ddr4
@@farhanabsar7905 I have a few of the DDR4, but I literally only use them as known good cards for seeing if an issue is a particular GPU or installing an OS or basic server graphics- places it makes no difference
So this makes AMD's Raven Ridge APUs all the more impressive.
Yeah, they get the performance they do using system memory over the Infinity Fabric!
O!Technology AMD has voodoo magiks up their sleeves.
not really, i mean, ofc they outperform that... abomination, the g5 card is a pile of useless trash, but this one.. hah seriously the intel hd 630 of the cpu will perform as terrible
This thing gives Intel HD Graphics a run for their money.
Funny to think that your system RAM could be faster than the freaking GPU's VRAM.
Plenus it is but intel still has the better performance
If you compare the i5 8400 with the ryzen 5 1600 you gett better gamibg performance on the intel chip even tho its multiplier is locked and has no hyperthreading compared to the r5 1600
I want to see a reviewed by Gamers nexus sticker on product pages. if a company gets a positive review here they should be proud of it and show it off.
KAT Erwhall Might be a good idea. Just like they did with the GN Glass Cube Awards at CES. I'm still loving GN for that "Least amount of RGB crap at booth"-Award they gave Noctua.
Genuinely would take note of such a sticker. Would just ultimately lead to GN selling out though. Money talks.
Money talks, but BS walks.
@@emlyndewar eh, I doubt they'd sell out. They could have many times, but they're for truth and honesty.
Remember guys: The GPP was established to eliminate confusion between GeForce graphics cards
Thats right, we messed up and made it go away.
"we just wanted gamers to know what they are buying"
Incorrect
Wow they murdered that poor GPU :( That is some really shady stuf. How many of these low end graphic cards are sold compared to the more upper level ones lets say 1060 and up?
The 3GB 1060 also has more downgrades than just half the RAM.
What I liked about AMD cards the most was more VRAM for a lot cheaper than comparable Nvidia cards. Of course mining and memory price fixing has made it irrelevant.
While a 1060 3GB was great on launch it loses value a lot quicker than the 6GB version because new games tend to quickly ramp up VRAM requirements with each passing year. My old HD6770 remained relevant for a lot longer than some comparable Nvidia cards that had 512 or 768MB of VRAM instead of a whole GB.
Toms Tech you dumb, boi
Shame on you Nvidia.
When I first got into PC gaming in 2009, even after researching what I didnt know as a teen, I still didnt understand the impact of memory bandwidth. I almost bought a GDDR2 version of the 8600GT when they had a GDDR3 version. If it wasnt for the retailer explaining it to me I wouldve suffered back then. Cant believe 9 years later, this same name thing for different memory tech hasnt changed and I blame no focus on low end products by reviewers tbh cuz thats where its always happening silently, whether the cards are popular or not. It needs to stop.
1030 DDR4 vs Intel integrated graphics?
Should be pretty equal, I mean if i remember correctly Intel's HD630 was roughly half of a Ryzen 2200G
I can't speak for the ddr4 version, but I've seen comparison videos of the gddr5 1030 vs Intel hd 630 igu (just searched here on YT). The 1030 provides actually quite a nice upgrade for "casual" games like DotA or CS go, even gta v on med settings around 720p. Thinking about getting one to put in the computer my niece uses. She loves to play Sims 4 and it'd be a nice improvemrnt for her.
the ddr4 is basically using system RAM, so it's going to be similar to an iGPU...
GT1030 D4 = Nvidia HD graphics 1000 ⚆ _ ⚆
As mentioned in this video the GDDR5 version is fine, but if you accidentally buy the DDR4 version you're basically fucked, since it's absolute garbage.
The pause for the ashes of the singularity benchmark was awesome
Don’t forget the gt730 where it’s not just memory and clock speed changes that are not mentioned very much if at all, but the cuda cores are also changed and I didn’t find anything on my box, the average 730 has about 384 cuda cores, and mine, a low end one that I thought was the same, has 96...
Edit: it’s also important to notice that even a gt710 has more cuda cores than my 730...
That sounds more like a faulty product or a AIB that does some crazy shit.
If you criple a product to be below any other of your products besides it is supposed to be higher up in the chain that is something that I would say not even Nvidia is shady enough to do.
Melvin Klein nope it’s intended and I found out through some digging on EVGAs store page and looking for the specific variation I have. Gt730 4gb ddr3 EDIT: I found out on EVGAs site yet I’m using an MSI card so it’s more than just a single AIB
But the problem is the fact that sure the memory is openly stated, but the core change isn’t. I bought it used thinking, 4gb that’s pretty good, then I found its ddr3, I was like oh ok so it’s still not trash. But what really caused me to go digging is when I looked up gt730 benchmarks and found out the the gt730 2gb gddr5 was kicking my cards ass
You have the GT 730 version with the Fermi architecture and not the Kepler one. The GT 710 is from Kepler architecture and that's why in this case has more CUDA cores available.
Heinz Traub and why isn’t this stated clearly? Lol, either way only reason I bought a gt730 in 2018 is because my 1060 is gone (long story) and I have a Ryzen processor
It is responsibility of the AIB to print it in the box. If you take a look at nvidia's website it shows all available versions and at the top it says that you should check with the manufacturer which version you are shipping.
This whole mess is so easily avoided. Green and Blue have been acting really drunk lately.
Competition is sorely needed now that Nvidia think their crap doesn't stink. Best wishes to AMD, and to Intel's new GPU push.
Well, it's not only Intel and Nvidia, AMD does weird branding on cropped versions too. It's a bad marketing practice they all do.
Vierax, multiple examples pls or it does not count
Just look how Polaris is a mess : RX 5 series is a rebranding. and if you buy an RX 550 or 560, you have to make sure that's the complete chip and not a Lite version with stream processors deactivated.
And if you look at CPU, Rysen is the beginning of AMD copying the shitty artificial segmentation of Intel with chipset's limitations (OC, Crossfire) instead of the promised SoC model and they added the platform secure processor, their own Intel ME.
The chipset limitations aren't artificial to my knowledge. Got any info on that?
EDIT: The chipset being limited to OC isn't a chipset problem, it's the VRMs not deliverable on cheaper boards. Different thing. As for Crossfire, that's supported on the chipset. You're thinking SLI, which requires NVIDIA certification and adds to the cost
I was wondering if you were gonna do this comparison. THANK YOU SO MUCH!!!!
They should have named it the GT 420 because who knows what they were smoking
I would say crack..
but the gt 420 was like, actually not that bad
gt heroin30
isn't this the very definition of bait and switch? They have marketed the 1030's performance for ages, now the does that would be considered a minor revision by consumers, but is in realty half the performance at a 10% discount.
Just GT is fine. Gigantic Turd.
Chris Hexx I don’t know. It looks like the gpu is about the size of a slightly larger than average turd.
The sltat-1030
BlueLightning Films it’s a steaming pile regardless 😀
NVIDIA shall name it the GTX 1090 Ti.
Nah it doesn't overheat enough
I just checked my local Microcenter. they have 4 1030's, 3 real ones and one of the DDR4's. The DDR4 is the most expensive one! WTF!
Well it's the newest model, so it must be the most expensive! And the better value! And the better performer!
An even better scam. The implication is that the new revision is somehow better than the old. Even the sales people might be fooled. I shop at MC all the time. Not all their employees are up to snuff on what they're selling.
Take down older videos covering the real 1030, no. Update them with a warning, and a link to this video, yes.
16:24 or if Intel released a DUAL CORE i7 and slapped say a U on the end of the model number, Oh wait they already did that for YEARS.
Well, at least the U ones were super low wattage (17W or less) compared to the 28W and 45W i7's
thats just MORE of an argument to NOT call them an i7
Thanks for the tests GN.
This is actually a good comparison between DDR4 & GDDR5.
Now I realize just how far we've come.
So the GTX 970 had 0.5 GB of DDR4 attached?
Oh fuck off I have a GTX970, it's a good card. This one isn't.
Yeah, but people act as if GTX970 was somehow crippled when it was not, and nowadays if you search hard enough, you probably could find a used GTX970 for the same price as a new GT1030 (the "good" one).
GTX970 is a good card if you want to experience stutter simulator.
it is crippled by its VRAM, but far less than this poor card. Also better than AMD's modern stuff (VEGA)?
LOLWUT? It's a perfectly fine card somewhere between GTX1060 3G and 6G.
Damn, glad I came across this video. I was looking for a 1030 to boost the HD530 graphics on the very modest Veriton N4640G in my living room; a very good candidate with its 35W power draw. I had no idea some of them were equipped with GDDR4. I see them now and they cost the same as the ones with GDDR5. What a sucky move.
inb4 every single card in the upcoming Turing arch will be named just "GeForce GTX"
Who needs specifications? Just look at the price, surely the most expensive one will be the best value!
Keep up the good work. One of the reasons I subbed your channel is because of your attitude towards the computer hardware market. True, direct and down to business. No BS taken or given. No paid reviews. No fear of losing unlike those big channels with small balls. Great stuff. Of course this won't earn you more subs but respect. Linus' or any other big channel would not risk it (which is completely natural), to loss sponsorship, be it of Intel or Nvidia or whatever. Big channels earn money, its basically business for them but GN echos us consumers, spreading awareness and help tackling the monopoly that every now and then clouds the computer hardware market.
I feel like the DDR4 variant was really meant for OEMs so that they can boast having a GT 1030 while also cutting costs and screwing over consumers (admittedly dont read the descriptions carefully) into thinking they're getting the real deal.
C2Lception that could be a scenario. Or maybe Nvidia had a bunch of extra stock of extremly low end GPUs and slapped DDR4 on them. Then called it quits. Either way its sleazy as all hell
True, they all love their cash. Proprietary crap PSUs? Just so you can buy another overpriced one from them again! Proprietary motherboards? Right! Now you can't switch to a better power supply/system! The next one angers me the most: Oh, we'll advertise it's an i5/i7, except we'll use the cheap low power cut down ones that they use in laptops! Who cares if the clocks are slashed and core count halved? The consumer will never notice!
Oh i can already imagine HP and the likes creaming themselves over this chance to sell a subpar product in the name of saving few bucks.
C2Lception HP is specifically guilty of this. I recently tore down one of those shitty HP desktops, I looked at the board and wondered wtf they were thinking with it. All the cables from the front panel connectors, all proprietary except for the Sata data cable (surprised they didn't find a way to make that proprietary). Uses one of their AC adapter power bricks to power the damn thing. The screws to keep the motherboard mounted were all proprietary. I'm just thinking what the hell, cause there's a spot where they could have put an actual PSU like it's clearly a spot where a PSU would be mounted. Also there's only this tiny little exhaust fan with zero front intake.
LordIron ye, depends on the socket (1156 is relatively free of this problem), but dell, lenovo, and acer prebuilts all use annoying proprietary connectors
Order more Modmat's cuz idk who is left to sponsor u guys. Love you guys
I'm pretty sure we can file a lawsuit for this.
This is a scam, false advertising.
you can buy a car that normally comes with a 2.4L engine, but you can also buy that same car with a smaller 2.0L for less. It's not false advertising if they put a tiny little different number somewhere on the box. But yeah still pretty shady if you ask me.
Nasty N8, you gave example pulled out of your arse. As you mentioned, with cars you know what engine you buying. With 1030's you do not know if you buying DDR4 version or GDDR5, therefore it is false advertising or at least it is not fully transparent.
WarmSoftKitty It did say DDR4 in the box but it didn't describe that the DDR4 is like 50-90% slower while only cost $10 less than the original, if it cost $30 less then the costumer will at least figured out that the DDR4 version must be much slower than the GDDR5 version. It also didn't say that it has a lower clock speed, and yet they still called it gt 1030
We could file a lawsuit for the GTX 1060 3GB too. The GTX 970 lawsuit was successful, just remember that.
noName yet "It did say DDR4 in the box but it didn't describe that the DDR4 is like 50-90% slower." They are not obligated to do that. Are you upset that they don't advertise the performance difference between a GTX 1050 Ti and a GTX 1070? What about the performance difference between versions of cards that have the same GPU but different amounts of video RAM?
Is this card crappy? Yes. Does that mean it shouldn't exist or that Nvidia should be prosecuted? No to both.
This sort of review is why this channel is so good. Indepth, unbiased, with no nonsense.
The title is great, "disgrace" is the most fitting word for this. Even if this was a “1020” performace would be too low!
Shoulda called it the GT 10AVOIDATALLCOSTS
Should call it potato, which it is.
I'm starting to realise that generally when a GN video's sponsor space is taken up by the mod mat, that usually means the video's going to be spicy or otherwise controversial in some way. That's a good thing.
Imagine VW would sell clean diesels that aren't clean... oh wait...
Also: Just how much difference can there even be in manufacturing costs between 2GB GDDR5 and DDR4? It must be kinda noticable to even warrant setting up the production/development of such a failure... then again, maybe it's gonna be popular with the prebuild PCs...
I just fell victim to this, but it's for a pc that will be simply playing a game in the first place, nothing modern, texture heavy, etc. So it is a good fit, especially since BF/CM dropped it below $50usd, and a low profile was needed, it works out in the end, but I had no idea that is what the D4 meant, and I coulda swore it said gddr5 on the specs on Amazon, and also beyond that it absolutely looks like a stupid serial number. It works for my needs, but it's also semi-disappointing to see, if not for the fact I found something in my budget / spec range anyways. Awesome video btw!
To avoid "getting punished" for mistaking the two variants, there's an easy way.. *Don't buy a 1030 under any circumstance* Problem solved. Let NVidia be " *punished* " instead. They just did this with the 1060 to a somewhat lesser degree, but every bit as bad.
I just dodged a bullet. I am watching this on an ATI AMD Radeon HD 7500 Series card and just ordered a MSI Graphic Cards GT 1030 2GH LP OC.
You can get a GTX 750 Ti from AliExpress for ~ 60 bucks, which is faster than the 1030 GDDR5.
Many people bought the 1030 because it's quiet and doesn't need a power connector... But if all you want is pure performance then go for it.
Putting the same name on different cards that have a huge difference in performance? That's a certified NVIDIA moment.
NVIDIA helped the RAM shortage issue by using DDR4
I bought a Gt 1030 GDDR5, But I heard crap about the card that scared me, but then I realized that was a DDR4 the people bashed about, thanks Gamers nexus for elucidate me.
lol. A AMD 2200G has better performance than a stock 1030 DDR4? Haha. Nvidia should be sued. BAIT AND SWITCH!
Yes. It is better. 2200G is much closer to 1030 GDDR5 than DDR4
I'm Definitely with KAT Erwhall-- There should be an official award for computer manufacturers to use stating they survived Gamers Nexus testing and reviews with a positive outcome. There's not a tech review entity out there that claim to be better than GN. Maybe one or 2 on roundabout equal footing- but none *BETTER* than ya'll. You guys are the only UA-cam channel I know of that'll give it to us raw and unabashedly true, with manufacturers feeling not remotely a consideration. You guys are definitely on the side of the customer base, and nobody out there could EVER attempt to call GN shills for any company Keep up the great work Steve, and we'll keep loving you for it
#askgn
What happened to meltdown, Spectre, Spectre v2, the Google released "Variant 4" exploit, etc. ?
1. Are they all fixed now?
2. How much performance impact did the microcode patches cause?
3. Is there a way, besides using older bios and windows version, to disable the patches so you get extra performance for benchmark ?
taging along to this...what I want to see now after half a year...if Intel still has the IPC advantage over AMD post all the patches. I don't think they are in the lead anymore.
on our servers it did quite a fit....for example RAID arrays lost about 25% speed and some calculations take longer. Was currious if the benchmarks have changed since initial ryzen launch.
Love seeing self sponsored content. Makes the videos even more trustworthy. Excellent video. Great work as usual.
or if my notebook is a i5 but only 2 cores... yeah i fell for this one before...
Well, you have to look up the specs of the CPU to know for sure. The i7-8550U can be found in fairly inexpensive laptops, and has 4 cores, 8 threads, a base clock of 1.8 GHz, and turbos up to 4 GHz. A friend's Surface Pro 4 from a couple of years back has a 2-core/4-thread "i7" though.
No sht lol. I bought i5 U series 7th gen that have 2 cores 4 threads. Then they released 8th gen month later. Still pissed to this day.
ig it has hp on both cores is not that bad, but an i5 SHOULD have 4 cores
Thanks for the upload. I'm so glad someone keeps it real on UA-cam!
did any of the past "different products with the same names" gpus have anywhere near this much difference in performance? Getting 50% of the performance for a product with the same name is... stupid.
I owned two versions of the GT 730. One had DDR3 memory, the other had GDDR5 memory. I ran benchmarks and there was a substantial difference. For my purposes, the DDR3 version worked and was passively cooled so I kept it and sold the GDDR5 version. I never even considered giving Nvidia any grief over it.
i own the original 1030. its ok but these SCAMMERS cant get away with this low life trick
but they will
I just got scammed 😅. I bought on ebay a 1030 to upgrade an old hd 5770. The title said ddr5, but I just noticed it is the ddr4 version by the model number in the pictures. It's still an upgrade and has much lower power consumption.. but still.
A bad rating for the seller is guaranteed
Wow, this is just a disgrace... This is literally a scam. Most people will never know the difference and will choose the slightly cheaper GPU and get HALF the performance.
Remember when Nvidia wanted gamers to "know what they are buying" in case someone reads NVIDIA instead of AMD on the box?
Just recently subscribed to your channel and I love your videos I really do, it's great no BS tech news, reviews and honest opinions
What is even more amazing is that with all this crap and schemes from Nvidia... people still support them...now that is the most amazing thing.
kong xiong After 20 years of supporting nvidia, that's it for me. 1060 was my last GPU from them and now i'm waiting for AMD to release something new and i will be very happy to switch the teams
Because AMD doesn't have any good gpus to compete with at a higher level, this product is a scam , but their other higher cards are better than AMD counterparts , and AMD have big issue with their cost and power consumption, a bad product doesn't mean shit if almost all of their other products are better than amd's
This card is a CSM Card (corporate stable model), hence the DDR4, its used for home theaters, server interfaces, network consoles. It uses 50% less power than the GDDR5 card, and supports still 8k @60Hz with HDMI 2.0b and HDCP 2.2
It's not marketed to gamers.
It's literally perfect for a home theater PC or server. What other 20W card supports 8K at 60hz????
1060 3GB, 1060 6GB
1030 D4, 1030 GD5
4080 12GB, 4080 16GB
Nvidia, the way you're meant to be played.
I don't think this review is to Nvidia's benefit, better call the lawyer ;)
Only covers confidential information. This is public knowledge obtained through their own testing.
Some ppl are so stupid, that is not even worth commenting...
Cpt_Wolf some ppl can't take a fucking joke it would seem ;)
Now this thing is $100. The world is messed up.
And they sell most of of these GT1030 cards to teens who don`t have money. Its just wrong...
I'm no stranger to memory clock and about every specification, but I admin, I did not know about this crap. I ordered a Home PC with Athlon 200GE and Gigabyte GT 1030 and I did not know there are 2 verisons but fortunately, my new desktop has the GDDR5 version, thanks for this bench video.
NGREEDIA needs a class action
I am the one who brought gt 1030 ddr4 version before I knew that this gpu had ddr4 and ddr5 version. The seller didn't even inform me, and I didn't notice. Money wasted!
Slower than the GT750 in my old laptop from 2013 🤦♂️
Very well written and presented. His intonation really improved listenability, and he would do very well on a business news channel..
Scumvidia
Yup, AMD has never done shit like that .... wait ... they have! More than once ...
Gamer Catolico What's wrong with 1060 3Gb huh
@@longvo8800 its cut down in more ways then just vram compared to the 6GB version.
Thanks steve, hopefully people will see this video and steer clear of this mess when they are doing their research.
It would be smart to take down your gt1030 reviews.
HilaKleiner Because the old reviews were probably talking about the gddr5 version, and will not be accurate for the ddr4 version. The review will not be negative enough.
Are you stupid? They mean the reviews of the actual card because now those ones are more likely to make someone accidentally pay money for this scam. You know. Like he said IN THE VIDEO.
At least unlisting them until this is fixed would be a good idea, that way anyone searching for GT 1030 is going to be more likely to find THIS video and not the reviews of the original.
It's nice to see someone pointing things out to help those with a tighter budget. Good video
NVidia to change the name to GT 1030 FUQU4G which of course means FUlly QUalified 4 Gaming.
DoctorWho8675309 Or FUQ U 4 Gaming.
When you mentioned the GDDR4 era, it would be interesting to look at the “first chips” from each gen of GDDR when GDDR6 cards launch.
Or even just at GDDR4 itself which was only on a handful of cards iirc
See how the new NDA forces reviewers to make positive reviews of all Nvidia products?! This is just the beginning!!!
It doesn't. GN did a video on this very thing with an actual lawyer.
woosh
TalesOfWar Just didn’t feel like switching on your sarcasm detector today?
I can't beleive GN is saying so many good things about this new card, it really does go to show you the NDA was true.
One of the best UA-cam channels and hardware sites on the internet.
Good job once again on doing tests nobody else got the balls for.
There was no other possibility then giving this card a devastating conclusion.
Nvidia wtf were you thinking again.
16:57 was that the graphics card version of a mic drop?
It is hardly a new practice , since it was done ever since the MX400.
In fact, 1030 was an oddity among the low-end Nvidia cards for only having a GDDR version.
The worst offender in that regard was GT 640, which has 5 versions woth the same name.
I have a gt 1030 and I am currently using it as my 2070 super is being "repaired" by nvidia. Had to do fresh install of drivers because all games would crash upon launch.
Thanks for publishing this! NVidia REEALLY needs to be called out publicly and LOUDLY for this bullshit.
I don't even understand why you are trying to do this. Certainly someone there or someone at Nvidia knows the difference between a reduced instruction set and a complex instruction set. Nothing else needs to be understood. Except for can you screw a customer better than we can screw a customer?
The only good thing about the DDR4 model is that it only consumes 20w instead of 30w for the GDDR5 model and that it's still better than an integrated IGP, but when it comes to video games it's a horror, compared to GDDR5, DDR4 is a horror, the games are very slow. (Tests with Gigabyte GT 1030 OC GDDR5 and MSI Aero ITX GT 1030 OC DDR4).
I had picked a Zotac one up last year as a console display card but upon seeing its potential used it in my workstation (Where gtx 1080 is a co-processor) and got surprised that I was able to get more than acceptable performance for my rare gaming occasions (WC3, WoW, SC, HotS etc.)
And people ask why I hate nVidia more than I hate RTG...
Glad you covered this. They should have called it GT1020 or some other shit. Crafty knob heads. They know what they're doing. They've been doing it since the days of DDR2 / 3 video memory. The GT1030 GDDR5 variants are bloody awesome cards. Such low power and small cards that can play so many games quite nicely.
This reminds me of Geforce4 MX. It didn't have programmable shaders like GeForce3 and was basically a rebrand of GeForce 2 chips. No business being called what it was.
even as a tester card I'm not picking one up! Great vid!!!
5:52
Hey! That's my old CPU! Had it paired with a RX 580 because I got fired before I could buy my original choice of CPU...which was a 7700k :D Just cool to see GN using hardware I'm familiar with.
Did you try running the Ashes of the Singularity benchmark? I got the same message on Intel HD integrated graphics, just had to click "OK" and it ran the benchmark anyway, at about 12 FPS.
I believe they were aiming this at the Home Cinema type PC for your digital library in your living room that never does anything else but play movies...
but price is almost same with those 2.
Yeah, that's true. Way too expensive. But we haven't really seen Nvidia selling something cheap...
when are we gonna get new gpus... and normal ram prices? i wanna upgrade this year
Great vid 👍.
Subscribed
I like how Steve keeps tossing products every now and then. It keeps viewer engagement high, as it's not an everyday thing, so we stay put waiting for the next one ;D
Good video. I hope the message reaches those interested in these cards.
I had a gt 1030 with the gddr5. Overclocked it sooo much. Over 2ghz and the memory was also overclocked. Nearly killed it when I tried 2.3ghz. best I got was 2.1 and I forget what I had the ram at. I'm sure if I got better luck on my gt 1030 I think I could have gone to 2.3ghz.
I'm more interested in looking forward to GN and Buildzoid doing a VRM Analysis this Winter 2018 Season on the Turing 1180's from EVGA and Galax etc. I stop GPU considerations for low end PC's at the 1060 on Pascal.
Great video, can't help cringing with every "2 ecks" instead of "2 times" but maybe that's just my own stuff I gotta work through.
Nice PSA, bought two used 1030 for my GF & her roomate, who just needs something stationary that can run PS & some light games, in conjunction with some used HP small form factor (i7, 3770 & 8gigs of DDR3 ram). I made damn sure that what I got was the GDDR5 ones.