@@raven4k998 if your only need is to drive monitor it doesn't matter if you got GT1010 or GT1030. for the most part just get whatever is cheaper. At this point it is silly try to play games even on proper GT1030 that use GDDR5.
@@raven4k998 well they go like this: gt1010 DDR4 < Gt1030 DDR4 < GT1010 DDR5 < GT1030 DDR5. There is a 1010 that is better than a 1030 so you have to specify the DDR5 1030
Nvidia originally released its GP108 as an entry-level desktop and notebook GPU in 2017, and since it never produced a low-end Turing GPU, it continued to manufacture the chip. Then to address demand from its OEM customers, Nvidia quietly released its GeForce GT 1010 based on a cut-down GP108 early last year.
What's that you say? This was a product with a specific purpose which fit a requirement from OEMs to produce inexpensive display adapters for their systems and it was never available at retail? Clearly, that means it should be compared against the 4090 so we can all lambast how much it "sucks" in comparison.
GDDR5 versions seems to be OEM only in certain regions. The only commercial release seems to be DDR4. But then again there is very little reliable info on these cards out there.
@@johnsalamiiit was a thing for a very short period of time It was AMD only. They invented it, but it didn’t have much gains over GDDR3, and was too expensive. GDDR5 quickly came out and made it redundant. Nvidia has never made a gddr4 gpu. This gpu, and the gt 1030 variant, are ddr4
I have a 3070ti, and the lag/delay it took to show menus specifically on the Nvidia Control Panel is literally the same like in the video. I think it's just the control panel thing
Lower cards from colorful always have these design, props to them, they really know most of these cards will likely perform like it came out in the early 2000s 😂
It seems like this was never meant to be a commercial release to begin with. "That's why Nvidia only made it available to OEMs who needed to serve business customers that sometimes demand standalone graphics cards in their desktops." Interesting it did of course lol.
@@timotheataecards like these are really just display adapters. They let you connect more screens to the computer, they're not really intended to do anything more than that.
@@timotheatae Things like these serve a purpose. I have a server I put together with commodity AM4 parts, but since the CPU I have in it has no integrated graphics I still need something to output a console for my PiKVM, so a PCI 1x version of the GT710 fits that bill perfectly. It also leaves the primary x16 slot free to put a real GPU in to pass through to virtual machines.
Yeah I wonder if it was for things like casino/gambling machines - they tend to like having lots of high resolution displays connected to them but don't always need to render in super high quality. They're one of the biggest markets for the Radeon Pro series too.
@@timotheatae companies buy these in bulk for dirt cheap, usually for older systems that has even crappier graphics. ATM machines, cash registers, arcade machines, you name it. as an example the pump it up arcade machines uses geforce 9300gs and gt210 cards, sometimes the GT710. they have a special OS written only for specific graphics and sound chips, and since it's a rythm game there is no need for anything fancy.
Not really. It still would be slow and couldn't max out games at 1080p ultra. It doesn't even launch Crysis at all. My old 650 Ti is superior "low power" XP card.
Be prepared to buy GTX 6010/6005 for $2000 when AMD go out of biz (which seems to be where they are headed with their GPU division).Unfortunately InHell hired a fake engineer from AMD thinking he can actually produce GPUs so there is no competition now in the market.
Unfortunately, if you search for the brand new GT 1030 4GB on a well known shopping websites(mentioned in the video), you will find that some models are advertised as the 'GDDR4' version. MSI is one of those brands, but when you look up the exact model on MSI's official website, it clearly states that it uses DDR4. It seems like this might just be a marketing tactic from the vendors
Why people still looking for almost a decade old GT1030? At this point those that want a GT1030 will only need it to drive display. Those model with DDR4 should not be an issue.
@@arenzricodexd4409 here are two reasons from my personal perspective: 1. It does not need external power (TDP is around 30W). 2. It is excellent as a media decoder card, with full support for H.264, H.265, and VP9. VP9 is the main codec using by UA-cam right now. Regarding the DDR4 version, I think it is just an NVIDIA scam (lower cost), since the DDR4 version only reaches half the performance of the GDDR5. As you mentioned, most people don't care about DDR4. Alas, they are paying almost the same amount of money for the GDDR5 version.
@@arenzricodexd4409 Two reasons, in my opinion. 1. No external power is needed.(TDP 30W) 2. NVDEC 3rd Gen(H264, H265 &VP9). The DDR4 is only half the performance of the GDDR5 version. Yet, they almost cost the same price . NVIDIA has been doing this practice for quite long time.Yet, for most consumers as you mentioed they won't notice anything.
2:22 The hdmi port can supply 300mA at 5v. Normally that is suppose to be used to detect a powered of monitor. But some dongles use it to convert between outputs
THe 1030 already was something else. But at least it had a relative modern chip and could run hvec and so on. So that is great. But a 1010? Just how much can you cut from a 1030 to make it a 1010???
Chip manufacturers don't want to discard large volumes of faulty chips, it's far too expensive. They find the level at which many of those faulty chips will function, and disable certain parts and/or downclock them so they can be used. Slap them on a standard PCB and sell them as a reduced cost or OEM card, thereby minimising the financial losses. It's a very common practice, and emerging or less affluent markets are keen to take them.
a friend of mine genuinely has a 1030 because wherever he bought the prebuilt from has severe mental issues, he has 3200 corsair ram that wasn't set to 3200 and a cpu that's better than mine but a 1030 because reasons no the pc has not been upgraded from stock.....
@@onegrumpyboi2914 Hopefully he can find a decent GPU. GTX 1080s are quite cheap these days and also powerful. Or if he wants to go high-end, maybe the 6900 XT or something.
Maybe in 5-10 years you'll end up reviewing the bottom tier display adapter card they released this year (once it doesn't cost a a ton anymore lol)- the RTX A400 that has cut down version of the same core that the RTX 3050 has, but with just 4 gb of VRAM and a 64-bit bus and less than half the shader cores.
@@MyNameIsBucket well it is kinda nvidia's equivalent to the W6400 which is the "pro" version of the RX6400. They're really just meant for adding a bunch of video outputs to professional and industrial computers that have higher stability demands while rx 6400, rx 6300 etc are for adding outputs to office computers and similar. Maybe the gt 1010 was also aimed at that like it's quadro P400 cousin but they had a surplus or something.
I can't understand his hate for MX440. Sure, naming was terrible with GF4 in the name and lack of shader support but cards were cheap at that time and to many of us who didn't have money for first real accelerators like gf2 or 3dfx v2 they give something to play older titles at decent framerate. At that time new mx440 was stil cheaper than used gf2 and run faster with more vram, its' not like every second hand market is healthy one.
@@gorky_vk that is a fair take you could pick them up farly easily for a tenner, but I was able to pick up a used ti4200 for 50 quid as soon as I could. trying to play bf1942 on a mx440 was PAIN!
@@arnislacis9064 The card were made with dx 9 support - find a Ati 9500-9600 or 600 card and frames are double -the nvidia 4200-4600 are for the older bus , but I guess a 4800 could do fine - ati 9800 and nvidia 5700 cant be run on default 300 w powersupply and - so you have options like even mx 440, geforce 3 , or nvidia 6200 , just some old knowlegde i gathered back in the day
Production runs of chips that produce a high proportion of faulty, sub standard parts are an expensive write off, so they simply disable the relevant parts and sell them as more budget oriented or OEM cards, especially in the less affluent markets. Cut down cards are exceptionally common, they're often just mid and high range chips that didn't make the grade.
@@ZERARCHIVE2023There’s also the GDDR5 GT 1030. Much better. Still not a good card at all, or a card made for gaming, but you can certainly make it do it with not the worst results ever recorded.
@@archgirl GDDR5 can run some games at 30FPS "stable" Where DDR4 would get to... 5/10fps maximum ? OC included ? 7600X's igpu is far and wide stronger. X8SSAA SMS emulation : 30 locked. Same settings ? 10 fps if I'm lucky 😂 with my 1030. And my 7600X can still boost if I take away the limitations to 100fps+ so... yeah...
@@ZERARCHIVE2023 I was just pointing out that the GDDR5 version of the 1030 exists and is better than the DDR4 version you mentioned. I never said anyone should buy or use one, just that it’s a faster version of the card. Of course a newer chip designed for gaming is likely to beat a 7 year old GPU that wasn’t designed or released for gaming.
My friend, I know you know what you're doing, but just a thought, please keep in mind that scammers flash modified BIOS which may be making it appear as a 1010 when its not.
@BudgetBuildsOfficial thank you for responding. I'm currently watching the video so I had that thought during not after. China always gets weird stuff. My personal favorite is how they have laptop CPUs in desktop sockets.
@BudgetBuildsOfficial I just thought of this & thought you may find it funny. You said "No it's a GT1010 unfortunately" I thought he's probably wishing it was anything else.
@@raven4k998If I recall correctly, all new AMD CPUs come with an iGPU. From 7000 series onwards. And most Intel chips too. And they're decently capable, too. My point being that it's harder to avoid iGPUs than specifically buy one, these days. To my great dislike, because now I'll always have to double check that things are using my dedicated GPU, if I buy a new CPU. And try to explain how to do that, and what it means to people I build PCs for... is pain.
@@_LGD Well i hated mine, the fans were super cheap and rattley and combined with the stupid power consumption of the 580 (because amd just had to make the 480 WORSE) it was louder than my vaccuum cleaner most of the time. Also this was back in 2020, which meant Radeon software and Drivers being so bad that i just jumped to team green... pascal just beats polaris in every single regard
@@AveragePootis ah, I think mines a revised card because it has “pulse” branding, I haven’t noticed its noise at all but drivers are still a bit wonky at times
You called the mx440 terrible, turns out there is a 12.5watt version and a 28 watt version. The 28 watt version can achieve 30-40fps on bg3 1080p low, with undervolting. Its basically a gtx 1640.
Just in case anyone was serious about wondering about the question at 2:15, HDMI has a 5V rail with just enough power to feed a basic circuit like that
I have lag in the Nvidia control panel under manage 3D settings on a (full tdp) mobile 3070, pretty sure had same issue with a desktop 1060 (even on fresh install, DDU, NVCleaninstall, etc), and have seen other people complain about the same thing. As bad as this card is, I do not think the NVC lagging is it's fault for this, the software is really really bad
18:52 but there is evidence that G-Sync works. See, GT cards didn't get G-Sync support until Nvidia added it for the GT 1030 (and 1010 when it came out) in 2019. I've fully tested my LP GT 1030 GDDR5 with G-Sync and let me tell you, Halo 2 cartographer runs like a dream in 3440x1440p120-144hz, all thanks to Displayport 1.4a. But yeah GT series never had NVENC at all. Look up GPU capabilities on the W!k!.
@@astroidexadam5976 that's like GT710 tier back then so should costs more. l meant most of people don't buy that for gaming but for display output when non G Ryzen and Intel F series don't have integrated graphics.
This graphics card is crazy. The fact that modern day integrated graphics like the 760M or 780M by AMD beats this thing is insane on how beefy they've become throughout the years compared to before. It would be troubling if we somehow gotten an RTX 3040 or heck, an RTX 5020.
I didn´t even know this existed. Got a GT 1030 low profile for my Hp ThinClient to get more modern codec support for web and streaming aswell as 4K60 display out. The built in CPU is still kinda slow but the GT 1030 greatly helps with stuff. Have been using this for some time as HTPC, but currently it´s collecting dust.
At 1:37 Which i7? You said you don't know which i7 it's supposed to be paired with....so I checked. The card came out in Q1 of 2021, according to GPU-L. Which means that it is firmly in the i7-11700 range since they also came out in Q1 of 2021, according to CPU-L. Now, there are six of those, only counting desktop variants: 11700 11700B 11700F 11700K 11700KF 11700T
Retro gaming was going to be an entire section, but unfortunately with no 32bit support or the ability to run its own adapter, it’s kinda redundant. Same with the HTPC side as it’s too expensive to be better than the alternatives
@@BudgetBuildsOfficial I found one of these in a green bin but left it as there was a 980 and took that instead, if I could travel time I'd of taken this and eBayed it
Hey! I am not subscribed to you and I have never seen your channel, however this video was in my notifications!!??? How is that possible 😂 I think UA-cam loves this video 😂
I know you are the ‘Budget’ build channel but could I recommend the Corsair 680x case to you? They aren’t cheap but not that expensive used on eBay at around £90-100. The reason I’m recommending it is that it has a hinged glass side panel, so you can open it in 2 seconds flat, rather than tackling four thumb screws every time you want to change some hardware.
Thanks to getting next to no sleep after my night shift, energy drinks and dwindling sanity, I'm finally awake to catch my manz at upload time once again 🙏
In 2008 I didn't have the budget for an 8800 GT, so I bought an Ati HD 3870 with GDDR4 and it wasn't bad at 1280x1024. I played through Crysis on it without complaint.
I actually remember hearing the news about the GT1010 around 2021-ish. I think the original point of it was to be a cut down GT1030 (duh), but to be incredibly cheap in a time when the chip shortage was in full swing. If that’s the case, the card probably didn’t have a lot of development time just to have SOMETHING out. Since I think even GT1030s were being scalped. I’m guessing they didn’t actually see a reason to release the GT1010 en masse once the chip shortage and GPU prices calmed down, so they didn’t make the GT1010 in large quantities and didn’t sell them outside of Southeast Asia for some reason.
During the pandemic, every GPU was being scalped. Even the GT210. 😂 With the cost of producing wafers and creating chips, manufacturers don't like throwing away bad batches of chips if they can actually be made to work. Disable the bits that don't work and try to sell them as budget or OEM cards, at least they get some money back.
I was _really_ hoping you'd pull the cooler off so we could all stare in awe at the presumably _absolutely tiny_ sliver of silicon under it, but alas no such luck.
Given they drop the X for these cards, the RT branding would seem fitting. The 3050 6GB is effectively the RT3030, and the 2050 is probably better given the name of RT3020 given it is a cut-down mobile 3050.
When it comes to lag on the nvidia control panel. I have a 3060ti and it still lags like that. I have no idea if it's even fixable, but I usually only use it after I reinstall Windows, so it doesn't really bother me that much.
Power management mode set to optimal performance does actually very little for performance as it wont clock the card higher then before but in idle will eat much more energy as the gpu will never clocks down when not in use. Its a pretty bad setting and best left on optimal power.
depends. I remember on I think it was my last card that if I put it on optimal/power saving some game would not really run to well. because the system would always try and clock down the GPU. the game only used like 10% of my GPU for 60ish FPS but as soon as the system clocked it down the FPS droped to like 10FPS because some more advance GPU tricks get disabled.
Great review! Thank you for the effort! I got a 1030 ddr4 as an upgrade after a 710 ddr3, for a Fuji Futro S920 build, and was much better, and the 40W powersupply still was stable. 😄 Crysis? The demo ran on both, under Win7 64bit. I got a Futro S940, as an upgrade, and I use my old laptops powersupply, so I will get a gddr5 1030, eventually... And everything was, is, and will be passive, since there are 1030's with passive cooler. 🙂
It's *literally* produced to *AVOID* ewaste. The failed 1030 GPUs are binned and resold. If they didn't do this, they'd just throw the parts out. It's not useless just because it's not a not a GaMiNg card. Manny = male nanny, BTW.
@@tim3172 but they could still pair it with more vram. You save the silicon but waste the ram and PCB. I understand why low grade silicon is sold. The scummy practice Nvidia is still doing is holding the silicon bad further to incentivise you to get the higher tier card
what a beast! i sent some GPU companies messages about them needing more mascots and not just random-ass black boring boxes, we want monsters and to be fair every single nvidia GPU i've had including a 4080S lagged horribly in the driver screen.
I mean these days it's a bad sign if any modern graphics card CAN'T run the OG Crysis :( I hope one day you manage to find the GDDR5 one, it looks a lot cooler.
There's only one card that should have GDDR4... The Radeon 1950xt, HD 3850/70 and 2600xt Oh also... Remember the weird R7 450 you did a video on? I think I found a tweak that'll help! Go into the driver configuration where you select "gaming" or "compute" mode. Pick "compute". Then load up MSI afterburner and max the power limit to whatever it'll give. Then try running some benchmarks again!
So today i put together an old HP Prodesk 400 G5 machine with a i7, 16GB RAM, and a HP branded GeForce GT730 2GB, and its not tooooo bad.. Given your clearly experienced, what would you say is "the best" Low Profile single width card you can still get ?. RX550 4GB ?. Thank you.
This makes me remember of a video of _Dawid Does Tech Stuff_ , where he completely removed the cooler of a GT1030 to see if the card would still work, and the card actually never managed to overheat, even after some overclocking ! XD Given the fact that this GT1010 is even worse than the 1030, I'm sure you can remove the cooler and the card won't even guess something is different 😂
I am kinda curious, how does the GT1010 compair with the nearly equivelent Quadro P400. They have the same cudacore count, close enough clocks. The Quadro has NVENC and GDDR vs DDR for vram. Oh and the P400 has a full x16 link vs x4. Forgot to mention that somehow my Quadro P400 seemed to perform better on PCIE Gen2 x8 than my GT1030.
True to form you have classically displayed the card as you have done in the past near a bird wash in your garden. Perhaps it would have been more appropriate to display this one on the brim of a toilet?
I think it's kinda cool. I bet they're a lot cheaper in their original market, and cheaper still in whatever devices they are made for, bought in bulk by whatever industries require them.
In the case of AMD, the GPUs without a video encoder are laptop GPUs that they decided to sell for desktop (they count on the integrated GPU having a video decoder). I wouldn't be surprised if that were also the case for some GPUs that they didn't sell for laptops and they try to sell them in another market.
the fact the minimum power was based on a I7 is optimistic to say the least. I swear this thing is beaten by 2020s Intel integrated graphics (AMDs would probably blow it away).
2:25 i believe there is a spec of HDMI that can passthru analog video. If i remember correctly there are multiple unused pins that can be repurposed for VGA.
So whats it for? Replacing the base generic GPU a basic home PC comes with in the odd case its not embedded? Maybe for library computers that don't actually have any graphical needs beyond basic OS animations?
About that rare video card, I realized I should not regret the GTX 1050 2GB I bought. Thank you for this encouragement. Well, I can play more games and even Crysis using GTX 1050 2GB than GT 1010... What a trip for that GT 1010 to not run Crysis, but overall, it's consoling it can run solitaire, minesweeper and even Purble Place. :D
A Nvidia GT 1010 is like a 3-legged horse. Very rare, but also very useless.
This card should have a remark : "For MS Office only"
what if the legs are in a triangle formation
@@lordmike9331 🤣
So, what you're saying is the card is meant for the recycling yard
@@lordmike9331 What kind of triangle?
THE DRAGON ON THE BOX WAS NOT EVEN RENDERED BY THAT GPU.
Maybe it was and it just took a loooooong time
@@rubeusvombatusI think pre-orders are still waiting for theirs to render.
@@imacg5 it'd be smart to, and I'm sure some companies do that.
@@rubeusvombatus not enough VRAM
i've seen a few GPU boxes before that use killzone art lol
I like how the box says GDDR4 despite the last GPU using it coming out in 2007
Real
Just like manufacturers mentioned GDDR2 on many cards yet only few cards used it.
@Leboobs22yeah it doesn’t work like that, that would require a whole different motherboard
Not quite. Some GT1030 cards had GDDR4.
@@haydenw8691That’s just DDR4, not GDDR4, which was barely used as it wasn’t much faster than GDDR3 and GDDR5 came out a year after
The card that was so pointless that Nvidia tried to hide it
well yeah it sucks compared to integrated graphics which is a joke in and of itself😭😭
It was so useless it saves many old pc with broken iGP into e-waste.
@@arenzricodexd4409 meh you wasted your money should have got a gt 1030😭
Try the GT 705 with 48 CUDA cores, its over 5 times slower. 😂
@@raven4k998 if your only need is to drive monitor it doesn't matter if you got GT1010 or GT1030. for the most part just get whatever is cheaper. At this point it is silly try to play games even on proper GT1030 that use GDDR5.
GT 1010: ❌
GT Ben Ten: ✅
save your money for the gt1030 it's the better option basically by a lot
@@raven4k998 well they go like this: gt1010 DDR4 < Gt1030 DDR4 < GT1010 DDR5 < GT1030 DDR5. There is a 1010 that is better than a 1030 so you have to specify the DDR5 1030
@@Mr28d23😂😂😂 this guy 😎🥰🥰
@@raven4k998 No, save your money for an ACTUALLY GOOD GPU. Not crappy display cards.
save up for rtx or amd xt series these are cheap nowdays@@raven4k998
Nvidia originally released its GP108 as an entry-level desktop and notebook GPU in 2017, and since it never produced a low-end Turing GPU, it continued to manufacture the chip. Then to address demand from its OEM customers, Nvidia quietly released its GeForce GT 1010 based on a cut-down GP108 early last year.
What's that you say? This was a product with a specific purpose which fit a requirement from OEMs to produce inexpensive display adapters for their systems and it was never available at retail?
Clearly, that means it should be compared against the 4090 so we can all lambast how much it "sucks" in comparison.
Early last year?!! They released a new 10 series card?
@@tim3172 "Content"!
Kinda sad he got the GDDR4 version, and not the GDDR5 version (which even beats the DDR4 GT 1030!)
i had no idea that gddr4 was a thing. like all i knew was gddr3,5 and 6
GDDR5 versions seems to be OEM only in certain regions. The only commercial release seems to be DDR4. But then again there is very little reliable info on these cards out there.
It’s ddr4, not gddr4, despite what the box says
Gddr4 is ancient history at this point
@@johnsalamii Some AMD (or ATI) cards had GDDR4. But by the time 4 came out, it was already being replaced by 5, that's why it's so rare.
@@johnsalamiiit was a thing for a very short period of time
It was AMD only. They invented it, but it didn’t have much gains over GDDR3, and was too expensive.
GDDR5 quickly came out and made it redundant.
Nvidia has never made a gddr4 gpu.
This gpu, and the gt 1030 variant, are ddr4
wait, you’re saying you’ve never seen the nvidia control panel lag? That’s impossible.
Exactly what I was thinking. I've never seen it not be laggy.
Every single Nvidia card I've had, 640, 760, 1050 Ti, 2060, 3060, all of them have lagged exactly like that in the control panel.
I've never once had the nvidia control panel lag either, interesting.
As an AMD and Intel Arc GPU user, I also never experienced Nvidia control panel lag.
I have a 3070ti, and the lag/delay it took to show menus specifically on the Nvidia Control Panel is literally the same like in the video. I think it's just the control panel thing
Is this thing from 2004? Because the box art is certainly in the 2004 school of box art design.
to be fair, a GT 1010 in a 2004 retro rig seems like a perfect fit provided your board has pcie
@@oggilein1Drivers.
With the performance it has, probably yes
Nvidia GF FX 5xxx flashbacks
Lower cards from colorful always have these design, props to them, they really know most of these cards will likely perform like it came out in the early 2000s 😂
It seems like this was never meant to be a commercial release to begin with.
"That's why Nvidia only made it available to OEMs who needed to serve business customers that sometimes demand standalone graphics cards in their desktops."
Interesting it did of course lol.
Ignorant businessmen who think "discreet = good" or genuine concerns?
@@timotheataecards like these are really just display adapters. They let you connect more screens to the computer, they're not really intended to do anything more than that.
@@timotheatae Things like these serve a purpose. I have a server I put together with commodity AM4 parts, but since the CPU I have in it has no integrated graphics I still need something to output a console for my PiKVM, so a PCI 1x version of the GT710 fits that bill perfectly. It also leaves the primary x16 slot free to put a real GPU in to pass through to virtual machines.
Yeah I wonder if it was for things like casino/gambling machines - they tend to like having lots of high resolution displays connected to them but don't always need to render in super high quality. They're one of the biggest markets for the Radeon Pro series too.
@@timotheatae companies buy these in bulk for dirt cheap, usually for older systems that has even crappier graphics. ATM machines, cash registers, arcade machines, you name it.
as an example the pump it up arcade machines uses geforce 9300gs and gt210 cards, sometimes the GT710.
they have a special OS written only for specific graphics and sound chips, and since it's a rythm game there is no need for anything fancy.
if it had Windows XP drivers, it woulda been a energy efficient Windows XP Retro Gamers' dream
i use a ddr2 gt220 and its a perfect xp
I doubt Nvidia would write Windows XP drivers for Pascal for what is essentially a bunch of leftover dies not good enough to become even a GT1030.
@@Δημήτρης-θ7θ I mean, the GTX 960 has a driver for XP, so I don't see why they couldn't give the 1010/1030 an XP driver as well
@@Sithhythe GTX 900 series came out the same year XP was still officially supported my Microsoft...
Not really. It still would be slow and couldn't max out games at 1080p ultra. It doesn't even launch Crysis at all. My old 650 Ti is superior "low power" XP card.
It’s a great Saturday when I hear that smooth music and a random forgotten graphics card. Can’t wait to see it “in the benchmarks” 😊😊😊
Be prepared to buy GTX 6010/6005 for $2000 when AMD go out of biz (which seems to be where they are headed with their GPU division).Unfortunately InHell hired a fake engineer from AMD thinking he can actually produce GPUs so there is no competition now in the market.
that is worse then your igpu🤣🤣🤣🤣🤣
Unfortunately, if you search for the brand new GT 1030 4GB on a well known shopping websites(mentioned in the video), you will find that some models are advertised as the 'GDDR4' version. MSI is one of those brands, but when you look up the exact model on MSI's official website, it clearly states that it uses DDR4. It seems like this might just be a marketing tactic from the vendors
Why people still looking for almost a decade old GT1030? At this point those that want a GT1030 will only need it to drive display. Those model with DDR4 should not be an issue.
@@arenzricodexd4409 here are two reasons from my personal perspective: 1. It does not need external power (TDP is around 30W). 2. It is excellent as a media decoder card, with full support for H.264, H.265, and VP9. VP9 is the main codec using by UA-cam right now. Regarding the DDR4 version, I think it is just an NVIDIA scam (lower cost), since the DDR4 version only reaches half the performance of the GDDR5. As you mentioned, most people don't care about DDR4. Alas, they are paying almost the same amount of money for the GDDR5 version.
It is still false marketibc
Nope
@@arenzricodexd4409 Two reasons, in my opinion. 1. No external power is needed.(TDP 30W) 2. NVDEC 3rd Gen(H264, H265 &VP9). The DDR4 is only half the performance of the GDDR5 version. Yet, they almost cost the same price . NVIDIA has been doing this practice for quite long time.Yet, for most consumers as you mentioed they won't notice anything.
Someone recently submitted a score on FurMark 2 using one of these.
It got a score of 91.
For reference, an RTX 4090 gets around 30,000. XD
It’s probably more affordable to buy 330 of these cards than it is to buy an rtx 4090
@@n7xyeah but have fun getting them all to work nicely with each other
2:22 The hdmi port can supply 300mA at 5v. Normally that is suppose to be used to detect a powered of monitor.
But some dongles use it to convert between outputs
the hdmi port on my 1060 can easily power the composite and rf converters i frequently plug into it.
THe 1030 already was something else. But at least it had a relative modern chip and could run hvec and so on. So that is great.
But a 1010? Just how much can you cut from a 1030 to make it a 1010???
It have 256 cuda cores, same amount as the Switch. Lol
Chip manufacturers don't want to discard large volumes of faulty chips, it's far too expensive. They find the level at which many of those faulty chips will function, and disable certain parts and/or downclock them so they can be used. Slap them on a standard PCB and sell them as a reduced cost or OEM card, thereby minimising the financial losses. It's a very common practice, and emerging or less affluent markets are keen to take them.
maybe is a 9000 or even 8000 repackaged as a 1010
a friend of mine genuinely has a 1030 because wherever he bought the prebuilt from has severe mental issues, he has 3200 corsair ram that wasn't set to 3200 and a cpu that's better than mine but a 1030 because reasons
no the pc has not been upgraded from stock.....
@@onegrumpyboi2914 Hopefully he can find a decent GPU. GTX 1080s are quite cheap these days and also powerful. Or if he wants to go high-end, maybe the 6900 XT or something.
Maybe in 5-10 years you'll end up reviewing the bottom tier display adapter card they released this year (once it doesn't cost a a ton anymore lol)- the RTX A400 that has cut down version of the same core that the RTX 3050 has, but with just 4 gb of VRAM and a 64-bit bus and less than half the shader cores.
Sounds like a good card to go head to head with the RX6400.
@@MyNameIsBucket well it is kinda nvidia's equivalent to the W6400 which is the "pro" version of the RX6400. They're really just meant for adding a bunch of video outputs to professional and industrial computers that have higher stability demands while rx 6400, rx 6300 etc are for adding outputs to office computers and similar. Maybe the gt 1010 was also aimed at that like it's quadro P400 cousin but they had a surplus or something.
The new *2*_10_
The 210 may be better for its intended job
@@BudgetBuildsOfficial i think internet got too heavy for it
"A means to connect a monitor to a machine without iGPU"
Hi Daniel fancy seeing you here. I watch your videos about old graphics cards.
At least the 210 was dirty cheap, this thing is as expensive as the gddr5 gt 1030, and is like 2x slower.
I suddenly feel better about my 64mb MX440
I can't understand his hate for MX440. Sure, naming was terrible with GF4 in the name and lack of shader support but cards were cheap at that time and to many of us who didn't have money for first real accelerators like gf2 or 3dfx v2 they give something to play older titles at decent framerate. At that time new mx440 was stil cheaper than used gf2 and run faster with more vram, its' not like every second hand market is healthy one.
I have GeForce FX5200 on my Pentium 4 PGA478 System, that is barely capable of running Windows Aero efects at 1920x1200.
@@gorky_vk that is a fair take you could pick them up farly easily for a tenner, but I was able to pick up a used ti4200 for 50 quid as soon as I could. trying to play bf1942 on a mx440 was PAIN!
@@arnislacis9064 The card were made with dx 9 support - find a Ati 9500-9600 or 600 card and frames are double -the nvidia 4200-4600 are for the older bus , but I guess a 4800 could do fine - ati 9800 and nvidia 5700 cant be run on default 300 w powersupply and - so you have options like even mx 440, geforce 3 , or nvidia 6200 , just some old knowlegde i gathered back in the day
I played NFSMW until my MX440 fried. I loved that card!
6:00 did you render that little ps1 looking rotating gpu using a GT1010 ?
Was thinking the same thing lol
Yes, thank god he did
At least the gt 210 was used by oem as a display adapter. Had a gt1030 for a emulation pc which in turn was upgraded to a 1050ti.
1030 works not so great for emulation... as far as I'm concerned (DDR4)
Production runs of chips that produce a high proportion of faulty, sub standard parts are an expensive write off, so they simply disable the relevant parts and sell them as more budget oriented or OEM cards, especially in the less affluent markets. Cut down cards are exceptionally common, they're often just mid and high range chips that didn't make the grade.
@@ZERARCHIVE2023There’s also the GDDR5 GT 1030. Much better. Still not a good card at all, or a card made for gaming, but you can certainly make it do it with not the worst results ever recorded.
@@archgirl GDDR5 can run some games at 30FPS "stable"
Where DDR4 would get to... 5/10fps maximum ?
OC included ?
7600X's igpu is far and wide stronger.
X8SSAA SMS emulation : 30 locked.
Same settings ?
10 fps if I'm lucky 😂 with my 1030.
And my 7600X can still boost if I take away the limitations to 100fps+ so... yeah...
@@ZERARCHIVE2023 I was just pointing out that the GDDR5 version of the 1030 exists and is better than the DDR4 version you mentioned. I never said anyone should buy or use one, just that it’s a faster version of the card.
Of course a newer chip designed for gaming is likely to beat a 7 year old GPU that wasn’t designed or released for gaming.
My friend, I know you know what you're doing, but just a thought, please keep in mind that scammers flash modified BIOS which may be making it appear as a 1010 when its not.
Nope this is a GT1010 unfortunately.
@BudgetBuildsOfficial thank you for responding. I'm currently watching the video so I had that thought during not after. China always gets weird stuff. My personal favorite is how they have laptop CPUs in desktop sockets.
@BudgetBuildsOfficial I just thought of this & thought you may find it funny. You said "No it's a GT1010 unfortunately" I thought he's probably wishing it was anything else.
That was my first thought until he ran GPU-Z. GPU-Z is really good at catching scam cards with a modified BIOS.
@@MoultrieGeek Thank you for taking the time to share your thoughts! I honestly do greatly appreciate your time & input! 😇
0:47 I wouldn't say rarest. What about the GeForce GTX 340?
since when was nvidia rebranding chrysler LA block engines?
The what now?
We have fallen so far from the light, lads.
you thought the igpu was the worst you could buy new you thought wrong dude🫵
I think I need a new hobby, maybe anime. 😑
@@raven4k998If I recall correctly, all new AMD CPUs come with an iGPU. From 7000 series onwards. And most Intel chips too.
And they're decently capable, too.
My point being that it's harder to avoid iGPUs than specifically buy one, these days. To my great dislike, because now I'll always have to double check that things are using my dedicated GPU, if I buy a new CPU.
And try to explain how to do that, and what it means to people I build PCs for... is pain.
@@wikwayerNo, don't do it! Don't go into the weeb cave, little German boy!
apparently there is a GDDR5 version of the 1010 and it's faster than the DDR4 version of the 1030 lmao
Looks to be OEM only from my findings. All commercial releases are DDR4. But then again there is virtually 0 rock solid info on these
one of my most anticipated videos of yours since I heard the card was announced!
also, when are the 6400 and 6500 XT getting put through....the benchmarks??
You know what other terrible card has a terrible dragon on the box? The Red Dragon RX 580. Stay away from ugly dragon graphics cards yall
ngl i actually like the tacky box art on that one. i have an affinity for old school edgy "gamer" branding
I like my RX 580 though…
@@_LGD Well i hated mine, the fans were super cheap and rattley and combined with the stupid power consumption of the 580 (because amd just had to make the 480 WORSE) it was louder than my vaccuum cleaner most of the time.
Also this was back in 2020, which meant Radeon software and Drivers being so bad that i just jumped to team green... pascal just beats polaris in every single regard
@@AveragePootis ah, I think mines a revised card because it has “pulse” branding, I haven’t noticed its noise at all but drivers are still a bit wonky at times
msi I presume?
I think you're the best channel for all the more rare and obscure budget GPUs.
You called the mx440 terrible, turns out there is a 12.5watt version and a 28 watt version. The 28 watt version can achieve 30-40fps on bg3 1080p low, with undervolting.
Its basically a gtx 1640.
Just in case anyone was serious about wondering about the question at 2:15, HDMI has a 5V rail with just enough power to feed a basic circuit like that
I have lag in the Nvidia control panel under manage 3D settings on a (full tdp) mobile 3070, pretty sure had same issue with a desktop 1060 (even on fresh install, DDU, NVCleaninstall, etc), and have seen other people complain about the same thing. As bad as this card is, I do not think the NVC lagging is it's fault for this, the software is really really bad
Good video otherwise tho, don't mean to be negative
nvidia control panel has sucked for years which is why the nvidia app is better
Same issue with an desktop GTX 1660 Ti.
yup, have had this with every nvidia card ive ever tried, which is at least 5 at this point
i swear i learn about a new gpu every time a video comes out on this channel..
Pascal was great from the x50 series to the x80Ti. But they were so ashamed that they even removed the X from the GTX name.
that moment you buy a gt 1010 and realize you made the largest mistake of your life😱😱
I'd say the GTX 1030 (or at least its GDDR5 version) was great as well, for its purpose.
this channel is the most relaxing to watch on yt idk y, .. love it!!.
Could you fix your discord link?
I always love the Lupin III anime series theme in the background. One of my favorite shows ever
18:52 but there is evidence that G-Sync works. See, GT cards didn't get G-Sync support until Nvidia added it for the GT 1030 (and 1010 when it came out) in 2019. I've fully tested my LP GT 1030 GDDR5 with G-Sync and let me tell you, Halo 2 cartographer runs like a dream in 3440x1440p120-144hz, all thanks to Displayport 1.4a. But yeah GT series never had NVENC at all. Look up GPU capabilities on the W!k!.
I can imagine some people buy this or GDDR5 version if it's released in 2017-2018 for around 50 bucks
If it was released around tgat year it would have costed around 30 to 40
@@astroidexadam5976 that's like GT710 tier back then so should costs more.
l meant most of people don't buy that for gaming but for display output when non G Ryzen and Intel F series don't have integrated graphics.
Uploaded 30s ago, I feel blessed
This graphics card is crazy. The fact that modern day integrated graphics like the 760M or 780M by AMD beats this thing is insane on how beefy they've become throughout the years compared to before. It would be troubling if we somehow gotten an RTX 3040 or heck, an RTX 5020.
that's because this gpu its horribly obsolete, mostly due to abysmall performance of gddr4
22:35, Dude, Transport Tycoon Deluxe soundtrack :D.
I didn´t even know this existed.
Got a GT 1030 low profile for my Hp ThinClient to get more modern codec support for web and streaming aswell as 4K60 display out.
The built in CPU is still kinda slow but the GT 1030 greatly helps with stuff.
Have been using this for some time as HTPC, but currently it´s collecting dust.
Where's the GT1020 available at for that _mid-range_ option?
At 1:37
Which i7?
You said you don't know which i7 it's supposed to be paired with....so I checked.
The card came out in Q1 of 2021, according to GPU-L. Which means that it is firmly in the i7-11700 range since they also came out in Q1 of 2021, according to CPU-L.
Now, there are six of those, only counting desktop variants:
11700
11700B
11700F
11700K
11700KF
11700T
ironically Dapz as in that Dapz was the one that first got his hands on one.
yeah i was reading the title and was thinking "this gpu sounds familiar to me" and i googled it and got reminded he existed
Love your content mate glad you are back to regular uploads , keep up the excellent work
Would be a great GPU for retro gaming, or a HTPC
Retro gaming was going to be an entire section, but unfortunately with no 32bit support or the ability to run its own adapter, it’s kinda redundant. Same with the HTPC side as it’s too expensive to be better than the alternatives
@@BudgetBuildsOfficial I found one of these in a green bin but left it as there was a 980 and took that instead, if I could travel time I'd of taken this and eBayed it
Hey! I am not subscribed to you and I have never seen your channel, however this video was in my notifications!!??? How is that possible 😂 I think UA-cam loves this video 😂
Well I hope you enjoyed the video
@BudgetBuildsOfficial Loved it! Right now watching some of your other videos 🙌😊
Nvidia had the chance to make the 2020 and didn't 😅
I know you are the ‘Budget’ build channel but could I recommend the Corsair 680x case to you? They aren’t cheap but not that expensive used on eBay at around £90-100. The reason I’m recommending it is that it has a hinged glass side panel, so you can open it in 2 seconds flat, rather than tackling four thumb screws every time you want to change some hardware.
Thanks to getting next to no sleep after my night shift, energy drinks and dwindling sanity, I'm finally awake to catch my manz at upload time once again 🙏
In 2008 I didn't have the budget for an 8800 GT, so I bought an Ati HD 3870 with GDDR4 and it wasn't bad at 1280x1024. I played through Crysis on it without complaint.
I actually remember hearing the news about the GT1010 around 2021-ish. I think the original point of it was to be a cut down GT1030 (duh), but to be incredibly cheap in a time when the chip shortage was in full swing. If that’s the case, the card probably didn’t have a lot of development time just to have SOMETHING out. Since I think even GT1030s were being scalped.
I’m guessing they didn’t actually see a reason to release the GT1010 en masse once the chip shortage and GPU prices calmed down, so they didn’t make the GT1010 in large quantities and didn’t sell them outside of Southeast Asia for some reason.
During the pandemic, every GPU was being scalped. Even the GT210. 😂 With the cost of producing wafers and creating chips, manufacturers don't like throwing away bad batches of chips if they can actually be made to work. Disable the bits that don't work and try to sell them as budget or OEM cards, at least they get some money back.
I was _really_ hoping you'd pull the cooler off so we could all stare in awe at the presumably _absolutely tiny_ sliver of silicon under it, but alas no such luck.
It always makes me chuckle hearing you talk in £s, as a fellow Brit, it is nice to hear someone talk in "real" money, without me having to convert.
love the videos keep it up, but what is that ram configuration
Now I need a Rtx 2010, 3010 and 4010
Given they drop the X for these cards, the RT branding would seem fitting. The 3050 6GB is effectively the RT3030, and the 2050 is probably better given the name of RT3020 given it is a cut-down mobile 3050.
When it comes to lag on the nvidia control panel. I have a 3060ti and it still lags like that. I have no idea if it's even fixable, but I usually only use it after I reinstall Windows, so it doesn't really bother me that much.
Power management mode set to optimal performance does actually very little for performance as it wont clock the card higher then before but in idle will eat much more energy as the gpu will never clocks down when not in use. Its a pretty bad setting and best left on optimal power.
depends.
I remember on I think it was my last card that if I put it on optimal/power saving some game would not really run to well.
because the system would always try and clock down the GPU.
the game only used like 10% of my GPU for 60ish FPS but as soon as the system clocked it down the FPS droped to like 10FPS because some more advance GPU tricks get disabled.
*won't
*than
*clock
*It's
@@GrainGrown Lovely people who are not capable of adding to the conversation but only begin about the form.
@@0MeALot0 You're welcome, kiddo.
This must cost like... dozens of cents per year in power.
Great review! Thank you for the effort!
I got a 1030 ddr4 as an upgrade after a 710 ddr3, for a Fuji Futro S920 build, and was much better, and the 40W powersupply still was stable. 😄 Crysis? The demo ran on both, under Win7 64bit.
I got a Futro S940, as an upgrade, and I use my old laptops powersupply, so I will get a gddr5 1030, eventually...
And everything was, is, and will be passive, since there are 1030's with passive cooler. 🙂
Compatible with Windows 7 32-bit. Like all the best things.
Actaully im diggin the box art big time. What a find! Awesome
I bought a 2020 from China, sadly it gave my computer a virus
Huh
skill issues-_-
Great videos, I immediately subbed because of the accent cheers!!!
The vibe I get from this card is an updated version of an older card with some newer hardware and bios and drivers for modern support
5:53 i love this 3D icon thing, its really cool, how do you do it?
Check out our channel artist @chunkyboinoodles
Manufactured ewaste. At least it is too uncommon to trick manny people
It's *literally* produced to *AVOID* ewaste. The failed 1030 GPUs are binned and resold.
If they didn't do this, they'd just throw the parts out.
It's not useless just because it's not a not a GaMiNg card.
Manny = male nanny, BTW.
@@tim3172 but they could still pair it with more vram. You save the silicon but waste the ram and PCB.
I understand why low grade silicon is sold. The scummy practice Nvidia is still doing is holding the silicon bad further to incentivise you to get the higher tier card
what a beast! i sent some GPU companies messages about them needing more mascots and not just random-ass black boring boxes, we want monsters
and to be fair every single nvidia GPU i've had including a 4080S lagged horribly in the driver screen.
GT1030 DDR4 users rejoice!!!
The lag in the nvidia control panel showcased at 5:18 is my complete experience with nvidia cards, one of them being a 1050 4gb in a laptop I own.
Been comparing the specs to the Vega IGP in my R5 5600G... 🤣🤣🤣
I mean these days it's a bad sign if any modern graphics card CAN'T run the OG Crysis :(
I hope one day you manage to find the GDDR5 one, it looks a lot cooler.
I literally have one of these sitting in my garage, had no clue what gpu it was til the video came out so thank ya
Lagging in the Nvidia control panel isn't normal? Both my 650 and my 1060 always did that.
Honestly impressed how well it was rendering pixels on the screen
Curious if losses scaling can save the overall experience. Or work with the older cards.
There's only one card that should have GDDR4... The Radeon 1950xt, HD 3850/70 and 2600xt
Oh also... Remember the weird R7 450 you did a video on? I think I found a tweak that'll help! Go into the driver configuration where you select "gaming" or "compute" mode. Pick "compute". Then load up MSI afterburner and max the power limit to whatever it'll give. Then try running some benchmarks again!
HDMI is capable of transmitting a VGA signal. It's just that your monitor has to be able to receive that signal through a HDMI port.
Most can't.
This is absolutely not true. It's an active HDMI to VGA adapter using +5V from the HDMI plug.
4:10, why'd you have 3 different ram sticks? Looking cool, but it doesn't help stability, isn't it?
So today i put together an old HP Prodesk 400 G5 machine with a i7, 16GB RAM, and a HP branded GeForce GT730 2GB, and its not tooooo bad.. Given your clearly experienced, what would you say is "the best" Low Profile single width card you can still get ?. RX550 4GB ?. Thank you.
Haven't watched the rest of the video, but 2:54
Is this a sticker placed over the box?
Yes
Didn’t know that I needed a video to review the gt1010 when I have a rtx 3060 but here I am
This makes me remember of a video of _Dawid Does Tech Stuff_ , where he completely removed the cooler of a GT1030 to see if the card would still work, and the card actually never managed to overheat, even after some overclocking ! XD
Given the fact that this GT1010 is even worse than the 1030, I'm sure you can remove the cooler and the card won't even guess something is different 😂
I am kinda curious, how does the GT1010 compair with the nearly equivelent Quadro P400. They have the same cudacore count, close enough clocks. The Quadro has NVENC and GDDR vs DDR for vram. Oh and the P400 has a full x16 link vs x4.
Forgot to mention that somehow my Quadro P400 seemed to perform better on PCIE Gen2 x8 than my GT1030.
True to form you have classically displayed the card as you have done in the past near a bird wash in your garden. Perhaps it would have been more appropriate to display this one on the brim of a toilet?
Hey Mr. builds, I found a GT 120 in a box recently but I believe its broken. Would you want anything to do with it?
is the card power limited so much that even desktop tasks are sucking the power away from the vga adaptor?
I think it's kinda cool. I bet they're a lot cheaper in their original market, and cheaper still in whatever devices they are made for, bought in bulk by whatever industries require them.
In the case of AMD, the GPUs without a video encoder are laptop GPUs that they decided to sell for desktop (they count on the integrated GPU having a video decoder). I wouldn't be surprised if that were also the case for some GPUs that they didn't sell for laptops and they try to sell them in another market.
Love the video man :)
the fact the minimum power was based on a I7 is optimistic to say the least. I swear this thing is beaten by 2020s Intel integrated graphics (AMDs would probably blow it away).
sir.. did you check if the monitor was at 60Hz and not at 30Hz?
Cheers.
I have my ancient backup monitor hooked up to a HDMI to VGA adapter on a 2060 Super right now......... Watching this on main though, 100% HDMI 😅
2:25 i believe there is a spec of HDMI that can passthru analog video.
If i remember correctly there are multiple unused pins that can be repurposed for VGA.
You need to give this card a second chance with the 480i treatment.
So whats it for?
Replacing the base generic GPU a basic home PC comes with in the odd case its not embedded?
Maybe for library computers that don't actually have any graphical needs beyond basic OS animations?
selling brand new refurbished garbage
to connect a monitor to a PC without integrated graphics.
So, it cost as a dual xeon DDR4 server motherboard? (like lenovo RD450X)
About that rare video card, I realized I should not regret the GTX 1050 2GB I bought. Thank you for this encouragement. Well, I can play more games and even Crysis using GTX 1050 2GB than GT 1010... What a trip for that GT 1010 to not run Crysis, but overall, it's consoling it can run solitaire, minesweeper and even Purble Place. :D
This card is ideal in just one scenario:
When your office PC have busted iGPU, it's output(s), or wasn't equipped with one.
whats the name of the music that plays during the benchmarks?