@@eliweber724 no ddr4 was out for years and they manufactured the motherboard with ddr3. Btw I used 256mb ddr1 sticks on the board and it worked so idk. Wont work on newer ram; only older then the installed.
@@raven4k998 if your only need is to drive monitor it doesn't matter if you got GT1010 or GT1030. for the most part just get whatever is cheaper. At this point it is silly try to play games even on proper GT1030 that use GDDR5.
@@raven4k998 well they go like this: gt1010 DDR4 < Gt1030 DDR4 < GT1010 DDR5 < GT1030 DDR5. There is a 1010 that is better than a 1030 so you have to specify the DDR5 1030
Be prepared to buy GTX 6010/6005 for $2000 when AMD go out of biz (which seems to be where they are headed with their GPU division).Unfortunately InHell hired a fake engineer from AMD thinking he can actually produce GPUs so there is no competition now in the market.
Nvidia originally released its GP108 as an entry-level desktop and notebook GPU in 2017, and since it never produced a low-end Turing GPU, it continued to manufacture the chip. Then to address demand from its OEM customers, Nvidia quietly released its GeForce GT 1010 based on a cut-down GP108 early last year.
What's that you say? This was a product with a specific purpose which fit a requirement from OEMs to produce inexpensive display adapters for their systems and it was never available at retail? Clearly, that means it should be compared against the 4090 so we can all lambast how much it "sucks" in comparison.
Lower cards from colorful always have these design, props to them, they really know most of these cards will likely perform like it came out in the early 2000s 😂
GDDR5 versions seems to be OEM only in certain regions. The only commercial release seems to be DDR4. But then again there is very little reliable info on these cards out there.
@@johnsalamiiit was a thing for a very short period of time It was AMD only. They invented it, but it didn’t have much gains over GDDR3, and was too expensive. GDDR5 quickly came out and made it redundant. Nvidia has never made a gddr4 gpu. This gpu, and the gt 1030 variant, are ddr4
I have a 3070ti, and the lag/delay it took to show menus specifically on the Nvidia Control Panel is literally the same like in the video. I think it's just the control panel thing
It seems like this was never meant to be a commercial release to begin with. "That's why Nvidia only made it available to OEMs who needed to serve business customers that sometimes demand standalone graphics cards in their desktops." Interesting it did of course lol.
@@timotheataecards like these are really just display adapters. They let you connect more screens to the computer, they're not really intended to do anything more than that.
@@timotheatae Things like these serve a purpose. I have a server I put together with commodity AM4 parts, but since the CPU I have in it has no integrated graphics I still need something to output a console for my PiKVM, so a PCI 1x version of the GT710 fits that bill perfectly. It also leaves the primary x16 slot free to put a real GPU in to pass through to virtual machines.
Yeah I wonder if it was for things like casino/gambling machines - they tend to like having lots of high resolution displays connected to them but don't always need to render in super high quality. They're one of the biggest markets for the Radeon Pro series too.
@@timotheatae companies buy these in bulk for dirt cheap, usually for older systems that has even crappier graphics. ATM machines, cash registers, arcade machines, you name it. as an example the pump it up arcade machines uses geforce 9300gs and gt210 cards, sometimes the GT710. they have a special OS written only for specific graphics and sound chips, and since it's a rythm game there is no need for anything fancy.
Maybe in 5-10 years you'll end up reviewing the bottom tier display adapter card they released this year (once it doesn't cost a a ton anymore lol)- the RTX A400 that has cut down version of the same core that the RTX 3050 has, but with just 4 gb of VRAM and a 64-bit bus and less than half the shader cores.
@@MyNameIsBucket well it is kinda nvidia's equivalent to the W6400 which is the "pro" version of the RX6400. They're really just meant for adding a bunch of video outputs to professional and industrial computers that have higher stability demands while rx 6400, rx 6300 etc are for adding outputs to office computers and similar. Maybe the gt 1010 was also aimed at that like it's quadro P400 cousin but they had a surplus or something.
Unfortunately, if you search for the brand new GT 1030 4GB on a well known shopping websites(mentioned in the video), you will find that some models are advertised as the 'GDDR4' version. MSI is one of those brands, but when you look up the exact model on MSI's official website, it clearly states that it uses DDR4. It seems like this might just be a marketing tactic from the vendors
Why people still looking for almost a decade old GT1030? At this point those that want a GT1030 will only need it to drive display. Those model with DDR4 should not be an issue.
@@arenzricodexd4409 here are two reasons from my personal perspective: 1. It does not need external power (TDP is around 30W). 2. It is excellent as a media decoder card, with full support for H.264, H.265, and VP9. VP9 is the main codec using by UA-cam right now. Regarding the DDR4 version, I think it is just an NVIDIA scam (lower cost), since the DDR4 version only reaches half the performance of the GDDR5. As you mentioned, most people don't care about DDR4. Alas, they are paying almost the same amount of money for the GDDR5 version.
@@arenzricodexd4409 Two reasons, in my opinion. 1. No external power is needed.(TDP 30W) 2. NVDEC 3rd Gen(H264, H265 &VP9). The DDR4 is only half the performance of the GDDR5 version. Yet, they almost cost the same price . NVIDIA has been doing this practice for quite long time.Yet, for most consumers as you mentioed they won't notice anything.
THe 1030 already was something else. But at least it had a relative modern chip and could run hvec and so on. So that is great. But a 1010? Just how much can you cut from a 1030 to make it a 1010???
Chip manufacturers don't want to discard large volumes of faulty chips, it's far too expensive. They find the level at which many of those faulty chips will function, and disable certain parts and/or downclock them so they can be used. Slap them on a standard PCB and sell them as a reduced cost or OEM card, thereby minimising the financial losses. It's a very common practice, and emerging or less affluent markets are keen to take them.
a friend of mine genuinely has a 1030 because wherever he bought the prebuilt from has severe mental issues, he has 3200 corsair ram that wasn't set to 3200 and a cpu that's better than mine but a 1030 because reasons no the pc has not been upgraded from stock.....
@@onegrumpyboi2914 Hopefully he can find a decent GPU. GTX 1080s are quite cheap these days and also powerful. Or if he wants to go high-end, maybe the 6900 XT or something.
Not really. It still would be slow and couldn't max out games at 1080p ultra. It doesn't even launch Crysis at all. My old 650 Ti is superior "low power" XP card.
2:22 The hdmi port can supply 300mA at 5v. Normally that is suppose to be used to detect a powered of monitor. But some dongles use it to convert between outputs
I can't understand his hate for MX440. Sure, naming was terrible with GF4 in the name and lack of shader support but cards were cheap at that time and to many of us who didn't have money for first real accelerators like gf2 or 3dfx v2 they give something to play older titles at decent framerate. At that time new mx440 was stil cheaper than used gf2 and run faster with more vram, its' not like every second hand market is healthy one.
@@gorjy9610 that is a fair take you could pick them up farly easily for a tenner, but I was able to pick up a used ti4200 for 50 quid as soon as I could. trying to play bf1942 on a mx440 was PAIN!
@@arnislacis9064 The card were made with dx 9 support - find a Ati 9500-9600 or 600 card and frames are double -the nvidia 4200-4600 are for the older bus , but I guess a 4800 could do fine - ati 9800 and nvidia 5700 cant be run on default 300 w powersupply and - so you have options like even mx 440, geforce 3 , or nvidia 6200 , just some old knowlegde i gathered back in the day
Production runs of chips that produce a high proportion of faulty, sub standard parts are an expensive write off, so they simply disable the relevant parts and sell them as more budget oriented or OEM cards, especially in the less affluent markets. Cut down cards are exceptionally common, they're often just mid and high range chips that didn't make the grade.
@@ZERARCHIVE2023There’s also the GDDR5 GT 1030. Much better. Still not a good card at all, or a card made for gaming, but you can certainly make it do it with not the worst results ever recorded.
@@archgirl GDDR5 can run some games at 30FPS "stable" Where DDR4 would get to... 5/10fps maximum ? OC included ? 7600X's igpu is far and wide stronger. X8SSAA SMS emulation : 30 locked. Same settings ? 10 fps if I'm lucky 😂 with my 1030. And my 7600X can still boost if I take away the limitations to 100fps+ so... yeah...
@@ZERARCHIVE2023 I was just pointing out that the GDDR5 version of the 1030 exists and is better than the DDR4 version you mentioned. I never said anyone should buy or use one, just that it’s a faster version of the card. Of course a newer chip designed for gaming is likely to beat a 7 year old GPU that wasn’t designed or released for gaming.
@@_LGD Well i hated mine, the fans were super cheap and rattley and combined with the stupid power consumption of the 580 (because amd just had to make the 480 WORSE) it was louder than my vaccuum cleaner most of the time. Also this was back in 2020, which meant Radeon software and Drivers being so bad that i just jumped to team green... pascal just beats polaris in every single regard
@@AveragePootis ah, I think mines a revised card because it has “pulse” branding, I haven’t noticed its noise at all but drivers are still a bit wonky at times
Hey! I am not subscribed to you and I have never seen your channel, however this video was in my notifications!!??? How is that possible 😂 I think UA-cam loves this video 😂
@@astroidexadam5976 that's like GT710 tier back then so should costs more. l meant most of people don't buy that for gaming but for display output when non G Ryzen and Intel F series don't have integrated graphics.
My friend, I know you know what you're doing, but just a thought, please keep in mind that scammers flash modified BIOS which may be making it appear as a 1010 when its not.
@BudgetBuildsOfficial thank you for responding. I'm currently watching the video so I had that thought during not after. China always gets weird stuff. My personal favorite is how they have laptop CPUs in desktop sockets.
@BudgetBuildsOfficial I just thought of this & thought you may find it funny. You said "No it's a GT1010 unfortunately" I thought he's probably wishing it was anything else.
You called the mx440 terrible, turns out there is a 12.5watt version and a 28 watt version. The 28 watt version can achieve 30-40fps on bg3 1080p low, with undervolting. Its basically a gtx 1640.
I didn´t even know this existed. Got a GT 1030 low profile for my Hp ThinClient to get more modern codec support for web and streaming aswell as 4K60 display out. The built in CPU is still kinda slow but the GT 1030 greatly helps with stuff. Have been using this for some time as HTPC, but currently it´s collecting dust.
I have lag in the Nvidia control panel under manage 3D settings on a (full tdp) mobile 3070, pretty sure had same issue with a desktop 1060 (even on fresh install, DDU, NVCleaninstall, etc), and have seen other people complain about the same thing. As bad as this card is, I do not think the NVC lagging is it's fault for this, the software is really really bad
In 2008 I didn't have the budget for an 8800 GT, so I bought an Ati HD 3870 with GDDR4 and it wasn't bad at 1280x1024. I played through Crysis on it without complaint.
Thanks to getting next to no sleep after my night shift, energy drinks and dwindling sanity, I'm finally awake to catch my manz at upload time once again 🙏
Great review! Thank you for the effort! I got a 1030 ddr4 as an upgrade after a 710 ddr3, for a Fuji Futro S920 build, and was much better, and the 40W powersupply still was stable. 😄 Crysis? The demo ran on both, under Win7 64bit. I got a Futro S940, as an upgrade, and I use my old laptops powersupply, so I will get a gddr5 1030, eventually... And everything was, is, and will be passive, since there are 1030's with passive cooler. 🙂
I was _really_ hoping you'd pull the cooler off so we could all stare in awe at the presumably _absolutely tiny_ sliver of silicon under it, but alas no such luck.
Given they drop the X for these cards, the RT branding would seem fitting. The 3050 6GB is effectively the RT3030, and the 2050 is probably better given the name of RT3020 given it is a cut-down mobile 3050.
what a beast! i sent some GPU companies messages about them needing more mascots and not just random-ass black boring boxes, we want monsters and to be fair every single nvidia GPU i've had including a 4080S lagged horribly in the driver screen.
This makes me remember of a video of _Dawid Does Tech Stuff_ , where he completely removed the cooler of a GT1030 to see if the card would still work, and the card actually never managed to overheat, even after some overclocking ! XD Given the fact that this GT1010 is even worse than the 1030, I'm sure you can remove the cooler and the card won't even guess something is different 😂
Retro gaming was going to be an entire section, but unfortunately with no 32bit support or the ability to run its own adapter, it’s kinda redundant. Same with the HTPC side as it’s too expensive to be better than the alternatives
@@BudgetBuildsOfficial I found one of these in a green bin but left it as there was a 980 and took that instead, if I could travel time I'd of taken this and eBayed it
There's only one card that should have GDDR4... The Radeon 1950xt, HD 3850/70 and 2600xt Oh also... Remember the weird R7 450 you did a video on? I think I found a tweak that'll help! Go into the driver configuration where you select "gaming" or "compute" mode. Pick "compute". Then load up MSI afterburner and max the power limit to whatever it'll give. Then try running some benchmarks again!
I actually remember hearing the news about the GT1010 around 2021-ish. I think the original point of it was to be a cut down GT1030 (duh), but to be incredibly cheap in a time when the chip shortage was in full swing. If that’s the case, the card probably didn’t have a lot of development time just to have SOMETHING out. Since I think even GT1030s were being scalped. I’m guessing they didn’t actually see a reason to release the GT1010 en masse once the chip shortage and GPU prices calmed down, so they didn’t make the GT1010 in large quantities and didn’t sell them outside of Southeast Asia for some reason.
During the pandemic, every GPU was being scalped. Even the GT210. 😂 With the cost of producing wafers and creating chips, manufacturers don't like throwing away bad batches of chips if they can actually be made to work. Disable the bits that don't work and try to sell them as budget or OEM cards, at least they get some money back.
This graphics card is crazy. The fact that modern day integrated graphics like the 760M or 780M by AMD beats this thing is insane on how beefy they've become throughout the years compared to before. It would be troubling if we somehow gotten an RTX 3040 or heck, an RTX 5020.
I mean these days it's a bad sign if any modern graphics card CAN'T run the OG Crysis :( I hope one day you manage to find the GDDR5 one, it looks a lot cooler.
Power management mode set to optimal performance does actually very little for performance as it wont clock the card higher then before but in idle will eat much more energy as the gpu will never clocks down when not in use. Its a pretty bad setting and best left on optimal power.
depends. I remember on I think it was my last card that if I put it on optimal/power saving some game would not really run to well. because the system would always try and clock down the GPU. the game only used like 10% of my GPU for 60ish FPS but as soon as the system clocked it down the FPS droped to like 10FPS because some more advance GPU tricks get disabled.
I know you are the ‘Budget’ build channel but could I recommend the Corsair 680x case to you? They aren’t cheap but not that expensive used on eBay at around £90-100. The reason I’m recommending it is that it has a hinged glass side panel, so you can open it in 2 seconds flat, rather than tackling four thumb screws every time you want to change some hardware.
About that rare video card, I realized I should not regret the GTX 1050 2GB I bought. Thank you for this encouragement. Well, I can play more games and even Crysis using GTX 1050 2GB than GT 1010... What a trip for that GT 1010 to not run Crysis, but overall, it's consoling it can run solitaire, minesweeper and even Purble Place. :D
It's *literally* produced to *AVOID* ewaste. The failed 1030 GPUs are binned and resold. If they didn't do this, they'd just throw the parts out. It's not useless just because it's not a not a GaMiNg card. Manny = male nanny, BTW.
@@tim3172 but they could still pair it with more vram. You save the silicon but waste the ram and PCB. I understand why low grade silicon is sold. The scummy practice Nvidia is still doing is holding the silicon bad further to incentivise you to get the higher tier card
Just in case anyone was serious about wondering about the question at 2:15, HDMI has a 5V rail with just enough power to feed a basic circuit like that
I mean, even on my RTX 2070 Super I get pop-in on GTA V and BeamNG, it's most noticeable on those two than anything else which I don't get. My 6GB 1060 did the same thing.
I miss when video cards had those ugly ass CGI boxes and decals. Sure we get the anime waifu ones and the Black Myth Wukong special edition ones but those are actually good and cost a premium. I just want the old school GT/GTX and ATI cards from like 2006~2011 stuff.
In the case of AMD, the GPUs without a video encoder are laptop GPUs that they decided to sell for desktop (they count on the integrated GPU having a video decoder). I wouldn't be surprised if that were also the case for some GPUs that they didn't sell for laptops and they try to sell them in another market.
Still happy with my gt1050 125w only 2gb overclocked to the Bones since at least 4 years. It just refuse to die. And interestingly it still runs games decently with the i3 7100 and vsync enabled 60hz1080p... Like metro or skyrim. And 16g of décent RAM sticks and timings tweaked. IT Refuse to die!!!
True to form you have classically displayed the card as you have done in the past near a bird wash in your garden. Perhaps it would have been more appropriate to display this one on the brim of a toilet?
It exists and was discontinued in 2021. It was meant to be more of a display adapter. As for GDDR , I wanted to see was GDDR5 the very first GDDR , but , according to wikipedia and similar sites , GDDR goes back to the very first DDR RAM.
@@manitoba-op4jx Or, you know... you bought Any non-APU Ryzen system, any F-series Intel CPU, or any Intel CPU from before they started including GPUs.
Given the GT 1010 name, I was expecting this thing to essentially be a desktop version of the GT920M I had in a Toshiba laptop I`d bought back in 2015, but it actually seems to perform a good amount better, not that it was a high bar to clear, because the best my GT920m could do in GTA V was in the 40 fps range at 1366x768 normal settings
6:40 To put the pricing into perspective: here at the end of October 2024 I just bought a good working 1080 for $50 locally in California. (To be clear that is still a good value, I had to pay $60 for a 1070 soon after 😂). But what a crazy card. I usually just keep a 750Ti as a GPU around to test computers with. Or a 1050 if I happen to have one.
I read the title and even before watching the video, I googled the damn thing and laughed so hard I fell out of my chair. This is the biggest joke ever, rivaling even the legendary FX5200 and the GT710
18:52 but there is evidence that G-Sync works. See, GT cards didn't get G-Sync support until Nvidia added it for the GT 1030 (and 1010 when it came out) in 2019. I've fully tested my LP GT 1030 GDDR5 with G-Sync and let me tell you, Halo 2 cartographer runs like a dream in 3440x1440p120-144hz, all thanks to Displayport 1.4a. But yeah GT series never had NVENC at all. Look up GPU capabilities on the W!k!.
WOW WHAT A SURPRISE!!!! I thought it will be an amazing GPU. Had GTX 980Ti and it was good GPU. GT1010 is only missing X and Ti but is on the other hand 30 better
There's a chance those you're only going to find in China or maybe some other Asian territories. I remember there was a card that was specifically made for internet cafe's in that region. It's probably done more than once.
THE DRAGON ON THE BOX WAS NOT EVEN RENDERED BY THAT GPU.
Maybe it was and it just took a loooooong time
@@rubeusvombatusI think pre-orders are still waiting for theirs to render.
have you ever bought a gpu with box art rendered by itself?
@@imacg5 it'd be smart to, and I'm sure some companies do that.
@@rubeusvombatus not enough VRAM
I like how the box says GDDR4 despite the last GPU using it coming out in 2007
Real
Just like manufacturers mentioned GDDR2 on many cards yet only few cards used it.
I use a laptop that SHOULD have ddr4 but they got lazy and used ddr3
@@Leboobs22yeah it doesn’t work like that, that would require a whole different motherboard
@@eliweber724 no ddr4 was out for years and they manufactured the motherboard with ddr3.
Btw I used 256mb ddr1 sticks on the board and it worked so idk. Wont work on newer ram; only older then the installed.
A Nvidia GT 1010 is like a 3-legged horse. Very rare, but also very useless.
This card should have a remark : "For MS Office only"
what if the legs are in a triangle formation
@@lordmike9331 🤣
The card that was so pointless that Nvidia tried to hide it
well yeah it sucks compared to integrated graphics which is a joke in and of itself😭😭
It was so useless it saves many old pc with broken iGP into e-waste.
@@arenzricodexd4409 meh you wasted your money should have got a gt 1030😭
Try the GT 705 with 48 CUDA cores, its over 5 times slower. 😂
@@raven4k998 if your only need is to drive monitor it doesn't matter if you got GT1010 or GT1030. for the most part just get whatever is cheaper. At this point it is silly try to play games even on proper GT1030 that use GDDR5.
GT 1010: ❌
GT Ben Ten: ✅
save your money for the gt1030 it's the better option basically by a lot
@@raven4k998 well they go like this: gt1010 DDR4 < Gt1030 DDR4 < GT1010 DDR5 < GT1030 DDR5. There is a 1010 that is better than a 1030 so you have to specify the DDR5 1030
@@Mr28d23😂😂😂 this guy 😎🥰🥰
@@raven4k998 No, save your money for an ACTUALLY GOOD GPU. Not crappy display cards.
save up for rtx or amd xt series these are cheap nowdays@@raven4k998
It’s a great Saturday when I hear that smooth music and a random forgotten graphics card. Can’t wait to see it “in the benchmarks” 😊😊😊
Be prepared to buy GTX 6010/6005 for $2000 when AMD go out of biz (which seems to be where they are headed with their GPU division).Unfortunately InHell hired a fake engineer from AMD thinking he can actually produce GPUs so there is no competition now in the market.
that is worse then your igpu🤣🤣🤣🤣🤣
Nvidia originally released its GP108 as an entry-level desktop and notebook GPU in 2017, and since it never produced a low-end Turing GPU, it continued to manufacture the chip. Then to address demand from its OEM customers, Nvidia quietly released its GeForce GT 1010 based on a cut-down GP108 early last year.
What's that you say? This was a product with a specific purpose which fit a requirement from OEMs to produce inexpensive display adapters for their systems and it was never available at retail?
Clearly, that means it should be compared against the 4090 so we can all lambast how much it "sucks" in comparison.
Early last year?!! They released a new 10 series card?
Is this thing from 2004? Because the box art is certainly in the 2004 school of box art design.
to be fair, a GT 1010 in a 2004 retro rig seems like a perfect fit provided your board has pcie
@@oggilein1Drivers.
With the performance it has, probably yes
Nvidia GF FX 5xxx flashbacks
Lower cards from colorful always have these design, props to them, they really know most of these cards will likely perform like it came out in the early 2000s 😂
Kinda sad he got the GDDR4 version, and not the GDDR5 version (which even beats the DDR4 GT 1030!)
i had no idea that gddr4 was a thing. like all i knew was gddr3,5 and 6
GDDR5 versions seems to be OEM only in certain regions. The only commercial release seems to be DDR4. But then again there is very little reliable info on these cards out there.
It’s ddr4, not gddr4, despite what the box says
Gddr4 is ancient history at this point
@@johnsalamii Some AMD (or ATI) cards had GDDR4. But by the time 4 came out, it was already being replaced by 5, that's why it's so rare.
@@johnsalamiiit was a thing for a very short period of time
It was AMD only. They invented it, but it didn’t have much gains over GDDR3, and was too expensive.
GDDR5 quickly came out and made it redundant.
Nvidia has never made a gddr4 gpu.
This gpu, and the gt 1030 variant, are ddr4
wait, you’re saying you’ve never seen the nvidia control panel lag? That’s impossible.
Exactly what I was thinking. I've never seen it not be laggy.
Every single Nvidia card I've had, 640, 760, 1050 Ti, 2060, 3060, all of them have lagged exactly like that in the control panel.
I've never once had the nvidia control panel lag either, interesting.
As an AMD and Intel Arc GPU user, I also never experienced Nvidia control panel lag.
I have a 3070ti, and the lag/delay it took to show menus specifically on the Nvidia Control Panel is literally the same like in the video. I think it's just the control panel thing
It seems like this was never meant to be a commercial release to begin with.
"That's why Nvidia only made it available to OEMs who needed to serve business customers that sometimes demand standalone graphics cards in their desktops."
Interesting it did of course lol.
Ignorant businessmen who think "discreet = good" or genuine concerns?
@@timotheataecards like these are really just display adapters. They let you connect more screens to the computer, they're not really intended to do anything more than that.
@@timotheatae Things like these serve a purpose. I have a server I put together with commodity AM4 parts, but since the CPU I have in it has no integrated graphics I still need something to output a console for my PiKVM, so a PCI 1x version of the GT710 fits that bill perfectly. It also leaves the primary x16 slot free to put a real GPU in to pass through to virtual machines.
Yeah I wonder if it was for things like casino/gambling machines - they tend to like having lots of high resolution displays connected to them but don't always need to render in super high quality. They're one of the biggest markets for the Radeon Pro series too.
@@timotheatae companies buy these in bulk for dirt cheap, usually for older systems that has even crappier graphics. ATM machines, cash registers, arcade machines, you name it.
as an example the pump it up arcade machines uses geforce 9300gs and gt210 cards, sometimes the GT710.
they have a special OS written only for specific graphics and sound chips, and since it's a rythm game there is no need for anything fancy.
The new *2*_10_
The 210 may be better for its intended job
@@BudgetBuildsOfficial i think internet got too heavy for it
"A means to connect a monitor to a machine without iGPU"
Hi Daniel fancy seeing you here. I watch your videos about old graphics cards.
At least the 210 was dirty cheap, this thing is as expensive as the gddr5 gt 1030, and is like 2x slower.
Maybe in 5-10 years you'll end up reviewing the bottom tier display adapter card they released this year (once it doesn't cost a a ton anymore lol)- the RTX A400 that has cut down version of the same core that the RTX 3050 has, but with just 4 gb of VRAM and a 64-bit bus and less than half the shader cores.
Sounds like a good card to go head to head with the RX6400.
@@MyNameIsBucket well it is kinda nvidia's equivalent to the W6400 which is the "pro" version of the RX6400. They're really just meant for adding a bunch of video outputs to professional and industrial computers that have higher stability demands while rx 6400, rx 6300 etc are for adding outputs to office computers and similar. Maybe the gt 1010 was also aimed at that like it's quadro P400 cousin but they had a surplus or something.
Unfortunately, if you search for the brand new GT 1030 4GB on a well known shopping websites(mentioned in the video), you will find that some models are advertised as the 'GDDR4' version. MSI is one of those brands, but when you look up the exact model on MSI's official website, it clearly states that it uses DDR4. It seems like this might just be a marketing tactic from the vendors
Why people still looking for almost a decade old GT1030? At this point those that want a GT1030 will only need it to drive display. Those model with DDR4 should not be an issue.
@@arenzricodexd4409 here are two reasons from my personal perspective: 1. It does not need external power (TDP is around 30W). 2. It is excellent as a media decoder card, with full support for H.264, H.265, and VP9. VP9 is the main codec using by UA-cam right now. Regarding the DDR4 version, I think it is just an NVIDIA scam (lower cost), since the DDR4 version only reaches half the performance of the GDDR5. As you mentioned, most people don't care about DDR4. Alas, they are paying almost the same amount of money for the GDDR5 version.
It is still false marketibc
Nope
@@arenzricodexd4409 Two reasons, in my opinion. 1. No external power is needed.(TDP 30W) 2. NVDEC 3rd Gen(H264, H265 &VP9). The DDR4 is only half the performance of the GDDR5 version. Yet, they almost cost the same price . NVIDIA has been doing this practice for quite long time.Yet, for most consumers as you mentioed they won't notice anything.
6:00 did you render that little ps1 looking rotating gpu using a GT1010 ?
Was thinking the same thing lol
Yes, thank god he did
THe 1030 already was something else. But at least it had a relative modern chip and could run hvec and so on. So that is great.
But a 1010? Just how much can you cut from a 1030 to make it a 1010???
It have 256 cuda cores, same amount as the Switch. Lol
Chip manufacturers don't want to discard large volumes of faulty chips, it's far too expensive. They find the level at which many of those faulty chips will function, and disable certain parts and/or downclock them so they can be used. Slap them on a standard PCB and sell them as a reduced cost or OEM card, thereby minimising the financial losses. It's a very common practice, and emerging or less affluent markets are keen to take them.
maybe is a 9000 or even 8000 repackaged as a 1010
a friend of mine genuinely has a 1030 because wherever he bought the prebuilt from has severe mental issues, he has 3200 corsair ram that wasn't set to 3200 and a cpu that's better than mine but a 1030 because reasons
no the pc has not been upgraded from stock.....
@@onegrumpyboi2914 Hopefully he can find a decent GPU. GTX 1080s are quite cheap these days and also powerful. Or if he wants to go high-end, maybe the 6900 XT or something.
if it had Windows XP drivers, it woulda been a energy efficient Windows XP Retro Gamers' dream
i use a ddr2 gt220 and its a perfect xp
I doubt Nvidia would write Windows XP drivers for Pascal for what is essentially a bunch of leftover dies not good enough to become even a GT1030.
@@Δημήτρης-θ7θ I mean, the GTX 960 has a driver for XP, so I don't see why they couldn't give the 1010/1030 an XP driver as well
@@Sithhythe GTX 900 series came out the same year XP was still officially supported my Microsoft...
Not really. It still would be slow and couldn't max out games at 1080p ultra. It doesn't even launch Crysis at all. My old 650 Ti is superior "low power" XP card.
2:22 The hdmi port can supply 300mA at 5v. Normally that is suppose to be used to detect a powered of monitor.
But some dongles use it to convert between outputs
the hdmi port on my 1060 can easily power the composite and rf converters i frequently plug into it.
I suddenly feel better about my 64mb MX440
I can't understand his hate for MX440. Sure, naming was terrible with GF4 in the name and lack of shader support but cards were cheap at that time and to many of us who didn't have money for first real accelerators like gf2 or 3dfx v2 they give something to play older titles at decent framerate. At that time new mx440 was stil cheaper than used gf2 and run faster with more vram, its' not like every second hand market is healthy one.
I have GeForce FX5200 on my Pentium 4 PGA478 System, that is barely capable of running Windows Aero efects at 1920x1200.
@@gorjy9610 that is a fair take you could pick them up farly easily for a tenner, but I was able to pick up a used ti4200 for 50 quid as soon as I could. trying to play bf1942 on a mx440 was PAIN!
@@arnislacis9064 The card were made with dx 9 support - find a Ati 9500-9600 or 600 card and frames are double -the nvidia 4200-4600 are for the older bus , but I guess a 4800 could do fine - ati 9800 and nvidia 5700 cant be run on default 300 w powersupply and - so you have options like even mx 440, geforce 3 , or nvidia 6200 , just some old knowlegde i gathered back in the day
At least the gt 210 was used by oem as a display adapter. Had a gt1030 for a emulation pc which in turn was upgraded to a 1050ti.
1030 works not so great for emulation... as far as I'm concerned (DDR4)
Production runs of chips that produce a high proportion of faulty, sub standard parts are an expensive write off, so they simply disable the relevant parts and sell them as more budget oriented or OEM cards, especially in the less affluent markets. Cut down cards are exceptionally common, they're often just mid and high range chips that didn't make the grade.
@@ZERARCHIVE2023There’s also the GDDR5 GT 1030. Much better. Still not a good card at all, or a card made for gaming, but you can certainly make it do it with not the worst results ever recorded.
@@archgirl GDDR5 can run some games at 30FPS "stable"
Where DDR4 would get to... 5/10fps maximum ?
OC included ?
7600X's igpu is far and wide stronger.
X8SSAA SMS emulation : 30 locked.
Same settings ?
10 fps if I'm lucky 😂 with my 1030.
And my 7600X can still boost if I take away the limitations to 100fps+ so... yeah...
@@ZERARCHIVE2023 I was just pointing out that the GDDR5 version of the 1030 exists and is better than the DDR4 version you mentioned. I never said anyone should buy or use one, just that it’s a faster version of the card.
Of course a newer chip designed for gaming is likely to beat a 7 year old GPU that wasn’t designed or released for gaming.
You know what other terrible card has a terrible dragon on the box? The Red Dragon RX 580. Stay away from ugly dragon graphics cards yall
ngl i actually like the tacky box art on that one. i have an affinity for old school edgy "gamer" branding
I like my RX 580 though…
@@_LGD Well i hated mine, the fans were super cheap and rattley and combined with the stupid power consumption of the 580 (because amd just had to make the 480 WORSE) it was louder than my vaccuum cleaner most of the time.
Also this was back in 2020, which meant Radeon software and Drivers being so bad that i just jumped to team green... pascal just beats polaris in every single regard
@@AveragePootis ah, I think mines a revised card because it has “pulse” branding, I haven’t noticed its noise at all but drivers are still a bit wonky at times
msi I presume?
one of my most anticipated videos of yours since I heard the card was announced!
also, when are the 6400 and 6500 XT getting put through....the benchmarks??
Pascal was great from the x50 series to the x80Ti. But they were so ashamed that they even removed the X from the GTX name.
I think you're the best channel for all the more rare and obscure budget GPUs.
Someone recently submitted a score on FurMark 2 using one of these.
It got a score of 91.
For reference, an RTX 4090 gets around 30,000. XD
It’s probably more affordable to buy 330 of these cards than it is to buy an rtx 4090
this channel is the most relaxing to watch on yt idk y, .. love it!!.
We have fallen so far from the light, lads.
you thought the igpu was the worst you could buy new you thought wrong dude🫵
I think I need a new hobby, maybe anime. 😑
Hey! I am not subscribed to you and I have never seen your channel, however this video was in my notifications!!??? How is that possible 😂 I think UA-cam loves this video 😂
Well I hope you enjoyed the video
@BudgetBuildsOfficial Loved it! Right now watching some of your other videos 🙌😊
Uploaded 30s ago, I feel blessed
I always love the Lupin III anime series theme in the background. One of my favorite shows ever
22:35, Dude, Transport Tycoon Deluxe soundtrack :D.
i swear i learn about a new gpu every time a video comes out on this channel..
I can imagine some people buy this or GDDR5 version if it's released in 2017-2018 for around 50 bucks
If it was released around tgat year it would have costed around 30 to 40
@@astroidexadam5976 that's like GT710 tier back then so should costs more.
l meant most of people don't buy that for gaming but for display output when non G Ryzen and Intel F series don't have integrated graphics.
My friend, I know you know what you're doing, but just a thought, please keep in mind that scammers flash modified BIOS which may be making it appear as a 1010 when its not.
Nope this is a GT1010 unfortunately.
@BudgetBuildsOfficial thank you for responding. I'm currently watching the video so I had that thought during not after. China always gets weird stuff. My personal favorite is how they have laptop CPUs in desktop sockets.
@BudgetBuildsOfficial I just thought of this & thought you may find it funny. You said "No it's a GT1010 unfortunately" I thought he's probably wishing it was anything else.
That was my first thought until he ran GPU-Z. GPU-Z is really good at catching scam cards with a modified BIOS.
@@MoultrieGeek Thank you for taking the time to share your thoughts! I honestly do greatly appreciate your time & input! 😇
You called the mx440 terrible, turns out there is a 12.5watt version and a 28 watt version. The 28 watt version can achieve 30-40fps on bg3 1080p low, with undervolting.
Its basically a gtx 1640.
Love your content mate glad you are back to regular uploads , keep up the excellent work
0:47 I wouldn't say rarest. What about the GeForce GTX 340?
since when was nvidia rebranding chrysler LA block engines?
I didn´t even know this existed.
Got a GT 1030 low profile for my Hp ThinClient to get more modern codec support for web and streaming aswell as 4K60 display out.
The built in CPU is still kinda slow but the GT 1030 greatly helps with stuff.
Have been using this for some time as HTPC, but currently it´s collecting dust.
I have lag in the Nvidia control panel under manage 3D settings on a (full tdp) mobile 3070, pretty sure had same issue with a desktop 1060 (even on fresh install, DDU, NVCleaninstall, etc), and have seen other people complain about the same thing. As bad as this card is, I do not think the NVC lagging is it's fault for this, the software is really really bad
Good video otherwise tho, don't mean to be negative
nvidia control panel has sucked for years which is why the nvidia app is better
Same issue with an desktop GTX 1660 Ti.
yup, have had this with every nvidia card ive ever tried, which is at least 5 at this point
In 2008 I didn't have the budget for an 8800 GT, so I bought an Ati HD 3870 with GDDR4 and it wasn't bad at 1280x1024. I played through Crysis on it without complaint.
apparently there is a GDDR5 version of the 1010 and it's faster than the DDR4 version of the 1030 lmao
Looks to be OEM only from my findings. All commercial releases are DDR4. But then again there is virtually 0 rock solid info on these
Actaully im diggin the box art big time. What a find! Awesome
Could you fix your discord link?
love the videos keep it up, but what is that ram configuration
It always makes me chuckle hearing you talk in £s, as a fellow Brit, it is nice to hear someone talk in "real" money, without me having to convert.
Great videos, I immediately subbed because of the accent cheers!!!
Thanks to getting next to no sleep after my night shift, energy drinks and dwindling sanity, I'm finally awake to catch my manz at upload time once again 🙏
Great review! Thank you for the effort!
I got a 1030 ddr4 as an upgrade after a 710 ddr3, for a Fuji Futro S920 build, and was much better, and the 40W powersupply still was stable. 😄 Crysis? The demo ran on both, under Win7 64bit.
I got a Futro S940, as an upgrade, and I use my old laptops powersupply, so I will get a gddr5 1030, eventually...
And everything was, is, and will be passive, since there are 1030's with passive cooler. 🙂
ironically Dapz as in that Dapz was the one that first got his hands on one.
yeah i was reading the title and was thinking "this gpu sounds familiar to me" and i googled it and got reminded he existed
I literally have one of these sitting in my garage, had no clue what gpu it was til the video came out so thank ya
Nvidia had the chance to make the 2020 and didn't 😅
I was _really_ hoping you'd pull the cooler off so we could all stare in awe at the presumably _absolutely tiny_ sliver of silicon under it, but alas no such luck.
Now I need a Rtx 2010, 3010 and 4010
Given they drop the X for these cards, the RT branding would seem fitting. The 3050 6GB is effectively the RT3030, and the 2050 is probably better given the name of RT3020 given it is a cut-down mobile 3050.
what a beast! i sent some GPU companies messages about them needing more mascots and not just random-ass black boring boxes, we want monsters
and to be fair every single nvidia GPU i've had including a 4080S lagged horribly in the driver screen.
Compatible with Windows 7 32-bit. Like all the best things.
This makes me remember of a video of _Dawid Does Tech Stuff_ , where he completely removed the cooler of a GT1030 to see if the card would still work, and the card actually never managed to overheat, even after some overclocking ! XD
Given the fact that this GT1010 is even worse than the 1030, I'm sure you can remove the cooler and the card won't even guess something is different 😂
Would be a great GPU for retro gaming, or a HTPC
Retro gaming was going to be an entire section, but unfortunately with no 32bit support or the ability to run its own adapter, it’s kinda redundant. Same with the HTPC side as it’s too expensive to be better than the alternatives
@@BudgetBuildsOfficial I found one of these in a green bin but left it as there was a 980 and took that instead, if I could travel time I'd of taken this and eBayed it
There's only one card that should have GDDR4... The Radeon 1950xt, HD 3850/70 and 2600xt
Oh also... Remember the weird R7 450 you did a video on? I think I found a tweak that'll help! Go into the driver configuration where you select "gaming" or "compute" mode. Pick "compute". Then load up MSI afterburner and max the power limit to whatever it'll give. Then try running some benchmarks again!
I actually remember hearing the news about the GT1010 around 2021-ish. I think the original point of it was to be a cut down GT1030 (duh), but to be incredibly cheap in a time when the chip shortage was in full swing. If that’s the case, the card probably didn’t have a lot of development time just to have SOMETHING out. Since I think even GT1030s were being scalped.
I’m guessing they didn’t actually see a reason to release the GT1010 en masse once the chip shortage and GPU prices calmed down, so they didn’t make the GT1010 in large quantities and didn’t sell them outside of Southeast Asia for some reason.
During the pandemic, every GPU was being scalped. Even the GT210. 😂 With the cost of producing wafers and creating chips, manufacturers don't like throwing away bad batches of chips if they can actually be made to work. Disable the bits that don't work and try to sell them as budget or OEM cards, at least they get some money back.
This graphics card is crazy. The fact that modern day integrated graphics like the 760M or 780M by AMD beats this thing is insane on how beefy they've become throughout the years compared to before. It would be troubling if we somehow gotten an RTX 3040 or heck, an RTX 5020.
that's because this gpu its horribly obsolete, mostly due to abysmall performance of gddr4
Where's the GT1020 available at for that _mid-range_ option?
I mean these days it's a bad sign if any modern graphics card CAN'T run the OG Crysis :(
I hope one day you manage to find the GDDR5 one, it looks a lot cooler.
Power management mode set to optimal performance does actually very little for performance as it wont clock the card higher then before but in idle will eat much more energy as the gpu will never clocks down when not in use. Its a pretty bad setting and best left on optimal power.
depends.
I remember on I think it was my last card that if I put it on optimal/power saving some game would not really run to well.
because the system would always try and clock down the GPU.
the game only used like 10% of my GPU for 60ish FPS but as soon as the system clocked it down the FPS droped to like 10FPS because some more advance GPU tricks get disabled.
*won't
*than
*clock
*It's
@@GrainGrown Lovely people who are not capable of adding to the conversation but only begin about the form.
@@mealot7613 You're welcome, kiddo.
This must cost like... dozens of cents per year in power.
I know you are the ‘Budget’ build channel but could I recommend the Corsair 680x case to you? They aren’t cheap but not that expensive used on eBay at around £90-100. The reason I’m recommending it is that it has a hinged glass side panel, so you can open it in 2 seconds flat, rather than tackling four thumb screws every time you want to change some hardware.
The vibe I get from this card is an updated version of an older card with some newer hardware and bios and drivers for modern support
5:53 i love this 3D icon thing, its really cool, how do you do it?
Check out our channel artist @chunkyboinoodles
I bought a 2020 from China, sadly it gave my computer a virus
Huh
skill issues-_-
Ive heard of the 1010 years ago and only now do i see this beast in Action 😂
GT1030 DDR4 users rejoice!!!
About that rare video card, I realized I should not regret the GTX 1050 2GB I bought. Thank you for this encouragement. Well, I can play more games and even Crysis using GTX 1050 2GB than GT 1010... What a trip for that GT 1010 to not run Crysis, but overall, it's consoling it can run solitaire, minesweeper and even Purble Place. :D
Manufactured ewaste. At least it is too uncommon to trick manny people
It's *literally* produced to *AVOID* ewaste. The failed 1030 GPUs are binned and resold.
If they didn't do this, they'd just throw the parts out.
It's not useless just because it's not a not a GaMiNg card.
Manny = male nanny, BTW.
@@tim3172 but they could still pair it with more vram. You save the silicon but waste the ram and PCB.
I understand why low grade silicon is sold. The scummy practice Nvidia is still doing is holding the silicon bad further to incentivise you to get the higher tier card
Just in case anyone was serious about wondering about the question at 2:15, HDMI has a 5V rail with just enough power to feed a basic circuit like that
I mean, even on my RTX 2070 Super I get pop-in on GTA V and BeamNG, it's most noticeable on those two than anything else which I don't get. My 6GB 1060 did the same thing.
I liked that n64 looking gpu turning animation
Thank you for the unbias review! Its nice to have a serious look at something like this
I miss when video cards had those ugly ass CGI boxes and decals. Sure we get the anime waifu ones and the Black Myth Wukong special edition ones but those are actually good and cost a premium. I just want the old school GT/GTX and ATI cards from like 2006~2011 stuff.
It was just a perfect chance to insert the “TENTEN” Japanese English contest on tv meme😭
9:24 "Red Dead Redemption 2 was a complete nightmare"
You could say it was an "Undead Nightmare" :D
This card is ideal in just one scenario:
When your office PC have busted iGPU, it's output(s), or wasn't equipped with one.
In the case of AMD, the GPUs without a video encoder are laptop GPUs that they decided to sell for desktop (they count on the integrated GPU having a video decoder). I wouldn't be surprised if that were also the case for some GPUs that they didn't sell for laptops and they try to sell them in another market.
1010이라니?! 소문으로만 떠돌던 물체를 어떻게 구하셨는지 대단할 따름입니다.;; 😂
Be nice to see a RTX 5010 with component Video out.... Itd be like a 2050 in theory.
Still happy with my gt1050 125w only 2gb overclocked to the Bones since at least 4 years. It just refuse to die.
And interestingly it still runs games decently with the i3 7100 and vsync enabled 60hz1080p... Like metro or skyrim. And 16g of décent RAM sticks and timings tweaked.
IT Refuse to die!!!
Love the video man :)
True to form you have classically displayed the card as you have done in the past near a bird wash in your garden. Perhaps it would have been more appropriate to display this one on the brim of a toilet?
It exists and was discontinued in 2021. It was meant to be more of a display adapter. As for GDDR , I wanted to see was GDDR5 the very first GDDR , but , according to wikipedia and similar sites , GDDR goes back to the very first DDR RAM.
if you just need more displays for something
@@manitoba-op4jx Or, you know... you bought Any non-APU Ryzen system, any F-series Intel CPU, or any Intel CPU from before they started including GPUs.
Given the GT 1010 name, I was expecting this thing to essentially be a desktop version of the GT920M I had in a Toshiba laptop I`d bought back in 2015, but it actually seems to perform a good amount better, not that it was a high bar to clear, because the best my GT920m could do in GTA V was in the 40 fps range at 1366x768 normal settings
I unironically needed something like this - but on a pci-e x1. The use case - sporadic use on a home NAS.
The lag in the nvidia control panel showcased at 5:18 is my complete experience with nvidia cards, one of them being a 1050 4gb in a laptop I own.
That issue with the HDMI-VGA adapter, I've found those work better with older, "dumb" VGA monitors, the ones with no OSD.
6:40 To put the pricing into perspective: here at the end of October 2024 I just bought a good working 1080 for $50 locally in California. (To be clear that is still a good value, I had to pay $60 for a 1070 soon after 😂). But what a crazy card. I usually just keep a 750Ti as a GPU around to test computers with. Or a 1050 if I happen to have one.
I read the title and even before watching the video, I googled the damn thing and laughed so hard I fell out of my chair. This is the biggest joke ever, rivaling even the legendary FX5200 and the GT710
18:52 but there is evidence that G-Sync works. See, GT cards didn't get G-Sync support until Nvidia added it for the GT 1030 (and 1010 when it came out) in 2019. I've fully tested my LP GT 1030 GDDR5 with G-Sync and let me tell you, Halo 2 cartographer runs like a dream in 3440x1440p120-144hz, all thanks to Displayport 1.4a. But yeah GT series never had NVENC at all. Look up GPU capabilities on the W!k!.
Haven't watched the rest of the video, but 2:54
Is this a sticker placed over the box?
Yes
WOW WHAT A SURPRISE!!!! I thought it will be an amazing GPU. Had GTX 980Ti and it was good GPU. GT1010 is only missing X and Ti but is on the other hand 30 better
I don't know this model exist until you review it 😂
According to Techpowerup's GPU database there is a GDDR5 version of the GT 1010 so maybe we should expect a follow up video on the GT 1010?
There's a chance those you're only going to find in China or maybe some other Asian territories. I remember there was a card that was specifically made for internet cafe's in that region. It's probably done more than once.
@@cyphaborg6598 GTX 1060 5GB was made specifically for iCafe's in that region yeah
whats good my favorite mate can i get a hi?
Hello
I can't wait to see if it is just a way to add HDMI ports to your PC.
Depends on needs.