To be fair, I don't think it's unfair to _not_ have to dedicate an SSD to a GPU because developers want to take the Crysis entitlement up another notch.
@@ZeroHourProductions407an you put that in English? Oh, and if you're suggesting that VRAM is expensive you're wrong. And if you're suggesting that video game graphics shouldn't improve over time, you're an anomaly. Are you trying to suggest that PC games shouldn't be able to produce console levels of fidelity? And by the way, there is no SSD that is 16 gigabytes, except if you want to call a thumb drive an SSD.
Geordie, I'll learn to trust you when you learn forgery is a crime, punishable by blah blah. I'm just kidding. Nvidia has stuck us with more or less 4-8 GB VRAM for the past 4 years (excepting the flagships, which cost as much as actual ships) and it's embarrassing. Look at Intels $350 Arc card with 16 GB VRAM. VRAM is not expensive, but I see tons of simpers in comments sections saying "oh Nvidia is just making smart business decisions by ripping off their customers" Yeah, well those smart decisions caused me (a 20 year Nvidia fanboy) to jump ship. When I have to replace my current GPU (hopefully in the year 2030 😅) I'm going to be looking at Intels BattleMage.
I can, situationally. Like I was fine with a low frame rate in Elden Ring before I upgraded, because I knew that it's capped at 60fps anyway, and I wasn't going to lose any fights I was remotely prepared for by running at only 45fps on my old card. Seeing that it was steady, and wouldn't dip on me when some giant enemy swooped down out of the sky, was more important. For those twitchy FPS titles, of course, I was looking for 100fps or better. Very fortunately, COD:MW, Cold War, and Doom (2016) were all incredible performers on even that old low-profile GPU.
I always felt that the 3GB VRAM buffer together with the cut down GPU made the GTX 1060 3GB a bad deal especially considering that the RX 480 8GB had an MSRP of just 30 USD more and it performs a lot better these days relative to even the GTX 1060 6GB.
@@burtiq Nvidia's mindshare was very strong back then (and still is though these days it feels like more people are willing to consider AMD). On one hand going with what people around you recommend makes sense and is often an ok choice but it can backfire if those people aren't well informed themselves and/or care more about you being on "their team" rather than you getting good value for money.
@@burtiq I bought my old 1060 in 2017 because even though I wanted an RX580 I couldn't get one for less than £330 due to miners I could and did buy a 6GB 1060 for £180. Nothing to do with fanboyism, you just can't justify paying that price difference.
It's had a pretty long run, but nothing lasts forever. The performance is just borderline acceptable in the reviewed games. I caved about a month ago and picked up a cheap RX6600 to tide me over. (Very happy with it, tbh) Hopefully the situation in the sub-$300 area improves in the coming years.
My first ever dedicated gpu! Had a Palit Dual model, which has kinda mediocre cooling. Served me well until I bought an RX6600. Donated to my cousin to upgrade his ancient R7 250x, and oh boy how happy he is. Mainly he plays esport tittles and some older mmorpg, and it serves him so well too.
That's the thing though, the card doesn't lose functionality as it ages it just stops getting new functionality. If you think how much software this card runs it represents untold thousands of games and aps. I have an ancient 7.1 soundcard in my computer to output to my amps, it sounds muscular compared to the more clinical modern devices and it's a huge upgrade on the board audio chip, my point you ask? I literally pulled it out of someone's e-waste for £0.00, it's function is fixed and that's fine by me so it'll run until it dies (3.5 years so far), we too often throw out stuff too soon.
Palit Dual's cooling is indeed meh. I undervolted my GPU and now its okay-ish. But without undervolt the temps were in 80's often causing GPU to throttle.
I used one (Zotac single fan) for 3 years and all it cost me was some thermal paste and elbow grease (bought it used so re-pasted it then again a day before I sold it), got exactly what I paid for it. Best value card ever, didn't even use much electric.
went from a 7yr old g4560 1050 2gb build to a 5600x 3060 12gb build and it's such a massive difference, the old build can still handle esports titles well but not without tinkering with a bunch of settings
currently i'm stuck with r5 2600 and gtx 1050 while waiting for new gpus to come. i have probably maxed out what this setup can do with system optimizatons + overclocking and... it works (just don't expect it even launching smth new)
@@RotcodFox Without a doubt. Had an i3-6100 combined with a GTX 750 Ti and the CPU was the bottleneck. And this was at least 3-4 years ago. Had the CPU replaced with an i5-6400T and it was a major step up regard to consistency in performance.
Great content as always. It just occurred to me. Maybe you could make a video comparing modern low settings to old ultra settings. That'd be interesting. It's infuriating how modern low settings can look much worse than older games set to ultra, yet require more VRAM and processing power to deliver that inferior experience while the older titles run much better.
We already know the result between most remasters and originals (because why would they do a remaster unless it's a cashgrab), but something like Portal 1 vs Portal 2, Just Cause 2 vs 3, Trine 2 vs 4, Doom 2016 vs Eternal
I just wanted to point out reapplying thermal paste did not help the card in Halo. It was the drivers. Actually, your temperatues jumped up 10c after reapplying paste. Likely because the drivers were actually leting the card get it's workout, but it running so much better while being 10c higher shows it wasn't the paste and probably would've been better to leave out so newcomers don't think slapping on paste will fix their driver issues. Besides, 58-59c is VERY cool for a GPU. Not sure why thermal paste was even a thought.
I had a 3 GB EVGA 1060 as one of my very first cards. It eventually found its way into a Christmas PC present for my niece and nephew a couple of years ago.
Sams things going to happen with the 3060ti and 3070/3070ti and the 6700XT and 6800/6800XT Once the Nvidia cards start getting older the performance gap will increase and favour the larger Vram Radeon cards, we're already seeing the 6800 beat the 3070 in RT
I suspect this card could live it's best life used as an encoder for a home media center OR as an emulation GPU for older console games. Most old consoles had very low Vram to begin with and if you stick with 1080p or lower, you may have a great deal of fun.
Jesus you are really on the ball, everytime I think about a video you should make you are already hard at work, never stop these awesome videos. PS I really want to see you to do a review of the old Quadro K6000 or the Maxwell Quadros since those has a massive amount of VRAM for its time and even by todays standards and really want to see how it does in recent games.
Didn't actually expect that card to run anything modern, count me surprised. Although the end result does looks like oil painting or slightly melted wax sculpture, but it runs so that's a definitive win.
Passed on a 1060 3GB to another UA-camr (Mikes Unboxing) to see if 3GB could still kick it today. Yep, still does the trick (with some graphical compromises).
I upgraded from this exact model of 1060 in December to an Arc A770!! It honestly still performed pretty good, I was even able to get Star Citizen to 40 fps on low.😂 Very surprising to see the Gigabyte model when the video started, thought Random Gaming stole mine🤣
I really don´t get the criticism against FSR and why it´s inferior to DLSS. FSR is saving a lot of cards out there, bringing their owners the possibility to enjoy even the latest games to an acceptable degree. It´s an "anti planned obsolescence" technology and we should be grateful to have it.
I finally got hold of some games that have FSR and I thought that 'Quality' and 'Balanced' looked close enough to native to use it without issue. Sure, it's not as good as DLSS and sometimes XeSS can have the edge. But FSR 3.0 is up next and promises to improve on a lot of the issues. At the end of the day, it works on Intel/AMD/Nvidia and costs us not a penny to use for extra fps. Some game devs implement it far better than others, seems to be the problem.
True i mean if you are budget constrain you are likely to have small monitor where the image is not that bad unless you put in on ultra performamce mode
I still use this exact card and am pretty pleased. I don’t play very many new AAA games so my opinion is likely very skewed compared to other players on pc but Im okay with running games that are expansive with all settings on performance or low. If you’re like me and you mainly play games that released before 2019 this card is highly recommended for the price but it is barley cutting it in 2023 and soon will be near unusable.
This video came out just after I built my new pc with a 4070ti and although the upgrade from my old 1060 3gb was night & day difference, I only started to feel like my gpu was in need of an upgrade in the past year. Up until now, the card help up pretty damn well.
The upgrade from my gtx 660 to the 1060 3gb at the end of 2018 was huge, got it for 120 eur. Soon the low vram started to irritate me, bought a 1070 in 2019 and used it until a few months ago, whole diff story!
Some updates ago, I've watched 1060 3GB and 7970 3GB running Hogwarts Legacy. Geforce had good frames at 1080p quality fsr, but the textures were often completely missing, leaving just a hole in their place. 7970 though, it didn't have that problem - as anything above 720p balanced fsr was too much. I've tried it on my R9 380 4GB though and was pleasantly surprised. While it couldn't hold 60fps in its dream, it was still pretty stable gameplay at around 40-50, can't remember now, with all the textures looking fine at 1080p quality fsr. It was a good gaming experience. So I'd really like to see that comparison with a 1050ti, they might just be too close in many situations. If not in frame rate, then in stability and/or overall looks.
im rocking the asus tuf fx505dd laptop with a ryzen 5 3550h and a gtx 1050 3gb in it and to be fair the 3gb vram limitation doesnt affect me at all since i do not own any games past the year 2021 with the exception of sons of the forest and even that runs absolutely okay .
I have one of these cards, but last year I replaced it with an RX 6600 because on newer titles, mainly Cyperpunk and Escape from Tarkov, struggled on it. Cyberpunk just ran hard against the VRAM limit and then some, so minor stutters accompanied most long drives, even if the average framerate was about 40 otherwise. And Tarkov stuttered less for the most part, but its too competitive to even live with those struggles, especially since the VRAM kept me from raising LOD to max, which has the odd side effect that any magnifying optic (even ones with optional magnification that are set to 1x) WILL set LOD to max on the fly, causing the card to suddenly need to load in a bunch of stuff that wasnt getting rendered before, freezing the game for a split second. Right when you want to shoot someone. Strangely enough the VRAM never bottlenecked it and the GPU usage was always glued to 100%, it just put a hard limit on how much stuff can be loaded in at any point in time. It still overclocked nicely and was well cooled.
With spare parts I strapped an Rx 5700 xt with the same CPU and I shit you not in games like cyberpunk, the outer worlds, spiderman, the card is maxed out
Also the GT 440 3GiB had an impressive amount of memory for the time especially for a low end card that is lower end than the GTX 1060 five generations earlier
I'm still trying to scrape by with this thing even with a new build a couple of months back, but by the end of the year I'm going to have to say goodbye to this card that got me through so much in the past three years. Thinking about it's older brother, the 2060 Super.
I miss the good old days when you could use 700 series cards as your main gpu. I had a 780ti for about 2 months as my only card after selling my 2060super for basically what I bought it for and preordering another gpu at only a little over MSRP. It served me well but since 700 series isn't supported anymore you can only used it for older games despite the fact they could still run them (altho due to the 3GB of Vram at medium/low 1080p) and for new games probably a lot of settings on low.
@@tilburg8683 What surprises me more is that until 2020, having a gtx 750, you could play any new game without any problems and did not need to have 6-8GB of video memory to play with a comfortable FPS
just get a 6700xt. The 7700 and 7800xt are barely going to be an upgrade apparently, and its the main reason AMD are not releasing them. They released the 7800 and 7800xt already but lied and called em the 7900xtx and 7900xt.
Still rocking the 1060 6gb, only real problems seem to be driver related. I want to upgrade but I bought a premade last time when they were about the same price as the GPU alone and I know I'll probably need to upgrade my PSU which is a large part of the cost of a new system reusing some of my parts. I think I'm stuck until something goes out, I'm not a heavy gamer so I still have some older game options.
Costco has had some screaming deals on prebuilds, off and on, since the end of 2022, some even with a 3060 12GB for $800 all-in. So it may be worth keeping tabs on that. In case they list a deal that undercuts a build or isn't costing much more than an upgrade.
What driver issues are you having? There is a unique Vulkan driver bug which kills Yuzu on the 900-series but i'm fairly certain 10-series is not affected, they had a similar bug but fixed it.
I never did understand the 3G choice for this card. All I can figure is that there were a bunch of 512 MB memory chips floating around for really cheap.
@@FacialVomitTurtleFights I think some of the 970s had full 4gb, at least mine never showed 3.5 on any games nor did it slow down when going over 3.5gb usage. Pretty pleased with it and definitely gainded itself a place on my shelf even after it's retirement, can't say the same for my 4690k, that poor thing didn't age so well
@@FacialVomitTurtleFights That cannot be done i don't think. 970 uses the same chip as 980 and same memory bus layout physically as the 980, with 256 data pins. But they split the bus into the 224-bit bus and 32-bit bus to artificially make it slower, and there's a chip select logic so the two cannot run at the same time. If you wanted to reuse the chip from the 1060 and split its 192 bit bus into two chip-select buses 96 bit wide, one carrying 3GB and another 1.5GB for a total of 4.5GB and some cost saving, it would also necessarily become horribly slow. The 970 just about works out OK, since the last half-a-gig become effectively idle and just storing the display window manager, and it's still faster to have that slow half-gig than to fetch data via PCI Express. As far as games are concerned, you're missing maybe 200MB fast VRAM compared to a full 4GB card, or they might not even have a metric to detect the difference. I wouldn't anticipate to see this trick done ever again, especially now that they can just lock down performance because the firmware is now hardware signed. I think the reason they made the 1060/3gb is because NVidia's problem is that they sometimes release something REALLY good and then have a hard time competing against themselves, with people plain not upgrading for 3-4 generations. So they try to release cards with deliberately not enough VRAM, even if they have to scam consumers into buying them by reusing a product name and not differentiating them well enough. If a card is too slow, you just dial down the settings. If you have a DX12/Vulkan title that needs a base amount of VRAM that you don't have, well that's game over. Planned obsolescence in truest sense of the word.
Back when i was building my first ever pc (i am still using it) the 1060 3gb version was due to discounts cheaper than gtx 1050ti which was my only option for the budget So i am using the gtx 1060 3gb version with an i5 9400f It ran pretty good, at least for what it costed me (the total build costed around 45k INR which inc of all taxes should be around 450$)
Tough choice, but good discount amount. 1050ti is pretty slow, 1060/3gb is too VRAM starved. I decided to side step the choice and get the 970, which has plenty of VRAM but it has other gremlins - the driver quality has been suffering with unique bugs lately (Yuzu is broken), and it doesn't do varying refresh displays. Got it for 80€ used in 2019. I'll be replacing it now, it's been becoming less reliable in spite of repaste/repad and then fan controller failed, but it's nearly 10 years old, so, fine i guess? Aiming for 160€ for the 2060Super, which will take some hunting, but if i get too frustrated i can grab the 2060 standard for around 130€. 3060/12GB is the dream but well outside my capability.
Lot of unnecessary panicking by youtubers 😁 Games are first and foremost made for consoles, i.e. PS5 . PS5 has 16 GB of unified RAM and VRAM to run games at 4K. Therefore, if you plan to play at 1080p low, 4GB VRAM (or even 3 as in this case) would be enough. Especially if you have 32 GB of DDR4 RAM to help it, as DDR4 is cheap.
I just replaced my gtx 970 strix by a rx 7900xt hellhound . In the end the memory was the thing holding it back the most.. I can't imagine having only 3 ☠️
My first pc build was at the end of my undergrad and used this card with a ryzen 3 1200, and it was a midrange dream until shadow of the tomb raider stuttered like a mad man. It was a great time though
Considering to some benchmark pages a RX 6500 XT would be just as fast - but considering the negative reviews when it launched you might be able to get one for less money. And even if not, it is probably using less power and it´s a lot newer.
I have a gtx 1050 3Gb version in a laptop and i can say it is very good. Surprises me almost every day with performance. I played uncharted, hogwarts legacy, death stranding, newer and old assassins creeds, rdr2 and a lot of other less demanding/older games. I play at 1080p medium and usually lock the newer games to 30 fps via Rivatuner. I hate stuttering/frame drops from 30 fps. But with overclocking and undervolting both the gpu and cpu, the games run very well, no dibs bellow 30 or stuttering. So i have to say 3gb vram still holds up, unless the game is very bad optimized. But then you just have to wait for patches and play it then
Oh and a tip for playing with 30 fps. Turn of ingame vsync and fps lock and turn on half refresh rate vsync (if you have a 60hz display) in nvidia control panel and lock the game to 30 fps via Rivatuner
this is why the PC gaming is at its worst aswell because its people like you who says "games are so unoptimized omg i cant even run it well on my 8 YEAR OLD PC!!" The answer is no, 3GB is definitely NOT holding up, even 4gb vram is dead and irrelevant in 2023 so idk what you are smokin if you think 3gb still holds in 2023 especially when even some non triple A title like Halo Infinite already shutdown people with only 3gb vram
I was playing Halo Infinite on a 1060 3GB since the game's release. It was a struggle on graphics even running on LOW but i ended up working my way up to Platinum 5 on ranked slayer so i guess that speaks to the ability to play. The card had a lot of issues loading larger maps (think BTB) but i stuck with ranked slayer most of the time anyway. The nail in the coffin was when 343 released Season 3 of Infinite where they cut off people with less that 4GB of VRAM.
I upgraded from this to a 3080 FTW 3 that got for $350. It was a night a day difference (obviously) The 1060 made a decent card for my GF’s “getting into PC gaming” PC I put together though
I honestly think it's on the developers to optimize the games to use less VRAM. I have played demanding games on laptops before without losing textures. It's ridiculous to require any more than 1GB of ram.
They can't fit those textures in 1GB and the PC hardware doesn't support most efficient texture compression algorithms either. Watchdogs 2 was a game that brought up 3GB as maximum texture requirement for me .... I still remember my dorm roommate finishing Need For Speed: The Run with totally unplayable fps for most part of it 😂. We just had fun with what we had . And back in the day's in our primary school we only had one PC capable of running need for speed: underground.. and it overheated and turned off after a while.. we had fun because to us FPS didn't matter as we were happy just to be able to run those games...
My textures looked like that on a gtx 1650 on low settings and it was because of the high texture package that installs with the game. I removed the texture pack and it looked alot better. I think you need 6 gb to run those textures and you have to choose not to install them otherwise you get them anyway.
This card saved me during the crypto boom, basically the only card at msrp. The 3GB VRAM saved it from a hard life of slave mining. It has served me well, now it's retiring in my Plex machine.
@@peterpan408 Both. During 2017, it was pretty much the only card at MSRP, since all GTX and RX 470->580 get bought up by miners. During the 2nd one this card gets sold pretty cheap on 2nd-hand market since DAG size increase for miners, and people sell their cards to move to used cards that miners sell.
A buddy of mine bought these in bulk from a miner for $25 USD each. At that price they're perfect for slapping in a cheap Optiplex or older custom built PC. Getting 1080p low out of a card that costs less than a nice dinner is a bargain in 2023.
Well, I have a GTX 970 with 4gb VRAM, although the card can only effectively use 3.5gb and I am actually quite pleased (got the card very cheap). I can actually play many games on medium graphics settings, such as the "new" CoD MWII. But I guess the GTX 970 simply has more performance (overclocked) than the 1060 and of course a bit more VRAM.
@@stefanpavicic6277 I don't know about gtx 980m, but with 970, there was a manufacturer issue where the last .5 GB of Vram was slower memory than the rest of the 3.5 Vram, so if you used over 3.5 GB of Vram, the card would slow down to a crawl, because it tried to use slower memory that couldn't keep up. There was a lawsuit, and people who bought a 970 were given a small refund.
@@firstnamelastname-oy7es whoa, that is quite interesting, I didnt know that but coming from nvidia it didnt suprise me(pardon me for my english language).
@@stefanpavicic6277 Well behaved software is supposed to leave some over for the DWM (display window manager) and other processes. All VRAM allocations by a single process are usually limited to about 81% of the full VRAM present, as noticed by CUDA users, though depending on the API used, there may not be an actual limit applied, or it can be 90% instead of 81%. It's not well documented. Anyway this is why the GTX970 trick, where the memory ended up partitioned into main 3.5GB and slow 512MB was THIS pain free in practice.
Had the Gtx 1060 3gb in my dell prebuild that i upgraded from ( i know i shouldnt have bought a prebuilt ) but its good for olders games like the settlers 3 and avp2 even csgo , after that i saved up for a custom rig from pc specialist now am running ryzen 5 2600, 16gb of ram and a 2060 palit single fan card , i only play at 1080p , so i am happy with my current setup , ps sorry to have rambled so much .
I have a GTX 1060 3GB. I spent around 90 Canadian dollars on it. Great for boomer shooters and big budget games from several years ago, like Gears of War 4 and Call of Duty WWII. My main PC has an AMD Radeon RX 6600.
I was rocking a 2060 3GB GPU (in a laptop) for quite a while before my most recent upgrade. I never ran into game I wasn't capable of running. VRAM is important, but definitely not as big of a deal as people say.
It depends, but your scenario is a laptop. Even some DX11 games during the PS4 era like Dark Souls 3, the VRAM usage was more than 4 GB at 1080p Ultra. Shadow of the Tomb Raider at 1080p Ultra was more than 6 GB VRAM. 3 GB VRAM was more than good enough for the early era of DX11 and overkill with DX 10 and older. It really depends...
This is exactly what will happen to the 4060 and maybe even 4070 cards sooner than later. the gpu power is enough to run the games in lower settings somewhat oakayish, but the missing VRam leads to all kinds of problems.
4070 should really be fine for a long while, 12GB should be enough, not ideal, but enough, especially for 1440p. We won't see another big spike in vram needs until the PS6 comes out which is many years away still. The reason all the drama over vram is sprung up is because consoles just upgraded and this happens every time consoles upgrade. Current gen consoles have basically 16GB of shared video/system ram so 12GB really should be fine for awhile. The general rule of thumb is never have a GPU that has massively less VRAM available to it than the bare minimum line set by consoles.
I still have a 3gb 1060 kicking around, it was ok when it was new, but time has not been kind to it. It’s great for older titles, but as you said, games these days aren’t made for low vram configs, and the sacrifices required to get playable fps are ridiculous, if I want to play Minecraft I’ll load Minecraft, but when it’s hard to tell the difference between Minecraft and something like hogwarts, it’s too much for me.
I mean, did you watch the video? The sacrifices were minimal and still ended with OK performance in most of the games. Only issues were Dead Space needing fsr, which looked fine but had stutters, and Star Wars that needed fsr but looked decent and ran well. It's coming to the end sure, but you can still game with various new titles and it look OK. It's not like he turned all the games into 240p.
Still using an Asus GTX 1060 3 GB on my secondary pc for light gaming. Still pretty decent for competitive games like Valorant but AAA games will struggle a lot unless you played at a lower resolution and settings.
I really love the untextured trees in that halo game there... I actually wish they would make a cartoony halo game with non existent textures like that
I first read the title as "When you have 32GB of RAM in 2023..." and was confused and worried. Surely I hadn't fell that much behind the curve on that front yet?
Still using my R9 290 4gb from 2015 and videos like this only reaffirm to me that I've no reason to upgrade other than for VR maybe. Me back in 2015 thought my system would've been dead in 7 years lmao, love my "old" PC.
I didn't know that this video was about the 1060. It's a surprise, given that I have a 1060. However, the one that I have is the Superclocked version, which has 6GB, instead of the 3. For what it is, it works like a dream, even to this day. I can run games at medium and high to reach around 45-60fps (depending on the game).
1060/6GB is the original card and very nice. SC or not SC tiny difference, no matter, though EVGA is always just pleasant and fairly reliable, so good choice. 3GB is NVidia's cut cost scam released a longer time later, it misses so much capability that it should have had a more distinctive name. You find many people not paying attention or not understanding the difference and hooking up with a so much inferior product. DX11 heavy testing and not enough warning by the press also to blame. NV's scammy naming goes more brazenly on. 3060 (12GB) got silently supplanted by the 3060/8GB which is missing a bunch of hardware and loses 20% performance, making it compete for performance with the original 2060 (6GB) and not always winning, and being well short of 2060Super in performance, at measly 30€ discount from the real 3060. The full 12GB version of 3060 is very nice though! Sometimes there are nice surprises. 2060Super is not a 2060, it's effectively a more power frugal 2070. Very good. I will not complain about over-delivering.
Hey ive stillgot this card! Bought in 2017 when money was very tight and its served me well. Looking for the more mainstream rdna3 cards and rtx 4060, then look at changing
As an owner of a gtx 1060 3gb, I can say I've been absolutely satisfied with it until now. Especially if you are willing to use mods or tinker with config files for some games, it can produce very decent results then. It did so well that I decided to hold on to it and transfered it to my new pc which I bought last year (ryzen 7900x), since gpu prices where still stupid high at my country at that point. That said, it is on it's last legs and have already ordered a new GPU, but I did play and enjoy Hogwarts legacy and RE4 remake just fine with it (with some tinkering necessary)
Hey been watching your channel for a while now im giving my buddy my 3060 and saving up for a RX 7900 xt so im going back to this card for a bit so thanks for the vid
I knew that 3GB Nvidia cards were in trouble, when I tried to play Fallout 4 with the High Texture pack using a GTX 780 Classified. It worked okay outside, but once I got inside of a building, there was a fog, to hide the low draw distance. It looked like ars!
This is ny exact gtx 1060. Its been a great card for the past few years but Im looking to finally upgrade it to something with a bit more power and vram. This card will still be used in a home server when I upgrade but its been good to me.
@@milosstojanovic4623 Wrong, there are many titles that can run at 1080p60 with medium-high settings, it's the poorly optimized triple A games that are giving the impression of capable hardware not being enough anymore
@@alexg9601 yeah for sure buddy. Try any of the games from 2021 2022 and from this year on high settings to run at 60fps. That card is barely mid range card, even lower than mid range. With exception of some indi titles AAA titles cant run at 60fps even on medium without reducing some settings even lower and with high FSR.
@@milosstojanovic4623 FSR set to quality is way more than enough and i tell you this because i have an RX580 Myself :) it's just that most PC gamers are elitist AF and can't stand the idea of someone not having an RTX 4090 with a 4K 360Hz Monitor and getting 1000 fps in everything, the RX580 is still fine and very capable, not "barely" mid-range, save that for cards like a 1660 or 1650
Still running this card in 2024! Honestly, because I'm only playing older games at the moment, I'm well off. However, I do want to play Cyberpunk for instance, and I'll need to get a new card someday even if I have barely any money left most of the time. I don't want to play Cyberpunk with bad settings. I don't want to do that to myself.
My first GPU in a PC i build from zero. God, do I regret that purchase. Went into 8GB within 1.5 years and stayed there until I picked the 3080Ti a few months ago.
It's good nvidia learned not to cheap out on the RAM on new cards... OH WAIT 😂
To be fair, I don't think it's unfair to _not_ have to dedicate an SSD to a GPU because developers want to take the Crysis entitlement up another notch.
@@ZeroHourProductions407an you put that in English?
Oh, and if you're suggesting that VRAM is expensive you're wrong.
And if you're suggesting that video game graphics shouldn't improve over time, you're an anomaly.
Are you trying to suggest that PC games shouldn't be able to produce console levels of fidelity?
And by the way, there is no SSD that is 16 gigabytes, except if you want to call a thumb drive an SSD.
@@ZeroHourProductions407 8GB was midrange in 2017.
Your point and blaming devs in general is stupid.
Geordie, I'll learn to trust you when you learn forgery is a crime, punishable by blah blah.
I'm just kidding. Nvidia has stuck us with more or less 4-8 GB VRAM for the past 4 years (excepting the flagships, which cost as much as actual ships) and it's embarrassing.
Look at Intels $350 Arc card with 16 GB VRAM. VRAM is not expensive, but I see tons of simpers in comments sections saying "oh Nvidia is just making smart business decisions by ripping off their customers"
Yeah, well those smart decisions caused me (a 20 year Nvidia fanboy) to jump ship. When I have to replace my current GPU (hopefully in the year 2030 😅) I'm going to be looking at Intels BattleMage.
@@Boogie_the_cat heyyy look when I built my first pc that was actually an option lol 16gb, 32gb and 64gb SSD's
5:08 Frametime is always more important than fps IMO. I can tolerate low but stable fps but not stutters.
Same.
I can, situationally. Like I was fine with a low frame rate in Elden Ring before I upgraded, because I knew that it's capped at 60fps anyway, and I wasn't going to lose any fights I was remotely prepared for by running at only 45fps on my old card. Seeing that it was steady, and wouldn't dip on me when some giant enemy swooped down out of the sky, was more important.
For those twitchy FPS titles, of course, I was looking for 100fps or better. Very fortunately, COD:MW, Cold War, and Doom (2016) were all incredible performers on even that old low-profile GPU.
Totally agree. Stable 30 FPS is better then unstable 30-45fps
I think 30 FPS just feels too laggy, but I agree that prioritising consistent frametime is more important.
As a virtual reality player, I gotta agree
I always felt that the 3GB VRAM buffer together with the cut down GPU made the GTX 1060 3GB a bad deal especially considering that the RX 480 8GB had an MSRP of just 30 USD more and it performs a lot better these days relative to even the GTX 1060 6GB.
If you were young and foolish like me, you bought the 1060 because there were A TON of nvidia fanboys to skew your decision.
@@burtiq Nvidia's mindshare was very strong back then (and still is though these days it feels like more people are willing to consider AMD). On one hand going with what people around you recommend makes sense and is often an ok choice but it can backfire if those people aren't well informed themselves and/or care more about you being on "their team" rather than you getting good value for money.
@@burtiq
I bought my old 1060 in 2017 because even though I wanted an RX580 I couldn't get one for less than £330 due to miners I could and did buy a 6GB 1060 for £180. Nothing to do with fanboyism, you just can't justify paying that price difference.
@@darthwiizius I bought my 1060 when they were 20$ apart purely because of fanboyism.
And post was about 480, not 580
Wasnt this card considered a great value back then? 🤣
Me with my 4GB of vram feeling superior
😂
*With a 1050ti*
Yeah, as I have a 290X on my 2nd setup.
Yes, me with my GTX 750TI 2GB 🤣💀
Me here chillin with gt 730
It's had a pretty long run, but nothing lasts forever. The performance is just borderline acceptable in the reviewed games. I caved about a month ago and picked up a cheap RX6600 to tide me over. (Very happy with it, tbh) Hopefully the situation in the sub-$300 area improves in the coming years.
The 6600 is the new budget king
My first ever dedicated gpu! Had a Palit Dual model, which has kinda mediocre cooling. Served me well until I bought an RX6600. Donated to my cousin to upgrade his ancient R7 250x, and oh boy how happy he is. Mainly he plays esport tittles and some older mmorpg, and it serves him so well too.
That's the thing though, the card doesn't lose functionality as it ages it just stops getting new functionality. If you think how much software this card runs it represents untold thousands of games and aps. I have an ancient 7.1 soundcard in my computer to output to my amps, it sounds muscular compared to the more clinical modern devices and it's a huge upgrade on the board audio chip, my point you ask? I literally pulled it out of someone's e-waste for £0.00, it's function is fixed and that's fine by me so it'll run until it dies (3.5 years so far), we too often throw out stuff too soon.
Palit Dual's cooling is indeed meh. I undervolted my GPU and now its okay-ish. But without undervolt the temps were in 80's often causing GPU to throttle.
*cries in r5 grpahics*
The rx 6600 is such a great value card right now.. Gonna get one after a few weeks.. The upgrade going to be soo good lol.
dual coolers always suck since about 5 years. 3 coolers run way more silent with less rpm than 2 coolers
Had the 6GB version, pretty good for the time and nice resale value when I switched to 1070 with almost no costs.
I used one (Zotac single fan) for 3 years and all it cost me was some thermal paste and elbow grease (bought it used so re-pasted it then again a day before I sold it), got exactly what I paid for it. Best value card ever, didn't even use much electric.
There are several versions of this card, there's one "good" 6 gb card and a "bad" one, same with the 3gb variant, and there's even a 5gb version.
went from a 7yr old g4560 1050 2gb build to a 5600x 3060 12gb build and it's such a massive difference, the old build can still handle esports titles well but not without tinkering with a bunch of settings
I imagine that the g4560 was also a massive bottleneck for the 1050
currently i'm stuck with r5 2600 and gtx 1050 while waiting for new gpus to come. i have probably maxed out what this setup can do with system optimizatons + overclocking and... it works (just don't expect it even launching smth new)
@@checkyboxbruh that's not even that bad, i have a i7-4700 and gtx 750ti😭, my r5 5600 and 6750xt are coming tomorrow tho🎉
@@RotcodFox Without a doubt. Had an i3-6100 combined with a GTX 750 Ti and the CPU was the bottleneck. And this was at least 3-4 years ago. Had the CPU replaced with an i5-6400T and it was a major step up regard to consistency in performance.
@@obi-wankenobi8023 How is your experience?
Great content as always. It just occurred to me. Maybe you could make a video comparing modern low settings to old ultra settings. That'd be interesting. It's infuriating how modern low settings can look much worse than older games set to ultra, yet require more VRAM and processing power to deliver that inferior experience while the older titles run much better.
We already know the result between most remasters and originals (because why would they do a remaster unless it's a cashgrab), but something like Portal 1 vs Portal 2, Just Cause 2 vs 3, Trine 2 vs 4, Doom 2016 vs Eternal
@@RadioactiveBlueberry Portal 2 is very beautiful game, and highest settings can be run on new RDNA2 iGPU, so clearly not very high performance hit.
I just wanted to point out reapplying thermal paste did not help the card in Halo. It was the drivers. Actually, your temperatues jumped up 10c after reapplying paste. Likely because the drivers were actually leting the card get it's workout, but it running so much better while being 10c higher shows it wasn't the paste and probably would've been better to leave out so newcomers don't think slapping on paste will fix their driver issues. Besides, 58-59c is VERY cool for a GPU. Not sure why thermal paste was even a thought.
I actually had this card until a few days ago, when I upgraded to a used 5700 XT! I must say it served me well despite how much everyone hated it.
still use it, and I just ordered a used 3070 ti today, gonna be a hell of an upgrade
I had a 3 GB EVGA 1060 as one of my very first cards. It eventually found its way into a Christmas PC present for my niece and nephew a couple of years ago.
same, had a 3GB Zotac
Sams things going to happen with the 3060ti and 3070/3070ti and the 6700XT and 6800/6800XT
Once the Nvidia cards start getting older the performance gap will increase and favour the larger Vram Radeon cards, we're already seeing the 6800 beat the 3070 in RT
@@yahyasajid5113 comparing a 3070 to a 3gb 1060 shows you have no clue what you’re even talking about
@@dylanzachary683 not sure which comment you're reading, clearly I drew no comparison of the 3070 to the 1060
I was comparing it directly to the 6800
@@yahyasajid5113 The thing is though, once those cards get older at least the Nvidia cards will still work
I suspect this card could live it's best life used as an encoder for a home media center OR as an emulation GPU for older console games. Most old consoles had very low Vram to begin with and if you stick with 1080p or lower, you may have a great deal of fun.
Jesus you are really on the ball, everytime I think about a video you should make you are already hard at work, never stop these awesome videos. PS I really want to see you to do a review of the old Quadro K6000 or the Maxwell Quadros since those has a massive amount of VRAM for its time and even by todays standards and really want to see how it does in recent games.
i'll take a look :)
Recently I got a Tesla M40 at home, but a workstation conversion is needed.
Didn't actually expect that card to run anything modern, count me surprised. Although the end result does looks like oil painting or slightly melted wax sculpture, but it runs so that's a definitive win.
Big fucking facts. I just bought a 7900xt 20gb for 800$ and tbh I don’t need it 😂 was surprised wow
it can run gtav and bf5 great, but kinda sucks at bf2042
Passed on a 1060 3GB to another UA-camr (Mikes Unboxing) to see if 3GB could still kick it today. Yep, still does the trick (with some graphical compromises).
I upgraded from this exact model of 1060 in December to an Arc A770!! It honestly still performed pretty good, I was even able to get Star Citizen to 40 fps on low.😂 Very surprising to see the Gigabyte model when the video started, thought Random Gaming stole mine🤣
How is a770 working? I am thinking of getting one
The most impressive thing in your comment was the fact that you willingly bought an Arc GPU.
@@NoThisIsntMyChannelarc gpu isn't that bad, it's just not polished for older games
Nice which one did you get?
I really don´t get the criticism against FSR and why it´s inferior to DLSS. FSR is saving a lot of cards out there, bringing their owners the possibility to enjoy even the latest games to an acceptable degree.
It´s an "anti planned obsolescence" technology and we should be grateful to have it.
I finally got hold of some games that have FSR and I thought that 'Quality' and 'Balanced' looked close enough to native to use it without issue. Sure, it's not as good as DLSS and sometimes XeSS can have the edge. But FSR 3.0 is up next and promises to improve on a lot of the issues. At the end of the day, it works on Intel/AMD/Nvidia and costs us not a penny to use for extra fps. Some game devs implement it far better than others, seems to be the problem.
Fsr definitely more friendly to the pocket
True i mean if you are budget constrain you are likely to have small monitor where the image is not that bad unless you put in on ultra performamce mode
i traded this exact card for some tree. it served me well and that was some damn good weed for it.
😂
Still rocking 1060 6gb i7 laptop 🔥
Love the way you make videos for your viewers and not for views or anything else..Keep up the good work mate ❤
thanks :)
i upgraded from 2gb 750ti to 3070 and it's already considered low Vram lmao
haha im going from the 3gb 1060 to a 3070 ti, and im so curious, how long will it hold up, 1060 lasted 6 years, and still going strong
3:13 this pop-in jeeeez
I still use this exact card and am pretty pleased. I don’t play very many new AAA games so my opinion is likely very skewed compared to other players on pc but Im okay with running games that are expansive with all settings on performance or low. If you’re like me and you mainly play games that released before 2019 this card is highly recommended for the price but it is barley cutting it in 2023 and soon will be near unusable.
This video came out just after I built my new pc with a 4070ti and although the upgrade from my old 1060 3gb was night & day difference, I only started to feel like my gpu was in need of an upgrade in the past year. Up until now, the card help up pretty damn well.
The upgrade from my gtx 660 to the 1060 3gb at the end of 2018 was huge, got it for 120 eur. Soon the low vram started to irritate me, bought a 1070 in 2019 and used it until a few months ago, whole diff story!
I had the 1060 3gb for such a long time. It's good to see that it can still run games!
Some updates ago, I've watched 1060 3GB and 7970 3GB running Hogwarts Legacy. Geforce had good frames at 1080p quality fsr, but the textures were often completely missing, leaving just a hole in their place. 7970 though, it didn't have that problem - as anything above 720p balanced fsr was too much. I've tried it on my R9 380 4GB though and was pleasantly surprised. While it couldn't hold 60fps in its dream, it was still pretty stable gameplay at around 40-50, can't remember now, with all the textures looking fine at 1080p quality fsr. It was a good gaming experience. So I'd really like to see that comparison with a 1050ti, they might just be too close in many situations. If not in frame rate, then in stability and/or overall looks.
im rocking the asus tuf fx505dd laptop with a ryzen 5 3550h and a gtx 1050 3gb in it and to be fair the 3gb vram limitation doesnt affect me at all since i do not own any games past the year 2021 with the exception of sons of the forest and even that runs absolutely okay .
I have one of these cards, but last year I replaced it with an RX 6600 because on newer titles, mainly Cyperpunk and Escape from Tarkov, struggled on it.
Cyberpunk just ran hard against the VRAM limit and then some, so minor stutters accompanied most long drives, even if the average framerate was about 40 otherwise.
And Tarkov stuttered less for the most part, but its too competitive to even live with those struggles, especially since the VRAM kept me from raising LOD to max, which has the odd side effect that any magnifying optic (even ones with optional magnification that are set to 1x) WILL set LOD to max on the fly, causing the card to suddenly need to load in a bunch of stuff that wasnt getting rendered before, freezing the game for a split second. Right when you want to shoot someone.
Strangely enough the VRAM never bottlenecked it and the GPU usage was always glued to 100%, it just put a hard limit on how much stuff can be loaded in at any point in time. It still overclocked nicely and was well cooled.
Currently still using this card, does me well still my cpu is holding me back more in some more modern games i play (i5-4670)
With spare parts I strapped an Rx 5700 xt with the same CPU and I shit you not in games like cyberpunk, the outer worlds, spiderman, the card is maxed out
Bruh, I remember when 2gb was considered high-end
I remember when 512mb cards were over $300.
In the 2005-2012 era probably.
@@NikosM112 thereabouts, first 2gb card were released in 2008
I still use a 2GB GTX 960
I remember 2012 when I bought my HD7770 with 1GB I felt 2 gb was overkill :)))
Also the GT 440 3GiB had an impressive amount of memory for the time especially for a low end card that is lower end than the GTX 1060 five generations earlier
Time is running fast. Nowadays 1060 is low end gpu..
To be fair, its life as a mid gamma was like 4-5 years, when gpus like gtx 960 went from mid to low in 1-2 years :(
@UnjustifiedRecs the 6gb version cost over 300Dollars in 2016, pretty much midrange.
I would have said it was a low end card at it's time but this card is still my daily driver it still plays cod somehow.
1060 3gb here. I have yet to find a game worth the upgrade.
Im still playing on mx110 which is 2gb vram and i am happy
I'm still trying to scrape by with this thing even with a new build a couple of months back, but by the end of the year I'm going to have to say goodbye to this card that got me through so much in the past three years. Thinking about it's older brother, the 2060 Super.
I had a 2GB 760 up until recently, power was never its issue but holy crap that VRAM really became a problem in the last year or two
I miss the good old days when you could use 700 series cards as your main gpu. I had a 780ti for about 2 months as my only card after selling my 2060super for basically what I bought it for and preordering another gpu at only a little over MSRP. It served me well but since 700 series isn't supported anymore you can only used it for older games despite the fact they could still run them (altho due to the 3GB of Vram at medium/low 1080p) and for new games probably a lot of settings on low.
@@tilburg8683 What surprises me more is that until 2020, having a gtx 750, you could play any new game without any problems and did not need to have 6-8GB of video memory to play with a comfortable FPS
That's my config today ! 12400f paired with my glorious GTX 1060 3gb. I plan to change with a RX 7700 XT card in a few months.
just get a 6700xt. The 7700 and 7800xt are barely going to be an upgrade apparently, and its the main reason AMD are not releasing them. They released the 7800 and 7800xt already but lied and called em the 7900xtx and 7900xt.
Nice! I still have the 12400f too. It’s incredible
Get rx 6800, 16gb vs 12 on 7700xt, should be even in perf, only difference would be power usage
@@noticing33 yeah, it's my second choice... I wait for a sale
Still rocking the 1060 6gb, only real problems seem to be driver related. I want to upgrade but I bought a premade last time when they were about the same price as the GPU alone and I know I'll probably need to upgrade my PSU which is a large part of the cost of a new system reusing some of my parts. I think I'm stuck until something goes out, I'm not a heavy gamer so I still have some older game options.
Costco has had some screaming deals on prebuilds, off and on, since the end of 2022, some even with a 3060 12GB for $800 all-in. So it may be worth keeping tabs on that. In case they list a deal that undercuts a build or isn't costing much more than an upgrade.
What driver issues are you having? There is a unique Vulkan driver bug which kills Yuzu on the 900-series but i'm fairly certain 10-series is not affected, they had a similar bug but fixed it.
It's amazing to see how much more my card, the GTX 1060 with 6GB, can do compared to the 3GB version.
I never did understand the 3G choice for this card. All I can figure is that there were a bunch of 512 MB memory chips floating around for really cheap.
Seem like nvidia should have reused gtx 970 memory or something 3.5gb (+.5?) or whatever haha
@@FacialVomitTurtleFights I think some of the 970s had full 4gb, at least mine never showed 3.5 on any games nor did it slow down when going over 3.5gb usage. Pretty pleased with it and definitely gainded itself a place on my shelf even after it's retirement, can't say the same for my 4690k, that poor thing didn't age so well
@@joey_f4ke238 They had 4Gb of memory but the last .5gb was slower than the first 3.5gb.
Google somehting like "970 vram 3.5gb?" to find out more.
@@FacialVomitTurtleFights That cannot be done i don't think. 970 uses the same chip as 980 and same memory bus layout physically as the 980, with 256 data pins. But they split the bus into the 224-bit bus and 32-bit bus to artificially make it slower, and there's a chip select logic so the two cannot run at the same time. If you wanted to reuse the chip from the 1060 and split its 192 bit bus into two chip-select buses 96 bit wide, one carrying 3GB and another 1.5GB for a total of 4.5GB and some cost saving, it would also necessarily become horribly slow. The 970 just about works out OK, since the last half-a-gig become effectively idle and just storing the display window manager, and it's still faster to have that slow half-gig than to fetch data via PCI Express. As far as games are concerned, you're missing maybe 200MB fast VRAM compared to a full 4GB card, or they might not even have a metric to detect the difference. I wouldn't anticipate to see this trick done ever again, especially now that they can just lock down performance because the firmware is now hardware signed.
I think the reason they made the 1060/3gb is because NVidia's problem is that they sometimes release something REALLY good and then have a hard time competing against themselves, with people plain not upgrading for 3-4 generations. So they try to release cards with deliberately not enough VRAM, even if they have to scam consumers into buying them by reusing a product name and not differentiating them well enough. If a card is too slow, you just dial down the settings. If you have a DX12/Vulkan title that needs a base amount of VRAM that you don't have, well that's game over. Planned obsolescence in truest sense of the word.
Back in 2016 I bought a 6GB version of a GTX 1060 because even at that time, seven years ago, 3GB were... discouraging, to say the least.
Thanx for another great video , i always look forward to your content , cheers .
I bought the $87 2GB and it seems to work fairly well.
Back when i was building my first ever pc (i am still using it) the 1060 3gb version was due to discounts cheaper than gtx 1050ti which was my only option for the budget
So i am using the gtx 1060 3gb version with an i5 9400f
It ran pretty good, at least for what it costed me (the total build costed around 45k INR which inc of all taxes should be around 450$)
Tough choice, but good discount amount. 1050ti is pretty slow, 1060/3gb is too VRAM starved. I decided to side step the choice and get the 970, which has plenty of VRAM but it has other gremlins - the driver quality has been suffering with unique bugs lately (Yuzu is broken), and it doesn't do varying refresh displays. Got it for 80€ used in 2019. I'll be replacing it now, it's been becoming less reliable in spite of repaste/repad and then fan controller failed, but it's nearly 10 years old, so, fine i guess? Aiming for 160€ for the 2060Super, which will take some hunting, but if i get too frustrated i can grab the 2060 standard for around 130€. 3060/12GB is the dream but well outside my capability.
I remember back when a 1060 3GB was your main system GPU, doesn't seem like long ago but it was. I think I first started watching in the G3258 days.
The hardware requirements of these new games are inexcusable, considering the graphical fidelity of those games... or lack thereof.
Yeah expexcially on raster graphics side
Lot of unnecessary panicking by youtubers 😁 Games are first and foremost made for consoles, i.e. PS5 . PS5 has 16 GB of unified RAM and VRAM to run games at 4K. Therefore, if you plan to play at 1080p low, 4GB VRAM (or even 3 as in this case) would be enough. Especially if you have 32 GB of DDR4 RAM to help it, as DDR4 is cheap.
Would like to see how RX590 or Radeon VII would do in 2023. They had 8gigs of VRAM too if I'm not wrong
I just replaced my gtx 970 strix by a rx 7900xt hellhound .
In the end the memory was the thing holding it back the most..
I can't imagine having only 3 ☠️
And now developers are like "yo, we need you to install an nvme in your gpu so you can run our game at 480p"
That's EA for you...
Wtf what games are these?
My first pc build was at the end of my undergrad and used this card with a ryzen 3 1200, and it was a midrange dream until shadow of the tomb raider stuttered like a mad man. It was a great time though
Considering to some benchmark pages a RX 6500 XT would be just as fast - but considering the negative reviews when it launched you might be able to get one for less money. And even if not, it is probably using less power and it´s a lot newer.
Man, this is R9 290x I got about a decade ago is still kicking & I don't know how, but I'm not going to question it.
I have a gtx 1050 3Gb version in a laptop and i can say it is very good. Surprises me almost every day with performance. I played uncharted, hogwarts legacy, death stranding, newer and old assassins creeds, rdr2 and a lot of other less demanding/older games. I play at 1080p medium and usually lock the newer games to 30 fps via Rivatuner. I hate stuttering/frame drops from 30 fps. But with overclocking and undervolting both the gpu and cpu, the games run very well, no dibs bellow 30 or stuttering. So i have to say 3gb vram still holds up, unless the game is very bad optimized. But then you just have to wait for patches and play it then
Oh and a tip for playing with 30 fps. Turn of ingame vsync and fps lock and turn on half refresh rate vsync (if you have a 60hz display) in nvidia control panel and lock the game to 30 fps via Rivatuner
this is why the PC gaming is at its worst aswell because its people like you who says "games are so unoptimized omg i cant even run it well on my 8 YEAR OLD PC!!" The answer is no, 3GB is definitely NOT holding up, even 4gb vram is dead and irrelevant in 2023 so idk what you are smokin if you think 3gb still holds in 2023 especially when even some non triple A title like Halo Infinite already shutdown people with only 3gb vram
@@Eleganttf2 Cry more.
@@wyterabitt2149 you sure do
Will be interesting to see how it will handle Unreal 5 engine games
Yeah I don’t think it’ll go well haha
I was playing Halo Infinite on a 1060 3GB since the game's release. It was a struggle on graphics even running on LOW but i ended up working my way up to Platinum 5 on ranked slayer so i guess that speaks to the ability to play. The card had a lot of issues loading larger maps (think BTB) but i stuck with ranked slayer most of the time anyway.
The nail in the coffin was when 343 released Season 3 of Infinite where they cut off people with less that 4GB of VRAM.
I still have a gtx 950 that i bought in 2015 or 2016. It has acceptable performance for 720p but the 2gb of vram makes it full in almost every game.
"You guys have 3GB VRAM?" - A 1050 owner.
I had this card. It was much faster than the 1050-Ti, but VRAM was an issue in 2018 already.
Its sad that my 760 has 4gb but a 1060 has 3gb.
I upgraded from this to a 3080 FTW 3 that got for $350. It was a night a day difference (obviously)
The 1060 made a decent card for my GF’s “getting into PC gaming” PC I put together though
I honestly think it's on the developers to optimize the games to use less VRAM.
I have played demanding games on laptops before without losing textures. It's ridiculous to require any more than 1GB of ram.
They can't fit those textures in 1GB and the PC hardware doesn't support most efficient texture compression algorithms either. Watchdogs 2 was a game that brought up 3GB as maximum texture requirement for me .... I still remember my dorm roommate finishing Need For Speed: The Run with totally unplayable fps for most part of it 😂. We just had fun with what we had . And back in the day's in our primary school we only had one PC capable of running need for speed: underground.. and it overheated and turned off after a while.. we had fun because to us FPS didn't matter as we were happy just to be able to run those games...
My textures looked like that on a gtx 1650 on low settings and it was because of the high texture package that installs with the game. I removed the texture pack and it looked alot better. I think you need 6 gb to run those textures and you have to choose not to install them otherwise you get them anyway.
This card saved me during the crypto boom, basically the only card at msrp. The 3GB VRAM saved it from a hard life of slave mining. It has served me well, now it's retiring in my Plex machine.
Do you mean the first crypto-boom 😂
@@peterpan408 Both.
During 2017, it was pretty much the only card at MSRP, since all GTX and RX 470->580 get bought up by miners.
During the 2nd one this card gets sold pretty cheap on 2nd-hand market since DAG size increase for miners, and people sell their cards to move to used cards that miners sell.
A buddy of mine bought these in bulk from a miner for $25 USD each. At that price they're perfect for slapping in a cheap Optiplex or older custom built PC. Getting 1080p low out of a card that costs less than a nice dinner is a bargain in 2023.
This guy did the first resident evil 4 test after the game got cracked. Guess is a pirate here, welcome to the gang.
Well, I have a GTX 970 with 4gb VRAM, although the card can only effectively use 3.5gb and I am actually quite pleased (got the card very cheap). I can actually play many games on medium graphics settings, such as the "new" CoD MWII. But I guess the GTX 970 simply has more performance (overclocked) than the 1060 and of course a bit more VRAM.
Yeah the 970 is still holding up quite well
I have gtx 980m with 4gb and it only uses 3.5gb , is there any reason for that?
@@stefanpavicic6277 I don't know about gtx 980m, but with 970, there was a manufacturer issue where the last .5 GB of Vram was slower memory than the rest of the 3.5 Vram, so if you used over 3.5 GB of Vram, the card would slow down to a crawl, because it tried to use slower memory that couldn't keep up.
There was a lawsuit, and people who bought a 970 were given a small refund.
@@firstnamelastname-oy7es whoa, that is quite interesting, I didnt know that but coming from nvidia it didnt suprise me(pardon me for my english language).
@@stefanpavicic6277 Well behaved software is supposed to leave some over for the DWM (display window manager) and other processes.
All VRAM allocations by a single process are usually limited to about 81% of the full VRAM present, as noticed by CUDA users, though depending on the API used, there may not be an actual limit applied, or it can be 90% instead of 81%. It's not well documented.
Anyway this is why the GTX970 trick, where the memory ended up partitioned into main 3.5GB and slow 512MB was THIS pain free in practice.
Had the Gtx 1060 3gb in my dell prebuild that i upgraded from ( i know i shouldnt have bought a prebuilt ) but its good for olders games like the settlers 3 and avp2 even csgo , after that i saved up for a custom rig from pc specialist now am running ryzen 5 2600, 16gb of ram and a 2060 palit single fan card , i only play at 1080p , so i am happy with my current setup , ps sorry to have rambled so much .
I have a GTX 1060 3GB. I spent around 90 Canadian dollars on it. Great for boomer shooters and big budget games from several years ago, like Gears of War 4 and Call of Duty WWII. My main PC has an AMD Radeon RX 6600.
on the Halo Infinite clip, take a look at the GPU Power value. No wonder such a big improvement.
GTX 970 owners be like, “I got 4GB….no, wait” 🤔
i would love to see kerbal space program 2 in your benchmark list as it is literally the most graphically demanding game
I was rocking a 2060 3GB GPU (in a laptop) for quite a while before my most recent upgrade. I never ran into game I wasn't capable of running. VRAM is important, but definitely not as big of a deal as people say.
I agree
Just be caucios with the settings
It is a big deal. 4gb vram is the minimum you'd want even for modern esports titles.
@@NikosM112 not really. 3GB runs most games perfectly fine aslong as you're not maxing out settings lol
It depends, but your scenario is a laptop. Even some DX11 games during the PS4 era like Dark Souls 3, the VRAM usage was more than 4 GB at 1080p Ultra.
Shadow of the Tomb Raider at 1080p Ultra was more than 6 GB VRAM.
3 GB VRAM was more than good enough for the early era of DX11 and overkill with DX 10 and older.
It really depends...
@@takehirolol5962 Thats why you shouldnt be an idiot and crank everything to ultra.
This is exactly what will happen to the 4060 and maybe even 4070 cards sooner than later. the gpu power is enough to run the games in lower settings somewhat oakayish, but the missing VRam leads to all kinds of problems.
4070 should really be fine for a long while, 12GB should be enough, not ideal, but enough, especially for 1440p.
We won't see another big spike in vram needs until the PS6 comes out which is many years away still. The reason all the drama over vram is sprung up is because consoles just upgraded and this happens every time consoles upgrade.
Current gen consoles have basically 16GB of shared video/system ram so 12GB really should be fine for awhile. The general rule of thumb is never have a GPU that has massively less VRAM available to it than the bare minimum line set by consoles.
Exact reason why i went for the 3060 12gb.
I still have a 3gb 1060 kicking around, it was ok when it was new, but time has not been kind to it. It’s great for older titles, but as you said, games these days aren’t made for low vram configs, and the sacrifices required to get playable fps are ridiculous, if I want to play Minecraft I’ll load Minecraft, but when it’s hard to tell the difference between Minecraft and something like hogwarts, it’s too much for me.
Me too - EVGA. Keeping it for old times sake
I mean, did you watch the video? The sacrifices were minimal and still ended with OK performance in most of the games. Only issues were Dead Space needing fsr, which looked fine but had stutters, and Star Wars that needed fsr but looked decent and ran well.
It's coming to the end sure, but you can still game with various new titles and it look OK.
It's not like he turned all the games into 240p.
Still using an Asus GTX 1060 3 GB on my secondary pc for light gaming. Still pretty decent for competitive games like Valorant but AAA games will struggle a lot unless you played at a lower resolution and settings.
Been running a 1050 for almost 5 years, it does decently for how old it is
Do not overclock it, I can cause some serious problems, specially with this usage time.
@@edi12312 Yeah pci 16 slots can only handle about 75 watts
My friend just gave me a great deal on a 6gb 1060 for my home media server. Sold it to me for $25! Looking forward to learning linux with that build.
I really love the untextured trees in that halo game there... I actually wish they would make a cartoony halo game with non existent textures like that
Nintendo 64 Halo
Me going from Intel integrated graphics to my GTX 1660TI that I still use today made a huge difference in my life.
I went from one of these to a 1080Ti and it was possibly the most mindblowingly noticeable upgrade I have ever made, haha.
I miss my r9 280 😭 overclocked very well just a shame no dx12
I first read the title as "When you have 32GB of RAM in 2023..." and was confused and worried. Surely I hadn't fell that much behind the curve on that front yet?
Still using my R9 290 4gb from 2015 and videos like this only reaffirm to me that I've no reason to upgrade other than for VR maybe.
Me back in 2015 thought my system would've been dead in 7 years lmao, love my "old" PC.
Bought one for my friend. They've been enjoying it. it's a decent deal
I didn't know that this video was about the 1060. It's a surprise, given that I have a 1060. However, the one that I have is the Superclocked version, which has 6GB, instead of the 3. For what it is, it works like a dream, even to this day. I can run games at medium and high to reach around 45-60fps (depending on the game).
1060/6GB is the original card and very nice. SC or not SC tiny difference, no matter, though EVGA is always just pleasant and fairly reliable, so good choice. 3GB is NVidia's cut cost scam released a longer time later, it misses so much capability that it should have had a more distinctive name. You find many people not paying attention or not understanding the difference and hooking up with a so much inferior product. DX11 heavy testing and not enough warning by the press also to blame.
NV's scammy naming goes more brazenly on. 3060 (12GB) got silently supplanted by the 3060/8GB which is missing a bunch of hardware and loses 20% performance, making it compete for performance with the original 2060 (6GB) and not always winning, and being well short of 2060Super in performance, at measly 30€ discount from the real 3060. The full 12GB version of 3060 is very nice though!
Sometimes there are nice surprises. 2060Super is not a 2060, it's effectively a more power frugal 2070. Very good. I will not complain about over-delivering.
Hey ive stillgot this card! Bought in 2017 when money was very tight and its served me well. Looking for the more mainstream rdna3 cards and rtx 4060, then look at changing
Why do old games (early 2010s) run great with 3GB and look beautiful enough, but modern games run bad and look so freaking ugly...
As an owner of a gtx 1060 3gb, I can say I've been absolutely satisfied with it until now. Especially if you are willing to use mods or tinker with config files for some games, it can produce very decent results then.
It did so well that I decided to hold on to it and transfered it to my new pc which I bought last year (ryzen 7900x), since gpu prices where still stupid high at my country at that point.
That said, it is on it's last legs and have already ordered a new GPU, but I did play and enjoy Hogwarts legacy and RE4 remake just fine with it (with some tinkering necessary)
Really can't believe 4x the VRAM that I was just using over a year ago isn't enough for some modern games anymore when I just upgraded 😭😭😭
Hey been watching your channel for a while now im giving my buddy my 3060 and saving up for a RX 7900 xt so im going back to this card for a bit so thanks for the vid
I knew that 3GB Nvidia cards were in trouble, when I tried to play Fallout 4 with the High Texture pack using a GTX 780 Classified. It worked okay outside, but once I got inside of a building, there was a fog, to hide the low draw distance. It looked like ars!
This is ny exact gtx 1060. Its been a great card for the past few years but Im looking to finally upgrade it to something with a bit more power and vram. This card will still be used in a home server when I upgrade but its been good to me.
I would like to see a second part of this video, targeting native 720p @60 fps and 1080p @30 fps gameplay.
Would love to see a comparison with the rx580 8gb to see how the VRAM buffer helps Polaris (if at all)
With most games high texture option is enough to make low settings visually acceptable so yes, 8GB helps.
Weak card for newer games, it cant run games on 60fps, more likely 30fps barely in every game. He already tested it.
@@milosstojanovic4623 Wrong, there are many titles that can run at 1080p60 with medium-high settings, it's the poorly optimized triple A games that are giving the impression of capable hardware not being enough anymore
@@alexg9601 yeah for sure buddy. Try any of the games from 2021 2022 and from this year on high settings to run at 60fps. That card is barely mid range card, even lower than mid range. With exception of some indi titles AAA titles cant run at 60fps even on medium without reducing some settings even lower and with high FSR.
@@milosstojanovic4623 FSR set to quality is way more than enough and i tell you this because i have an RX580 Myself :) it's just that most PC gamers are elitist AF and can't stand the idea of someone not having an RTX 4090 with a 4K 360Hz Monitor and getting 1000 fps in everything, the RX580 is still fine and very capable, not "barely" mid-range, save that for cards like a 1660 or 1650
Still running this card in 2024! Honestly, because I'm only playing older games at the moment, I'm well off. However, I do want to play Cyberpunk for instance, and I'll need to get a new card someday even if I have barely any money left most of the time. I don't want to play Cyberpunk with bad settings. I don't want to do that to myself.
Proud owner of GTX 1050 Ti 4Gb VRAM paired with 1080p monitor, more then enough, just lower expectations in game graphic.. ;-)
My first GPU in a PC i build from zero. God, do I regret that purchase. Went into 8GB within 1.5 years and stayed there until I picked the 3080Ti a few months ago.