Hey guys, sorry for interrupting but would like to ask if said graphic card you mention would be s good card for beginners? Would love to play stuff like hell let loose, even on medium thst would be awesom. Im looking for a graphic card thst wont break the bank
I was always kinda confused why people love the rx 580 but shit on the 1060 6gb, in my market they’re the same price, and the 580 uses a lot more power while not being much more powerful (just 2gb of extra vram, but let’s be honest, 6gb is completely enough for the 1060 class of gpu)
The 16 GB version only has 192GB/s of mem bandwidth. If you could push that a little higher (normal 580 has 256GB/s), it would net you some significant, additional performance. Polaris10 allows for adjusting memory timings and bios mods btw.
For some reason they used 6Gbps memory modules instead of the 8Gbps modules that the original RX 580 had. This probably made the performance significantly worse.
The 2048SP thing got me intrigued. Turns out, it's literally just the RX 570 with a small 40mhz factory overclock on the max boost. The RX 570 is the same die as the 580 and same architecture design as the 590 (it was a die shrink)
I'm wish Dawid tested ML stuff, theoretically (if you ignore AMD shenanigans) this GPU could be amazing bang 4 buck for stable diffusion/LLMs. I mean, if tinygrad succeeds :D
@@MenkoDanyonly if you wanna tinker. gfx803 isn’t officially supported by rocm for a while. Thanks to amds drivers the speed on windows in ML is blatantly bad. In linux you will have to find a way to get rocm going for these specific cards, then you could actually get halfway decent performance for the price but it’s not really worth the hassle if you can just buy a 3060 12 gig or more recent 12gig Radeon card used for not much more
@@pfizerpricehike9747idk about pricing in your region, but in Russia that 580 16gb costs half the price that the cheapest used 3060s usually go for, while providing 30% more VRAM. For applications that need VRAM that bad the card seems like a good deal
I bought my RX 580 8GB between 2017 and 2018 and it was great for 1080p gaming. I just upgraded it this year a month ago to a RADEON RX 6650 XT. I still recommend the RX 580 if you are on a budget you can still game with that GPU.
Yes, I bought my rx 580 for 45€ 1 year ago. I only upgraded to a gtx 1080 2 months ago because it was in a pc I got for 20€ used (well, it was free but I needed a 20€ wifi card) It has a i7-6850K on a MSI X99A Gaming Pro Carbon in an old Lian Li PC-Z70.
I also forgot to mention the 7800XT I got was sold to me for.....wait for it.....89 usd. Yes 89 bucks. A friend I build PC's for gave me a free 1000w ASUS ROG Strix PSU and tried to sell his 7800XT to me for 409 usd hoping that I'd bite the offer. Since I told him I was using a Seasonic M12II 525w PSU so I was limited on upgrades. I was considering the 6650XT since its good on a 500w PSU. When I declined, he just sold it to me at 89 usd. I've been gaming since then, replayed RE 4 Remake, and everything is at max settings on my 1080p 27" 144hz curved monitor.
My guess is that the 16gb may help on productivity tasks? Very fun video. I like that when you crank things to unusable, the 16gb is technically 3x better
Rather than doing ultra preset or 4K you could have only increased the texture setting. Wouldn't have hurt performance and still used most of the memory the game could use anyway.
@@12Rosen Not nearly as much as increasing everything to Ultra. Especially not with 580's memory bandwidth, which can handle high res textures just fine.
@@12Rosen No, it doesn't. It was tested so many times. Anisotropic filtering doesn't have any performance cost either. This has been true since like 2002 or 2003.
I still like how some graphics cards, like the 3D Wildcat back in the day (which was a CAD-oriented card) had upgradable RAM slots on the card. Then again, these cards were also not cheap as i think they ran a few thousand dollars each at the time. That being said, I think we are venturing into rather absurd territory where graphics cards have more RAM than some systems. I used to joke (back when 64GB iPods were around) that it was sad if your iPod had more storage space than your computer.
Hey Dawid, I also have one of these scalpel RX 580s. And for overclocking with MSI afterburner it's everything to do with the power limit settings. They are running at like 50% power capacity, so slide that bad boy up and it will hold the frequencies.
Great video. Very entertaining. My son and I are currently in the process of upgrading an old sff business machine just to see how it plays games at 1080p. Low power low profile. Thnx again.
You are not running any AI on amd hardware, and not because of tensor cores, but awful software support and lack of driver updates. Rocm is no longer updated for these cards.
The first concerning thing for me at the 2:15 mark....the mention of PhysX. That's always been an nVidia thing--I don't even remember what the last game to have that "PhysX" branding, was. But this 16GB version might be useful in the same way the 3060 12GB was/is. Not the best for gaming, but a lot of video/animation programs take up a lot of VRAM.
Why your channel and videos haven't popped up and been recommended for me in so long is infuriating. This is literally the first video of yours I've ever seen, and I have no idea why. You are freaking hilarious, and I enjoyed this video so much. Already liked and subscribed. I'm still running an RX 580. Told myself I'd replace it when it wasn't good enough, and it's still doing everything I need it to do.
the rx580 8gb was a beast card probably one of my most favorite gpu's of all time. It did 4k on my tv when i had my HTPC setup at the time and played wild lands at respectable frame rates and it handled my ultrawide in games too. bought it for 100$ sold it for 300$
Yeah when I heard about this version of the card, I just asked myself "why? and WHY the cut down version?". 8GB is really the sweetspot for RX580, more and slower memory will not help you in any case. RX580 is such a good card, released in 2017 and still relevant card for 1080p gaming in 2023, and you can undervolt most of the cards so they are super efficient for this architecture.
I just finally upgraded from the rx 580 - the thing was a champ. I mean still ran everything I ever threw at it well enough. jumped up to the 7600. This new card runs so cold for me, I dunno how I am going to heat my office this winter.
My guess would be etherium leftovers. Mining ETH required a lot of RAM, but wasn’t very intense on the core. My understanding was that it did benefit significantly from faster memory, but maybe they were chasing some type of power savings or maybe this was just what was available and cheap and worked.
@@BronzedTube Why, though? You could have literally spent like $40 more for a brand new, on sale 6700 XT or 6750 XT that would have crushed the 7600. Or, even a used, under warranty 3070.
@@degnartsE Please don't. Just spend the small amount of extra money on a 6750XT. Skip eating out like once or twice, you'll have a vastly superior card.
It's a very weird variant because I think It's made for ML (stable diffusion, Vicuna-13b...) BUT, as far as I know, these cards are compatible only with a very outdated version of ROCm (which is the AMD's crippled version of CUDA) so the performance will be pretty low these days. We need someone to test this card with ROCm to see if "it does something" though.
@hawky2k215 Performance wise yes. Power draw wise not really. The 580 drew a whopping 100w less than the 390x. If you used it for around 4-5 years 4h/day you saved around $100 in the electrical bill alone (depending on the kwh price). That and it also probably reduced the load on the CPU... If you bought your 580 8GB at the time for $200 you effectively reduced the cost by half. No money wasted there if you ask me since it's still by today standards a very solid card and can be resold much easier at a higher price than a 390x.
That often won't help in these cases. The cards are constructed from found working parts pulled from dead cards. To conceal the sourcing of the chips they sand off the model/serial numbers
Memory is cheap now (who know do they even use new chips) and this is a way to stand up in market saturated with millions of other 580 cards available now. But it's pointless, it's just like 4GB gt710 cards.
I remember seeing graphics cards in Maplin back when dad would go in to buy some electronics components... and that gpu box gave me nostalgic feelings towards that...
I love how if you stop to look closely at the ai generated images at the beginning they are pretty much 90% squiggly lines that don’t mean anything like my drawings of a “mad scientist laboratory” when I was 4 except it’s all colored in and shaded real well.
Yep, thats what i thought with 2048sp previously (not this brand tho). But then i realize that 120 watts max is pretty much sweet (compared to the over 200 watts standard rx 580). So, im accepting it with bitter truth.
Weird idea for these kind of cards... You could try rendering a video with them, to see if the extra memory can maybe somehow make it a little faster? 😅
It's crazy my 2nd gen RTX 3060 has more vram than most new cards lol. I thought I was getting on the ground floor with my single fan, amazon special, GPU, but its actually been a work horse, and since its so ridiculous to 16Gb Vram or higher on a GPU I feel extremely fortunate lol.
My 8gb 580 purchased last year for 120$ delivered was the single best purchase I have made in years. Seems everything else I buy dissapoints. From cars to bluray players, a win is hard to come by these days...
@@happybuggy1582 It won't for me. I don't play new games. I stay at least 3 years behind releases so I actually get to play finished games. This PC is used to play backlogged games that span all the way to SNES games. Currently playing fallout new vegas for the first time, in preperation for starfield in 4 years 😂
As soon as i saw the beginning of the video, I immediately thought "the vram is not the bottleneck". I had a strix rx480 (basically the same as your 580) and used it up untill a few months ago. A very good card and lasted a lot longer than it should. It now lives in my daughters minecraft pc. Great video btw!
Do you think this card is suitable for deep learning ? I mean, if you accept longer training times, 16gb of memory for less than $250 (I didn't find any mention about the price in the video) might be acceptable for broke computer scientists. The only issue is that AMD cards are much, much slower than NVIDIA's because most DL libraries are made with CUDA...
Dawid, you gotta understand (and a lot of gamers and Nvidia takes advantage of this misconception) that resolution is not the only thing that affects VRAM usage. Texture settings are typically the settings that affect VRAM usage, but even then there are a lot of games that "smartly" use VRAM for textures. In a lot of modern games, the texture setting simply affects how fast the textures are streamed in, not how big the textures are, RE4R being the best example. (annoying how they put a gb size next to the setting despite that not being what it does) But that's not all, any RT settings, specific detail settings, all will affect VRAM usage, not just "pump to 4k to use VRAM". It hurt me to see you playing games at 4k with medium settings, like what's the point of testing out the VRAM if you don't change the settings that actually scale with more VRAM!
@@rustler08 TLDR: 4K resolution is not the only way to increase VRAM usage. Normal settings do that too. Turning a game up to 4K but keeping the settings low will not use too much VRAM.
I get you're trying to be clever, but not enough to prove him wrong. It ultimately varies based on the method game developers used to make the game. Doom eternal, RDR2 pre-caches everything into vram to prevent stutters. Tiling is used with dx12 and as you say, is a smart way to use vram. It however requires some advanced yet troublesome prediction or scheduling to prevent stuttering. There are many games that simply determine textures based off of resolution, so dawid is completely fine in his video. The game is not going to load higher resolution textures for a lower screen resolution, there is zero point. On the flip side a game might default to 2k textures or higher when set to 4k resolution.
The added memory is likely for crypto mining. The RX580 was mining powerhouse for a while, but only 8gb version, because mining uses a crap ton of vram. 4gb just doesnt cut it. So 16gb would probably have a pretty big effect on your hash rate.
I had this card back in $2018-19 on 9400f and it was really good for the price. Bought it on eBay for $200 at the time. Gigabyte I think. I could play MEA on high on a 4k 55" LG tv. Old tech should get better as software improves, pretty neat.
I love that people are resurrecting old tech with upgrades, especially since in my opinion manufactures should make things like ram upgradable, and cpus usable regardless of mobile or desktop in any platform, mobile highend cpu's use to be a killer option for passively cooled home theater pcs, etc.. and then if you could get a highend mobile gpu on a card you were set to have the best experince.
You should totally test Stable Diffusion, people over there are crying for more VRAM. Would love to see if this card can run SDXL where new cards at 8gb/11gb are struggling. If a RX580 16gb can run SDXL perfectly fine then im certainly getting one.
I doubt Polaris is compatible with SD at all. It might run with a custom build, but probably only at 0.5 it/s or something useless. I have a 16 GB Radeon Instinct MI50 (modded to Pro VII), and it can hardly hit 2 it/s with the AMD build. You really need an Nvidia card for SD, or one of the RX 7000 cards. The cheapest option is the CMP 40HX at ~$80, which is the mining version of the RTX 2060S/2070; it can hit 4~5 it/s using single-precision (its half-precision is crippled), or about 75% of an RTX 2060S. The value-performance option is the RTX 2080 Ti with the 22 GB mod, but make sure you buy your own VRAM chips and find a reputable technician to do it.
Dawid, you should try playing the 7 days to die game. If you have a save with a lot going on, that old game uses a lot of VRAM (10gb is not even enough at 1440p). You can use that to compare GPU VRAM.
unless you're running out of VRAM to hold reduced textures adding more VRAM doesnt generally increase performance UNLESS you add a wider bus. However, adding more VRAM can allow for farther LOD fade, and higher textures throughout.
be interesting to see how such a card works in something like davinci resolve. comparing it to say a 570 which is what it's based on would be a fairer test
I actually love the 2048 cards, but that's mostly because, when looking for most performance possible with fewest dollars spent, I don't think anything beats it. Where else are you gonna get a $45-$80 video card that is that powerful? While certainly a little mock-worthy, I think the card really shows how capable it still is, and is a clear win for absolute budget gaming PC builds!
Memory is use for generative AI. The problem is that many open source projects use nVidia and little by little we can use AMD GPUs. This could be a great card, not for games, but other applications. But... the software is not yet.
@@jomeyqmalone some may for budget stuff. rendering works too. also video editing with eg davinci resolve can benefit from thsi large FB. i mean teh next competitor to that value in terms of vram is prolly the Intel A770. Which is quiet a gap tho.
I have an MSI Radeon RX 580 GAMING X 8GB card. It has served me well all these years. It is retiring at the end of this week when I build my new computer. I find it odd though, I don't know what I did, but I was getting 60 fps on High at 1080 in Cyberpunk 2077. Secret, the "Ultra" settings in a lot of games, if not all, is really not that great. You might get 1 or 2% increase in visual fidelity. I would prefer the more smooth and increased FPS at High.
The CPU core allocation and usage and the memory allocation stats on the screen while playing is something I've always wanted to deal with my son's PC. We've been running a FX 8350 processor with a 1660 super. Recently unlike what The geek squad tells everyone we actually had a motherboard failure although it could be the processor gave. Either way I picked up a ryzen 5 5600 as an open box that that's sorta land itself nicely 2D newer RAM and the newer motherboard. We have a Corsair liquid cooler that I hope will mitigate the heat of a slight overclock so that I can get the 5602 5600x plus standards.
You could have reprogrammed the BIOS for a more aggressive fan curve (keeping it ~65C) and bumped the core clock speeds and even improved memory timings. That would have been an interesting experiment. I've always wondered if tightening memory latency helped gameplay. One other thing you could do by making your own BIOS is to bump the memory speeds up as well (until you start seeing ECC errors in HWInfo64).
I have a 4gb version of the 750ti from some internet sale, and it was actually pretty decent back in the day with the extra vram. It overclocked very well! This made me think of it.
Just love the humour and the way you take nothing seriously about tech. If we want serious results, I will watch Gamers Nexus. But your channel is funny and makes me feel better. Also, I love your Anna she's so cool, I wish we so her more often. Cheers!
I am actually surprised how well the 5700 XT is doing right now in Starfield. Being a minimum spec, I have been tinkering with it and I can get some pretty high settings and still hovering around the 60fps without FSR. Quite surprising. Before when there would be something like a GTX650 minimum, the game would have to be run on low settings to get 60fps.
Solid choice for anyone needing a huge VRAM buffer on a budget. Linus Torvald comes to mind. Back in 2020, his personal workstation included a 3970X Threadripper and an RX 580. Dude doesn't need much GPU power, apparently.
My biggest takeaway is that video memory isn't as important as the panic for upgrade made people think. Newer games seem to ask for more but not really doing anything with it.
The 2048sp Bios was developed so these refurbished factories could use anything from rx470 to rx580 chips to make "new cards" for the mining farms. Reused gpus and memory chips were used in new boards, the gpus power wasnt very important to mining, were memory performance was were they get more hash rate performance. Its a nice cheap gpu, but theres a lot of failure rate to this cards as they were heavily used and abused for so long in the mining era, from the early days till the end of gpu mining, rx 4xx/5xx series were the most popular choice.
My understanding is that modern Nvidia BIOSes are loaded up with additional security, so it's not possible on 20-series or newer to edit a BIOS yourself and have the card accept it. You can flash a low-end card with a factory BIOS from a card with more aggressive power targets and it will take, but if there isn't an Nvidia-authorized 48gb 3090 vBIOS, it's not going to work.
@@laszlozsurka8991 what is this, a low-effort troll at nvidia? I'm not any company fans, not even pro-intel, but I expected better arguments than this.
Dawid, you did not buy a 16gb rx 580 from aliexpress. You stole it and ran and ran like Forrest Gump until you were on the other side of the world until you got far enough away that you got away with it, you magnificent bastard.
Interesting video,but it's a bit disappoint that you didn't test Ratchet and Clank, that game is a real VRAM hog with texture set to ultra.Anyway thanks for your works.
sand the edge of the chip that make the paste thin out the most.. aka HIGH edge.. it will level out the pressure to allow the chip to get 90% activity. then you can use software to AI train its mem and power controllers to assist in mem acquisitions to use instead of just a big portfolio they never fill.
10:31 - It's interesting that 100% GPU usage is reported here when the framerate is being limited due to VRAM capacity, and it looks like it's not using all 8GB for some reason
I also have a Strix 580, it's actually insane how good this card is, i didn't even realize how much of a better base clock it has compared to the other 580s.
The vram chips were sanded down because they came from a batch that performed too low or had too many errors. Samsung or a third party would sell these on discounted to other parties, so they can do more testing (hopefully) and reclaim the good ones. It's really common and not necessarily as sketchy as it looks - but they're probably low spec.
Simply mining card. 2048 SP and conservative clocking in order to save on electricity bills for the same result. 16 GB of VRAM useful when you mine ETH.
He's used it for testing some stuff before, but the problem is that Tarkov is still in very active development. A run now is different than it was before, and will be different from the future. Plus the game has some whacky usage on GPUs.
Still own an RX-590, It's now in my 2nd PC ( my 'back-up' pc ) I updated that RX-590, to an RX-6750XT in my 'main' PC. *I still LOVE that ASRock RX-590 PG !!!* Would be funny, to have 16gb, or even just 12 Gb, on that RX-590 !
All jokes aside: It is mind blowing that a card from 2016 (yeah, that´s right) performs like that in 2023. Polaris was/is AMD´s masterpiece.
AMD does that a lot, it's not a unique case of amazing performance down the road.
Three words.
7970 Ghz Edition.
so does GeForce 10 series GPUs.
Hey guys, sorry for interrupting but would like to ask if said graphic card you mention would be s good card for beginners? Would love to play stuff like hell let loose, even on medium thst would be awesom. Im looking for a graphic card thst wont break the bank
I was always kinda confused why people love the rx 580 but shit on the 1060 6gb, in my market they’re the same price, and the 580 uses a lot more power while not being much more powerful (just 2gb of extra vram, but let’s be honest, 6gb is completely enough for the 1060 class of gpu)
@@jonathank5841depends how much is your bank, where you live, and if you're willing to go second hand or not
The 16 GB version only has 192GB/s of mem bandwidth. If you could push that a little higher (normal 580 has 256GB/s), it would net you some significant, additional performance. Polaris10 allows for adjusting memory timings and bios mods btw.
polaris is dead bc of new DirectX
@@donciutino7490 What is it? the 12.2?
@@donciutino7490 is it 12.1?
For some reason they used 6Gbps memory modules instead of the 8Gbps modules that the original RX 580 had. This probably made the performance significantly worse.
The memory also runs 500 Mhz slower. Everything is nerfed on this card.
The RX580 2048SP is called RX 570 elsewhere in the world.
The normal 2048 has 8gb. Double the amount of the normal 570.
@@mijingles8864 there is a 8gb variant of rx570
@@mijingles8864some of the RX570s actually have 8 gb of vram
no, it is a 2048SP
570 is a different thing
We shall call it 570X then. Or even better, 570 Ti.
The 2048SP thing got me intrigued. Turns out, it's literally just the RX 570 with a small 40mhz factory overclock on the max boost. The RX 570 is the same die as the 580 and same architecture design as the 590 (it was a die shrink)
I'm wish Dawid tested ML stuff, theoretically (if you ignore AMD shenanigans) this GPU could be amazing bang 4 buck for stable diffusion/LLMs. I mean, if tinygrad succeeds :D
RX 590 is actually a different die. It is made on 12nm instead of 14nm.
@@MenkoDanyonly if you wanna tinker. gfx803 isn’t officially supported by rocm for a while. Thanks to amds drivers the speed on windows in ML is blatantly bad. In linux you will have to find a way to get rocm going for these specific cards, then you could actually get halfway decent performance for the price but it’s not really worth the hassle if you can just buy a 3060 12 gig or more recent 12gig Radeon card used for not much more
@@pfizerpricehike9747 Trust me, I know. During the early LLaMA days I desperately wanted more ram (4090 is not enough), I was this --->
@@pfizerpricehike9747idk about pricing in your region, but in Russia that 580 16gb costs half the price that the cheapest used 3060s usually go for, while providing 30% more VRAM. For applications that need VRAM that bad the card seems like a good deal
I bought my RX 580 8GB between 2017 and 2018 and it was great for 1080p gaming. I just upgraded it this year a month ago to a RADEON RX 6650 XT. I still recommend the RX 580 if you are on a budget you can still game with that GPU.
I have an RX 580 8gb and it runs at least some modern games decently well
I just upgraded from an RX 470 just last week to a 7800XT. That thing can still game, especially with FSR enabled. That feature is such a lifesaver!
Yes, I bought my rx 580 for 45€ 1 year ago. I only upgraded to a gtx 1080 2 months ago because it was in a pc I got for 20€ used (well, it was free but I needed a 20€ wifi card)
It has a i7-6850K on a MSI X99A Gaming Pro Carbon in an old Lian Li PC-Z70.
I also forgot to mention the 7800XT I got was sold to me for.....wait for it.....89 usd. Yes 89 bucks. A friend I build PC's for gave me a free 1000w ASUS ROG Strix PSU and tried to sell his 7800XT to me for 409 usd hoping that I'd bite the offer. Since I told him I was using a Seasonic M12II 525w PSU so I was limited on upgrades. I was considering the 6650XT since its good on a 500w PSU. When I declined, he just sold it to me at 89 usd. I've been gaming since then, replayed RE 4 Remake, and everything is at max settings on my 1080p 27" 144hz curved monitor.
@@itpugil Holy $hit, that's cheap!
My guess is that the 16gb may help on productivity tasks? Very fun video. I like that when you crank things to unusable, the 16gb is technically 3x better
Actually it's 4 and half Times better
Rather than doing ultra preset or 4K you could have only increased the texture setting. Wouldn't have hurt performance and still used most of the memory the game could use anyway.
Increasing textures reduces performance too..
@@12Rosen Not as much as everything on ultra though.
@@12Rosen only when bandwidth is quite low
@@12Rosen Not nearly as much as increasing everything to Ultra. Especially not with 580's memory bandwidth, which can handle high res textures just fine.
@@12Rosen No, it doesn't. It was tested so many times. Anisotropic filtering doesn't have any performance cost either. This has been true since like 2002 or 2003.
I still like how some graphics cards, like the 3D Wildcat back in the day (which was a CAD-oriented card) had upgradable RAM slots on the card. Then again, these cards were also not cheap as i think they ran a few thousand dollars each at the time. That being said, I think we are venturing into rather absurd territory where graphics cards have more RAM than some systems. I used to joke (back when 64GB iPods were around) that it was sad if your iPod had more storage space than your computer.
Unfortunately, Wildcat was severely flawed card in many ways.
Hey Dawid, I also have one of these scalpel RX 580s. And for overclocking with MSI afterburner it's everything to do with the power limit settings. They are running at like 50% power capacity, so slide that bad boy up and it will hold the frequencies.
Great video. Very entertaining. My son and I are currently in the process of upgrading an old sff business machine just to see how it plays games at 1080p. Low power low profile. Thnx again.
This memory could be very useful for loading larger AI models for the same reasons that the 12GB 3060 is the budget secret weapon for AI enthusiasts.
preach
yup!! i just learned about the 3060 myself.. just picked up 3 for my AI rig.. really not bad performance for the price!!!
@GreedIsBad slow but no memory limitation at cheap price maybe we gotta wait stablediffusion for cpus ryzen 7000 support bf16 but not fp16
You are not running any AI on amd hardware, and not because of tensor cores, but awful software support and lack of driver updates. Rocm is no longer updated for these cards.
The first concerning thing for me at the 2:15 mark....the mention of PhysX. That's always been an nVidia thing--I don't even remember what the last game to have that "PhysX" branding, was. But this 16GB version might be useful in the same way the 3060 12GB was/is. Not the best for gaming, but a lot of video/animation programs take up a lot of VRAM.
Dawid never disappoints
dawiddoesneverdissapoint
Except with EPYC CPU.
God bless him
he's using AI generated images. that's very disappointing
@@targz__Its true unfortunately 🥲
Why your channel and videos haven't popped up and been recommended for me in so long is infuriating. This is literally the first video of yours I've ever seen, and I have no idea why. You are freaking hilarious, and I enjoyed this video so much. Already liked and subscribed.
I'm still running an RX 580. Told myself I'd replace it when it wasn't good enough, and it's still doing everything I need it to do.
Same got inside a puter I built back in 2013, cpu is bottle necking it. I rarely game on it as I have a better PC for that.
Davvid is 😎 🫘
Same here, I watch a ton of tech videos and never saw Dawid in my recommend. I found him through Toasty Bros and immediately subbed.
the rx580 8gb was a beast card probably one of my most favorite gpu's of all time. It did 4k on my tv when i had my HTPC setup at the time and played wild lands at respectable frame rates and it handled my ultrawide in games too. bought it for 100$ sold it for 300$
Given it was basically a 480 it remained relevant for a shocking amount of time
Yeah when I heard about this version of the card, I just asked myself "why? and WHY the cut down version?". 8GB is really the sweetspot for RX580, more and slower memory will not help you in any case. RX580 is such a good card, released in 2017 and still relevant card for 1080p gaming in 2023, and you can undervolt most of the cards so they are super efficient for this architecture.
I just finally upgraded from the rx 580 - the thing was a champ. I mean still ran everything I ever threw at it well enough.
jumped up to the 7600. This new card runs so cold for me, I dunno how I am going to heat my office this winter.
My guess would be etherium leftovers. Mining ETH required a lot of RAM, but wasn’t very intense on the core. My understanding was that it did benefit significantly from faster memory, but maybe they were chasing some type of power savings or maybe this was just what was available and cheap and worked.
@@BronzedTube its kinda crazy I have a 580 and I was planning to upgrade to a 7600 too 💀💀
@@BronzedTube Why, though? You could have literally spent like $40 more for a brand new, on sale 6700 XT or 6750 XT that would have crushed the 7600. Or, even a used, under warranty 3070.
@@degnartsE Please don't. Just spend the small amount of extra money on a 6750XT. Skip eating out like once or twice, you'll have a vastly superior card.
It's a very weird variant because I think It's made for ML (stable diffusion, Vicuna-13b...) BUT, as far as I know, these cards are compatible only with a very outdated version of ROCm (which is the AMD's crippled version of CUDA) so the performance will be pretty low these days.
We need someone to test this card with ROCm to see if "it does something" though.
The 580 was my first gpu, it sure was a huge upgrade from integrated graphics😅
@@wildcat002 i got 4k monitor and still use my 1080ti for it Some games run fine 4k with it still.
@hawky2k215 Performance wise yes. Power draw wise not really. The 580 drew a whopping 100w less than the 390x. If you used it for around 4-5 years 4h/day you saved around $100 in the electrical bill alone (depending on the kwh price). That and it also probably reduced the load on the CPU... If you bought your 580 8GB at the time for $200 you effectively reduced the cost by half. No money wasted there if you ask me since it's still by today standards a very solid card and can be resold much easier at a higher price than a 390x.
My second gpu. Bought it as an upgrade to GTX 550 Ti. Got it right before the GPU prices went insane in 2020
Dude I love your videos! Thank you for doing your job!
I clicked faster than linus being exposed by gamer nexus
Wow this is an original comment
So you clicked after several months of investigation?
Faster than 44.25 minutes
Doesn’t seem like they’ve properly set a response on their UA-cam yet
NAWWW 💀💀💀
I absolutely love Dawid’s Drama Free content. Great video!
It doesn't matter what the video is about, it could be an intricate look into a cable and you manage to make it entertaining! Well done my guy!
You can use a polarization filter. That will make the information on the chips much more visible.
That often won't help in these cases. The cards are constructed from found working parts pulled from dead cards. To conceal the sourcing of the chips they sand off the model/serial numbers
I love Dawid's videos because he doesn't say why things are a bad idea he gets the hardware and shows you and it speaks for itself.
I knew you'd buy one of these when I saw them the other week.
Nice one brother.
I feel like it's not even for gaming at that point. Probably for mining or machine learning that needs "Just Vram"
Its sure !
Memory is cheap now (who know do they even use new chips) and this is a way to stand up in market saturated with millions of other 580 cards available now.
But it's pointless, it's just like 4GB gt710 cards.
Always happy when the best TechTuber around releases a new video. I seem to always go out of my way to ensure I get to enjoy the video ASAP.
Now imagine 2 of them, in Crossfire
I remember seeing graphics cards in Maplin back when dad would go in to buy some electronics components... and that gpu box gave me nostalgic feelings towards that...
I love how if you stop to look closely at the ai generated images at the beginning they are pretty much 90% squiggly lines that don’t mean anything like my drawings of a “mad scientist laboratory” when I was 4 except it’s all colored in and shaded real well.
you know if the circumcized 580 can do that well with 16gb of vram imagine what the regular version would do with the same memory🤔🤔
Thankyou for the awesome content 🫂
1410mhz. undervolted by 20. with a 16 percent over power budget runs stable.
I’m waiting for the 16GB GT710 review.
they did it because they could not because they should
watch Jurassic park for the reference🤣🤣🤣🤣🤣
Yep, thats what i thought with 2048sp previously (not this brand tho).
But then i realize that 120 watts max is pretty much sweet (compared to the over 200 watts standard rx 580).
So, im accepting it with bitter truth.
Weird idea for these kind of cards... You could try rendering a video with them, to see if the extra memory can maybe somehow make it a little faster? 😅
Yeah, my first thought is that this is for cryptomining, or rendering or something. I can't imagine why else this FrankenGPU would exist.
It's crazy my 2nd gen RTX 3060 has more vram than most new cards lol. I thought I was getting on the ground floor with my single fan, amazon special, GPU, but its actually been a work horse, and since its so ridiculous to 16Gb Vram or higher on a GPU I feel extremely fortunate lol.
It will last
This intro has better cinematic than some games 😂
I love the comical ride along on the way to making your point. Good stuff!
I wonder how this card would do for Stable Diffusion rendering, since the more video memory it has, the bigger the images it can generate.
That's what I asking for too
Quality wise, better go for RX 6600 and beyond
My 8gb 580 purchased last year for 120$ delivered was the single best purchase I have made in years. Seems everything else I buy dissapoints. From cars to bluray players, a win is hard to come by these days...
This one will be obsolete this year😢 good card though
@@happybuggy1582 It won't for me. I don't play new games. I stay at least 3 years behind releases so I actually get to play finished games. This PC is used to play backlogged games that span all the way to SNES games. Currently playing fallout new vegas for the first time, in preperation for starfield in 4 years 😂
WOW, 16gb really brings out the 4k performance. I could only imagine what some games could do with 2 rx580s working together in tandem
по видео видно что это всё напрасно,16 это всё в никуда
the video shows that it's all in vain, and it's all going nowhere
@@wizardothefoolYeah RX 580 at 4K, it’s a massive piece of shit and dual gpu gaming is dead
2 rx580? Might as well get 1 RX6700 XT
Bro was mad @AtomicSub
I'd love if he could plop the 16gb memory into the beefcake RX 580.
As soon as i saw the beginning of the video, I immediately thought "the vram is not the bottleneck".
I had a strix rx480 (basically the same as your 580) and used it up untill a few months ago. A very good card and lasted a lot longer than it should. It now lives in my daughters minecraft pc.
Great video btw!
Do you think this card is suitable for deep learning ?
I mean, if you accept longer training times, 16gb of memory for less than $250 (I didn't find any mention about the price in the video) might be acceptable for broke computer scientists. The only issue is that AMD cards are much, much slower than NVIDIA's because most DL libraries are made with CUDA...
Honestly I'd say the best card for something like this would be a RX 5700 XT that would make it a budget 1440p powerhouse.
I'm using a 6600xt for 1440p, even though it is said to be a 1080p card, it is performing quite well
My 5700xt is already a 1440p powerhouse
@@Am_Yeff Not anymore my RX 5700 XT is struggling in 1440p gaming because of vram.
@@EliezYT mine isnt, runs VR DCS great and most things at 144fps, really still an amazing card
@@Am_Yeff Are you playing newer titles like The last of us and Cyberpunk? I see the vram usage go pretty high playing 1440p high settings.
Dawid, you gotta understand (and a lot of gamers and Nvidia takes advantage of this misconception) that resolution is not the only thing that affects VRAM usage. Texture settings are typically the settings that affect VRAM usage, but even then there are a lot of games that "smartly" use VRAM for textures. In a lot of modern games, the texture setting simply affects how fast the textures are streamed in, not how big the textures are, RE4R being the best example. (annoying how they put a gb size next to the setting despite that not being what it does)
But that's not all, any RT settings, specific detail settings, all will affect VRAM usage, not just "pump to 4k to use VRAM". It hurt me to see you playing games at 4k with medium settings, like what's the point of testing out the VRAM if you don't change the settings that actually scale with more VRAM!
TL;DR
@@rustler08 TLDR: 4K resolution is not the only way to increase VRAM usage. Normal settings do that too. Turning a game up to 4K but keeping the settings low will not use too much VRAM.
I get you're trying to be clever, but not enough to prove him wrong. It ultimately varies based on the method game developers used to make the game. Doom eternal, RDR2 pre-caches everything into vram to prevent stutters. Tiling is used with dx12 and as you say, is a smart way to use vram. It however requires some advanced yet troublesome prediction or scheduling to prevent stuttering. There are many games that simply determine textures based off of resolution, so dawid is completely fine in his video. The game is not going to load higher resolution textures for a lower screen resolution, there is zero point. On the flip side a game might default to 2k textures or higher when set to 4k resolution.
Shadows also use quite a bit of VRAM
Aw man for a second I thought this was a GTX 580 and got excited! But, I watched the whole thing anyways because your videos are awesome.
The added memory is likely for crypto mining.
The RX580 was mining powerhouse for a while, but only 8gb version, because mining uses a crap ton of vram. 4gb just doesnt cut it.
So 16gb would probably have a pretty big effect on your hash rate.
It's for A.I.
Nope, one thing, GPU mining doesn't even exist pretty much, second, more vram doesn't get you higher hashrate.
@@nihadasadli2642 you are completely wrong on both accounts.
you ALWAYS start my Saturday morning off right with HILARITY!!! THANK YOU DAWID!
It maybe circuncised, but it's still 12 inches long.
I had this card back in $2018-19 on 9400f and it was really good for the price. Bought it on eBay for $200 at the time. Gigabyte I think. I could play MEA on high on a 4k 55" LG tv. Old tech should get better as software improves, pretty neat.
I love that people are resurrecting old tech with upgrades, especially since in my opinion manufactures should make things like ram upgradable, and cpus usable regardless of mobile or desktop in any platform, mobile highend cpu's use to be a killer option for passively cooled home theater pcs, etc.. and then if you could get a highend mobile gpu on a card you were set to have the best experince.
Chinese market are doing that. They buy old motherboards, RAM, and GPU and create a Frankenstein outta them
You should totally test Stable Diffusion, people over there are crying for more VRAM. Would love to see if this card can run SDXL where new cards at 8gb/11gb are struggling. If a RX580 16gb can run SDXL perfectly fine then im certainly getting one.
I doubt Polaris is compatible with SD at all. It might run with a custom build, but probably only at 0.5 it/s or something useless.
I have a 16 GB Radeon Instinct MI50 (modded to Pro VII), and it can hardly hit 2 it/s with the AMD build.
You really need an Nvidia card for SD, or one of the RX 7000 cards. The cheapest option is the CMP 40HX at ~$80, which is the mining version of the RTX 2060S/2070; it can hit 4~5 it/s using single-precision (its half-precision is crippled), or about 75% of an RTX 2060S. The value-performance option is the RTX 2080 Ti with the 22 GB mod, but make sure you buy your own VRAM chips and find a reputable technician to do it.
@@manaphylv100 Wow cool thanks for the info, certainly going to look into that!
I would be curious to see how it handles generative AI tasks. Some LLMs are massive memory hogs.
Dawid, you should try playing the 7 days to die game. If you have a save with a lot going on, that old game uses a lot of VRAM (10gb is not even enough at 1440p). You can use that to compare GPU VRAM.
Nothing I buy from there ever arrives lol
ha ha
weird lol
unless you're running out of VRAM to hold reduced textures adding more VRAM doesnt generally increase performance UNLESS you add a wider bus.
However, adding more VRAM can allow for farther LOD fade, and higher textures throughout.
be interesting to see how such a card works in something like davinci resolve. comparing it to say a 570 which is what it's based on would be a fairer test
I appreciate that when you are done with a video, YOUR DONE! It’s like BAM! video over! Go home! I love it! Haha 😂
You’re*
This abomination was made for mining a fair few of these are in the wild considering the price difference to other 16gb cards it made a lot of sense
Correct. 2048 SP was chosen because it used less energy for similar yield.
I actually love the 2048 cards, but that's mostly because, when looking for most performance possible with fewest dollars spent, I don't think anything beats it. Where else are you gonna get a $45-$80 video card that is that powerful?
While certainly a little mock-worthy, I think the card really shows how capable it still is, and is a clear win for absolute budget gaming PC builds!
Memory is use for generative AI.
The problem is that many open source projects use nVidia and little by little we can use AMD GPUs.
This could be a great card, not for games, but other applications.
But... the software is not yet.
you can run stable diffusion on it.
Are a lot of people actually looking for 7 year old AMD GPUs for this kind of work though?
@@jomeyqmalone some may for budget stuff. rendering works too. also video editing with eg davinci resolve can benefit from thsi large FB. i mean teh next competitor to that value in terms of vram is prolly the Intel A770. Which is quiet a gap tho.
I have an MSI Radeon RX 580 GAMING X 8GB card. It has served me well all these years. It is retiring at the end of this week when I build my new computer. I find it odd though, I don't know what I did, but I was getting 60 fps on High at 1080 in Cyberpunk 2077. Secret, the "Ultra" settings in a lot of games, if not all, is really not that great. You might get 1 or 2% increase in visual fidelity. I would prefer the more smooth and increased FPS at High.
Video starts @1:30
The CPU core allocation and usage and the memory allocation stats on the screen while playing is something I've always wanted to deal with my son's PC. We've been running a FX 8350 processor with a 1660 super. Recently unlike what The geek squad tells everyone we actually had a motherboard failure although it could be the processor gave. Either way I picked up a ryzen 5 5600 as an open box that that's sorta land itself nicely 2D newer RAM and the newer motherboard. We have a Corsair liquid cooler that I hope will mitigate the heat of a slight overclock so that I can get the 5602 5600x plus standards.
what have they done to my precious 580...
lmao nice joke. nobody uses a GPU that bad nowadays
@@MrTefe shut 😔😔😔😔
@@MrTefe alot of people do, I bought one last week, not my main rig but, I still bought it.
@@MrTefesurprise surprise, it's close to the 🐐 1060
And Nivida is still going to try to sell us 8gb cards.
This video was showing 16GB was pointless on this model of GPU.
You could have reprogrammed the BIOS for a more aggressive fan curve (keeping it ~65C) and bumped the core clock speeds and even improved memory timings. That would have been an interesting experiment. I've always wondered if tightening memory latency helped gameplay.
One other thing you could do by making your own BIOS is to bump the memory speeds up as well (until you start seeing ECC errors in HWInfo64).
I have a 4gb version of the 750ti from some internet sale, and it was actually pretty decent back in the day with the extra vram. It overclocked very well! This made me think of it.
Just love the humour and the way you take nothing seriously about tech. If we want serious results, I will watch Gamers Nexus. But your channel is funny and makes me feel better. Also, I love your Anna she's so cool, I wish we so her more often. Cheers!
are you learning with lenode?🤣🤣🤣
@@raven4k998 LENOOOOOODE.
Dawid testing 2018 games wondering why they don't use more than 8GB of VRAM.
Day 53 of Ahoy there
I am actually surprised how well the 5700 XT is doing right now in Starfield. Being a minimum spec, I have been tinkering with it and I can get some pretty high settings and still hovering around the 60fps without FSR. Quite surprising. Before when there would be something like a GTX650 minimum, the game would have to be run on low settings to get 60fps.
Dawid Hollywood are searching you
Solid choice for anyone needing a huge VRAM buffer on a budget. Linus Torvald comes to mind. Back in 2020, his personal workstation included a 3970X Threadripper and an RX 580. Dude doesn't need much GPU power, apparently.
I would love to test the card for some mining. See if there’s any difference whatsoever being able to use all 16 GB for crypto mining.
comment at time index 5:28; - did you toggle the bios setting 'above 4g decoding' to 'enabled'? If you didn't, it would explain a few things...
My biggest takeaway is that video memory isn't as important as the panic for upgrade made people think. Newer games seem to ask for more but not really doing anything with it.
Unless you're super competitive and needed top of the line setups, there's really no reason to rush for an upgrade
The 2048sp Bios was developed so these refurbished factories could use anything from rx470 to rx580 chips to make "new cards" for the mining farms. Reused gpus and memory chips were used in new boards, the gpus power wasnt very important to mining, were memory performance was were they get more hash rate performance. Its a nice cheap gpu, but theres a lot of failure rate to this cards as they were heavily used and abused for so long in the mining era, from the early days till the end of gpu mining, rx 4xx/5xx series were the most popular choice.
Amazing! I'm waiting for modded 48GB RTX 3090‘s, which would be basically Quadro A6000's for cheap.
My understanding is that modern Nvidia BIOSes are loaded up with additional security, so it's not possible on 20-series or newer to edit a BIOS yourself and have the card accept it. You can flash a low-end card with a factory BIOS from a card with more aggressive power targets and it will take, but if there isn't an Nvidia-authorized 48gb 3090 vBIOS, it's not going to work.
There is a 48 GB 3090, it's called the 3090 CEO edition but unfortunately it didn't launch.
@@laszlozsurka8991 what is this, a low-effort troll at nvidia? I'm not any company fans, not even pro-intel, but I expected better arguments than this.
@@baoquoc3710 ??? This isn't any troll Look it up, a 48 GB 3090 CEO edition was reported back then but it didn't launch.
please do more aliexpress videos, that's the only place where I can get cheaper prices where I live
1:03 You sure are, buddy! Good job! You'll be a real boy soon!
I just don't know what I would do without a weekly dose of Dawid. Please keep this going! And thanks!
Dawid, you did not buy a 16gb rx 580 from aliexpress. You stole it and ran and ran like Forrest Gump until you were on the other side of the world until you got far enough away that you got away with it, you magnificent bastard.
😂😂👍👍
This video just shows how good of a card the rx580 is nowadays. Another timeless piece of silicon!
Interesting video,but it's a bit disappoint that you didn't test Ratchet and Clank, that game is a real VRAM hog with texture set to ultra.Anyway thanks for your works.
Dawid is the first UA-camr I know to reuse clips for a sponsorship.
sand the edge of the chip that make the paste thin out the most.. aka HIGH edge.. it will level out the pressure to allow the chip to get 90% activity. then you can use software to AI train its mem and power controllers to assist in mem acquisitions to use instead of just a big portfolio they never fill.
Glad to see this channel growing.
Funny thing there's also a RX590 GME, which is just an rx 580 with 40% more clocks. Which make it a slight slower 590 and slightly faster 580
10:31 - It's interesting that 100% GPU usage is reported here when the framerate is being limited due to VRAM capacity, and it looks like it's not using all 8GB for some reason
"Circumsized Version" LMFAO - earned yourself a sub ;)
I love this sort of product, so original
I also have a Strix 580, it's actually insane how good this card is, i didn't even realize how much of a better base clock it has compared to the other 580s.
from jie shuo ?
The vram chips were sanded down because they came from a batch that performed too low or had too many errors. Samsung or a third party would sell these on discounted to other parties, so they can do more testing (hopefully) and reclaim the good ones. It's really common and not necessarily as sketchy as it looks - but they're probably low spec.
Simply mining card. 2048 SP and conservative clocking in order to save on electricity bills for the same result. 16 GB of VRAM useful when you mine ETH.
Hey mate, love the content. Is there any chance you can add Escape from Tarkov to your regular games you benchmark with?
He's used it for testing some stuff before, but the problem is that Tarkov is still in very active development. A run now is different than it was before, and will be different from the future. Plus the game has some whacky usage on GPUs.
Still own an RX-590, It's now in my 2nd PC ( my 'back-up' pc ) I updated that RX-590, to an RX-6750XT in my 'main' PC.
*I still LOVE that ASRock RX-590 PG !!!* Would be funny, to have 16gb, or even just 12 Gb, on that RX-590 !
I’m choking on my coffee about 3 seconds in. Wasn’t ready for that crap😂
Huh! Now I want to fire up my trusty Vega 64 Frontier Edition (16GB HBM2) and see what it takes to fully utilize the VRAM.