Fun fact! Not only is there an RX 6300, there's also an RX 5300, a 5300 XT, a 5500 non-XT, and a 5600 non-XT! They're OEM cards that came with Dell PCs.
@@RuanEmanuellI almost forgot to mention the 5500 non-XT. So I had to edit my comment. Lmao Yeah, that's a damn good question! What happened to the 5400? 🤷
Why do these OEMs even care, when integrated graphics have come such a long way? There is no need for such a card in a modern machine, especially when it doesn't even support encoding and AV1, which current iGP's support.
i'm kinda happy we have a more modern equivalent to the GT1030 and 710, it has all the modern capabilities like GDDR 6 and PCIe 4.0 but is slow and cheap enough to be a good choice for a basic video adapter that still (presumably) has updated drivers
I doubt the 3050 6gb will sell for less than $150 since regular 3050’s hardly sell for less than $200 unless it’s the very weak variants. Not saying the 6400 is really worth it since many still go for $150 also, but would be nice to see an actual display adapter sell for display adapter prices.
@@DLTX1007 also because that the RTX 3050 is supposed to just be a budget gaming card (and low profile?) while the RX 6300 is good for being a video adapter and i guess maybe old games.
@@onlyhereformoney175 3050 6GB is NOT the same animal as 3050 8GB is. It will be far slower and has less memory bandwidth. Also 6500xt can be had for roughly 130usd. Still not a good price but 6600 is roughly in 3050 8GB's price crosshairs which the 6600 thoroughly beats anyway
Funnily enough, 680M has more L2 cache and higher TGP and thus performs better. Hell, I'm pretty sure my laptop's 660M does better than this despite literally being the same chip as 680M with half the cores disabled. Edit: it does about the same in Cyberpunk at 1080p Medium with FSR Performance, but worse at 1080p Low with FSR Balanced, probably because the iGPU was hogging all the memory bandwidth and wattage from the CPU (I tested on battery, and my laptop has a 25W power limit when used on battery). This is with DDR5-4800 RAM, too, so I'm pretty sure there's a good chance 660M would beat RX 6300 in optimal conditions, despite, and I feel like I should reiterate, literally only having half the cores.
I’ve actually got a 7735hs with a 680m in it. Haven’t got a lot of the games iceberg has but I can test ratchet and clank and cyberpunk while also throwing in cs2. I’ll try to get back to ya with the results if I remember.
Can you actually explain why, though? I bet you can't. "Display adapter" is supposed to mean "can do everything but play games". This can play games much better than an Xbox 360. What word would you use to describe GPUs that actually can't game if you apply "Display adapter" to a GPU that CAN game?
I recently found a rx 5300 in the trash. Had a shorted cap and torn caps off the ram and passives off the pcie connector... was deathly sick and stuck at home so I turned it into a project to resurrect... interestingly it performed really strong for what I expected for something meant to be lower on the stack... also has a massive cooler for what I was expecting.
If you've still got the card, it would be interesting to answer the question that literally nobody is asking: Just what is the 'king' of still supported 2GB GPU's? Get a cheap 2GB 960 or 1050, overclock them, and see if they can beat the 6300, even if it's only when using PCIE Gen 3.0, or perhaps just in older, say 2015 and before, games. Think of the views!
At least on 3DMark Fire Strike and Time Spy my GTX 960 2GB beats the RX 6300 by a bit My GTX 960 graphics score on Fire Strike: 7970 and Time Spy: 2359
Concerning cyberpunk i suggest switching your benchmark pass to include Dogtown (the new area added in phantom liberty). I noticed it is way more taxing on your hardware than anything in the base game map.
AMD have all sorts of GPUs they've sneakily released since the dawn of RDNA. They've done gamers REAL dirty by releasing a slot-power-only 7500 level RDNA3 die with 8GB of GDDR6...but only releasing it as a Radeon Pro.
The ATI Rage XL that was often used on server boards for basic VGA out is 32-bit more often than not. I've got one in an HP thin client and it is S L O W. You can't even pan around in Roller Coaster Tycoon smoothly, it just doesn't have the power. There are also plenty of Geforce 8400 GS/210s out there with 32-bit buses.
From a quick google, there seems to be a HP variant too with a different cooler. No idea if the clocks are the same as the dell variant or not. Also hilariously, some of the first results are results from web stores from my country selling the HP variant for like 4k NOK new, aka €400... Both seem to be made for adding display extra outputs to OEM systems, the optiplex 7010 SFF in case of the dell one. Probably why they're so cut down.
@@BonusCrook What? High end RDNA2 was not available for 2 years in most regions. Yes, criticism to the RX 7800 XT is valid with RDNA2 in the US and UK, but not for my Latinamerican region and others as well that there was no stock of high end RDNA2.
@@takehirolol5962 Literally all GPUs here in the US were either unavailable or absurdly above MSRP at the same time. Even old gen cards were being sold at what is now mid range RDNA2 prices.
To be fair, the memory bandwidth of that 12CU RDNA2 card is pretty much comparable to what 12 CU APUs (Radeon 680M) have available. The 6400 version with 128GB/s is outright generous in that regard if you view it this way.
12:20 just in case you didn't know, the view distance setting in Fortnite has no effect on enemy players, they render equally far on Low and Epic. it only affects world LOD and object (weapons/items) distance
While I could see a use for the RX 6400, I never saw a use for the RX 6500 XT, and now, the RX 6300. At least with the 6400, you can drop it into a old workstation PC and have a quick low end gaming PC without the need for supplemental power. But for the 6500 XT, just get a 6600, and for the 6300, just get a 6400. There is absolutely no reason for these 2 cards to exist.
The low cost and power consumption of the rx 6400 is also nice for broke people living in a hot climate without an aircon. Though I can get more performance for the same-ish cost, I ended up choosing it over the rx 580 or 590 for this reason.
@@DoubtingThomas333 Give how small the price difference is between the RX 6500 XT and the RX 6600, just save the extra $50ish dollars. It might take a bit longer, but at least it's a card that will perform well for a while, the 6500 is basically DOA.
6300 is an OEM card right? Surely the reason for it to exist is it was cheaper to manufacture and fit in the PC case. And as far as the 6500xt, the reason for it to exist was clearly because our standards were so low at time of release that AMD thought they might be able to market it. Literally a period piece in the form of a gpu.
@@mangolover1rx 6600 launched with msrp of 330$, 6600xt with 400$ msrp and 6650xt with 430$, so their prices were not cheaper, while rtx 3060ti/4060ti have msrp of 400$ and 4060 is 300$
Hey there! Since you were talking about the specs I noticed that the 6300 shared similar compute unit count to the 680m while of course running the same architecture. Went ahead and tested the few games i own that showed up in the video on a ryzen 7 7735hs in my minisoforum Neptune hx77g. Cyberpunk 2077 1080 p Low fsr quality Avg fps: 44 1%: 29 .1%: 21 1080p Medium fsr performance Avg fps: 47 1%: 31 .1%: 23 Ratchet and clank: rift apart 1080p Very low fsr ultra performance Avg: 49 1%:33 .1%: 2 Quick note on ratchet and clank, while the .1% was in the single digits it didn’t feel bad and it did jump to 15. Another strange thing, if msi afterburner is reading the apu’s vram usage correctly, it settled around 8.5gb for most of the intro mission in the parade. Even tested cs2, unfortunately my msi afterburner overlay wasn’t showing up and the in-game fps counter isn’t very detailed but at medium settings 1080p fsr balanced it stayed around 120fps and dipped to 60 when near smoke, near npcs, or in detailed areas such as the reactors of Nuke. And before anyone questions my 6600m was running, I monitored both the usage of gpu’s and never saw the 6600m go above 1%. Just tested cs2 on my 6600m and found out that on the 680m it is incredibly stuttery. Definitely running on it though
I wonder if there's some crazy SBC out there that would pair this GPU with something link an N200 CPU and 8-16GB of system RAM. That would be a super low-power SBC and it would probably be about as cheap as you could go for modern architectures with DDR5 and dedicated graphics.
Believe it or not, it IS integrated graphics, or at least has the same amount of compute units and architecture as the 680m. But yeah, it’s definitely odd looking at it from a gaming standpoint since igpu’s are starting to become faster, it’s why Nvidia stopped making of display adapters. This thing is really only useful for adding display output to a system with nothing to do so or connecting extra displays if the current ports are already used up.
I mean this card would be really neat inside a retro XP/Vista/7 machine as it is low profile and has little power consumption. No doubt would play every game from that era 60FPS no problem. But I doubt anyone would bother making custom drivers to get it to run on old OS's.
Even with the best stuff you can get for Windows XP - which is 3rd gen intel i7 processor - you, at best, would get PCIe 3.0, which would make this card perform even worse. Not really worth it, especially considering driver difficulty.
I think we are pretty spoiled with gaming. Frame rate, resolution, quality modes, they are all so munch better than what used to be acceptable in pc gaming.
Yeah dude. Ever since more realistic graphics have started to be made, we can't accept 1080p anymore. We've gone from shimmering which we were fine with to "it's too blurry". We've gone from "this is such a good game!" to "ugh, I can't get 60 FPS. What a doze".
We've gone to that because if you get a game that can run max graphics really high fps and res and then you have another game that can hardly hit 60 (doom eternal vs hog warts legacy) even if there's a difference of a few years with release the performance is just not acceptable now with what we have available
I remember 10 years ago that, if a GPU in a $600-700 PC could do 1080p60 in all games at say medium to high settings, that would be considered great value. Today, if a GPU can only do 60 FPS at 1440p, it is considered as “low end”.
We are spoiled with the hardware and the innovation of the tech industry as a whole. Gaming has been lagging behind solely because of corporate greed, not because we don't have the capabilities.
11:47 I like it how the color of the cars changes instantly. Is it some kind of a 6300 "special" feature?😅 Or is it meant to be like this? Anyways, nice video, keep it up!
Minor nitpick: RE4 Remake has a less gpu heavy reconstruction technique which is around 25% faster than fsr 2 on the max performance setting, interlaced rendering, at 100% render image quality doesn't take a huge hit in movement (at least on 1080p), however since it's a reconstruction technique that renders half pixels one frame and half on the other I assume at lower resolutions it'd look worse and worse, it's still better image quality and performance than fsr 2 which I think is poorly implemented or something, because it doesn't seem to scale performance properly the lower you go. Also even if it's pretty meh in performance, I'm honestly impressed it's only using max 25W under maximum load.
Never thought I'd see a 32-bit memory bus on something other than the Geforce 6200LE, but here we are. Thanks to GDDR memory and clock speeds being 8 times higher the 32-bit bus club has come a long way in memory bandwidth and performance.
Interesting video. Might make for a nice display adapter that will likely have longer support than say a GT 1030 and no messing around with mini display port like some of the quadros. It does go to show it does have enough horse power to be able to mess around on when not working though. :)
@@BonusCrook As I mentioned in my comment gaming is not the only reason why GPUs exist. HTPC, workstations running OpenCL workloads etc. And yes, driver support does matter with any PC hardware. Not every GPU is useless just because it can't play Alan Wake 2.
If you are interested, Sparkle *finally* released a single-slot, low-profile A310 with almost the exact same specs, but with an extra 2GB of VRAM and a 64-bit bus. It does have 2 miniDP ports, but they were nice enough to throw in an HDMI port so you're not entirely screwed. It is about $40 more though, so it's not as much of a "mess around/slap it in and forget it" price that it should be. However, it's Arc, it's very low end, and thus there are zero reviews of the thing easily available in the wild. There aren't even any 3DMark scores for the thing. I'm assuming that anyone buying one is going for the media server build since Arc is amazing at it. Frankly, if I didn't have other priorities in life and money, I'd give it a whirl and leave it in a media server after laughing for a good hour.
@@EbonySaints Cool. Good to see intel getting involved in the various sectors. The only issue is from reviews online there seems to be the implication that resizeable bare is required and most people using something that low end will be upgrading old PCs or office PCs that are off lease and often have very locked down bios where setting such things might not even be an option.
Now I’m interested. I’ve seen people put those low profile 6400’s in their old half height optiplex’s with the 2nd-4th gen core series CPU’s and I’m curious to see if it actually becomes unplayable or if the cpu holds them back first.
tbh i'm quite impressed that the lowest end card of that generation is this good i remember when cards that looked exactly like that were GT 210's and 710's and those were not able to reach even close to such performance we really have gone a long way since then
it may seems like e-waste but it will hold more than some older gpus on the same level of performace... sadly is scarab with no driver support and no PCI lanes and at least 4gb vram :(
Geez. Even the equivalent of the HD 6450 is now a capable gaming GPU. Not for new AAA games, but who plays that garbage anyway? Edit: Yeah, in passmark, it's faster than an R9 280X! :O 6545 points to 6226!
wasnt there an amd feature that allows you to go past yourr vram limit and use system ram for old cards like these? now saying itd make a huge difference but would be interesting to see it in action with something as limited as this
Iceberg actually featured this tech on the Vega 64 a little while back. Only works with a select few HBM vram cards but it would be interesting to see something like that work for limited vram gpus.
@nathan4746 I have an old HD 6670 1gb gddr5 and it had this feature, although the performance wasn't that great considering it was taking from ddr3 system ram
That 2GB of VRAM is just too much of a bottleneck in this day and age. It's so easy to get integrated graphics that you can assign more VRAM to. My old Xeon Workstation at work, for which I basically just wanted a card that would handle two monitors and not choke on 1080p video, has an RX 560 with 4GB of VRAM.
Sorry about that. Pretty certain that was me who outbid you. Currently using it for a low power live TV recoder/encoder. Along with a N100, cpu. Do need to try gaming on it. I know a 780m can mop the floor with it, but it interesting to see fps per watt on it.
Amazing. Honestly didn't think you could just buy one of these. It basically is a glorified display adapter! Come to think of it - if only this thing had an analogue port like VGA or D-sub, it could be somewhat interesting...
Even if it had, you wouldn't have needed to buy it solely for that reason. What is wrong with using an (active) adapter from e.g. HDMI or DisplayPort to VGA or D-sub? You can use those with any (i)GPU.
Would be fun but otherwise pretty useless since at best it might be a 6400 with a 32bit memory bus and only 2gb of vram. Also doesn’t help that it’s locked down unless there’s an easy way to bypass it.
I found in a local store a rx 640 and i havent seen much information about that one. I thinks is based of rdna and wonder if it is a rebrand of some kind
if it had more PCI lanes and 4gb VRAM it may be as good as my 1050ti or even better! Sadly no drivers support or FSR3, then would be not a bad choice and really interesting mini card XD
I have good news! You just described the rx 6400 without the extra PCIe lanes… 4gb of vram, officially supported with AMD’s drivers, fsr3 support, and many models are half height like this one.
By the way, you don’t even need to touch the game Pass app to play games on game pass. unlike on steam, game pass games don’t require launcher because windows itself is the launcher. Windows does all the actual heavy lifting, including handling the licensing, installing, and updating. so install the game directly and launch from the start menu. May Allah (S.W.T.) guide you and bestow upon you His Blessings; Ameen.
This is actually pretty powerful for what it is. That's amazing that it can outperform the Xbox 360 so well. This could totally make a super budget gaming pc, if not for the fact that double the money would get you way more than double the performance on the used market.
This was probably developed for thin clients, but chips available were too few to release it. AMD pads their portfolio ahead of tap out based on estimated die defects. If the dies are better than expected the lowest on the totem pole is sometimes nonviable without scrapping the next highest tier in the range.
I mean that's pretty dang good. I remember when crysis was out and what used to be a legitimate mid teir card would grind at 2fps if you're lucky at 1080p while you brown out your house. How far we've come this little thing can run avitar at 10 fps
The reason these cards are so lacking is that they're probably just some bare bones business class desktop accelerator, mostly for 2D stuff, where an APU wasn't an option. Or maybe it offers multi-monitor support. Integrated graphics are probably punching at the same weight class. 🤔 If you buy surplus servers, and want a card that can help transcode video for a streaming server, this kind of thing sometimes works. It usually won't require added power input, comes in low profile, and often this level of card is passively cooled because servers have decent airflow. Otherwise these cards are pretty useless.
I use a Navi 24 RX 6500 XT for AFMF on my Vega 10 16GB MI25/WX9100. The 6500 rarely sees more than 20% load doing this. IMO, this RX 6300 *is* an 'AFMF "Feature Add" card'. Also, Navi 24 is Gen4x4, so it's readily adapted to a spare M.2 slot, even in a mini-PC or 'server'. If this lil RX 6300 becomes common as eWaste in coming years, it will be a neat lil 'enthusiast gadget'.
To be fair, that only page on Dell that markets them clearly says they are "Business solutions" and that they are low profile cards designed/made for two Optiplex models. That is why they are only x4 PCI and part of why it is so low spec. They can't really pull any more power than what the mobo does so unless they supplement it with a sata power it's prob 100% PCIE power. It's about the best you can do for one of those without rigging up your own power supply. Then, you are still limited by Dell's notoriously bad mobo design/poor specs. There was a market for those kinds of cards actually during/around the pandemic people were buying those cheap optiplexes but not able to re-pin a PSU or even know they could and just wanted something to plug and play. Dell prob just bought up as many of their lowest BINd dies and slapped those things together as cheap as possible. That's on the card partner not AMD, especially in this case, since it was prob Dell's idea. It looks like new, they were still a good bit cheaper than the 1650 when it came out around that time. As another person mentioned this is far from the first time it's been done. It was actually Dell back in 2003ish that got.me.back into building PCs...almost out of spite at first. I wanted one for BF1943. My last build was a 133mhz Pentium 1 pro, and I was gifted a Dell laptop and I didn't really keep up in-between. Pentuim 4s just came out so I "built" one using their PC builder thing on their site. I had the choice of 3 P4s besides the base chip for the PC model(something-plex). They let me pay an extra $350for the best one they offered. Then I added a ATI 9800 at retail price! It took about 7 days after the shot return window to realize that the CPU BUS on the mobo in that model could not let any Pentium4 chip.run any faster than the base model chip that would have been free. It wouldn't have been able to cool it if it had either. It had one fan in the back with a hood that went over the CPU "heat sync". Standard boards would not fit into he case either. Nor would the PSU work with them. I then realized that the "ATI 9800" was OEM and only had maybe ½-⅓ of the "graphic pipelines" as the retail. Along with a dinky cooler just like that(20 years ago) that blew its hot air onto the CPU throttling it constantly. I tried to see if O could get a mobo from them and that's when I learned about Dell's "award winning support" where a guy I could barely understand would just type a BUNCH for like 40 seconds then tell me how much RAM I had the first 4 times I asked him if I could get a new mobo. Then he told.me they don't recommend getting a new motherboard due to cooing or something. I hung up, got online and built a MUVH better PC with a cool case for $300 less than I paid them for the Dell(dell was $1300), then went and built another that was just as good if not better for $500 because I enjoyed it so much. Been a bit of an addiction ever since. I regulate it to once every 3 years, but gladly build for others anytime I can! Apparently Dell is still doing the same thing via Alienware. "Dell! Encouraging people to build their own PCs for over 2 decades!" LOL At least they learned not to let them say it is a card it isn't anymore. My point is, that this is likley a relic from the "shortage" or maybe dell finding a way to take advantage of low BINd overstock due to the sudden bubble burst after the crypto crash. Either way, since that is the only place it is marketed at all and it is only marketed as a graphics boost for an office PC, to me, that doesn't sound like a dirty secret at all. If they or even just Dell had claimed it was for gaming, that would be VERY different. I will have to give it to Dell on this one oddly enough. Especially if it was during the shortage as they would have sold A LOT had they marketed it that way. $ end /commrant\;
Clearly a Video Card for Dell/HP computers without iGPU.. Functional, but slow/small. It could have been the $50 GPU I suppose. Although they would have likely tried for $100 just for fun 😮
Just ordered an RX 6300 for my server to replace the pretty poor GT710. Mostly for Linux compatibility overall. £60 gamble on eBay but worth a shot given that basic display outputs like the GT710 go for similar money second hand.
"2GB framebuffer"..... No. Thsta VRAM, video RAM, which holds textures among other things. The framebuffer is the size of resolution x color bit depth x double or triple buffering.
The 6300 From a Dev/OEM standpoint had room for growth but need did not outweigh it in that series baseline. It supported, with mosfet upgrades and a better cooling solution ( from us integrators ) 4, 8 & 12 GB memory options and could also be in a crossfire array with tandem pairing only w/o further debugging across quad-lane usage.
I've been looking into very low end GPU's to pair an old Ivy Bridge laptop CPU and motherboard for a small emulation machine. The top contender is still the GT1030, which is almost 10 years old and still costs the same as launch, 60 pounds for this might not be that bad considering the bleak state of 10+ year old hardware on the very low end GPUs, at least is a relatively new product.
I'm surprised AMD and nVidia still make these kinds of gpu's. If you are only playing older/dx11 games buy a 1060 or 1070, even a 980 like me, and if power consumption is an issue get a 1050 ti.
While these numbers aren't amazing, on the budget side of things (especially if you really only play indie games or just games from 6 or more years ago,) this GPU for 70 Euros or 70 US Dollars could be good for those who have an older system, or rather, a SFF system that would like to game, but don't want to spend more than a hundred on their GPU. After all, your only options today would be this or the RX 6400, and we know that the price for that would be a bit over a $100 budget for a GPU such as this. Fun fact: Dell is selling this GPU on their website for over $200! If form factor isn't an issue, you could buy the RX 6600 (non-XT) for that price! Very classy Dell.
I didn't found any story or information from this video why this tresh even exist. Its probably some corupted chips that can't be use for low profile RX6400
What was the point of testing a $60 card like this with so many high-end games. Talk about trying to get blood out of a stone for clickbait. For the money and spec, it did damn well and it seems to exist not as a gaming card but for office and server PCs!!!!!
uh seriously this feels like click bait calling it a dirty secret when it isn't. it is a card produced specifically for dell pcs it was not intended for commercial gamer use it was for pre-built pcs that were very low end "gaming" or what I call a basic use pc. if this is what it takes for your channel to get views on videos, you might as well quit and step aside. a better title is the obscure RX6300 only found in dell pcs. there fixed the click bait title for you.
Fun fact! Not only is there an RX 6300, there's also an RX 5300, a 5300 XT, a 5500 non-XT, and a 5600 non-XT! They're OEM cards that came with Dell PCs.
Makes me wonder what happened to RX 5400 then, and WHY 5300 has 3GB and 6300 does not lol
@@RuanEmanuellI almost forgot to mention the 5500 non-XT. So I had to edit my comment. Lmao Yeah, that's a damn good question! What happened to the 5400? 🤷
I have another odd one from dell Rx640. Based on the Polaris architecture
Gt 1010
@@nephron9924I totally forgot about that one
It makes sense that this is a OEM only GPU. Basically a display adapter card, a more modern version of the GT-710.
GT 705*
Why do these OEMs even care, when integrated graphics have come such a long way? There is no need for such a card in a modern machine, especially when it doesn't even support encoding and AV1, which current iGP's support.
i'm kinda happy we have a more modern equivalent to the GT1030 and 710, it has all the modern capabilities like GDDR 6 and PCIe 4.0 but is slow and cheap enough to be a good choice for a basic video adapter that still (presumably) has updated drivers
With the 3050 6gb not needing external power its hard to justify this cards existence
@@Tofuey only if nvidia prices it below 100usd, otherwise its even harder to justify 3050 6gb's existence. But even 150usd is not looking likely
I doubt the 3050 6gb will sell for less than $150 since regular 3050’s hardly sell for less than $200 unless it’s the very weak variants. Not saying the 6400 is really worth it since many still go for $150 also, but would be nice to see an actual display adapter sell for display adapter prices.
@@DLTX1007 also because that the RTX 3050 is supposed to just be a budget gaming card (and low profile?) while the RX 6300 is good for being a video adapter and i guess maybe old games.
@@onlyhereformoney175 3050 6GB is NOT the same animal as 3050 8GB is. It will be far slower and has less memory bandwidth. Also 6500xt can be had for roughly 130usd. Still not a good price but 6600 is roughly in 3050 8GB's price crosshairs which the 6600 thoroughly beats anyway
Talking about that, that RX 6300's specs are frighteningly similar to that of 680M
Just with the bus width of a single stick of ddr4, I'm impressed by how good these display adapters have gotten!
I'm impressed with how impressed you are with this abomination.
They cut it down past quark level lol
Funnily enough, 680M has more L2 cache and higher TGP and thus performs better. Hell, I'm pretty sure my laptop's 660M does better than this despite literally being the same chip as 680M with half the cores disabled.
Edit: it does about the same in Cyberpunk at 1080p Medium with FSR Performance, but worse at 1080p Low with FSR Balanced, probably because the iGPU was hogging all the memory bandwidth and wattage from the CPU (I tested on battery, and my laptop has a 25W power limit when used on battery). This is with DDR5-4800 RAM, too, so I'm pretty sure there's a good chance 660M would beat RX 6300 in optimal conditions, despite, and I feel like I should reiterate, literally only having half the cores.
I’ve actually got a 7735hs with a 680m in it. Haven’t got a lot of the games iceberg has but I can test ratchet and clank and cyberpunk while also throwing in cs2. I’ll try to get back to ya with the results if I remember.
“Display adapter” might be a more appropriate label.
… but for less than $100, alright I guess.
🤣
Gt710 : over my dead body
Can you actually explain why, though? I bet you can't. "Display adapter" is supposed to mean "can do everything but play games". This can play games much better than an Xbox 360. What word would you use to describe GPUs that actually can't game if you apply "Display adapter" to a GPU that CAN game?
@@awesomeferret I actually said it's alright for under $100, if you just read a little further.
I recently found a rx 5300 in the trash. Had a shorted cap and torn caps off the ram and passives off the pcie connector... was deathly sick and stuck at home so I turned it into a project to resurrect... interestingly it performed really strong for what I expected for something meant to be lower on the stack... also has a massive cooler for what I was expecting.
Would you care to share some numbers, friend?
If you've still got the card, it would be interesting to answer the question that literally nobody is asking: Just what is the 'king' of still supported 2GB GPU's? Get a cheap 2GB 960 or 1050, overclock them, and see if they can beat the 6300, even if it's only when using PCIE Gen 3.0, or perhaps just in older, say 2015 and before, games. Think of the views!
At least on 3DMark Fire Strike and Time Spy my GTX 960 2GB beats the RX 6300 by a bit
My GTX 960 graphics score on Fire Strike: 7970 and Time Spy: 2359
@@Pasi123 with triple the power draw probably
@@abyrenggasaputra8064 Well the RX 6300 is a 32W card while the GTX 960 is 120W, so more like 4 times the power draw
I have a 970 kicking around and it is more powerful than a 1050, but 2gb of vram is limiting. Would rather overclock a 1050 with 4 gb of VRAM.
If you're really getting a 1050 nowadays, what's the point of not getting the Ti 4GB?
Concerning cyberpunk i suggest switching your benchmark pass to include Dogtown (the new area added in phantom liberty). I noticed it is way more taxing on your hardware than anything in the base game map.
This channel is kind of like a cat, growing so fast!
A well deserved growth in my opinion! Needs more subscribers than this
@@mtaslim9778 definitely, i started watching when he had like 5k subs, and since then, and almost everytime he posts I eat while watching the video!
Deserved. Well produced videos with an entertaining host.
This channel is feeding my desire to experiment with weird older PC parts, I love it. And the general production quality is fantastic.
AMD have all sorts of GPUs they've sneakily released since the dawn of RDNA.
They've done gamers REAL dirty by releasing a slot-power-only 7500 level RDNA3 die with 8GB of GDDR6...but only releasing it as a Radeon Pro.
Yup, AMD are just as bad as nvidia. People shouldn't treat them as underdogs anymore.
I wouldn't even be surprised if it actually has a proper media encoder just to rub salt in the wound.
I spit out my drink when I saw your search typing "what is the growth on my p......"!
To be honest, I find the performance of this card truly impressive - considering the specs, that is.
i totally agree
32-bit? I don’t think I’ve ever seen a 32-bit memory bus before. Even back in the late 90s with TNT2 it had 64 bit in the lowest sku
The ATI Rage XL that was often used on server boards for basic VGA out is 32-bit more often than not. I've got one in an HP thin client and it is S L O W. You can't even pan around in Roller Coaster Tycoon smoothly, it just doesn't have the power. There are also plenty of Geforce 8400 GS/210s out there with 32-bit buses.
From a quick google, there seems to be a HP variant too with a different cooler. No idea if the clocks are the same as the dell variant or not.
Also hilariously, some of the first results are results from web stores from my country selling the HP variant for like 4k NOK new, aka €400...
Both seem to be made for adding display extra outputs to OEM systems, the optiplex 7010 SFF in case of the dell one. Probably why they're so cut down.
At this point, why wont they just re-re-re-release polaris? Preferably with a 4 bit bus, 512mb of vram and 1 cu clocked at 350mhz?
Probably because whatever card you're specing out here is a pile of shit card
RDNA2, the most geographically locked GPU generation ever made.
what?
@@BonusCrook What? High end RDNA2 was not available for 2 years in most regions.
Yes, criticism to the RX 7800 XT is valid with RDNA2 in the US and UK, but not for my Latinamerican region and others as well that there was no stock of high end RDNA2.
@@takehirolol5962 all regions had it bad, they were basically being shipped to miners everywhere
@@BonusCrook Not the US and UK...the only regions that sell and matter...
@@takehirolol5962 Literally all GPUs here in the US were either unavailable or absurdly above MSRP at the same time. Even old gen cards were being sold at what is now mid range RDNA2 prices.
To be fair, the memory bandwidth of that 12CU RDNA2 card is pretty much comparable to what 12 CU APUs (Radeon 680M) have available. The 6400 version with 128GB/s is outright generous in that regard if you view it this way.
Maybe add CS 2 in the benchmarks
12:20 just in case you didn't know, the view distance setting in Fortnite has no effect on enemy players, they render equally far on Low and Epic. it only affects world LOD and object (weapons/items) distance
While I could see a use for the RX 6400, I never saw a use for the RX 6500 XT, and now, the RX 6300. At least with the 6400, you can drop it into a old workstation PC and have a quick low end gaming PC without the need for supplemental power. But for the 6500 XT, just get a 6600, and for the 6300, just get a 6400. There is absolutely no reason for these 2 cards to exist.
Because they're cheaper than the cards above them...
The low cost and power consumption of the rx 6400 is also nice for broke people living in a hot climate without an aircon. Though I can get more performance for the same-ish cost, I ended up choosing it over the rx 580 or 590 for this reason.
@@DoubtingThomas333 Give how small the price difference is between the RX 6500 XT and the RX 6600, just save the extra $50ish dollars. It might take a bit longer, but at least it's a card that will perform well for a while, the 6500 is basically DOA.
6300 is an OEM card right? Surely the reason for it to exist is it was cheaper to manufacture and fit in the PC case.
And as far as the 6500xt, the reason for it to exist was clearly because our standards were so low at time of release that AMD thought they might be able to market it. Literally a period piece in the form of a gpu.
@@stratuvarious8547 I can get a new 6600 for less than a 6500xt. There seems to be more variety in skus for it, so the spread in pricing is wider.
"The RX 6300 performs similarly to a gtx 1080ti in Alan Wake 2!" 👀😁
In its defence, the 1080Ti at least looks better 😁
Thats because Alan Wake 2 is unoptimized requiring mesh shaders to work.
I'm surprised how well it actually ran. I was expecting it to not run modern games due to ram restrictions alone.
32 bit? Damn! And people talk shit about the 128 bit bus in the 4060 and 4060ti!
yet not about the 6650 XT and 7600, hmmm i smell a double standard
@@BonusCrook those were cheaper than nvidia's competing gpu lmao
@@mangolover1 wtf no they werent they were ~$300
6650xt price to under 200$ i believe and rx 7600 was 60$ less than 4060@@BonusCrook
@@mangolover1rx 6600 launched with msrp of 330$, 6600xt with 400$ msrp and 6650xt with 430$, so their prices were not cheaper, while rtx 3060ti/4060ti have msrp of 400$ and 4060 is 300$
Cant wait to see a showdown of the 8700g vs rx 6300! As both have 12 compute units!
Hey there! Since you were talking about the specs I noticed that the 6300 shared similar compute unit count to the 680m while of course running the same architecture. Went ahead and tested the few games i own that showed up in the video on a ryzen 7 7735hs in my minisoforum Neptune hx77g.
Cyberpunk 2077
1080 p Low fsr quality
Avg fps: 44
1%: 29
.1%: 21
1080p Medium fsr performance
Avg fps: 47
1%: 31
.1%: 23
Ratchet and clank: rift apart
1080p Very low fsr ultra performance
Avg: 49
1%:33
.1%: 2
Quick note on ratchet and clank, while the .1% was in the single digits it didn’t feel bad and it did jump to 15. Another strange thing, if msi afterburner is reading the apu’s vram usage correctly, it settled around 8.5gb for most of the intro mission in the parade.
Even tested cs2, unfortunately my msi afterburner overlay wasn’t showing up and the in-game fps counter isn’t very detailed but at medium settings 1080p fsr balanced it stayed around 120fps and dipped to 60 when near smoke, near npcs, or in detailed areas such as the reactors of Nuke.
And before anyone questions my 6600m was running, I monitored both the usage of gpu’s and never saw the 6600m go above 1%.
Just tested cs2 on my 6600m and found out that on the 680m it is incredibly stuttery. Definitely running on it though
Gaming Display Adapter™ I bet this thing would have an epic boss battle against the gt 1030 GDDR5. Like Godzilla vs. King kong.
I wonder if there's some crazy SBC out there that would pair this GPU with something link an N200 CPU and 8-16GB of system RAM. That would be a super low-power SBC and it would probably be about as cheap as you could go for modern architectures with DDR5 and dedicated graphics.
I think its made for Home Theatre Computers or basic office machines like Geforce 1030GT. But given AMD APU have a GPU in them. This seems odd.
Believe it or not, it IS integrated graphics, or at least has the same amount of compute units and architecture as the 680m. But yeah, it’s definitely odd looking at it from a gaming standpoint since igpu’s are starting to become faster, it’s why Nvidia stopped making of display adapters. This thing is really only useful for adding display output to a system with nothing to do so or connecting extra displays if the current ports are already used up.
I mean this card would be really neat inside a retro XP/Vista/7 machine as it is low profile and has little power consumption. No doubt would play every game from that era 60FPS no problem. But I doubt anyone would bother making custom drivers to get it to run on old OS's.
do you try it?
Maybe Linux with wine would be better.
The GPU is PCIe x4 gen 4 which would be really bad at gen 2 or gen 1 speeds
Don’t think drivers would work either
Even with the best stuff you can get for Windows XP - which is 3rd gen intel i7 processor - you, at best, would get PCIe 3.0, which would make this card perform even worse. Not really worth it, especially considering driver difficulty.
I think we are pretty spoiled with gaming. Frame rate, resolution, quality modes, they are all so munch better than what used to be acceptable in pc gaming.
even low settings look really good these days
Yeah dude. Ever since more realistic graphics have started to be made, we can't accept 1080p anymore. We've gone from shimmering which we were fine with to "it's too blurry". We've gone from "this is such a good game!" to "ugh, I can't get 60 FPS. What a doze".
We've gone to that because if you get a game that can run max graphics really high fps and res and then you have another game that can hardly hit 60 (doom eternal vs hog warts legacy) even if there's a difference of a few years with release the performance is just not acceptable now with what we have available
I remember 10 years ago that, if a GPU in a $600-700 PC could do 1080p60 in all games at say medium to high settings, that would be considered great value. Today, if a GPU can only do 60 FPS at 1440p, it is considered as “low end”.
We are spoiled with the hardware and the innovation of the tech industry as a whole. Gaming has been lagging behind solely because of corporate greed, not because we don't have the capabilities.
11:47 I like it how the color of the cars changes instantly. Is it some kind of a 6300 "special" feature?😅 Or is it meant to be like this?
Anyways, nice video, keep it up!
Minor nitpick: RE4 Remake has a less gpu heavy reconstruction technique which is around 25% faster than fsr 2 on the max performance setting, interlaced rendering, at 100% render image quality doesn't take a huge hit in movement (at least on 1080p), however since it's a reconstruction technique that renders half pixels one frame and half on the other I assume at lower resolutions it'd look worse and worse, it's still better image quality and performance than fsr 2 which I think is poorly implemented or something, because it doesn't seem to scale performance properly the lower you go.
Also even if it's pretty meh in performance, I'm honestly impressed it's only using max 25W under maximum load.
Never thought I'd see a 32-bit memory bus on something other than the Geforce 6200LE, but here we are.
Thanks to GDDR memory and clock speeds being 8 times higher the 32-bit bus club has come a long way in memory bandwidth and performance.
8:26 you can hear the smile on his face when he sees something above cinematic performance out of this thing
That ratched and clank footage could almost be a good thing if they intended a "pixel art" art style.
"what is this growth on my p-"😂😂😂 1:18
Oh no, how embarrassing, I didn't realise the camera was still rolling 😳
"P" for PC 😂
@@IcebergTech 🤓🤓💀💀💀
My man will make Techpowerup update their website just because of this video. "The Radeon RX 6300 is a graphics card by AMD, that was never released."
Or funnier than that: "The RX 6300 is one of the GPUs ever released"
Not had channel notifications snice xmas thought the vids stopped
Interesting video. Might make for a nice display adapter that will likely have longer support than say a GT 1030 and no messing around with mini display port like some of the quadros. It does go to show it does have enough horse power to be able to mess around on when not working though. :)
So this piece of eWaste is good because...... it's new? I seriously don't see what you are trying to say.
@@BonusCrook As I mentioned in my comment gaming is not the only reason why GPUs exist. HTPC, workstations running OpenCL workloads etc. And yes, driver support does matter with any PC hardware. Not every GPU is useless just because it can't play Alan Wake 2.
If you are interested, Sparkle *finally* released a single-slot, low-profile A310 with almost the exact same specs, but with an extra 2GB of VRAM and a 64-bit bus. It does have 2 miniDP ports, but they were nice enough to throw in an HDMI port so you're not entirely screwed. It is about $40 more though, so it's not as much of a "mess around/slap it in and forget it" price that it should be.
However, it's Arc, it's very low end, and thus there are zero reviews of the thing easily available in the wild. There aren't even any 3DMark scores for the thing. I'm assuming that anyone buying one is going for the media server build since Arc is amazing at it. Frankly, if I didn't have other priorities in life and money, I'd give it a whirl and leave it in a media server after laughing for a good hour.
@@EbonySaints the asrock a380 LP is 1.5 slots and way better
@@EbonySaints Cool. Good to see intel getting involved in the various sectors. The only issue is from reviews online there seems to be the implication that resizeable bare is required and most people using something that low end will be upgrading old PCs or office PCs that are off lease and often have very locked down bios where setting such things might not even be an option.
4:34 i kinda wanna see what would happen if you used this GPU in a ancient PC, I need to see the pain.
Now I’m interested.
I’ve seen people put those low profile 6400’s in their old half height optiplex’s with the 2nd-4th gen core series CPU’s and I’m curious to see if it actually becomes unplayable or if the cpu holds them back first.
It would probably start to crumble and try to kill itself as its meager 8 gbs of PCIE bandwidth is ripped in half to only 4.
tbh i'm quite impressed that the lowest end card of that generation is this good
i remember when cards that looked exactly like that were GT 210's and 710's
and those were not able to reach even close to such performance
we really have gone a long way since then
AMD and NVIDIA could adjust their general GPU prices to consumer friendly levels... but instead they decide to release E-Waste such as the RX 6300...
it may seems like e-waste but it will hold more than some older gpus on the same level of performace... sadly is scarab with no driver support and no PCI lanes and at least 4gb vram :(
the 5300 with re4 shouldn't crash at those settings. It shouldn't actually exceed the vram limit but dear god will it have some streaming issues.
Geez. Even the equivalent of the HD 6450 is now a capable gaming GPU. Not for new AAA games, but who plays that garbage anyway? Edit: Yeah, in passmark, it's faster than an R9 280X! :O 6545 points to 6226!
wasnt there an amd feature that allows you to go past yourr vram limit and use system ram for old cards like these? now saying itd make a huge difference but would be interesting to see it in action with something as limited as this
HBM only
Iceberg actually featured this tech on the Vega 64 a little while back. Only works with a select few HBM vram cards but it would be interesting to see something like that work for limited vram gpus.
@nathan4746 I have an old HD 6670 1gb gddr5 and it had this feature, although the performance wasn't that great considering it was taking from ddr3 system ram
I am also a "Redacted" adult who enjoys techpowerup and countless gpu comparisions and tests from the past and FUTURE!!!!111
That 2GB of VRAM is just too much of a bottleneck in this day and age. It's so easy to get integrated graphics that you can assign more VRAM to. My old Xeon Workstation at work, for which I basically just wanted a card that would handle two monitors and not choke on 1080p video, has an RX 560 with 4GB of VRAM.
That Alan Wake with PS1 aesthetics is kinda of a vibe ngl 🔥
I was *so* close to getting one of these, but got outbid later on in it and decided it wasn't worth it after around the ~$50 mark.
Sorry about that. Pretty certain that was me who outbid you.
Currently using it for a low power live TV recoder/encoder. Along with a N100, cpu.
Do need to try gaming on it. I know a 780m can mop the floor with it, but it interesting to see fps per watt on it.
@@Vekstar It's fine, would have been cool but also wasn't a necessary thing for me to have. Pretty cool to hear what you're using it for, though!
Amazing. Honestly didn't think you could just buy one of these. It basically is a glorified display adapter!
Come to think of it - if only this thing had an analogue port like VGA or D-sub, it could be somewhat interesting...
Even if it had, you wouldn't have needed to buy it solely for that reason. What is wrong with using an (active) adapter from e.g. HDMI or DisplayPort to VGA or D-sub? You can use those with any (i)GPU.
it only needs 6 more gb vram and it be a very budget friendly card good job amd
I'm pretty sure resident evil 4 got updated to where you no longer crash to the desktop if you run out of vram now.
I just can't believe that CP2077 is more than 3 years old!
I see you HDMI 2.1. Man this could have been a really interesting card.
Low-key wanted you turn on rAyTrAcINg just for more laughs
if i didn't know it was because of the GPU i'd have suspected a lot of the footage had big old CENSORED mosaics
i mean , this is cooler than i expected
better not released than become E-waste.
even the integrated graphics are better that that.
This gpu is awesome for 24watts,but i think a big OC will help it a litle bit
No amount of overclocking is going to fix 2GB of vram and a 32-bit bus 💀. This thing is manufactured eWaste.
@@BonusCrookIt's probably for those dell office or hp builds. When the igpu can't support many monitors well: add this.
Would be fun but otherwise pretty useless since at best it might be a 6400 with a 32bit memory bus and only 2gb of vram. Also doesn’t help that it’s locked down unless there’s an easy way to bypass it.
Now get RX 6600M, the desktop version one
i dont think its worth benchmarking it, its almost exactly the same as a rx 6600
This card would have been amazing 12 years ago 😂
I found in a local store a rx 640 and i havent seen much information about that one. I thinks is based of rdna and wonder if it is a rebrand of some kind
its polaris, look at techpowerup
if it had more PCI lanes and 4gb VRAM it may be as good as my 1050ti or even better! Sadly no drivers support or FSR3, then would be not a bad choice and really interesting mini card XD
'if it was a different card it would be good'
-UA-cam commenter 2024
I have good news! You just described the rx 6400 without the extra PCIe lanes…
4gb of vram, officially supported with AMD’s drivers, fsr3 support, and many models are half height like this one.
Why does the RX 6300 look the exact same as the random GPU in the Dell Vostoro from 2007 😭😭
Standard Dell...
@@IcebergTech Is that why the chromebook bezels are so big?
By the way, you don’t even need to touch the game Pass app to play games on game pass.
unlike on steam, game pass games don’t require launcher because windows itself is the launcher.
Windows does all the actual heavy lifting, including handling the licensing, installing, and updating. so install the game directly and launch from the start menu.
May Allah (S.W.T.) guide you and bestow upon you His Blessings; Ameen.
This is actually pretty powerful for what it is. That's amazing that it can outperform the Xbox 360 so well. This could totally make a super budget gaming pc, if not for the fact that double the money would get you way more than double the performance on the used market.
second ig
oh wow that ratchet and clank footage was brutal.
Damm i did NOT except you to make a vid on this GPU! Like never because not even one youtuber have made review on it
useless card
this is like the gt1010 of AMD world 💀
"it's not THAT bad like Intel iGPU"
meanwhile a 4-year old Intel iGPU: 13:23
It scores higher than this GPU does.
This was probably developed for thin clients, but chips available were too few to release it. AMD pads their portfolio ahead of tap out based on estimated die defects. If the dies are better than expected the lowest on the totem pole is sometimes nonviable without scrapping the next highest tier in the range.
I mean that's pretty dang good. I remember when crysis was out and what used to be a legitimate mid teir card would grind at 2fps if you're lucky at 1080p while you brown out your house. How far we've come this little thing can run avitar at 10 fps
This is the worst logic I have ever seen someone use to justify something being good.
Wow this card is much more disappointing compared to the 5300, which home just feels like a 5500 but with 3gb
The reason these cards are so lacking is that they're probably just some bare bones business class desktop accelerator, mostly for 2D stuff, where an APU wasn't an option. Or maybe it offers multi-monitor support.
Integrated graphics are probably punching at the same weight class. 🤔
If you buy surplus servers, and want a card that can help transcode video for a streaming server, this kind of thing sometimes works. It usually won't require added power input, comes in low profile, and often this level of card is passively cooled because servers have decent airflow.
Otherwise these cards are pretty useless.
What is the growth on my p
You win.
I about died laughing on this one. 🤣
I guess it was AMD's answer to NVidia's GTX 1630. Both cards don't really have a right to exist and can probably be outperformed by an APU.
please stop benchmarking fortnite and benchamrk CS2 instead
i fu*king hate fortnite ewwwww
Nice videocard for the money.
Dawid found rx 5300, and you found rx 6300 wtf whats next? Rx 7300?
Let's hope not!
32bit bus… What an abomination.
if u use the more power tool u might be able to to increase the tdp and the clocks
I always dance on that loud music
I use a Navi 24 RX 6500 XT for AFMF on my Vega 10 16GB MI25/WX9100. The 6500 rarely sees more than 20% load doing this.
IMO, this RX 6300 *is* an 'AFMF "Feature Add" card'.
Also, Navi 24 is Gen4x4, so it's readily adapted to a spare M.2 slot, even in a mini-PC or 'server'.
If this lil RX 6300 becomes common as eWaste in coming years, it will be a neat lil 'enthusiast gadget'.
To be fair, that only page on Dell that markets them clearly says they are "Business solutions" and that they are low profile cards designed/made for two Optiplex models. That is why they are only x4 PCI and part of why it is so low spec. They can't really pull any more power than what the mobo does so unless they supplement it with a sata power it's prob 100% PCIE power. It's about the best you can do for one of those without rigging up your own power supply. Then, you are still limited by Dell's notoriously bad mobo design/poor specs.
There was a market for those kinds of cards actually during/around the pandemic people were buying those cheap optiplexes but not able to re-pin a PSU or even know they could and just wanted something to plug and play. Dell prob just bought up as many of their lowest BINd dies and slapped those things together as cheap as possible. That's on the card partner not AMD, especially in this case, since it was prob Dell's idea. It looks like new, they were still a good bit cheaper than the 1650 when it came out around that time.
As another person mentioned this is far from the first time it's been done. It was actually Dell back in 2003ish that got.me.back into building PCs...almost out of spite at first. I wanted one for BF1943. My last build was a 133mhz Pentium 1 pro, and I was gifted a Dell laptop and I didn't really keep up in-between. Pentuim 4s just came out so I "built" one using their PC builder thing on their site. I had the choice of 3 P4s besides the base chip for the PC model(something-plex). They let me pay an extra $350for the best one they offered. Then I added a ATI 9800 at retail price! It took about 7 days after the shot return window to realize that the CPU BUS on the mobo in that model could not let any Pentium4 chip.run any faster than the base model chip that would have been free. It wouldn't have been able to cool it if it had either. It had one fan in the back with a hood that went over the CPU "heat sync". Standard boards would not fit into he case either. Nor would the PSU work with them. I then realized that the "ATI 9800" was OEM and only had maybe ½-⅓ of the "graphic pipelines" as the retail. Along with a dinky cooler just like that(20 years ago) that blew its hot air onto the CPU throttling it constantly. I tried to see if O could get a mobo from them and that's when I learned about Dell's "award winning support" where a guy I could barely understand would just type a BUNCH for like 40 seconds then tell me how much RAM I had the first 4 times I asked him if I could get a new mobo. Then he told.me they don't recommend getting a new motherboard due to cooing or something. I hung up, got online and built a MUVH better PC with a cool case for $300 less than I paid them for the Dell(dell was $1300), then went and built another that was just as good if not better for $500 because I enjoyed it so much. Been a bit of an addiction ever since. I regulate it to once every 3 years, but gladly build for others anytime I can! Apparently Dell is still doing the same thing via Alienware. "Dell! Encouraging people to build their own PCs for over 2 decades!" LOL
At least they learned not to let them say it is a card it isn't anymore. My point is, that this is likley a relic from the "shortage" or maybe dell finding a way to take advantage of low BINd overstock due to the sudden bubble burst after the crypto crash. Either way, since that is the only place it is marketed at all and it is only marketed as a graphics boost for an office PC, to me, that doesn't sound like a dirty secret at all. If they or even just Dell had claimed it was for gaming, that would be VERY different. I will have to give it to Dell on this one oddly enough. Especially if it was during the shortage as they would have sold A LOT had they marketed it that way.
$ end /commrant\;
Clearly a Video Card for Dell/HP computers without iGPU..
Functional, but slow/small.
It could have been the $50 GPU I suppose. Although they would have likely tried for $100 just for fun 😮
Just ordered an RX 6300 for my server to replace the pretty poor GT710. Mostly for Linux compatibility overall. £60 gamble on eBay but worth a shot given that basic display outputs like the GT710 go for similar money second hand.
"2GB framebuffer"..... No. Thsta VRAM, video RAM, which holds textures among other things. The framebuffer is the size of resolution x color bit depth x double or triple buffering.
The 6300 From a Dev/OEM standpoint had room for growth but need did not outweigh it in that series baseline. It supported, with mosfet upgrades and a better cooling solution ( from us integrators ) 4, 8 & 12 GB memory options and could also be in a crossfire array with tandem pairing only w/o further debugging across quad-lane usage.
I've been looking into very low end GPU's to pair an old Ivy Bridge laptop CPU and motherboard for a small emulation machine. The top contender is still the GT1030, which is almost 10 years old and still costs the same as launch, 60 pounds for this might not be that bad considering the bleak state of 10+ year old hardware on the very low end GPUs, at least is a relatively new product.
HOW big is that memory bus, well I guess putting only ONE memory chip on a card is really cheap.
I'm surprised AMD and nVidia still make these kinds of gpu's. If you are only playing older/dx11 games buy a 1060 or 1070, even a 980 like me, and if power consumption is an issue get a 1050 ti.
While these numbers aren't amazing, on the budget side of things (especially if you really only play indie games or just games from 6 or more years ago,) this GPU for 70 Euros or 70 US Dollars could be good for those who have an older system, or rather, a SFF system that would like to game, but don't want to spend more than a hundred on their GPU. After all, your only options today would be this or the RX 6400, and we know that the price for that would be a bit over a $100 budget for a GPU such as this.
Fun fact: Dell is selling this GPU on their website for over $200! If form factor isn't an issue, you could buy the RX 6600 (non-XT) for that price! Very classy Dell.
This comment is the definition of complacence.
I didn't found any story or information from this video why this tresh even exist. Its probably some corupted chips that can't be use for low profile RX6400
please try fluid motion frames on it =))))))))
What was the point of testing a $60 card like this with so many high-end games. Talk about trying to get blood out of a stone for clickbait. For the money and spec, it did damn well and it seems to exist not as a gaming card but for office and server PCs!!!!!
uh seriously this feels like click bait calling it a dirty secret when it isn't. it is a card produced specifically for dell pcs it was not intended for commercial gamer use it was for pre-built pcs that were very low end "gaming" or what I call a basic use pc. if this is what it takes for your channel to get views on videos, you might as well quit and step aside. a better title is the obscure RX6300 only found in dell pcs. there fixed the click bait title for you.
64 bit bus is enough for the diplay addpter not for the GPU, but 32 bit... wandering if perform better in the netflix than nVidia 1030....