I suspect the ARC card actually has a higher potential than the AMD card, but Intel still has a couple of years of driver maturity before their cards are at their best
@@ihateeveryone8161 i dont think that the a750 is capable to close the gap to the rx 7600, not even with the best drivers. it still has potential and older games definetley have a lot of headroom, but the card in general is weaker.
It's actually a bit strange with the wattage because I have an A750 and daily drive it with a 1440p monitor and I'm always up in the 190w range so for me to see this video and see that the power draw is only between 140-170w most of the time (except in RT Minecraft) makes me think that something is wrong with his card and it makes it underperform.
I find it interesting to see the A750 not being utilized fully in a fair number of games you tested. I hope they keep improving the drivers as the hardware seems to be capable of outperforming the 7600 sometimes.. RDR2 is especially interesting. The A750 isn't being fully utilized again, yet CPU load is much higher than with the 7600 even though the fps on the A750 system are lower.
@@Le_Grand_Rigatoni I don't know that it's a CPU bottleneck. Arc shows low utilization in most titles that Intel haven't specifically gone out of their way to optimize for it. I had an A770 for a while and it was very common to see it well below 90% usage and chugging in games where I know for a fact that my CPU can push hundreds of frames on other cards. Yakuza: Like a Dragon is one example that sticks in my mind, where the A770 regularly dipped below 60fps and utilization went way down for no apparent reason. You could spin the camera around and go from 100fps to 50fps in the process, with utilization plummeting into the 60% sort of range and changing the settings making little difference to that. Intel would have to be doing something SERIOUSLY wrong to be encountering a CPU bottleneck in such a light game, where other cards can achieve hundreds of frames per second (if you turn the settings down to create a CPU-bound scenario). The drivers are just all over the place in general and Intel seem to be putting out fires on a per-game basis, which isn't great if you step outside the very most popular titles.
The Arc cards have come a long way in terms of drivers but Intel really needs to get the VRAM usage under control. The 7600 was using 1 GB less on average than the Arc card was in your testing.
Is compression memory technologieën. What amd or nvidia use other one than intel. And the technology the game maker have to program in the game it self Think about the s3metal memory compression technologie . That are more texture than glide use in games. Later all cards use it .
The RX 7600 is not a bad card, but the prices may fluctuate at the beginning now, for the same performance I would go for the 6700 as it's more available and possibly at a lower price
Bottom line. RX 7600 need to be sub $200 price range. The A750 is already $199 price range. A win for Intel but RX 7600 is not bad for what it is. Seem like a price to performance value. Unlike the 4060ti 8gb. If anything it should just be the 4060ti 16gb at $499 which is terrible value! RTX 4060 is $299 is not going to be great either. Nvidia? What are you doing? Do you guys even care anymore?
The power consumption on the 7600 would have me argue its a worse card than the competing 6000 series cards. It seems to be the smallest die possible, clocked way higher than it should be.
@@spoots1234 a desperate move it seems. Personally, I own a GIGABYTE 1060 which has a GP104 gpu, found in 1070 and 1080 models. Unable to unlock the Core Voltage but just feeding it more power is enough to make my computer turn my desk into a sauna. The Pascal architecture was one hell of a step forward.
i think comparison videos are so much more useful than your usual single card showdown, even though your calming voice style would really suit the less "competitive" kind of content
With ARC, you really need to push beyond 1080p; they simply don't do proportionally anywhere near as well as they do at 1440p and 4k. I'm sure you'd find the performance deltas collapse for the 7600, particularly at 4k. Sure 4k, is on the ragged edge of useful, but it also shows how effective down-scaling would be as well (render at 1440p, display at 1080, etc which is a very common option in a lot of games)
Yeah, scaling gets weird. I use my ASRock A750 at 4K, and it semi-regularly beats my 6700 XT in another system with certain games. Compatibility improved a lot in the past four months I've had it as well, which I appreciate.
I managed to get my a750 for 2300sek (about 190 gbp) just a week ago (brand spanking new). There were recent US price cuts down to 199 dollars. That is insane value.
I'm impressed that the 750 did as well as it did considering the immature drivers. The 7600 is shaping up to be a good choice for 1080p gaming if RT isn't needed. Great video Steve, just what I wanted to see.
The Arc not being kept at 100% usage at some of the games definitely means the card still has more performance up its sleeves, it just needs to be unlocked with better drivers in the future.
The store I'm working in has Arc A750s priced a bit lower than RTX 3050s (about ~$10-$15 lower). But still, people still usually pick the RTX 3050 over the Arc A750, most come up with arguments such as: "Intel is new to the dedicated GPUs market, so they must be really suck at it", which is sad, consider how it destroyed the RTX 3050 performance-wise. Why can't people just give this little card a chance to shine?
Because people who buy lower tier cards need their money to stretch, and hence they’re less likely to take risks with unfamiliar products. And the people with the money to take risks… they’ll just buy a 4090.
So disgusting ypu want to encourage others to have to PAY to be Beta testers, just so you can get a lower price. Do you even listen to yourself!? Nvidia drivers are indeed superior.
I wish when a price cut is announced, it would actually have an effect internationally. The price here in the Netherlands for the A750 hasn't budged while you can get them for the new MSRP at microcentre in the USA.
Usually, it takes longer for the price to drop in Europe. It's really sad :/ I remember when every youtuber said that the GPU prices started to drop, but it actually took a few months to drop in Europe.
Very interesting. Although I recently switched from Nvidia 1060 6Gb to an AMD 6600XT 8gb, I do hope Intel will make some room for itself on the GPU market
I love seeing a new player in the gpu race. It seems that Intel is more future focused with an emphasis on rt. It will be interesting to see how the next couple generations of Intel cards shape up.
for the minecraft benchmark; would you be able to check if the arc card has the upscaling option selected, and if the radeon one does as well? i don't think that radeon cards are able to use minecraft's upscaling for ray tracing so it'd be interesting to see if the arc card can
Not a bad performance from either card. I'd probably go with the AMD card unless I wanted ray tracing, though tracking the Intel card further may be a good option to see if drivers improve further still in the future.
You’re not going to get any playable performance in ray tracing with any GPU in this price range. Even the 4070ti can’t ray trace since it runs out of VRAM. Outside of the 4080/4090 ray tracing is staying off anyways so it’s pointless to compare.
Dude I've watched your videos for years now. I've been a brokie whilst studying etc. Basically had to watch all the tech come and go and settle with dell/hp systems, starting out swapping ram, installing single slot gpus etc. I've just ordered all the parts to finally make a build and I think you'd be happy with it. I've kept mid range with the i5 13400f and the arc a750 to begin with, which will both be upgradeable as I've gone for a top end b760 mobo. 990 pro ssd. 2x16 6000 ram (double up in the future) noctua coolers, fractal design pop mini etc. Basically all the foundations are BIS but the cpu and gpu will allow me to tinker about in the years to follow. So happy to finally be joining the master race. Amazing to think I'd end up using your benchmarks to make the final decision.
well done!But their is lack of the upscaling FSR/Xess test. Next time may be we can add some comparison in upscaling on?such as Cyberpunk 2077 and Bright memory:infinite which support dlss/fsr/xess,Call of Duty: Modern Warfare II ,Crime Boss: Rockay City,Diablo 4,Death Stranding: Director's Cut...also support the 3 upscaling techs.
The rx 6600 has dropped to £190 in the uk, now £30-£40 cheaper than the arc 750. That's where the value is at. Its also better with older systems that cannot enable resizable bar
There was a recent rice cut for a750 in the US that recently made its way to Europe. I got mine brand new for just 2300sek (190gbp) in Sweden. I am pretty satisfied with that purchase :)
@Linus Berglund if I could have got the a750 for £190 three months ago I'd have got it rather than the rx6600. At under £200 I think it's really good value
@@immortaljellyfish1051 Still good price since rx 6600 is a good value because by the time a750 gets fully matured expect intel got something way better than it at the same price as 190 although that might mean more price drop for a750. Currently in my country 6600 cost 12-13k while a750 is 15k-18k, so I'd rather pick the 6600
@pauloazuela8488 you also raise an interesting point on the fully matured part on intel. I think the a750 might be more powerful than the rx6600 however it's the drivers and lack of intel knowledge holding it back somehow. It will be interesting if the next intel gpu achieves its full potential and is the better option. The jntel a380 with 6gb of vram seems massively underwhelming and on paper should be far better
I can see Intel being in a good place with their GPU's after another couple of years of driver development. I suspect the A750 is a fair bit more powerful than the RX7600 in most cases if the technical specs are correct. Obviously the drivers are still immature and they have a lot of catching up to do but considering the good steps they've already made I don't see why they can't get there and be competitive.
Making this comment before finishing the video FYI. Pay attention to the wattage. Seeing some differences. Not sure how accurate the readings are without reading between the wall and the power supply but interesting.
Here in Canada, the A750 is $340 while the cheapest RX 7600 is $360, on the Nvidia side the cheapest RTX 3060 is $375-380 so the RX 7600 is $20 more than the A750 with the RTX 3060 $15-20 more than the RX 7600 so $35-40 more than the A750, that's brand new before tax (13% sales tax so tax included the 3 cards would come to like $384.20-$429.40). On the used market a RTX 3060 Ti or perhaps even a RTX 3070 would be in that price range on the Nvidia side while on the AMD side a RX 6700 XT would be possible with the RX 6800 being close.
I noticed that in most games the arc GPU was between 80% and 90% while the AMD GPU was at between 95% and 100% usage and between the two card's there was 10 to 15 frames difference.. so I wonder if the arc GPU driver is still holding back 10% or more (game depending) witch would be really likely the difference in frames!! So I think the hardware is as good as the NVIDIA and AMD it equivalents just needs driver work to get it up to the 100% usage then it needs to be re tested because I really think that 10% to 15% usage difference is really holding it back atm.
The legend has it that Intel Arc cards are still improving with driver updates. It was a nice try by Intel, but the performance is still quite variable in different titles.
It's a first gen "beta" product really. The good thing is all the work they are putting into the drivers for this card should benefit them with future GPU releases.
The Intel Arc cards looks so nice from the design, i love it! :D I wonder what we get in the next generation, maybe Intel can close the gap a little bit more.
The Arc founders edition cards are beautiful. I would pick one up for the looks alone but unfortunately they are poorly built; glue is widely used throughout its construction making it difficult to work on.
@@Swiss4.2 I have the A770 Limited Edition card as well as Acer's Predator Bifrost A770. The AIB partner models seem to be better constructed than the reference design, but the reference design also occupies a smaller footprint (closer to a 980Ti), making it a perfect candidate for my ITX travel rig. I wasn't ever expecting to get an Intel card...that is, until I got the reference card a week after it launched. I made sure to get it from a place with a good return policy, just in case. Instead of hating it, though, I saw great potential in the early results and decided to ride out the turbulence, and so far it's paid off rather well.
A750 put up a decent fight. Over here, A750 is about 70€ cheaper than the 7600 (275€ vs 345€). Looking at the results, they seem to be about equal in price/performance.
Have you guys not gotten the price cuts for the a750 yet? I managed to buy mine for 2300SEK (about 190 pounds) in Sweden just a week ago. That is crazy value!
I wonder how Skyrim runs on those 2 GPUs. A750 is a pretty interesting GPU. If Intel keeps on improving the drivers, it might gonna end as a good GPU, but right now I wouldn't use it as a main GPU, but as a backup or testing GPU I would like to get one.
@@shepardpolska yeah, multi GPU is dead. Though, I should've been more coherent. I wanted to mean that I want to see Skyrim running on A750 vs RX 7600.
About 10 watts power consumption advantage to the AMD card at idle and gaming. 5 watts to mid 150s vs 15 to mid 160s for intel card. The driver issues could bring down this gap and will likely give good performance improvements in the future if they still develop them for this card.
While I'm cheering for Intel to do better (never thought I would write that sentence) I simply can't recommend these cards. It may be possible that Intel abandons the dGPU market in the near future, meaning that these drivers may never be improved. AMD on the other hand seems very committed to support their products, so I think it's just a no brainer to go with the 7600. I also need to mention that unlike the RTX 4060 (ti) and the A750, the RX 7600 is actually a product that makes sense. The 4060 is limited by it's very weird hardware configuration and the A750 has very weird drivers.
I really like your video styles. They have a budget-friendly outlook, and compare hardware at face value. It’s always nice to see a mid-range comparison, especially with AMD and the newer Intel cards, which are often dismissed over an Nvidia card, which are often more expensive.
I like to play older titles and the lack of hardware DX9 support by Intel is a big turnoff. I have a laptop with Core I5 with XE graphics and most old DX9 titles run like crap. Meanwhile Radeon Graphics in and old Ryzen 4000 run them flawlessly.
Can't find any at that price anymore. And anyway ignoring RT, at any particular price the A750 is in, AMD's 6000 series has a more consistent performer and overall better value.
Instead of that, buy 8gb A770, if you're ok with 8gb cards and Intel in general. In lot of places you can get that for same price as 7600, maybe even cheaper. It's 15% faster than A750.
The RX 6700 10GB is so close to the same price (in the US) offers better performance using roughly the same amount of power as these two cards do. That would be my pick.
I noticed Mafia: Definitive edition worked better using DXVK with Intel Arc. FPS wasn't so much higher, but got rid of that stuttering most of the time (stuttered only for a while after launching the game).
That's definitely the drivers. Intel vram use > AMD vram use > Nvidia vram use is how it generally goes. AMD uses 0.5-1gb more vram than Nvidia, and Intel uses 1-1.5gb more. This is why smart Intel buyers get the 16gb A770, because you kind of really want to have that extra memory just in case.
Just curious did you have resizable bar on its reccomened with the arc cards just asking. and the intel gpu's do really good at 1440p maybe its the bus.
Is ARC A750 compatible is my pc? And if so, then is there any drawback or will there be any bottleneck issue? Specifications: Motherboard: AsRock x370 gaming x Processor: Ryzen 7 1700 PSU: 550w RAM: 16 GB
Nope. Arc requires you to use Resizable BAR in order for it to even perform decently. The Ryzen 7 1700 doesn't support it so you'd need to upgrade to at least a Zen 2+ processor to be able to use Resizable BAR. You'd also need to check if your motherboard supports it too. It would be called "Above 4G Decoding" in your BIOS.
I don't care about the FPS when they can achieve over 60 FPS. However, the ARC graphics card does look a lot better in graphical sharpness, and color depth & saturation.
@@KilzKnight In the video, the Arc A750 looks better in color depth and sharpness than the RX 7600. Having said that, I just installed the ACER Predator Bifrost Arc A770 in my system to play with. I pulled out the Zotac RTX 3060 OC. Let me just say that FPS is well above 60 in Halo Infinite, and 120 in Halo MCC, but there are still a lot of micro-stutters happening. In Unigine Heaven's benchmark, it got 5341 points with an average 212 FPS. Here's hoping future drivers resolve this issue and makes this a great card to have. Update: I downloaded the latest drivers from Intel's website and installed their app and drivers. I disabled the Predator Bifrost app. Graphics quality went way up while returning the same Heaven benchmarks. Micro-stuttering also was eliminated. I'm saying this card is a potential winner for the price and performance.
@@KilzKnight I had an A770 to play with over the weekend. With the latest drivers, the graphics were on par with an RTX 3060 XC. I should know since I have another PC with the 3060 in it and was able to compare them.side-by-side. I also have an RTX 3070 Ti, an RTX 4070, an RX 6700 XT, and an RX 6950 XT to compare to. However, the drivers for the the A770 is still not ready for primetime since it caused the card to run really hot during a Heaven benchmark run (89 deg C). Micro-stuttering was gone with the latest driver though, so there is that. When the Intel cards and drivers mature, then I will revisit the A770 16GB.
I have heard with 7000 series Ryzen you can do passthrough with RDNA 2 and newer GPUs without any performance drop, so I wonder if anything similar will be possible with Intel’s integrated GPU (it could be a way to save power when not gaming).
@@MirelRC crossfire is about making use of multiple GPUs at once. This is using either one (so a dedicated GPU is passthrough the connectors for the integrated GPU or only the integrated being used). It was mentioned in a Linus Tech Tip video about AMD tech upgrade for one of his employees where the guy needed more HDMI connectors so he could use the one for the integrated one without any issue for his monitor and the one on the dedicated GPU for his VR headset. I don’t know anything else about it than that which is why I am so curious about this topic, lol.
I think the most amazing thing is the VRAM usage. Intel is getting crushed in this respect. If they can continue to optimize their hardware then I do think they have a great chance of taking on the Green and Red head on.
Can you review the best graphics card that doesn't need an extra power connector? I'm curious because now a days this is almost non existence. Can you please? Amd and nvidia
One thing no one had noticed is the ARC A750 has richer colours and is more accurate to life, so if I only had the choice between those two cards, well, ARC A750 all the time.
That argument is non-sense. No GPU today has any advantage in this kind of stuff. Both AMD and Nvidia have a full set of on-the-fly color/brightness/saturation/etc controls that you can set up for each game and program, if you want. Nvidia even has some integrated shader modifiers if you like to tool around with those.
At around 2:22 when you bench marked cyberpunk, the A750 used 1gb - 1.5 gb. Does that have anything to do with the Intel drivers or the game? I'm simply just curious because I would think it should be similar (I might be wrong). Its worse in the witcher 3.
@n n I don't get many drivers issues with my RX580, the latest are incredibly stable, I've not experienced Intel drivers yet but in time I will purchase an intel Arc graphics card, as for Nvidia the drivers are good although when I had my GTX570 it was crashing constantly which is why I went for AMD next.
The question is "Which one is better?", not "Which one is good?". Even if you like both, you can't just say they're equally good, you have to choose one, even if it's by a very small margin. Always choosing the middle ground won't get you anywhere in life and posting crap like this also wastes the time of people researching which one to choose
There’s kinda no point going with the 7600 right now, there’s still the 6650XT and the 6700 non-XT which are cheaper and perform basically identically, with the added bonus of the 6700 having 2 more gb of vram.
I would not read too much into the .1% Low Percentiles of either card in very CPU intensive games such as The Witcher 3 Remastered and Cyberpunk 2077. You'd have to average out a number of runs for reliable results. Regardless I think both cards do very well.
The thing with raytracing is, if I want similar performance to non rtx games I have to buy something way above my paygrade (actually the most exp cards) With middle and low cards it doesn't matter what brand you use , it still sucks ... yes nvidia has some advantage but it is still under 60 fps ?? So I get to choose between like 30 and 40 fps I would choose not to enable it at all and don't bother with rt until a few more generations.
I'd really appreciate it if someone out there that is using a Intel Arc GPU with a Intel motherboard and CPU could test and see if DOSBox is working for you, and what Arc driver you are using. DOSBox doesn't work with a Intel Arc GPU with a AMD motherboard and CPU, so curious to see if Intel mobo's and cpu's have the same problem. Thanks in advance!!!
Anyone noticed how much less VRAM RX 7600 uses? Basically even if FPS would be identical the AMD card would be a safer bet because you won't run out of that VRAM as fast as on ARC A750
You also got to remember that Moore's Law is dead leaked how the Arc cards may have some major issues on a hardware level, meaning there is a possibility that drivers may never improve to the point that Arc cards are competitive (unless major price reduction) because drivers can't fix bad hardware
Given that the RX7600 is in a very mature architecture and driver environment Im insanely impressed Intel ARC A750 is doing so well with such good power consumption
Performance-wise they will likely be the same. As for cooling, the Pulse is just very slightly worse (less than 1c) than the Hellhound, so I would imagine the Fighter being worse as the Hellhound has a bigger heatsink.
the Arc is just better at Raytracing titles. For the price it will be the best RT card for sometime as drivers will improve but new cards will also arrive to challenge it 15-20months from now.
Looks like the Radeon's more mature drivers definitely makes a difference, performing better in most cases than the A750 with about the same wattage.
Yeah more established Radeon drivers help massively I think
I suspect the ARC card actually has a higher potential than the AMD card, but Intel still has a couple of years of driver maturity before their cards are at their best
Different hardware. Drivers are only one thing.
@@ihateeveryone8161 i dont think that the a750 is capable to close the gap to the rx 7600, not even with the best drivers. it still has potential and older games definetley have a lot of headroom, but the card in general is weaker.
It's actually a bit strange with the wattage because I have an A750 and daily drive it with a 1440p monitor and I'm always up in the 190w range so for me to see this video and see that the power draw is only between 140-170w most of the time (except in RT Minecraft) makes me think that something is wrong with his card and it makes it underperform.
I find it interesting to see the A750 not being utilized fully in a fair number of games you tested. I hope they keep improving the drivers as the hardware seems to be capable of outperforming the 7600 sometimes..
RDR2 is especially interesting. The A750 isn't being fully utilized again, yet CPU load is much higher than with the 7600 even though the fps on the A750 system are lower.
Yeah I hope it ages like a fine wine
I'm guessing the Intel driver is more CPU intensive because of poor optimisation.
@@Le_Grand_Rigatoni yeah it's likely there's more cpu overhead than there should be , resulting in worse performance
@@Le_Grand_Rigatoni I don't know that it's a CPU bottleneck. Arc shows low utilization in most titles that Intel haven't specifically gone out of their way to optimize for it. I had an A770 for a while and it was very common to see it well below 90% usage and chugging in games where I know for a fact that my CPU can push hundreds of frames on other cards. Yakuza: Like a Dragon is one example that sticks in my mind, where the A770 regularly dipped below 60fps and utilization went way down for no apparent reason. You could spin the camera around and go from 100fps to 50fps in the process, with utilization plummeting into the 60% sort of range and changing the settings making little difference to that. Intel would have to be doing something SERIOUSLY wrong to be encountering a CPU bottleneck in such a light game, where other cards can achieve hundreds of frames per second (if you turn the settings down to create a CPU-bound scenario). The drivers are just all over the place in general and Intel seem to be putting out fires on a per-game basis, which isn't great if you step outside the very most popular titles.
@@RandomGaminginHD”keeps aging like fine wine”
The Arc cards have come a long way in terms of drivers but Intel really needs to get the VRAM usage under control. The 7600 was using 1 GB less on average than the Arc card was in your testing.
And Nivida uses like 1gb less than AMD too lol
Is compression memory technologieën.
What amd or nvidia use other one than intel.
And the technology the game maker have to program in the game it self
Think about the s3metal memory compression technologie .
That are more texture than glide use in games.
Later all cards use it .
@@erickalvarez6486 it depends on how much Vram card have
The RX 7600 is not a bad card, but the prices may fluctuate at the beginning now, for the same performance I would go for the 6700 as it's more available and possibly at a lower price
And 2 extra gigs of vram on top. Not a bad choice.
Bottom line. RX 7600 need to be sub $200 price range. The A750 is already $199 price range. A win for Intel but RX 7600 is not bad for what it is. Seem like a price to performance value. Unlike the 4060ti 8gb. If anything it should just be the 4060ti 16gb at $499 which is terrible value! RTX 4060 is $299 is not going to be great either. Nvidia? What are you doing? Do you guys even care anymore?
The power consumption on the 7600 would have me argue its a worse card than the competing 6000 series cards. It seems to be the smallest die possible, clocked way higher than it should be.
@@spoots1234 a desperate move it seems. Personally, I own a GIGABYTE 1060 which has a GP104 gpu, found in 1070 and 1080 models. Unable to unlock the Core Voltage but just feeding it more power is enough to make my computer turn my desk into a sauna. The Pascal architecture was one hell of a step forward.
@@spoots1234 the 6000 cards only report board power, and not the full consumption of the card iirc
i think comparison videos are so much more useful than your usual single card showdown, even though your calming voice style would really suit the less "competitive" kind of content
With ARC, you really need to push beyond 1080p; they simply don't do proportionally anywhere near as well as they do at 1440p and 4k. I'm sure you'd find the performance deltas collapse for the 7600, particularly at 4k. Sure 4k, is on the ragged edge of useful, but it also shows how effective down-scaling would be as well (render at 1440p, display at 1080, etc which is a very common option in a lot of games)
Yeah, scaling gets weird. I use my ASRock A750 at 4K, and it semi-regularly beats my 6700 XT in another system with certain games. Compatibility improved a lot in the past four months I've had it as well, which I appreciate.
These cards could handle 1440p fine in most games (as long as the VRAM is not limiting), I'd love to see a test at that resolution.
With XeSS, 4K is actually really playable.
I pretty much hit 60 FPS 4K any game with XeSS with A770 16 GB
I managed to get my a750 for 2300sek (about 190 gbp) just a week ago (brand spanking new). There were recent US price cuts down to 199 dollars. That is insane value.
you can get a rx 6600 for 180 USD, it's not that insane
I'm impressed that the 750 did as well as it did considering the immature drivers. The 7600 is shaping up to be a good choice for 1080p gaming if RT isn't needed. Great video Steve, just what I wanted to see.
The Arc not being kept at 100% usage at some of the games definitely means the card still has more performance up its sleeves, it just needs to be unlocked with better drivers in the future.
Yeah definitely
The store I'm working in has Arc A750s priced a bit lower than RTX 3050s (about ~$10-$15 lower).
But still, people still usually pick the RTX 3050 over the Arc A750, most come up with arguments such as: "Intel is new to the dedicated GPUs market, so they must be really suck at it", which is sad, consider how it destroyed the RTX 3050 performance-wise. Why can't people just give this little card a chance to shine?
It's a kinda the same with Radeon, people just like to buy Nvidia instead of even checking if other products are competetive.
Because people who buy lower tier cards need their money to stretch, and hence they’re less likely to take risks with unfamiliar products. And the people with the money to take risks… they’ll just buy a 4090.
Cuz they are retardet mass brainwashed by marketing.
Its same reason why people buy iphone despite having much better phones for the same price.
So disgusting ypu want to encourage others to have to PAY to be Beta testers, just so you can get a lower price. Do you even listen to yourself!? Nvidia drivers are indeed superior.
@@Wobbothe3rd people are acting like amd driver crash every second time they try to boot.
I wish when a price cut is announced, it would actually have an effect internationally. The price here in the Netherlands for the A750 hasn't budged while you can get them for the new MSRP at microcentre in the USA.
And here in my country the arc A750 is like $60 cheaper than the RX7600
Usually, it takes longer for the price to drop in Europe. It's really sad :/ I remember when every youtuber said that the GPU prices started to drop, but it actually took a few months to drop in Europe.
@@MirelRC and it never even dropped that far! Some never even hit msrp
@@deldarel yeah. Most RTX 30 dropped a bit in price.
Very interesting. Although I recently switched from Nvidia 1060 6Gb to an AMD 6600XT 8gb, I do hope Intel will make some room for itself on the GPU market
I love seeing a new player in the gpu race. It seems that Intel is more future focused with an emphasis on rt. It will be interesting to see how the next couple generations of Intel cards shape up.
for the minecraft benchmark; would you be able to check if the arc card has the upscaling option selected, and if the radeon one does as well? i don't think that radeon cards are able to use minecraft's upscaling for ray tracing so it'd be interesting to see if the arc card can
It can't,the upscaling is DLSS and only for Nvidia cards
Not a bad performance from either card. I'd probably go with the AMD card unless I wanted ray tracing, though tracking the Intel card further may be a good option to see if drivers improve further still in the future.
You’re not going to get any playable performance in ray tracing with any GPU in this price range. Even the 4070ti can’t ray trace since it runs out of VRAM. Outside of the 4080/4090 ray tracing is staying off anyways so it’s pointless to compare.
Did u have reBar enabled? Can you compare results with enabled vs disabled?
Dude I've watched your videos for years now. I've been a brokie whilst studying etc. Basically had to watch all the tech come and go and settle with dell/hp systems, starting out swapping ram, installing single slot gpus etc. I've just ordered all the parts to finally make a build and I think you'd be happy with it. I've kept mid range with the i5 13400f and the arc a750 to begin with, which will both be upgradeable as I've gone for a top end b760 mobo. 990 pro ssd. 2x16 6000 ram (double up in the future) noctua coolers, fractal design pop mini etc. Basically all the foundations are BIS but the cpu and gpu will allow me to tinker about in the years to follow. So happy to finally be joining the master race. Amazing to think I'd end up using your benchmarks to make the final decision.
well done!But their is lack of the upscaling FSR/Xess test. Next time may be we can add some comparison in upscaling on?such as Cyberpunk 2077 and Bright memory:infinite which support dlss/fsr/xess,Call of Duty: Modern Warfare II ,Crime Boss: Rockay City,Diablo 4,Death Stranding: Director's Cut...also support the 3 upscaling techs.
The rx 6600 has dropped to £190 in the uk, now £30-£40 cheaper than the arc 750. That's where the value is at. Its also better with older systems that cannot enable resizable bar
There was a recent rice cut for a750 in the US that recently made its way to Europe. I got mine brand new for just 2300sek (190gbp) in Sweden.
I am pretty satisfied with that purchase :)
@Linus Berglund if I could have got the a750 for £190 three months ago I'd have got it rather than the rx6600. At under £200 I think it's really good value
@@immortaljellyfish1051 Still good price since rx 6600 is a good value because by the time a750 gets fully matured expect intel got something way better than it at the same price as 190 although that might mean more price drop for a750. Currently in my country 6600 cost 12-13k while a750 is 15k-18k, so I'd rather pick the 6600
@pauloazuela8488 you also raise an interesting point on the fully matured part on intel. I think the a750 might be more powerful than the rx6600 however it's the drivers and lack of intel knowledge holding it back somehow. It will be interesting if the next intel gpu achieves its full potential and is the better option. The jntel a380 with 6gb of vram seems massively underwhelming and on paper should be far better
Nice test but I would have included 1440p data but that would have been twice the work.
I don't know how it is in other countries, but in my case the RX7600/RTX3060 cost $60 more than the ARC A770.
I would take the RX 6700 over both of them. The first time I saw that card was on your channel and it has become a phenomenal value!
I can see Intel being in a good place with their GPU's after another couple of years of driver development. I suspect the A750 is a fair bit more powerful than the RX7600 in most cases if the technical specs are correct. Obviously the drivers are still immature and they have a lot of catching up to do but considering the good steps they've already made I don't see why they can't get there and be competitive.
Making this comment before finishing the video FYI. Pay attention to the wattage. Seeing some differences. Not sure how accurate the readings are without reading between the wall and the power supply but interesting.
Considering how long Intel's competitors have done GPUs, they did a fine job with arc
This video made me much more proud of being AMD-Only
Here in Canada, the A750 is $340 while the cheapest RX 7600 is $360, on the Nvidia side the cheapest RTX 3060 is $375-380 so the RX 7600 is $20 more than the A750 with the RTX 3060 $15-20 more than the RX 7600 so $35-40 more than the A750, that's brand new before tax (13% sales tax so tax included the 3 cards would come to like $384.20-$429.40). On the used market a RTX 3060 Ti or perhaps even a RTX 3070 would be in that price range on the Nvidia side while on the AMD side a RX 6700 XT would be possible with the RX 6800 being close.
So, is A750 still not a better buy?
I noticed that in most games the arc GPU was between 80% and 90% while the AMD GPU was at between 95% and 100% usage and between the two card's there was 10 to 15 frames difference.. so I wonder if the arc GPU driver is still holding back 10% or more (game depending) witch would be really likely the difference in frames!! So I think the hardware is as good as the NVIDIA and AMD it equivalents just needs driver work to get it up to the 100% usage then it needs to be re tested because I really think that 10% to 15% usage difference is really holding it back atm.
I've got a 7800x3d and 7900xtx and just watch your videos for fun. You have excellent content and a unique perspective, keep up the amazing work mate.
epeen alert
The A580 is $149.99 in USA!
You make an astounding work, please don't stop!
The legend has it that Intel Arc cards are still improving with driver updates. It was a nice try by Intel, but the performance is still quite variable in different titles.
Yeah still a work in progress
It's a first gen "beta" product really. The good thing is all the work they are putting into the drivers for this card should benefit them with future GPU releases.
The Intel Arc cards looks so nice from the design, i love it! :D
I wonder what we get in the next generation, maybe Intel can close the gap a little bit more.
Yeah I hope so :)
The Arc founders edition cards are beautiful. I would pick one up for the looks alone but unfortunately they are poorly built; glue is widely used throughout its construction making it difficult to work on.
@@Swiss4.2 I have the A770 Limited Edition card as well as Acer's Predator Bifrost A770.
The AIB partner models seem to be better constructed than the reference design, but the reference design also occupies a smaller footprint (closer to a 980Ti), making it a perfect candidate for my ITX travel rig.
I wasn't ever expecting to get an Intel card...that is, until I got the reference card a week after it launched. I made sure to get it from a place with a good return policy, just in case.
Instead of hating it, though, I saw great potential in the early results and decided to ride out the turbulence, and so far it's paid off rather well.
A750 put up a decent fight. Over here, A750 is about 70€ cheaper than the 7600 (275€ vs 345€). Looking at the results, they seem to be about equal in price/performance.
Have you guys not gotten the price cuts for the a750 yet? I managed to buy mine for 2300SEK (about 190 pounds) in Sweden just a week ago. That is crazy value!
I wonder how Skyrim runs on those 2 GPUs. A750 is a pretty interesting GPU. If Intel keeps on improving the drivers, it might gonna end as a good GPU, but right now I wouldn't use it as a main GPU, but as a backup or testing GPU I would like to get one.
I don't believe Intel has a GPU linking tech at all like that. Why would they when Crossfire/SLI is not supported anymore due to lack of usefulness.
@@shepardpolska They are talking about the two different GPUs, the a750 and 7600. Not using two gpus linked together in SLI/Crossfire.
@@JustSomeDinosaurPerson ah, yeah I totaly misread that.
@@shepardpolska yeah, multi GPU is dead. Though, I should've been more coherent. I wanted to mean that I want to see Skyrim running on A750 vs RX 7600.
@@MirelRC It was my bad, I misread wjat you wrote
About 10 watts power consumption advantage to the AMD card at idle and gaming. 5 watts to mid 150s vs 15 to mid 160s for intel card. The driver issues could bring down this gap and will likely give good performance improvements in the future if they still develop them for this card.
Sometimes I swear you read my mind. I was searching vids and boom. Posted 4hrs ago…. Keep up the good work and say Hi to Dave for me!
I'm impressed by the picture fidelity of the A750.
@Kharos There isn't any, this dude doesn't know what he's saying
Nice, great video as always!
Thanks :)
Very objective analysis. Keep up the good work.
Good stuff dude!
I picked up an A750 Challenger for $149 USD and couldn’t be happier.
Nice to see you’re putting that newly returned arc 750 to good use! Haha
Absolutely!
As usual GOOD work and excellent content!
While I'm cheering for Intel to do better (never thought I would write that sentence) I simply can't recommend these cards.
It may be possible that Intel abandons the dGPU market in the near future, meaning that these drivers may never be improved.
AMD on the other hand seems very committed to support their products, so I think it's just a no brainer to go with the 7600.
I also need to mention that unlike the RTX 4060 (ti) and the A750, the RX 7600 is actually a product that makes sense.
The 4060 is limited by it's very weird hardware configuration and the A750 has very weird drivers.
Remind us what the power draw of the ARC GPU is again please?
same here in our country, pricing of the a750 is around 7600. it should be equal or just above the 6600
I really like your video styles. They have a budget-friendly outlook, and compare hardware at face value. It’s always nice to see a mid-range comparison, especially with AMD and the newer Intel cards, which are often dismissed over an Nvidia card, which are often more expensive.
in UK on overclockers you can get the 8gb 770 for same price as the 7600 :)
If they drop the price of the 7600 by $30 - $50, it'll be worth it.
They don't want to lower that much because 7500 will be 199
Buy used bruh
@@OpsMasterWoods no thanks bruh
@@thuglyfe709 imagine
Did you mean Arc? Judging by this comparison a750 should be cheaper.
Interesting the A750 seems to use both more RAM and/or vRAM in some games
I like to play older titles and the lack of hardware DX9 support by Intel is a big turnoff. I have a laptop with Core I5 with XE graphics and most old DX9 titles run like crap. Meanwhile Radeon Graphics in and old Ryzen 4000 run them flawlessly.
I’m waiting to see a versus of the 6700 XT and 7600
Why are the a750 fan fins like that, almost like they're warped
a750 was only $200 USD recently. A massive discount compared to the 7600
At Sub 200$ you can get 6700/xt used
Can't find any at that price anymore. And anyway ignoring RT, at any particular price the A750 is in, AMD's 6000 series has a more consistent performer and overall better value.
@@nikosensei1258 Yeah, used. The a750 was $200 USD new at the time.
What're the specs of your testing rig? I want to know how much that can impact the gaming.
Considering Arc cards overclock well, it would be interesting to see how they compare when A750 is OCed
rx 7600 overclocks even better, if you compare on techpowerup, a750 overclock = 6% improvement while the rx 7600 gets an 10%
Instead of that, buy 8gb A770, if you're ok with 8gb cards and Intel in general. In lot of places you can get that for same price as 7600, maybe even cheaper. It's 15% faster than A750.
Have you done any testing recently if DXVK still improves perf in the arc cards on d11 and older?
I think Intel themselves has added DXVK to their drivers for improving pre DX12 performance (or some other translation layer).
@@AndersHass its only whitelisted in some games like counterstrike, you need to manually do it in other games
@@KARLOSPCgame ah I see. I guess they test it for specific games before having it on by default.
The RX 6700 10GB is so close to the same price (in the US) offers better performance using roughly the same amount of power as these two cards do.
That would be my pick.
I noticed Mafia: Definitive edition worked better using DXVK with Intel Arc. FPS wasn't so much higher, but got rid of that stuttering most of the time (stuttered only for a while after launching the game).
Yeah i tried dxvk in another video and it works great :)
A game suggestion I would love to see in future benchmarks is 7 Days To Die.
This game can be very demanding on my laptop 1650.
For the averages of Forza Horizon 5, did you use the in-game benchmark or the in-game footage displayed ?
Minecraft really hates AMD 😅
Thanks for including Minecraft bro. Its hard to find Minecrafts benchmark on specific GPU
Curiously the ARC has significantly higher system memory usage in most games here compared to the 7600. Was resizable bar enabled?
It's the drivers, just like Nvidias there's a ton of system overhead and poor optimization.
That's definitely the drivers.
Intel vram use > AMD vram use > Nvidia vram use is how it generally goes. AMD uses 0.5-1gb more vram than Nvidia, and Intel uses 1-1.5gb more. This is why smart Intel buyers get the 16gb A770, because you kind of really want to have that extra memory just in case.
never been too fast
Interesting that in games like Minecraft the Vram allocation on the Intel card is 1gb higher than the 7600 at the same settings!
Just curious did you have resizable bar on its reccomened with the arc cards just asking. and the intel gpu's do really good at 1440p maybe its the bus.
Is ARC A750 compatible is my pc? And if so, then is there any drawback or will there be any bottleneck issue?
Specifications:
Motherboard: AsRock x370 gaming x
Processor: Ryzen 7 1700
PSU: 550w
RAM: 16 GB
Nope.
Arc requires you to use Resizable BAR in order for it to even perform decently. The Ryzen 7 1700 doesn't support it so you'd need to upgrade to at least a Zen 2+ processor to be able to use Resizable BAR.
You'd also need to check if your motherboard supports it too. It would be called "Above 4G Decoding" in your BIOS.
I don't care about the FPS when they can achieve over 60 FPS. However, the ARC graphics card does look a lot better in graphical sharpness, and color depth & saturation.
bruh what are you saying, in some games the a750 won't reach 60 fps, and the image they both produce is the same
@@KilzKnight In the video, the Arc A750 looks better in color depth and sharpness than the RX 7600. Having said that, I just installed the ACER Predator Bifrost Arc A770 in my system to play with. I pulled out the Zotac RTX 3060 OC. Let me just say that FPS is well above 60 in Halo Infinite, and 120 in Halo MCC, but there are still a lot of micro-stutters happening. In Unigine Heaven's benchmark, it got 5341 points with an average 212 FPS. Here's hoping future drivers resolve this issue and makes this a great card to have.
Update: I downloaded the latest drivers from Intel's website and installed their app and drivers. I disabled the Predator Bifrost app. Graphics quality went way up while returning the same Heaven benchmarks. Micro-stuttering also was eliminated. I'm saying this card is a potential winner for the price and performance.
@@l.i.archer5379 They do not look better dude.. I am happy that you like your A750 but stop saying those things
@@KilzKnight I had an A770 to play with over the weekend. With the latest drivers, the graphics were on par with an RTX 3060 XC. I should know since I have another PC with the 3060 in it and was able to compare them.side-by-side. I also have an RTX 3070 Ti, an RTX 4070, an RX 6700 XT, and an RX 6950 XT to compare to. However, the drivers for the the A770 is still not ready for primetime since it caused the card to run really hot during a Heaven benchmark run (89 deg C). Micro-stuttering was gone with the latest driver though, so there is that. When the Intel cards and drivers mature, then I will revisit the A770 16GB.
Which driver version did you test the RX 7600 with? I have a lot of crashes with the official driver.
did you DDU before installing the card?
Hey neighbour, is it also 30+ C over there in the UK? 🔥
Will you possibly benchmark on Ark Survival Evolved? Got an RTX 3070 and a ryzen 5 5600 and on the genesis 2 map I barely get 60fps. Very demanding
I have heard with 7000 series Ryzen you can do passthrough with RDNA 2 and newer GPUs without any performance drop, so I wonder if anything similar will be possible with Intel’s integrated GPU (it could be a way to save power when not gaming).
What kind of passthrough? Like CrossfireX?
@@MirelRC crossfire is about making use of multiple GPUs at once. This is using either one (so a dedicated GPU is passthrough the connectors for the integrated GPU or only the integrated being used).
It was mentioned in a Linus Tech Tip video about AMD tech upgrade for one of his employees where the guy needed more HDMI connectors so he could use the one for the integrated one without any issue for his monitor and the one on the dedicated GPU for his VR headset. I don’t know anything else about it than that which is why I am so curious about this topic, lol.
@@AndersHass that's cool. I should start to search more about this topic. Thank your for info mate.
@@MirelRC goodluck finding things, I haven’t been able to find things other then that LTT video, lol
@@AndersHass I have only found something related to Linux VMs.
I hope intel release more cards.
Did you actually give the price? I skipped through but the intro and description say "similar price" but come on give us a number up front.
I think the most amazing thing is the VRAM usage. Intel is getting crushed in this respect. If they can continue to optimize their hardware then I do think they have a great chance of taking on the Green and Red head on.
Can you review the best graphics card that doesn't need an extra power connector? I'm curious because now a days this is almost non existence. Can you please? Amd and nvidia
One thing no one had noticed is the ARC A750 has richer colours and is more accurate to life, so if I only had the choice between those two cards, well, ARC A750 all the time.
That argument is non-sense. No GPU today has any advantage in this kind of stuff.
Both AMD and Nvidia have a full set of on-the-fly color/brightness/saturation/etc controls that you can set up for each game and program, if you want. Nvidia even has some integrated shader modifiers if you like to tool around with those.
The ARC cards look so clean, ngl
where I live the a750 is like rx 6600 rn. The A770 is like the 7600
I think for hogwarts it’s better to test in medium because of the Vram of both the cards, the lows will be better in both cards
At around 2:22 when you bench marked cyberpunk, the A750 used 1gb - 1.5 gb. Does that have anything to do with the Intel drivers or the game? I'm simply just curious because I would think it should be similar (I might be wrong). Its worse in the witcher 3.
Likely the drivers.
Both offer excellent performance at their respective price points.
@n n I don't get many drivers issues with my RX580, the latest are incredibly stable, I've not experienced Intel drivers yet but in time I will purchase an intel Arc graphics card, as for Nvidia the drivers are good although when I had my GTX570 it was crashing constantly which is why I went for AMD next.
@n n AMD drivers often suck. Throwing stones from a glass house.
@n n You're about 4 months behind on this narrative.
The question is "Which one is better?", not "Which one is good?". Even if you like both, you can't just say they're equally good, you have to choose one, even if it's by a very small margin. Always choosing the middle ground won't get you anywhere in life and posting crap like this also wastes the time of people researching which one to choose
These comparative videos are much more informative/interesting than looking at one GPU in a vacuum.
Would say the a770 8gb is same price as the a750 atm in the UK about 250.
There’s kinda no point going with the 7600 right now, there’s still the 6650XT and the 6700 non-XT which are cheaper and perform basically identically, with the added bonus of the 6700 having 2 more gb of vram.
Can u compare the cards again I saw gamer nexus video and a750 is on par with 7600xt in some games while costing less than rx 6600
It would be nice to see this test at 1440p
I would not read too much into the .1% Low Percentiles of either card in very CPU intensive games such as The Witcher 3 Remastered and Cyberpunk 2077. You'd have to average out a number of runs for reliable results. Regardless I think both cards do very well.
Only testing in 1080p ?
Testing in 1440p and beyond favours the Arc card.
The thing with raytracing is, if I want similar performance to non rtx games I have to buy something way above my paygrade (actually the most exp cards) With middle and low cards it doesn't matter what brand you use , it still sucks ... yes nvidia has some advantage but it is still under 60 fps ?? So I get to choose between like 30 and 40 fps I would choose not to enable it at all and don't bother with rt until a few more generations.
Kind of a moot comparison I guess you just had it lying around and to squeeze a bit more views out before you sell it on
wondering why radeon use significantly less vram in the witcher 3
When the arc cards were new, it was just a poor choice. Now it's totally a buy for the right price.
BTFO, got it, thanks for the comparisons 🙂
I'd really appreciate it if someone out there that is using a Intel Arc GPU with a Intel motherboard and CPU could test and see if DOSBox is working for you, and what Arc driver you are using. DOSBox doesn't work with a Intel Arc GPU with a AMD motherboard and CPU, so curious to see if Intel mobo's and cpu's have the same problem. Thanks in advance!!!
Anyone noticed how much less VRAM RX 7600 uses? Basically even if FPS would be identical the AMD card would be a safer bet because you won't run out of that VRAM as fast as on ARC A750
You also got to remember that Moore's Law is dead leaked how the Arc cards may have some major issues on a hardware level, meaning there is a possibility that drivers may never improve to the point that Arc cards are competitive (unless major price reduction) because drivers can't fix bad hardware
Should i buy 3060 or 7600?
Given that the RX7600 is in a very mature architecture and driver environment Im insanely impressed Intel ARC A750 is doing so well with such good power consumption
I wonder how the powercolor fighter model fair against the sapphire pulse model
Performance-wise they will likely be the same. As for cooling, the Pulse is just very slightly worse (less than 1c) than the Hellhound, so I would imagine the Fighter being worse as the Hellhound has a bigger heatsink.
the Arc is just better at Raytracing titles. For the price it will be the best RT card for sometime as drivers will improve but new cards will also arrive to challenge it 15-20months from now.