Just bought an ARC A770 for $280 and installed it into a system that will mostly be used to teach myself Resolve and OBS. So far, so good. I suspect that driver support will keep getting better because Intel has the resources to maintain such a focus.
That’s what I’m hoping. They’ve made huge strides in the past year+ since launch. Hope they can continue the improvements until and after their next gen comes out!
Our youngest son had a hand me down rx 5700xt in his rig. We gave him a £300 gpu budget. Personally I thought he would grab a 6700xt but he picked up the arc sparkle 770. I am so impressed with how far drivers have come and he just chops about in Minecraft and the unreal store building his own game and it seems flawless. I know I am speaking on behalf of what 5 year old wants and does but it has me wanting to try it in my server pc.
I swear if Intel keeps going I'll seriously consider them for my next gpu. Nvidia has become too expensive and amd aren't as value oriented as they once were.
AMD is great in that its cards perform very well for the price + the generous amounts of VRAM. However, it needs to get on Nvidia's ass when it comes to proper competition for DLDSR and DLSS.
AMD seems content to just slightly undercut nvidia at every sub-top-tier price point whereas Intel tried very hard to go right for the throat (they just haven't quite succeeded)
@@klevesmith nice GPU! Why Intel though, if you don't mind me asking? With such a powerful CPU, you could easily go RTX 4070 Super or even Ti Super. That CPU is a lot of power for just an Arc card. If you're adamant about Arc, you may be better off with an i3-13100 in that case for 1080p gaming. i7-13700 would never see its full potential with that card, so it may be wasted $$.
A lot of content creators and production users like the 16GB of VRAM on a wide memory bus. The 770 does cater to a specific market that AMD and Nvidia have ignored. Intel's driver development is also helping with their iGPU drivers. Future looks positive for Intel keeping up on their ARC line.
I have the exact same version of the A770 you have and I love it, sniped for 279$ USD new for my secondary system, intel has come a very very long way in terms of drivers and the instability seems to be fixed for the most part. In terms of performance the card is still all over the place, sometimes losing to a 3060, to sometimes mopping the floor with a 4060Ti. Either way I think this card will age like fine wine and over the next 4-5 months it’ll get more consistent
The higher the resolution, the better the A770 runs compared to 4060 and 7600. In many cases stepping up from 1080p to 1440p is only a few fps slower, what is a sign for still lacking driver-optimization.
@@lorsheckmolseh3345 yup that’s the A770’s superior hardware showing its true colors, it has better specs on paper than an RTX 3070Ti and has the potential to outperform one as drivers get better
@@Reaper_ZL1 Yep. I've seen a few people say why would you buy Intel cards right now. It's simple, you're not buying them for RIGHT NOW, you're buying them for the future when VRAM is at a premium and you'll have to pay $600+ for a card of this nature instead of the $250-$300 you can get it for now. Intel definitely future proofed this release and I'm heavily leaning towards upgrading to an A770 from a 1050 ti. It's a massive jump in performance but looking like it's the best price-to-performance card right now with some futureproofing as well.
I just recently switched to a INTEL LIMITED EDITION A770 16GB after my 3080 died. even though the framerate's weren't crazy fast, it runs butter smooth, and Personally I like the way Intel renders the colors in COD.
I believe that many gamers buying the Arc cards do it so to support the new platform, and be there to se it evolve and grow, not max out performance, gaming heroes in a way!
I got my A770 for around 250 here in Japan and that 16GB is worth every penny especially for the future games that needs more than 8GB. for competitive games you dont need that much since most of those are optimized for 8GB or lower specs, since AAA and AA single games will be aimed with PS5/XBSX ram in mind 16GB is definitely the way to go if you want to use more geometry and texture data for those sweet visuals. also for those who are starting to use Blender on making 3D stuff or just Davinci Resolve, it's quite fast for what it is. so if you can spot an under 300 bucks in the wild id say it's a steal.
@@brahimsaad6287 wonderful. Drivers are very mature. A770 performance falls between a 4060 and 4060ti. Especially the 16gb variant. If you can get it for $300 or less, it’s well worth your money.
EU here as well, on the fence of buying an A770 despite the potential hassle. I ve been through the journey of Ryzen 1st gen to nowadays chips and whilst interesting, I m not all that sure I really want or can invest as much time as I did with workarounds and learning what works and what doesn t when it comes to graphics cards. I believed AMD s promise of a long lasting platform and also saw the cost benefits over time- my entire family/ friends switched to Ryzen thanks to me spearheading and spending time doing my own research thanks to many online sources. It ended up paying off but you don t remotely get a similar synergy or ROI when it comes to a graphics card and yeah, here the prices for A770 are between 350-400Eur lately, that simply kills the deal when you can get a 6700/6750XT for the very same price. If the A770 was around 250-270 Eur it d still be a questionable risk (re sale value will be low anyway) but that could lure me into getting one.
I had issues 3 months ago with the Arc A770 16GB in my PC, and had to swap my RTX 3070 Ti back into it. This Sunday, I actually decided to fix my PC by reinstalling the Arc A770 and resetting my Windows 10 22H2, with a clean install of the Intel drivers. Reinstalled Halo MCC and Infinite, and FPS maxxes out at my monitor's 165 Hz frequency in both titles with smooth, and very playable graphics. I'm really loving my ASRock Phantom Gaming card paired with a Ryzen 7 5800X, and 64 GB of 3200 MT/s 16CL RAM, now. The latest Heaven Benchmark tonight also yielded a 361 FPS max, and 209 FPS average framerate, and temperatures never went above 74*C. Now, I'm thinking of upgrading to a 5800X3D CPU for this PC.
A770 has a 256-bit bus (560 GB/s), 16 GB, what should make it a good tad faster than the castrated 4060 or 7600 only connected by 128 bits and 8x PCI-E. Raytracing-capabilities are outstanding in its class. The high power draw easily can be reduced to around 140W by undervolting, maxclock set to 2100MHz and usage of Optimus (video out over motherboard), what costs around 15-20% performance. The driver improvements in 1 year were massive and there is still a good portion of extra power to expect by more optimization.
While I agree 100%, the a770 and 4060/7600 are in completely different performance classes. I should have included the specifications in this video instead of pushing them to a separate video, because I could have made this much clearer.
@@ProceuTech, yes, but this was only true before the newest driver versions. The "PC Games Hardware"-channel on UA-cam made several tests some days ago, that clearly show the A770 dominating over the 4060/7600. No wonder: the higher the resolution of meshes, textures and screen, the higher the risk of cache misses at these models.
intel gpu can spot game devs flaws in the code when it reveal any lower 1%lows when highs can go very high 300 fps....crysis is well made game in remaster version than original version due to devs don't make mistake in making maps from their tools of doing maps in one pass vs avg dev would patch the map by multiple pass from their texture editor than redo whole map from scracth that devs don't want to do again so that idea would result fps hit in certain spot on the map if user find it from gpu rendering ...devs are getting super lazy as this late ...either time constraints and low pay
It's a shame that in Brazil it costs 431 euros for a plate like this, here our politicians rob us straight through. But I opted for an Arc 770 because I believe in Intel more than AMD
You are not using XeSS. You should be more knowledgable in the area of electronics my friend… XeSS literally a blessing to this already powerful GPU. Edit: You know about XeSS, but you didn’t show the differences with XeSS. These are only native settings.
I have been using an Arc A770 that I got used during the tail end of the pandemic. When it comes to really old games, I have some subjective observations. In City of Heroes: Freedom, the A770 had serious issues. Running the 'Ultimate" Preset (the game wants to default to this) rendered this old game beautifully - but had a very distinct stutter whenever rendering large fights, crowds and fights with lots of particle effects. On Recommended settings the previous issues go away but the game looks somewhat 'flat' in comparison. In the Cryptic Studios games Champions Onliine and StarTrek Online, there are some issues in what the game wants to use as default settings. In the former, anti-aliasing is turned to the max, which causes harsh black lines to appear on characters and objects in the former, and in the latter title bloom had to be manually adjusted in space scenes as planets were brighter than stars, and almost painfully bright on the monitor I am using. Again, adjusting these specific settings fixes the issues. I am sorry I don't have hard FPS data for these titles. I've paired it with an I7-11700F machine with 32 gb of 3200mhz ram. Does it matter? I dunno. 🤷♂
Proceu Tech : The problem with this I5 13600K is that the 1.st CPU-s P-core only hold those turbo clocks for a few seconds and then they clock down, the same goes for the 2.nd CPU-s E-cores....this i5 is a bottle neck for many new GPU models that are of a MID TIER and above....so, to truly test the potential of this Intel Arc A770 16GB you would have to use a Ryzen 7 5800X3D (that is faster than this i5 13600K as well as the i7-s and i9-s) or a Ryzen 7 7800X3D (that is faster than the i9 14900K). You can go look it up, certain reviewers already proved that (not that youtube reviewers are legit, but in certain general knowledge scenarios, they, if the wanted to, cannot hold the truth back).
While this behavior is true out of the box, I’ve uncapped the power limit and set the tau to be over 300 seconds. In games, and I can prove this by either showing Intel’s XTU or MSI afterburner, the chip is able to stick to 5.5GHz on P-Cores 0-3, and then 5.4GHz on the rest. All of the E cores were also stuck at 4.0 GHz. I do agree that there were cpu bound instances in this video; I Just wonder how much of it is due to the Arc cards and how much is due to the bottleneck
@@ProceuTech cool to everything you just said....but bro, 230 Watts at stock clocks with no OC-ing, unlocking the time limit only increases the power consumption to 290 Watts ?!?!?!?! Nissan Skyline GT-R 35 can get 2500 HP and beat every car on the planet....while eating abnormal amount of fuel, breaking every 100 KM, needs its part replaced every week and so on....do get my drift. The 5800X3D and 7800X3D are at its stock clocks and at their stock extremely high efficiency the fastest CPU-s on the planet Earth. By doing so they eat very low amounts of power. And only at stock clocks with PBO enable is what counts and nothing else, because this is the most realistic scenario here to be used by every gamer. These two AMD-s are faster even at stock limited power consumption levels than you 300 Watt i5. That's why you should use either one of them for every GPU testing because this way you get the least amount of CPU bound scenarios if at all. 5800X3D = max peak of 111 Watts 7800X3D = max peak of 82 Watts
I average 144 to 158 1080p on cod mw3, running with corsiar vengence ddr5 6400 set at 5400 with msi b760 wifi plus gaming moterboard and i7 12700kf not overclocked.
Only two games shown that are part of my game niche. A card like this with 16GB of VRAM for competitive games is money wasted, most of these games are optimized as hell for a GPU with 8GB of VRAM and don't need more than that! Now the performance of this card in these singleplay games is leaving something to be desired, cyberbug is a shit... although the performance of this card in Red Dead Redemption was a little satisfactory. Anyway, the card looks good but the performance is a little low for the price it offers!
@@ProceuTechbut I'm guessing by the time Battlemage comes out, drivers might be on par with AMD which still isn't the best drivers.... I just ordered a 280$ A770 Titan 16Gb and am l hoping they will pull through for all of us!
@@Miley_00 it will be fine for 1080p and arc 770 are simply not enough stronger so 16gb will be enough but performance of card again will be too shitty
I really like my A770 LE. Sadly, I just put it up for sale last evening due to an issue in my favorite title generated by an Intel firmware update. Intel's support seem lacking at this point. Not sure if they factored the amount of resources they'd need to address issues. The standard answer for every issue is them asking to submit an SSU report...that is typically followed by them closing the case without any attempt at a proper resolution.
Great video! Thanks for sharing. So the A770 is a great card for 720p ultra, 1080p medium to high, or 1440p low to medium, depending on the game of course. No thanks. Lol
I got this card and unfortunately my PC won't boot. I have a 9700F i7 and it won't post. I don't have onboard graphics so I made sure resizable bar was on before I swapped gpus. I made sure to uninstall Nvidia drivers and swap with my A770. When the computer turns on I only see a few white dots on the screen. I thought my gpu was dead so I put it in my son's computer. He has an i5 10400 and it works fine. I know Intel recommends a 10th get but some people have made an ARC gpu work with an 8th gen so long as resizable bar is on. Any help would be appreciated.
Had me wondering too. Looks great... I didn't play any cod since og black ops. And recently got back on mw3 for zombies. But definitely going to try that one.
i have a a750 and the fact it doesn't run dosbox is a huge turn off. there is no reason in 2024 where you cannot run dos games. intel made a huge blunder there. welp back to my 5700xt till i upgrade to something else.
Just bought an ARC A770 for $280 and installed it into a system that will mostly be used to teach myself Resolve and OBS. So far, so good. I suspect that driver support will keep getting better because Intel has the resources to maintain such a focus.
That’s what I’m hoping. They’ve made huge strides in the past year+ since launch. Hope they can continue the improvements until and after their next gen comes out!
Our youngest son had a hand me down rx 5700xt in his rig. We gave him a £300 gpu budget. Personally I thought he would grab a 6700xt but he picked up the arc sparkle 770. I am so impressed with how far drivers have come and he just chops about in Minecraft and the unreal store building his own game and it seems flawless. I know I am speaking on behalf of what 5 year old wants and does but it has me wanting to try it in my server pc.
I swear if Intel keeps going I'll seriously consider them for my next gpu. Nvidia has become too expensive and amd aren't as value oriented as they once were.
AMD is great in that its cards perform very well for the price + the generous amounts of VRAM. However, it needs to get on Nvidia's ass when it comes to proper competition for DLDSR and DLSS.
AMD seems content to just slightly undercut nvidia at every sub-top-tier price point whereas Intel tried very hard to go right for the throat (they just haven't quite succeeded)
I am about to build my first PC around the Arc 770 16gb paired with an i7 13700
@@klevesmith nice GPU! Why Intel though, if you don't mind me asking? With such a powerful CPU, you could easily go RTX 4070 Super or even Ti Super. That CPU is a lot of power for just an Arc card. If you're adamant about Arc, you may be better off with an i3-13100 in that case for 1080p gaming. i7-13700 would never see its full potential with that card, so it may be wasted $$.
@@mikepawlikguitaryou're exactly right. Especially with me running AutoCAD 3D, SOLIDWORKS
A lot of content creators and production users like the 16GB of VRAM on a wide memory bus. The 770 does cater to a specific market that AMD and Nvidia have ignored. Intel's driver development is also helping with their iGPU drivers. Future looks positive for Intel keeping up on their ARC line.
I have the exact same version of the A770 you have and I love it, sniped for 279$ USD new for my secondary system, intel has come a very very long way in terms of drivers and the instability seems to be fixed for the most part. In terms of performance the card is still all over the place, sometimes losing to a 3060, to sometimes mopping the floor with a 4060Ti. Either way I think this card will age like fine wine and over the next 4-5 months it’ll get more consistent
The higher the resolution, the better the A770 runs compared to 4060 and 7600. In many cases stepping up from 1080p to 1440p is only a few fps slower, what is a sign for still lacking driver-optimization.
@@lorsheckmolseh3345 yup that’s the A770’s superior hardware showing its true colors, it has better specs on paper than an RTX 3070Ti and has the potential to outperform one as drivers get better
@@Reaper_ZL1 Yep. I've seen a few people say why would you buy Intel cards right now. It's simple, you're not buying them for RIGHT NOW, you're buying them for the future when VRAM is at a premium and you'll have to pay $600+ for a card of this nature instead of the $250-$300 you can get it for now. Intel definitely future proofed this release and I'm heavily leaning towards upgrading to an A770 from a 1050 ti. It's a massive jump in performance but looking like it's the best price-to-performance card right now with some futureproofing as well.
@@_gr1nchhi just bought one for my 2nd pc build
I've got the A750, it hasn't let me down!
I just recently switched to a INTEL LIMITED EDITION A770 16GB after my 3080 died. even though the framerate's weren't crazy fast, it runs butter smooth, and Personally I like the way Intel renders the colors in COD.
I believe that many gamers buying the Arc cards do it so to support the new platform, and be there to se it evolve and grow, not max out performance, gaming heroes in a way!
That's me! I never run anything on ultra and I have built about 15 PC's for my family/friends and excited to see the Ryzen 7500f does with A770
The only arc card I would buy would be the 380 and 310 simply because of form factor basically better entry levels
I got my A770 for around 250 here in Japan and that 16GB is worth every penny especially for the future games that needs more than 8GB. for competitive games you dont need that much since most of those are optimized for 8GB or lower specs, since AAA and AA single games will be aimed with PS5/XBSX ram in mind 16GB is definitely the way to go if you want to use more geometry and texture data for those sweet visuals. also for those who are starting to use Blender on making 3D stuff or just Davinci Resolve, it's quite fast for what it is. so if you can spot an under 300 bucks in the wild id say it's a steal.
I just ordered a Sparkle Titan A770 for 280$ Newegg I'm excited for 1440p 144hz on a Ryzen 7500f 6000Mhz ram and excited to see how it does
Such an exotic 3rd contender card. I've seen a few at my microcenter here. Keep up the good work as always. Always love the die breakdowns.
Ingot the Sparkle Orc as a media card and it is a video editing monster.
It all depends on the price. In my country A750 is €250 and A770 is €399. Performance uplift is like 5% at most.
I just bought the A770 Sparkle and waiting for delivery. Hoping the experience is good. 🤞🏻
hey man
how is your experience so far?
@@brahimsaad6287 wonderful. Drivers are very mature. A770 performance falls between a 4060 and 4060ti. Especially the 16gb variant. If you can get it for $300 or less, it’s well worth your money.
A770 is constantly on sale for 250-260 so it’s better for the price to performance than the 4060
Not in EU unfortunately, it’s a rock solid at €399 for over a year now. A750 is €250 or so.
@IntelArcTesting that’s sad to hear. In the US alchemist cards are getting very cheap, probably because nobody wants to really buy them
I got one for 250 on the used market luckily but new doesnt make sense to buy imo.@@ProceuTech
EU here as well, on the fence of buying an A770 despite the potential hassle. I ve been through the journey of Ryzen 1st gen to nowadays chips and whilst interesting, I m not all that sure I really want or can invest as much time as I did with workarounds and learning what works and what doesn t when it comes to graphics cards. I believed AMD s promise of a long lasting platform and also saw the cost benefits over time- my entire family/ friends switched to Ryzen thanks to me spearheading and spending time doing my own research thanks to many online sources. It ended up paying off but you don t remotely get a similar synergy or ROI when it comes to a graphics card and yeah, here the prices for A770 are between 350-400Eur lately, that simply kills the deal when you can get a 6700/6750XT for the very same price. If the A770 was around 250-270 Eur it d still be a questionable risk (re sale value will be low anyway) but that could lure me into getting one.
try looking at buying from german websites. Here in my local Saturn the A770 goes for 299 and sometimes 269 on discount and the a750 for 199.
Works perfectly with emulators at least the arc a750 from my experience
have you unlocked the resizable bar function? I heard that if it is disabled, the card loses a lot of performance
I had issues 3 months ago with the Arc A770 16GB in my PC, and had to swap my RTX 3070 Ti back into it. This Sunday, I actually decided to fix my PC by reinstalling the Arc A770 and resetting my Windows 10 22H2, with a clean install of the Intel drivers. Reinstalled Halo MCC and Infinite, and FPS maxxes out at my monitor's 165 Hz frequency in both titles with smooth, and very playable graphics. I'm really loving my ASRock Phantom Gaming card paired with a Ryzen 7 5800X, and 64 GB of 3200 MT/s 16CL RAM, now. The latest Heaven Benchmark tonight also yielded a 361 FPS max, and 209 FPS average framerate, and temperatures never went above 74*C. Now, I'm thinking of upgrading to a 5800X3D CPU for this PC.
A770 has a 256-bit bus (560 GB/s), 16 GB, what should make it a good tad faster than the castrated 4060 or 7600 only connected by 128 bits and 8x PCI-E. Raytracing-capabilities are outstanding in its class. The high power draw easily can be reduced to around 140W by undervolting, maxclock set to 2100MHz and usage of Optimus (video out over motherboard), what costs around 15-20% performance. The driver improvements in 1 year were massive and there is still a good portion of extra power to expect by more optimization.
While I agree 100%, the a770 and 4060/7600 are in completely different performance classes. I should have included the specifications in this video instead of pushing them to a separate video, because I could have made this much clearer.
@@ProceuTech, yes, but this was only true before the newest driver versions. The "PC Games Hardware"-channel on UA-cam made several tests some days ago, that clearly show the A770 dominating over the 4060/7600. No wonder: the higher the resolution of meshes, textures and screen, the higher the risk of cache misses at these models.
@@ProceuTech ua-cam.com/video/ZkTYGTsbBvI/v-deo.html
intel gpu can spot game devs flaws in the code when it reveal any lower 1%lows when highs can go very high 300 fps....crysis is well made game in remaster version than original version due to devs don't make mistake in making maps from their tools of doing maps in one pass vs avg dev would patch the map by multiple pass from their texture editor than redo whole map from scracth that devs don't want to do again so that idea would result fps hit in certain spot on the map if user find it from gpu rendering ...devs are getting super lazy as this late ...either time constraints and low pay
Apex Legends is also available in DX12. Borderlands 3 is a UE4 game, disable texture streaming and the stuttering will go away.
BL3 can also use TAAU for upscaling
It's a shame that in Brazil it costs 431 euros for a plate like this, here our politicians rob us straight through. But I opted for an Arc 770 because I believe in Intel more than AMD
You are not using XeSS. You should be more knowledgable in the area of electronics my friend… XeSS literally a blessing to this already powerful GPU.
Edit: You know about XeSS, but you didn’t show the differences with XeSS. These are only native settings.
For honest comparison, native resolution is better for comparison
New driver released today 1/10/24
I have been using an Arc A770 that I got used during the tail end of the pandemic. When it comes to really old games, I have some subjective observations. In City of Heroes: Freedom, the A770 had serious issues. Running the 'Ultimate" Preset (the game wants to default to this) rendered this old game beautifully - but had a very distinct stutter whenever rendering large fights, crowds and fights with lots of particle effects. On Recommended settings the previous issues go away but the game looks somewhat 'flat' in comparison. In the Cryptic Studios games Champions Onliine and StarTrek Online, there are some issues in what the game wants to use as default settings. In the former, anti-aliasing is turned to the max, which causes harsh black lines to appear on characters and objects in the former, and in the latter title bloom had to be manually adjusted in space scenes as planets were brighter than stars, and almost painfully bright on the monitor I am using. Again, adjusting these specific settings fixes the issues. I am sorry I don't have hard FPS data for these titles. I've paired it with an I7-11700F machine with 32 gb of 3200mhz ram. Does it matter? I dunno. 🤷♂
If you play vr do not buy this
Your standards are a bit weird, what is a acceptable frame rate. Seems like anything under 200 fps at 1440p is not playable in your opinion
i agree, he sead he cringed at the fact that the card ran crysis at 87 fps at 4k, like wtf ?
"The card ran acceptible at 1080p. We got 250 fps"
Roflmao.
I got my hands on the Asia exclusive A770 Photon in white one of the best looking cards I've seen in person .
Saw one on eBay a few weeks ago here in the US- interesting screen on it
Fully working with pico connect on VR , love this GPU now 😁
Proceu Tech : The problem with this I5 13600K is that the 1.st CPU-s P-core only hold those turbo clocks for a few seconds and then they clock down, the same goes for the 2.nd CPU-s E-cores....this i5 is a bottle neck for many new GPU models that are of a MID TIER and above....so, to truly test the potential of this Intel Arc A770 16GB you would have to use a Ryzen 7 5800X3D (that is faster than this i5 13600K as well as the i7-s and i9-s) or a Ryzen 7 7800X3D (that is faster than the i9 14900K). You can go look it up, certain reviewers already proved that (not that youtube reviewers are legit, but in certain general knowledge scenarios, they, if the wanted to, cannot hold the truth back).
While this behavior is true out of the box, I’ve uncapped the power limit and set the tau to be over 300 seconds. In games, and I can prove this by either showing Intel’s XTU or MSI afterburner, the chip is able to stick to 5.5GHz on P-Cores 0-3, and then 5.4GHz on the rest. All of the E cores were also stuck at 4.0 GHz. I do agree that there were cpu bound instances in this video; I Just wonder how much of it is due to the Arc cards and how much is due to the bottleneck
@@ProceuTech cool to everything you just said....but bro, 230 Watts at stock clocks with no OC-ing, unlocking the time limit only increases the power consumption to 290 Watts ?!?!?!?!
Nissan Skyline GT-R 35 can get 2500 HP and beat every car on the planet....while eating abnormal amount of fuel, breaking every 100 KM, needs its part replaced every week and so on....do get my drift.
The 5800X3D and 7800X3D are at its stock clocks and at their stock extremely high efficiency the fastest CPU-s on the planet Earth. By doing so they eat very low amounts of power. And only at stock clocks with PBO enable is what counts and nothing else, because this is the most realistic scenario here to be used by every gamer.
These two AMD-s are faster even at stock limited power consumption levels than you 300 Watt i5. That's why you should use either one of them for every GPU testing because this way you get the least amount of CPU bound scenarios if at all.
5800X3D = max peak of 111 Watts
7800X3D = max peak of 82 Watts
Im rooting for intel, but their next GPU either needs to perform a lot better or be a lot cheaper before i consider buying an arc card
I average 144 to 158 1080p on cod mw3, running with corsiar vengence ddr5 6400 set at 5400 with msi b760 wifi plus gaming moterboard and i7 12700kf not overclocked.
Only two games shown that are part of my game niche. A card like this with 16GB of VRAM for competitive games is money wasted, most of these games are optimized as hell for a GPU with 8GB of VRAM and don't need more than that!
Now the performance of this card in these singleplay games is leaving something to be desired, cyberbug is a shit... although the performance of this card in Red Dead Redemption was a little satisfactory.
Anyway, the card looks good but the performance is a little low for the price it offers!
Couldn’t agree more!
@@ProceuTechbut I'm guessing by the time Battlemage comes out, drivers might be on par with AMD which still isn't the best drivers.... I just ordered a 280$ A770 Titan 16Gb and am l hoping they will pull through for all of us!
Arc 750 are much better deal because performance difference are under 10% and price difference are 50%
But in a year or two 8gb not enough and I like to upgrade in 3+ years so I went 16Gb a770 Titan 280$ on sale
@@Miley_00 it will be fine for 1080p and arc 770 are simply not enough stronger so 16gb will be enough but performance of card again will be too shitty
I really like my A770 LE.
Sadly, I just put it up for sale last evening due to an issue in my favorite title generated by an Intel firmware update.
Intel's support seem lacking at this point.
Not sure if they factored the amount of resources they'd need to address issues.
The standard answer for every issue is them asking to submit an SSU report...that is typically followed by them closing the case without any attempt at a proper resolution.
Great video! Thanks for sharing. So the A770 is a great card for 720p ultra, 1080p medium to high, or 1440p low to medium, depending on the game of course. No thanks. Lol
What about Skyrim?
Im curious Where are you finding a 4060 cheaper? Everywhere I look, the 4060 is the same flr the 8gb and hundreds more for anything above the 8gb
I got this card and unfortunately my PC won't boot. I have a 9700F i7 and it won't post. I don't have onboard graphics so I made sure resizable bar was on before I swapped gpus. I made sure to uninstall Nvidia drivers and swap with my A770.
When the computer turns on I only see a few white dots on the screen. I thought my gpu was dead so I put it in my son's computer. He has an i5 10400 and it works fine.
I know Intel recommends a 10th get but some people have made an ARC gpu work with an 8th gen so long as resizable bar is on.
Any help would be appreciated.
Thinking about buying one , and i wanted to ask if it gets bottlenecked by an i3 12100f, and if yes in which extent?
It definitely will, but only in incredibly CPU demanding tasks. In gaming, it should be fine and you shouldn’t notice any performance degradations.
Does this card pair well with the 5600x?
have a a380 as dedicated video encoder
What is name of game in background?
Black Ops Cold War, Firebase-Z zombie map!
Had me wondering too. Looks great... I didn't play any cod since og black ops. And recently got back on mw3 for zombies. But definitely going to try that one.
good video!
Not running with amd CPUs?
A770 or 1080TI?
CAUTION: IF YOU WANT TO USE A750/A770 TO PLAY ZELDA TOTK OR OTHER SWITHCH GAME WITH YUZU/RYUJINX, STAY AWAY. IT WILL CRASH AND NO RESOLVE SOLUTION!
My guy think 140fps is barely high refresh rate 😂 3:49
buy intel if u cry high prizes and think drop prizes if competitive buying
DOSBox is used in some older games from GOG oof
Yo I forgot to hit the like button and subscribe, my bad.
You mentioned the a750 price and performance 50 times in this video but never provided any evidence
Watch, your, tone, Boy
Unfortunately that’s what this comment section seems to want 😭
too much graphs, too little gameplay footage
i have a a750 and the fact it doesn't run dosbox is a huge turn off. there is no reason in 2024 where you cannot run dos games. intel made a huge blunder there. welp back to my 5700xt till i upgrade to something else.
What is with these benchmark settings? It basically makes them completely useless and not comparable to any other trusted benchmarks.
boycott Intel