Batman Arkham Knight is because the game checks if the GPU is Intel and refuses to run if it is. If you spoof the GPU vendor using DXVK or patch the check out of the Batman exe it will work perfectly. The only way Intel could fix it on their end is if they added an automatic GPU vendor spoof for this specific game.
That is crazy! That kind of coding practically prevents anyone from entering discrete GPU market. "Sorry, if its not AMD or Nvidia then you are SOL". Yeah those two are our only true options for gamers but that may not be the case in the future. As we see here, Intel is really trying and if given the chance then it works really well and sometimes punching above its price range. No matter what GPU or iGPU is in the machine the game should at least try to run if the said graphics chip supports the correct DirectX version. If it runs poorly or crashes then so be it, either the drivers need improving or the gpu is just not enough.
I don't care how you do it, testing that many games is going to take a lot of effort to run, gather data, and then parse all the data into something meaningful. Well done!
I admire the quality of work done but a real scientist wouldn't have bothered testing a clusterfu*k from a company that is about to fail due to completely alienating the enterprise market with poor efficiency, frequent errors and silicon degradation. A company that is using a clean sheet design from 1995 to design their CPUs is gonna fail eventually. Finally caught up to them with 13th and 14th gen. Intel still thinks it is 2001 and they can wow people with nonsensical clockspeeds. Doesn't matter the architecture, 6GHz is purely for advertising and they've tuned your i9 for failure to meet those advertising targets. It isn't just the i9s. Everything in the range will have the same issue; it will just take longer for an i3 to fail.
That's WAY better than most people probably think. Many of these games are also titles like Star Field who are known to cause issues, and AMD & Nvidia cards aren't always flawless either.
Starfield had a broken DX12 implimentation, it ran best on AMD because the Radeon driver team literally had months of privelleged access to bulwark their drivers against the Bethesda sloppy code. Literally Glitched API by the billions per minute, its a testament to Nvidia driver stack that it was pre-emptively hardened enough against such chicanery to somewhat run the games on launch without having sponser access. ARC drivers, being so young; didnt stand a chance.
Yeah, this was pretty eye opening to me. I’ve become so caught up in top tier NVIDIA cards over the years, seeing how many games the 770 can run at 60FPS good settings is a good grounding
Yeah people made it sound like half of your games will be unplayable. The games that were actually unplayable were like 3 of which i dont give a shit about
@@notchipotle That's one of the reason I purposedly choose the job that I hate so that I can enjoy my hobby in my free time (turned out that's also a bad idea, it's soul crushing waking up to do something you hate everyday, and you are likely to suck at it as well)
I would advocate to find a balance, but the sheer variety of jobs makes finding that difficult. So, as long as you're satisfied before you go to bed, then I'm proud of you. ❤
For us it was a worthy waste of money and time but our 6 year old will be using his 6750xt for the foreseeable future due this gpu demanding settings to be changed as often as he flips game choice. A new warning label is in need, not for mid gamers or kids
As an owner of a 16GB A770, the only thing I can say, is that I honestly forget that I'm using a Intel GPU, I obviously dont go testing every single game I own, but what I do/play daily, it just works.
@@boenkstah He uses the latest driver which sometimes fix problem but also creates another problem, this happens on both Nvidia and AMD GPU too sometimes
Can't say "it just works" like my nvidia or amd gpus 😅 When I got my gpu this Tuesday I had to reinstall the WHQL driver three times in order to get the Arc Panel! 😂 Installed the latest driver yesterday that froze my system. Had to enter safe mode and remove the drivers.. I still have HDR issues when booting up, the display is all gray and I need to turn off/on HDR several times till it finally works 🫠 Oh and also E-Arc sound drops when playing the first thing with audio, solution is to mute - unmute after 5-10 seconds. Besides all this GPU acceleration doesn't work in browsers or launchers. The screen will flicker between picture - black screen - picture etc till I pause the content and disable gpu-acceleration. This is just three days of ownership.. I would NOT recommend Arc to anyone that doesn't have patience and can do basic troubleshooting. Edit: Arc 770
This is honestly so positive from intel. Drivers are hard, and AMD/ Nvidia have years and years of experience on intel. Want to see more competition, even if it is at the low/ mid end
Going to ignore the fact that Intel has been making GPU drivers longer than AMD? They know how GPUs work; this is just a dedicated GPU but most of these issues have been in their iGPU drivers for over a decade, hence why many people who want emulator mini-PCs go with AMD.
@@zxZethxz I don't understand why people are so wiling to just excuse this for intel, they aren't some newbie to the computer game. Most of the newer demanding games could barely run or it wasn't playable, and it's only getting worse for new releases. That's ridiculously concerning for intel.
@zxZethxz they aren't newbies for integrated graphics- but discreet GPU's are a different ball game. That's why I give them props for this. Obviously they've had years with integrated and cpus
@@frankytanky5076 Because people are fed up with the Nvidia and AMD rip off. More competition is likely to reduce prices. Additionally, Intel, imo handled it well. They owned up to it and improved upon their drivers.
Just FYI, In the Witcher 3 I get shadow flickering on RX 6800 XT too, before that I had 3060 ti and I also think I had this issue sometimes. These issues started to crop up with the Next Gen version of the game.
I don’t get that at all on my 5700xt or 7900xt… I wonder why. I have settings maxed out for 1440p and I don’t think I have anything special installed or changed in the settings
The Witcher 3 Wild hunt is just a good Dx11 game engine that tried to get revamped to DX12 & have raytracing added for the enhanced version, but still has too many old calls from the DX11 protocol. The issue is that DX12 doesn't have the simple compliers for a lot of the work done in DX11 & has need for recoding by hand at low-level for proper support.
Such a bummer about that game. It could have had a really damn cool campaign following up on the previous games. And a few months after release when they actually finiahed making it, it was honestly a really good game. And all the portal stuff could have been soooooo coooool if they had really fleshed it out in a more Master Chief Collection way. God especially if they could have allowed a lot more mix and match between eras. I played for a few months and was really liking it. Such a shame they botched it so badly. The concepts were all there. The execution just wasn't.
I have 300 hours in BF 2042. It's still a fun game. 4kd gamer. People who suck at fps and don't have battle sense think the game is bad. 😂 I only play infantry too, no vehicles. I also have 2000 hours on BF V
If battlemage is half the improvement people are expecting, then we may have a real competitor in next gen. Considering how fresh Intel are in this space, the fact they only have these issues is kinda crazy.
@@ree6487 Some of the best CPUs for gaming on a budget are from Intel like the i5 12400f and the i5 12600kf, guess they're doing the same strategy for their GPUs too
but they need to get some good board partners. The Asrock and Sparkle cards are ugly AF and arent exactly rigid / dont really support their own weight as well as the reference and Acer Cards. (Owner of an Acer Predator Bifrost A750 OC)
I'm an A770 owner. My main gripe is an ongoing driver auto update bug, meaning I have to use DDU Uninstaller regularly to purge the thing to get the auto updater to work. The improvement in titles like DOTA 2 since purchase is huge. It was the cheapest 16gig card on the market at time of purchase. I was a bit bummed out when I heard how bad Starfield was on it but less bummed out when mates told me I wasn't missing much anyway. Did have some issues with Dual monitor initially, but only using a HDMI converter. It runs f124 with good frames @ 1080p, had no issues with Helldivers 2 or Fallen Order. HAppy enough with my purchase, very pleased with Intel's efforts to improve the product and will consider Battlemage when it drops, providing Intel can iron out the compatibility issues.
@@bryanwages3518 yeah same, but I DDU it every now and then, seems to work for an update or so then reverts uselessness. Not a deal breaker but annoying to say the least.
I've been very happily using my Bifrost A770 for about 6 months now and haven't really had issues. I have enjoyed the setup and yeah, no notes. Glad someone is dedicating the time to review a ton of games on it. Edit: I will say it wasn't always sunny skies especially since I use Linux but haven't had issues for a while.
If it releases with a similar stability and compatibility as current arc driver.. it would certainly be a success as things will get better with time. Fingers crossed that Intel can compete.
They have dropped high-end, so it's all down to pricing low-end. Personally not holding my breath. Finally picked up an A770 open box for $150. At that price it's OK. Not the $250+ they were going for. 16GB was $300.
@@farmeunit Intel graphics cards are alright as long as you know that it's going to work well with the games that you want to play, and if you feel comfortable with maybe having to deal with some issues in the future which might require a bit of tinkering to fix. For someone who is less tech savvy and who doesn't want to mess around, AMD or Nvidia will be a better choice for sure. Some people will tell you that Nvidia is better than AMD for driver support and compatibility, but it's not like Nvidia never has issues, and there's a good chance that you can get a better value from AMD in your price range. Any issues that you do get with AMD (or Nvidia) are likely to be extremely minor compared to the type of issues that people are still getting with Intel cards.
@@syncmonism I have all makes and models. Just pointing out that the value proposition isn't that great compared to others. I had a 7800XT in that box but it's running Plex so thought I would give it a shot since I do game on it when my daughter is on mine for some reason or girlfriend is teaching a class. It's just not been that great, I my opinion. I do think people need budget options but honestly, my 5700XT performed better in the games I have played on the card. Which is equivalent to a 6600/XT.
Great seeing you have Sim City 4 in your game library, The main culprit for crashing is multi core left enabled. You should be fine (change the resolution as needed via the launch parameters) in the launch parameters in steam you could add -- CustomResolution:enabled -r1920x1080x32 -intro:off -CPUcount:1 -d:DirectX -CPUPriority:high -f
Finally, Steve's infected him. And not only that, Tim immediately had to climb new heights increasing the number of games to the hundreds. 😂 But on a more serious note, I really appreciate this update on Arc's driver situation. Kudos to Intel - hopefully, they manage to deliver a competitive day one experience in the future as well. 😊
The year is 2055. A group of kids is egging one another on to explore the derelict old property just known only as “HUB”. As they explore deeper into the house, they hear sounds, muttering, an electrical hum. They force open a door against a pile of old graphics cards and monitors, and see a man in a tattered blue hoodie, his eyes wild, his beard long and unkempt. “Must benchmark…playable quality…crashes” he mumbles to himself. It is Tim.
Will have to give this a try, current solution is to run game in Vulkan, but performace can go from 300fps to 0 seemingly at random, I think due to shader caching
Not so impressive when you consider they farmed it out to a few dozen unpaid 'interns'. Literal slavery, just like any job in this industry. Anyone with a sense of ethics has stopped buying games in 2024. The slaves are gonna have to decide whether they prefer income or freedom. Anyone with any sense would be re-training for a career as a medical assistant and that isn't saying much.
What a Herculean task of running so many games for the days set. Great job! I have a A770 16gb and didn't run into issue with running Left4Dead but I was using an Intel 12900k as well.
Considering I wrote a macro in '97 to convert 100, 000 text files to *.doc format while I went to lunch, I bet they did the same here. Let AI do the testing and to hell with the fact it isn't yet ready for primetime.
Finally, the 3rd generation of intel Xe graphics (first generation being DG1 and DG2) after 2 years since being released, the drivers are decent. If only the 10 years of intel integrated graphics driver development that Xe is based on could have contributed litterally anything to make the drivers work.
Integrated Vs Dedicated graphics are entirely different beasts. There was a graphics developer on MLID's podcast last year who summed up the biggest difference in driver goals pretty clearly: with dedicated graphics, you're looking to offload as much as possible to the GPU, because it's probably not going to reach its limit even in demanding games; with integrated graphics, you're looking to only send the tasks that absolutely have to be GPU to the hardware, because it's so weak that any amount of workload is going to be coming up to hardware limits very quickly. AMD Strix rebalances this problem in a big way, with the most powerful Strix Halo chips aiming for graphics performance equivalent to a 4070, so the above probably comes with a caveat of "only applies to driver development up to maybe 2025"
The driver's being "decent" for such a new product line is actually incredible. Shout out to the Intel engineers busting their asses trying to catch up to Nvidia's and AMD's two decade headstart.
@@Razzbow A770 has 16GB of VRAM though, if you want a better price comparison, use the A750, which is 97% of the performance of the A770, just with half the VRAM.
Now we need Steve to test all his games, 3 runs each, on all GPUs released 2016-2024, at 720p, 1080p, 1440p and 4K, and with a range of processors to show where CPU bottlenekcing kicks in for each config.
1 game tested: fortnite In all seriousness, with Steve pretty much only plays competitive multiplayer games (or even shooters?), so his list would probably be comparatively tiny
So a few days ago, I was looking at your older B650 videos, and about 30 minutes later you upload a new B650 video. Now I'm looking at A770 videos and again you just uploaded a new A770 video. How...
I still doubt they were all on steam, though. For example, Far Cry 2 and Far Cry 3 are a whole new battle to get working. And for me, it was literally impossible on steam to play FC3. And FC2 needed HEAVY 3rd party patching so it worked properly. Can't imagine a universe where he didn't have ANY issues with those games on Steam. But that's the game's/Ubisoft's problem, not the card, just to be clear.
It has gotten so much better since launch. Actually pretty usable in new and old games, although at the price the A770 16GB is hard to recommend but the A750 and A580 are great value cards for budget systems. Thanks for these tests! Edit: You mention horizon forbidden west being playable at 60+ fps but do note that’s only because of the 7800x3d. I tested this game with a 5600 at launch and could barely get above 45fps with very low settings and XeSS performance with low gpu usage. Any game with decima has this issue. Death stranding and horizon zero dawn also. Not a cpu bottleneck since my Nvidia card gets significantly higher fps with same cpu.
Interesting, I wonder if the game keeps swapping assets from VRAM out and back in, hitting the 3D-VCache if you have it, and falling back to system RAM otherwise (because the CPU's cache isn't big enough). Maybe Nvidia's drivers prevent this somehow.
Horizon forbidden west at launch was a complete mess tho, they patched it after a few days but at first it was giving my friend's 3090ti something like 90-95°
"no graphical issues" sometimes you can't even tell if those are gpu related, i just finished Horizon Zero Dawn, in dlc zone in Dam area there was a floating black box of textures... is it the game, is it the radeon GPU? (probably not, since it's PS game), but it's kinda random sometimes
Had a similar issue with Beam NG Drive with the AFMF2 preview driver. Black sand in any map that used the sand material. Restarting the game fixed the issue though, which is strange.
You and other UA-camrs should really do benchmarks of these GPUs for creative apps such as After Effects, Blender, Maya, 3ds Max, Houdini, Cinema 4d and Unreal Engine 5
Achievement Unlocked: Hey Tim! All behold the great Tim for he is now our fearless leader! Hey Tim: "A steam achievement when you have played ALL of your steam library at least once. Only .0001% of all steam users hold that badge of honor."
I built an AMD/A770 system about 9 months ago. I don't any newer games except maybe Borderlands. I have not had any problems so far with my setup, but I also don't have the vast library you do. I will definately bookmark this video so I can refer to it before I purchase any games. I went with the ARC A770 mostly because I want to support having a 3rd company out there making video cards, but I was also aware of the driver problems I would probably encounter. I had an old GeForce 3060 as a backup just in case.
I was an early investor in Arc A770. It plays games well but it absolutely crushes with video rendering and processing! Competes with 4000 series RTXs.
That's a lotta games! Really appreciating these more creative/out-of-the-box ideas the channel has been making lately. Maybe it's the quiet before the storm for new hardware releases, but whatever the reason these have been eye-opening and gave me what feels like a lot more perspective. Thanks as always!
I would have dabbled with Arc knowing it's limitations if they hadn't realistically locked it to only systems with reBAR. I ran a lot of older hardware when their cards made sense for me to buy. I'm excited to see what Battlemage is like though. Having a 3rd option is amazing.
@@alexturnbackthearmy1907 Isn't it something close to 30% performance lost* without reBAR. I'm talking at the time a H81 board running a 2014 haswell CPU and PCIE 2.0. I work in IT and try to keep stuff out of land fills. :) It was perfectly suitable with a GTX 1060 and kept playing the games I wanted at 60fps until around 2022. Monster hunter, Elden Ring,etc. I've finally upgraded the board but picked up a used RTX 3060 12GB for cheap.
@@SinisterPuppy Nowhere near as much hit JUST from re-bar. It would be like 80-85% without rebar, but nowhere near 30, there must be some other funny buisness going on.
@@alexturnbackthearmy1907 I forgot a word. It's 30% perf lost with rebar off. There's a pretty good techpowerup article on it. Only reason I know is I really wanted to give Arc a try.
IDK about Tim but Todd literally made the game he wanted in grade school. Elon Musk's cybertruck is just something he drew in kindergarten. The most wealthy people on the planet are just children who never grew up.
Gotta respect HU for such extensive testing of Arc, that's quite the commitment. I personally got fed up with that Nvidia tax (having owned probably 10'ish Nvidia cards in the past) and went with an Intel Arc A750 as an "experimental adventure" as I don't play games as actively as before for sales price on singles day or whatever it was, 269€ down from 299€ at that time I believe when finally upgrading from a 1070 Ti after holding onto it for years hoping for that "performance vs cost" to come down to reasonable levels which never happened. At the time I picked up the A750 even a card like RTX 3060 costed quite a lot more and A750 already then performed similarly or (mostly) better. It's been suprisingly easy to make that switch, I expected more struggles tbh but if anything this card just made me excited for upcoming Battlemage which I might pick up the top SKU this time as Intel has earned my trust to at least give me an experience I can totally live with as far as technical issues (or the lack of). Thanks to driver updates the card has only started performing better and better over time so that was a nice "bonus" to being an early adopter, feels like you start getting more for your money. For Battlemage B870 (
Jeez. Talk about delivering and walking the walk. I own Arc A770 Limited Edition. The only issue I had was with Starfield game. I waited until it got patched and then its performance got decimated with Nvidia performance patch. I also had suspiciously low 1% lows in old games like original Need for Speed Most Wanted. I do suspect that those 1% lows on old games would be better on Nvidia GPU. Though, all old games were still playable.
Have you ever came across DXVK, a wrapper that allows a DX8/9/10/11 games to be played using Vulkan? It would be nice to see how well Intel Arc performs in an older titles using Vulkan.
DXVK is actually what the Intel driver largely based on for DX11 and below. A couple years ago they suddenly had a massive performance boost from a driver update and that was actually because they ditched their in-house driver and just switched to DXVK (without giving credit)
@@mar2ck_ Then it's doing a bad job with DXVK wrapper if using DXVK manually give better performance and fixes games that normally don't run like Arkham Knight.
DXVK fixes almost all issues with GTA4, runs better than it has ever done on any hardware using DX. So if they have DXVK in the driver they don't know how to use it properly.
@@blitzwing1Intel took all the credit for the "performance breakthrough", it was pretty disappointing to see. The only reason we know what happened was one of the DXVK devs (Joshie) went digging through the driver binary and found all the DXVK variable names.
I deeply appreciate such enormous work to test compatibility of new GPUs with the older titles. As someone who likes to replay hundreds of older games after upgrading a monitor I think this area is underrepresented on most tech reviewing channel and I would absolutely love to see more content like this exploring combinations of modern tech and settings (High refresh rate, High fps, ultrawide resolutions and top tier GPUs) with old but still great games.
I bought an ARC A770 a few months back to play around with and for the most part it's great. All of the games I ran played perfectly fine except for one. Which was the Half Life remake Black Mesa. It starts and runs fine, giving good fps without a hitch. But certain textures were missing and blacked out. I could live with that while playing. The annoying part was the flashlight only seemed to shine off of static objects and npc's. It wouldn't shine off of any of the walls or light up hallways. So you weren't technically lighting up your surroundings just objects. Which made it a pain to traverse dark hallways. I've been wanting to play again for a while since they revamped the routing of familiar levels after the release of Definitive Edition. So I just swapped my RX 5700XT back into my system for now. But, after reading a recent patch-notes news on Steam it seems Crowbar Collective has acquired an ARC card for testing. So it looks like they'll be working on ARC compatability more. Which will be nice.
@@blipblap614 Interesting. I haven't had crashes but the game is missing some textures which would leave a black base. One of the missing textures being the scientists eye's. So they look like they have black voids while talking to you. The flashlight not working properly for me annoyed me enough to stop playing. It was only lighting up objects and npc's. It wouldn't light up hallways and surroundings.
You need to change the command line options on Black Mesa. Add "-force_vendor_id 0x10DE -force_device_id 0x1180 -dxlevel 95", launch the game, then exit the game. Then remove the "-dxlevel 95" line.
You might be willing to live with it but that's objectively an 'unplayable' result. Both Nvidia and AMD have made major architectural changes over the past couple of decades but they don't have that issue. The problems with ARC are hard coded. Compromises to the memory subsystem make compatibility with older games a dicey affair. It is basically a DX12 only GPU without patching. Battlemage won't have this issue but intel has already screwed the pooch. In the midst of an AI boom, nobody is thinking about Battlemage for datacenters. And few gamers are willing to take chances with an $800 purchase.
@@Lurch-Bot I said I was willing to deal with a couple of textures missing. Because they didn't prevent the game from working. What made the game unplayable for me was the one and only light source you have to help you get through dark spots in the game wasn't working as a light source. It only showed that it was working when you were looking at random static objects and npc's. It wasn't casting light onto walls nor was it illumnating dark hallways. That's what made it unplayable for me. I also never bought my ARC card as a full on replacement to my 5700XT. I wanted to play around with something different for a while so I bought an ARC card to mess about with. Since I've been wanting to play Black Mesa for a while I just switched back to my 5700XT. I also highly doubt Battlemage is going to be $800. If it were Intel is going to have a very hard time trying to sell them. They're in no position to pull an Nvidia and charge big money for low-tier garbage.
Thanks for this review, Tim. I'm in process of a new build and I did it...I bought the Sparkle A770. This is my first foray into Intel Arc territory. Price was $15-$20 more than RX 7600 (which I've been using for the past year). I'm skilled enough to work thru any problems I encounter, so the 16GB of VRAM was the clincher. Plus, I wanted to try something new. I'm not worried. Will see how things develop.
I've been using the A770LE for about 18 months. I've had zero issues (but I play very few games). My RTX3070 in my older machine seems a bit faster, but I prefer my ARC. Why? It's not from the company I've grown to hate.
It's funny when you realize the ARC dropped when the 30-series was the current lineup, and the A770 16gb had more VRAM than every single GPU from NVIDIA other than the 3090/3090 Ti. In a $350 card.
@@rustler08 Not sure what games require more than 8GB, but what I play doesn't. It's just nice to know the VRAM is there if I ever have the need for it. I'll be in line for the next ARC card.
@@Raintiger88 At 4K native, there's quite a few games that would like more than 8GB VRAM. One of my main games (Assetto Corsa) is one of them. It still runs great on my 4060 Ti though, and the Arc A770 LE performs only slightly less in it. Both are great cards for the game.
Excellent video. Had an A770 16GB for about 12 months (maybe less, time is weird) and I'd say I agree with pretty much everything you've said. I'm happy with my purchase and haven't had any big issues along the way. At the time I bought it, I was looking for a 16GB card, and the A770 was the cheapest available by about $400 (Australia).
I've been using my A770 at 1440p while gaming and streaming and it manages to stay at high 100+ fps the whole time so I'm very happy with the card. Definitely worth the price compared to any AMD or Nvidia equivalent for me.
I appreciate the effort. The sad thing is that a 6750 XT or 6700 XT is still available for $300 in the USA, and almost everything will run fine on them. I have one to compare to my A770 LE. It's just sad when the 6700 XT works fine in games that have problems on the A770. No Man's Sky has lots of issues. If you increase the render resolution to something over 130% at 1080p, the game crashes and will not start up again until you edit the config files to reduce the render resolution. It also takes much longer to initially load saves, in addition to the constant in-game stuttering. Arc has a whole lot of issues that may never be fixed. Power is one, but Sleep also doesn't work properly - games artifact or stutter when the system comes back from Sleep. There's plenty of monitor/black screen issues with different combinations of cabling and displays (all of my Arc cards showed no display output until I loaded the drivers via RDP sessions into the PCs). There's also the firmware updates as part of driver update issue. Intel shows no signs that they'll decouple the firmware updates from driver updates, so every driver update you run the risk of bricking the card. But hey, at least now they warn you they're doing firmware updates during the driver install process.
Hello, RX 6700 XT user here. No Man's Sky stutters like hell when entering/exiting any planets, stations or freighters. Very, very long loading times are an issue especially after driver updates, along with massive shader compilation stutters that can pause the game for ~200ms at the worst case. I use an R5 3600, so that might be an issue, but the game runs on Switch just fine, it should be able to run fine on R5 3600 as well.
Great video, thanks again! I had been happily running my ASRock Challenger OC A770 16GB and figured performance was fine, even though having mayor CPU overhead on my Ryzen 5 5600 in Horizon Zero Dawn making it barely playable (around 45 fps in many scenes on 1080p) but now it stopped working as it should and have sent it back for repair (after 2 months of use), hoping after repair or replacement it will run better!
Dang... much respect. That's a freaking lot of games! Usually I skip the ads section of every video out there, but out of respect I watched it for this one. Keep up the fantastic work, HWUB!
I contacted intel about the crashes with Avatar and they advised to disable XMP, which did stop the crashing. But months later, I'm playing it with XMP and it was loading fine, performance is very inconsistent though.
Just the fact it takes that much a performance hit without reBAR is why ARC is already a failure and Battlemage won't save 'em. Nobody is planning to buy Battlemage for AI. They're in the midst of a GPU boom with a brand they've given a black eye by jumping the gun. The A310 is a GPU for literally nobody. The people who would want a GPU of that class for gaming won't buy it because their platform doesn't support reBAR. Better off buying a GTX 980 than an A770. Or, for that matter, an RX6600 which isn't brought to its knees by running without reBAR on PCIe 3.0. The fact you can get it with 16GB VRAM is just a pointless waste of VRAM. They want it to look like it will be a solid 1440p GPU moving forward but it really won't. Sort of like having 16GB on a 7600XT but even more sad, pathetic and pointless. Intel is a budget GPU manufacturer with a product that is a dog that won't hunt. Not with the systems the people in that market range acrtually have. ARC is a knee-jerk reaction to the temporary shortage that is already long past. With an AI boom brewing, intel has an advantage over the rest in that, nobody wants their GPUs for AI. So they might stick around for budget gaming. But then they need to rethink their Battlemage launch. They can't go toe to toe with the best gaming GPUs. They need to focus on being an even cheaper alternative to AMD. AMD is actually who is going to win the AI boom because Nvidia is cocky and to obsessed with per unit performance to realize that AMD is going to win on efficiency, just like they were the best performers in the last mining boom. I literally sold a 3080 to buy a pair of 5700XTs, got the same hashrate and only burned 80% of the power. Similarly, data centers don't care at all about per unit performance, only overall efficiency and reliability, which is where current AMD products shine. While board repair experts are saying that 50% of 4090s will fail, I can't find anything like that with AMD. Their GPUs are solid performers. Also, AMD didn't need special hardware to do RT or anything like that which makes Nvidia second best. They can't do RT or AI without spending billions of dollars on hardware development. And CP 2077 is the one and only game where Nvidia has an RT advantage and that is solely because CDPR is a whore for Nvidia. Bought off to make Cyberpunk a 40 series showcase. They artificially tanked performance even on prior gen Nvidia GPUs. Psycho RT still looks the same but will no longer run satisfactorily on an RTX2070 Super, as it once did. Also, they didn't fix Cyberpunk. They made it worse. Pandering to a bunch of spoiled children is not a comeback story. They deleted so much content just because they couldn't fix their crappy game engine. I have never seen a worse Keanu Reeves performance and, in time, I bet he'll be suing CDPR for ruining what little that was left of his career. He sold out so his crappy Arch motorcycles could get some advertising. John Stamos would have been a better Silverhand and would have contributed some real music to the project, rather than being a brain dead heroin addict just trying to advertise his pretentious motorcycles.I was actually somewhat intrigued when Reeves was on Leno's car show 4 or 5 years ago but the whole thing with CP2077 has made me see those bikes for what they really are - a joke.
@@Lurch-Bot Dude A770 does better job on 4k than 4070ti. You have no clue about the INTEL architecture or REBAR. REBAR equals using 2 silicon's instead of 1. What you see as FPS is only half and no waiting clean FPS where all are good. And yes as soon as the Battlemage launches I'll be on the long queue to get it with those that understands the performance and architecture.
@@paulboyce8537 In the games I tested, my A770 LE performs either on par or slightly worse (depending on the game) than my 4060 Ti. That's with a 5800X3D.
@@HazewinDog With INTEL CPU you get better results. But that's not all. AMD/Nvidia are standalone cards where CPU matters very little. 1 silicone. INTEL instead is different architecture where it splits the tasks via REBAR. 2 silicone's. You have "two engines" but you only see the performance of one. There is no waiting and all the FPS matters. AMD/Nvidia there is lot of queuing where half the FPS is wasted. If you compare FPS alone you don't see the performance from the ARC. 4k I find if there is 40FPS from the ARC it is very playable. 4070ti for example 120FPS can stutter and not very good experience. Further proof is RT. A770 you can often use RT with mid to high settings on 4k. You can't say the same for the competition even if you double the "FPS". Two different architectures and we test using same tests that worked when there was only AMD/Nvidia with similar architectures.
I wish this video was released last month when I was buying my first gpu. This is the only video that actually tested other games than the usual esports games that I don’t play. Thanks for taking the time as this is very helpful.
Btw - there is a much murkier distinction between "game problem" and "intel problem". The fact that it works on Radeon doesn't necessarily mean it's not a problem of the game. It can just mean that AMD already made a workaround of the game problem. If you also tested on nvidia, you would probably find some games, that work ok on nvidia and not on amd (and vice versa), because one team made a workaround. Or if you would test all of these games on amd, you would also probably find a problem or two that intel GPUs don't have.
Was going to make the same point. I get _why_ it's done, but I hate it because not only does it shift blame from the game dev to that IHV, but shipping these driver workarounds cost time, money and in many cases probably performance too, PLUS the underlying issue is less likely to get fixed.
Okay, but that's kind of the point of drivers, isn't it? Why do you think Nvidia releases constant driver updates with what they call 'game-ready drivers'?
I would LOVE to see the same tests with nvidia and amd in the same ball park of performance. Maybe that could lead to the bigger hardware comparison im youtube if you guys decide to correlate the data in a fourth video that talks about general game stabilty between brands.
I am so glad someone finally asked and answered the appropriate question regarding a gpu: "will it play the damn game?" instead of lazily throwing fps charts at the screen and making a big deal out of a 4% performance uplift. THANK YOU. After using my Arc A770 for a couple months I have yet to find a game in my library that doesn't come up on my monitor. In fact, left for dead 2 works great! ...However, notice I said "come up on my monitor", that's because Intel hasn't given an update regarding VR game compatibility for almost 2 years now, and for that reason alone my GTX 970 will be staying in my PC for the time being. Assuming you find this important I'd politely ask either of you, Tim or Steve to make mention of this in your next intel arc coverage as intel needs more pressure to make these cards true competitors.
Ive been using an A750 for several months now and its been great. Im really happy with it. No issues yet but I admit I dont play terribly common games.
I'll be honest. I have zero interest in these Intel GPUs. But, I clicked to support the hard work that went into doing this and to support the channel. You guys are legends. Cheers!
Most games they will have already tested and even optimized for. It's older titles where the many game crashes and crazy rendering artifacts really kick in... (DX7/8 and earlier) and it is obscure titles where the terrible performance really kicks in...
hats off to both Tim for this insane task and also Intel for actually delivering on the promise of fixing one of the most broken hardware ever released
This is honestly really good given how new Intel is to the big GPU game. Here's to hoping Intel & Radeon will duke it out in the low & mid-tier segment, revitalizing that market and granting consumers high quality, low cost graphics cards.
I was considering buying the A580 to replace my 1060 6GB. It's more powerful and has 2 more GB of VRAM on a wider bus. I can buy it for 180 Euro which is muuuuch less than any comparable AMD or Nvidia card. Seems like a solid short-term upgrade.
If you are willing to buy used, you can get way better offers for that money if you look around. One example that comes to mind is the 5700xt. EDIT: For the same money you also find Titan XP's and 1080ti's, whilst both older, even faster than the 5700xt depending on the game, plus they offer more Vram.
Yeah sounds like a good upgrade. I used an a750 for about two months and I has a partially worst experience than with AMD (regarding drivers). So all things considered I was quite happy with it
Can't imagine how incredibly much work this whole project was. downloading, installing playing over 200 games, tweaking their settings, recording, editing, making graphs, even testing some of them with another GPU. This must have taken so long to make and the quality is still amazing! Incredible job! As for the question at the end, I'll be staying with the team green for the peace of mind that stuff works and many other little things I care about. But it's very fascinating to ACTUALLY thoroughly see what you can expect from an Intel GPU.
As someone who just built his PC with a FE Arc A770 and not knowi 6 whether I made a good choice, You guys came in and saved my Day like a Superman playing Starman theme song. Absolute Madlads ❤😢.
IIRC Starfield is a game side thing and GTA V is a weird MSAA incompatibility, i think Intel even said that and is entirely on the game dev side. Could argue GPU arch side as well. Maybe some other artifacts like that are due to MSAA as well, i wonder. Like Dirt Rally. Insane work Tim, and the amount of well working games is also impressive, given the arch seemingly emulates some commands it doesn't have at hardware level.
personally, i do get that people want Intel GPU to compete, but i can't get people hyping them so much that they will just come and compete with nvidia in 2nd or 3rd gen the thing i mostly look at is that A770 is competing with RX 6650XT based on price alone, but if we compare silicon, 406mm^2 N6 vs 237mm^2 N7 and newer vram, in silicon department A770 competes with RX 7800XT on almost level footing, i think 7800XT has slightly less silicon (200mm^2 N5 + 146mm^2 N6 on MCD)
@@SenZjo some people really do think that since "radeon left high end, then intel will come to compete" ofc choice is great, but i doubt Intel will be competive before Druid, and then first signs of drivers overhead should come into play, at this point those drivers are made for arc and arc alone
There is no need to worry too much about how much silicon Intel is using. That is their problem. We should be concerned with price to performance. And I guess to a lesser extent, power
@@deviouslaw my point is under utilization, if you compare those 3 on tech power up, arc dominates in Shading Units, TMUs, ROPs, Execution units, even L2 cache, but there is nothing about L3 so maybe that compensates... on those specs alone, this GPU should be comparable to 7900GRE or 4070TI or so
Anecdotally, I have tested my Arc A770 with shaders from ShaderToy, which if your not familiar, have a lot of wonderful graphical effects that don't normally render using geometry. Specifically, shaders in ShaderToy lean heavily on the compute side of the GPU, not on the rasterization or texture mapping. In those benchmarks, my A770 smokes even the RTX 4060 between 60 and 100%. So what I'm saying is the silicon is definitely there, but I think standard gaming with rasterization struggles to achieve high GPU occupancy. I definitely think there is more runway for future Intel GPU's and drivers to compete with AMD and even NVIDIA, but it will take time for sure.
Really appreciate an in depth look like this and it certainly seems like Intel has come a long way drivers wise. It'll be interesting to see how many of the issues are architectural due to the issues with Alchemist itself as opposed to just driver issues. I feel like the continued Starfield problems have to be beyond just drivers as that had been singled out multiple times in their driver updates.
Man the design of the limited edition of the card is so nice, its prob not the best for thermals but i wish i had one just to have on the shelf, sadly its discontinued and people are selling em for 2-3x the price which isnt worth it at all.
Batman Arkham Knight is because the game checks if the GPU is Intel and refuses to run if it is. If you spoof the GPU vendor using DXVK or patch the check out of the Batman exe it will work perfectly.
The only way Intel could fix it on their end is if they added an automatic GPU vendor spoof for this specific game.
That's crazy
That's kind of hilarious
It's a game from 2015, way before discrete Intel graphics was a thing. Sounds like they were trying to ban the old Intel iGPUs.
That is just stupid that games add that.
That is crazy! That kind of coding practically prevents anyone from entering discrete GPU market. "Sorry, if its not AMD or Nvidia then you are SOL". Yeah those two are our only true options for gamers but that may not be the case in the future. As we see here, Intel is really trying and if given the chance then it works really well and sometimes punching above its price range.
No matter what GPU or iGPU is in the machine the game should at least try to run if the said graphics chip supports the correct DirectX version. If it runs poorly or crashes then so be it, either the drivers need improving or the gpu is just not enough.
I don't care how you do it, testing that many games is going to take a lot of effort to run, gather data, and then parse all the data into something meaningful. Well done!
in my case, since i dont get real devices, they rely on my experience to decide which range the performance was, and that experience took years
little secret here, real reviewers got any hardware months prior the product release date, ranging 3-6 months, maybe longer
so wiser not to tell them this that, they already done all those
I admire the quality of work done but a real scientist wouldn't have bothered testing a clusterfu*k from a company that is about to fail due to completely alienating the enterprise market with poor efficiency, frequent errors and silicon degradation. A company that is using a clean sheet design from 1995 to design their CPUs is gonna fail eventually. Finally caught up to them with 13th and 14th gen. Intel still thinks it is 2001 and they can wow people with nonsensical clockspeeds. Doesn't matter the architecture, 6GHz is purely for advertising and they've tuned your i9 for failure to meet those advertising targets. It isn't just the i9s. Everything in the range will have the same issue; it will just take longer for an i3 to fail.
It's called a job, lol.
One thing about Arc, the cards just look so CLEAN.
And they don't take up every PCIe slot on your motherboard.
@@nickdibart I mean, neither does the 6650xt it was being compared against.
Hoping battlemage has a similar clean 2 slot look too.
The Intel plastic ones. ASRock Challenger has a better cooler
The rgb ties in nicely with everything.
That's WAY better than most people probably think.
Many of these games are also titles like Star Field who are known to cause issues, and AMD & Nvidia cards aren't always flawless either.
Starfield had a broken DX12 implimentation, it ran best on AMD because the Radeon driver team literally had months of privelleged access to bulwark their drivers against the Bethesda sloppy code. Literally Glitched API by the billions per minute, its a testament to Nvidia driver stack that it was pre-emptively hardened enough against such chicanery to somewhat run the games on launch without having sponser access. ARC drivers, being so young; didnt stand a chance.
Yeah, this was pretty eye opening to me. I’ve become so caught up in top tier NVIDIA cards over the years, seeing how many games the 770 can run at 60FPS good settings is a good grounding
@@anasevi9456 I remember Nvidia doing the exact same thing back when I had my Radeon 9600 pro - Yes, it's infuriating.
Yeah people made it sound like half of your games will be unplayable. The games that were actually unplayable were like 3 of which i dont give a shit about
That’s dedication to the craft, I can respect the effort put into testing all of these games
That's fun for me lol. If I got paid for testing games I would test 1000+ games without breaking a sweat
@@Dionyzos trust me you won't enjoy it lol. my job is literally my hobby, but after years of doing the same thing, I'm sick of it
@@notchipotle That's one of the reason I purposedly choose the job that I hate so that I can enjoy my hobby in my free time (turned out that's also a bad idea, it's soul crushing waking up to do something you hate everyday, and you are likely to suck at it as well)
I would advocate to find a balance, but the sheer variety of jobs makes finding that difficult. So, as long as you're satisfied before you go to bed, then I'm proud of you. ❤
For us it was a worthy waste of money and time but our 6 year old will be using his 6750xt for the foreseeable future due this gpu demanding settings to be changed as often as he flips game choice. A new warning label is in need, not for mid gamers or kids
As an owner of a 16GB A770, the only thing I can say, is that I honestly forget that I'm using a Intel GPU, I obviously dont go testing every single game I own, but what I do/play daily, it just works.
same here, I don't even get why tim had trouble with l4d2 and sim city 4 considering i play both of those occasionally without issues
I also played Sim City 4 on an a750 and a380 with no issues
lol @ "it just works" ...nice 🤣
@@boenkstah He uses the latest driver which sometimes fix problem but also creates another problem, this happens on both Nvidia and AMD GPU too sometimes
Can't say "it just works" like my nvidia or amd gpus 😅
When I got my gpu this Tuesday I had to reinstall the WHQL driver three times in order to get the Arc Panel! 😂
Installed the latest driver yesterday that froze my system. Had to enter safe mode and remove the drivers..
I still have HDR issues when booting up, the display is all gray and I need to turn off/on HDR several times till it finally works 🫠
Oh and also E-Arc sound drops when playing the first thing with audio, solution is to mute - unmute after 5-10 seconds.
Besides all this GPU acceleration doesn't work in browsers or launchers. The screen will flicker between picture - black screen - picture etc till I pause the content and disable gpu-acceleration.
This is just three days of ownership.. I would NOT recommend Arc to anyone that doesn't have patience and can do basic troubleshooting.
Edit: Arc 770
Holy smokes, great work. This is insane!!!! I hope the devs see this! Thanks so much for the analysis! I love Arc content 😅
This is honestly so positive from intel. Drivers are hard, and AMD/ Nvidia have years and years of experience on intel. Want to see more competition, even if it is at the low/ mid end
Going to ignore the fact that Intel has been making GPU drivers longer than AMD? They know how GPUs work; this is just a dedicated GPU but most of these issues have been in their iGPU drivers for over a decade, hence why many people who want emulator mini-PCs go with AMD.
But Intel is better at Emulation though. @@zxZethxz
@@zxZethxz I don't understand why people are so wiling to just excuse this for intel, they aren't some newbie to the computer game. Most of the newer demanding games could barely run or it wasn't playable, and it's only getting worse for new releases. That's ridiculously concerning for intel.
@zxZethxz they aren't newbies for integrated graphics- but discreet GPU's are a different ball game. That's why I give them props for this. Obviously they've had years with integrated and cpus
@@frankytanky5076 Because people are fed up with the Nvidia and AMD rip off.
More competition is likely to reduce prices.
Additionally, Intel, imo handled it well. They owned up to it and improved upon their drivers.
That game compatibility list is sooo helpful. Great presentation
Pure copium. You shouldn't need a game compatibility list for a GPU in 2024.
@@Lurch-Bot you will need it because Intel arc is new compared to Nvidia and amd
@@Lurch-Bot such hate, wow
Just FYI, In the Witcher 3 I get shadow flickering on RX 6800 XT too, before that I had 3060 ti and I also think I had this issue sometimes. These issues started to crop up with the Next Gen version of the game.
yep had the same on my 3070 especially in Toussaint
Same on 4090 / 7900 XTX
I don’t get that at all on my 5700xt or 7900xt… I wonder why. I have settings maxed out for 1440p and I don’t think I have anything special installed or changed in the settings
Change your AA settings generally off lowest setting. See if it helps.
The Witcher 3 Wild hunt is just a good Dx11 game engine that tried to get revamped to DX12 & have raytracing added for the enhanced version, but still has too many old calls from the DX11 protocol. The issue is that DX12 doesn't have the simple compliers for a lot of the work done in DX11 & has need for recoding by hand at low-level for proper support.
"...Battlefield 2042, which I couldn't test properly due to the complete lack of anyone playing this game."
LMAO That's savage, Tim.
Such a bummer about that game. It could have had a really damn cool campaign following up on the previous games. And a few months after release when they actually finiahed making it, it was honestly a really good game. And all the portal stuff could have been soooooo coooool if they had really fleshed it out in a more Master Chief Collection way. God especially if they could have allowed a lot more mix and match between eras.
I played for a few months and was really liking it. Such a shame they botched it so badly. The concepts were all there. The execution just wasn't.
They didn't even test Crysis.
He’s not wrong lol.
I have 300 hours in BF 2042. It's still a fun game. 4kd gamer. People who suck at fps and don't have battle sense think the game is bad. 😂 I only play infantry too, no vehicles.
I also have 2000 hours on BF V
@@pvtcmyers87 He's wrong, not the most played BF but 20K players xd
Lazy excuse.
You have no idea how valuable this video is to the Arc Community. We constantly are asked "Is Arc safe to buy yet?" and "Does run with Intel Arc?"
If battlemage is half the improvement people are expecting, then we may have a real competitor in next gen. Considering how fresh Intel are in this space, the fact they only have these issues is kinda crazy.
They're gonna disappoint like their cpus 😊
💯👍👏@@ree6487
@@ree6487 Some of the best CPUs for gaming on a budget are from Intel like the i5 12400f and the i5 12600kf, guess they're doing the same strategy for their GPUs too
It seems like they need to improve on pricing, which is the big elephant in the room.
but they need to get some good board partners. The Asrock and Sparkle cards are ugly AF and arent exactly rigid / dont really support their own weight as well as the reference and Acer Cards. (Owner of an Acer Predator Bifrost A750 OC)
I'm an A770 owner. My main gripe is an ongoing driver auto update bug, meaning I have to use DDU Uninstaller regularly to purge the thing to get the auto updater to work. The improvement in titles like DOTA 2 since purchase is huge. It was the cheapest 16gig card on the market at time of purchase. I was a bit bummed out when I heard how bad Starfield was on it but less bummed out when mates told me I wasn't missing much anyway. Did have some issues with Dual monitor initially, but only using a HDMI converter. It runs f124 with good frames @ 1080p, had no issues with Helldivers 2 or Fallen Order. HAppy enough with my purchase, very pleased with Intel's efforts to improve the product and will consider Battlemage when it drops, providing Intel can iron out the compatibility issues.
Same bro same. I just used the intel command center to update the drivers due to the arc control being broken.
I have also been happy with A770 at 1440 paired with a 7600. Great budget build to get to modern hardware and runs everything I want it to.
Starfield is good actually
OMG I tought this bug happened more on the mobile ARCs. I HATE I IT
@@bryanwages3518 yeah same, but I DDU it every now and then, seems to work for an update or so then reverts uselessness. Not a deal breaker but annoying to say the least.
For someone who plays mostly old games at high settings this is a gem
I though I am the only ones who barely play new games, newest games I often played is released on 2019 and it is indie title
I've been very happily using my Bifrost A770 for about 6 months now and haven't really had issues. I have enjoyed the setup and yeah, no notes. Glad someone is dedicating the time to review a ton of games on it.
Edit: I will say it wasn't always sunny skies especially since I use Linux but haven't had issues for a while.
good news for battlemage as far as im concerned
If it releases with a similar stability and compatibility as current arc driver.. it would certainly be a success as things will get better with time. Fingers crossed that Intel can compete.
They have dropped high-end, so it's all down to pricing low-end. Personally not holding my breath. Finally picked up an A770 open box for $150. At that price it's OK. Not the $250+ they were going for. 16GB was $300.
@@farmeunit Intel graphics cards are alright as long as you know that it's going to work well with the games that you want to play, and if you feel comfortable with maybe having to deal with some issues in the future which might require a bit of tinkering to fix. For someone who is less tech savvy and who doesn't want to mess around, AMD or Nvidia will be a better choice for sure.
Some people will tell you that Nvidia is better than AMD for driver support and compatibility, but it's not like Nvidia never has issues, and there's a good chance that you can get a better value from AMD in your price range. Any issues that you do get with AMD (or Nvidia) are likely to be extremely minor compared to the type of issues that people are still getting with Intel cards.
@@syncmonism I have all makes and models. Just pointing out that the value proposition isn't that great compared to others. I had a 7800XT in that box but it's running Plex so thought I would give it a shot since I do game on it when my daughter is on mine for some reason or girlfriend is teaching a class. It's just not been that great, I my opinion. I do think people need budget options but honestly, my 5700XT performed better in the games I have played on the card. Which is equivalent to a 6600/XT.
Battlemage needs good drivers at launch. We'll see if Intel can deliver.
Great seeing you have Sim City 4 in your game library, The main culprit for crashing is multi core left enabled. You should be fine (change the resolution as needed via the launch parameters) in the launch parameters in steam you could add -- CustomResolution:enabled -r1920x1080x32 -intro:off -CPUcount:1 -d:DirectX -CPUPriority:high -f
Finally, Steve's infected him. And not only that, Tim immediately had to climb new heights increasing the number of games to the hundreds. 😂
But on a more serious note, I really appreciate this update on Arc's driver situation. Kudos to Intel - hopefully, they manage to deliver a competitive day one experience in the future as well. 😊
We need more Steves Burke in this industry.
They should work on making stable CPUs instead of
@@Dave102693 big oof 😅
The year is 2055. A group of kids is egging one another on to explore the derelict old property just known only as “HUB”. As they explore deeper into the house, they hear sounds, muttering, an electrical hum. They force open a door against a pile of old graphics cards and monitors, and see a man in a tattered blue hoodie, his eyes wild, his beard long and unkempt. “Must benchmark…playable quality…crashes” he mumbles to himself. It is Tim.
Tim the enchanter.
ok
12:34 Left 4 Dead 2 will crash while loading into a match, ONLY if you have shaders set to high or very high. Set it to MEDIUM and the game will work.
Yeah, sure. Makes total sense. All this says is devs don't know how to engineer games and intel doesn't know how to engineer gaming
GPUs.
@@Lurch-Bot All this says that you don't know how hardware and drivers development works
Will have to give this a try, current solution is to run game in Vulkan, but performace can go from 300fps to 0 seemingly at random, I think due to shader caching
holy fuck 250 games testing is crazy… kudos to you bro
Not so impressive when you consider they farmed it out to a few dozen unpaid 'interns'. Literal slavery, just like any job in this industry. Anyone with a sense of ethics has stopped buying games in 2024. The slaves are gonna have to decide whether they prefer income or freedom. Anyone with any sense would be re-training for a career as a medical assistant and that isn't saying much.
What a Herculean task of running so many games for the days set. Great job!
I have a A770 16gb and didn't run into issue with running Left4Dead but I was using an Intel 12900k as well.
With every driver update there a bigger performance gap to AMD CPU. ARC is made to pair with high end INTEL CPU's. And yet the test was with AMD CPU.
Considering I wrote a macro in '97 to convert 100, 000 text files to *.doc format while I went to lunch, I bet they did the same here. Let AI do the testing and to hell with the fact it isn't yet ready for primetime.
Great work there - probably the most useful piece of content around ARC GPUs released so far.
Finally, the 3rd generation of intel Xe graphics (first generation being DG1 and DG2) after 2 years since being released, the drivers are decent. If only the 10 years of intel integrated graphics driver development that Xe is based on could have contributed litterally anything to make the drivers work.
?? DG2 is alchemist
Integrated Vs Dedicated graphics are entirely different beasts. There was a graphics developer on MLID's podcast last year who summed up the biggest difference in driver goals pretty clearly: with dedicated graphics, you're looking to offload as much as possible to the GPU, because it's probably not going to reach its limit even in demanding games; with integrated graphics, you're looking to only send the tasks that absolutely have to be GPU to the hardware, because it's so weak that any amount of workload is going to be coming up to hardware limits very quickly. AMD Strix rebalances this problem in a big way, with the most powerful Strix Halo chips aiming for graphics performance equivalent to a 4070, so the above probably comes with a caveat of "only applies to driver development up to maybe 2025"
Yeah... and now the 6650 is cheaper than A770...
The driver's being "decent" for such a new product line is actually incredible.
Shout out to the Intel engineers busting their asses trying to catch up to Nvidia's and AMD's two decade headstart.
@@Razzbow A770 has 16GB of VRAM though, if you want a better price comparison, use the A750, which is 97% of the performance of the A770, just with half the VRAM.
Insane amount of work. ty for your efforts
Now we need Steve to test all his games, 3 runs each, on all GPUs released 2016-2024, at 720p, 1080p, 1440p and 4K, and with a range of processors to show where CPU bottlenekcing kicks in for each config.
And make him use the 5800X3D as the CPU for the test bench!
@@RavenZahadoom Duh! Of course...
Stop it before he sees this, you're going to break Steve and then we will only have one left.
@@deviouslaw Well, we can't have cloning, so we need to develop a brainwashing program to create more Steves.
For the greater good.
1 game tested: fortnite
In all seriousness, with Steve pretty much only plays competitive multiplayer games (or even shooters?), so his list would probably be comparatively tiny
So a few days ago, I was looking at your older B650 videos, and about 30 minutes later you upload a new B650 video. Now I'm looking at A770 videos and again you just uploaded a new A770 video. How...
Coincidence 😅 Out of millions of people watching, coincidences like this are bound to happen from time to time. 🤷♂
@@Chasm9 Not really, google tracks your views for making recommendations.
@@Chasm9 I think it's far more likely that Steve was looking into his crystal ram stick and predicting your needs
He is always watching
@@magnusnilsson9792 google not telling people to upload a specific video because someone is watching an old video.
8:47
😂Battlefield 2042 “I couldn’t test properly due to the complete lack of anyone playing this game.”😂
250 games.. you guys are awesome. loved the detail, explanations, recommendations, all of it, thank you so much :)
Quite a strong showing by Intel. I was surprised by how well older games performed, as I expected that to be the weak aspect to using one.
I still doubt they were all on steam, though. For example, Far Cry 2 and Far Cry 3 are a whole new battle to get working. And for me, it was literally impossible on steam to play FC3. And FC2 needed HEAVY 3rd party patching so it worked properly. Can't imagine a universe where he didn't have ANY issues with those games on Steam. But that's the game's/Ubisoft's problem, not the card, just to be clear.
It has gotten so much better since launch. Actually pretty usable in new and old games, although at the price the A770 16GB is hard to recommend but the A750 and A580 are great value cards for budget systems. Thanks for these tests!
Edit:
You mention horizon forbidden west being playable at 60+ fps but do note that’s only because of the 7800x3d. I tested this game with a 5600 at launch and could barely get above 45fps with very low settings and XeSS performance with low gpu usage. Any game with decima has this issue. Death stranding and horizon zero dawn also. Not a cpu bottleneck since my Nvidia card gets significantly higher fps with same cpu.
Interesting, I wonder if the game keeps swapping assets from VRAM out and back in, hitting the 3D-VCache if you have it, and falling back to system RAM otherwise (because the CPU's cache isn't big enough). Maybe Nvidia's drivers prevent this somehow.
Sound like CPU limited
Horizon forbidden west at launch was a complete mess tho, they patched it after a few days but at first it was giving my friend's 3090ti something like 90-95°
Intel GPUs have the highest CPU overhead and can be quite taxing for older CPU
I'd like to see a driver overhead comparrison between Nvidia, Intel and AMD, because many games can be brute forced on ARC with a powerful CPU.
"no graphical issues" sometimes you can't even tell if those are gpu related, i just finished Horizon Zero Dawn, in dlc zone in Dam area there was a floating black box of textures... is it the game, is it the radeon GPU? (probably not, since it's PS game), but it's kinda random sometimes
Had a similar issue with Beam NG Drive with the AFMF2 preview driver. Black sand in any map that used the sand material. Restarting the game fixed the issue though, which is strange.
Intel Arc FTW... AMD should be worried.... time is ticking for Radeon
5:15 - 87% of the games passed, that's extremely impressive for a 1st gen GPU.
First and last for them when it comes to desktop, unfortunately.
@@beachslap7359 Says who? I looked it up and all I found were rumors. Battlemage should be releasing in a couple of months still.
@@beachslap7359 What? Battlemage is quite close to a desktop release.
@@Hyperus board partners haven't even been briefed on the specs yet, so I doubt that.
@@Hyperus”quite close” 🤦♀️
You and other UA-camrs should really do benchmarks of these GPUs for creative apps such as After Effects, Blender, Maya, 3ds Max, Houdini, Cinema 4d and Unreal Engine 5
Achievement Unlocked: Hey Tim!
All behold the great Tim for he is now our fearless leader!
Hey Tim: "A steam achievement when you have played ALL of your steam library at least once.
Only .0001% of all steam users hold that badge of honor."
I built an AMD/A770 system about 9 months ago. I don't any newer games except maybe Borderlands. I have not had any problems so far with my setup, but I also don't have the vast library you do. I will definately bookmark this video so I can refer to it before I purchase any games. I went with the ARC A770 mostly because I want to support having a 3rd company out there making video cards, but I was also aware of the driver problems I would probably encounter. I had an old GeForce 3060 as a backup just in case.
Translation: I wanted to play ALL my games for 1 hour...
Now we wait for Steve's Version of that.
Steve's version: Testing all Ampere, Ada, Blackwell, RDNA2, 3 and 4 cards in 500 games.
@@nipa5961And there will still be that one commenter saying, "could you test [game]?"
I was an early investor in Arc A770. It plays games well but it absolutely crushes with video rendering and processing! Competes with 4000 series RTXs.
Really nice work on this.
That's a lotta games!
Really appreciating these more creative/out-of-the-box ideas the channel has been making lately. Maybe it's the quiet before the storm for new hardware releases, but whatever the reason these have been eye-opening and gave me what feels like a lot more perspective. Thanks as always!
I would have dabbled with Arc knowing it's limitations if they hadn't realistically locked it to only systems with reBAR. I ran a lot of older hardware when their cards made sense for me to buy. I'm excited to see what Battlemage is like though. Having a 3rd option is amazing.
Its not locked to re-bar systems only, and A LOT of that older hardware can receive re-bar just with bios update.
@@alexturnbackthearmy1907 Isn't it something close to 30% performance lost* without reBAR. I'm talking at the time a H81 board running a 2014 haswell CPU and PCIE 2.0. I work in IT and try to keep stuff out of land fills. :)
It was perfectly suitable with a GTX 1060 and kept playing the games I wanted at 60fps until around 2022. Monster hunter, Elden Ring,etc. I've finally upgraded the board but picked up a used RTX 3060 12GB for cheap.
@@SinisterPuppy Nowhere near as much hit JUST from re-bar. It would be like 80-85% without rebar, but nowhere near 30, there must be some other funny buisness going on.
@@alexturnbackthearmy1907 I forgot a word. It's 30% perf lost with rebar off. There's a pretty good techpowerup article on it. Only reason I know is I really wanted to give Arc a try.
@@SinisterPuppy Makes more sense, but 30% loss still seems too much, could it be one of older articles?
Impressive! Both the ARCs performance and testing all those games!
Makes me wonder ? How many of Tim’s old games he said, dang ! I have to go back and play this again 😊
IDK about Tim but Todd literally made the game he wanted in grade school. Elon Musk's cybertruck is just something he drew in kindergarten. The most wealthy people on the planet are just children who never grew up.
This is really great to see. Very cool to know that Arc is really improving over time since launch.
Upvote for the sheer amount of work it took to make this video.
Gotta respect HU for such extensive testing of Arc, that's quite the commitment. I personally got fed up with that Nvidia tax (having owned probably 10'ish Nvidia cards in the past) and went with an Intel Arc A750 as an "experimental adventure" as I don't play games as actively as before for sales price on singles day or whatever it was, 269€ down from 299€ at that time I believe when finally upgrading from a 1070 Ti after holding onto it for years hoping for that "performance vs cost" to come down to reasonable levels which never happened. At the time I picked up the A750 even a card like RTX 3060 costed quite a lot more and A750 already then performed similarly or (mostly) better. It's been suprisingly easy to make that switch, I expected more struggles tbh but if anything this card just made me excited for upcoming Battlemage which I might pick up the top SKU this time as Intel has earned my trust to at least give me an experience I can totally live with as far as technical issues (or the lack of). Thanks to driver updates the card has only started performing better and better over time so that was a nice "bonus" to being an early adopter, feels like you start getting more for your money.
For Battlemage B870 (
Jeez. Talk about delivering and walking the walk.
I own Arc A770 Limited Edition. The only issue I had was with Starfield game. I waited until it got patched and then its performance got decimated with Nvidia performance patch. I also had suspiciously low 1% lows in old games like original Need for Speed Most Wanted. I do suspect that those 1% lows on old games would be better on Nvidia GPU. Though, all old games were still playable.
Thanks for the video, my confidence for purchasing ARC is now way up. No more hesitation buying new gen ARCs.
Have you ever came across DXVK, a wrapper that allows a DX8/9/10/11 games to be played using Vulkan? It would be nice to see how well Intel Arc performs in an older titles using Vulkan.
DXVK is actually what the Intel driver largely based on for DX11 and below.
A couple years ago they suddenly had a massive performance boost from a driver update and that was actually because they ditched their in-house driver and just switched to DXVK (without giving credit)
I think the GTA4 issues likely would have been solved by using DXVK here.
@@mar2ck_ Then it's doing a bad job with DXVK wrapper if using DXVK manually give better performance and fixes games that normally don't run like Arkham Knight.
DXVK fixes almost all issues with GTA4, runs better than it has ever done on any hardware using DX. So if they have DXVK in the driver they don't know how to use it properly.
@@blitzwing1Intel took all the credit for the "performance breakthrough", it was pretty disappointing to see.
The only reason we know what happened was one of the DXVK devs (Joshie) went digging through the driver binary and found all the DXVK variable names.
I deeply appreciate such enormous work to test compatibility of new GPUs with the older titles. As someone who likes to replay hundreds of older games after upgrading a monitor I think this area is underrepresented on most tech reviewing channel and I would absolutely love to see more content like this exploring combinations of modern tech and settings (High refresh rate, High fps, ultrawide resolutions and top tier GPUs) with old but still great games.
I bought an ARC A770 a few months back to play around with and for the most part it's great. All of the games I ran played perfectly fine except for one. Which was the Half Life remake Black Mesa. It starts and runs fine, giving good fps without a hitch. But certain textures were missing and blacked out. I could live with that while playing.
The annoying part was the flashlight only seemed to shine off of static objects and npc's. It wouldn't shine off of any of the walls or light up hallways. So you weren't technically lighting up your surroundings just objects. Which made it a pain to traverse dark hallways. I've been wanting to play again for a while since they revamped the routing of familiar levels after the release of Definitive Edition. So I just swapped my RX 5700XT back into my system for now.
But, after reading a recent patch-notes news on Steam it seems Crowbar Collective has acquired an ARC card for testing. So it looks like they'll be working on ARC compatability more. Which will be nice.
I'm ten hours into Black Mesa, no graphical issues but occasional crashes. A770, GE-Proton8.
@@blipblap614 Interesting. I haven't had crashes but the game is missing some textures which would leave a black base. One of the missing textures being the scientists eye's. So they look like they have black voids while talking to you. The flashlight not working properly for me annoyed me enough to stop playing. It was only lighting up objects and npc's. It wouldn't light up hallways and surroundings.
You need to change the command line options on Black Mesa. Add "-force_vendor_id 0x10DE -force_device_id 0x1180 -dxlevel 95", launch the game, then exit the game. Then remove the "-dxlevel 95" line.
You might be willing to live with it but that's objectively an 'unplayable' result. Both Nvidia and AMD have made major architectural changes over the past couple of decades but they don't have that issue.
The problems with ARC are hard coded. Compromises to the memory subsystem make compatibility with older games a dicey affair. It is basically a DX12 only GPU without patching. Battlemage won't have this issue but intel has already screwed the pooch. In the midst of an AI boom, nobody is thinking about Battlemage for datacenters. And few gamers are willing to take chances with an $800 purchase.
@@Lurch-Bot I said I was willing to deal with a couple of textures missing. Because they didn't prevent the game from working.
What made the game unplayable for me was the one and only light source you have to help you get through dark spots in the game wasn't working as a light source. It only showed that it was working when you were looking at random static objects and npc's. It wasn't casting light onto walls nor was it illumnating dark hallways. That's what made it unplayable for me.
I also never bought my ARC card as a full on replacement to my 5700XT. I wanted to play around with something different for a while so I bought an ARC card to mess about with. Since I've been wanting to play Black Mesa for a while I just switched back to my 5700XT.
I also highly doubt Battlemage is going to be $800. If it were Intel is going to have a very hard time trying to sell them. They're in no position to pull an Nvidia and charge big money for low-tier garbage.
Dude you are the messiah of Computer gaming hardwares !
It's very promising for next gen Battlegame GPUs.
Thanks for this review, Tim. I'm in process of a new build and I did it...I bought the Sparkle A770. This is my first foray into Intel Arc territory. Price was $15-$20 more than RX 7600 (which I've been using for the past year). I'm skilled enough to work thru any problems I encounter, so the 16GB of VRAM was the clincher. Plus, I wanted to try something new. I'm not worried. Will see how things develop.
I've been using the A770LE for about 18 months. I've had zero issues (but I play very few games). My RTX3070 in my older machine seems a bit faster, but I prefer my ARC. Why? It's not from the company I've grown to hate.
It's funny when you realize the ARC dropped when the 30-series was the current lineup, and the A770 16gb had more VRAM than every single GPU from NVIDIA other than the 3090/3090 Ti. In a $350 card.
@@rustler08 Not sure what games require more than 8GB, but what I play doesn't. It's just nice to know the VRAM is there if I ever have the need for it. I'll be in line for the next ARC card.
@@Raintiger88 At 4K native, there's quite a few games that would like more than 8GB VRAM. One of my main games (Assetto Corsa) is one of them. It still runs great on my 4060 Ti though, and the Arc A770 LE performs only slightly less in it. Both are great cards for the game.
@@HazewinDog Ah, that explains why I've not noticed anything using is. I pretty much play at 1440p
@@Raintiger88 I believe Sony games are the worst offenders, along with Hogwarts Legacy, with consuming much VRAM
I love how thorough you guys are with everything you do, but holy shit take some well deserved off time too
Excellent video. Had an A770 16GB for about 12 months (maybe less, time is weird) and I'd say I agree with pretty much everything you've said.
I'm happy with my purchase and haven't had any big issues along the way. At the time I bought it, I was looking for a 16GB card, and the A770 was the cheapest available by about $400 (Australia).
I've been using my A770 at 1440p while gaming and streaming and it manages to stay at high 100+ fps the whole time so I'm very happy with the card. Definitely worth the price compared to any AMD or Nvidia equivalent for me.
I appreciate the effort. The sad thing is that a 6750 XT or 6700 XT is still available for $300 in the USA, and almost everything will run fine on them. I have one to compare to my A770 LE. It's just sad when the 6700 XT works fine in games that have problems on the A770.
No Man's Sky has lots of issues. If you increase the render resolution to something over 130% at 1080p, the game crashes and will not start up again until you edit the config files to reduce the render resolution. It also takes much longer to initially load saves, in addition to the constant in-game stuttering.
Arc has a whole lot of issues that may never be fixed. Power is one, but Sleep also doesn't work properly - games artifact or stutter when the system comes back from Sleep. There's plenty of monitor/black screen issues with different combinations of cabling and displays (all of my Arc cards showed no display output until I loaded the drivers via RDP sessions into the PCs). There's also the firmware updates as part of driver update issue. Intel shows no signs that they'll decouple the firmware updates from driver updates, so every driver update you run the risk of bricking the card. But hey, at least now they warn you they're doing firmware updates during the driver install process.
To be fair an A770 can be had for $220-240 on sale.
A lot of the teething issues will be ironed out with battle mage
Hello, RX 6700 XT user here. No Man's Sky stutters like hell when entering/exiting any planets, stations or freighters. Very, very long loading times are an issue especially after driver updates, along with massive shader compilation stutters that can pause the game for ~200ms at the worst case.
I use an R5 3600, so that might be an issue, but the game runs on Switch just fine, it should be able to run fine on R5 3600 as well.
Great video, thanks again! I had been happily running my ASRock Challenger OC A770 16GB and figured performance was fine, even though having mayor CPU overhead on my Ryzen 5 5600 in Horizon Zero Dawn making it barely playable (around 45 fps in many scenes on 1080p) but now it stopped working as it should and have sent it back for repair (after 2 months of use), hoping after repair or replacement it will run better!
That vram issue in GTA IV is not exclusive to Intel, I've had that issue before on my RTX 2070
Dang... much respect. That's a freaking lot of games! Usually I skip the ads section of every video out there, but out of respect I watched it for this one. Keep up the fantastic work, HWUB!
I’m actually so keen to see intel come out swinging. This duopoly needs to end.
just don't make it a monopoly lol... we're already nearly there
Whoa! Mamoth undertaking. Thank you for your hard work!
I contacted intel about the crashes with Avatar and they advised to disable XMP, which did stop the crashing. But months later, I'm playing it with XMP and it was loading fine, performance is very inconsistent though.
I never though xmp would cause game crash,
Very much looking forward to Intel's Battlemage GPUs! Thanks for all the hard work you did on this video.
Remember to enable Resizable Bar (REBAR) before using any Arc GPU.
And use INTEL CPU to gain performance as there is widening gap.
Just the fact it takes that much a performance hit without reBAR is why ARC is already a failure and Battlemage won't save 'em. Nobody is planning to buy Battlemage for AI. They're in the midst of a GPU boom with a brand they've given a black eye by jumping the gun. The A310 is a GPU for literally nobody. The people who would want a GPU of that class for gaming won't buy it because their platform doesn't support reBAR. Better off buying a GTX 980 than an A770. Or, for that matter, an RX6600 which isn't brought to its knees by running without reBAR on PCIe 3.0. The fact you can get it with 16GB VRAM is just a pointless waste of VRAM. They want it to look like it will be a solid 1440p GPU moving forward but it really won't. Sort of like having 16GB on a 7600XT but even more sad, pathetic and pointless.
Intel is a budget GPU manufacturer with a product that is a dog that won't hunt. Not with the systems the people in that market range acrtually have. ARC is a knee-jerk reaction to the temporary shortage that is already long past.
With an AI boom brewing, intel has an advantage over the rest in that, nobody wants their GPUs for AI. So they might stick around for budget gaming. But then they need to rethink their Battlemage launch. They can't go toe to toe with the best gaming GPUs. They need to focus on being an even cheaper alternative to AMD. AMD is actually who is going to win the AI boom because Nvidia is cocky and to obsessed with per unit performance to realize that AMD is going to win on efficiency, just like they were the best performers in the last mining boom. I literally sold a 3080 to buy a pair of 5700XTs, got the same hashrate and only burned 80% of the power. Similarly, data centers don't care at all about per unit performance, only overall efficiency and reliability, which is where current AMD products shine. While board repair experts are saying that 50% of 4090s will fail, I can't find anything like that with AMD. Their GPUs are solid performers.
Also, AMD didn't need special hardware to do RT or anything like that which makes Nvidia second best. They can't do RT or AI without spending billions of dollars on hardware development. And CP 2077 is the one and only game where Nvidia has an RT advantage and that is solely because CDPR is a whore for Nvidia. Bought off to make Cyberpunk a 40 series showcase. They artificially tanked performance even on prior gen Nvidia GPUs. Psycho RT still looks the same but will no longer run satisfactorily on an RTX2070 Super, as it once did.
Also, they didn't fix Cyberpunk. They made it worse. Pandering to a bunch of spoiled children is not a comeback story. They deleted so much content just because they couldn't fix their crappy game engine. I have never seen a worse Keanu Reeves performance and, in time, I bet he'll be suing CDPR for ruining what little that was left of his career. He sold out so his crappy Arch motorcycles could get some advertising. John Stamos would have been a better Silverhand and would have contributed some real music to the project, rather than being a brain dead heroin addict just trying to advertise his pretentious motorcycles.I was actually somewhat intrigued when Reeves was on Leno's car show 4 or 5 years ago but the whole thing with CP2077 has made me see those bikes for what they really are - a joke.
@@Lurch-Bot Dude A770 does better job on 4k than 4070ti. You have no clue about the INTEL architecture or REBAR. REBAR equals using 2 silicon's instead of 1. What you see as FPS is only half and no waiting clean FPS where all are good. And yes as soon as the Battlemage launches I'll be on the long queue to get it with those that understands the performance and architecture.
@@paulboyce8537 In the games I tested, my A770 LE performs either on par or slightly worse (depending on the game) than my 4060 Ti. That's with a 5800X3D.
@@HazewinDog With INTEL CPU you get better results. But that's not all. AMD/Nvidia are standalone cards where CPU matters very little. 1 silicone. INTEL instead is different architecture where it splits the tasks via REBAR. 2 silicone's. You have "two engines" but you only see the performance of one. There is no waiting and all the FPS matters. AMD/Nvidia there is lot of queuing where half the FPS is wasted. If you compare FPS alone you don't see the performance from the ARC. 4k I find if there is 40FPS from the ARC it is very playable. 4070ti for example 120FPS can stutter and not very good experience. Further proof is RT. A770 you can often use RT with mid to high settings on 4k. You can't say the same for the competition even if you double the "FPS". Two different architectures and we test using same tests that worked when there was only AMD/Nvidia with similar architectures.
I wish this video was released last month when I was buying my first gpu. This is the only video that actually tested other games than the usual esports games that I don’t play. Thanks for taking the time as this is very helpful.
Btw - there is a much murkier distinction between "game problem" and "intel problem". The fact that it works on Radeon doesn't necessarily mean it's not a problem of the game. It can just mean that AMD already made a workaround of the game problem. If you also tested on nvidia, you would probably find some games, that work ok on nvidia and not on amd (and vice versa), because one team made a workaround. Or if you would test all of these games on amd, you would also probably find a problem or two that intel GPUs don't have.
Was going to make the same point. I get _why_ it's done, but I hate it because not only does it shift blame from the game dev to that IHV, but shipping these driver workarounds cost time, money and in many cases probably performance too, PLUS the underlying issue is less likely to get fixed.
Okay, but that's kind of the point of drivers, isn't it? Why do you think Nvidia releases constant driver updates with what they call 'game-ready drivers'?
@@HazewinDog No, the purpose of drivers is not to work around application bugs.
@@cacheman game problem and game bug aren't the same thing to me
I would LOVE to see the same tests with nvidia and amd in the same ball park of performance. Maybe that could lead to the bigger hardware comparison im youtube if you guys decide to correlate the data in a fourth video that talks about general game stabilty between brands.
Damn you test stuffs fast too, Tim. And I've come to watch your video fast as always!
I am so glad someone finally asked and answered the appropriate question regarding a gpu: "will it play the damn game?" instead of lazily throwing fps charts at the screen and making a big deal out of a 4% performance uplift. THANK YOU. After using my Arc A770 for a couple months I have yet to find a game in my library that doesn't come up on my monitor. In fact, left for dead 2 works great!
...However, notice I said "come up on my monitor", that's because Intel hasn't given an update regarding VR game compatibility for almost 2 years now, and for that reason alone my GTX 970 will be staying in my PC for the time being. Assuming you find this important I'd politely ask either of you, Tim or Steve to make mention of this in your next intel arc coverage as intel needs more pressure to make these cards true competitors.
Ive been using an A750 for several months now and its been great. Im really happy with it. No issues yet but I admit I dont play terribly common games.
That's even better, if it also works with non-common games.
I'll be honest. I have zero interest in these Intel GPUs. But, I clicked to support the hard work that went into doing this and to support the channel. You guys are legends. Cheers!
This is the correct mindset, to pay 500 bucks for the future RTX5030 with 2GB VRam and a 64-bit bus.
I really hope someone from Intel is watching, seems super valuable.
Most games they will have already tested and even optimized for. It's older titles where the many game crashes and crazy rendering artifacts really kick in... (DX7/8 and earlier) and it is obscure titles where the terrible performance really kicks in...
Thanks for your hard work, Tim!
Dude, this videos like military grade. Well done. THANK YOU! For real.
hats off to both Tim for this insane task and also Intel for actually delivering on the promise of fixing one of the most broken hardware ever released
This is honestly really good given how new Intel is to the big GPU game. Here's to hoping Intel & Radeon will duke it out in the low & mid-tier segment, revitalizing that market and granting consumers high quality, low cost graphics cards.
Don't forget about snapdragon they might enter the dgpu space
I can't say I was waiting for this because nobody expected this. Thanks!
Hope intel gets successful. We need more competition. Great content Tim.
Wow! Massive testing effort. Thank you.
I was considering buying the A580 to replace my 1060 6GB. It's more powerful and has 2 more GB of VRAM on a wider bus. I can buy it for 180 Euro which is muuuuch less than any comparable AMD or Nvidia card. Seems like a solid short-term upgrade.
If you are willing to buy used, you can get way better offers for that money if you look around. One example that comes to mind is the 5700xt. EDIT: For the same money you also find Titan XP's and 1080ti's, whilst both older, even faster than the 5700xt depending on the game, plus they offer more Vram.
TBF i bought a 2070 for 200€ which runs anything i want with no problem at 1440p. DLSS works wonders at this resolution
Yeah sounds like a good upgrade. I used an a750 for about two months and I has a partially worst experience than with AMD (regarding drivers). So all things considered I was quite happy with it
but does your system support rebar?
As long as your motherboard will do resizable bar, go for it!
Can't imagine how incredibly much work this whole project was. downloading, installing playing over 200 games, tweaking their settings, recording, editing, making graphs, even testing some of them with another GPU. This must have taken so long to make and the quality is still amazing! Incredible job!
As for the question at the end, I'll be staying with the team green for the peace of mind that stuff works and many other little things I care about. But it's very fascinating to ACTUALLY thoroughly see what you can expect from an Intel GPU.
As someone who just built his PC with a FE Arc A770 and not knowi 6 whether I made a good choice,
You guys came in and saved my Day like a Superman playing Starman theme song. Absolute Madlads ❤😢.
This is why we love you guys at Hardware Unboxed.
You can fix Bioshock Infinite's texture bug in 2 minutes. The game can't detect the vram size and it won't load the textures.
I remember having vram issues on gta4 which make the game look way worse than it supposed to be on a freaking pc with Nvidia gpu
I got L4D2 working on my A770 LE edition.
Just slap command -vulkan into launch options and been good. Recently did a run thru with few mates on it.
IIRC Starfield is a game side thing and GTA V is a weird MSAA incompatibility, i think Intel even said that and is entirely on the game dev side. Could argue GPU arch side as well.
Maybe some other artifacts like that are due to MSAA as well, i wonder. Like Dirt Rally.
Insane work Tim, and the amount of well working games is also impressive, given the arch seemingly emulates some commands it doesn't have at hardware level.
I'd love to see this kind of exhaustive testing on popular emulators using all 3 GPUs including performance and compatibility.
personally, i do get that people want Intel GPU to compete, but i can't get people hyping them so much that they will just come and compete with nvidia in 2nd or 3rd gen
the thing i mostly look at is that A770 is competing with RX 6650XT based on price alone, but if we compare silicon, 406mm^2 N6 vs 237mm^2 N7 and newer vram, in silicon department A770 competes with RX 7800XT on almost level footing, i think 7800XT has slightly less silicon (200mm^2 N5 + 146mm^2 N6 on MCD)
i think people just want more choice in what graphic card they want..., AMD and nVidia have a duopoly.., wich is bad for the consumer...
@@SenZjo some people really do think that since "radeon left high end, then intel will come to compete"
ofc choice is great, but i doubt Intel will be competive before Druid, and then first signs of drivers overhead should come into play, at this point those drivers are made for arc and arc alone
There is no need to worry too much about how much silicon Intel is using. That is their problem. We should be concerned with price to performance. And I guess to a lesser extent, power
@@deviouslaw my point is under utilization, if you compare those 3 on tech power up, arc dominates in Shading Units, TMUs, ROPs, Execution units, even L2 cache, but there is nothing about L3 so maybe that compensates...
on those specs alone, this GPU should be comparable to 7900GRE or 4070TI or so
Anecdotally, I have tested my Arc A770 with shaders from ShaderToy, which if your not familiar, have a lot of wonderful graphical effects that don't normally render using geometry. Specifically, shaders in ShaderToy lean heavily on the compute side of the GPU, not on the rasterization or texture mapping. In those benchmarks, my A770 smokes even the RTX 4060 between 60 and 100%.
So what I'm saying is the silicon is definitely there, but I think standard gaming with rasterization struggles to achieve high GPU occupancy. I definitely think there is more runway for future Intel GPU's and drivers to compete with AMD and even NVIDIA, but it will take time for sure.
Really appreciate an in depth look like this and it certainly seems like Intel has come a long way drivers wise. It'll be interesting to see how many of the issues are architectural due to the issues with Alchemist itself as opposed to just driver issues. I feel like the continued Starfield problems have to be beyond just drivers as that had been singled out multiple times in their driver updates.
Huge work! saved Intel some time :D
This is all looking great considering Battlemage is coming and I for one have high hopes for the next step of Intel's GPUs
Not being able to play Starfield is just a good feature.
Starfield isn't even a finished game it was rushed out in a beta
Man the design of the limited edition of the card is so nice, its prob not the best for thermals but i wish i had one just to have on the shelf, sadly its discontinued and people are selling em for 2-3x the price which isnt worth it at all.
How long did this take
a few minutes at least
I want to know how much room the installed games took up (if he says this later in the video I'm sorry for jumping the shark)
At least 3
Probably 2 full weeks of work, 80+ hours
Thank you so much for this video! I will send anyone not certain about buying Arc gpus here.