Watch This Next! The Future is pretty fked ua-cam.com/video/bc-lTgB8Ff8/v-deo.html Check out the Arc cards I used: As a retail affiliate, I am paid a percentage of qualifying purchases at the expense of the retailer. 💎 Arc A770 16GB - newegg.io/nc644e2d2a 💎 Arc A750 - newegg.io/ncccf517f6 💎 Arc A580 - newegg.io/ncd1ad6c1a 💎 Arc A380 - newegg.io/nc61a41f4
I have been saying this for the past three years. We have no choice!! Nvidia no longer cares about the gaming market . We only have AMD and Intel. So, intel need to put more emphasis on their Gaming Graphics. There is no way either AMD or Intel will catch Nvidia in AI. So, the Gaming market that Nvidia left in the trash, is there for the taking. Someone needs to step and take it.
Nvidia wont drop the GPU market cause their philisophy is to be the best in every sector. Its just that their GPUs value will be faitly low despite the high price and they are trying to compensate it with questionable features. They are cutting performace again with the 50 series cause they dont care, but they will still b manufactoring GPUs for the private market.
@@Greenalex89 They only care to stay in the GPU market because leaving it entirely would create exactly an opportunity for Intel and AMD to grow... which they don't want.
it is not that nvidia did not care about gaming market but gamer did not want to pay high price for it. they did not want to accept the reality that things are getting expensive to make GPU. they want their GPU to be cheap just like how it was a decade ago when wafer cost alone have increased more than 4 times for the same die size. intel already said before they are not making profit with Arc even when they still sell the card at it's original MSRP. this gen AMD did not even dare to use 5nm on their low end.
I adopted ARC upon launch and the journey has been interesting and fun despite the challenges. I knew there were going to be some rough patches so i stuck specifically with DX12 titles and used Intel report a problem link whenever i encountered issues. Intel gave me a number of occasions to work one on one with engineers and even do some live screenshares so they could observe my experiences live. All in all, i am EXTREMELY happy with the the A770's that we have here and look forward to battlemage when it launches. Ultimately what attracts me to these products is that being a 32+ year custom PC builder, it is so refreshing to have a genuinely new and refreshing product to work with versus the very old and stale Radeon and GeForce branding. No to mention that the price creep from those 2 brands is just beyond the pale of comprehension which is majority based in shareholder greed. From here and based on what i have experience so far with ARC, i see no personal reason to by another Radeon or GeForce product again for the foreseeable future. I personally just can't justify their ridiculous prices for an occasional pastime.
If they make a white battlemage that rivals a 4070 Ti/4080, I might actually explode 💀. I was super impressed by Intel"s (modern) first try, but I got a Nitro+ 6600XT barely used for 140 bucks lol. I'm going to either to snag one when they drop or get a 7900 GRE for 1440p. Nvidia makes nice cards and all, but they charge more and more every generation and gut the specs in favor of the only peformance being from software (which costs them nothing extra to make.) They've been neutering the vram, bit buses, speeds, so on while charging basically double of AMD at times. And I don't like the look of most of their cards lmfao
@@paulboyce8537 Fake news. It's a good 1080p that can sometimes do 1440p depending on the game. At 4k my A770 16gb LE struggled to reach 60 fps in most modern titles.
Intel Arc is very important to the future of mainstream PC gaming. Nvidia have shown that they are no longer interested in the low and mid range, only the $1000 and up cards. In fact they're not even particularly interested in GPUs at all at the moment because of AI. AMD are incapable of competing at the high end so have a virtual monopoly in the sub $500 market. Steam hardware surveys seem to suggest that for the vast majority of gamers, sub $400 is the sweet spot so if Intel can be truly competitive here they could get a reasonable slice of that market. Looking forward to Battlemage but even more so Celestial & Druid. 🤞🏻
@@InfTlrthe point is that AMD doesn't have a market share monopoly but only a "de facto" monopoly. They have the monopoly of "non-nvidia graphics card", that's why they can get away with ridiculous prices (not as ridiculous as Nvidia's but still ridiculous) because for budget ballers they're the only real option. Market share is not really in the equation.
As an AMD convert from Intel, honestly, we need them. Nvidia, in their dominance, has treated gamers, their core audience that is responsible for their dominance, like absolute trash.
I got an arc card when I built my computer earlier this year. The only issues I've had frame drops in would be when messing with shader settings in Minecraft Java, and spamming particles in Dungeon Defenders 9(Though it was still playable). Overall I like my purchase of my A770 and would be inclined to recommend it to other people who need a new graphics card.
As an AMD guy, I've been praying for Intel to succeed, and as an XTX owner, have been hoping to be able to switch to Battlemage if performance is comparable. Because ultimately, I'm competition first and foremost.
Arc had many advantages like AV1 encoding support in programs like Handbrake right out of the gate. AV1 is an absolute gamechanger. Nvidia only added support for it in May 2024 despite the 40 series having at least one AV1 encoder. Since their release in 2022, they've only enabled AV1 support for OBS.
I actually bought an A770 LE; had it since launch month or so. The improvements have been very welcome, and it's not so far away from being in an ideal situation imho. Once the DX11/DX12 rework is rolled out as it was hinted to in Gamers Nexus' 2024 revisit, I have a feeling the A770 will be one of the best modern fine wine GPUs out there.
Have an A770 in my gaming/general-purpose rig (to be replacedwith a B980), planning on one A60 Pro for my workstation, and either another A60 or a mid Pro Battlemage card.
Also have an Arc A770 (Asrock) in my main rig and really hoping for it to get native VR support soon as that 16gb of vram would be nice in vrchat. Having to use a 3060ti in a different build for now.
@@artmanrom Mostly 13700, 13900, 14700 and 14900 and their variants are dying in servers and crashing in heavy workloads and even in games according to various outlets. Search online you'll find news.
@@artmanrom they are working fine but are basically receiving too much voltage slowly degrading the cpu bit by bit. literally. it is true you wouldn't feel it for the first maybe 2 to 5 years if you are a light user say 2 hour gaming sessions per day and some hours on the internet. but the heavier you use it, the faster the degradation goes. but no reason to panic as Intel has a fix to save those cpu's.
Just upgraded to an Acer BiFrost Arc A770 16GB from my Vega 64 3 days ago. I was honestly worried that I was making a terrible mistake, but it's currently £230 at Currys which is an amazing deal for a GPU with 16GB of VRAM. Obviously haven't spent much time with it, but so far I'm very, very pleased with it. I'm addicted to the Resident Evil 4 remake and it runs like a dream on it. You can crank the textures all the way up which makes the game use almost 15GB of VRAM and there's no issue. I have had stutter in Overwatch 2 after switching to it. It's not common and doesn't last long; it's probably around 4 seconds every 3 matches or so. I play it casually and not that often so it's not something I mind too much, but that'd be a deal breaker for a lot of people. I think esports titles should be the #1 priority honestly, as those titles are always popular and the players tend to be the most vocal about performance issues. I'm just happy the card is far more stable than I thought it'd be. No random BSODs or black screens and great performance in the titles I've played. I would have felt sad going for the RX 7600 or RTX 4060 over this as those cards simply have much less "GPU" for your buck if you know what I mean. The A770 having 16GB of VRAM on a 256 bit bus and plenty of cores for ray tracing and AI upscaling just make it feel so premium despite it being so cheap. It should age beautifully.
I just think it manages to put AMD in a very rough spot at least. Yes, the RX 7600 is faster at 1080p in the vast majority of games without ray tracing. As soon as you start going above 1080p and enabling ray tracing, the A770 shows how much more well-rounded of a GPU it is. XeSS needs more adoption, but it certainly looks cleaner than FSR does in the games I've used it in. Intel have made managed to hit a sweet spot between the features Nvidia offers and delivering raw performance. AMD selling the 7600 for £30-£40 less than the 4060 with no real match for the fancy features it offers doesn't feel compelling. Intel doesn't need to be selling a GPU with 16GB of VRAM and a 256 bit bus for around £250 given the features it has, but it sure feels like AMD does.
The Arc control windows was unresponsive for months until I found a post on reddit that indetified the issue to be with ASUS AI noise cancellation technology. I disabled those audio devices, and everything worked normally.
I have the A750 from Intel. The card has a few issues. Idle power draw is still insanely high even after enabling Native Power management in the BIOS per Intel guideline. However, the GPU has been very good at games I play. Most games I play are DX12 or just aren't that intense. I also do a fair amount of video transcoding and here it excels in Handbrake. I often get 20:1 ratios when transcoding.
I have the Asrock A770 you were showing in the video and it's been great in the last 6 months I've had it. One of the main reasons I bought it was for video editing/recording and it's very good at this. It also does well in games at 1440p. I'd actually call the drivers pretty optimized now because I mostly play niche jrpgs but haven't had any driver issues with them (for example, Scarlet Nexus and Persona 4 Golden work great, and those games, especially Scarlet Nexus, aren't anywhere near mainstream titles). I know I'll be recommending Arc gpus to others.
I have the A770 and it has only gotten better with time. I really only play World of Warships, but I had issues in the start. Now it is solid. I had figured it wouldn't take Intel long to work things out. I'm looking forward to the next level of Intel gpu's!
My guess is that corporate greed bit their backs because the higher-ups didn't let the developers and engineers do what had to be done. You just can't afford to make this kind of mistake when entering a market that's been controlled by two companies for decades. Seeing just how well their cards perform now, they had a ton of potential to be major competitors, but that all went down the drain with a hundred billion dollars and one train-wreck of a launch.
Personally I don't see that Intel pushed in discrete GPU market just for the sake of it, I think them real goal is datacenters ("AI" or better said "crunching numbers and datasets") They just needed someone to test them products and sustain some of them investments. Just think about it, them cards are DirectX agnostic, you can use the cores however you want..
*Thank you!* 🙏🏼 🤗 I own the VERY A770 you had in front of you during the video (I'm a dyed in the wool *ASRock* fanboi and _I don't care who knows it!_ 😊), but I have yet to install it because I'm working on 4 different builds simultaneously (all with ASRock 600 series motherboards 😁) and my focus has been on speccing them all out, grabbing the individual components whenever they go on sale, and then paying my credit card balance back down (so I can grab more Stuff™ 😆). Soon (hopefully during July) I'll be completing this first (A770) build (the case I'm using for it should be arriving next week, along with an AIO and some extra case fans), but in the meantime I'll be lying in wait for the *Prime Day* sales so I can snatch up an (ASRock) AMD GPU (probably a 7800XT or 7900GRE) for one of the other builds - but I *WILL* be snatching up Intel's _very first_ Xe2 (a.k.a. Battlemage 🧙🏼♂) offering when they drop later this year. If it all possible (and I'm sure it will be), it's going to be an _ASRock_ product. I like your style and manner of presentation, and I _really_ appreciate that you've been doing this series of videos. 👍🏼 Please keep up the great work! Stay safe & be well. 🖖🏼
I bought the A770 at launch knowing that the drivers would not be optimal for at least a year after release. I have been building pc's for a very long time (35 years plus) and so I have seen the early days of all the gpu makers and Intel started out like AMD and NVIDIA with teething issues, no difference, but they kept at it and now the card is starting to see the kind of performance that everyone has come to expect from Intel. Today's PC builders are a lot different from when I began, back then computing was a hobbyist culture were things didn't just work out of the box, it took time and a ton of effort to get a computer running, those days are long gone which in a way is real shame.😊
I do not miss those days when it took hours if not days of troubleshooting to figure out which part is faulty. It was nice to be able to get a replacement part from the store right away however.
@@Hardwaregeekx Then you must remember the feeling you got when after all the troubleshooting you finally find a solution to the problem. Plug and Play took all of that away as well as the skill set necessary to find the solution in the first place.
@@pyrielrising4338 The only thing I miss is choices. There used to be so many more choices. Even if I never intend to use a Cyrix cpu or an S3 video card, it was nice to know there were options out there. And it is sad that Abit is gone.
@@roasthunter No more Sound Blaster, no more Aureal. Its all just Realtek integrated now. It is kind of nice not having to worry about it. But it is scary how the industry consolidated.
When I had a 6700XT I prefered XESS for upscaling instead of FSR. Now, on a 7900XTX, I dont use upscalers, but if I needed a a card in that price segment, Intel Arc would be a serious competitor, especially for rendering and in terms of raw hardware/potential. Im mostly impressed of how far they got in such a short time with they driver support and Im still curious how the battlemage wil perform and for what price.
To be honest, i'm very happy with my A750 ORC OC. I play Battlefield 2043 @ 1440/ medium to high @60-80 fps CS2 is now also stable @1440 @120 fps medium/high Games like Age of Empires 4 and etc. run just amazing. So i'am happy as a user. Especially for the price of 220 euro I got it for. In lots of games it goes above RX7600 which costs 310+ euros On TimeSpy (3D Mark) I get 12000+ points. The GPU is powerful, the drivers are hit and miss, but generally they release updates once or twice per month with fairly nice improvements. So, nothing to complain about.
It's kind of funny that over time it's actually getting harder to find Intel Arc cards now. Like, I wanted to buy a Low Profile A380 just for AV1 encoding and I can't find one from my usual sources which is really wierd and nice.
Always been a ATi/AMD fan and they have gone through the same process. (I used to mod ati/amd drivers to push the drivers for more performance and make them work throughout their while range of cards “all in one package” before they did it them selves) and it took team red 2 decades (same for team green by the way). If anything I’m pleased to see what Intel is doing and within the timeline they are doing it.
I got a new computer at work. Needed a video card because I use 3 monitors. I dont game on it, but I need good video for opening and navigating large blueprint documents. I got the 750 when it was first out, because it was cheap and powerful. I was terrified of it for a while because of so many issues people were having. I never had any problems with it. I then got a 770 for home. That one ended up in a box, replaced by a 7800 XT.
I bought one for my sisters rig. She’s mainly dx12. Works pretty much all the time with small glitches here and there. She games on a 1080p ultra wide @60hz. The a770 doesn’t even sweat.
I bought my ASRock Phantom Gaming Arc a770 16gb oc card in august of 2023 for a product review and its been a good choice since day one, so Mutch so that I decided to build an intel balanced build around it a few months ago and glad I did.
Have an A770, Arc drivers are in a lot better place now. Still have issues with Arkham Knight and Kingdoms of Amalur Re-Reckoning, but that's about it out of the slate of games I usually play. Everything else from Fallout to Assassin's Creed generally plays well.
Im gald for people who just want test out things with these cards, but i just want things to work. Oh and what im missing in this video is, the mentioning of power consumptionm, as i sse it, they are not that effiicent, i for sure hope they will be in the future
I got an A750 in March ( $200 new ) to do my first PC build in over 10 years. I run Linux and mainly Blender. The only issues I had was when Blender updates needed an updated driver, I reported it to Intel and they got a new driver out in a few days sometimes in hours. It was a huge step up from a Dell desktop + GTX 1650. I don't do much gaming on it just 0AD , Warzone 2100, and openttd. Most gaming I do is on PlayStation.
i use an arc a770 and it has served me well I do both Gaming and editing and with gaming its pretty good with no problems with the games I play (skyrim, Tf2, Madden, Fortnite city skylines, cookie clicker, poker night at the inventory, civ 6 sims 4 and minecraft) my only issues come with editing where it seems intel has not worked out issues with gpu accceleraton giving constant artifacts when exporting in this style. The workaround is easy and just using the cpu for exporting, but that means i cant use fancy editing techniques like motion transitons making my edits sometimes look cheaper.
I bought a Sparkle A770 Titan about 2 months ago and have been running it on Linux. It's performing fine with one issue that may likely turn it into a deal breaker for gaming. There has been mixed reviews from people saying the HDMI ports suffer from random small blackouts but it seemingly did not affect the DP's. This card has this issue on every single port, HDMI or DP. Playing a game where death or quick reaction time matters, a random blackout for up to 3 seconds is enough to ruin everything for you. The blackouts are not extremely common, but they do occur a handful of times a day. This is not safe to game on for me personally.
Just curious to know for Sure; But is the game: 'Robo Cop Rogue City' optimized for the Intel Arc? Because there is an 'Intel Arc Splash Screen' as the Game is Launching.
intel was the only big company that could still come out with such a good performer. if they didnt have igpu experience it would've been a waste from the start.
I have been using a750 for about 6 months. If it helps people, I paired it with a 12600kf processor. In these 6 months; I played black&desert online, path of exile, dota 2, pubg, gta san andreas, hitman 2, frostpunk, albion online and ghost of tsushima games completely without any problems and with ultra settings.
Pardon me, but all of those games are old, and some are VERY old. GTA San Andreas is very popular again cos it's offering a really great experience even on mobile devices, therefore Intel has provided good drivers for it even if runs on Direct X 9.0C.
I got a A750 LE for $284 AUD, absolute bargain and i'm super happy with my card, but as you went through that list of games not supported well, it occurred to me that i luckily don't play any of those games. :D Thank Odin!
I switched from Nvidia to Intel A770 16 GB and love it. Note that I only use DX12 and for Flight Simulator 2020 only. Ready to purchase the upcoming Intel Battlemage GPU.
I got a arc card at Christmas of 2022 and was ok but when i got a new pc and used the a770 le it was so much better and the stability is much better. When i first got it i have artifacts when playing video but not in game i have not gotten any artifacts in many months
It's been about a year since I built a pc strictly for the A750. Now it's July 2024 and I wouldn't dream of pulling that card out of that machine. It's an 11th gen i5 and with the A750 it runs very well. I use it mainly for testing my Steam Library games.
I orginally got ARC as i needed a new gpu to my 3060 becuase the games i were playing needed more vram then i had available. The first few months of usage had games crashing rarely and over the drivers i was able to noticed increased performance over time very happy with my arc.
i was an team green 100% before the intel Arc but i am now a team Blue and looking forward to the new releases .. my hope for intel is to release a card that knocks on the door of Nvidia and Radeon and forces them to stop hicking the prices so that they become more competitive , as people what and need graphics card but can't afford 2 - 3k for a card .. i will be buying the next team blue
I have had both green and red cards but Intel stepping into the space has been one of the most exciting things to happen in the pc tech space since I started building. I am looking forward to their next generation and what it will mean for the market.
For me the only downside to ARC now is the power consumption compared to performance. If Battlemage can deliver more fps and much better efficiency, then I'll view it as a viable alternative option.
for battlemage to be more power efficient intel need to use 5nm at the very least. and that will make those BM expensive. gamer then will complain about those expensive price.
@@arenzricodexd4409 I mean, they do use TSMC 6nm node for the Alchemist first gen Arc GPUs, so of course they're gonna use the 5nm N4 node for the Battlemage second gen Arc GPUs
@@curious5887 and that will make those battlemage expensive. if intel still need to sell BM at cost or even below it then BM will be their last discrete GPU.
I was tempted to go for a750 although the cost went up in my area around the time the rx 7600 came out Making it worse in terms of price to performance. My cousins GPU died so he has my rx 7600 which was my guest PC card, currently have no GPU in that PC still tempted to pick one up. Edit the a750 is now really cheap less than a 3050 6gb or rx 6600
I bought my Arc A750 for 170€ and put it on my remodeled computer 9 months ago. I bought it just to test how Intel worked with a view to perhaps buying or at least having among the alternatives to their Battlemage... ironically it works so well, that I don't know if I'll buy a Battlemage or wait for a Celestial. So it was already working very well, because Intel's work was extensive, but I have to admit that in these 9 months, it has also been intensive, and the card performs better with each update. Its big problem is that games mostly do not use XeSS technology as it happens with Dlss and FSR (although this one is honestly worse, than XeSS at least, it is an alternative when it is not implemented). So although XeSS is a better solution than FSR to act as Super Scaling for any graphics card, let alone XMX mode for graphics with XeSS hardware, the reality is that Intel's real limit is still in the game developers who implement it. Perhaps with the Windows 24H2 version, this will change as games call up the windows libraries and they use the most successful solution, but for now, the introduction of XeSS is the most serious issue that Intel has to tackle. Not only for the sake of the Arc, but also for their future Battlemage.
This was a good video, thank you. I've been considering an ARC A580 to swap out my RX-6600, and give my RX-6600 to our daughter. She's running a Radeon 5700, and her room gets HOT. I run the 6600 and it's cool. My CPU is a 5600X, daughter is running a 5600. I could get an A580 and give my 6600 to my daughter, the 5700 can then be sold I guess.
I have an Arc A770 and I have had an almost almost flawless experience the only problems I have had have been with BIOS, but I got it fixed and now I haven't had any issues for about 2 months Edit: XeSS has also been a really good upscale technology which if implemented correctly looks almost as good as DLSS
I'm still waiting for them to officially support VR. In any other scenario, the GPU rocks, but on OpenXR/Vulkan API related things, the driver has missing instructions, but the capability is there.
I would love to continue messing with my A770, but when I got my 7900XT I upgraded my PSU and it no longer has a 6 pin sadly. ARC is defintely good for budget gamers and even streamers. Intel had AV1 support before AMD and Nvidia. Strangely Nvidia was the last to implement it into OBS and even more strange they have the most stability issues and bugs with AV1. Both Intel and AMD seem to have solid implementations there from some of my own testing.
Totally surreal to see Intel, who used to be totally anti-consumer with their CPU's for nearly a decade, (with only 4 cores and hardly any significant performance improvements), until AMD Ryzen launched, be touted as the pro-consumer alternative for graphics cards now. But if Battlemage comes out strong for the mainstream, with price to performance, I'll be trying out team Blue.
Hey, they were anti-consumer before that, even bribing execs of OEMs to not sell AMD Athlon which had an edge on Intel's Pentium. There's a reason AMD didn't make profits despite successful chips which killed Intel proprietory 64bit Itanic.
@@RobBCactive none of these companies are consumer friendly. They only become more tolerable when they loose market share and trying to win us back. I have no allegiance to any of them.
@@RobBCactive AMD did not make profit nearly a decade because of their ATI acquisition. intel did not play fair during pentium 4 era but their core architecture was very competitive with AMD Phenoms. AMD still get decent profit with Phenom and Phenom 2 but the issue is AMD spend a lot of money when they acquire ATI for themselves. if AMD try to be more clever in this part they will not starved on money to compete with the rising intel back then.
@@arenzricodexd4409 actually Intel acted illegally in seriously criminal ways and there were many legal cases brought worldwide against them. That included actual bribery. Starving companies of profits means they become under-resourced. Intel attempted to fix the "error" on x86 licensing, not only welching on legal contracts to monopolise the PC CPU mary, but later tried again at 64bit. For those who saw the fishy goings on and wondered, then saw Intel's monetezation of the near monopoly, seeing Intel portrayed as competition champions is not credible. Intel AXG was the strategy to defend the lucrative near monopoly in data center and as integration continues undermine Nvidia's gfx biz in client computing. Just as Nvidia were frozen out of the chipset biz. No way Intel planned to lose a lot of money on Alchemist and the lack of follow up to a late to market card since 2022 is part of that.
I’m thinking of buying one for my new pc build gaming at 1440p but I’m hesitating on this arc770 gpu or with the rtx 4070 super can someone help me decide please thanks or should i keep my 1080ti and wait for improvements on the arc770
I would totally wait bro!! i have the A770 it will be a nice upgrade but the 5xxx series will come out soon and battlemage will be like 6 months i bet.... I wait for Nvidia I have not been back since my 980ti and i really miss Nvidia after the AMD 5700xt and A770
Intel Arc has a lot of potential. My hope is that more AIB partners, like Sapphire or XFX or Gigabyte, might start working with Intel to offer different versions of Intel's graphics cards.
How many years of support can we expect on arc cards as i want to build a future proof pc but im a bit scared that out of nowhere intel will end drivers support, but surely not as they are relatively new right???
pay closer attention to the 1% lows, if this increases it means your experience will be smoother. and just looking at the launch drivers vs the most recent drivers explains why so many games have massive FPS increases. e.g. they improved the 1% lows.
I run a A750 on a Ryzen 7 5700, I usually play single player games and have had really no issues. I would get some slow down and some artifacts in the beginning but they were resolved rather quickly and being I started with PC gaming back in 1987 this is how most of the cards on the market started out. ATI didn't just make a card and it be the best on the market it took a couple of years with the Rage chips, if you go back and look a lot of the first gen Rage were used in servers and not as gaming GPU. Nvidia hit it out of the park with the TNT line, and is one of the reasons we don't have 3dFX any more along with a lot of others, but they too did have a lot of bumps on the way.
We downplay AMDs hold on the APU based gaming market with these talks, just like Intel and AMD is behind in the dedicated GPU market It also doesn't seem like Intel and Nvidia won't catch up with APUs which is all narrowing Intel's reach
This video aged like a milk in just a month. AMD does better in GPU power effiency, Integrated gpu, and in CPU stability being in the industry for a long time doesn't mean Intel does it better they're still greedy AF now their cpus are dying here and there.
they should not only focus on gamming, but also in creative and engineering application performance. being a student both are really important. they got the pricing and the potential for future improvement in performance(they already proved that ) which amd and nvdia are lacking.
Intel Arc Xe2 Battlemage looks very promising. Xe2 is one micro architecture that scales from Low Power Gaming (Xe2-LPG) for iGPUs to High Power Gaming (Xe2-HPG) for dGPUs. The reason DX9 games are very popular is because most people are using much older cards. When Battlemage comes out there will be very cheap Alchemist cards available in the used market that people can upgrade from those older Nvidia cards. Intel used F1 2024 to demo the Xe2 iGPU on Lunar Lake. A new game that was not yet released working well on an unreleased GPU. Intel is working hard to get day 1 game ready drivers for popular titles. A very different start with Battlemage than with Alchemist. I think more people will buy Battlemage compared to Alchemist buy a significant margin.
I bought an a750 and though I had some issues with it I had a good experience with it and would recommend either the 580 750 or 770 to anybody trying to build a new PC on a budget it's not really worth spending 800 buck on an invidia card when cheaper options perform better
I got my A750 May-23 and all mayor issues were essentielly gone, sure I can't get the extra boost from 200 to 300FPS in 1080p in all games, but I can go to 1440p and keep the same FPS in those games. With new drivers it's only getting better, gaining 10% FPS in a couple of games every month. I got the LE version and it's silent as a mouse, but it does draw some power, especially when overclocked to compete with the 7600 & 4060. Raytracing & upscaling it's rivaling Nvidia, the A750 actually beats the 4060 in 1440p in some titles, probably due to it's 256-bit bus.
We do need someone to compete and help Nvidia pushing real.time rendering forward. Nvidia have def been leading the way in pushing whats possible. Physics based real time lighting. Upscaling to greatly improve performance budget that developers have to work with, and fix the image quality problems caused by taa and fxaa. The performance budget increase is bigger than when we went from msaa to taa, mire than mesh shaders added, more than screen space effects added, more than mip maps added. Streamline SDK to make adding new features simple and quick for devs. RTX HDR to improve hdr in pc games. RTX IO to improve asset streaming and loading times. Upcoming tech neural compression to improve texture quality while reducing texture size (allows for less bandwidth to be used on textures). RTX Remix to give modders tools to update older games. People like to claim Nvidia gave up on gaming. But the truth is they are making more money than ever from the gaming market. They are the company not skipping out on the high end. The only company adding new tools and tech to increase fidelity and reduce dev times. Intel is doing a decent job trying to make low end/mid range cards with modern features. Nintendo is using nvida and will have its feature set on the next Switch. Sony is adding an NPU and using a bespoke upscaler for the ps5 pro. It's not Nvidia that gave up on gaming. It's not their fault that others refuse to try to compete. Its companys saying we will match them in a single metric, be 2 generations behind in everything else and sell our cards for 10 percent less.
I've owned an Arc A770 16GB since late 2022. I'm still really bummed by the following: Inferior OpenGL support compared to Nvidia Discontinued non-VR Stereoscopic 3D from the driver (it was present at launch and is not present now) *Still no VR support!*
A770 got relevant 2023 January driver updates. A770 is a bit odd card where at 1080p you could say it is behind the competition but we are talking high FPS and in reality it really makes very little difference (pont 1). 1440p the drop is lot less than competition pulling level if not ahead on FPS count at compared cards. At 4k it is well ahead where it shines as the entry level card with 16gb VRAM and 256 bit bus. I use mine with 4k and works well at mid and high settings and also can handle RT most of the time. Peter let the cat out of the bag and most missed it. ARC paired with INTEL CPU gives a better performance. ARC is almost doubling it performance with INTEL CPU via REBAR/E cores but you do not see this performance on FPS count (point 2). This is why the experience often is lot better than competition with double the FPS count. None of the sponsored channels are willing to say this. GN sort of did letting Peter demonstrate how the ARC works. The speed of rendering where ARC shines as productivity card gives you a good hint where the performance actually is. Hence INTEL offering a tool for pairing. Also need to point out INTEL XeSS XMX is different to XeSS DP4A. Often missed or mixed in purpose to give worse results. Overclocks really easy for good performance boost. 2786MHz at 1.16V at 271W. Base is 2100MHz (2400MHz) 225W. Point 1: Some games are FPS capped so for example at 1080p if the game is capped at 200 FPS you often get around 100 FPS with ARC. Peter pointed the pairing with CPU and here you double the performance. Would show 100 FPS but no waiting. In reality 100 FPS performance equal the 200 FPS performance from the competition. 1440p the cap allows ARC to perform better numbers. 4k you see even less drop to the competition. I believe this is mostly about the capped games and higher the resolution less restriction there is for the ARC. Point 2: This means every FPS counts and no stutter. 100 FPS is 100 FPS not like 200 FPS from the competition where the extra 100 FPS is targeted to fill the badly timed and missed FPS with long waiting times. It often means that good 100 FPS is better than 200 FPS that are imperfect. For last I would like to point out Intel® Application Optimization (APO). This is upcoming performance that will further boost the ARC and INTEL CPU pairing by 10-50% depending of the application and game. It will be relevant for the Alchemist but more for Battlemage and standard with Celestial. So A770 is just going to get better and better with time.
I got the A770 Titan for 280$ and i really think the NO STUTTER part is 100% true I get a little lower FPS than my 6700xt but there is clost to 0 Stutter at 1440p! I also agree it will get better with age like cheese!
@@Miley_00 6700xt 10gb VRAM 160 bit bus 220W loses on specs fair bit. 1440p it might not make a huge difference yet but it couldn't do 4k. On ARC I find as soon as you are over 40 FPS most is very playable. Same you can't say about the competition.
developer have some control but it is not much. in the end full GPU optimization still reliant on GPU vendor. it is not the same as console low level optimization.
alright equivalent would be smth like i5 10400f + rtx3060ti or arc770, and i can assure you if those are minimum spec lowest games will look like, maybe ac origins very high w raytracing enabled
I think intel should keep pushing, going from absolutely nothing to an ok GPU is strong, if they double their efforts, they can be competitive. They can benefit significantly once they get there. Mobile gpu will benefit and the stocks will return. Also they invested far too much to just abandon it.
The problem was the hype. They said it was going to be comparable to the 4070 for a fraction of the price. The reality is it competes about the same amount a slab of tofu competes with Ruth Chris for satisfying your hunger. They should have just said we are releasing a low end video card. Instead of dissappointing as it slowly got better it would be a very positive spin.
They did not say 4070 as that card didn’t even exist on the market at the time of the A770’s release. It was supposed to be 3070 performance launching in the middle of the pandemic scalping to drive down prices with millions of new cards. What we got was typically 3060 performance (much worse in older games to the point of being broken), broken and buggy software, and arriving not just after the market crashed but so late it launched the same day as the 4090. It’s up to around 3060 Ti performance and the software actually works now which is pretty good, but older games may still see issues.
Watch This Next! The Future is pretty fked ua-cam.com/video/bc-lTgB8Ff8/v-deo.html
Check out the Arc cards I used:
As a retail affiliate, I am paid a percentage of qualifying purchases at the expense of the retailer.
💎 Arc A770 16GB - newegg.io/nc644e2d2a
💎 Arc A750 - newegg.io/ncccf517f6
💎 Arc A580 - newegg.io/ncd1ad6c1a
💎 Arc A380 - newegg.io/nc61a41f4
I have been saying this for the past three years. We have no choice!! Nvidia no longer cares about the gaming market . We only have AMD and Intel. So, intel need to put more emphasis on their Gaming Graphics. There is no way either AMD or Intel will catch Nvidia in AI. So, the Gaming market that Nvidia left in the trash, is there for the taking. Someone needs to step and take it.
Nvidia wont drop the GPU market cause their philisophy is to be the best in every sector. Its just that their GPUs value will be faitly low despite the high price and they are trying to compensate it with questionable features. They are cutting performace again with the 50 series cause they dont care, but they will still b manufactoring GPUs for the private market.
@@Greenalex89 They only care to stay in the GPU market because leaving it entirely would create exactly an opportunity for Intel and AMD to grow... which they don't want.
Intel has already caught up in the AI segment with their Gaudi 3 chips. They're also significantly cheaper than H100s and H200s.
Yeah... no way amd will ever catch nvidia. Just like amd will never catch intel.
it is not that nvidia did not care about gaming market but gamer did not want to pay high price for it. they did not want to accept the reality that things are getting expensive to make GPU. they want their GPU to be cheap just like how it was a decade ago when wafer cost alone have increased more than 4 times for the same die size. intel already said before they are not making profit with Arc even when they still sell the card at it's original MSRP. this gen AMD did not even dare to use 5nm on their low end.
I adopted ARC upon launch and the journey has been interesting and fun despite the challenges. I knew there were going to be some rough patches so i stuck specifically with DX12 titles and used Intel report a problem link whenever i encountered issues. Intel gave me a number of occasions to work one on one with engineers and even do some live screenshares so they could observe my experiences live. All in all, i am EXTREMELY happy with the the A770's that we have here and look forward to battlemage when it launches. Ultimately what attracts me to these products is that being a 32+ year custom PC builder, it is so refreshing to have a genuinely new and refreshing product to work with versus the very old and stale Radeon and GeForce branding. No to mention that the price creep from those 2 brands is just beyond the pale of comprehension which is majority based in shareholder greed. From here and based on what i have experience so far with ARC, i see no personal reason to by another Radeon or GeForce product again for the foreseeable future. I personally just can't justify their ridiculous prices for an occasional pastime.
If they make a white battlemage that rivals a 4070 Ti/4080, I might actually explode 💀.
I was super impressed by Intel"s (modern) first try, but I got a Nitro+ 6600XT barely used for 140 bucks lol. I'm going to either to snag one when they drop or get a 7900 GRE for 1440p.
Nvidia makes nice cards and all, but they charge more and more every generation and gut the specs in favor of the only peformance being from software (which costs them nothing extra to make.) They've been neutering the vram, bit buses, speeds, so on while charging basically double of AMD at times. And I don't like the look of most of their cards lmfao
@@cuteAvancer A770 already is better experience at 4k than 4070ti that can't.
I've also had great feedback from Intel support.
@@paulboyce8537 lol no
@@paulboyce8537 Fake news. It's a good 1080p that can sometimes do 1440p depending on the game. At 4k my A770 16gb LE struggled to reach 60 fps in most modern titles.
Intel Arc is very important to the future of mainstream PC gaming. Nvidia have shown that they are no longer interested in the low and mid range, only the $1000 and up cards. In fact they're not even particularly interested in GPUs at all at the moment because of AI.
AMD are incapable of competing at the high end so have a virtual monopoly in the sub $500 market.
Steam hardware surveys seem to suggest that for the vast majority of gamers, sub $400 is the sweet spot so if Intel can be truly competitive here they could get a reasonable slice of that market.
Looking forward to Battlemage but even more so Celestial & Druid. 🤞🏻
Amd has a monopoly only on reddit comments, steam surveys show they don't even make up 10% of all gpus
@@InfTlrthe point is that AMD doesn't have a market share monopoly but only a "de facto" monopoly. They have the monopoly of "non-nvidia graphics card", that's why they can get away with ridiculous prices (not as ridiculous as Nvidia's but still ridiculous) because for budget ballers they're the only real option. Market share is not really in the equation.
they attacked mid-range because they're a problem. get it right
A former anti consumer monopolist like intel is now our only hope in the GPU market.
Things went downhill hard in the past few years.
To be fair. Intel kept AMD alive. Yeah, most likely because they were afraid to be a monopoly. But still.
intel is calling out amd and nvidia that they're being too greedy and boring. that's why they attacked their mid-range
As an AMD convert from Intel, honestly, we need them. Nvidia, in their dominance, has treated gamers, their core audience that is responsible for their dominance, like absolute trash.
I got an arc card when I built my computer earlier this year. The only issues I've had frame drops in would be when messing with shader settings in Minecraft Java, and spamming particles in Dungeon Defenders 9(Though it was still playable). Overall I like my purchase of my A770 and would be inclined to recommend it to other people who need a new graphics card.
As an AMD guy, I've been praying for Intel to succeed, and as an XTX owner, have been hoping to be able to switch to Battlemage if performance is comparable. Because ultimately, I'm competition first and foremost.
Arc had many advantages like AV1 encoding support in programs like Handbrake right out of the gate. AV1 is an absolute gamechanger. Nvidia only added support for it in May 2024 despite the 40 series having at least one AV1 encoder. Since their release in 2022, they've only enabled AV1 support for OBS.
I actually bought an A770 LE; had it since launch month or so. The improvements have been very welcome, and it's not so far away from being in an ideal situation imho. Once the DX11/DX12 rework is rolled out as it was hinted to in Gamers Nexus' 2024 revisit, I have a feeling the A770 will be one of the best modern fine wine GPUs out there.
I already bought 6 cards: 2x sparkle a310 eco(1slot Av encoder), 2x 380, 2x 770. I did my part!
i have bought 2, the a380 and a750 and getting ready to buy an a580 soon.
why 6?
Minted?
Have an A770 in my gaming/general-purpose rig (to be replacedwith a B980), planning on one A60 Pro for my workstation, and either another A60 or a mid Pro Battlemage card.
Yeah why did you buy 6 cards?
I'd like to see Intel become the third big player in the GPU market.
I use a sparkle a770 for vr and I gotta say I really enjoy the performance. Seeing that I play vrchat that 16gb of vram comes in handy big time
Also have an Arc A770 (Asrock) in my main rig and really hoping for it to get native VR support soon as that 16gb of vram would be nice in vrchat. Having to use a 3060ti in a different build for now.
I'm rooting for Intel
why
Gaming community clearly has short memory.
@@GrainGrownrooting for intel to succeed in GPU's is different than rooting for Intel to succeed in CPU's
@@lucazani2730 With apple as a new player I root for them all. :)
I think their most expensive mistake would be the self destructing 13th and 14th gen CPUs
I have a 13th gen Intel GPU, and it's alive and kicking, any self-destruct by any means.
@@artmanrom Mostly 13700, 13900, 14700 and 14900 and their variants are dying in servers and crashing in heavy workloads and even in games according to various outlets. Search online you'll find news.
@@artmanrom they are working fine but are basically receiving too much voltage slowly degrading the cpu bit by bit. literally.
it is true you wouldn't feel it for the first maybe 2 to 5 years if you are a light user say 2 hour gaming sessions per day and some hours on the internet.
but the heavier you use it, the faster the degradation goes.
but no reason to panic as Intel has a fix to save those cpu's.
Just upgraded to an Acer BiFrost Arc A770 16GB from my Vega 64 3 days ago. I was honestly worried that I was making a terrible mistake, but it's currently £230 at Currys which is an amazing deal for a GPU with 16GB of VRAM.
Obviously haven't spent much time with it, but so far I'm very, very pleased with it. I'm addicted to the Resident Evil 4 remake and it runs like a dream on it. You can crank the textures all the way up which makes the game use almost 15GB of VRAM and there's no issue.
I have had stutter in Overwatch 2 after switching to it. It's not common and doesn't last long; it's probably around 4 seconds every 3 matches or so. I play it casually and not that often so it's not something I mind too much, but that'd be a deal breaker for a lot of people. I think esports titles should be the #1 priority honestly, as those titles are always popular and the players tend to be the most vocal about performance issues.
I'm just happy the card is far more stable than I thought it'd be. No random BSODs or black screens and great performance in the titles I've played. I would have felt sad going for the RX 7600 or RTX 4060 over this as those cards simply have much less "GPU" for your buck if you know what I mean. The A770 having 16GB of VRAM on a 256 bit bus and plenty of cores for ray tracing and AI upscaling just make it feel so premium despite it being so cheap. It should age beautifully.
I just think it manages to put AMD in a very rough spot at least. Yes, the RX 7600 is faster at 1080p in the vast majority of games without ray tracing. As soon as you start going above 1080p and enabling ray tracing, the A770 shows how much more well-rounded of a GPU it is. XeSS needs more adoption, but it certainly looks cleaner than FSR does in the games I've used it in.
Intel have made managed to hit a sweet spot between the features Nvidia offers and delivering raw performance. AMD selling the 7600 for £30-£40 less than the 4060 with no real match for the fancy features it offers doesn't feel compelling. Intel doesn't need to be selling a GPU with 16GB of VRAM and a 256 bit bus for around £250 given the features it has, but it sure feels like AMD does.
The Arc control windows was unresponsive for months until I found a post on reddit that indetified the issue to be with ASUS AI noise cancellation technology. I disabled those audio devices, and everything worked normally.
I have the A750 from Intel. The card has a few issues. Idle power draw is still insanely high even after enabling Native Power management in the BIOS per Intel guideline.
However, the GPU has been very good at games I play. Most games I play are DX12 or just aren't that intense.
I also do a fair amount of video transcoding and here it excels in Handbrake. I often get 20:1 ratios when transcoding.
I have the Asrock A770 you were showing in the video and it's been great in the last 6 months I've had it. One of the main reasons I bought it was for video editing/recording and it's very good at this. It also does well in games at 1440p. I'd actually call the drivers pretty optimized now because I mostly play niche jrpgs but haven't had any driver issues with them (for example, Scarlet Nexus and Persona 4 Golden work great, and those games, especially Scarlet Nexus, aren't anywhere near mainstream titles). I know I'll be recommending Arc gpus to others.
I have the A770 and it has only gotten better with time. I really only play World of Warships, but I had issues in the start. Now it is solid. I had figured it wouldn't take Intel long to work things out. I'm looking forward to the next level of Intel gpu's!
My guess is that corporate greed bit their backs because the higher-ups didn't let the developers and engineers do what had to be done. You just can't afford to make this kind of mistake when entering a market that's been controlled by two companies for decades. Seeing just how well their cards perform now, they had a ton of potential to be major competitors, but that all went down the drain with a hundred billion dollars and one train-wreck of a launch.
Personally I don't see that Intel pushed in discrete GPU market just for the sake of it, I think them real goal is datacenters ("AI" or better said "crunching numbers and datasets")
They just needed someone to test them products and sustain some of them investments.
Just think about it, them cards are DirectX agnostic, you can use the cores however you want..
*Thank you!* 🙏🏼 🤗
I own the VERY A770 you had in front of you during the video (I'm a dyed in the wool *ASRock* fanboi and _I don't care who knows it!_ 😊), but I have yet to install it because I'm working on 4 different builds simultaneously (all with ASRock 600 series motherboards 😁) and my focus has been on speccing them all out, grabbing the individual components whenever they go on sale, and then paying my credit card balance back down (so I can grab more Stuff™ 😆).
Soon (hopefully during July) I'll be completing this first (A770) build (the case I'm using for it should be arriving next week, along with an AIO and some extra case fans), but in the meantime I'll be lying in wait for the *Prime Day* sales so I can snatch up an (ASRock) AMD GPU (probably a 7800XT or 7900GRE) for one of the other builds - but I *WILL* be snatching up Intel's _very first_ Xe2 (a.k.a. Battlemage 🧙🏼♂) offering when they drop later this year. If it all possible (and I'm sure it will be), it's going to be an _ASRock_ product.
I like your style and manner of presentation, and I _really_ appreciate that you've been doing this series of videos. 👍🏼
Please keep up the great work!
Stay safe & be well. 🖖🏼
Thanks Mike! Honestly, it's always really nice to see a comment from you down here 😊
Thanks for being awesome and stay safe too dude!
I bought the A770 at launch knowing that the drivers would not be optimal for at least a year after release. I have been building pc's for a very long time (35 years plus) and so I have seen the early days of all the gpu makers and Intel started out like AMD and NVIDIA with teething issues, no difference, but they kept at it and now the card is starting to see the kind of performance that everyone has come to expect from Intel. Today's PC builders are a lot different from when I began, back then computing was a hobbyist culture were things didn't just work out of the box, it took time and a ton of effort to get a computer running, those days are long gone which in a way is real shame.😊
I do not miss those days when it took hours if not days of troubleshooting to figure out which part is faulty. It was nice to be able to get a replacement part from the store right away however.
@@Hardwaregeekx Then you must remember the feeling you got when after all the troubleshooting you finally find a solution to the problem. Plug and Play took all of that away as well as the skill set necessary to find the solution in the first place.
@@pyrielrising4338 The only thing I miss is choices. There used to be so many more choices. Even if I never intend to use a Cyrix cpu or an S3 video card, it was nice to know there were options out there. And it is sad that Abit is gone.
I don’t miss the days where hours could be lost trying to get sound cards, peripherals to work, things are much better now
@@roasthunter No more Sound Blaster, no more Aureal. Its all just Realtek integrated now. It is kind of nice not having to worry about it. But it is scary how the industry consolidated.
If intell has it dialed in by the time I build my next PC or need a upgrade I will be converting
me too!
theres a lot possibilities my new PC with AMD CPU and Intel GPU,Both Intels,Intel CPU and AMD GPU or even Snapdragon SoC with Intel or AMD GPU
When I had a 6700XT I prefered XESS for upscaling instead of FSR. Now, on a 7900XTX, I dont use upscalers, but if I needed a a card in that price segment, Intel Arc would be a serious competitor, especially for rendering and in terms of raw hardware/potential.
Im mostly impressed of how far they got in such a short time with they driver support and Im still curious how the battlemage wil perform and for what price.
Now I'm curious if DX9 games would perform better on a Linux based OS with proton.
To be honest, i'm very happy with my A750 ORC OC.
I play Battlefield 2043 @ 1440/ medium to high @60-80 fps
CS2 is now also stable @1440 @120 fps medium/high
Games like Age of Empires 4 and etc. run just amazing. So i'am happy as a user. Especially for the price of 220 euro I got it for. In lots of games it goes above RX7600 which costs 310+ euros
On TimeSpy (3D Mark) I get 12000+ points. The GPU is powerful, the drivers are hit and miss, but generally they release updates once or twice per month with fairly nice improvements. So, nothing to complain about.
It's kind of funny that over time it's actually getting harder to find Intel Arc cards now. Like, I wanted to buy a Low Profile A380 just for AV1 encoding and I can't find one from my usual sources which is really wierd and nice.
Always been a ATi/AMD fan and they have gone through the same process. (I used to mod ati/amd drivers to push the drivers for more performance and make them work throughout their while range of cards “all in one package” before they did it them selves) and it took team red 2 decades (same for team green by the way). If anything I’m pleased to see what Intel is doing and within the timeline they are doing it.
I got a new computer at work. Needed a video card because I use 3 monitors. I dont game on it, but I need good video for opening and navigating large blueprint documents. I got the 750 when it was first out, because it was cheap and powerful. I was terrified of it for a while because of so many issues people were having. I never had any problems with it. I then got a 770 for home. That one ended up in a box, replaced by a 7800 XT.
I bought one for my sisters rig. She’s mainly dx12. Works pretty much all the time with small glitches here and there. She games on a 1080p ultra wide @60hz. The a770 doesn’t even sweat.
I bought my ASRock Phantom Gaming Arc a770 16gb oc card in august of 2023 for a product review and its been a good choice since day one, so Mutch so that I decided to build an intel balanced build around it a few months ago and glad I did.
Have an A770, Arc drivers are in a lot better place now. Still have issues with Arkham Knight and Kingdoms of Amalur Re-Reckoning, but that's about it out of the slate of games I usually play. Everything else from Fallout to Assassin's Creed generally plays well.
Thank you for the info, I'm playing quite often those two games, so I better wait till I'll go GPU-Intel.
Im gald for people who just want test out things with these cards, but i just want things to work. Oh and what im missing in this video is, the mentioning of power consumptionm, as i sse it, they are not that effiicent, i for sure hope they will be in the future
I got an A750 in March ( $200 new ) to do my first PC build in over 10 years. I run Linux and mainly Blender. The only issues I had was when Blender updates needed an updated driver, I reported it to Intel and they got a new driver out in a few days sometimes in hours. It was a huge step up from a Dell desktop + GTX 1650.
I don't do much gaming on it just 0AD , Warzone 2100, and openttd. Most gaming I do is on PlayStation.
i use an arc a770 and it has served me well I do both Gaming and editing and with gaming its pretty good with no problems with the games I play (skyrim, Tf2, Madden, Fortnite city skylines, cookie clicker, poker night at the inventory, civ 6 sims 4 and minecraft) my only issues come with editing where it seems intel has not worked out issues with gpu accceleraton giving constant artifacts when exporting in this style. The workaround is easy and just using the cpu for exporting, but that means i cant use fancy editing techniques like motion transitons making my edits sometimes look cheaper.
I bought a Sparkle A770 Titan about 2 months ago and have been running it on Linux. It's performing fine with one issue that may likely turn it into a deal breaker for gaming. There has been mixed reviews from people saying the HDMI ports suffer from random small blackouts but it seemingly did not affect the DP's. This card has this issue on every single port, HDMI or DP. Playing a game where death or quick reaction time matters, a random blackout for up to 3 seconds is enough to ruin everything for you. The blackouts are not extremely common, but they do occur a handful of times a day. This is not safe to game on for me personally.
I had blackouts when playing OW 2 on my ex GTX 1600 6GB, since I switched to my 3060 Ti OC any of those blackouts have occurred.
im running an a380 currently and have had zero problems. i play mostly single player offline games.
I love my Arc 770, and it's installed in an Intel NUC 12 extreme i9. Performant and silent. Drivers updates come very regularly.
Just put together a rig with the a580. Will be bench marking tomorrow
Just curious to know for Sure; But is the game: 'Robo Cop Rogue City' optimized for the Intel Arc? Because there is an 'Intel Arc Splash Screen' as the Game is Launching.
intel was the only big company that could still come out with such a good performer. if they didnt have igpu experience it would've been a waste from the start.
Pulled a trigger on used Arc a770 Acer Bifrost and so far so good.
I have been using a750 for about 6 months. If it helps people, I paired it with a 12600kf processor. In these 6 months; I played black&desert online, path of exile, dota 2, pubg, gta san andreas, hitman 2, frostpunk, albion online and ghost of tsushima games completely without any problems and with ultra settings.
Pardon me, but all of those games are old, and some are VERY old. GTA San Andreas is very popular again cos it's offering a really great experience even on mobile devices, therefore Intel has provided good drivers for it even if runs on Direct X 9.0C.
@@artmanrom Yes, the drivers are mature enough, I have no problems. He gave me much more than his price.
I got a A750 LE for $284 AUD, absolute bargain and i'm super happy with my card, but as you went through that list of games not supported well, it occurred to me that i luckily don't play any of those games. :D Thank Odin!
I switched from Nvidia to Intel A770 16 GB and love it.
Note that I only use DX12 and for Flight Simulator 2020 only.
Ready to purchase the upcoming Intel Battlemage GPU.
Looking at their 13 and 14th gen issues, Arc might not have been their most expensive mistake.
I got a arc card at Christmas of 2022 and was ok but when i got a new pc and used the a770 le it was so much better and the stability is much better. When i first got it i have artifacts when playing video but not in game i have not gotten any artifacts in many months
It's been about a year since I built a pc strictly for the A750. Now it's July 2024 and I wouldn't dream of pulling that card out of that machine. It's an 11th gen i5 and with the A750 it runs very well. I use it mainly for testing my Steam Library games.
I'm watching this video with Arc A380 LP
I orginally got ARC as i needed a new gpu to my 3060 becuase the games i were playing needed more vram then i had available. The first few months of usage had games crashing rarely and over the drivers i was able to noticed increased performance over time very happy with my arc.
I have used a sparkle a770 16gb with an AMD cpu for around 2 months, and so far, it works great on Windows and Linux with zero issues.
i was an team green 100% before the intel Arc but i am now a team Blue and looking forward to the new releases .. my hope for intel is to release a card that knocks on the door of Nvidia and Radeon and forces them to stop hicking the prices so that they become more competitive , as people what and need graphics card but can't afford 2 - 3k for a card .. i will be buying the next team blue
You forgot that Intel before AMD's Zen had really overpriced CPUs.
I have had both green and red cards but Intel stepping into the space has been one of the most exciting things to happen in the pc tech space since I started building. I am looking forward to their next generation and what it will mean for the market.
For me the only downside to ARC now is the power consumption compared to performance. If Battlemage can deliver more fps and much better efficiency, then I'll view it as a viable alternative option.
well, Intel is betting on efficiency for their Battlemage gpus, so don't need to worry
for battlemage to be more power efficient intel need to use 5nm at the very least. and that will make those BM expensive. gamer then will complain about those expensive price.
@@arenzricodexd4409 I mean, they do use TSMC 6nm node for the Alchemist first gen Arc GPUs, so of course they're gonna use the 5nm N4 node for the Battlemage second gen Arc GPUs
@@curious5887 and that will make those battlemage expensive. if intel still need to sell BM at cost or even below it then BM will be their last discrete GPU.
Intel biggest mistake was Itanium
I was tempted to go for a750 although the cost went up in my area around the time the rx 7600 came out Making it worse in terms of price to performance. My cousins GPU died so he has my rx 7600 which was my guest PC card, currently have no GPU in that PC still tempted to pick one up.
Edit the a750 is now really cheap less than a 3050 6gb or rx 6600
I bought my Arc A750 for 170€ and put it on my remodeled computer 9 months ago. I bought it just to test how Intel worked with a view to perhaps buying or at least having among the alternatives to their Battlemage... ironically it works so well, that I don't know if I'll buy a Battlemage or wait for a Celestial. So it was already working very well, because Intel's work was extensive, but I have to admit that in these 9 months, it has also been intensive, and the card performs better with each update. Its big problem is that games mostly do not use XeSS technology as it happens with Dlss and FSR (although this one is honestly worse, than XeSS at least, it is an alternative when it is not implemented).
So although XeSS is a better solution than FSR to act as Super Scaling for any graphics card, let alone XMX mode for graphics with XeSS hardware, the reality is that Intel's real limit is still in the game developers who implement it. Perhaps with the Windows 24H2 version, this will change as games call up the windows libraries and they use the most successful solution, but for now, the introduction of XeSS is the most serious issue that Intel has to tackle. Not only for the sake of the Arc, but also for their future Battlemage.
This was a good video, thank you. I've been considering an ARC A580 to swap out my RX-6600, and give my RX-6600 to our daughter. She's running a Radeon 5700, and her room gets HOT. I run the 6600 and it's cool. My CPU is a 5600X, daughter is running a 5600. I could get an A580 and give my 6600 to my daughter, the 5700 can then be sold I guess.
I have an Arc A770 and I have had an almost almost flawless experience
the only problems I have had have been with BIOS, but I got it fixed and now I haven't had any issues for about 2 months
Edit: XeSS has also been a really good upscale technology which if implemented correctly looks almost as good as DLSS
I'm still waiting for them to officially support VR. In any other scenario, the GPU rocks, but on OpenXR/Vulkan API related things, the driver has missing instructions, but the capability is there.
I have an Acer Predator Bifrost A770 16GB. Amazing card. Handles Helldivers 2 and War Thunder like a champ
I would love to continue messing with my A770, but when I got my 7900XT I upgraded my PSU and it no longer has a 6 pin sadly. ARC is defintely good for budget gamers and even streamers. Intel had AV1 support before AMD and Nvidia. Strangely Nvidia was the last to implement it into OBS and even more strange they have the most stability issues and bugs with AV1. Both Intel and AMD seem to have solid implementations there from some of my own testing.
You're underselling just how much of a budget banger the Arc cards are, especially if you do streaming or video encoding.
Totally surreal to see Intel, who used to be totally anti-consumer with their CPU's for nearly a decade, (with only 4 cores and hardly any significant performance improvements), until AMD Ryzen launched, be touted as the pro-consumer alternative for graphics cards now. But if Battlemage comes out strong for the mainstream, with price to performance, I'll be trying out team Blue.
Hey, they were anti-consumer before that, even bribing execs of OEMs to not sell AMD Athlon which had an edge on Intel's Pentium.
There's a reason AMD didn't make profits despite successful chips which killed Intel proprietory 64bit Itanic.
@@RobBCactive none of these companies are consumer friendly. They only become more tolerable when they loose market share and trying to win us back. I have no allegiance to any of them.
@@RobBCactive AMD did not make profit nearly a decade because of their ATI acquisition. intel did not play fair during pentium 4 era but their core architecture was very competitive with AMD Phenoms. AMD still get decent profit with Phenom and Phenom 2 but the issue is AMD spend a lot of money when they acquire ATI for themselves. if AMD try to be more clever in this part they will not starved on money to compete with the rising intel back then.
@@arenzricodexd4409 actually Intel acted illegally in seriously criminal ways and there were many legal cases brought worldwide against them. That included actual bribery.
Starving companies of profits means they become under-resourced.
Intel attempted to fix the "error" on x86 licensing, not only welching on legal contracts to monopolise the PC CPU mary, but later tried again at 64bit.
For those who saw the fishy goings on and wondered, then saw Intel's monetezation of the near monopoly, seeing Intel portrayed as competition champions is not credible.
Intel AXG was the strategy to defend the lucrative near monopoly in data center and as integration continues undermine Nvidia's gfx biz in client computing. Just as Nvidia were frozen out of the chipset biz.
No way Intel planned to lose a lot of money on Alchemist and the lack of follow up to a late to market card since 2022 is part of that.
I’m thinking of buying one for my new pc build gaming at 1440p but I’m hesitating on this arc770 gpu or with the rtx 4070 super can someone help me decide please thanks or should i keep my 1080ti and wait for improvements on the arc770
I would totally wait bro!! i have the A770 it will be a nice upgrade but the 5xxx series will come out soon and battlemage will be like 6 months i bet.... I wait for Nvidia I have not been back since my 980ti and i really miss Nvidia after the AMD 5700xt and A770
Intel Arc has a lot of potential. My hope is that more AIB partners, like Sapphire or XFX or Gigabyte, might start working with Intel to offer different versions of Intel's graphics cards.
I've had the a770 16gb since launch and I really like it. Sure there were and still are some issues but I am happy to have it to play with.
How many years of support can we expect on arc cards as i want to build a future proof pc but im a bit scared that out of nowhere intel will end drivers support, but surely not as they are relatively new right???
pay closer attention to the 1% lows, if this increases it means your experience will be smoother.
and just looking at the launch drivers vs the most recent drivers explains why so many games have massive FPS increases. e.g. they improved the 1% lows.
Hi, could you put this card up against a quadro, could be a great alternative for a content creator?
I run a A750 on a Ryzen 7 5700, I usually play single player games and have had really no issues. I would get some slow down and some artifacts in the beginning but they were resolved rather quickly and being I started with PC gaming back in 1987 this is how most of the cards on the market started out. ATI didn't just make a card and it be the best on the market it took a couple of years with the Rage chips, if you go back and look a lot of the first gen Rage were used in servers and not as gaming GPU. Nvidia hit it out of the park with the TNT line, and is one of the reasons we don't have 3dFX any more along with a lot of others, but they too did have a lot of bumps on the way.
We downplay AMDs hold on the APU based gaming market with these talks, just like Intel and AMD is behind in the dedicated GPU market It also doesn't seem like Intel and Nvidia won't catch up with APUs which is all narrowing Intel's reach
This video aged like a milk in just a month. AMD does better in GPU power effiency, Integrated gpu, and in CPU stability being in the industry for a long time doesn't mean Intel does it better they're still greedy AF now their cpus are dying here and there.
they should not only focus on gamming, but also in creative and engineering application performance. being a student both are really important. they got the pricing and the potential for future improvement in performance(they already proved that ) which amd and nvdia are lacking.
We need these Arc dGPUS in the laptop space as well.
I would have bought one last week for a new PC I'm building if they'd be close to the competition in power consumption.
running a a770 on a old x99 xeon e5 2695 v3, and i love it
Intel Arc Xe2 Battlemage looks very promising. Xe2 is one micro architecture that scales from Low Power Gaming (Xe2-LPG) for iGPUs to High Power Gaming (Xe2-HPG) for dGPUs. The reason DX9 games are very popular is because most people are using much older cards. When Battlemage comes out there will be very cheap Alchemist cards available in the used market that people can upgrade from those older Nvidia cards. Intel used F1 2024 to demo the Xe2 iGPU on Lunar Lake. A new game that was not yet released working well on an unreleased GPU. Intel is working hard to get day 1 game ready drivers for popular titles. A very different start with Battlemage than with Alchemist. I think more people will buy Battlemage compared to Alchemist buy a significant margin.
Arrow lake plus battlemage should be a great combination.
My first gaming pc was a Arc A770 LE I'd say its well worth the value
I bought an a750 and though I had some issues with it I had a good experience with it and would recommend either the 580 750 or 770 to anybody trying to build a new PC on a budget it's not really worth spending 800 buck on an invidia card when cheaper options perform better
I personally would never go with an Intel GPU, but I hope they figure it out so they push NVIDIA and AMD to make some crazy stuff in the future.
I got my A750 May-23 and all mayor issues were essentielly gone, sure I can't get the extra boost from 200 to 300FPS in 1080p in all games, but I can go to 1440p and keep the same FPS in those games. With new drivers it's only getting better, gaining 10% FPS in a couple of games every month.
I got the LE version and it's silent as a mouse, but it does draw some power, especially when overclocked to compete with the 7600 & 4060.
Raytracing & upscaling it's rivaling Nvidia, the A750 actually beats the 4060 in 1440p in some titles, probably due to it's 256-bit bus.
We do need someone to compete and help Nvidia pushing real.time rendering forward. Nvidia have def been leading the way in pushing whats possible. Physics based real time lighting. Upscaling to greatly improve performance budget that developers have to work with, and fix the image quality problems caused by taa and fxaa. The performance budget increase is bigger than when we went from msaa to taa, mire than mesh shaders added, more than screen space effects added, more than mip maps added. Streamline SDK to make adding new features simple and quick for devs. RTX HDR to improve hdr in pc games. RTX IO to improve asset streaming and loading times. Upcoming tech neural compression to improve texture quality while reducing texture size (allows for less bandwidth to be used on textures). RTX Remix to give modders tools to update older games.
People like to claim Nvidia gave up on gaming. But the truth is they are making more money than ever from the gaming market. They are the company not skipping out on the high end. The only company adding new tools and tech to increase fidelity and reduce dev times. Intel is doing a decent job trying to make low end/mid range cards with modern features. Nintendo is using nvida and will have its feature set on the next Switch. Sony is adding an NPU and using a bespoke upscaler for the ps5 pro. It's not Nvidia that gave up on gaming. It's not their fault that others refuse to try to compete. Its companys saying we will match them in a single metric, be 2 generations behind in everything else and sell our cards for 10 percent less.
I've owned an Arc A770 16GB since late 2022. I'm still really bummed by the following:
Inferior OpenGL support compared to Nvidia
Discontinued non-VR Stereoscopic 3D from the driver (it was present at launch and is not present now)
*Still no VR support!*
You should say paid promotion if youre sponsored
A770 got relevant 2023 January driver updates. A770 is a bit odd card where at 1080p you could say it is behind the competition but we are talking high FPS and in reality it really makes very little difference (pont 1). 1440p the drop is lot less than competition pulling level if not ahead on FPS count at compared cards. At 4k it is well ahead where it shines as the entry level card with 16gb VRAM and 256 bit bus. I use mine with 4k and works well at mid and high settings and also can handle RT most of the time.
Peter let the cat out of the bag and most missed it. ARC paired with INTEL CPU gives a better performance. ARC is almost doubling it performance with INTEL CPU via REBAR/E cores but you do not see this performance on FPS count (point 2). This is why the experience often is lot better than competition with double the FPS count. None of the sponsored channels are willing to say this. GN sort of did letting Peter demonstrate how the ARC works. The speed of rendering where ARC shines as productivity card gives you a good hint where the performance actually is. Hence INTEL offering a tool for pairing.
Also need to point out INTEL XeSS XMX is different to XeSS DP4A. Often missed or mixed in purpose to give worse results.
Overclocks really easy for good performance boost.
2786MHz at 1.16V at 271W.
Base is 2100MHz (2400MHz) 225W.
Point 1:
Some games are FPS capped so for example at 1080p if the game is capped at 200 FPS you often get around 100 FPS with ARC. Peter pointed the pairing with CPU and here you double the performance. Would show 100 FPS but no waiting. In reality 100 FPS performance equal the 200 FPS performance from the competition. 1440p the cap allows ARC to perform better numbers. 4k you see even less drop to the competition. I believe this is mostly about the capped games and higher the resolution less restriction there is for the ARC.
Point 2: This means every FPS counts and no stutter. 100 FPS is 100 FPS not like 200 FPS from the competition where the extra 100 FPS is targeted to fill the badly timed and missed FPS with long waiting times. It often means that good 100 FPS is better than 200 FPS that are imperfect.
For last I would like to point out Intel® Application Optimization (APO). This is upcoming performance that will further boost the ARC and INTEL CPU pairing by 10-50% depending of the application and game. It will be relevant for the Alchemist but more for Battlemage and standard with Celestial. So A770 is just going to get better and better with time.
I got the A770 Titan for 280$ and i really think the NO STUTTER part is 100% true I get a little lower FPS than my 6700xt but there is clost to 0 Stutter at 1440p! I also agree it will get better with age like cheese!
@@Miley_00 6700xt 10gb VRAM 160 bit bus 220W loses on specs fair bit. 1440p it might not make a huge difference yet but it couldn't do 4k. On ARC I find as soon as you are over 40 FPS most is very playable. Same you can't say about the competition.
I'm prolly gonna finish my all amd build, but when battlemage settles on the market, I'll probably go intel for my upgrade at this rate.
Intel cards are beautiful. But the power consumption while gaming on arc a580 and a750 is super horrible.
The A770 isn't much better. lol
Gotta break that Nvidia chokehold honestly, getting that Sparkle A380 for an ITX build with sober optimism
They better pull a rabbit out of the hat with Battlemage, and their next gen CPUs. Given how issue plagued 13th and 14th gen i9s have been.
The HUGE CAVEAT for the Intel Arc cards is they are supporting only games with DirectX 12, any previous other game cannot run properly on those cards.
A mist effect in Wolfenstien New Order tanked my A730m What a garbage occurance.
think dx12/vulkan optimisation are more on developers side; not much on hardware vendors; unlike older dx versions
developer have some control but it is not much. in the end full GPU optimization still reliant on GPU vendor. it is not the same as console low level optimization.
check reviews from few years back
intel gpus were barely able to get 40fps at cp2077
it was able to reach 60fps after the xess update
my A770 at least better than i expected. not by much. just need their driver to mature.
i still think A770 still have more potential
alright equivalent would be smth like i5 10400f + rtx3060ti or arc770, and i can assure you if those are minimum spec lowest games will look like, maybe ac origins very high w raytracing enabled
I think intel should keep pushing, going from absolutely nothing to an ok GPU is strong, if they double their efforts, they can be competitive.
They can benefit significantly once they get there. Mobile gpu will benefit and the stocks will return.
Also they invested far too much to just abandon it.
I can't wait for Battle mage
The problem was the hype. They said it was going to be comparable to the 4070 for a fraction of the price. The reality is it competes about the same amount a slab of tofu competes with Ruth Chris for satisfying your hunger. They should have just said we are releasing a low end video card. Instead of dissappointing as it slowly got better it would be a very positive spin.
They did not say 4070 as that card didn’t even exist on the market at the time of the A770’s release. It was supposed to be 3070 performance launching in the middle of the pandemic scalping to drive down prices with millions of new cards. What we got was typically 3060 performance (much worse in older games to the point of being broken), broken and buggy software, and arriving not just after the market crashed but so late it launched the same day as the 4090. It’s up to around 3060 Ti performance and the software actually works now which is pretty good, but older games may still see issues.