Intel has to cover years of black magic optimizations done in amd and nv drivers for every single game. The fact that they are doing it at that speed is really impressive.
I mean it’s definitely impressive but hasn’t the software side of things always been what Intel focuses on? Between that and their decades of experience working on internal graphics, I guess I kind of expected Intel to have almost caught up by now.
@@Rushil69420 simply put NAAAAAAAA, amd and nv drivers are ahead. Intel had more bugs, that's why i feel not in confidence buying an intel card. But i am currently thinking about intel for my next video card maybe for their next gen, if they level up their driver and don't cheat.
Things have improved a lot on the Linux side too. In the past couple of years, ARC on Linux has gone from "nightmare to get it working properly" to "just works right out of the box". My main (Linux) desktop currently has an ARC card in it, and I'm happy with it.
I had a lot of problems getting video encoding and decoding working on my a380. That was across arch and Ubuntu. I'm not saying it's not better but it's still not quite plug and play.
Depends on the distro, AIUI. Debian stable is still in kernel 6.1 and you really need a newer kernel than that. I mostly care because I want to stay on Debian stable and Jellyfin supports AV1 encoding on Intel hardware from 6.2.
@@samiraperi467 Yeah, I suppose I should've qualified that with "assuming you're on a recent kernel". I'm currently running Ubuntu 22.04.4 LTS; the .4 point release brings 22.04 up to the 6.5 kernel. FWIW ARC also seemed decent with the 6.2 kernel (22.04.3 LTS) before that.
Old boomer here, was a early adaptor of arc cards. living on fixed income, and needing a new computer during the pandemic, I bought a mac mini m1 due to the cheap price. I have always built my own too. Finally after prices ettled down I built a PC with Ryzen 5 3600 G, and a intel A380. I'm not a gamer, and at first left the integreated graphics on. Intel drivers were really bad as 4K HDR on youtube would just white out. Now evereything runs smooth. Driver updates are painless. Also I do some video encoding and the arc is amazing for that. I finally turned off the integrated graphics in the bios. Good job intel! I was not gonna pay the Nivida tax, with a 1650 even costing way over 200.00 a lil while back.
Yep, thats why I chose AMD, sadly Intel wasnt an option yet at the time, but its been nice to see them finally release a GPU lineup and turn it into another very good alternative to the insane prices Nvidia pushes on GPUs that are not as good as an AMD or Intel equivalent on the lower end. The silly argument for Nvidia is raytracing but the truth is, _NOBODY_ is getting good RT on lower end SKUs from any company. You want an RT PC? You better be buying the LATEST _and_ GREATEST top tier SKUs - and that means your willing or able to spend $3,000 on a PC every year. Meanwhile, I eagerly await Battlemage.
The A770 LE has been one of the best gifts I've ever bought for somebody. Through no doing of my own, it keeps on giving. My brother uses it in an SFF editing rig with an i7 13700 (non-K). This card was the best value dual-slot card at the time, and now it's just a great GPU in general. The goals initially were to have a solid editing rig that could play a few games here and there, and now the Xbox has lost its spot on his desk.
I am absolutely a fan of my A770. I'm running it with a Ryzen 5700 on Fedora 40 as of this comment and it's just wonderful. Everything supported out of the box, stable and fast! I'd dare even say an AMD CPU and an Intel GPU is a dark horse these days. No, it's not the top of the line performance monster, but bang for your buck, this just can't be beat. I could see myself getting an AMD GPU in the future if the price was right, but if I need a new card in the next year, I'll strongly be considering Battlemage. The level of support and quality Intel has given on Linux is setting the bar way high.
I'll be buying into Arc when battlemage drops I hope they'll have some sensible choices, but irrespective of the offer of the competition I'm ready to give intel Arc a try
I am running an A770 with an I7-11700F and so far, it's performed pretty well. I do run games in 1440, using an MSI 120 hz 1440p monitor. My biggest issue is no sound over DisplayPort when viewing UA-cam. So, I plugged my speakers back into the motherboard audio. Fixed.
In the graph at around 9:30, I believe you used outdated results for ARC 770 (from Feb 2023), which could explain why there is so small a difference between that card and A580.
should be fine. Intel has always had the best linux drivers and has committed to the kernel for decades, far longer than any of the other big corps. Obviously make sure you use bleeding edge kernel
Arch-based Hyprland user here, Wayland is great! Gaming performance is a bit behind the 3070 Ti I had before, but solidly playable (2K 165Hz for reference).
It's been a wild ride, from figuring out firmware updating to moving away from D3Don12. The SR-IOV stuff has always been in the driver, I'm curious what more recent stuff you've seen that makes you think it might go public. I'll stay tuned. For my part, I've been in it since Odyssey in SF. I was there when it launched, right behind Raja. And I'm still on their Beta program with an A770 as my daily. I can give nothing but the greatest praise to the Intel folks for responding to issues and working them out since launch, it really gives you an appreciation for all the reproduction they have to try and do.
I picked up the a770 16 GB, yanked it out of my computer June of 2024 just never really worked very well. A lot of the games you had to turn down your settings so far that it was disappointing for it being 16 GB of RAM. The money I saved on it. I wish I would have put it towards a better graphics card. Ended up picking up the AMD 7900 XTX. Was worried about making the same mistake twice.
Arch-based user here, it's a pretty smooth experience. Obviously the A770 is about 10-15% behind my old 3070 Ti at 2K, but most of what I play handles great. Currently on a Helldivers 2 kick and it handles it like a champ.
As noted in another thread, my experiences with ARC on Linux have been positive. You do need to be on a reasonably recent kernel - 6.2 or later - or you may be in for a bad time. Any currently supported version of Fedora should be fine since it's pretty bleeding edge. Ubuntu 22.04.3 and .04 (with the updated HWE kernels) have worked for me.
People still seem to forget the insane amount of silicon Arc GPUs have. Yes, they have moved mountains like Wendell rightfully states. But! That's in comparison to the abysmal state the software stack was in when it launched. And it's still painfully weak when compared to the sheer amount of silicon an A770/750/580 has. If I had paid €400 when the A770 launched, I would've literally thrown it out of the window.
I actually gave up on the Challenger A580 about a month ago, pulled it and sold it. The machine it was in is used for watching video streams, discord and occasional light 1080 gaming. I added the A580 primarily for Quick Sync (so I could offload some encoding jobs from my power hog gaming machine) and maybe turn up some detail when gaming. It never got along well with Windows update in my machine, but two or three months in I was getting daily driver update failures from the Intel updater, and when I finally did more than just dismiss the notifications I found two instances of the updater, along with a non-existent Intel Bluetooth adapter in device manager (?). I removed the card, DDU'd the drivers, cleaned up device manager and reinstalled. It took less than a week to start giving me the update failures. The updater had again installed itself twice, and device manager was again showing the phantom Intel BT adapter. Personally I could live with minor bugs in some games, but the Arc Control update software in my experience was crap.
@@PixelatedWolf2077 I'm actually sorry I sold it.. I've recently moved to Ubuntu on my desktop, and I hear the experience in Linux is better than what's on Windows. Might have to get another one and try again..
aside from gaming the media engine included in the arc cards is very nice. i replaced my quadro p400 with an sparkle A310 single slot card in my jellyfin/plex server and i'm very satisfied. even the pcie-passthrough an the driver installation worked like a charm
This makes me really excited for my AsRock Challenger A770 16gb! It should get here next week. I snagged one yesterday brand new for $196.09, tax and shipping included!
I’m no gamer and love my ARC 770 LE. I even have an ARC 750 LE still sealed in a box as a spare card. The Intel design of the LE cards looks great (to me).
@@manuelhernandez2017 no issues here in both windows and linux , A750 1x 2560x1440p144hz + 1x 5120x1440p200hz. for gaming i typically put that 5120x1440p monitor into 3840x1080p mode because some games dont offer upscaling and 3440x1440p and 5120x1440 native is quite a bit to hard to drive for the a750 (In everything thats not some sort of shooter but i dont play those).
hello i have the AsRock ARC A770 16gb and its always at 90c gaming how is yours running so cold even when i set the fans to 100% im still getting mid to high 80s
ngl i want an intel card, i could gun it for an A750 but i want to see what battlemage brings to the table, i hope the hardware side has moved as much as the software side and i hope it also doesnt cost an arm and a leg either, im happy if we can get decent 200 usd cards again that are not 3 generations old
Honestly, I had an Arc A750 for about six months, and aside from a weird monitor incapability (that was the fault of the monitor's firmware), I haven't really had any major issues--or, that is to say, any major issues I couldn't find a workaround for. Once the workarounds were applied, I had damn solid performance for a very reasonable price. And the LE version of the card might be the most beautiful GPU I've ever seen--you'd have to actively try to make it look bad in a build. I did end up getting rid of it, because I found an amazing deal for an RX 7800 XT, but I'd say my run with the arc cards was very positive. I'm very much looking forward to Battlemage.
I've been using an Intel Arc A750 for the past eight months now. I picked up this almost unused second-hand card for 170€, a couple of months before. The idea was to test Intel (I was curious) as an assessment of whether it was worth taking a Battlemage. And since I started using it at the same time as I renewed the equipment I can only say that it is fantastic and the work with the Intel drivers is simply incredible month by month. The question is whether I will switch to Battlemage or wait for Celestial.
See these improvements over time is great, but what would really help to drive a decision to invest in an ARC gpu would be a comparison with contemporary competing solutions at similar price range.
Excited for what is in store for these cards even last Celestial which might be when I might upgrade. I’m perfectly fine with my A750. More so thanks to these driver updates the past handful of months. In a sea and lake of green and blue I love my little pond of blue.
I bought a A770 LE last year and it is mindbogglingly good for a first gen product. Drivers updates are regular and keep getting better. If you mainly want a productivity GPU that is also becoming a pretty good at gaming and you don't feel like remortgaging your house, Intel has you covered.
I have a 14600k with an arc770 since 6 months. I use it mainly for rendering videos. Handbrake is using the hyperencode using CPU , iGPU and the arc770. And that does 4k encoding in crazy speeds. Ok, rtx4090 is still faster, but at what price.....in six months I had the following problems with the arc770: none.
I am a small company 2D/3D software developer, and my next testing/profiling system will be a 14th/15th Gen Intel CPU with an Intel Battlemage GPU. I am waiting for Q4-2024 and Q1-2025 for hopeful release dates of the processors and video cards. I am really interested in seeing how well Intel GPUs perform with my software. I have high hopes for an all Intel system.
A770 has been mostly great, but I'm starting to need more juice to run games on the resolution (3440x1440p) I'm using. BG3 is fine on 60fps, but games like Hunt Showdown and Helldivers 2 need more than that. Let's hope that they drop Battlemage soon, because I don't want to go back to Nvidia.
I feel that, A770 has been good to me using 4K res, but if BM can get near 4070 to 4080 performance like speculated then we are in for a treat if the price is right like $400
@@thetheoryguy5544 Cheapest 4070S in about 680€ in my country and if Battlemage's high-end (B770?) will be close to that performance, I'm happy to even pay 500-600€ range. LE ofc if possible.
I use a Phantom Gaming A770 16GB version. Metro Exodus 1920x1080 DX12 Max settings, gives me a stable average 180-220FPS (lowest around 120FPS). (i7 12700KF, with DDR5 6000mhz RAM). DX12 uses less GPU, than DX11 (roughly 10-15% less).
im glad there is a 3rd option out there now. i have 4 of the 8 a770 16gb cards that there are and they all vary on their max power setting. so far the sparkle a770 titan oc is the highest one i have at 276w. for me they are a great bang for the buck on my systems due to not needing to much a lot of the time but when i need the extra power they work great for me. the drivers also impress the heck out of me how intel is backing their project massively and i don't mind supporting something so neat fully. i have high hopes for these cards and cant wait to see them become more then a "hobbyist" card.
I love my ARC a770 LE. Amazing on linux and looks fantastic. Intel did a great job on it. Cant wait to see battlemage and see intel excel in the GPU space.
11:43 - I use the 6950XT, which is roughly twice as powerful as the A770 as far as I could find, and 'Starfield' is quite doable at 1600p 21:9 and Medium-High settings, even though it's very inconsistent (and that was with the launch-version and maybe a few small updates on top of that). - This is not to "brag" or something, but what I want to say is that... while it did manage that, it really COOKED the crap out of the GPU. I was getting nervous seeing the temperatures and hearing the fans going nuts, and that was with intake-fans aimed right at it, exhaust-fans in the top, with 3 sides of the case being completely mesh, as well as other openings. - That thing is already squeezed for all it's got, and then 'Starfield' again squeezes it as hard as it can, and my card must've reached some record highs. I think it peaked beyond safe points. And though it never shut down or anything, I did get some weird (temporary) visual anomalies that I never got in other games. I just don't know if the game would do that regardless or if it's the engine or whatever, but I was fearing for that card's life, and I only got it recently at that point, having just had thrown about 650 Euros at it. I was playing that game like the Matthew McConaughey smoking meme. - I seriously thought there was a risk of sparks and flames shooting out at some point, as even the case itself was hot to the touch and the hot air coming from it was unbearable, and I was also regularly trying to smell if something was burning and checked if anything melted. - Nothing destructive ultimately happened, which is quite impressive considering the heat, but while I like 'Starfield', also F that game on a technical level. - Good stress-test, though... For both videocards and their owners.
Frame generation is nice, but the image quality of XeSS is so much better and I'm not even using the XMX version. If anything, I feel like FSR is the technology that's lagging behind and desperately needs to improve.
I just got that A580 for a dedicated pinball computer to help my dad out with, but trying to run Linux on it turned out to be a nightmare to get everything to work, including GPUs (including an AMD-one, but I think the Vega ones just don't play nice), even though my personal computer has next to no issues with the vast majority of hardware and games (I mean, I just install stuff and it generally works). - In any case, or this case in particular rather, I followed Intel's instructions for installing the drivers, but it just won't take, with both Zorin and Mint complaining that "the package system is broken", for which in turn there are numerous cases and various solutions about on the web. - Supposedly Arc-drivers work out-of-the-box from Ubuntu 23 onward, but those distros I mentioned aren't on that yet. - I'm not a fan of straight Ubuntu, but I suppose I could ignore that for a system that needs to do almost one thing. This is nothing against Intel's efforts with the Arc-GPUs, I wholly support it, but those instructions for manual installation are... quite confusing, first of all, and while I'm "savvy", I'm not someone who sits at a computer typing commands all day and goes on auto-pilot, I have stuff to do with a computer, but when they then also don't work... They specifically state that Linux is supported, but evidently not seamlessly. - Again, there's those native drivers, supposedly. hhhuhhhh... I just gotta... install another OS again then...
At 9:35 , that lackluster CP2077 performance on the A770 is using the older Feb 2023 driver's performance numbers. He had so much better results with the April 2024 drivers just a few minutes back at 5:58
Im not a technical person i di unferstand 0.1% lows etc but running a 5800x3d on an a770 playing modern warfare 2 feels smooth as butter. Its not super high frame rate but it just feels incredibly smooth.
Same combination and I have to agree 100%. It is television broadcast smooth for me as well. Not a single stutter. Not a single in-game crash to date...and I've owned mine since December 2022. Had three driver installers crash but never the game itself.
I lament to inform you that I was unable to read the white letters on the light blue background of the bar graphs, at the out and about viewing distance I had with this on my phone. Black outline could help. Also the far left text was beyond too small for me see even when I gave special efforts to look closely. As a listener viewer, I wasn’t able to discern what the 9 bars were each associated with. I assumed the top 3 were all the new driver results, but upon zooming in, I could discern different dates among those 3. Maybe a little bit of description of the graphs, or fewer at once would be favorable. Maybe… But don’t sweat it, you do you. I would though promote enlarging the text and giving a black outline for the white on light-blue. Thanks for the follow up on Arc, I also lament I still have not produced my “Intel ARC the Anime” series. But with Ai I hope to some day achieve such heights of art.
Intel GPU industry wise are doing exciting work. Assuming they hang tough, and keep up the improvement, they are going to issue Battle mage with a rep that while the drivers may not be A1, their effort to get to A1 is - A1. I was in a mental view of not buying Intel discrete GPUs - due to bad drivers. I think that's largely done now, so I'd probably be fine picking them up now. Its not quite right to say Intel came from no where. In truth their GPU and IGP hardware and software has been around a long time. But to level up against the high end players in the market, to this degree - in this kind of time frame is highly impressive. And I am thankful for it - because the market where midrange cards exceed $500/£500 instead of the older $200/£150 arena is batshit and something needed to come break that impasse.
I have had virtualization issues for dual monitors on my 11thgen i7, at first I had to pull a hdmi cable out to get past the POST screen, but it seems a microcode update in Ubuntu fixed the issue randomly one day. Note I tried allowing visualisation on the board and I also was running bare metal
i have an arc a750 and i bought it as a "temporary" gpu (upgrade for my finally long in the tooth Titan Xp/building a new pc and keeping the old one together) waiting for the 50 series to buy one of those or a 4090 depending on pricing. im very impressed with the a750 for the price. i do still get occasional consistent stutters in some games.
Intel needs to take advantage of the market that AMD and Nvidia do not want to cover, The Midrange and Low End. If Intel can come out with a $400 GPU with 16GB of VRAM that is as strong as a 3090, $300 12GB Vram card as strong as a 3080, sub $200 8GB card as strong as a 3070. They will sell like hot cakes
I have an Intel A750 for my Jellyfin server, it shattered my previous 3050ti, in terms of transcoding speed and of course AVI encoding and transcoding . Bought it for 200$ including shipping, and I have loved it ever since, it was quite a pain to make the pass through in Proxmox and it sadly only has support on Ubuntu and Windows VMs but for what I paid and what it delivers , it truly is a marvel. If Intel responds with Battlemage being at least in an AMD level of performance and driver support and development , I might switch from NVIDIA to Intel
Fingers crossed that the battle mage isn't to much expensive then this gen and a lot more performance! I was waiting till see the 2nd Gen of Intel gpu's before jumping on board buy its very exciting to see the strides Intel is making on their gpu's though! More competition the better for everyone and hopefully knock the green giant of their high horse!
I've had that starfield half-framerate-after-menus bug on and off throughout starfield's patch history. On my RTX 3070. So it's not entirely an intel issue
Hope with Battlemage Intel can put enough pressure on Amd and nvidia with their mid and low end cards to have reasonable vram and prices, would be awesome for the gamer market...nvidia is supposedly alone in the ultra high end coming fall/winter but ey, 3 players is better than 2
I'd be curious how these cards perform rendering video in DaVinci Resolve AV1 Vs MP4 (render time and file size)... Windows Vs Linux (since there versions for both platforms).... And ideally if things have improved over driver updates. I'm rooting for intel Arc, and looking to escape the Windows grip. I wonder if anyone else cares for this test?
*Question...* Any thoughts on MACHINE LEARNING being involved to fix drivers for various games? I'm very interested in the idea of ML being involved. Maybe initially some non-ML version that's simply automated to try all sorts of combinations of game settings while monitoring FRAME TIMES? Step through every game in the Steam catalogue? Then ML having access to specific driver code locations so it can try to rewrite them? (feedback loop for crashes/worse performance/better performance...) Maybe move on to rewriting game code to be more efficient? Basically there are TONNES of older games. How much can we fix them with what might be eventually LOW HANGING FRUIT in terms of cost as ML gets better?
100% hope Intel continues with their GPUs and Drivers. I am never an early adopter of PC technology because of the cost of parts, but if a competitive hardware or software options proves itself, I am happy to whole-heartedly embrace it.
Very much looking forward to seeing what Battlemage is capable of! Hopefully they can improve on efficiency. The main thing I'm curious about with Intel ARC is creative, productivity, and LLM tasks.
did the drivers improve AI performance? With the ass games that have been out lately, (not into multiplayer shooters) im deepdiving Ai more than gaming. Would an intel card out perform the $400-450 used 3080ti's in stable diffusion or llm? i know they have a bit more memory, but speed is more important than memory in my usage.
Bought an ASROCK A380 strictly for doing AV1 encoding and its been working like a champ except I just noticed...its locked at PCIe v1.0 x1 speeds no matter what motherboard I plug it into. Was a rather shocking discovery given it requires a full x16 slot, tho its only spec'd to use x8. The ASROCK also requires external PCI-E power tho its not actually needed. Kind of a bummer. Confirmed with a collogue with the same card and different hardware - same thing. Guess I should have figured ASROCK would scrape the bottom of the barrel for 'supports but doesn't utilize' specifications. Oh well, none of the A-series cards support SR-IOV and I'd really like that feature so *crosses fingers* here's to hoping the next-gen Intel supports this.
I been running a test pig on AMD / AM4 / B550 with the Ryzen 7 5700x and Sparkle A 770 Titan video card, some of us are doing AMD platforms and the user experience could be different. Wendell, please test using B450/x470 in PCIe 3 with Rebar on for Intel Arc A580 /A770, it should work being AMD can do what Intel cannot on PCIe 3, AMD can do Rebar for Nvidia or Intel on x470 chipset, I tested the RTX 3070 on x470 with Rebar and it works.
I have just hated how the general community reacted to - mostly driven by UA-camrs (MLID etc) - and continues to react to Intels efforts in the consumer GPU market. The initial drivers weren’t great but what did we expect. This is a brand new market segment for Intel and they have put in the hours and done the dirty work of optimizing per game! With all this learned knowledge I am certain they will continue to get better.
I had that issue with my 6900xt/12700k gaming rig. All of the sudden my game crashed and only one monitor worked…then the amd adrenaline wouldn’t work, It was a pain to figure out
Intel has to cover years of black magic optimizations done in amd and nv drivers for every single game. The fact that they are doing it at that speed is really impressive.
I mean it’s definitely impressive but hasn’t the software side of things always been what Intel focuses on? Between that and their decades of experience working on internal graphics, I guess I kind of expected Intel to have almost caught up by now.
Hope they take a bit of L on the CPU side and focus on stability and efficiency rather than beating AMD on benchmarks.
@@Rushil69420 no one was playing these games on igpus
I realize I'm a bit late to the conversation but has Intel improved things for pre-dx9 games? I play a lot of 90s AAA classics.
@@Rushil69420 simply put NAAAAAAAA, amd and nv drivers are ahead. Intel had more bugs, that's why i feel not in confidence buying an intel card.
But i am currently thinking about intel for my next video card maybe for their next gen, if they level up their driver and don't cheat.
Things have improved a lot on the Linux side too. In the past couple of years, ARC on Linux has gone from "nightmare to get it working properly" to "just works right out of the box". My main (Linux) desktop currently has an ARC card in it, and I'm happy with it.
I had a lot of problems getting video encoding and decoding working on my a380. That was across arch and Ubuntu. I'm not saying it's not better but it's still not quite plug and play.
this is relevant to my interests :D
Depends on the distro, AIUI. Debian stable is still in kernel 6.1 and you really need a newer kernel than that. I mostly care because I want to stay on Debian stable and Jellyfin supports AV1 encoding on Intel hardware from 6.2.
@@samiraperi467 Yeah, I suppose I should've qualified that with "assuming you're on a recent kernel". I'm currently running Ubuntu 22.04.4 LTS; the .4 point release brings 22.04 up to the 6.5 kernel. FWIW ARC also seemed decent with the 6.2 kernel (22.04.3 LTS) before that.
@@samiraperi467 You could just use the backport kernel, which is 6.6
Old boomer here, was a early adaptor of arc cards. living on fixed income, and needing a new computer during the pandemic, I bought a mac mini m1 due to the cheap price. I have always built my own too. Finally after prices ettled down I built a PC with Ryzen 5 3600 G, and a intel A380. I'm not a gamer, and at first left the integreated graphics on. Intel drivers were really bad as 4K HDR on youtube would just white out. Now evereything runs smooth. Driver updates are painless. Also I do some video encoding and the arc is amazing for that. I finally turned off the integrated graphics in the bios. Good job intel! I was not gonna pay the Nivida tax, with a 1650 even costing way over 200.00 a lil while back.
Yep, thats why I chose AMD, sadly Intel wasnt an option yet at the time, but its been nice to see them finally release a GPU lineup and turn it into another very good alternative to the insane prices Nvidia pushes on GPUs that are not as good as an AMD or Intel equivalent on the lower end. The silly argument for Nvidia is raytracing but the truth is, _NOBODY_ is getting good RT on lower end SKUs from any company. You want an RT PC? You better be buying the LATEST _and_ GREATEST top tier SKUs - and that means your willing or able to spend $3,000 on a PC every year. Meanwhile, I eagerly await Battlemage.
The A770 LE has been one of the best gifts I've ever bought for somebody. Through no doing of my own, it keeps on giving. My brother uses it in an SFF editing rig with an i7 13700 (non-K). This card was the best value dual-slot card at the time, and now it's just a great GPU in general. The goals initially were to have a solid editing rig that could play a few games here and there, and now the Xbox has lost its spot on his desk.
Does it work good for his every day gaming and PC usage? Old games and new games? Asking for a friend because the A750 has a great price.
@@ThunderTheBlackShadowKittyyes
Love it when Wendell is hyper-active. More videos, yes!
Get him and Patrick from STH and we get pandamonium in the server room in a good way... 😊
I am absolutely a fan of my A770. I'm running it with a Ryzen 5700 on Fedora 40 as of this comment and it's just wonderful. Everything supported out of the box, stable and fast! I'd dare even say an AMD CPU and an Intel GPU is a dark horse these days. No, it's not the top of the line performance monster, but bang for your buck, this just can't be beat.
I could see myself getting an AMD GPU in the future if the price was right, but if I need a new card in the next year, I'll strongly be considering Battlemage. The level of support and quality Intel has given on Linux is setting the bar way high.
Cant wait for battlemage
The more competition in the GPU space, the better.
@@DragunBreathOnly if people will actually _buy_ the competition.
@@benjaminoechsli1941 lots of us did, I know I did and I can't wait for battlemage
Hopefully they don't neglect to include a healermage too, or at least a supportmage
I'll be buying into Arc when battlemage drops
I hope they'll have some sensible choices, but irrespective of the offer of the competition I'm ready to give intel Arc a try
I've been running the A770 for over a year now and I love this card.
I find it amazing how well these things compete with their contemporaries, being a completely novel lines of gpus.
I am running an A770 with an I7-11700F and so far, it's performed pretty well. I do run games in 1440, using an MSI 120 hz 1440p monitor. My biggest issue is no sound over DisplayPort when viewing UA-cam. So, I plugged my speakers back into the motherboard audio. Fixed.
Hardware can be amazing but it won't matter if the software isn't efficient... super cool to see how many gains they've made over the 2 years.
I just picked up an a770 for $229....I had to take the chance at that price.
I have the A380 as my host gpu for dual 1440p monitors and it's working with no issues.
In the graph at around 9:30, I believe you used outdated results for ARC 770 (from Feb 2023), which could explain why there is so small a difference between that card and A580.
How are these doing on Linux now? Wayland support? Game performance? AV1 encoding?
should be fine. Intel has always had the best linux drivers and has committed to the kernel for decades, far longer than any of the other big corps.
Obviously make sure you use bleeding edge kernel
Runs pretty well, the only things I can't do is run some newer games like Starfield.
Arch-based Hyprland user here, Wayland is great! Gaming performance is a bit behind the 3070 Ti I had before, but solidly playable (2K 165Hz for reference).
It's been a wild ride, from figuring out firmware updating to moving away from D3Don12. The SR-IOV stuff has always been in the driver, I'm curious what more recent stuff you've seen that makes you think it might go public. I'll stay tuned.
For my part, I've been in it since Odyssey in SF. I was there when it launched, right behind Raja. And I'm still on their Beta program with an A770 as my daily. I can give nothing but the greatest praise to the Intel folks for responding to issues and working them out since launch, it really gives you an appreciation for all the reproduction they have to try and do.
4x of A770 could be a beast for AI. 64GB of VRAM for less than the price of a 4090. Too bad the support isn't there.
would love a vid from l1 reviewing intels ai performance, if it runs at all
hopefully many more will buy it and developers will have no choice but to support it
Interesting, but for now 2x 3090 is more than capable for about the same price.
@@userblame632I use my single A770 16GB on Arch and it's pretty solid for Llama3. OneAPI is stupid easy to configure.
@@disco.volante Consider the power usage of 3090s vs intel though.
There needs to be a shirt with "Gaming.... For science!" for these occasions!
I picked up the a770 16 GB, yanked it out of my computer June of 2024 just never really worked very well. A lot of the games you had to turn down your settings so far that it was disappointing for it being 16 GB of RAM. The money I saved on it. I wish I would have put it towards a better graphics card. Ended up picking up the AMD 7900 XTX. Was worried about making the same mistake twice.
Id love for you to test these on linux, since im considering going intel GPU for my next build and i like to use fedora. Good video tho :3
Arch-based user here, it's a pretty smooth experience. Obviously the A770 is about 10-15% behind my old 3070 Ti at 2K, but most of what I play handles great. Currently on a Helldivers 2 kick and it handles it like a champ.
As noted in another thread, my experiences with ARC on Linux have been positive. You do need to be on a reasonably recent kernel - 6.2 or later - or you may be in for a bad time. Any currently supported version of Fedora should be fine since it's pretty bleeding edge. Ubuntu 22.04.3 and .04 (with the updated HWE kernels) have worked for me.
This is a pretty impressive uplift from drivers alone
People still seem to forget the insane amount of silicon Arc GPUs have.
Yes, they have moved mountains like Wendell rightfully states.
But! That's in comparison to the abysmal state the software stack was in when it launched. And it's still painfully weak when compared to the sheer amount of silicon an A770/750/580 has.
If I had paid €400 when the A770 launched, I would've literally thrown it out of the window.
I actually gave up on the Challenger A580 about a month ago, pulled it and sold it. The machine it was in is used for watching video streams, discord and occasional light 1080 gaming. I added the A580 primarily for Quick Sync (so I could offload some encoding jobs from my power hog gaming machine) and maybe turn up some detail when gaming. It never got along well with Windows update in my machine, but two or three months in I was getting daily driver update failures from the Intel updater, and when I finally did more than just dismiss the notifications I found two instances of the updater, along with a non-existent Intel Bluetooth adapter in device manager (?). I removed the card, DDU'd the drivers, cleaned up device manager and reinstalled. It took less than a week to start giving me the update failures. The updater had again installed itself twice, and device manager was again showing the phantom Intel BT adapter. Personally I could live with minor bugs in some games, but the Arc Control update software in my experience was crap.
Ehh, I got that issue on my MSI Claw with the driver updated, but its super easy to find the newest arc drivers and install nowadays.
@@PixelatedWolf2077 I'm actually sorry I sold it.. I've recently moved to Ubuntu on my desktop, and I hear the experience in Linux is better than what's on Windows. Might have to get another one and try again..
@cameronfrye5514 Fair enough! Well hopefully you can enjoy whatever you have now until ya feel the need to re-buy Intel in the future. ^^
I would buy an ARC GPU immediately if they get proper SR-IOV support.
Yeah igpu SR-IOV for a server build would be pretty compelling.
Wendell, How is the A770 doing with Topaz Video AI 5? I know that Arc GPU's were unusable with Video AI about a year ago. Has this been fixed? Thanks!
10:50 was it with INTEL XeSS XMX or XeSS DP4A? Two different programs. XeSS XMX is for INTEL specific. The difference is huge.
If it was on an intel card. I don't know if you can use other stuff beyond xmx version
So with intel making enterprise VDI solutions... will we see any of that support trickle down to the desktop cards?
aside from gaming the media engine included in the arc cards is very nice. i replaced my quadro p400 with an sparkle A310 single slot card in my jellyfin/plex server and i'm very satisfied. even the pcie-passthrough an the driver installation worked like a charm
Just got my new PC with an Asrock A770. Couldn't be happier!
Nice card at a nice price.
This makes me really excited for my AsRock Challenger A770 16gb! It should get here next week. I snagged one yesterday brand new for $196.09, tax and shipping included!
I’m no gamer and love my ARC 770 LE. I even have an ARC 750 LE still sealed in a box as a spare card. The Intel design of the LE cards looks great (to me).
Do you use multi monitors with Intel card or no? Would like to know how they run
Its space ship grade
@@manuelhernandez2017 No, I just use one 1920x1080 60 Hz monitor.
@@a.j.haverkamp4023 I've read of issues with multimonitor setups even without gaming.. just asking for that reason. I'd like to buy one and try it
@@manuelhernandez2017 no issues here in both windows and linux , A750 1x 2560x1440p144hz + 1x 5120x1440p200hz.
for gaming i typically put that 5120x1440p monitor into 3840x1080p mode because some games dont offer upscaling and 3440x1440p and 5120x1440 native is quite a bit to hard to drive for the a750 (In everything thats not some sort of shooter but i dont play those).
hello i have the AsRock ARC A770 16gb and its always at 90c gaming how is yours running so cold even when i set the fans to 100% im still getting mid to high 80s
Check your thermal pads/paste or ask a friend who is experienced wirh that kind of thing?
Has the elevated idle power draw on intel Arc cards been fixed?
No, it's hardware level and can only be fixed with a hardware change, so Battlemage (hopefully) won't have this problem.
Love my A770 but idle power consumption still sucks despite so many driver upgrades
Multiple monitors? All high refresh rates?
@@joniqst It's a known and published bug that Intel accepts cannot be fixed without a hardware revision.
ngl i want an intel card, i could gun it for an A750 but i want to see what battlemage brings to the table, i hope the hardware side has moved as much as the software side
and i hope it also doesnt cost an arm and a leg either, im happy if we can get decent 200 usd cards again that are not 3 generations old
Can you make customer GPUs / builds comparison for LLM applications?
Honestly, I had an Arc A750 for about six months, and aside from a weird monitor incapability (that was the fault of the monitor's firmware), I haven't really had any major issues--or, that is to say, any major issues I couldn't find a workaround for. Once the workarounds were applied, I had damn solid performance for a very reasonable price. And the LE version of the card might be the most beautiful GPU I've ever seen--you'd have to actively try to make it look bad in a build.
I did end up getting rid of it, because I found an amazing deal for an RX 7800 XT, but I'd say my run with the arc cards was very positive. I'm very much looking forward to Battlemage.
Sir... Does ARC A770 support video Ai FPS enhancement software? (Like : Flowframes / FrameGUI)
How good is legacy support for Intel Arc. All my old games run great on Nvidia GPUs. If i switch, will my old games run well?
I've been using an Intel Arc A750 for the past eight months now. I picked up this almost unused second-hand card for 170€, a couple of months before. The idea was to test Intel (I was curious) as an assessment of whether it was worth taking a Battlemage. And since I started using it at the same time as I renewed the equipment I can only say that it is fantastic and the work with the Intel drivers is simply incredible month by month. The question is whether I will switch to Battlemage or wait for Celestial.
See these improvements over time is great,
but what would really help to drive a decision to invest in an ARC gpu
would be a comparison with contemporary competing solutions at similar price range.
Excited for what is in store for these cards even last Celestial which might be when I might upgrade. I’m perfectly fine with my A750. More so thanks to these driver updates the past handful of months. In a sea and lake of green and blue I love my little pond of blue.
The 0.1% low changes are insane. Incredible work!
I bought a A770 LE last year and it is mindbogglingly good for a first gen product. Drivers updates are regular and keep getting better. If you mainly want a productivity GPU that is also becoming a pretty good at gaming and you don't feel like remortgaging your house, Intel has you covered.
I have a 14600k with an arc770 since 6 months. I use it mainly for rendering videos. Handbrake is using the hyperencode using CPU , iGPU and the arc770. And that does 4k encoding in crazy speeds. Ok, rtx4090 is still faster, but at what price.....in six months I had the following problems with the arc770: none.
I am a small company 2D/3D software developer, and my next testing/profiling system will be a 14th/15th Gen Intel CPU with an Intel Battlemage GPU. I am waiting for Q4-2024 and Q1-2025 for hopeful release dates of the processors and video cards. I am really interested in seeing how well Intel GPUs perform with my software. I have high hopes for an all Intel system.
A770 has been mostly great, but I'm starting to need more juice to run games on the resolution (3440x1440p) I'm using. BG3 is fine on 60fps, but games like Hunt Showdown and Helldivers 2 need more than that. Let's hope that they drop Battlemage soon, because I don't want to go back to Nvidia.
I feel that, A770 has been good to me using 4K res, but if BM can get near 4070 to 4080 performance like speculated then we are in for a treat if the price is right like $400
@@thetheoryguy5544 Cheapest 4070S in about 680€ in my country and if Battlemage's high-end (B770?) will be close to that performance, I'm happy to even pay 500-600€ range. LE ofc if possible.
How is this GPU for adobe after effects and adobe premiere?
I use a Phantom Gaming A770 16GB version. Metro Exodus 1920x1080 DX12 Max settings, gives me a stable average 180-220FPS (lowest around 120FPS). (i7 12700KF, with DDR5 6000mhz RAM). DX12 uses less GPU, than DX11 (roughly 10-15% less).
im glad there is a 3rd option out there now. i have 4 of the 8 a770 16gb cards that there are and they all vary on their max power setting. so far the sparkle a770 titan oc is the highest one i have at 276w. for me they are a great bang for the buck on my systems due to not needing to much a lot of the time but when i need the extra power they work great for me. the drivers also impress the heck out of me how intel is backing their project massively and i don't mind supporting something so neat fully. i have high hopes for these cards and cant wait to see them become more then a "hobbyist" card.
I love my ARC a770 LE. Amazing on linux and looks fantastic. Intel did a great job on it. Cant wait to see battlemage and see intel excel in the GPU space.
11:43 - I use the 6950XT, which is roughly twice as powerful as the A770 as far as I could find, and 'Starfield' is quite doable at 1600p 21:9 and Medium-High settings, even though it's very inconsistent (and that was with the launch-version and maybe a few small updates on top of that). - This is not to "brag" or something, but what I want to say is that... while it did manage that, it really COOKED the crap out of the GPU. I was getting nervous seeing the temperatures and hearing the fans going nuts, and that was with intake-fans aimed right at it, exhaust-fans in the top, with 3 sides of the case being completely mesh, as well as other openings. - That thing is already squeezed for all it's got, and then 'Starfield' again squeezes it as hard as it can, and my card must've reached some record highs. I think it peaked beyond safe points. And though it never shut down or anything, I did get some weird (temporary) visual anomalies that I never got in other games. I just don't know if the game would do that regardless or if it's the engine or whatever, but I was fearing for that card's life, and I only got it recently at that point, having just had thrown about 650 Euros at it.
I was playing that game like the Matthew McConaughey smoking meme. - I seriously thought there was a risk of sparks and flames shooting out at some point, as even the case itself was hot to the touch and the hot air coming from it was unbearable, and I was also regularly trying to smell if something was burning and checked if anything melted. - Nothing destructive ultimately happened, which is quite impressive considering the heat, but while I like 'Starfield', also F that game on a technical level. - Good stress-test, though... For both videocards and their owners.
Wonder how the drivers are for older titles like Riddick and FEAR
Frame generation is nice, but the image quality of XeSS is so much better and I'm not even using the XMX version. If anything, I feel like FSR is the technology that's lagging behind and desperately needs to improve.
I just got that A580 for a dedicated pinball computer to help my dad out with, but trying to run Linux on it turned out to be a nightmare to get everything to work, including GPUs (including an AMD-one, but I think the Vega ones just don't play nice), even though my personal computer has next to no issues with the vast majority of hardware and games (I mean, I just install stuff and it generally works). - In any case, or this case in particular rather, I followed Intel's instructions for installing the drivers, but it just won't take, with both Zorin and Mint complaining that "the package system is broken", for which in turn there are numerous cases and various solutions about on the web. - Supposedly Arc-drivers work out-of-the-box from Ubuntu 23 onward, but those distros I mentioned aren't on that yet. - I'm not a fan of straight Ubuntu, but I suppose I could ignore that for a system that needs to do almost one thing.
This is nothing against Intel's efforts with the Arc-GPUs, I wholly support it, but those instructions for manual installation are... quite confusing, first of all, and while I'm "savvy", I'm not someone who sits at a computer typing commands all day and goes on auto-pilot, I have stuff to do with a computer, but when they then also don't work... They specifically state that Linux is supported, but evidently not seamlessly. - Again, there's those native drivers, supposedly. hhhuhhhh... I just gotta... install another OS again then...
At 9:35 , that lackluster CP2077 performance on the A770 is using the older Feb 2023 driver's performance numbers. He had so much better results with the April 2024 drivers just a few minutes back at 5:58
You don't have to install GeForce Experience. I avoid that shit like the plague.
😅😂😅
Im not a technical person i di unferstand 0.1% lows etc but running a 5800x3d on an a770 playing modern warfare 2 feels smooth as butter. Its not super high frame rate but it just feels incredibly smooth.
Same combination and I have to agree 100%.
It is television broadcast smooth for me as well.
Not a single stutter.
Not a single in-game crash to date...and I've owned mine since December 2022.
Had three driver installers crash but never the game itself.
7:25 all that looks good, but isn't this a 1440p card?
What I can actually say: using an arc380 in my proxmox server for a windows vm is crazy good. Low power, great encoding.
I lament to inform you that I was unable to read the white letters on the light blue background of the bar graphs, at the out and about viewing distance I had with this on my phone. Black outline could help. Also the far left text was beyond too small for me see even when I gave special efforts to look closely.
As a listener viewer, I wasn’t able to discern what the 9 bars were each associated with. I assumed the top 3 were all the new driver results, but upon zooming in, I could discern different dates among those 3. Maybe a little bit of description of the graphs, or fewer at once would be favorable. Maybe…
But don’t sweat it, you do you. I would though promote enlarging the text and giving a black outline for the white on light-blue.
Thanks for the follow up on Arc, I also lament I still have not produced my “Intel ARC the Anime” series. But with Ai I hope to some day achieve such heights of art.
They released new hardware? Please do a segment on the importance of drivers versus hardware
When they actually put sometime into fixing multi monitor setups then it will be a product worth considering.
Thanks Wendell! - Love the reviews as always
Intel GPU industry wise are doing exciting work. Assuming they hang tough, and keep up the improvement, they are going to issue Battle mage with a rep that while the drivers may not be A1, their effort to get to A1 is - A1.
I was in a mental view of not buying Intel discrete GPUs - due to bad drivers. I think that's largely done now, so I'd probably be fine picking them up now.
Its not quite right to say Intel came from no where. In truth their GPU and IGP hardware and software has been around a long time. But to level up against the high end players in the market, to this degree - in this kind of time frame is highly impressive. And I am thankful for it - because the market where midrange cards exceed $500/£500 instead of the older $200/£150 arena is batshit and something needed to come break that impasse.
I love to see what they're doing but I feel like linux is so neglected on Arc, we don't even have a GUI yet.
You don't need a GUI for it.
@@lost-prototype yes, and?
@@jddes Ummmm... Eat at Joes?
I have had virtualization issues for dual monitors on my 11thgen i7, at first I had to pull a hdmi cable out to get past the POST screen, but it seems a microcode update in Ubuntu fixed the issue randomly one day. Note I tried allowing visualisation on the board and I also was running bare metal
i have an arc a750 and i bought it as a "temporary" gpu (upgrade for my finally long in the tooth Titan Xp/building a new pc and keeping the old one together) waiting for the 50 series to buy one of those or a 4090 depending on pricing. im very impressed with the a750 for the price. i do still get occasional consistent stutters in some games.
I'm curious if A380 is also affected by the latest drivers.
Intel needs to take advantage of the market that AMD and Nvidia do not want to cover, The Midrange and Low End. If Intel can come out with a $400 GPU with 16GB of VRAM that is as strong as a 3090, $300 12GB Vram card as strong as a 3080, sub $200 8GB card as strong as a 3070. They will sell like hot cakes
This video drops just as i got the driver update.😁
I have an Intel A750 for my Jellyfin server, it shattered my previous 3050ti, in terms of transcoding speed and of course AVI encoding and transcoding . Bought it for 200$ including shipping, and I have loved it ever since, it was quite a pain to make the pass through in Proxmox and it sadly only has support on Ubuntu and Windows VMs but for what I paid and what it delivers , it truly is a marvel. If Intel responds with Battlemage being at least in an AMD level of performance and driver support and development , I might switch from NVIDIA to Intel
does it whine at full load running at 100%?
I love my a770, super impressed with it's price/performance and it just keeps getting better
Fingers crossed that the battle mage isn't to much expensive then this gen and a lot more performance! I was waiting till see the 2nd Gen of Intel gpu's before jumping on board buy its very exciting to see the strides Intel is making on their gpu's though! More competition the better for everyone and hopefully knock the green giant of their high horse!
I've had that starfield half-framerate-after-menus bug on and off throughout starfield's patch history. On my RTX 3070. So it's not entirely an intel issue
Good info. I'm rooting for Intel in the GPU market!
I will laugh my ass off if i end up buying battlemage and an amd processor…..what a dramatic difference in upgrade for me
Hope with Battlemage Intel can put enough pressure on Amd and nvidia with their mid and low end cards to have reasonable vram and prices, would be awesome for the gamer market...nvidia is supposedly alone in the ultra high end coming fall/winter but ey, 3 players is better than 2
I'd be curious how these cards perform rendering video in DaVinci Resolve AV1 Vs MP4 (render time and file size)... Windows Vs Linux (since there versions for both platforms)....
And ideally if things have improved over driver updates. I'm rooting for intel Arc, and looking to escape the Windows grip. I wonder if anyone else cares for this test?
Wendell what’s the encode/decode performance like? I’m wanting one of these for Plex to ditch Nvidia.
I knew drivers make a difference, but this is pretty amazing.
*Question...*
Any thoughts on MACHINE LEARNING being involved to fix drivers for various games?
I'm very interested in the idea of ML being involved. Maybe initially some non-ML version that's simply automated to try all sorts of combinations of game settings while monitoring FRAME TIMES? Step through every game in the Steam catalogue? Then ML having access to specific driver code locations so it can try to rewrite them? (feedback loop for crashes/worse performance/better performance...) Maybe move on to rewriting game code to be more efficient? Basically there are TONNES of older games. How much can we fix them with what might be eventually LOW HANGING FRUIT in terms of cost as ML gets better?
100% hope Intel continues with their GPUs and Drivers. I am never an early adopter of PC technology because of the cost of parts, but if a competitive hardware or software options proves itself, I am happy to whole-heartedly embrace it.
one of the biggest problem that the arc gpus still have are the idle consumptions all of them consume around 40w by just doing nothing on the pc
A770 or rtx 3070? Same price in my country
Very much looking forward to seeing what Battlemage is capable of! Hopefully they can improve on efficiency. The main thing I'm curious about with Intel ARC is creative, productivity, and LLM tasks.
If you ramp up the fans, you can get the card temperature down real low. Its always running at max power, so it wont matter. 😊
did the drivers improve AI performance? With the ass games that have been out lately, (not into multiplayer shooters) im deepdiving Ai more than gaming. Would an intel card out perform the $400-450 used 3080ti's in stable diffusion or llm? i know they have a bit more memory, but speed is more important than memory in my usage.
I had the same thing with asus amd 6600xt. Windows 10 no issues. Windows 11 would put in a generic driver. I spent months trying to fix it.
Intel Arc FTW
Wooooo intel GPUs! SR-IOV baby!!!!!!
How are the intel GPU on Linux? Do they even work on Linux?
Bought an ASROCK A380 strictly for doing AV1 encoding and its been working like a champ except I just noticed...its locked at PCIe v1.0 x1 speeds no matter what motherboard I plug it into. Was a rather shocking discovery given it requires a full x16 slot, tho its only spec'd to use x8. The ASROCK also requires external PCI-E power tho its not actually needed. Kind of a bummer. Confirmed with a collogue with the same card and different hardware - same thing. Guess I should have figured ASROCK would scrape the bottom of the barrel for 'supports but doesn't utilize' specifications. Oh well, none of the A-series cards support SR-IOV and I'd really like that feature so *crosses fingers* here's to hoping the next-gen Intel supports this.
I been running a test pig on AMD / AM4 / B550 with the Ryzen 7 5700x and Sparkle A 770 Titan video card, some of us are doing AMD platforms and the user experience could be different.
Wendell, please test using B450/x470 in PCIe 3 with Rebar on for Intel Arc A580 /A770, it should work being AMD can do what Intel cannot on PCIe 3, AMD can do Rebar for Nvidia or Intel on x470 chipset, I tested the RTX 3070 on x470 with Rebar and it works.
I have just hated how the general community reacted to - mostly driven by UA-camrs (MLID etc) - and continues to react to Intels efforts in the consumer GPU market. The initial drivers weren’t great but what did we expect. This is a brand new market segment for Intel and they have put in the hours and done the dirty work of optimizing per game! With all this learned knowledge I am certain they will continue to get better.
Windows auto updating has been such a pain. My solution is just download the driver manual do a clean install, it's been relatively good since.
Was on the fence before but I'm seriously looking forward to Battlemage.
I had that issue with my 6900xt/12700k gaming rig. All of the sudden my game crashed and only one monitor worked…then the amd adrenaline wouldn’t work, It was a pain to figure out
What about on linux? How's proton?
can't you tell windows 11 to stop updating driver updates for a specific device/driver like you can on 10?