@@Tugela60 You're missing the point entirely, on purpose too probably. Nvidia STILL are delivering very low amounts of vram so you upgrade faster. My 3070 has 8gb's of Vram and that's the same as the 1080 I had before also had, a card from 2016. And yes, it's a problem, it wasn't a lot when the 3070 was launched and it's kinda pathetic now.
The PROBLEM is it's not just FPS per DOLLAR anymore... there are still issues with many older games. Will Intel be supporting this at all in a few years as it's got massive problems as a company. There's other issues as well. Personally, I can't justify less than 12GB of VRAM but also wouldn't go with Intel currently. I ended up with an RTX4070 because the process of elimination only left me that GPU. I didn't like AMD's FSR. I couldn't go with Intel. I couldn't go less than 12GB and wanted 16GB but the RTX4080 was too expensive. SIGH. (I am happy with the RTX4070 which I modded with Noctua fans but I do find the situation a bit nuts... if I buy any NEW games in the next few years, which I might not, I'd be checking to see if I need more than 12GB and if so I won't buy the game.)
8gb is fine, but the way nvidia put that amount of vram... is really weird i mean why 3060 has 12gb vram while 3070 only has 8gb? it is fixed on 40 series, but TOO EXPENSIVE. require 499$ just to get 16gb vram? don't be ridiculous 4060 and ti only different in terms of performance while being the same features, so why they even sell that much? it is acceptable if Ti version has more features, but no, it is the same as the 4060. even 7600xt has 60$ more expensive than 7600. idk why nvidia put that huge gap price even for same class card.
@@photonboy999The 4060s also have a crippled PCI 8x. So it means that those on PCI gen 3 were really disappointed. The same happened with AMD's 6400 with only 4 lanes. These artificial restrictions are not necessary.
@@arlynnfolkeThey wanted to prove a point while cashing in on this problem. Unfortunately for them, the market prices adjusted from the gaming demand and the 4060 ti 16gb became a popular card for LLM development.
Hot take but I don’t understand people’s takes on Intel GPUs. People: “NVIDIA GREEDY! THEY SELL GROSSLY OVERPRICED PRODUCTS THAT NOBODY CAN AFFORD.” Also these same people: “WHY DIDNT INTEL GIVE US HIGH END CARDS REEEEE! THESE NEW CARDS DONT WORK WITH ANYTHNG WE NEED MORE COMPETITION REEEEEEEEEEEee!” Also the exact same people: *Buy NVIDIA cards.*
That’s exactly what I did but I waited for the 4080 Super and also got a 7800 XT in a second pc. It’s not that we can’t afford high end. The people who buy it can most likely afford it. But that doesn’t mean we wouldn’t prefer a cheaper price. If they could offer great competitive performance and be a substantial amount cheaper it would be amazing.
Well…. I want to support Intel but I’m not in the market for a lower end gpu. I purchase higher tier cards. Still on a 3080 because I panic bought at retail during Covid. Had I waited I would have definitely picked up a 7900 XTX. Kind of sad to hear rumors of amd not doing super high end on their next gen gpu. Going to wait and see what they offer before shilling to nvidia yet again :(
@Fliim I suggest starting a class action suit against nVidia. Force them in court to sell the 5090 for $250. Your argument can be that you only have $250, but deserve a state of the art product (due to you being able to breathe). Consequently they NEED to reduce the price to suit your wallet.
thats what I called out on intels strong point when alchemist launched, they would be great production based cards with gaming as an extra feature. Production workloads will only get better each launch. Personally buying a b580 just for the work loads you can do on these things.
@@lil----lil Our editing rigs do run Nvidia right now, but your 10X faster claim is simply not accurate. Check out Tech Notice and his deep dive into it.
@@lil----lilas someone who edits for a living and also consults for I.T in film & broadcast ,I can absolutely say that 10x is an absolutely wild and inaccurate claim. Arc alchemist cannot be beat in its hardware video decode speed which has a big impact in edit delivery times. Intel also has the only platform capable of decoding Sony's 8k xavc-h codec in anything close to realtime at this time. While it's true, for creative workflows outside of editing like VFX and 3D NVIDIA is far better; Intel really does have a place. Their drivers in Adobe are quite stable even though games kinda suck. Even for VFX and compositing, VRAM can often be the limiting factor well before raw compute performance. I can see this being a great product for smaller companies with smaller requirements. The price is fantastic. I've used the arc a380 as part of a remote broadcast system sending 8 HD HEVC video feeds to remote studios before. Definitely something not even an NVIDIA RTX Quadro can do. definitely not at that price. The arc series are content creation beasts.
if you actually want competition you can't just upvote a top comment that says wow yeah we need more competition to bring down prices you actually have to purchase the underdog's gpu...
A750 user here, intel gpus aren't faster than the amd/nvidia counterparts per se. They're more resistant to framerate drops as you go from 1080p low -> 1080p med -> 1080p ultra, so forth, relative to an equivalent amd/nvidia gpu. The caveat you have to accept is about 8% of games or so won't work with intel gpus (eg Starfield or TLoU1). Good so far/
@@Igor369 OpenGL is just a big nono afaik, but I haven't tested anything on my A770 recently. Everything besides DX12 and Vulkan uses a translation layer (probably DXVK) to convert older API calls (like DX9) directly to Vulkan - It's the same tech the Steam Deck uses to play DirectX titles on Linux. Performance hit is minimal and it actually runs quite well. Only problem are older titles that simply refuse to start on Intel graphics, as the devs would throw an error when "Intel" is detected as the GPU vendor, because they didn't want to deal with people creating support requests for problems while running on Intel HD graphics. You can spoof your vendor with DXVK I believe so many games can be "fixed" if you just set your vendor manually to AMD/NVIDIA, but Intel can't implement that into their drivers for obvious reasons.
Hardware Unboxed recently dropped an update video on arc drivers testing over 200+ games. They experience virtually zero issues. So assuming this new launch has even better drivers... Should be pretty decent.
It won't affect high end at all. Nvidia has stated they aren't aiming their high end for gamers for a while now. I've seen so many 4090 cores and vram stripped from their original boards to be utilized for AI/crypto
Gonna out perform everything up to $300. Gonna last longer than anything under $250 cuz everything has 8gb of vram. Unless you want 120 frames at 1440p this is the new goat. Coppin next Thursday
Yeah, I'm in the same boat as you. Once I slap in the video card that I want to buy that gives me better performance than an RTX 4070, that computer is going to be hidden away in the corner so I won't see them.
The software side is very exciting. XeSS FG and Low Latency. As well as AI Playground. Also a big boost in RT performance, even above the 4060. AMD has been lacking in literally every department besides raster performance. Intel are doing a better job at competing on all fronts with Nvidia
thats what disappoints me about amd. They build all these features and never finish them. Fsr1+2, frame generation... These features are there and never improved -.-
This GPU is below the performance target I generally shop for, but I REALLY REALLY hope it does well. I think this is one of the more interesting segments of the market.
I've been using Arc a750 and I didn't face any problems, I'm not into games too much but whatever I play works just like my 3070, i mainly use it for editing and after effects and it's really optimised for Adobe, love this card
I disagree. A $250 GPU is definitely 1440p nowadays. We have upscaling, remember? There is no point in running 1080p anymore. 1440p monitors are also dirt cheap nowadays.
I'll probably wait for the B7xx series Intel GPUs to see what Intel has to bring to the table. The only thing I will miss is CUDA/OptiX for Blender renders, especially when it comes to animating camera movements. NVIDIA has the upper hand when it comes to performance in Blender Cycles. And personally, I don't care for looks when it comes to GPUs. I prefer to hide my computer away on the corner or in a home server closet so that it's out of sight.
API support. What software engineers are going to take the time to include support for an additional architecture that hardly anyone uses? I feel like Intel will need to fund and work directly with developers, like AMD and Nvidia do, just to give their cards a fighting chance.
Have had great luck with two A770/16GB models from Acer and Sparkle. Used both as encoders, only job was to upload live streams to various sites, and AV1 encode to YT. My switching workstation was a beast, running vMIX with over 50 inputs, sporting a 7950X and a 20 GB A4500 from nvidia. That switched the show, the output was sent over 10 GB fiber to the Intel encoders (running Fedora, and using OBS for upload). Bulletproof. Offloaded some workload from the main machine. Even tried it with an A380, which was decent for $120. I will be buying the B580 at launch, really looking forward to the B770 or B790 though! 24 GB of VRAM for $499 is my sweet spot.
Why, not, if the DX 9, 10, 11 drivers are good. If not, why bother? Not worth it for me. I play a lot of older games. So yes, hopefully it will be a RX 570 moment. Maybe they named them 580 and 570 for this reason. But probably not.
2:10 wow.. reminds me of the time when i got a relatively new GTX 1060 Aorus Xtreme for 240 bucks! the prices at that time were really good. I bought it used but with a couple years guarantee left. the purchase was insanely good. I could play Doom, Battlefield 1, Fallout 4, Metro Last Light Redux, Star Wars Battlefront 2 etc. on ultra settings in 1080p and still had headroom over 60fps. (of course upscalers were not a thing back then) would be cool if the Intel GPUs fall directly into this category and really deliver what they promise. this time the drivers must work. i already thought about buying an used Intel card but the driver issues let me doubt... would also be nice to know if the RT Remix software can later also be used by Intel cards. Right now only Nvidea cards can use it sadly. together with DLSS, frame generation and RT Remix, im still more drawn to Nvidea cards. Im also curious how good the AV1 decoding is on the Intel cards.
A series GPU required as much as PCIe bandwidth as it can get to be functional and now they cut the number of PCIe lanes in half? Did they optimize the driver/software for working with more limited bandwidth communication?
@ FYI it also supports Vulkan which is used e.g. on Linux to avoid relying on the translation layer in Proton and use the Linux driver directly. The game also supports vulkan on windows, but for most windows users it will launch in dx11 mode.
I was hoping intel would come up with something higher end but it looks very unlikely. I am glad they are focusing on the budget market though. I will probably keep my 4080 Super and 7800 XT systems for another year until I can figure my render farm situation but if intel has something good to offer I wouldn’t mind getting their highest end graphics card as long as its an upgrade over my 4080. Which may be in a few years.
They gave the B580 a chance at least by waiting until it was hard to find a 3060 Ti or 6700 XT new for the same price. Not that I think that was a deliberate strategy. I probably will get one for the collection, since these two are likely Intel's last cards.
I think a lot of people is going to trash on the performance but I think battlemage is brilliant... I think the b580 and b570 is just like when amd release the rx580/570... Budget GPUs that will be timeless
Blender and other VFX software are my focus. I understand that no card manufacturer is going to reveal ahead of time what their future specs will be, especially not until their competitors have played their hands, but I wish they'd engineer a "leak" to let us know how much VRAM to expect from a B780/790.
We haven't had an issue with programs like Blender, Premiere and Resolve for output but overall we haven't seen any issues with ARC series and OpenGL acceleration in AutoCAD, Sketchup, etc. So I'm GUESSING that would translate well into Solidworks.
If you can't get the reference card then the Acer brand one looks pretty sleek, I'll go with the Sparkle brand though as they've been in the GPU market for a very long time
I am a little curious about B570 since yes, it is only 10gb of Vram on it i would tend to think it would be able to handle a lot of games at 1080p quire well over the RX 7600 and the RTX 4060 since they only have 8gb of Vram.
I am massively chuffed for B570 and B580 as they will be an amazing due to price to performance ratio. Fingers crossed that drivers do not let us down.
The problem with this card is that, of all the cards competing in this performance range, it has the largest die (except the soon to be replaced 7800XT), and is thus the most expensive to make, and the situation will soon be worse for Intel. When Navi 44/RX 8600XT comes out, it will have a die that is in the 130-152m^2 range, half the size of the Battlemage die. With a die that small, it will most like draw only 100-130 watts. The small Navi 44 die will cost much less to make than the Battlemage die, and with lower power draw, the rest of the card will be cheaper to make, too. Thus, if AMD chooses to, they can sell the 8600XT for less than Battlemage, but since it will have higher performance, I don't think they will. Selling an expensive-to-make card for a very low price will definitely get some sales for Intel this month and next, but in the long run, they need profit to justify continued investment in GPUs. I think the B570 and B580 cards will be the only Battlemage cards we ever see.
I agree, and Intel is the company that has piqued most of my interest in this Gen. Initial impressions (launch day) were not good with ARC in it's first consumer launch (was better and worse then I expected), but seeing them work away at improving drivers and seeing how ARC has matured, I absolutely think Intel is going to be a serious contender this generation - and I feel this side of the market is where they need to carve out and generate excitement in - they can worry about going higher end down the road - having a super solid foundation is what the Dr. ordered. I'm looking forward to the reviews. 🙂
I liked the A770 16gb but the prices are still to high in UK for a card 2 years old now, while other brand dropped in price, limited stocks of Intel Arc meant no body stocked them bar the most scalping retailers who are still charging £350 for a A770 LE 16GB version. after market ones are coming down but they lack quality. I hope this isn't the case on this and even so £250 is do-able.
Sounds promising if they can actually deliver. It's just unfortunate that it took Intel having their backs against the wall to incentivize them to be this competitive.
Sure this could be competitive enough if "non limited edition" SKUs sell at $225, but I doubt this is going to catch AMD off guard if the 8800XT is truly going to be an "RTX 4080 Killer" for $550.
Love the B580 specs (4060+ level but with more VRAM) and price. I've been needing a roughly 4060 level card but the 4060 with only 8 GB VRAM just isn't quite worth the price ($300) to me. B580 isn't perfect, but it crosses the bar for what I consider to be good value, or at least good enough value to buy.
Exactly what price should be for 1080p gaming, not some 300-400 dollar 8gb card that can barely do RT even with dlss/fsr and frame gen. If the performance benefits and increase in vram are true, then for people that are still running rtx 20 series or amd equivalent gpu, this is a good deal (specially if they dont plan on anything other than 1080p high).
I wouldn't mind trying Intel if the performance is similar to or better than my 4060 with more VRAM. I'm pretty agnostic about hardware if it works well.
I shop on cards in this price range because I am a 3rd world poor, I can't afford more than 1080p, and 1080p is what I am looking for. If the B580 can be a decent upgrade over my 6650XT, than I will buy it.
Let's keep in mind Nvidia and AMD? DO NOT include all hardware encoders and decoders on cheaper gpus. Intel's A/B 380, 580 etc do. Therefore it's a win win for budget gamers and content creators. Keep in mind, the specs on Intel GPUS, back 4 to 5 years ago were high end shaders/ cuda cores.
6:23 I mean for being tech guys I dont know why you choose these colors on the column graph. Cant see anything since it's almost the same as the background color. It's honestly pretty confusing throughout the video. And im not even colorblind
These might be nice for some older builds too, but they pretty much will require support for resizable bar which defeats the purpose in older builds that won't support the feature. Still - promising stuff from Intel. Maybe the prices will even drop below MSRP in future B-)
Since a lot of people are asking, here's the wallpaper link: wallhaven.cc/w/kx5v57
A $250 GPU has more VRAM than my 3080 I hate nvidia 💀
Awesome for entry level users though. Love seeing cards like this
how much did you pay for the 3080?
A 2024 product has more memory than a 2020 product! That is pretty shocking and surprising!
@@davidyusakuin my country, 3050 alone cost more than 3000 bucks, i won't imagine how much 3080 priced
@@Tugela60 You're missing the point entirely, on purpose too probably.
Nvidia STILL are delivering very low amounts of vram so you upgrade faster.
My 3070 has 8gb's of Vram and that's the same as the 1080 I had before also had, a card from 2016.
And yes, it's a problem, it wasn't a lot when the 3070 was launched and it's kinda pathetic now.
@@kristoffer3000adding to that, the 1080ti has 11GBs of vram.
It's going to be wild if nvidia drop another garbage 8gb card like a 5060 for $300+, whilst Intels $250 card with 12gb outperforms it.
The PROBLEM is it's not just FPS per DOLLAR anymore... there are still issues with many older games. Will Intel be supporting this at all in a few years as it's got massive problems as a company. There's other issues as well.
Personally, I can't justify less than 12GB of VRAM but also wouldn't go with Intel currently. I ended up with an RTX4070 because the process of elimination only left me that GPU. I didn't like AMD's FSR. I couldn't go with Intel. I couldn't go less than 12GB and wanted 16GB but the RTX4080 was too expensive. SIGH. (I am happy with the RTX4070 which I modded with Noctua fans but I do find the situation a bit nuts... if I buy any NEW games in the next few years, which I might not, I'd be checking to see if I need more than 12GB and if so I won't buy the game.)
8gb is fine, but the way nvidia put that amount of vram... is really weird
i mean why 3060 has 12gb vram while 3070 only has 8gb?
it is fixed on 40 series, but TOO EXPENSIVE.
require 499$ just to get 16gb vram? don't be ridiculous 4060 and ti only different in terms of performance while being the same features, so why they even sell that much?
it is acceptable if Ti version has more features, but no, it is the same as the 4060.
even 7600xt has 60$ more expensive than 7600.
idk why nvidia put that huge gap price even for same class card.
@@photonboy999The 4060s also have a crippled PCI 8x. So it means that those on PCI gen 3 were really disappointed. The same happened with AMD's 6400 with only 4 lanes.
These artificial restrictions are not necessary.
@@arlynnfolkeThey wanted to prove a point while cashing in on this problem.
Unfortunately for them, the market prices adjusted from the gaming demand and the 4060 ti 16gb became a popular card for LLM development.
@@arlynnfolke 8 gb ain't fine anymore, not even for 1080p
I definitely wouldn't mind slapping one of these in a spare PC for fun if they live up to the internal benchmarks.
exactly
It won't
Same
@@zDToddy that guy who hates everything
if only it didn’t require resizable bar… you can patch your bios if course to add support even on old cpus, but it’s wonky
Hot take but I don’t understand people’s takes on Intel GPUs.
People: “NVIDIA GREEDY! THEY SELL GROSSLY OVERPRICED PRODUCTS THAT NOBODY CAN AFFORD.”
Also these same people: “WHY DIDNT INTEL GIVE US HIGH END CARDS REEEEE! THESE NEW CARDS DONT WORK WITH ANYTHNG WE NEED MORE COMPETITION REEEEEEEEEEEee!”
Also the exact same people: *Buy NVIDIA cards.*
That’s exactly what I did but I waited for the 4080 Super and also got a 7800 XT in a second pc. It’s not that we can’t afford high end. The people who buy it can most likely afford it. But that doesn’t mean we wouldn’t prefer a cheaper price. If they could offer great competitive performance and be a substantial amount cheaper it would be amazing.
Also people saying they will by an AMD card performs whatever arbitrary thing they want, even though they would never buy the card
Well…. I want to support Intel but I’m not in the market for a lower end gpu. I purchase higher tier cards. Still on a 3080 because I panic bought at retail during Covid. Had I waited I would have definitely picked up a 7900 XTX. Kind of sad to hear rumors of amd not doing super high end on their next gen gpu. Going to wait and see what they offer before shilling to nvidia yet again :(
people are idiots
@Fliim I suggest starting a class action suit against nVidia. Force them in court to sell the 5090 for $250. Your argument can be that you only have $250, but deserve a state of the art product (due to you being able to breathe). Consequently they NEED to reduce the price to suit your wallet.
I might buy one just to support future innovations and competition
ARC discrete is cancelled, Intel just trying to recover as much cash
@@harperlee6383 That's a big load! lol
@@harperlee6383 Source: [My uncle told me]
@@harperlee6383 Citation please.
Don't do it, mine decided to no longer boot after an update😢
Personally, I like ARC for production workloads in Adobe. Great deal for that. I'm hoping this is as well.
thats what I called out on intels strong point when alchemist launched, they would be great production based cards with gaming as an extra feature. Production workloads will only get better each launch. Personally buying a b580 just for the work loads you can do on these things.
My dude have your tried Nvidia for Adobe? I don't like Nvidia's price but they're AT LEAST 10x faster in Adobe application. NO CONTEST.
@@lil----lilNot worth it if it's expensive.
@@lil----lil Our editing rigs do run Nvidia right now, but your 10X faster claim is simply not accurate. Check out Tech Notice and his deep dive into it.
@@lil----lilas someone who edits for a living and also consults for I.T in film & broadcast ,I can absolutely say that 10x is an absolutely wild and inaccurate claim.
Arc alchemist cannot be beat in its hardware video decode speed which has a big impact in edit delivery times. Intel also has the only platform capable of decoding Sony's 8k xavc-h codec in anything close to realtime at this time. While it's true, for creative workflows outside of editing like VFX and 3D NVIDIA is far better; Intel really does have a place.
Their drivers in Adobe are quite stable even though games kinda suck.
Even for VFX and compositing, VRAM can often be the limiting factor well before raw compute performance.
I can see this being a great product for smaller companies with smaller requirements. The price is fantastic.
I've used the arc a380 as part of a remote broadcast system sending 8 HD HEVC video feeds to remote studios before. Definitely something not even an NVIDIA RTX Quadro can do. definitely not at that price. The arc series are content creation beasts.
You have to start somewhere no matter what so here's to a decent start.
This is a second generation card
Still a fantastic card with value@@zDToddy
@@zDToddya baby’s second step is no less impressive than the first
@@zDToddy Nvidia is about to release their 24th generation of graphics cards, this is still the start for Intel
@@heatnup very well said
ill probably be replacing my GTX 1080 with this, thanks Intel. I just hope its not too cursed using an Intel GPU with an AMD CPU.
if you actually want competition you can't just upvote a top comment that says wow yeah we need more competition to bring down prices you actually have to purchase the underdog's gpu...
Me upvoting this comment 😐
I'm definitely considering grabbing one of these. I have an older system that needs a new GPU
A750 user here, intel gpus aren't faster than the amd/nvidia counterparts per se. They're more resistant to framerate drops as you go from 1080p low -> 1080p med -> 1080p ultra, so forth, relative to an equivalent amd/nvidia gpu. The caveat you have to accept is about 8% of games or so won't work with intel gpus (eg Starfield or TLoU1). Good so far/
Thats what happens when you don't skimp out on memory bandwidth and capacity.
How about older games pre DX9? Or Open GL games?
@@Igor369 OpenGL is just a big nono afaik, but I haven't tested anything on my A770 recently. Everything besides DX12 and Vulkan uses a translation layer (probably DXVK) to convert older API calls (like DX9) directly to Vulkan - It's the same tech the Steam Deck uses to play DirectX titles on Linux. Performance hit is minimal and it actually runs quite well. Only problem are older titles that simply refuse to start on Intel graphics, as the devs would throw an error when "Intel" is detected as the GPU vendor, because they didn't want to deal with people creating support requests for problems while running on Intel HD graphics. You can spoof your vendor with DXVK I believe so many games can be "fixed" if you just set your vendor manually to AMD/NVIDIA, but Intel can't implement that into their drivers for obvious reasons.
@@Igor369 Can't say, haven't tried. Tho prolly the older the game, the worse the compatibility.
I suspect this is why Intel is focusing on 1440p Ultra benchmarks in the announcement. That being said, 1440p monitors are shockingly cheap now.
We just need the drivers to be good
Hardware Unboxed recently dropped an update video on arc drivers testing over 200+ games. They experience virtually zero issues. So assuming this new launch has even better drivers... Should be pretty decent.
@@braydennturnerperformance on new game releases is a big problem for arc. They definitely need to improve drivers
AI coded! ♥️
B770 24gb Vram Will be Game changer
There is no way they will make it 24 gig card.
@@Rentta It seems logical they added more VRAM to the 580 so why not the b770
Maybe B770 20GB 😊@@n.d.n.e1805
B780 maybe
Good for LLMs.
I love the stock looks of Intel and NVIDIA graphics cards.
couldnt care less about looks since its under my desk down by my stinky feet
I hope there is a trickle Up affect to cause a decrease or at least a halt to CRAZY GPU prices.
It won't affect high end at all. Nvidia has stated they aren't aiming their high end for gamers for a while now. I've seen so many 4090 cores and vram stripped from their original boards to be utilized for AI/crypto
Gonna out perform everything up to $300. Gonna last longer than anything under $250 cuz everything has 8gb of vram. Unless you want 120 frames at 1440p this is the new goat. Coppin next Thursday
120fps at 1440p wont be a problem at all for the 580 with 12gb ram
@srobeck77 depending on what you play. Its not gonna crank modern single player games too much past 90
@@4brotroydavis i dunno man, my 1070 is over 100fps and this is wayyyy faster
@srobeck77 then i stand corrected. My experience is with the a750 at 1080p and i struggle to get 100fps in jedi survivor. Most i get is around 95
@srobeck77 i plan to buy battlemage and a 1440p monitor next weekend. Was gettin a 1440p anyway so hopefully i can grab the gpu too
I love the OEM look of them, I don't like the after market gaming BS and overclocking shit. i like small, cool, quiet PCs. ITX PC's/ mini mobile PC
Yeah, I'm in the same boat as you. Once I slap in the video card that I want to buy that gives me better performance than an RTX 4070, that computer is going to be hidden away in the corner so I won't see them.
I like the stealth look. Reminds me of the Red Dragon 5700 xt cards. We need more stealth looking cards!
I had to buy a bigger case for my last gpu because of all this 2.5-3 slot junk 😢
@@kieron88ward Video cards are getting thicker and thicker these days...
The software side is very exciting. XeSS FG and Low Latency. As well as AI Playground. Also a big boost in RT performance, even above the 4060. AMD has been lacking in literally every department besides raster performance. Intel are doing a better job at competing on all fronts with Nvidia
thats what disappoints me about amd. They build all these features and never finish them. Fsr1+2, frame generation... These features are there and never improved -.-
Intel GPUs are also still crap at 3D rendering.
This GPU is below the performance target I generally shop for, but I REALLY REALLY hope it does well. I think this is one of the more interesting segments of the market.
I've been using Arc a750 and I didn't face any problems, I'm not into games too much but whatever I play works just like my 3070, i mainly use it for editing and after effects and it's really optimised for Adobe, love this card
I disagree. A $250 GPU is definitely 1440p nowadays. We have upscaling, remember? There is no point in running 1080p anymore. 1440p monitors are also dirt cheap nowadays.
bingo!
I'll probably wait for the B7xx series Intel GPUs to see what Intel has to bring to the table. The only thing I will miss is CUDA/OptiX for Blender renders, especially when it comes to animating camera movements. NVIDIA has the upper hand when it comes to performance in Blender Cycles.
And personally, I don't care for looks when it comes to GPUs. I prefer to hide my computer away on the corner or in a home server closet so that it's out of sight.
I think I’m going to pair this with my 5800x3d 1440p ultrawide. What a time to be alive
API support.
What software engineers are going to take the time to include support for an additional architecture that hardly anyone uses? I feel like Intel will need to fund and work directly with developers, like AMD and Nvidia do, just to give their cards a fighting chance.
The drop-in upgrade thing will be more affected by 8 lanes of PCIE, rather than a single 8 pin requirement. most older PCs don't have PCIE gen4.
I do happy we finally have some competition in dGPU market. Can't wait for their B780
Mike, Great Hot Take on the ARC B580 GPU...!
Budget gamers are going to put the B570 into Dell Optiplexes and have a great time
ah hell nah. not when the 580 is ONLY $30 more
The board partners should make low profile version of this card like nvidia a2000.
The only thing we have had to discuss is how ungodly expensive Nvidia is. im actually considering using this for my dads pc build.
Have had great luck with two A770/16GB models from Acer and Sparkle. Used both as encoders, only job was to upload live streams to various sites, and AV1 encode to YT. My switching workstation was a beast, running vMIX with over 50 inputs, sporting a 7950X and a 20 GB A4500 from nvidia. That switched the show, the output was sent over 10 GB fiber to the Intel encoders (running Fedora, and using OBS for upload). Bulletproof. Offloaded some workload from the main machine. Even tried it with an A380, which was decent for $120. I will be buying the B580 at launch, really looking forward to the B770 or B790 though! 24 GB of VRAM for $499 is my sweet spot.
A decent gpu at this price that also supports vfio is pretty crazy.
ive got a 2060 that this is looking like a tasty replacement for
250$ for 12gig card!
Is awesome
My A770 should cover me for the next 2 years.
fb marketplace it and upgrade
Why, not, if the DX 9, 10, 11 drivers are good. If not, why bother? Not worth it for me. I play a lot of older games.
So yes, hopefully it will be a RX 570 moment. Maybe they named them 580 and 570 for this reason.
But probably not.
2:10 wow.. reminds me of the time when i got a relatively new GTX 1060 Aorus Xtreme for 240 bucks!
the prices at that time were really good. I bought it used but with a couple years guarantee left.
the purchase was insanely good. I could play Doom, Battlefield 1, Fallout 4, Metro Last Light Redux, Star Wars Battlefront 2 etc. on ultra settings in 1080p and still had headroom over 60fps. (of course upscalers were not a thing back then)
would be cool if the Intel GPUs fall directly into this category and really deliver what they promise.
this time the drivers must work. i already thought about buying an used Intel card but the driver issues let me doubt...
would also be nice to know if the RT Remix software can later also be used by Intel cards. Right now only Nvidea cards can use it sadly.
together with DLSS, frame generation and RT Remix, im still more drawn to Nvidea cards. Im also curious how good the AV1 decoding is on the Intel cards.
A series GPU required as much as PCIe bandwidth as it can get to be functional and now they cut the number of PCIe lanes in half? Did they optimize the driver/software for working with more limited bandwidth communication?
We will certainly need to see.
6:25 correction: cs2 isn’t dx9, that was only true for its predecessor cs:go. All source 2 games use dx11.
Interesting. Thanks for the info.
@ FYI it also supports Vulkan which is used e.g. on Linux to avoid relying on the translation layer in Proton and use the Linux driver directly. The game also supports vulkan on windows, but for most windows users it will launch in dx11 mode.
I was hoping intel would come up with something higher end but it looks very unlikely. I am glad they are focusing on the budget market though. I will probably keep my 4080 Super and 7800 XT systems for another year until I can figure my render farm situation but if intel has something good to offer I wouldn’t mind getting their highest end graphics card as long as its an upgrade over my 4080. Which may be in a few years.
better ray tracing than rdna3 and the best up scaling in the market. Cant wait to test these new cards out for video and photo editing.
Intel/Raja telling you this is Intel's Polaris -RX- B580 and -RX- B570
Raja was fired 2 years ago.
If the drivers hold up, hell yeah
So far the only thing I'm having an issue with is the card size. Clearly there could be a single fan option geared towards sub 5L ITX builds.
They gave the B580 a chance at least by waiting until it was hard to find a 3060 Ti or 6700 XT new for the same price. Not that I think that was a deliberate strategy. I probably will get one for the collection, since these two are likely Intel's last cards.
Here's to hoping they improve their driver on Linux. Its the only reason why i didn't use it as my 2nd GPU
Hear, hear! I would 100% buy one if they were stable on Linux. Edit: AND CAN RUN HDMI 2.1
I'm crossing my fingers for this!
Counter-Strike 2 is running on DX11
I think a lot of people is going to trash on the performance but I think battlemage is brilliant... I think the b580 and b570 is just like when amd release the rx580/570... Budget GPUs that will be timeless
Blender and other VFX software are my focus. I understand that no card manufacturer is going to reveal ahead of time what their future specs will be, especially not until their competitors have played their hands, but I wish they'd engineer a "leak" to let us know how much VRAM to expect from a B780/790.
you can always use the pin header converter to get the pcie 8-pin plug. also there is the upcoming upscaler support in windows and/or directx/vulkan.
Are the ARC drivers stable for programs such as Solidworks? If so, then this is what I'd use for budget workstations.
We haven't had an issue with programs like Blender, Premiere and Resolve for output but overall we haven't seen any issues with ARC series and OpenGL acceleration in AutoCAD, Sketchup, etc. So I'm GUESSING that would translate well into Solidworks.
@@HardwareCanucks Thank you.
Which of the board partners makes the "Limited Edition"?
I like the plain, sleek appearance.
Its from Intel themselves
Cooler might be made by Cooler Master but the card is sold by Intel directly
If you can't get the reference card then the Acer brand one looks pretty sleek, I'll go with the Sparkle brand though as they've been in the GPU market for a very long time
I am a little curious about B570 since yes, it is only 10gb of Vram on it i would tend to think it would be able to handle a lot of games at 1080p quire well over the RX 7600 and the RTX 4060 since they only have 8gb of Vram.
I am massively chuffed for B570 and B580 as they will be an amazing due to price to performance ratio. Fingers crossed that drivers do not let us down.
How does it perform using the AMD CPU structure ? Also good for Video editing?
250 dollars is great on paper but first price in EU are already around 320-340 euros, which is the same price as a RX 6750 XT
It's not a big deal. The $250 MSRP is $300 on street. We already have the RTX 3060 12GB and RTX 4060 in this price range.
I think it's great. Competition is the best thing us, consumers, can ask for.
Here's hoping drivers are good.
From the specs it could be also a well-rounded GPU for SFF systems, eg. in an S400 V2 or even S300, Dan A4-SFX etc.
If they do game bundle deals, even more tempting to jump on board.
This is really good for the industry. I'm routing for Intel.
I really hope this card delivers some amazing bang for buck and shakes up this whole GPU markets
The problem with this card is that, of all the cards competing in this performance range, it has the largest die (except the soon to be replaced 7800XT), and is thus the most expensive to make, and the situation will soon be worse for Intel. When Navi 44/RX 8600XT comes out, it will have a die that is in the 130-152m^2 range, half the size of the Battlemage die. With a die that small, it will most like draw only 100-130 watts. The small Navi 44 die will cost much less to make than the Battlemage die, and with lower power draw, the rest of the card will be cheaper to make, too. Thus, if AMD chooses to, they can sell the 8600XT for less than Battlemage, but since it will have higher performance, I don't think they will.
Selling an expensive-to-make card for a very low price will definitely get some sales for Intel this month and next, but in the long run, they need profit to justify continued investment in GPUs. I think the B570 and B580 cards will be the only Battlemage cards we ever see.
The 7600 was 250$ sometimes. It was a 6650xt with a new coat of paint, and that was just an overclocked 6600xt.
I agree, and Intel is the company that has piqued most of my interest in this Gen. Initial impressions (launch day) were not good with ARC in it's first consumer launch (was better and worse then I expected), but seeing them work away at improving drivers and seeing how ARC has matured, I absolutely think Intel is going to be a serious contender this generation - and I feel this side of the market is where they need to carve out and generate excitement in - they can worry about going higher end down the road - having a super solid foundation is what the Dr. ordered. I'm looking forward to the reviews. 🙂
ill be grabbing it. asap.
250$ its a tremendously important sector
I hope its good I pray its good
Honestly I was kind of hoping Intel would be releasing their rumored higher end cards but hopefully those are announced later.
I would honestly like to see high midrange
any chance that the B580 would be a good upgrade for a GTX 1080?, I game on a 32" 1440P 144Hz Asus Monitor? the price is incredible
id sell that gpu while it holds some value. Rather wait for a b770
Don't buy this for 1440p gaming
@@zDToddy The target audience for those battlemages is 1440p games though
the specifically say this is a targeted upgrade for a 1060, so either wait on next gen or dont expect a big uplift.
I liked the A770 16gb but the prices are still to high in UK for a card 2 years old now, while other brand dropped in price, limited stocks of Intel Arc meant no body stocked them bar the most scalping retailers who are still charging £350 for a A770 LE 16GB version. after market ones are coming down but they lack quality. I hope this isn't the case on this and even so £250 is do-able.
a750 is the same card with half the ram
try looking for used cards
Sounds promising if they can actually deliver. It's just unfortunate that it took Intel having their backs against the wall to incentivize them to be this competitive.
12GB VRAM at $250 price tag is hype!
How much cad? And would you use this in an AM 4 or AM 5 build?
How do Intel cards perform when streaming on OBS
Hope intel gets success with this. Gamers need new company to counter monopoly of nvidia
6:22 DOTA 2 and CS2 use DirectX 11, not 9. They also support Vulkan. League of Legends also switched a long time ago.
I respect intel so much for this, hope they make it.
Sure this could be competitive enough if "non limited edition" SKUs sell at $225, but I doubt this is going to catch AMD off guard if the 8800XT is truly going to be an "RTX 4080 Killer" for $550.
I can't wait to see the 10bit 4:2:2 HEVC benchmarks! If it outperforms the A580 I'll swap out my 3090 for this.
Intel needs gain market share and consumer confidence. Even if they sell these card at cost!
I would buy one if intel bring forward decent linux driver.
Love the B580 specs (4060+ level but with more VRAM) and price. I've been needing a roughly 4060 level card but the 4060 with only 8 GB VRAM just isn't quite worth the price ($300) to me. B580 isn't perfect, but it crosses the bar for what I consider to be good value, or at least good enough value to buy.
I'm looking at all this as someone who, a more then year a half ago, bought a used (THEN LESS THAN 2-YEAR-OLD) RX 6700 XT (12 GB) for 220 euros.
Exactly what price should be for 1080p gaming, not some 300-400 dollar 8gb card that can barely do RT even with dlss/fsr and frame gen.
If the performance benefits and increase in vram are true, then for people that are still running rtx 20 series or amd equivalent gpu, this is a good deal (specially if they dont plan on anything other than 1080p high).
You’ve got the same desktop background I do! 😂
Whats it called ?
they need to have rx 6700xt performance and good drivers to succed ,only matching 4060&7600 would not be enough
Please tell me this is a 2-slot design! That looks thicker than the A750 that I have.
This 250$ GPU is great for 1080p medium settings for a very long time.
this is 1440p on ultra settings.....
@srobeck77 the A770 barely runs games at 1080p 60, how do you expect this card to even come close to 1440p Ultra?
I wouldn't mind trying Intel if the performance is similar to or better than my 4060 with more VRAM. I'm pretty agnostic about hardware if it works well.
i think intel will win 20% marketshare, this will go hard in south america and asia most countries cant just drop 800$ for a new gpu
I shop on cards in this price range because I am a 3rd world poor, I can't afford more than 1080p, and 1080p is what I am looking for. If the B580 can be a decent upgrade over my 6650XT, than I will buy it.
Let's keep in mind Nvidia and AMD? DO NOT include all hardware encoders and decoders on cheaper gpus.
Intel's A/B 380, 580 etc do.
Therefore it's a win win for budget gamers and content creators.
Keep in mind, the specs on Intel GPUS, back 4 to 5 years ago were high end shaders/ cuda cores.
as someone who doesnt spend more than $800 on a gpu, yeah i hope this decimates nvidia and amd on the low end.....
i'm looking an upgrade for my rx vega 56 and has been eyeing the 3070. the 4060 is around that performance so i might get this honestly
Move over AMD, Intel is taking your spot in the market
6:23 I mean for being tech guys I dont know why you choose these colors on the column graph. Cant see anything since it's almost the same as the background color. It's honestly pretty confusing throughout the video. And im not even colorblind
This could be Intel's redemption Arc.
These might be nice for some older builds too, but they pretty much will require support for resizable bar which defeats the purpose in older builds that won't support the feature. Still - promising stuff from Intel. Maybe the prices will even drop below MSRP in future B-)
A+ for 1440p gamers
Also doesn't help it's X8 lanes. Pcie gen 3 systems will lose out on performance too because of that