Hey! For the testing in this video, I did not have ReBar turned on. So I did a follow up video looking at how much of a difference this single setting made to the airbus GPU's gaming performance. Go check it out here: ua-cam.com/video/4itS-I_Xtlw/v-deo.html
Honestly, describing the Intel ARC GPU as a very smart drug addict was genius. Never change, Dawid, your commentary never ceases to brighten up murky days.
I really want these Intel graphics cards to do well, having another competitor for Nvidia's shenanigans would be good. And these cards don't seem to be far off, with better drivers and optimizations, the second or third generation could be very compelling.
well Intels doing pretty good considering they are just starting out think about it AMD bought ATI Intel is starting out from scratch and they are already able to not just game but also do ray tracing to some degree already it's rather impressive when you think about it
@@raven4k998 , that's not a fair comparison considering Intel used a person from AMD to design the Intel GPU. Along with that, its not fair to compare Intel or AMD to NVIDIA with Ray Tracing because the DXR code is programmed for NVIDIA, requiring very complex drivers to compensate. This will not be the case with gameS developed on a real DirectX 12 Engines like Unreal Engine 5 that are not some ghetto rigged DirectX 11 Engine with the "DirectX12 feature" added, non-natively. The major purpose of DirectX 12 was to eliminate all this "driver" non-sense and simplify the process by natively supporting software and hardware RT within the Engine code itself. NVIDiA will still have to deal with drivers for every game because the code has to travel outside of the Engines workspace to utilize dedicated cores. AMD and Intel have the advantage of allowing the developers to utilize them easily because the allocated cores are within the same silicon. There is also the fact that AMDs shader-cores have become so fast and efficient that software RT like Lumen can be handled without the need for dedicated hardware. The differences between Lumen and dedicated Ray Tracing has recently been proven to be minor and its will only get better. NVIDIA still doesn't hold a candle to AMDs pure shader-core performance.
All the stuttering in these games makes it look like rebar is disabled. Arc really likes rebar that is a setting in the bios that really needs to be enabled.
Very likely, seen the tests of this card and it was looking like competent 1440p card in new titles so having performance like this in 1080p looks strange.
@@Derpynewb I saw Digital Foundry's review of the Arc GPUs and Raytracing seems a lot better in their testing and they did it for both 1080p and 1440p. I guess Intel cards are little gimmicky right now.
If the game has an option for vulkan, try that as its better supported than Direct X at the moment. People have been using a DX11 to Vulkan wrapper and getting large performance gains.
To be honest you also get performance gains with DirectX to Vulkan translation on other GPUs as well. I think there are still 2 years old benchmarks, showing that you can get around 20% more performance in World of Warcraft for example... even comparing DX12 with DXVK.
I just run popos with vulkan. I started using it a little more than a year ago and it was barely supported, and now every game supports it, and every game is on proton
as a linux user it sounds so strange hearing about dxvk on windows XD, some of my friends who run windows did get huge performance gains in games that use old dx versions
bottom line, intel suceeded in sending a mid-tier performing gpu for (somewhat) reasonable pricing to market, something that neither nvidia or AMD really do anymore. Some of us are still on the nvidia 10 series, and are just excited to see more variety entering the second hand market, and competition for price efficiency from here.
6600 xt is performing without RT about on par, just is a lot cheaper so ? this card costs about the same as a 3060, performs like one but just more issues so I don't find it good value cause it's inferior by the fact perf isn't better, price isn't better and driver quality is worse. Nothing else to be said really.
@@ole-martinbroz8590 they do compete a lot better for professional applications. their encoder is insane, and they seem to perform better in professional rendering than game rendering. They don't have to be the best on their first generation, so long as they have a market. though I will say, I wasn't aware of just how little difference ther was between AMDs low end stuff. I'm happy with the card I have for the next few years, so I didnt do that much research.
I want Intel to keep creating GPUs and increasing support for older games. I might not buy one of their GPUs this generation but I definitely would consider buying their 2nd or 3rd generation if they continue. For a first attempt at a GPU, everyone seems to expect them to be perfect. As a computer engineer myself I just want to say that this is hella impressive that they got the game support and performance that they did for their first attempt! I’m excited to see them continue in this space.
Yes, but rather than being impressive from engineering standpoint it needs to be actually much better than AMD/Nvidia cards at the same price point in practical day to day use, to get reasonable sells, since it doesn't have any brand recognition. I don't see that coming, so let's hope Intel's plan for 1st gen is to be a sort of beta test, and only 2nd gen and above being a commercial success.
@@franciscocornelio5776 actually consedering their price vs similar speed and spec cards from amd and nvdia i would buy it if its getting close to their competitors if i were to buy a gpu thats a $50 dollars lower than nvdia but only having like 10 fps lower is a real good product
I checked out my local MicroCenter and they had one Arc GPU, an Arc 750. On the web page showing Intel GPU's (only the one) was this line: "When it comes to purchasing a GPU, the first decision you need to make is between an NVIDIA graphics card or an AMD graphics card."
tbf, as funny as that is, its more likely they havent yet gotten around to get their site templates updated then it being meant as a jab, these "blurbs" above product categorys(like GPUs even for specific brands) are part of the template for the product category, and its not often you get a new contender in a space like GPUs so this isnt necesserly something you specificaly check or evne has "high priority"
I like that Intel has entered into the GPU market because most if not all cards are built with NVIDIA or AMD chipsets, looks like they need to sort out things overall but it's a promising start. Intel also has great packaging, in 2016 or so they used an IR soundboard to play the Intel jingle every time you open a CPU box just like those musical birthday cards. I cut them out and put in them in unsuspecting places such as a cabinet or by a door, walk into a room and you'd hear the jingle.
Finally a David video where I actually own the product. So far I've been loving mine. It absolutely crushes for AV1 and video transcoding, and it renders quite well too.
I've been wondering how well it handles video rendering for live streaming. Thank you for your comment! All the big tech reviewers only review GPUs for their gaming performance. I wish they would cover video rendering too
@@tvHTHtv For streaming it is more than enough. Right now mine is used to encode an 800p 60fps twitch stream, and record the gameplay and my webcam at 1600p and 1080p respectively. It seems to do quite well for that in OBS.
I agree. I used my A770 to stream 1080p60 to do a UA-cam stream test while playing Scum the other day and it worked great. A few friends said the video quality looked excellent. Only problem I had was I think I picked the wrong audio settings so It was only picking up my mic audio. My fault.
@@michaelbrindley4363 - Me too Michael, when the quality of games was great and it didn't cost the earth. Now we've got remakes generally, poor innovation and GPU pricing which is a joke.
@auritro3903 I mean, the 3060 is 40%+ more powerful, so idk if it really competes with that. That's not even counting, dlss. Either way, op did say high-end. So I assumed he meant the 40 series... which are even more powerful.
The fps that you saw on dx9-10 was a result of the card not having drivers for dx9 10 and I think 11 too Intel focused on creating dx12 and vulkan Drivers first You are still able to play those games because they added a translation layer that translates from dx9,10,11 to dx12 Translating is also somewhat CPU intensive and can be seen on the graphs What I personally would do is use dxvk instead of translating to dx12 because dxvk seems to be more mature cause of its wide use on Linux
Haha, the first thing when I saw the naming of the ARC GPU'S was think of Airbus too! If they can optimize drivers better then it night be a great buy.
On the linux drivers there's a few tests where it hit *3080* levels. Currently outliers, but a sign of how far these might stretch with heavy driver optimizing.
I actually got to build a system with an A770, and honestly in the stability testing I did before handing it off to the customer, it was surprisingly good in the RT department. Optimization was horrible requiring a ton of tweaking and work arounds, but the results were actually worth it. Properly dialed in it was about even with a 3060 maybe a 3060 TI in some titles from a straight FPS standpoint. From an RT standpoint though? It was closer to a 3070 when properly dialed in. For a first gen? That's not bad, but you have to like tinkering if you're going to buy this one in it's current state. It has a lot of the "teething issues" Radeon had early on, but much better architecture, so it looks promising if they can get the optimization side of it handled.
Random video idea: By one of those screens that they sell on Amazon or other sites, and play games on it solely for a week or longer. These screens are the sizes made for cars or other devices, after cutting the materials for larger screens. It is my understanding that a screen is made as a larger surface, and then cut to size, so any leftovers can be manufactured as displays, or they are waste. It sounds like it could be a cool video.
I'm sure you've read enough comments about DXVK and 'Rebar' at this point... I've been watching a number of videos about these cards, and yes, the driver situation needs help too... But I've seen better, much better, performance from these cards. I'm very happy to see competition to Nvidia/Radeon - which feels really odd to say since it's Intel... LOL.
Oh no, Intel is let down by the performance. They wanted more, but sure they at least priced appropriate for the market. But this first gen is putting the whole project in tepid waters.
It seems like this is a really great mid-level GPU that just lets you use raytracing with it, whereas most mid-level GPUs would not take it as well. Very interesting!
please do an SLI arc build - the fun part is finding compatible hardware ;) also ive heard the arc series has devolopment for DVXK/ (direct x over vulkan) pretty well
YES! FINALLY! Someone mentions the Airbus naming scheme. Intel has so clearly lifted it from Airbus , that I honestly cannot believe nobody has mentioned it yet😁
Someone should make a dedicated ray tracing card. It would go in the second PCIE slot along with the gpu and it would allow for you to max out ray tracing without adding even a single percent of GPU usage meaning even ppl with lower end cards that dont have RT capabilities would get to experience ray tracing. Also having a dedicated ray tracing card would allow for developers to go crazy with ray tracing
Bad idea. RT cards are already basically doing that because they have dedicated cores for that. So it's not hurting shader performance anyways. All you're doing is adding latency and complication, while reducing bandwidth.
2:00 Yeah, I know you were attempting to be funny, but that cardboard is saying that YOU are authorized to use the logo (via the sticker) on your computer as long it has that graphics card installed, not that Intel is authorized to use their own logo (because OF COURSE they are authorized to use their own logo).
It's funny that ray tracing is so unimportant that we often really struggle to tell the difference except in games where the rasterised lighting is super basic (Quake rtx, Minecraft, etc) and we have to rely on looking at the frame rate
Wait. Did you swap back & forth between the 770 and a 380? I thought you made a verbal oopsie but the Airbus pic also had 380 on it. I picked up the A380 for my htpc and it is an AV1 comprssing blu-ray beast as well. No crushed blacks and fully encodes the 1-2 gig AV1 copy in around 10-18 minutes. Going to check out its HDR performance soon.
Makes sense it has ray tracing performance advantages considering Intels GPU attempt 12 years ago, Larrabee was a ray tracing GPU and ran full ray tracing games back then at 60fps at 1080p which is mind blowing for something so old, makes sense they would reuse some of that technology in their second production GPU launch.
@@Zionn69 check out the Gamers Nexus teardown. There are some questionable assembly methods employed with the reference A770/750, including glue. Not great for maintenance or reassembly.
I've been leery of considering an Intel GPU for my system but Dawid's tests make it look like a decent option...especially for the price. Ngl...I dont hate it! Maybe someday when my EVGA FTW3 3060ti Ultra kicks the bucket, I'll consider replacing it with an Intel Arc because NVIDIA flat out said they are going to price gouge and AMD will probably be jealous of NVIDIA's revenues so they'll start raising prices too. Its also worth noting that these are the 1st gen of Intel cards...I would like to think they'll improve in the future.
I know you won't do it, but... I would love to see someone finally testing that card over Linux - with all duo respect, it might be the same thing as with Tiger Lake, (G5/G7 APU) where it runs much better with the Linux Intel drivers than on Windows *cat eyes stare* pweety pweeseee
@@dillonnnnnn not really, there is actually a support Kernel 6.0 has them implemented officially by Intel BUT Mesa3D already updated their packages with Intel Drivers - so really... even if you take for example, Mint with Kernel 5.15 - you still will run that GPU without any issue, and if you use Mesa you will have everything updated to latest no matter what :D
Its too bad if you look at a teardown video and see that its held together with glue and tape. Literally. Sacrificing practical parts of the design like it being easily worked on for aesthetics is a terrible practice for a company. Even if most of your consumers wont tear it apart there is quite often a need to do so for some.
@@MasterN64 Uh, yeah, that solid, cast base plate is just for show. Get real. They taped some wires out of the way and used some adhesive to secure the back cover. It is not "literally" held together with glue and tape. If anything, the card is overengineered.
@@wargamingrefugee9065 Look guy every single major manufacturer of cards avoids tape and glue at practically all costs because they are an absolute nightmare to deal with and put back together in a way that looks decent. The first step to taking it apart is to peel off the big sheet of flimsy metal on the back of the card to get access to the screws thats held there with double sided tape. A metal sheet that you will -never- get back on in a way that will look decent again. If you want to understand why just watch the gamers nexus teardown of the card and why they say so.
As an avgeek and a tech geek, I approved with your jokes with the A380 on a gpu plane lol Also the term “GPU” is also in aviation, meaning “Ground Processing Unit”
2:47 IKR I have one of those old cards lying around somewhere and my first thought when I saw that arc GPU was, oh wow it looks a lot like that evga card
Great video as usual. We are located in Ontario and actually getting hands on with an A770 was just luck. I had been calling Canada Computers and Memory Express regularly but neither could confirm when they would have Stock, Memory Express took a pre order for me on an A750 but said they could not do it for an A770. The Canada Computers site showed stock of the A750 but still "Sold Out" of the A770 it wasn't until I actually checked the per store stock checker as you displayed it showed an A770 at 3 of the Ontario locations. My Pre Order at MemEx still stands but they haven't received any stock.
Would be good, where games support it, for somebody to design some add-on levels that specifically look very different, not in "quality" but in obviously noticable appearance, so you could tell if ray tracing really IS turned on or not. Then you wouldn't be relying on the game to tell you, or squinting and trying to figure out what difference it ought to be making. Also Dawid, everything you ever say, do, or think is amazing and you're a big genius. I totally give a shit about this channel! [Now for my like-heart, heh heh heh!]
I subscribed because of your humor and also your funny voice, nice video sir. Keep it up. A770 and Amd latest gpu will dominate the price category this year and 2023.
Great vid as expected Dawid, has it all, funky smells, new naming to the intel Airbus and the stutter zones and seizures. Did you ever see if there was any difference with the "external gpu setting" the drivers were moaning about?
in Fortnite, some graphical settings don't take effect until you start a new match. RT amongst them. change the settings, return to lobby, start a new match and then you'll not only see a difference in FPS, but also in visuals. I'm sure someone below already commented about it, but I just experienced this on my own. a rx 6650xt doesn't trace the rays very well. I got about 20 fps
Hey! For the testing in this video, I did not have ReBar turned on. So I did a follow up video looking at how much of a difference this single setting made to the airbus GPU's gaming performance. Go check it out here: ua-cam.com/video/4itS-I_Xtlw/v-deo.html
It´s still private, lel
@@ftchannel21 20 more minutes. 😃
Yeah in this video it seems to be slightly better than a gtx 960 except in cyberpunk
if you reply ill send you a pokemon card
So no Rebar - which Intel has stated needs to be enabled... Maybe you will try testing in Win95 next..
Honestly, describing the Intel ARC GPU as a very smart drug addict was genius. Never change, Dawid, your commentary never ceases to brighten up murky days.
If this was a YLYL video, I would have lost it multiple times... so funny. lol
His random analogies are the reason I subscribed haha
I have to admit his cheeky comments brighten my day as well 🍻
😂
Yeah, I totally agree with this.
I really want these Intel graphics cards to do well, having another competitor for Nvidia's shenanigans would be good. And these cards don't seem to be far off, with better drivers and optimizations, the second or third generation could be very compelling.
At least pre covid GPUs were intended the price drop they had but scalpers ruined it. I still agree tho.
well Intels doing pretty good considering they are just starting out think about it AMD bought ATI Intel is starting out from scratch and they are already able to not just game but also do ray tracing to some degree already it's rather impressive when you think about it
Everyone wants these cards to do well. Only idiot fanboys don't. Competition is good for everyone.
@@cryluneI wish they were out sooner but better late then never🤣
@@raven4k998 , that's not a fair comparison considering Intel used a person from AMD to design the Intel GPU. Along with that, its not fair to compare Intel or AMD to NVIDIA with Ray Tracing because the DXR code is programmed for NVIDIA, requiring very complex drivers to compensate. This will not be the case with gameS developed on a real DirectX 12 Engines like Unreal Engine 5 that are not some ghetto rigged DirectX 11 Engine with the "DirectX12 feature" added, non-natively. The major purpose of DirectX 12 was to eliminate all this "driver" non-sense and simplify the process by natively supporting software and hardware RT within the Engine code itself. NVIDiA will still have to deal with drivers for every game because the code has to travel outside of the Engines workspace to utilize dedicated cores. AMD and Intel have the advantage of allowing the developers to utilize them easily because the allocated cores are within the same silicon. There is also the fact that AMDs shader-cores have become so fast and efficient that software RT like Lumen can be handled without the need for dedicated hardware. The differences between Lumen and dedicated Ray Tracing has recently been proven to be minor and its will only get better. NVIDIA still doesn't hold a candle to AMDs pure shader-core performance.
Did you double check you had Resizeable Bar on? the intel GPUS lose a ton of speed if its not active.
yes he did
@@L4ftyOne You know how?
@@kidShibuya It's a BIOS/UEFI setting on supported hardware. The CPU, motherboard and GPU must all support the feature in order to use Resizable BAR.
Dawid did not mention anything he changed in the bios to enable resizable bar, so it's possible he didn't have it on
@@konrad999999 12th gen *should* have it enabled by default so if he didn't change anything then it should be enabled
I like how Dawid just names the gpu "Intel Airbus" and sticks with it throughout the video 😂
I was just going to comment that 😂😂
It's a thing he do in most videos
His consistency is impeccable
And A380 instead of A770 at the beginning of the video
I’m an avgeek and I approved with his jokes
All the stuttering in these games makes it look like rebar is disabled. Arc really likes rebar that is a setting in the bios that really needs to be enabled.
Very likely, seen the tests of this card and it was looking like competent 1440p card in new titles so having performance like this in 1080p looks strange.
@@SirSleepwalker intel gpus seem weird. They perform better at higher resolutions for some reason.
@@Derpynewb I saw Digital Foundry's review of the Arc GPUs and Raytracing seems a lot better in their testing and they did it for both 1080p and 1440p. I guess Intel cards are little gimmicky right now.
it weas on, he clarified it
@@Derpynewb Well it is from strachs and new. They are well optimized for newer tech.
As a few others have said, Intels cards really need ReBar. Its a decent card at the price, if they can sort out the drivers.
Right. Especially if You acknowledge that this is the 1st genration with 1st gen drivers!!!
Wtf is rebar?
@@raptorhacker599 Resizeable Bar, basically it lets the cpu acces the gpu memory.
G'day Steve, at $699AUD the A770 16GB isn't priced very well down here in Australia as you can get a RX6700XT or 3060Ti for the same $$$
@@raptorhacker599 Steel bars used to reinforce concrete. :P
If the game has an option for vulkan, try that as its better supported than Direct X at the moment. People have been using a DX11 to Vulkan wrapper and getting large performance gains.
yeah dxvk
So Intel arc GPU benefitted most from the work done on Steam deck and Linux gaming, everyone wins in open source
To be honest you also get performance gains with DirectX to Vulkan translation on other GPUs as well. I think there are still 2 years old benchmarks, showing that you can get around 20% more performance in World of Warcraft for example... even comparing DX12 with DXVK.
I just run popos with vulkan. I started using it a little more than a year ago and it was barely supported, and now every game supports it, and every game is on proton
as a linux user it sounds so strange hearing about dxvk on windows XD, some of my friends who run windows did get huge performance gains in games that use old dx versions
Other reviews had Intel somehow more competitive at 1440p. Like 1/2 the performance at 1080, but 3/4 the performance at 1440.
bottom line, intel suceeded in sending a mid-tier performing gpu for (somewhat) reasonable pricing to market, something that neither nvidia or AMD really do anymore. Some of us are still on the nvidia 10 series, and are just excited to see more variety entering the second hand market, and competition for price efficiency from here.
6600 xt is performing without RT about on par, just is a lot cheaper so ?
this card costs about the same as a 3060, performs like one but just more issues so I don't find it good value cause it's inferior by the fact perf isn't better, price isn't better and driver quality is worse.
Nothing else to be said really.
@@ole-martinbroz8590 they do compete a lot better for professional applications. their encoder is insane, and they seem to perform better in professional rendering than game rendering. They don't have to be the best on their first generation, so long as they have a market. though I will say, I wasn't aware of just how little difference ther was between AMDs low end stuff. I'm happy with the card I have for the next few years, so I didnt do that much research.
They've already laid off the graphics division I'm afraid.
man you guys claim to know so much and can't even look for 5 seconds at newegg to see even RX 6700XT are now found at $350....
@@WaterZer0 wait, actually? so theyve given up after the first generation? They cant have expected to profit on the first go, thats insane.
I want Intel to keep creating GPUs and increasing support for older games. I might not buy one of their GPUs this generation but I definitely would consider buying their 2nd or 3rd generation if they continue. For a first attempt at a GPU, everyone seems to expect them to be perfect. As a computer engineer myself I just want to say that this is hella impressive that they got the game support and performance that they did for their first attempt! I’m excited to see them continue in this space.
Yes, but rather than being impressive from engineering standpoint it needs to be actually much better than AMD/Nvidia cards at the same price point in practical day to day use, to get reasonable sells, since it doesn't have any brand recognition. I don't see that coming, so let's hope Intel's plan for 1st gen is to be a sort of beta test, and only 2nd gen and above being a commercial success.
I really hope these cards continue to get better with driver updates. Nvidia/AMD need serious competition!
You wish but you will not buy one woldn't you?
@@franciscocornelio5776 actually consedering their price vs similar speed and spec cards from amd and nvdia i would buy it if its getting close to their competitors if i were to buy a gpu thats a $50 dollars lower than nvdia but only having like 10 fps lower is a real good product
@@KiotheCloudI only want one for Minecraft
@@SMCwasTaken if you want a graphics card for minecraft you aren't going to need much. just get a low profile card
I checked out my local MicroCenter and they had one Arc GPU, an Arc 750. On the web page showing Intel GPU's (only the one) was this line:
"When it comes to purchasing a GPU, the first decision you need to make is between an NVIDIA graphics card or an AMD graphics card."
tbf, as funny as that is, its more likely they havent yet gotten around to get their site templates updated then it being meant as a jab, these "blurbs" above product categorys(like GPUs even for specific brands) are part of the template for the product category, and its not often you get a new contender in a space like GPUs so this isnt necesserly something you specificaly check or evne has "high priority"
Living anywhere near a microcenter is a blessing…
future laptop you will only need to choose either Intel CPU /GPU or AMD CPU /GPU .
@@kobolds638 hopefully nvidia goes the way of voodoo then
The Intel A7XX cards do seem to be more optimized for 1440p. These results probably would have been more interesting at that resolution.
it's because of the bus speed, which is better than even the 3080 I believe so it does really well at not dropping as much fps in 1440
I like that Intel has entered into the GPU market because most if not all cards are built with NVIDIA or AMD chipsets, looks like they need to sort out things overall but it's a promising start. Intel also has great packaging, in 2016 or so they used an IR soundboard to play the Intel jingle every time you open a CPU box just like those musical birthday cards. I cut them out and put in them in unsuspecting places such as a cabinet or by a door, walk into a room and you'd hear the jingle.
Intel has a jangle?
Finally a David video where I actually own the product. So far I've been loving mine. It absolutely crushes for AV1 and video transcoding, and it renders quite well too.
I've been wondering how well it handles video rendering for live streaming. Thank you for your comment!
All the big tech reviewers only review GPUs for their gaming performance. I wish they would cover video rendering too
@@tvHTHtv For streaming it is more than enough. Right now mine is used to encode an 800p 60fps twitch stream, and record the gameplay and my webcam at 1600p and 1080p respectively. It seems to do quite well for that in OBS.
@@tvHTHtv Are you familiar with EposVox? His video, "Intel GPUs are NOT what anyone expected" covers video performance starting at 7:56.
I agree. I used my A770 to stream 1080p60 to do a UA-cam stream test while playing Scum the other day and it worked great. A few friends said the video quality looked excellent. Only problem I had was I think I picked the wrong audio settings so It was only picking up my mic audio. My fault.
Love how the A770 is the price the top end Nvidia cards should be. Good move there intel.
Yeh from the early 2000s, I miss those days.
@@michaelbrindley4363 - Me too Michael, when the quality of games was great and it didn't cost the earth. Now we've got remakes generally, poor innovation and GPU pricing which is a joke.
But then doesn't that mean the Intel gpu is overpriced? Considering the performance doesn't match the top nvidia gpus
@@psychologicalFudge I think he means that it competes with 3060 and is a much better deal
@auritro3903 I mean, the 3060 is 40%+ more powerful, so idk if it really competes with that. That's not even counting, dlss. Either way, op did say high-end. So I assumed he meant the 40 series... which are even more powerful.
ARC GPUS are smexy
True
now that's a word I haven't seen in ages
ARC GPUS are smelly.
@@Gatorade69 your smelly
@@Gatorade69 tbh I'd rather take smell over an extreme "gAmEr" look a little green company uses.
I really hope this architecture takes off and matures well. my next PC might be al Intel!
The fps that you saw on dx9-10 was a result of the card not having drivers for dx9 10 and I think 11 too
Intel focused on creating dx12 and vulkan Drivers first
You are still able to play those games because they added a translation layer that translates from dx9,10,11 to dx12
Translating is also somewhat CPU intensive and can be seen on the graphs
What I personally would do is use dxvk instead of translating to dx12
because dxvk seems to be more mature cause of its wide use on Linux
Haha, the first thing when I saw the naming of the ARC GPU'S was think of Airbus too! If they can optimize drivers better then it night be a great buy.
Fr, same. Especially the a380 one.
i saw the ray tracing videos a few months back, it even did as good as the 3060ti in some cases
On the linux drivers there's a few tests where it hit *3080* levels. Currently outliers, but a sign of how far these might stretch with heavy driver optimizing.
@@XiaOmegaX that would make them incredible for blender rendering workloads since they rely so heavily on ray tracing now.
I started watching when you had 5k subs. Now within days you will reach 500k. Congrats!!
I actually got to build a system with an A770, and honestly in the stability testing I did before handing it off to the customer, it was surprisingly good in the RT department. Optimization was horrible requiring a ton of tweaking and work arounds, but the results were actually worth it.
Properly dialed in it was about even with a 3060 maybe a 3060 TI in some titles from a straight FPS standpoint. From an RT standpoint though? It was closer to a 3070 when properly dialed in. For a first gen? That's not bad, but you have to like tinkering if you're going to buy this one in it's current state. It has a lot of the "teething issues" Radeon had early on, but much better architecture, so it looks promising if they can get the optimization side of it handled.
Random video idea:
By one of those screens that they sell on Amazon or other sites, and play games on it solely for a week or longer.
These screens are the sizes made for cars or other devices, after cutting the materials for larger screens. It is my understanding that a screen is made as a larger surface, and then cut to size, so any leftovers can be manufactured as displays, or they are waste. It sounds like it could be a cool video.
Great as always. I look forward to new videos with that intel also ran. I hope they stay in the GPU game and offer a real 3rd option.
It was impressive how the box had a blue glow as he opened it!
That wasn't the box.
I'm sure you've read enough comments about DXVK and 'Rebar' at this point... I've been watching a number of videos about these cards, and yes, the driver situation needs help too... But I've seen better, much better, performance from these cards. I'm very happy to see competition to Nvidia/Radeon - which feels really odd to say since it's Intel... LOL.
Oh no, Intel is let down by the performance. They wanted more, but sure they at least priced appropriate for the market. But this first gen is putting the whole project in tepid waters.
Holy shit, this thing's potential is insane. If they can work out the driver problems... damn, I might grab one myself later.
Yeah... It's actually reasonably priced as well
The great Control thumbnail is back!
I couldn't resist that click bait! 😄
Did you remember to enable reBAR?
The performance is crippled without it.
If ReBAR was not being used, all of these benchmarks mean nothing.
It doesn't look like it was. To be honest, this is a fantastic card with it on, but a surprising amount of people don't know it needs to be used
12th gen should be turning on that by default
He said it was enabled in a post on one of these comments
Why is Dawid actually "ignoring" the poor frametimes?
It seems like this is a really great mid-level GPU that just lets you use raytracing with it, whereas most mid-level GPUs would not take it as well. Very interesting!
I have the same card running 4K resolution at 45-55 fps. I’ve never seen something so beautiful on my monitor
please do an SLI arc build - the fun part is finding compatible hardware ;)
also ive heard the arc series has devolopment for DVXK/ (direct x over vulkan) pretty well
YES! FINALLY! Someone mentions the Airbus naming scheme. Intel has so clearly lifted it from Airbus , that I honestly cannot believe nobody has mentioned it yet😁
Ahhh Dawids classic Control thumbnail. You know he is doing a GPU test.
Someone should make a dedicated ray tracing card. It would go in the second PCIE slot along with the gpu and it would allow for you to max out ray tracing without adding even a single percent of GPU usage meaning even ppl with lower end cards that dont have RT capabilities would get to experience ray tracing. Also having a dedicated ray tracing card would allow for developers to go crazy with ray tracing
There might be PCI-E limitations regarding bandwidth. RT cores are integrated into the GPU so the bandwidth is high and latency is low.
Like a photonic chip card. Then ray tracing would be instant.
Bad idea. RT cards are already basically doing that because they have dedicated cores for that. So it's not hurting shader performance anyways. All you're doing is adding latency and complication, while reducing bandwidth.
There is one Dawids sentence in this video that pretty much sums up the ray traycing: 'i think it looks different'
I have high hopes for intrls GPU's i hope they get them together we need some more competition in the gpu space
Haha, had to laugh every time you called it an airbus! :)
Did you use reBAR btw? HUGE impact. Pretty sure Petersen even said to buy a 3060 if you dont have reBAR
So when it runs well, it runs *very* well! Thanks for the review! Great video
"It's like living with a very smart drug addict"
Sir you earned yourself a Like
2:00 Yeah, I know you were attempting to be funny, but that cardboard is saying that YOU are authorized to use the logo (via the sticker) on your computer as long it has that graphics card installed, not that Intel is authorized to use their own logo (because OF COURSE they are authorized to use their own logo).
It's funny that ray tracing is so unimportant that we often really struggle to tell the difference except in games where the rasterised lighting is super basic (Quake rtx, Minecraft, etc) and we have to rely on looking at the frame rate
If you cant tell the difference between Ray Tracing On and Off you really need to check your eyes :)
I love that Ray Tracing was subtitled as rage racing at 0:09 =)
Wait. Did you swap back & forth between the 770 and a 380? I thought you made a verbal oopsie but the Airbus pic also had 380 on it.
I picked up the A380 for my htpc and it is an AV1 comprssing blu-ray beast as well. No crushed blacks and fully encodes the 1-2 gig AV1 copy in around 10-18 minutes. Going to check out its HDR performance soon.
seems like a lot of this stuff is driver/compatibility related, so it sorta makes sense. Plus at that pricepoint it's definitely an impressive card
Remember PhysX cards? I think they should make RT cards as an option. I think that would be awesome.
You are not the first person to notice the aviation adjacent naming convention of these cards. It makes me want one all by itself.
Makes sense it has ray tracing performance advantages considering Intels GPU attempt 12 years ago, Larrabee was a ray tracing GPU and ran full ray tracing games back then at 60fps at 1080p which is mind blowing for something so old, makes sense they would reuse some of that technology in their second production GPU launch.
The only reason I wanna get an ARC A770 is because of the looks
I want it for the Glue!
@@stephanhart9941 i dont get it, can you explain
@@Zionn69 Gamers Nexus who reviewed one
of the models said the pcb was tacked with so many thermal pads, acting like glue.
@@Zionn69 check out the Gamers Nexus teardown. There are some questionable assembly methods employed with the reference A770/750, including glue. Not great for maintenance or reassembly.
There's some tape on the backplate
I'm more excited about the card as a dedicated video encoder, I might end up getting one in the future for my Plex server.
What do you think about its price to performance ratio 🤔
I like that the fans vaguely resemble a turbo jet engine main intake turbo fan like a big airliner would have.
Seeing Dawid struggle with the terrible launcher that is EGS is always funny
I've been leery of considering an Intel GPU for my system but Dawid's tests make it look like a decent option...especially for the price. Ngl...I dont hate it! Maybe someday when my EVGA FTW3 3060ti Ultra kicks the bucket, I'll consider replacing it with an Intel Arc because NVIDIA flat out said they are going to price gouge and AMD will probably be jealous of NVIDIA's revenues so they'll start raising prices too.
Its also worth noting that these are the 1st gen of Intel cards...I would like to think they'll improve in the future.
Thank you! Someone finally acknowledging the airbus naming scheme lol.
"That is an odd smell for a graphics card"
You know you're in for a thorough review of this card when you even get a report on it's olfactory status 😅
soon there would be a collaboration between Intel and Airbus, and soon Airbus will also compete in the GPU scene lol
Do you even ray trace bro?
Tough day, really needed some Dawid snark to bring up my mood.
I did not leave disappointed.
I know you won't do it, but... I would love to see someone finally testing that card over Linux - with all duo respect, it might be the same thing as with Tiger Lake, (G5/G7 APU) where it runs much better with the Linux Intel drivers than on Windows
*cat eyes stare* pweety pweeseee
ive heard it doesnt have driver support for ubuntu yet
@@dillonnnnnn not really, there is actually a support
Kernel 6.0 has them implemented officially by Intel
BUT
Mesa3D already updated their packages with Intel Drivers - so really... even if you take for example, Mint with Kernel 5.15 - you still will run that GPU without any issue, and if you use Mesa you will have everything updated to latest no matter what :D
proper neckbeard linux.
@@ahmetrefikeryilmaz4432 mhm, ok xd
GPU: I fear no man
Dawid's nephew: But that thing it scares me.
despite all the issues this card have it looks awesome 😄 i'm more like an aesthetic guy than performance guy 😅
Its too bad if you look at a teardown video and see that its held together with glue and tape. Literally. Sacrificing practical parts of the design like it being easily worked on for aesthetics is a terrible practice for a company. Even if most of your consumers wont tear it apart there is quite often a need to do so for some.
I agree 100% the card is so pretty
Looks are superficial.
@@MasterN64 Uh, yeah, that solid, cast base plate is just for show. Get real. They taped some wires out of the way and used some adhesive to secure the back cover. It is not "literally" held together with glue and tape. If anything, the card is overengineered.
@@wargamingrefugee9065 Look guy every single major manufacturer of cards avoids tape and glue at practically all costs because they are an absolute nightmare to deal with and put back together in a way that looks decent. The first step to taking it apart is to peel off the big sheet of flimsy metal on the back of the card to get access to the screws thats held there with double sided tape. A metal sheet that you will -never- get back on in a way that will look decent again. If you want to understand why just watch the gamers nexus teardown of the card and why they say so.
As an avgeek and a tech geek, I approved with your jokes with the A380 on a gpu plane lol
Also the term “GPU” is also in aviation, meaning “Ground Processing Unit”
"Non-saggy mounter thing" - ahh, you've seen my Tinder profile
dang, this is one of the best ARC reviews I've seen. Kind of makes me want one.
If they can get their drivers sorted out this is going to be a great card at a fraction of the price of most other options.
I love that you always put the consumption on the screen, its relevant to me :D
I like how Dawid compares a GPU to a literal Air plane
we love gpu software features such as screen recording.
we love trusting people with when our screen can be viewed and recorded remotely.
i wish microcenter would go international, as someone who doesn't live in america, getting pc parts can be a pain in the ass
My god. I always chuckled at every time the "Intel Airbus" is mentioned. 😂
Best micro center ad I've ever heard.
2:47 IKR I have one of those old cards lying around somewhere and my first thought when I saw that arc GPU was, oh wow it looks a lot like that evga card
Great video as usual. We are located in Ontario and actually getting hands on with an A770 was just luck. I had been calling Canada Computers and Memory Express regularly but neither could confirm when they would have Stock, Memory Express took a pre order for me on an A750 but said they could not do it for an A770. The Canada Computers site showed stock of the A750 but still "Sold Out" of the A770 it wasn't until I actually checked the per store stock checker as you displayed it showed an A770 at 3 of the Ontario locations. My Pre Order at MemEx still stands but they haven't received any stock.
I loves how you speak like a dad but you actually know what your doing
@El Cactuar thanks 😉
I'm studying to work on A380's among other aircraft.
Sad to realize all the computer nerds don't know that when Dawid says A380 he means the airplane, not the card.
Apparently the recent drivers have sorted out the performance super well, so for a card at this price point its apparently equivalent to a 3060 now
Would be good, where games support it, for somebody to design some add-on levels that specifically look very different, not in "quality" but in obviously noticable appearance, so you could tell if ray tracing really IS turned on or not. Then you wouldn't be relying on the game to tell you, or squinting and trying to figure out what difference it ought to be making.
Also Dawid, everything you ever say, do, or think is amazing and you're a big genius. I totally give a shit about this channel! [Now for my like-heart, heh heh heh!]
My high ass just realized his name is Dawid and not David after watching him for over a year
whats even more weird is how people still make videos about ray tracing when literally less then 1% of users actually use ray tracing
My only use for Ray Tracing is like an occasional play through with it on in Minecraft lol, 100% agree with you
I use ray tracing
I subscribed because of your humor and also your funny voice, nice video sir. Keep it up. A770 and Amd latest gpu will dominate the price category this year and 2023.
DX12 Battlefield probably stutters because it's compiling shaders. If you had kept playing that map it probably would have evened out after a bit.
That’s what I was thinking too. But after 15 minutes it was still stuttering like that.
GO DAWID!!!!!! love the videos, if you don’t mind me saying
The fact you put a 4090 in there.... well just made me laugh so much😂
Can't believe I managed to get one of these before Dawid did
Great vid as expected Dawid, has it all, funky smells, new naming to the intel Airbus and the stutter zones and seizures. Did you ever see if there was any difference with the "external gpu setting" the drivers were moaning about?
that quake RTX actually ran super good
in Fortnite, some graphical settings don't take effect until you start a new match. RT amongst them. change the settings, return to lobby, start a new match and then you'll not only see a difference in FPS, but also in visuals. I'm sure someone below already commented about it, but I just experienced this on my own. a rx 6650xt doesn't trace the rays very well. I got about 20 fps
"runs very smoothly now" game proceeds to have a stroke
I'm sure if you give the drivers for that GPU a few solid revisions, most of that stutter will get worked out...
1:03-1:08 my thoughts exactly 😂😂😂 Airbus will be furious lmao 😂😂😂
“Miiiiicrooceeeenteeer”. Subscribed. Thanks for that lol
yoooo dawid you probably won’t see this but i live in the uk and i was able by pre ordering 2 days before i could buy instantly get an a380
subscribed because of the "Miiiiiiiiicroooo Ceeeeenter." Every video should have singing.