Duuuuuuuude I am a seruous stalker fan and the Xray engine is awful. But beautiful at the same time. It makes me happy that GSC got some lime light considering a load of them had to go war and relocate to poland from ukraine. Stalker2 soon!!
LTT probably has a lot more footage to compile since a big chunk of their company went through with the switch. It is good that we are getting Dawids now. And it will be good when we get the LTT report later.
AFAIK they have an ex Nvidia engineer on the team responsible for the Arc GPUs, I want Intel to stay in the GPU market as a third player was needed to stop the stupidly high prices that Nvidia are charging.
I know right? I'm tired of big bulky GPUs looking like it's a sword or something. Just give me something sleek and stylish. It's why I stuck with reference designs these past few generations. Intel did a fantastic job. It's clean, modern, and simple. Easy to pop into any setup.
@@DawidDoesTechStuff drivers - yes. firmware/bios is another story. amd/nvidia dosn't bundle bios update with the driver, but that's mostly due to fw/bios being the responsibility of board partners.
Intel's drivers work more like firmware. Once installed, you cannot install an older version as it updates the card's firmware with each update. I'm not sure if they will change this.
Early adopter here. I noticed the performance boost with the newest driver. Many more games improved with it. Awesome! I know more updates will coming and I'm looking forward to see the full potential the Arc GPU will bring.
@@domeberlin4480 Used i assume? Because no way you get a new 6700XT for 300 euro. Anyway i got an ARC 750 for 280 euro with shipping. Gonna have it in my hands next few days and i am excited to see what it does. Why? Price was decent since similar performing cards are more expensive. Now it seems Intel fixed many problems and they haven't quit on them. So in general you can have it and forget that is a first generation GPU from a new competitor in the sector. It has potential to improve even more considering the hardware. It looks fine as hell.
he said honey 'BADGER' ???? it sounded like honey 'RADGET' or something like that lol I wouldn't have known if there wasn't a picture of that stripe-less skunk with the sweet intel tattoo on the screen
Was using my a770 for about a month as well (just popped my 6800 back in last night). It honestly wasn't bad other than the fact that there was a big jump in performance I wasn't ready for. It worked really well at 1080p and okay at 1440p. I played through the MW2 campaign which I expected a little more performance out of but was more than playable. P5R ran fine despite being an older game though to be fair, it was built and specced for the PS4 so even my steam machine handles that game fine. GoW I did have to run with DXVK and even then, there were notable dips though overall the game was playable. I got through a good chunk of Spiderman Remastered and a few hours of Control both with RT. Witcher 3 Next Gen was fine after the beta intel patch they just released alongside the hotfix CDPR dropped. Overall it was fun though, i'd still stand unless you're a power user, I wouldn't get one. Its a lot better in one month but not quite mainstream anyone can get yet. If you care about RT, it is worth considering as I was running Control fine rendered at 1080 render resolution medium with the top two RT options on and was at 65-75fps on average. The Witcher was running at 80ish fps with RT ultra on (FSR2 Quality 1440p) and Spiderman was fine for 80-90fps with RT on (Native 1440p). My 6800 drops like a rock the second I turn on RT in Witcher and Control.
To be fair though, if RT is something anyone cares about there's no reason to get anything other than a 4090. Every card below the 4090 isn't powerful enough for good RT performance.
@@hman6159 I've had 3080s for over 2 years. They're not strong enough for RT, imo. If I were looking for a 600ish dollar card now, a new 6800xt would be a better buy than a used 3080.
I hope some people stick with using these cards so that Intel can improve and MAYBE actually stick around. With how AMD and Nvidia (especially Nvidia, looking at you 4080) have been doing price wise recently, we really need Intel to improve, stick around, and compete.
@@AngelA-tj9ok CURRENTLY, their graphics department seems to be deadset on trying to bring reasonably priced stuff to the market. CURRENTLY. Will it stay that way? Likely not forever but they are the best hope we have right now for a good competitor. Designing and manufacturing GPUs is an expensive endeavor. It's not something a small dog can really get into and succeed in very easily.
i personaly blame AMD for the terrible prices of gpus, nvidia has the performance and trust lead, so they can and will charge a premium. AMD shit heads on the other hand lack both, but still playing on the same price level as nvidia, if AMD drops their prices by 2 tiers, then they can at least compete.
@@mikeymaiku look, I know I'm hoping in one hand and shitting in the other and I know which will fill up faster, but I still want to hope they can make a lasting impact even if it's, especially right now, looking rough for Intel in the GPU market.
Intel Arc GPUs really aren't that bad, with their new drivers they can now run the older games just fine, Arcs are really something to consider when building or upgrading a PC. I'm really excited to see how Arc GPUs go. The naming scheme is bad though, is it an Airbus or a GPU? But that seems to be consistent with Nvidia and AMD.
I think there's a contingent of people that are very conflicted over Arc GPUs. Intel is usually the bully on the CPU side, but now the underdog on the GPU side. That and when a new player comes into the fight, the MO is going to be to examine it under a microscope and find everything wrong with it that you can. Like Dawid said, '90% of the time, it just works' but that's a boring video, so to drive engagement, it 'helps' to point out ever single flaw. The downside is that when you're done its 'It worked mostly, but here's most of a video with me complaining about it.' and the average UA-cam user doesn't grasp the nuance and figures 'It must be crap, look at all the things wrong!'
@@WhiteG60 Overall that's what I got out of all the reviews. It's fine, it works. It doesn't quite compete with the top offerings, though even AMD struggles to keep up with NVidia. Though AMD still owns the efficiency crown, not sure against Intel though.
@@WhiteG60 tbf working fine 90% of the time is still way worse than you usually get from even the worst examples of half-baked Radeon/Geforce launch drivers
DXVK-async is the WAY better version of DXVK especially if you are using a Windows PC. The place the files go is always where the .exe file for the game is. I use DXVK-async for EVERY DX9 game I have and it resolves a lot of issues including the Borderlands GOTY Enhanced memory leak issue. In that game I run the ISLC program as well to cap the amount of system memory that can be used. Based on this vid there might be some long-term hope for intel as a GPU manufacturer which will be good for the market. Thanks for the vid Big D.
I sidegraded from the Vega 64 to the A770. I’m about a month in with the card and it’s been treating me well. I mainly swapped out of curiosity and haven’t been disappointed yet. My biggest complaint so far is that certain games (ex:Age of Empires IV) assume you’re using intel integrated graphics. Because of this quirk you’ll have to do a restart on initial setup of games because they default to the lowest possible settings. This is true even when the game runs fine maxed out.
@@DawidDoesTechStuff Shhh. Don't tell them! 😂 Love your more down-to-Earth content dude. Just a little nag, though, be careful about the distinction between GB and Gb. I see a lot of these mistakes in your videos.
S.T.A.L.K.E.R anomaly has had me obsessed, cool to see youre playing gamma although i think youd see less performance issues with anomaly i think it has less mods but it keeps all the good stuff while being a bit easier to run on most system . It does keep all the things like ironman mode and that mode where you like warp into a random ai when u die to carry on playing. Cant beat a bit of gopnik fallout though i love it.
@@DawidDoesTechStuff I mean the original Stalker was a bit janky on release, and I loved it for it. As long as it has a similar atmosphere. Also obligatory : GET OUT OF HERE STALKER
I tried dxvk in gamma, the easiest way to force them to work is to replace the dll files in your anomaly/bin folder That way even if you launch the modpack the game will have no way to run anything else
Thanks for the suggestion. Although considering the performance improvements from the driver update, I think the drivers may have implemented that fix natively.
@@DawidDoesTechStuff it think is most likely, but RivaTuner still indicates its in directX11 mode which is weird because when i tried dxvk with my tesla m40 it did show vulkan instead, maybe Riva cant read what is actually using at a driver level but who knows, intel cards are as weird as they get
@@montagyuu5163 Uh, no. They are not "held together by hopes and hot glue". If anything, they are overengineered. Also, who services a graphics card?!? I literally still use a graphics card plugged into an accelerated graphics port. It's never been "serviced" outside of blowing the dust/pet hair/only-god-knows-whatever-else out of the thing.
@@wargamingrefugee9065 I've had to repaste GPUs, I'm surprised you've never had to do that, it isn't uncommon to need that performed in their first five years of life. The backplate on the A770 / A750 is literally hot glued on btw.
@@BubbaBearsFriend I simply said I thought they looked nice. Not that I would buy one based on that. You assumed that yourself. Not everyone wants tacky looking RGB covered gear though. There is lots of good hardware out there to suit both tastes. No need to settle for looks alone.
Thanks for taking the time to do this, "it just works!" is pretty much all we ask for with hardware. Intel is new to the GPU game, I really hope that developers embrace it and in time it's just as reliable as the AMD and NVIDIA offerings because, at the end of the day, competition and choice benefits everyone.
My A770 is fine. I don't notice the coil while anymore, I've only noticed like 3 instances of glitching graphics that fixed on restart. (I also used DDU to install the December update... But couldn't use the control software to update so I'm not happy with that)
I was gifted a 3080 but if I had to use my own money, I'd probably try an Arc A770. I really want there to be more competition in the graphics card space, especially with the Arc's price point.
I have been using an arc a770 since its release date, I agree with everything here. It is a great experience with minor issues that get fixed weeks after you notice them. It has been a great card and it looks the best to.
I guess it’s growing pains for Intel’s GPU. I’m still glad they threw their hat in the GPU ring (more competition is great for the consumer). They don’t have to swing in the deep end either (having expensive GPU’s). They can carve a nice market for them in the mid range (who knows if Nvidia is going lower than the 4070).
I'm thinking maybe we should do more revisits as the drivers mature and fixes are made? (I'm a new owner of the A770 16 gb) Thanks for the video Dawid!
I was literally just finished recording my ARC review that has been a month and two weeks coming because of weird issues. I missed the fact the screen capture didn't have a hotkey, but I did notice there is no driver-based screen shot hotkey either like you have in Nvidia or AMD. Also the telemetry has all sorts of info ... but no frame rate, how crazy is that. I did use it as my main gaming card and in my usual games it just worked and other than it looked nicer than my 3070 through the tempered glass, I mostly forgot it was in there as well. Anyway, excellent video as always.
In case nobody already postet it here: short keys for screencapturing is alt-- f5 to start and alt--f7 to end the recording by default ( Beta driver released on Dezember the 5th). Used it several times . :)
This video had me physically in pain from laughing so hard. This man is hilarious and informative at the same time without even trying. Great stuff, man! You earned yourself a subscriber.
The a750 has been fairly nice to me so far. 1440p @75hz feels as good as my rtx 3050. S.T.A.L.K.E.R Anomaly runs stably and smoothly. The one game i've had a problem with has been Ready Or Not, it crashes 20 minutes in when playing the DX12 version. Everything else has played well enough for me to feel good about my $280 investment, rather than my $429 3050.
First thing that popped up for me in google: Method 1: Capture tab Open the Intel GCC application. Click Home > Capture. Click Start Recording. Click End Recording. Method 2: Hotkeys Press Ctrl+Alt+F5 (default) to start recording. Press Ctrl+Alt+F7 (default) to end recording. Note Hotkeys must be enabled under System.
I believe that installing that driver update must have also installed new microcode to the GPU itself, which obviously stays even if you downgrade to a previous GPU driver
Have my ARC770 16GB since a month and so far it was a blast. Dodged the early driver problems and most games I play run flawless. Sometimes dxvk gives better performance, but that is okay. MSI Afterburner seems to have some problems with ARC, but that happens often with new GPU architectures. Waiting for an update. Only thing I really can complain about is the dreaded control panel. It is next to useless and bothers me with constant popups for uninteresting things (Internet seems to be interrupted, your device has changed after the display was in powersaving mode). And who in the world is still using overlays in the 2020s? Do an app like everyone else, Intel! It is a good looking card, but disassembling it for cleaning seems to be complicated.
Hello Dawid, I'm also using A770 as my main GPU right now, and I can confirm that its behavior is very sporadic, and those issues would usually get fixed after a reboot. I first got it in a system with Strix X670E-I and R7 7700X, and I was getting 50-60 fps in 1080P RDR2 and A Plague Tale Requiem. Notice that even in bios and GPU-Z it says I have rebar on, the Arc control center tells me no. Then I put it in my main system with Crosshair VIII X570 Dark Hero and R7 5800X3D, with new drivers, and I was getting 150+ fps with RT at 3440x1440P Cinametic settings in War Thunder, and 100+ fps in Monster Hunter Rise SB. But, it carries alma with it... Every time I wake up after around 6+ hrs not using the PC, the A770 either outputs a very yellow tinted screen(fixable after reboot), a 60 fps lock at the driver level(fixable after reboot), messes up my Realtek audio driver(fixable after reboot), mess up displaying of already saved pictures, or crash any game runs in D3D12 render(fixable after reboot and can no longer play any D12 game if don't do so). Also, in this X570 system, I used 3080, 2080Ti, and 6800XT before and currently running duo GPU with the A770 and 2080Ti, and aside from 2080Ti all those GPUs should benefit from rebar, but A770 doesn't. The same issues with the X670E-I board, even in Bios and GPU-Z they say rebar is on, Arc control refuses to believe it, and the performance does tank in games. I've already contacted Intel support about all these(since they are all AMD systems)
Thanks for adding your experience. It does very much have Alma living in it. It’s interesting that in your case restarting it fixes so many of the problems temporarily. The whole rebar issue is very weird considering that it persists in multiple motherboards. Is this with the latest driver?
@@DawidDoesTechStuff Yeah, with the latest drivers and bios. When I was using A770 in the X670E system I contacted Asus and they updated new bios a week later, flashed the bios, and still no rebar according to Arc Control. And now in my X570 system, with the latest bios and Arc drivers, it still says no rebar, even when I was using 3080/6800XT they showed rebar/smart access memory was enabled.
I'm glad I'm not the only one who couldn't get the capture to work, though I tried on the A380. It would start but the screen would stutter like crazy like I had 1 fps 😂
if the best thing you have to say is that you “didn’t even realize it was in the system” at 9:23 then that’s pretty dang good for intel. at the end of the day, gamers just want to play games on their systems, not play pc troubleshooting simulator 2000 on their pc. if the experience of using it (save from screen recording and gamma) is on par with a 3080 then that’s a win in intel’s book in my opinion
So would you recommend this GPU for someone like me who needs a photo-editing rig only? I use photoshop and lightroom which is CPU intensive 2D pixel pushing, not 3D gaming graphics. Is it stable outside of the gaming? THANKS.
Late reply but nvidia is generally better with anything relating to adobe software like photoshop (pretty sure it's due to the CUDA cores they have). However, with the prices Nvidia charges, the arc a770 or a750 are probably the best you'll get without spending a ton. Pretty sure they'll work better than an AMD gpu for adobe software and other productivity stuff.
Maybe the new Driver fixed the way the Preshadergeneration is handled and unless you cleanly install the Game AND Driver (maybe even the system, I don't know where Shadercache is located) it still loads the better generated Shaders out of the Cache. Just for Example, Tiny tinas Wonderlands and Borderlands 3 generate new Shadercache after every gameupdate, which in my System took a bit of time since I played both directly after realease, so nearly every start was with an update of the game, thus new Shadercache generation. Also: I know I tend to unnecessarily put capital letters in words. It's just german grammar kicking in subconsciously. I'm too lazy to bother fixing that when posting on the net.
I have had my a770 for a few days now and today the pc had the no vga detected beeps, resaved bios where it changed csm to off (resize bar is on) and it cycled for a bit before booting to desktop. When i first installed it i had to unplug and replug in the hdmi and it worked. I am still in my 15 day return period for the store i bought it at so I should be able to exchange it for another if issues persist. The only game I have tried using the latest stable drivers (Mar 15 2023) was Destiny 2 where it works quite well at 1080p
I tested an A750 in my PC and gave up after 3 weeks. Without DDU, drivers straight break the PC. After DDU, it mostly works, but some games just keep crashing. What killed it for me was that I couldn't use it for work (OpenCL programming), as it would randomly crash during every single test, and take the entire PC down with it. It's embarrassing for such a large company to throw out so poorly validated drivers and software that their hardware becomes a paperweight.
If performance was just a little better (like 3070ti level) I'd heavily consider it. The hardware is there, this was definitely supposed to be a 3070/3080 competitor looking at the specs but the drivers are very immature and possibly just plain inefficient.
8:43 did the new drivers install a new firmware to the card that applied on restart...? Otherwise I am also lost. Could be a windows update to some APIs who knows.
Not too sure if anyone else has said this, but in the ModOrganizer window for Stalker under G.A.M.M.A fixes (or Disabled) there is a setting called "Turn this on if you stutter," my 3070ti runs the game 144fps+ but 300+ mods are susceptible to stutters I guess? Who knew🤷♀ Idk what Deity this setting calls upon but I'll tell you this... My game no longer randomly stutters.
Dawid, you're funny af. Were you ever a fan of Top Gear UK? I think I see a little bit of JC in the way you deliver the humor and use ridiculous metaphors to describe things. It makes for some top notch content, thank you!
i'm so torn because i love the look of these gpus, they're super duper sleek and not gamer looking, but it isn't worth the upgrade when the 2070 super i have works on a similar level to the a770.
@Dawid Does Tech Stuff perhaps you should do a video of 1) what is a BIOS, and what does it do 2) is it a firmwave, like someone told, but the little child has many names 3) who has the control of it? User? Manufacturer? 4) If something goes wrong, and it doesn't boot up anymore, who to blame? That card kind of scares me.
I hope that tech channels like this one do "review after 6 months" or something similar for the arc cards. I think you can cut Intel a little slack for this being their first discrete GPU. Would be interesting to evaluate the price:performance after a few months of driver updates and give everyone an idea of what to expect from battlemage next year.
No one's gonna watch a video where you say 'It's fine. It played my games.' is the problem. There's also a lot of nVidia fanboys out there that will shit on anything that isn't nVidia. That looks to be changing slightly with the 4000 series RTX cards and the pricing, but nVidia's been shitting on consumers for years now, only trickling out updates the way Intel did with CPUs from 2010 until 2019. Only once there's competition do they get their shit together again. The bottom line is, for $300, it's a hell of a card if you can live with the teething issues of the drivers maturing in realtime.
We need competition in the GPU market so bad so I hope Intel improves their Arc and doesn't just quit due to having a bad experience trying to get Arc going with their first attempt. I bet they didn't sell that many of them but maybe they will use it as learning experience and fix all the issues in the next model
8:30 some reason this is like what happened with my PC (rx 6600 xt) when playing forza. I botted it up and got 47 fps and 20% utilisation. I restarted my PC and then got 165 fps for no reason and it has been like that since.
Well it's quite easy to to get the Intel screen recorder to work and I'm quite honestly surprised that a tech channel of this caliber isn't able to figure it out. Here's an easy 24-step guide an what to do: 1. First make sure your ARC card has the current graphical update 2. Navigate to the video settings and set it to 720p and 30 frames 3. Press start recording 4. Open up your file explorer and verify it's recording 5. Grab your wallet and keys 6. Exit your home 7. Lock your doors in case some psycho wants to steal your stuff 8. Walk around your vehicle and certify that has working turn signals and brake lights 9. If you don't have a vehicle, prepare to walk, bike, or take public transport if available 10. Before you travel, map your way to the nearest hardware store 11. Once you've arrived to the local shop start browsing for tools 12. Look for the biggest hammer that the local flavor carries, you may also use a bat or similar tool 13. Purchase the large tool 14. Navigate your way back home with the tool in tow 15. Enter your home and make certain that no psychos have stolen your stuff 16. Once your home is clear, shut off the screen recorder. 17. Open file explorer and verify the recording was successful 18. If the recording is or was not successful, turn off and unplug your computer 19. If you already have a large hammer or similar tool you can skip steps 5-15 20. Once your computer is off and unplugged, turn the machine so the GPU is upright 21. Prepare your hammer, be sure to keep your hands at the end of the handle to avoid contact with the PC 22. Destroy the GPU 23. Once the GPU is destroyed, plug in and turn on your computer 24. If your computer boots into your current OS with internal graphics from the CPU, you have completed this guide
Love the mention of RGinHD. Steve (one of the lesser known Steves in the hardware media space) has a top quality channel that offers a different kind of content which is highly engaging and fun nonetheless.
Maybe the newer driver has a bios update for the graphics card that patched whatever was preventing it going close if not to its maximum performance and sense it would’ve a bios update it would be persistent on the card and that’s why it stayed on the old driver
i cant decide between 3060/3060ti and 6700XT/RX6750XT and A770 ... but i think the A770 falls out because of those problems. For me the 6700xt seems to be the worthiest but i could get the 6750xt for 30€ more.
I would guess the cause for the performance increase had to do with a firmware update that might have installed during the driver update. If that were the case, even if you did remove the updated drivers the card would still correct any board compatibility issues that might have been causing the problems in the first place. Just a thought.
if its using DXVK then it should be showing Vulkan instead of d3d11...typically you just need to put the right version in the main folder, either converting dx9, 10 , or 11.......and for dx12 there is vkd3d
Anyone else remember that old Nickelodeon gameshow Figure It Out? Something about the intonation of Dawid's amazingly well-contained internal meltdown midway through cuz of Arc's arc-ane screen capture process triggered recall lmao
Resizable BAR I doubt would help performance on older games since they tend to not have big textures that would need the larger bandwidth that resizable bar gives it. Plus games have to be coded for resizable bar in mind, right? So having it enabled for a game that's too old to use it might slow performance?
Hi Dawid. Could you make a future video of how to build a gaming PC from the ground up with useful information and tips. That would be greatly appreciated and I'm sure a lot of people would find it useful. Cheers.
to be very fair this is intels first fully realised attempt at a graphics card, the architecture itself is fine so it should improve once they get their drivers fixed
I couldn't get a 770 but did get a 750 for the lolz. I've had it for about 3 weeks now. I replaced a RX580 in an old ryzen 1700x based system. Played games like Diablo 3, World of Warships, Doom. It games fine. The software and control panel is wonky as hell like Dawid says. They're decent cards for what most people will ever do with them.
@@bhume7535 Correct. I didn't want to replace a 3080 with it so I used that old system for now. Next month I am putting together a 12400 machine and will test it properly there. But even on an older machine that doesn't have SAM/rebar the card is ok.
I bought one and have been using it for a couple weeks now. The new COD, Monster Hunter Stories 2, Rimworld, Bastion, Civ 5, L4D2, Red Dead 2, Hades and even Tarkov all run perfectly fine. Only game I've had an issue with is Cyberpunk and its only that my framerate didn't change between the ARC and my RX 580. I need to test more older games, but so far I've enjoyed my ARC experience.
Try installing Intel command center from Microsoft store, not sure if its supported in arg GPU, My integrated intel drivers don't need that but have to install that it has a hotkey to screen capture and works well for me.
My dad won one of the first home video systems available in the UK in the late 1970's, and it was quite highly specced with a movie like camera and a very heavy recording VCR unit. The quality from that outdid many later compact cameras.
Stalker wasn’t being buggy, it’s just that Arc’s unique architecture allowed you to fully experience the Monolith’s presence
Oooh! That makes sense. It was all just immersion.
>Stalker was not being buggy
Haha, Nice joke bro! xD But Arc has added some Monolith (and/or Brain Burner) experience for sure
It's not bug, it's the anomaly, comrade
"it's not a bug, it's a feature"
Duuuuuuuude I am a seruous stalker fan and the Xray engine is awful. But beautiful at the same time.
It makes me happy that GSC got some lime light considering a load of them had to go war and relocate to poland from ukraine.
Stalker2 soon!!
Dawid being quicker than LTT here
Rich from Digital Foundry beating both
@@oatsaredelicious2521 who cares who did it first lol
LTT probably has a lot more footage to compile since a big chunk of their company went through with the switch. It is good that we are getting Dawids now. And it will be good when we get the LTT report later.
Yeah when are we going see Linus's A770 video... I want to know if they got VR fixed
@@IHaveBadWiFiBro not what I meant, im just happy for getting more content to enjoy.
The Intel card itself I think looks lovely, especially when it’s running. A+ to the design team
AFAIK they have an ex Nvidia engineer on the team responsible for the Arc GPUs, I want Intel to stay in the GPU market as a third player was needed to stop the stupidly high prices that Nvidia are charging.
I agree. It is a very nice looking card.
I know right? I'm tired of big bulky GPUs looking like it's a sword or something. Just give me something sleek and stylish. It's why I stuck with reference designs these past few generations. Intel did a fantastic job. It's clean, modern, and simple. Easy to pop into any setup.
@@terminusaquo1980 for that to happen you need to actually buy inel gpus
@@pu239 which people likely will if they make it worthwhile
Perhaps the new drivers did some sort of firmware update to the card. That might persist even if the old drivers were re-installed.
Gyu hi o hi hi bb
@@yapweikang9287 For sure m8 !
@@avz7448 Yeah someone else was saying that too. I feel like it is important to be able to revert drivers though. I’m not a fan of that approach.
@@DawidDoesTechStuff drivers - yes. firmware/bios is another story. amd/nvidia dosn't bundle bios update with the driver, but that's mostly due to fw/bios being the responsibility of board partners.
Really? Usually drivers don't do that kind of stuff. Firmware updates are usually more delicate process, that user handles himself. Usually not.
Intel's drivers work more like firmware. Once installed, you cannot install an older version as it updates the card's firmware with each update. I'm not sure if they will change this.
that explains alot
That does explain it. Thank you for pointing that out. 👍
That is really stupid. I often had to uninstall a newer version and kept an older version running because it didn't caused any problems.
@@an3k I agree. They also need to make it more clear that that is what’s happening.
So basically all installing the older driver does is get you an older version of the software?
Early adopter here. I noticed the performance boost with the newest driver. Many more games improved with it. Awesome! I know more updates will coming and I'm looking forward to see the full potential the Arc GPU will bring.
can i ask you why you bought a arc and what you paid for it ? i consider a update from my gtx 1070 but i was looking to buy a 6700xt for 300 euro
@@domeberlin4480 Used i assume? Because no way you get a new 6700XT for 300 euro.
Anyway i got an ARC 750 for 280 euro with shipping. Gonna have it in my hands next few days and i am excited to see what it does.
Why? Price was decent since similar performing cards are more expensive.
Now it seems Intel fixed many problems and they haven't quit on them. So in general you can have it and forget that is a first generation GPU from a new competitor in the sector.
It has potential to improve even more considering the hardware.
It looks fine as hell.
@@SIPEROTH Could I ask about your opinion on the arc?
Dawid has so many great one liners. Bipolar honeybadger is awesome.
Haha gotta agree with this. How he comes up with them I do not know.
GREAT description of my ex-wife :)
Yeah, that's a gem.
@@AnnaDoes You better keep your eye on him .... he might be hitting the local dispensaries ! 🙂
he said honey 'BADGER' ???? it sounded like honey 'RADGET' or something like that lol I wouldn't have known if there wasn't a picture of that stripe-less skunk with the sweet intel tattoo on the screen
Was using my a770 for about a month as well (just popped my 6800 back in last night). It honestly wasn't bad other than the fact that there was a big jump in performance I wasn't ready for. It worked really well at 1080p and okay at 1440p. I played through the MW2 campaign which I expected a little more performance out of but was more than playable. P5R ran fine despite being an older game though to be fair, it was built and specced for the PS4 so even my steam machine handles that game fine. GoW I did have to run with DXVK and even then, there were notable dips though overall the game was playable. I got through a good chunk of Spiderman Remastered and a few hours of Control both with RT. Witcher 3 Next Gen was fine after the beta intel patch they just released alongside the hotfix CDPR dropped.
Overall it was fun though, i'd still stand unless you're a power user, I wouldn't get one. Its a lot better in one month but not quite mainstream anyone can get yet. If you care about RT, it is worth considering as I was running Control fine rendered at 1080 render resolution medium with the top two RT options on and was at 65-75fps on average. The Witcher was running at 80ish fps with RT ultra on (FSR2 Quality 1440p) and Spiderman was fine for 80-90fps with RT on (Native 1440p). My 6800 drops like a rock the second I turn on RT in Witcher and Control.
To be fair though, if RT is something anyone cares about there's no reason to get anything other than a 4090. Every card below the 4090 isn't powerful enough for good RT performance.
@@kenpumphrey8384 on 4K, the Rtx 3080 and 3080 ti are becoming really good budget 1440p cards if you can get them for under 600 used
@@kenpumphrey8384 That's a *slightly* different pricebracket
@@hman6159 I've had 3080s for over 2 years. They're not strong enough for RT, imo. If I were looking for a 600ish dollar card now, a new 6800xt would be a better buy than a used 3080.
@@Incommensurabilities Yes it is, but that's the price decent RT performance costs.
I hope some people stick with using these cards so that Intel can improve and MAYBE actually stick around. With how AMD and Nvidia (especially Nvidia, looking at you 4080) have been doing price wise recently, we really need Intel to improve, stick around, and compete.
Yeah, lets company which invented price gauging save us from its apprentices. We need new player. But a real new one.
@@AngelA-tj9ok CURRENTLY, their graphics department seems to be deadset on trying to bring reasonably priced stuff to the market. CURRENTLY. Will it stay that way? Likely not forever but they are the best hope we have right now for a good competitor. Designing and manufacturing GPUs is an expensive endeavor. It's not something a small dog can really get into and succeed in very easily.
@@DeckDogs4Life give it more than two generations and you will see how fast they drop the gpu department in regards to high performance graphics
i personaly blame AMD for the terrible prices of gpus, nvidia has the performance and trust lead, so they can and will charge a premium.
AMD shit heads on the other hand lack both, but still playing on the same price level as nvidia, if AMD drops their prices by 2 tiers, then they can at least compete.
@@mikeymaiku look, I know I'm hoping in one hand and shitting in the other and I know which will fill up faster, but I still want to hope they can make a lasting impact even if it's, especially right now, looking rough for Intel in the GPU market.
The price and performance for the GPU is really good for their first shot at this. I can’t wait to see their next generation of Arc GPU’s.
Intel Arc GPUs really aren't that bad, with their new drivers they can now run the older games just fine, Arcs are really something to consider when building or upgrading a PC. I'm really excited to see how Arc GPUs go.
The naming scheme is bad though, is it an Airbus or a GPU? But that seems to be consistent with Nvidia and AMD.
Arc is also the name of the some server tech from Intel...so yea
I think there's a contingent of people that are very conflicted over Arc GPUs. Intel is usually the bully on the CPU side, but now the underdog on the GPU side. That and when a new player comes into the fight, the MO is going to be to examine it under a microscope and find everything wrong with it that you can. Like Dawid said, '90% of the time, it just works' but that's a boring video, so to drive engagement, it 'helps' to point out ever single flaw. The downside is that when you're done its 'It worked mostly, but here's most of a video with me complaining about it.' and the average UA-cam user doesn't grasp the nuance and figures 'It must be crap, look at all the things wrong!'
@@WhiteG60 Overall that's what I got out of all the reviews. It's fine, it works.
It doesn't quite compete with the top offerings, though even AMD struggles to keep up with NVidia. Though AMD still owns the efficiency crown, not sure against Intel though.
@@WhiteG60 tbf working fine 90% of the time is still way worse than you usually get from even the worst examples of half-baked Radeon/Geforce launch drivers
@@helenHTID You mean, the "not integrated" gpu market. Because Intel has had GPUs for a while now. Just not ones you could willingly buy separately.
DXVK-async is the WAY better version of DXVK especially if you are using a Windows PC. The place the files go is always where the .exe file for the game is. I use DXVK-async for EVERY DX9 game I have and it resolves a lot of issues including the Borderlands GOTY Enhanced memory leak issue. In that game I run the ISLC program as well to cap the amount of system memory that can be used. Based on this vid there might be some long-term hope for intel as a GPU manufacturer which will be good for the market. Thanks for the vid Big D.
I been running a A380 on a older 3600 and it been working great for me. The frame rate keeps going up with each driver update.
I sidegraded from the Vega 64 to the A770. I’m about a month in with the card and it’s been treating me well. I mainly swapped out of curiosity and haven’t been disappointed yet. My biggest complaint so far is that certain games (ex:Age of Empires IV) assume you’re using intel integrated graphics. Because of this quirk you’ll have to do a restart on initial setup of games because they default to the lowest possible settings. This is true even when the game runs fine maxed out.
yay another vega 64 user spotted, kinda rare these days :D
Thanks!
Thank you! 😁
I've wanted a good review on an A770 for a while and the fact you tested on a 3440 made me wish I could double subscribe :O
You can! Just press the button twice.
@@nevmiku I think that unsubscribes you. 😂
@@DawidDoesTechStuff Shhh. Don't tell them! 😂
Love your more down-to-Earth content dude. Just a little nag, though, be careful about the distinction between GB and Gb. I see a lot of these mistakes in your videos.
I"ve been living with an intel airplane (a380) since August, I don't regret it.
@6:10 through @6:47 is great. Your experience seems eerily similar to trying desperately to run games and some software on various Linux distros.
S.T.A.L.K.E.R anomaly has had me obsessed, cool to see youre playing gamma although i think youd see less performance issues with anomaly i think it has less mods but it keeps all the good stuff while being a bit easier to run on most system . It does keep all the things like ironman mode and that mode where you like warp into a random ai when u die to carry on playing. Cant beat a bit of gopnik fallout though i love it.
Even the anomaly is hard to run bro but had Soo much fun
It’s made me VERY excited for Stalker 2. Hopefully it’s a decent launch.
@@DawidDoesTechStuff I mean the original Stalker was a bit janky on release, and I loved it for it. As long as it has a similar atmosphere.
Also obligatory : GET OUT OF HERE STALKER
I tried dxvk in gamma, the easiest way to force them to work is to replace the dll files in your anomaly/bin folder
That way even if you launch the modpack the game will have no way to run anything else
Thanks for the suggestion. Although considering the performance improvements from the driver update, I think the drivers may have implemented that fix natively.
@@DawidDoesTechStuff you could also try DXVK-async which is an improvement over DXVK
@@DawidDoesTechStuff it think is most likely, but RivaTuner still indicates its in directX11 mode which is weird because when i tried dxvk with my tesla m40 it did show vulkan instead, maybe Riva cant read what is actually using at a driver level but who knows, intel cards are as weird as they get
@@Elinzar Intel modified DXVK in that way that it still reports DX11 or DX9 even tough it runs Vulkan.
ye intel cards are as weird as they get
that was the most useful and efficient conclusion segment ever. tech youtubers take note.
I really love the look of the Arc 700 series. Very understated and classy.
They do look nice. It's a shame their held together by hopes and hot glue though. Servicing them would be a nightmare.
Cause the "look" of the card is of paramount importance for a graphics card. 😉. Faint praise if that's the main selling point.
@@montagyuu5163 Uh, no. They are not "held together by hopes and hot glue". If anything, they are overengineered. Also, who services a graphics card?!? I literally still use a graphics card plugged into an accelerated graphics port. It's never been "serviced" outside of blowing the dust/pet hair/only-god-knows-whatever-else out of the thing.
@@wargamingrefugee9065 I've had to repaste GPUs, I'm surprised you've never had to do that, it isn't uncommon to need that performed in their first five years of life. The backplate on the A770 / A750 is literally hot glued on btw.
@@BubbaBearsFriend I simply said I thought they looked nice. Not that I would buy one based on that. You assumed that yourself. Not everyone wants tacky looking RGB covered gear though. There is lots of good hardware out there to suit both tastes. No need to settle for looks alone.
That issue with no hotkey for screen capture is a huge issue. Most of my youtube channel relies on pressing my hockey assignment to record gaming.
Thanks for taking the time to do this, "it just works!" is pretty much all we ask for with hardware. Intel is new to the GPU game, I really hope that developers embrace it and in time it's just as reliable as the AMD and NVIDIA offerings because, at the end of the day, competition and choice benefits everyone.
My A770 is fine. I don't notice the coil while anymore, I've only noticed like 3 instances of glitching graphics that fixed on restart. (I also used DDU to install the December update... But couldn't use the control software to update so I'm not happy with that)
Thanks for the update. I actually also had that update issue with the driver. It seems to be a common problem with it.
I was gifted a 3080 but if I had to use my own money, I'd probably try an Arc A770. I really want there to be more competition in the graphics card space, especially with the Arc's price point.
I have been using an arc a770 since its release date, I agree with everything here. It is a great experience with minor issues that get fixed weeks after you notice them. It has been a great card and it looks the best to.
The side panel dropping is a nice subtle reference for the origin of your idea...
I build these for Intel, mostly the pre-production stuff. We had such high hopes for these cards. The Enterprise stuff is pretty nuts though.
I guess it’s growing pains for Intel’s GPU. I’m still glad they threw their hat in the GPU ring (more competition is great for the consumer). They don’t have to swing in the deep end either (having expensive GPU’s). They can carve a nice market for them in the mid range (who knows if Nvidia is going lower than the 4070).
Nvidia is going lower than the 4070, but if leaks are anything to go by the 4060 is going to suck. 8gb of VRAM is not really acceptable at this point.
I'm thinking maybe we should do more revisits as the drivers mature and fixes are made? (I'm a new owner of the A770 16 gb) Thanks for the video Dawid!
I was literally just finished recording my ARC review that has been a month and two weeks coming because of weird issues. I missed the fact the screen capture didn't have a hotkey, but I did notice there is no driver-based screen shot hotkey either like you have in Nvidia or AMD. Also the telemetry has all sorts of info ... but no frame rate, how crazy is that. I did use it as my main gaming card and in my usual games it just worked and other than it looked nicer than my 3070 through the tempered glass, I mostly forgot it was in there as well. Anyway, excellent video as always.
Did you have resizable bar enabled? Apparently that can make a very significant difference to performance with the ARC GPUs.
He mentioned he turned it off and on.
It was on the entire time I used it at home. I did however play around with it while testing Stalker.
@@DawidDoesTechStuff I knew you wouldn’t accept anything else but perfection with the settings 😂😂
You should revisit this video with updated drivers, like a yearly update! I bet there is a huge improvement
In case nobody already postet it here: short keys for screencapturing is alt-- f5 to start and alt--f7 to end the recording by default ( Beta driver released on Dezember the 5th). Used it several times . :)
Your channel is like Kitchen Nightmares. I simply cannot stop watching all of the videos
This video had me physically in pain from laughing so hard. This man is hilarious and informative at the same time without even trying. Great stuff, man! You earned yourself a subscriber.
1:39 what is the name of gpu?
I thought I was the only one letting the warning label stick to the glass panel even 3 years after building the pc.
The a750 has been fairly nice to me so far. 1440p @75hz feels as good as my rtx 3050. S.T.A.L.K.E.R Anomaly runs stably and smoothly. The one game i've had a problem with has been Ready Or Not, it crashes 20 minutes in when playing the DX12 version. Everything else has played well enough for me to feel good about my $280 investment, rather than my $429 3050.
Bro I got a 3050 brand new for 280cad
@@imc00k The A750 is a lot better than an RTX 3050 so im not sure what your point is?
@@IPendragonI I was addressing the price
@@imc00k Ah yeah my bad missed the $429 3050 part.
First thing that popped up for me in google:
Method 1: Capture tab
Open the Intel GCC application.
Click Home > Capture.
Click Start Recording.
Click End Recording.
Method 2: Hotkeys
Press Ctrl+Alt+F5 (default) to start recording.
Press Ctrl+Alt+F7 (default) to end recording.
Note
Hotkeys must be enabled under System.
I believe that installing that driver update must have also installed new microcode to the GPU itself, which obviously stays even if you downgrade to a previous GPU driver
Have my ARC770 16GB since a month and so far it was a blast. Dodged the early driver problems and most games I play run flawless. Sometimes dxvk gives better performance, but that is okay.
MSI Afterburner seems to have some problems with ARC, but that happens often with new GPU architectures. Waiting for an update.
Only thing I really can complain about is the dreaded control panel. It is next to useless and bothers me with constant popups for uninteresting things (Internet seems to be interrupted, your device has changed after the display was in powersaving mode). And who in the world is still using overlays in the 2020s? Do an app like everyone else, Intel!
It is a good looking card, but disassembling it for cleaning seems to be complicated.
Some third party should produce a more normal set of components to make teardowns easier.
now its the best time to make a review again seeing that intel has been improving there drivers, coming out with consistent updates
Hello Dawid, I'm also using A770 as my main GPU right now, and I can confirm that its behavior is very sporadic, and those issues would usually get fixed after a reboot. I first got it in a system with Strix X670E-I and R7 7700X, and I was getting 50-60 fps in 1080P RDR2 and A Plague Tale Requiem. Notice that even in bios and GPU-Z it says I have rebar on, the Arc control center tells me no. Then I put it in my main system with Crosshair VIII X570 Dark Hero and R7 5800X3D, with new drivers, and I was getting 150+ fps with RT at 3440x1440P Cinametic settings in War Thunder, and 100+ fps in Monster Hunter Rise SB. But, it carries alma with it... Every time I wake up after around 6+ hrs not using the PC, the A770 either outputs a very yellow tinted screen(fixable after reboot), a 60 fps lock at the driver level(fixable after reboot), messes up my Realtek audio driver(fixable after reboot), mess up displaying of already saved pictures, or crash any game runs in D3D12 render(fixable after reboot and can no longer play any D12 game if don't do so). Also, in this X570 system, I used 3080, 2080Ti, and 6800XT before and currently running duo GPU with the A770 and 2080Ti, and aside from 2080Ti all those GPUs should benefit from rebar, but A770 doesn't. The same issues with the X670E-I board, even in Bios and GPU-Z they say rebar is on, Arc control refuses to believe it, and the performance does tank in games. I've already contacted Intel support about all these(since they are all AMD systems)
Thanks for adding your experience. It does very much have Alma living in it. It’s interesting that in your case restarting it fixes so many of the problems temporarily. The whole rebar issue is very weird considering that it persists in multiple motherboards. Is this with the latest driver?
@@DawidDoesTechStuff Yeah, with the latest drivers and bios. When I was using A770 in the X670E system I contacted Asus and they updated new bios a week later, flashed the bios, and still no rebar according to Arc Control. And now in my X570 system, with the latest bios and Arc drivers, it still says no rebar, even when I was using 3080/6800XT they showed rebar/smart access memory was enabled.
I'm glad I'm not the only one who couldn't get the capture to work, though I tried on the A380. It would start but the screen would stutter like crazy like I had 1 fps 😂
It's funny, because it works great on my A380.
Unless, of course, now it doesn't. Just Arc things.
@@ryanspencer6778 😂yep
Love this channel... Dawid sounds like he just discovered Windows Vista, and all the bugs in it at the same time.
if the best thing you have to say is that you “didn’t even realize it was in the system” at 9:23 then that’s pretty dang good for intel. at the end of the day, gamers just want to play games on their systems, not play pc troubleshooting simulator 2000 on their pc. if the experience of using it (save from screen recording and gamma) is on par with a 3080 then that’s a win in intel’s book in my opinion
So would you recommend this GPU for someone like me who needs a photo-editing rig only? I use photoshop and lightroom which is CPU intensive 2D pixel pushing, not 3D gaming graphics. Is it stable outside of the gaming? THANKS.
I'm looking for more info on this, I'm a 3ds max user and want to buy this card soon
Late reply but nvidia is generally better with anything relating to adobe software like photoshop (pretty sure it's due to the CUDA cores they have). However, with the prices Nvidia charges, the arc a770 or a750 are probably the best you'll get without spending a ton. Pretty sure they'll work better than an AMD gpu for adobe software and other productivity stuff.
A770 and 3ds max work fine together.
Maybe the new Driver fixed the way the Preshadergeneration is handled and unless you cleanly install the Game AND Driver (maybe even the system, I don't know where Shadercache is located) it still loads the better generated Shaders out of the Cache.
Just for Example, Tiny tinas Wonderlands and Borderlands 3 generate new Shadercache after every gameupdate, which in my System took a bit of time since I played both directly after realease, so nearly every start was with an update of the game, thus new Shadercache generation.
Also: I know I tend to unnecessarily put capital letters in words. It's just german grammar kicking in subconsciously. I'm too lazy to bother fixing that when posting on the net.
I have had my a770 for a few days now and today the pc had the no vga detected beeps, resaved bios where it changed csm to off (resize bar is on) and it cycled for a bit before booting to desktop. When i first installed it i had to unplug and replug in the hdmi and it worked. I am still in my 15 day return period for the store i bought it at so I should be able to exchange it for another if issues persist. The only game I have tried using the latest stable drivers (Mar 15 2023) was Destiny 2 where it works quite well at 1080p
98% gaming and the odd fingernail creation, guess most of the remaining 2% is porn.
G'day Dawid, Anna & Neko,
I hope you all have a Wonderful Christmas plus a Safe & Happy New Year
I tested an A750 in my PC and gave up after 3 weeks. Without DDU, drivers straight break the PC. After DDU, it mostly works, but some games just keep crashing. What killed it for me was that I couldn't use it for work (OpenCL programming), as it would randomly crash during every single test, and take the entire PC down with it. It's embarrassing for such a large company to throw out so poorly validated drivers and software that their hardware becomes a paperweight.
If performance was just a little better (like 3070ti level) I'd heavily consider it. The hardware is there, this was definitely supposed to be a 3070/3080 competitor looking at the specs but the drivers are very immature and possibly just plain inefficient.
I would like to see these cards succeed. We badly need more competition in the GPU market.
8:43 did the new drivers install a new firmware to the card that applied on restart...? Otherwise I am also lost. Could be a windows update to some APIs who knows.
Not too sure if anyone else has said this, but in the ModOrganizer window for Stalker under G.A.M.M.A fixes (or Disabled) there is a setting called "Turn this on if you stutter," my 3070ti runs the game 144fps+ but 300+ mods are susceptible to stutters I guess? Who knew🤷♀ Idk what Deity this setting calls upon but I'll tell you this... My game no longer randomly stutters.
Dawid, you're funny af. Were you ever a fan of Top Gear UK? I think I see a little bit of JC in the way you deliver the humor and use ridiculous metaphors to describe things. It makes for some top notch content, thank you!
i'm so torn because i love the look of these gpus, they're super duper sleek and not gamer looking, but it isn't worth the upgrade when the 2070 super i have works on a similar level to the a770.
Most of the issues is fixed now because there's been constant driver updates and with xss its better than amd
The hotkey problem is not a problem if you are using Windows 10 and above and have the Xbox game bar installed, just hit Winkey-alt-r
Press Ctrl+Alt+F5 to start recording.
Press Ctrl+Alt+F7 to end recording.
for DXVK, i think you have residual dxvk dlls stuck in the system path, which DDU didn't mess with (if it did it would probably brick directx)
@Dawid Does Tech Stuff perhaps you should do a video of 1) what is a BIOS, and what does it do 2) is it a firmwave, like someone told, but the little child has many names 3) who has the control of it? User? Manufacturer? 4) If something goes wrong, and it doesn't boot up anymore, who to blame?
That card kind of scares me.
Can't wait to hear your Broken Silicon appearance!
I saw the 3080 hybrid that the system usually has in it. Grabbed one of those in July for $799 from EVGA I’m so glad I didn’t wait
RIP EVGA :(
It's a very pretty card, I hope they can become competitive. We need more competition in the gpu market, badly.
Great catch on the glass!!
I hope that tech channels like this one do "review after 6 months" or something similar for the arc cards. I think you can cut Intel a little slack for this being their first discrete GPU. Would be interesting to evaluate the price:performance after a few months of driver updates and give everyone an idea of what to expect from battlemage next year.
Dawid:its fine
Every other youtuber: DONT BUY IT YOU CANT PLAY SHIT
It's fine in that its a GPU... but I wouldn't spend GPU money on 'fine'
No one's gonna watch a video where you say 'It's fine. It played my games.' is the problem. There's also a lot of nVidia fanboys out there that will shit on anything that isn't nVidia. That looks to be changing slightly with the 4000 series RTX cards and the pricing, but nVidia's been shitting on consumers for years now, only trickling out updates the way Intel did with CPUs from 2010 until 2019. Only once there's competition do they get their shit together again.
The bottom line is, for $300, it's a hell of a card if you can live with the teething issues of the drivers maturing in realtime.
We need competition in the GPU market so bad so I hope Intel improves their Arc and doesn't just quit due to having a bad experience trying to get Arc going with their first attempt. I bet they didn't sell that many of them but maybe they will use it as learning experience and fix all the issues in the next model
Well you say that but have you bought one? I thought the same so i put my money where my mouth is and got one.
@@SIPEROTH No but I do plan on getting the partner one with crazy fans, its ok for a low spec build. I am pulling for them tho
Will you try linux gaming on the Arc A770?
8:30 some reason this is like what happened with my PC (rx 6600 xt) when playing forza. I botted it up and got 47 fps and 20% utilisation. I restarted my PC and then got 165 fps for no reason and it has been like that since.
Omg the Caution sticker.... OCD KILLING ME......
what ram are you using? at 1:21 it only says 32 gb of ddr4
Well it's quite easy to to get the Intel screen recorder to work and I'm quite honestly surprised that a tech channel of this caliber isn't able to figure it out. Here's an easy 24-step guide an what to do:
1. First make sure your ARC card has the current graphical update
2. Navigate to the video settings and set it to 720p and 30 frames
3. Press start recording
4. Open up your file explorer and verify it's recording
5. Grab your wallet and keys
6. Exit your home
7. Lock your doors in case some psycho wants to steal your stuff
8. Walk around your vehicle and certify that has working turn signals and brake lights
9. If you don't have a vehicle, prepare to walk, bike, or take public transport if available
10. Before you travel, map your way to the nearest hardware store
11. Once you've arrived to the local shop start browsing for tools
12. Look for the biggest hammer that the local flavor carries, you may also use a bat or similar tool
13. Purchase the large tool
14. Navigate your way back home with the tool in tow
15. Enter your home and make certain that no psychos have stolen your stuff
16. Once your home is clear, shut off the screen recorder.
17. Open file explorer and verify the recording was successful
18. If the recording is or was not successful, turn off and unplug your computer
19. If you already have a large hammer or similar tool you can skip steps 5-15
20. Once your computer is off and unplugged, turn the machine so the GPU is upright
21. Prepare your hammer, be sure to keep your hands at the end of the handle to avoid contact with the PC
22. Destroy the GPU
23. Once the GPU is destroyed, plug in and turn on your computer
24. If your computer boots into your current OS with internal graphics from the CPU, you have completed this guide
I am probably the only person using a arc a770 in a 5,1 mac pro and I am happy with it so far, I just want the drivers to keep getting better.
Love the mention of RGinHD. Steve (one of the lesser known Steves in the hardware media space) has a top quality channel that offers a different kind of content which is highly engaging and fun nonetheless.
Maybe the newer driver has a bios update for the graphics card that patched whatever was preventing it going close if not to its maximum performance and sense it would’ve a bios update it would be persistent on the card and that’s why it stayed on the old driver
i cant decide between 3060/3060ti and 6700XT/RX6750XT and A770 ... but i think the A770 falls out because of those problems. For me the 6700xt seems to be the worthiest but i could get the 6750xt for 30€ more.
I would guess the cause for the performance increase had to do with a firmware update that might have installed during the driver update. If that were the case, even if you did remove the updated drivers the card would still correct any board compatibility issues that might have been causing the problems in the first place. Just a thought.
From rumors, it's kinda sad Intel is giving up on these cards so shortly after release.
Just awesome video love watching the channel :)
Excited to see more intel gpu content!
if its using DXVK then it should be showing Vulkan instead of d3d11...typically you just need to put the right version in the main folder, either converting dx9, 10 , or 11.......and for dx12 there is vkd3d
Anyone else remember that old Nickelodeon gameshow Figure It Out? Something about the intonation of Dawid's amazingly well-contained internal meltdown midway through cuz of Arc's arc-ane screen capture process triggered recall lmao
Resizable BAR I doubt would help performance on older games since they tend to not have big textures that would need the larger bandwidth that resizable bar gives it. Plus games have to be coded for resizable bar in mind, right? So having it enabled for a game that's too old to use it might slow performance?
love the blue highlights bro
Dawid, have you tried ALT-F4 on the GPU to start screen capture?
Nah, but I did try deleting system32 which helped. 😂
@@DawidDoesTechStuff ive found deleting FAT32 a better remedy for a problematic pc lol
@@DawidDoesTechStuff can't trust anything that isn't system64
Hi Dawid. Could you make a future video of how to build a gaming PC from the ground up with useful information and tips. That would be greatly appreciated and I'm sure a lot of people would find it useful.
Cheers.
to be very fair this is intels first fully realised attempt at a graphics card, the architecture itself is fine so it should improve once they get their drivers fixed
Dawid mentioning playing EFT twice, subconsciously building wipe hype
Dawid I just bought some BeQuiet gear for a new build based on your ad's, so makesure they know sponsorship works!
I couldn't get a 770 but did get a 750 for the lolz. I've had it for about 3 weeks now. I replaced a RX580 in an old ryzen 1700x based system. Played games like Diablo 3, World of Warships, Doom. It games fine. The software and control panel is wonky as hell like Dawid says. They're decent cards for what most people will ever do with them.
You put it in a first gen Ryzen system?! Doesn't that mean you're running without ReBar?
@@bhume7535 Correct. I didn't want to replace a 3080 with it so I used that old system for now. Next month I am putting together a 12400 machine and will test it properly there. But even on an older machine that doesn't have SAM/rebar the card is ok.
DXVK on modded games is a pain to get working tbh, it's not just stalker gamma. It's also pretty difficult to get it working with Skyrim.
You always make my day better Dawid ! It cheers me up in sad times I love watching you video's happy Holidays to you and your Family !
I bought one and have been using it for a couple weeks now. The new COD, Monster Hunter Stories 2, Rimworld, Bastion, Civ 5, L4D2, Red Dead 2, Hades and even Tarkov all run perfectly fine. Only game I've had an issue with is Cyberpunk and its only that my framerate didn't change between the ARC and my RX 580.
I need to test more older games, but so far I've enjoyed my ARC experience.
Try installing Intel command center from Microsoft store, not sure if its supported in arg GPU, My integrated intel drivers don't need that but have to install that it has a hotkey to screen capture and works well for me.
My dad won one of the first home video systems available in the UK in the late 1970's, and it was quite highly specced with a movie like camera and a very heavy recording VCR unit. The quality from that outdid many later compact cameras.
my parents payed 700 for our first VHS player
@@psiklops71 Our first VHS player payed my parents 700 to it off and leave it alone.
I sold my RTX 2060 Super for an A770 LE Its really good tbh Im using Beta drivers for it though but Its a GPU that can really pack power.