I hope intel is taking notes, this is probably the perfect series of videos they need to get feedback and bug fix everything which is good to hear Linus confirming as well too
They are basically super users. Companies uses people like that to make the products better. They listen to that kind of use more cause those users have enough knowledge to report clearly the problems.
Let TP control it, get rid of RS. Possibly the most cringe set of videos I've watched in recent years, was seeing Tom Peterson with his superior engineering qualifications and enthusiasm, and Ryan Shroud being a complete corporate marketing lackey shutting down the most interesting topics. I get that RS has a job to do, I just can't give him any respect for how every Arc promo video he was in, that he's a complete sell-out. Every. Single. Video.
Oh, they ABSOLUTELY are taking notes. Videos like these are some of the most valuable beta testing they can get outside of the lab. The problem with products that go out into the wild is that very few customers actually write back to them with their gripes and complaints. Most who feel irritated enough to say anything do so in a one-star review with "this card doesn't work" as their critique. Usually the only chance to start a dialogue happens when a customer tries to RMA the card, but the vast majority of people simply shrug at the minor inconveniences or replace the device with something else. "I knew what I was getting into buying the first gen" they'll say, not realizing that they're perpetuating the problem by not saying anything. You can tell Intel is listening, because they were in close communication after each episode. One day we'd be hearing complaints from Linus on his show, then by the next show he'd be saying that Intel reached out to him immediately with driver fixes, patches, vr alpha drivers to test out. And paying attention to the feedback is honestly what they have to do to prioritize how they put out the neverending brushfire of problems that come with delivering a GPU to market. They're hopelessly behind the curve in this generation of cards, but it gives me hope that the next release will be much more stable and not have quite so many quality-of-life problems.
@@vectious2237 Yeah - After watching the WAN show I have been wondering whether this was THE video... It sure is subtle, almost as if Luke was Linuses ventriloquistic puppet :D
Realistically, THIS is what it takes for this stuff to improve. For anyone new to a market, getting heavy usage, and direct feedback and support from any of the biggest, best, or smartest companies/people in the field will lead to massive improvements much faster than normal, because the root of issues can be found and fixed faster. The fact that they were working, essentially hand-in-hand with you guys the entire time, is really awesome to see, and gives me hope that they will continue to improve and succeed
Seeing the story of DXVK is truly impressive. It, almost by itself, got Linux from a "Some games work every once in a while" to "If it wasn't for crazy anti cheat you can play about 99% of all games at good performance".
GOG now is getting DRM'd shit too 😡They didn't have the Mafia soundtrack either I had to actually download it and put it in music and audio folder manually. I hate third party clients so damn much.
... it is impressive IF DXVK is working fine ... In my FC38, Doom Eternal does not launch and Crysis, Crysis Wars and Crysis Warhead has resolution bugs .... If the process to get API trace would not be 10 step copy-paste-install-configure journey, it probably would not be such a hassle to provide quick bug reports..
@@sohiearthIs it really that obvious though? If DXVK turns D3D9/10/11 calls into Vulkan API calls, shouldn't it also let the Vulkan calls through directly? It's just dumb to design a program that translates for another API, but then block the calls and not launch the app when they don't need translation--
Laying the way for an almost entirely new type of product line is going to be tough. I'm just glad they're doing it. It's still a hiking trail at this point, but at least it reaches the destination, and it's being smoothed out. Really looking forward to seeing what it turns into.
@@WeicherKeks Yeah, but those iGPUs were basically meant for business use, ie just to have something to drive a couple of monitors. This is the first time they have to design GPU hardware and software for high-performance workloads.
@@WeicherKeks Those integrated GPUs were something that competed on Low End GPUs. At best. NEVER competed with Medium End. With a TRUE Dedicated GPU. It's like the difference between Gas Powered Scooters and getting into Mopeds or Motorcycles. Still not car or truck level, but waaaaaaay more than they used to be.
Hmm considering how Intel has a good relationship with Microsoft's operating systems. Microsoft having chat gpt. I see Intel implementing davinci codex on the GPU kernel. That 10$b from chips and science act is just a registration fee. AI integrated chips are now a strategic resource commodity. What better way than through the demanding GPU enthusiast sector.
I really am rooting for intel to keep at it and compete with nvidia and amd. It’s just disgusting how much nvida is price gouging and amd isn’t helping gamers either.
Well, not their first, first attempt, but first attempt where they have brought to market something at least marginally competitive. On paper anyways. There's a whole webpage dedicated to intels past attempts. I don't think I ever bookmarked it though and I can't be bothered to google it right now tho.
@@Megalomaniakaal aside from arc, xe graphics a couple years ago and the first intel graphics solution, the i740 in 1998, they've never launched dedicated graphics, especially not commercially available ones. it's always been integrated, either with intel extreme graphics and intel gma on the motherboard northbridge chipset, or built into the cpu on everything since. larrabee never officially launched (and it wasn't aimed at home users anyways it seems). so there's really only the i740, intel xe, and now arc as intel dedicated graphics products.
Intel just announced they have discovered "the huge bottleneck" in the ARC drivers and will be releasing new drivers soon which would eliminate this "bottleneck".
@@mythicalducky Exactly. It's kind of like having a super powerful engine, but it's not optimised and therefore lacks a lot of horsepower. Then when optimised, both in software and physically, it's suddenly competitive. Don't buy into a future promise, but honestly? It DOES look promising for Intel. First release and they can be in the same ballpark as some Nvidia cards, that's no small feat.
@@sujimayne It's actually quite believable. On paper the card is strong enough. In practice it is much slower than it should be in many games. For games that already works well it probably won't make much of a difference, but I'm expecting the same thing they already did it with DX9 games by bundling DXVK in the driver. DXVK also works for DX10-DX11 games but for some reasons that's not enabled in Intel's drivers. I have not tested many games, but in specific cases in Final Fantasy XIV (super crowded areas), I got almost double FPS by adding DXVK myself. (Which can be done easily copying a few DLL files in the game's folder)
Man, I'm reminded just how good of a presenter Luke is. It's been nice getting a nice dose of Luke content outside of the WAN Show lately with the challenges and recent Shadow tour/showcase
@@ethanperez4774 Linux, Arc, and AMD. This video is the conclusion to the Arc challenge, they previously did a challenge where they ran Linux on their main PCs and more recently (I've not actually watched it yet) they've done or started to do an AMD GPU challenge where, like the Arc challenge they use AMD GPUs instead of NVidia.
Curiousity and experimentation, on my part. And I was building an El Cheapo Special project for my Dad anyways, so it's not like he was going to be a performance snob trying to hit 30% extra frames to max out a 144hz 4k display. So he got to be my guinea pig. As long as intel doesn't take their ball and go home, they could make a run, here. Even if they're just fighting for 2nd place in the mid/low tier.
That's good to hear. If they can do good with their driver updates, Intel may be the choice for budget users. And we NEED one given how expensive Nvidia and AMD are getting with their GPU this current gen.
It's clear Intel has been taking notes and making strides during this entire process. I honestly think this will need more research after some hiatus. Honestly deserves a part two - probably near end of this year. Excellent jobs - both of you guys.
I have been testing their cards on my channel, in various community requested games. I show REALY gameplay footage, and dont make subjective comments. I want people to see how well this card really runs. I have found that major youtube is immensely missing this (i get it, its a lot of work, i know personaly), but even so. The Arc GPUs are an IMMENSE development in personal computing, arguably a bigger one than ANY other content these channels are covering... yet its mostly ignored.
@@ubermidget2 They barely did the challenge in the first place. they also didnt document it, or explain in depth what games were played, and on which drivers. It was a nice little opinion piece of tech-tainment, but it lacked the objectivity of something kike GN or related channels. heck. i sold my system and bought an arc system JUST so i could make videos myself out of frustration due to the lack of coverage
@Josi Whitlock I wonder, tho, if they hadn't mentioned it on WAN, would it have been as noticeable? I noticed it because i was already aware of it and looking for it...
As an owner of both an A770 and an ASRock Challenger A750 I can say that the latest driver updates really makes them more than just useable. I expect these cards to just keep on improving as time goes on. They're a good buy now!
Went from a GTX 1060 3GB to the A770 LE n the beginning of December paired with an i5-12600k and z690 mobo. Been having a great time past my first week's struggles. Definitely other cards to consider in the price range but I'm having fun and playing more games than ever!
It would be interesting to see how the arc GPU is doing in a year from now could be a good video too to compare how much better it has become. It's good that intel is actually working hard to fix things they are learning a lot and the next GPU line might be a huge competitor already
Dual host is great, and so is the crunchier tech content. Really hope that this becomes something of a model for labs. As much as I hear all the stuff on WAN Show about LTT making what it does for solid business reasons, the actual tech content has always been really good when you decide to damn the the algorithm and get serious.
I really wish that Luke would be featured in more content similar to this with the dual host layout because it really brings out their dynamic that they have
The "spiky" lag might be caused by their use of DXVK for older direct x games, it's probably the vulkan shader compilation, which is an issue that has been addressed heavily for the next version of DXVK
I'd be interested in seeing you both review it again in 3-6-12 months, just to see the growth and fixed they are deploying. It does seem that the majority of their issues is entirely driver/software based, so its highly likely to get fixed sooner then later at the stage they are in.
Will ARC ever get supported and Good as ATi tech on AMD now? Do we need more parties, or you need to work together on projects, we all use a Car, but one of them does the job in a boat, same results ?
You guys should revisit this. I'm very curious about where it's at now. It's time to upgrade and that 16GB on a 256 bit bus for 350ish bucks is very compelling.
The improvement over the course of the 1 month gives me a lot of hope. I'm excited to see you guys try this challenge again with the 2nd gen Arc release. I'm personally waiting for Intel to have their "Zen 2" moment and (and hopefully they make it that far).
Alchemist is at least their 2nd generation. They had DG1 last generation and decades of integrated GPUs. Seems like they've had an opportunity to work on graphics drivers before asking the general public to pay to participate in this massive beta test.
@@drewdane40 For ARC, you would be wrong. DG1 was a development sample that never hit the market, well, not in discrete GPU's anyway. It is basically the Iris tech that Intel has for their integrated GPUs on a single card, so yeah. The other thing that that some of these issues, were unavoidable. Take the elephant in the room that is Direct X 9 support, which has about a 4 to 10 year life span, 3 API revisions and ALOT of quirks from game to game. Intel has been rather honest about the effort of Direct X 9 support, which is to say.... it going to be a lot of work. What makes this worse is that Direct X 12 doesn't have a emulation layer in for 9... neither does 11, or 10. It one of the things that separates it from OpenGL (but to be fair, that has caused a lot of issues for OpenGL, to the point of Vulkan existing because of them) in that new versions of the API are NOT reverse-compatible and continues to be an issue because, games TODAY still release using the API of DX9. I guess that is why they are force to rely on the WINE subproject DXVK driver, which.... look WINE is VERY, VERY good, but you are not replacing an Windows install with it, same goes for DXVK and DX9, Windows or not. A reasonable point that this video is making that is that we, the consumer, have to gauge our expectations.... even if it's a Multinational, Billion Dollar Corporation the size of Intel. The way that I have been looking at Intel jump into the discrete GPU's, they get two generations tops... to figure things out, after that. Either they hit the ground running or this is dud. Right now, I they they are doing ok.
@@DuvJones You are incorrect. DG1 discrete GPUs absolutely were sold to the public through OEM PC manufacturers. It was widely publicized at the time so you'd have to have been trying really hard to have missed that news. You can buy them today on eBay for about a hundred bucks.
@@drewdane40 No, it wasn't. The DG1 never made production, for reasons know only to Intel. There was no way that Intel made them available to the public, the fact that you find these samples to buy on eBay is really irreverent.
@@DuvJones Intel's own Ark page lists DG1 as "launched," not cancelled. If you Google "DG1 cancelled" the top result is an article titled "Intel officially confirms DG1 GPU is now shipping to OEMs, and DG2 has taped out." It appears you're getting your news from an alternate timeline (or orifice.)
This is a complete aside to the subject matter, but, having two presenters sat chatting is a refreshing change to some of the norms of; single talking head, lead and support, lead and many many supports (oh hey engineering project vids didn't see you there) etc. Brings more of the WAN feel (obviously) while keeping the tight production of a regular LTT video. More of this please as and when the opportunity arises
It would be interesting to see another challenge in six to eight months to see how things have progressed. Not necessarily Linus and Luke this time, maybe someone else can suffer 🤣.
I truly think that having Intel has a new entry in the GPU market can only be a nice thing. I don't care if their first GPU is not perfect. If it allows avg users to get more well priced GPUs in the near future, it already is a win for me.
Not for long if everyone keeps buying amd or nvidia regardless. Intel won't pump out arc just to help you get better prices (which won't happen anyway as nvidia and amd don't and won't take Intel seriously since they know most people won't get Intel) if no one buys their GPU. If arc didn't have the DX issues it would have been worth it somewhat
As long as intel keeps making these great improvements, I can see myself switching from team green to team blue in the future when I am ready to upgrade. Seriously rooting for intel.
Really seems like they’re already killing the competition at price/performance, at least in games where there aren’t major issues. If they can work out their remaining driver showstoppers, they could be the go-to for budget builds.
@@dschwartz783 They're losing money in these cards, plus saying "it works fine in the games that work" says a lot about the quality of these cards, lol.
@@danieloberhofer9035 well, at the very least it’ll get people to consider them, a very new player to the market. Even if the prices go up a bit, I suspect that once they wack enough driver bottlenecks, it’ll still be the best budget option. I hope they manage to make this a success. The market can definitely use a third player, as it seems AMD has left the budget market.
Sources told ComputerBase another big driver update is on the way in February that will further improve performance substantially. Hope you guys keep testing Arc 👍
@Steve Sherman The hopium train is strong around these parts. "Buy our GPUs, guys! It doesnt work in most competitive games, it also is not capable of rendering reflections without glitching everything, but i swear in (n+2) weeks it will be close to a 3060ti!"
It amazes and surprises me sometimes, how I prepared repair procedures in advance. I dropped my e-cig (again) and it broke the glass tank. I smoked 2 real cigarettes, that I saved for that contingency. I was preparing to go on a mission to buy a new part. After taking a nap, I remembered, I had put a spare tank in my parts box, exactly for this contingency. Amazes me, that I had planned for exactly this. My e-cig works now.
It would be interesting to dig through old web archives / forum posts about early AMD and nVidia Cards and compare wether they had similar bugs and lacking support for older games etc. It's great to see Intel improving the drivers for their cards regularly and that they are listening to their customers for feedback.
Can't remember the specific card now, but years ago I had an ATi radeon card and the drivers were pretty janky. Inexplicably poor performance in certain games, even instability, then good performance elsewhere. The drivers did slowly improve the card over the two or three years I had it, but the experience was enough to ensure that my next card was an nvidia. I like the commitment we're seeing from Intel on these drivers though and I feel we desperately need a new runner in the GPU race, so I've taken the plunge and ordered an A770.
As someone that was in the game when Nvidia and AMD (ATI at the time) entered the arena, I can say that they definitely had bugs. But they weren't really that similar to what Intel has now. It was a long time ago and things have changed so much that it's really not even a fair comparison. The biggest bugs were mostly due to the growing pains of 3d rendering. But I will say this. Nvidia had a few issues with their drivers on very early cards. But they got the driver's sorted out fairly quickly. Nvidia was (and still is) good at drivers. AMD/ATI drivers have been pretty hit or miss for me. Some generations they were fine. Others they were horrible. And that horribleness lasted for years and years sometimes. AMD has come a long way since then though. As far as older games not working? That was fairly common. But that was mostly due to games being specifically designed for use on a specific card and architecture. So you had to find workarounds or settle for lackluster ports. I do remember those days mostly fondly. Being there at the birth of the modern PC gaming community and then watching what it has become is something I'm glad I experienced. Having said that, I wouldn't want to relive it from a user/gamer perspective. It was a major pain in the ass most of the time.
Probably not software API support problems (since there simply weren't as many back then), but issues with basic things not always working, absolutely.
My A770 works great. First mistake was putting it in my i7 5960k. The performance was very poor due to no rebar. Once I put it into my 10600k the performance almost tripled. Now everything works great at 1440p. Very happy with it.
This has been a great adventure, I so want to Intel to become the third alternative on my next purchase decision. I hope you Linus and Luke do these kind of videos somewhat regularly after new releases, where you try the cards you review for a month or so; Even the other brands and with updated first hand overall user experience.
@@leeroyjenkins0 yeah, new hardware can be difficult. I'd be curious if it's up and running on pop yet. I would imagine it would be fine in Arch by now.
I'm proud of you as well, i hope you do a couple hundred bug reports so in 4 years i too can buy an Intel GPU... Except then one that's actually good and not a broken underperforming mess.
@@MrTurbo_ I have used one since launch with only one issue. It's easy to look in from outside and think it's a super buggy experience. But for many users, it's been awesome including myself. They've come a long way in just the 3-4 months since release. Depends on your games as well.
@@chronometer9931 well, it's not fast, that's for sure, clearly it still got a ton of issues as you can see in the video, and the fact i mainly play indie games and a lot of older games but at really high resolution and framerate i can't imagine helps much for making the expirience any better
One thing I genuinely don't really understand (and maybe Arc buyers can explain it to me), is why you felt comfortable spending a decent amount of money on an Intel Arc A770, when an AMD RX 6650 XT is actually cheaper and better in every possible way (and, before you say it, no, "faster ray tracing" is not a valid reason for this performance tier). I mean, I would totally give Intel my money if they made a good product (we NEED the competition), but this first generation... is really not it. Spending my hard earned money on something that "mostly works... for newer games" is not something that I would ever consider.
Thank you Linus. Been following u guys since 200k subs. Since then I passed high school...got a degree in electrical engg...now halfway through my pg in vlsi design and still watching u guys make tech content. Keep em coming.
I built T1 to be a resilient monster. It effectively has 12 hard drives. Today, the audio amplifier that does the speakers, repeatedly said it was overheating. I confirmed, it was hot. It happened to be physically next to T1, I tried and couldn't find another fan. T1 donated a case fan (120mm) It works now.
I went from a gtx 1060 6gb to the Arc A770. I'd say from my experience it's a card for people who have had a pc for a while and know how to get around bugs and problems not for someone who is starting out building a pc. But performance wise I couldn't be happier. Intel has been dishing out amazing fps updates and they are not leaving it to rot like game devs do with some games now days. Something else I run the Arc A770 in a 9th gen cpu a intel I9 9900k. so even if you don't have a 10th gen cpu your good to go.
Been using A770 for a month. I have had issues, but I went in knowing all these issues. But then again, I don't really game that much so not really affected in Lightroom and Photoshop use.
You guys should run the Arc performance benchmarks again so you can quantify the gains Intel has achieved. Would be really interesting so keep following the improvements of Alchemist until the next gen arrive.
Lots of respect to Intel for rushing thier butts off to push out a GPU as fast as they did to try to ease the shortage. I just hope they can continue to improve and compete with AMD and NVIDIA
I jumped on the Arc train (keeping a 30 series on the side) and honestly its been pretty solid. The only time I had a problem was no driver update notifications and the new dead space remake gets like 5fps. Other than that i'm pretty happy with it.
@@eric-. Not really sorry, mostly games like Destiny 2, City Skylines, Supreme commander, Warzone, Modded Skyrim, Crysis remastered, Hyperlight drifter, Mass effect series, Cyberpunk, etc.
@@eric-. Searching Forza Horizon A770 on UA-cam, there seems to be a few videos showing how the game runs and it seems to be fine. Focus on most recent videos first if possible, the drivers have improved a lot (and still need to be improved a lot...).
I committed to an Arc A770 and I was surprised how it does blender, besides a couple problem games that wasnt too hard to fix like not running BFV in DX12 (runs fine in DX11), remembering to launch BeamNG in vulkan mode (no longer needed, runs fine in DX11 mode) its been quite solid.
Yep, I know most people here are probably focused on the gaming aspect of GPUs, but the productivity suite abilities of these 1st Gen ARC cards is seriously overlooked. In fact, I'd go so far as to say that I'd call it a great card for productivity-focused buyers given its price point!
@@kwizzeh I'm not saying it's a bad thing, I know it's not LTT's focus, I'm just saying the ARC productivity performance has been flying under the radar.
Check out the products featured in our video at the links below! Buy an Intel Arc A770: geni.us/9IhkN5b Purchases made through some store links may provide some compensation to Linus Media Group.
Out of interest, what issues did you guys have in Minecraft Dungeons? I play it quite a bit with my partner and I've found it alright on the A770! Frame pacing is definitely not as smooth though. That said I did notice FPS drops in the menus at one point but I moved over DXVK files to the game directory and it seemed to get rid of that! It may have just been a driver update though! Hope you do a 6 month or 1 year review down the line!
I’ve been putting out intel arc a770 test videos on my channel, showing game play and various settings. Objective game testing, real footage, no opinions. I have been PLEASANTLY surprised by both its performance and phenomenal driver updates. problem is, I make a video and they put out an update the next day. I mean, it’s a good problem to have 😂😂
I'm not in the gpu market right now, but I wanna thank Luke and Linus for the light they shed and any early adopters for going through these pains, so that when I get back into the market I'll have some great options available thanks to the competition heating up!
This feels like the same issues people seemed to have while trying to tweak settings for GuildWars2. Like turning off reflections because, for some reason, there's some reflective texture way below where you are (water under everything). Or users using DXVK to get stabilization, not necessarily for raw performance.
Exactly, I think I will build a new computer just to experience something where real changes are being made vs just taking tons of my money for what (Nvidia's pricing and even amd's)
Fr, all they need to do is pay attention to their own mistakes and keep making refinements, and in a couple years they'll have a solid competitive product line on their hands. The hardware itself seems to be good, it's just the software side holding Arc back.
Yeah, I think it's very good that influential voices in gaming are constantly expressing distaste for Nvidia's pricing. If they bring the prices a little closer to earth, they'll get a lot of money from me when I build my 4090 behemoth.
I swapped my 3060 12GB for an Arc A770 for my recording and editing rig, and from a productivity standpoint it isn't too bad when paired with a 13th gen i7. I will say however that I've run into more than my fair share of annoyances, most of which seem to be software based rather than hardware. I'm willing to give them a chance in the long run but they really need to ramp up the driver team to get some of those basic fixes out...like the UAC prompt for Arc Control for one.
@@jothain av1, h265 10 bit 4:2:2, 16gb of memory, intel hyper encode, actually more powerful gpu - there are some reasons. the a770 isnt exactly a 3060 in blue - its actually quite a step up in die size and transistor count, also in raw teraflop. the alchemist cards probably have a lot of untapped power sitting in them, its just not clear how much of these resources can actually be accessed by removing software bottlenecks. considering this is first gen for intel, theres probably alot of room for improvements.
@@houssamalucad753 Oh yeah. Just remembered that Intel has been quite good with some hardware video stuff even in some of their integrated graphics before. That would make sense in editing then.
I have 2 A750 Intels. One on my security NVR BlueIrirs. It is a beast recording and transcoding HD camera feeds of 8+cameras. It is a beast and barely gets above 5% on GPU resources. With the previous machine which had a intel 6th gen the GPU aspect was capping high 90%. The second one is for my plex server in a docker container on OMV. Runs great on linux side. All the articles I have read, Intel cards are great for production transcoding video editing and obs. Gaming I have not tried it. I only have AMD Cards and Intel GPU cards on my home. Removed Nvidia from my home. I took a chance with Intel and Nvidia better gets their stuff together.
I hope that there is an update to this considering that Intel just announced a bottle neck fix the other day. Not sure how impactful, if any, this will be but still worth mentioning.
That was the best ending, "The future where I tell you about our sponsor." LOL I am glad Intel is listening and making changes to their ARC GPU. I know it's about time the GPU market got some serious competition. Does anyone know if the ARC GPU drivers are working in OBS fully?
The good thing about videos like this is that intel will get all the details from an end user point of view and HOPEFULLY be improved in the next iteration
That glare bug happened a lot with the the HD 615 for the Surface Go. I could play MHR (with small tweaks and eons of time to allow for shaders to compile), but would get a particular ray of sunshine that would blow out the screen.
Arc survival guide: 1) Delete Arc control for stability (untill that gets reworked, so no OCs for you unless you are a power user and know a bit about computers) 2) Expect perfromance of 3070 levels (its effectively there but just with 16GB VRAM) 3) Dont be MLG, Just play your games for fun. 4) You will be fine if your setup is simple Arc is $400 and its fine, Its absolutely fine and it will get better, probably one of the best 1440p cards too tbh cos that extra vram really helps on modern titles, even if you dont use it fully it has more breathing room.
i love that both luke and linus wrote their script....you can tell the difference in other videos when someone else writes the script. they feel so natural here.
Man, the energy of this episode is amazing. It was clear they were reading off a script, as usual, and there's nothing wrong with that, but Luke and Linus were able to vibe off each other and kind of just hang out a la WAN Show, it seemed, and it was so fun to watch.
Thank you for sharing. My pal and I both bought ARC A770s. The games and the programs we are running work great. Hopefully Intel will clear up the issues you identified.
Hello, I currently see reflection glitches on my NVidia GPU (an GTX 1650) on a lot of games, so I quickly made a demo with Vulkan in C and saw that the problem is REALLY weird, sometimes it works really well, sometimes the glitch happens, sometimes it fixes itself after a short or long waiting time, but the conclusion I got was that there was some features of Vulkan having troubles with memory addresses and memory overflow, I think that the glitch actually happens constantly, but that the end user see it occasionally so maybe the GPU tries a background optimization which results in this bug and that the Intel GPUs does this optimization more to compensate for the lack of performance on slow games, so I am not entirely sure that the problem is caused essentially by Intel, maybe another API in the process may cause this issue.
That "DisplayPort connection might be limited" message I get all the time with my RTX 3090 FE. In my experience, it happens when you connect USB-C to a monitor that allows DP over USB-C functionality, but you're only using the USB-C connection for data (aka the connection is hooked up to a USB host bus and not a GPU).
Despite I'm no intel fan on any front. I found amazing the pace that Intel is putting on the arc, and now with battemage closing by, really shows how much attention they are putting to the outside words, and also how they tackle issues so fast. Nothing like this has been seen in either green or red team. I really hope this works for Intel, we need serious competition on the gpu front.
I recently switched from an RX 5700XT to the A770 LE and it's a world of difference. I went from near constant crashes and an almost unusable GPU to something that actually runs smoothly (for the most part). I actually couldn't be happier with the new hardware.
Thanks for some great episodes on Intet Arc. I hope you will follow up in 6 months - 1 year, to see how better things are at that time. I really want Intel Arc to be a success, but because my gaming station also are my work station, I won't buy a GPU like Arc at the moment, because I has to be sure I don't run in to any issues. I do hope they will get better, and they can challenge Nvidia and AMD in a few years, because the GPU marked really need a competitor to Nvidia and AMD.
It's really interesting to see their drivers improve over time. It's a bit like watching the Switch emulators work out all the weird graphics bugs. Turns out there's a lot of crazy shit under the hood we never really think about, even as game developers.
I'm hoping things continue to improve. I'm fine with "okay" performance, I more just want things to work smoothly. One thing that I miss from the Nvidia GPU I had in my previous laptop was the control panel. It was simple and let me choose what opened with what without having to use the Windows settings app. When the GPU works as intended, it works just fine for me. It's just some road bumps getting it up and going. One specific problem I've had is with BTD6 not using either GPU and Windows settings doesn't help fix it. Neither GPU in task manager will have any utilization after opening it, it all seems to happen on the CPU instead, and I don't have any way to troubleshoot it.
Recipe for lasagna: Ingredients 2 olive oil, plus a little for the dish 750g lean beef mince 90g pack prosciutto tomato sauce 200ml hot beef stock a little grated nutmeg 300g pack fresh lasagne sheets white sauce 125g ball mozzarella, torn into thin strips Method STEP 1 To make the meat sauce, heat 2 tbsp olive oil in a frying pan and cook 750g lean beef mince in two batches for about 10 mins until browned all over. STEP 2 Finely chop 4 slices of prosciutto from a 90g pack, then stir through the meat mixture. STEP 3 Pour over 800g passata or half our basic tomato sauce recipe and 200ml hot beef stock. Add a little grated nutmeg, then season. STEP 4 Bring up to the boil, then simmer for 30 mins until the sauce looks rich. STEP 5 Heat oven to 180C/fan/160C/gas 4 and lightly oil an ovenproof dish (about 30 x 20cm). STEP 6 Spoon one third of the meat sauce into the dish, then cover with some fresh lasagne sheets from a 300g pack. Drizzle over roughly 130g ready-made or homemade white sauce. STEP 7 Repeat until you have 3 layers of pasta. Cover with the remaining 390g white sauce, making sure you can’t see any pasta poking through. STEP 8 Scatter 125g torn mozzarella over the top. STEP 9 Arrange the rest of the prosciutto on top. Bake for 45 mins until the top is bubbling and lightly browned.
Linus, the "Display connection might be limited" message might actually be related to your Asus PG42UQ. As I also experienced that when I was testing with a PG42UQ that I decided to return. I didn't get these messages on my two Acer monitors that I'm daily driving. And just wait until you actually try using HDR on that Asus monitor, which has washed out colors on firmware version V32 and V33 due to some tone mapper problems.
I really hope Intel continues to improve on arc and they go for the same price to performance they did with this generation. When it works the price to performance of the A770 looks damn competitive.
I’m surprise that VR wasn’t a consideration for until from the start when making their GPUs. Granted, the close alpha does begin to address it. But it’s one of those things I wish was a consideration for the launch.
I agree. Especially cuz I think people who do buy the intel arc GPUs are gonna be those who are at the very least somewhat enthusiastic about tech, so there's a higher chance than in other cases that they would own a vr device
@@Funky_Brother Well no, because no one in their right mind would spend as much for their gpu as they would for a gaming accessory that can be used in 3 decent games
@@AnEagle you realise that things like the valve index aren't the only VR hesdsets available? There's stuff like the quest series that are much more affordable?
Honestly vr performance is pretty much the only consideration I personally have now, a mid range card runs anything 2d fine, and if you use vorpx you can run most games in vr, I just find it so much more immersive
I currently run an Arc a750 and apart from annoying UAC messages from Arc Control and a weird rendering bug in Minecraft which was fixed by resetting some settings, my experience has been flawless with Arc. Plus I am loving the performance improvements I keep getting from Intel's driver updates!
Thank you both for doing this challenge I know you both suffered a lot for us but could you in the future come back to this again when intel improves the drivers more ? Thanks again.. Awesome stuff !
full vs. limited color range (grey blacks) - I had the exact same issue on an Intel HD620 on an LG C9... it seems to be due to the driver recognizing the TV as a "TV" and forcing HDTV standards (limiting it to 8 bit YCbCr) instead of treating it as a PC monitor - the moment you switch to HDR, it switches to 10bit/RGB
I had the same issue with my very old Intel HD4000 once I connected my laptop to my TV. Upon searching online, I noticed that this bug has been there for so many Intel graphic cards and still hasn’t been properly resolved until now. I willing to bet my money that they still have the same issues on their ARC GPUs
I hope intel is taking notes, this is probably the perfect series of videos they need to get feedback and bug fix everything which is good to hear Linus confirming as well too
surely they must be!
They are basically super users. Companies uses people like that to make the products better. They listen to that kind of use more cause those users have enough knowledge to report clearly the problems.
Let TP control it, get rid of RS. Possibly the most cringe set of videos I've watched in recent years, was seeing Tom Peterson with his superior engineering qualifications and enthusiasm, and Ryan Shroud being a complete corporate marketing lackey shutting down the most interesting topics.
I get that RS has a job to do, I just can't give him any respect for how every Arc promo video he was in, that he's a complete sell-out. Every. Single. Video.
They absolutely are. I really hope they make a dent in the market. Options never hurt. It's also worth mentioning how cheap the A770 can be.
Oh, they ABSOLUTELY are taking notes. Videos like these are some of the most valuable beta testing they can get outside of the lab. The problem with products that go out into the wild is that very few customers actually write back to them with their gripes and complaints. Most who feel irritated enough to say anything do so in a one-star review with "this card doesn't work" as their critique. Usually the only chance to start a dialogue happens when a customer tries to RMA the card, but the vast majority of people simply shrug at the minor inconveniences or replace the device with something else. "I knew what I was getting into buying the first gen" they'll say, not realizing that they're perpetuating the problem by not saying anything.
You can tell Intel is listening, because they were in close communication after each episode. One day we'd be hearing complaints from Linus on his show, then by the next show he'd be saying that Intel reached out to him immediately with driver fixes, patches, vr alpha drivers to test out. And paying attention to the feedback is honestly what they have to do to prioritize how they put out the neverending brushfire of problems that come with delivering a GPU to market. They're hopelessly behind the curve in this generation of cards, but it gives me hope that the next release will be much more stable and not have quite so many quality-of-life problems.
Knowing that Linus wrote the script for him and Luke makes this funny. You can listen to Luke talk and it sounds like something Linus would say.
You can see Linus mouthing along at some points when Luke is speaking, funny lol.
As I understood it Linus only compacted Luke's points so the information would be more dense
@@vectious2237 Yeah - After watching the WAN show I have been wondering whether this was THE video... It sure is subtle, almost as if Luke was Linuses ventriloquistic puppet :D
That explains why the vid flows so well. I like it a lot
And they both suck their major sponsor as usual by making soft "criticism".
Realistically, THIS is what it takes for this stuff to improve. For anyone new to a market, getting heavy usage, and direct feedback and support from any of the biggest, best, or smartest companies/people in the field will lead to massive improvements much faster than normal, because the root of issues can be found and fixed faster. The fact that they were working, essentially hand-in-hand with you guys the entire time, is really awesome to see, and gives me hope that they will continue to improve and succeed
100% my brother
Haven't thought of that. Thanks for the viewpoint
They need to do the same type of challenge but with retro games. DX8 and below. So many good games were made pre-DX9.
@@woopygoman I really don't know a ton of games which didn't use OpenGL but used pre-DX9 and OpenGL is ok in ARC cards.
@@woopygoman LinusTech is not the right channel for that anyway. This is a channel directed for normies.
Seeing the story of DXVK is truly impressive. It, almost by itself, got Linux from a "Some games work every once in a while" to "If it wasn't for crazy anti cheat you can play about 99% of all games at good performance".
GOG now is getting DRM'd shit too 😡They didn't have the Mafia soundtrack either I had to actually download it and put it in music and audio folder manually. I hate third party clients so damn much.
... it is impressive IF DXVK is working fine ... In my FC38, Doom Eternal does not launch and Crysis, Crysis Wars and Crysis Warhead has resolution bugs ....
If the process to get API trace would not be 10 step copy-paste-install-configure journey, it probably would not be such a hassle to provide quick bug reports..
@@HXunbanned Doom Eternal's id Tech 7 supports only Vulkan, of course it won't work with DXVK.
@@sohiearthIs it really that obvious though? If DXVK turns D3D9/10/11 calls into Vulkan API calls, shouldn't it also let the Vulkan calls through directly? It's just dumb to design a program that translates for another API, but then block the calls and not launch the app when they don't need translation--
Laying the way for an almost entirely new type of product line is going to be tough. I'm just glad they're doing it. It's still a hiking trail at this point, but at least it reaches the destination, and it's being smoothed out.
Really looking forward to seeing what it turns into.
Intel has developed integrated GPUs for decades, I would not really call them newcomers.
@@WeicherKeks Yeah, but those iGPUs were basically meant for business use, ie just to have something to drive a couple of monitors. This is the first time they have to design GPU hardware and software for high-performance workloads.
@@WeicherKeks Those integrated GPUs were something that competed on Low End GPUs. At best. NEVER competed with Medium End. With a TRUE Dedicated GPU. It's like the difference between Gas Powered Scooters and getting into Mopeds or Motorcycles. Still not car or truck level, but waaaaaaay more than they used to be.
Hmm considering how Intel has a good relationship with Microsoft's operating systems. Microsoft having chat gpt. I see Intel implementing davinci codex on the GPU kernel. That 10$b from chips and science act is just a registration fee. AI integrated chips are now a strategic resource commodity. What better way than through the demanding GPU enthusiast sector.
I really am rooting for intel to keep at it and compete with nvidia and amd. It’s just disgusting how much nvida is price gouging and amd isn’t helping gamers either.
If you consider the magnitude of what they are trying to accomplish, I'd say it's pretty awesome how fast things are improving on their first attempt.
Well, not their first, first attempt, but first attempt where they have brought to market something at least marginally competitive. On paper anyways.
There's a whole webpage dedicated to intels past attempts. I don't think I ever bookmarked it though and I can't be bothered to google it right now tho.
@@Megalomaniakaal aside from arc, xe graphics a couple years ago and the first intel graphics solution, the i740 in 1998, they've never launched dedicated graphics, especially not commercially available ones. it's always been integrated, either with intel extreme graphics and intel gma on the motherboard northbridge chipset, or built into the cpu on everything since. larrabee never officially launched (and it wasn't aimed at home users anyways it seems). so there's really only the i740, intel xe, and now arc as intel dedicated graphics products.
@@dan_loeb Yes, the things in between those never were publicly released. Larrabee perhaps came the closest.
aagh not again this first attempt after attempting 25 years
They have integrated gpu all along. Ofc there are a lot of difference but not that unimaginable
Intel just announced they have discovered "the huge bottleneck" in the ARC drivers and will be releasing new drivers soon which would eliminate this "bottleneck".
But don't get your hopes up. There is only so much that drivers can do for fundamentally weak hardware.
@@sujimayne that's the thing i don't know if id call arc fundamentally weak hardware wise
It's the bottleneck their engineers' whiskey flows through?
@@mythicalducky Exactly. It's kind of like having a super powerful engine, but it's not optimised and therefore lacks a lot of horsepower. Then when optimised, both in software and physically, it's suddenly competitive. Don't buy into a future promise, but honestly? It DOES look promising for Intel. First release and they can be in the same ballpark as some Nvidia cards, that's no small feat.
@@sujimayne It's actually quite believable. On paper the card is strong enough. In practice it is much slower than it should be in many games. For games that already works well it probably won't make much of a difference, but I'm expecting the same thing they already did it with DX9 games by bundling DXVK in the driver. DXVK also works for DX10-DX11 games but for some reasons that's not enabled in Intel's drivers. I have not tested many games, but in specific cases in Final Fantasy XIV (super crowded areas), I got almost double FPS by adding DXVK myself. (Which can be done easily copying a few DLL files in the game's folder)
FYI: For me, the DisplayPort message in Windows actually comes from a USB C Hub that I have. That might be the case for you guys too.
Yup, same. Got a "cable matters" hub that does that but works flawlessly otherwise.
Same with a usb hub I have that has HDMI out
I get the same message when I plug my laptop into my monitors integrated kvm using the usb c cable as data and an hdmi cable for video.
RTX4090 here, here it is a DELL Monitor. Never had ARK installed bevore. Comes, when a device for charging is connected over usb c of the monitor
@@stili774 oh cmon stop braggin...
Man, I'm reminded just how good of a presenter Luke is. It's been nice getting a nice dose of Luke content outside of the WAN Show lately with the challenges and recent Shadow tour/showcase
The WAN show is everything to me.
What recent challenges?
@@ethanperez4774 Linux, Arc, and AMD. This video is the conclusion to the Arc challenge, they previously did a challenge where they ran Linux on their main PCs and more recently (I've not actually watched it yet) they've done or started to do an AMD GPU challenge where, like the Arc challenge they use AMD GPUs instead of NVidia.
I've been surprisingly happy with my 770. I had plenty of issues early on, but have been quite impressed since the driver roll-out near Christmas.
What led you to buy one?
@@Mumbolian Price, they are cheap som places
Curiousity and experimentation, on my part. And I was building an El Cheapo Special project for my Dad anyways, so it's not like he was going to be a performance snob trying to hit 30% extra frames to max out a 144hz 4k display. So he got to be my guinea pig. As long as intel doesn't take their ball and go home, they could make a run, here. Even if they're just fighting for 2nd place in the mid/low tier.
That's good to hear. If they can do good with their driver updates, Intel may be the choice for budget users. And we NEED one given how expensive Nvidia and AMD are getting with their GPU this current gen.
Really good to hear. We need more updated reviews as drivers continue to get rolled out and make it better.
It's clear Intel has been taking notes and making strides during this entire process. I honestly think this will need more research after some hiatus. Honestly deserves a part two - probably near end of this year. Excellent jobs - both of you guys.
Maybe soon after battlemage releases both see how the new GPU and drivers are
You read my mind - They should re-run the challenge with 6 or 12 months of new driver updates
@@ubermidget2 Literally said what I was gonna say LMAO
I have been testing their cards on my channel, in various community requested games. I show REALY gameplay footage, and dont make subjective comments.
I want people to see how well this card really runs.
I have found that major youtube is immensely missing this (i get it, its a lot of work, i know personaly), but even so.
The Arc GPUs are an IMMENSE development in personal computing, arguably a bigger one than ANY other content these channels are covering... yet its mostly ignored.
@@ubermidget2 They barely did the challenge in the first place. they also didnt document it, or explain in depth what games were played, and on which drivers.
It was a nice little opinion piece of tech-tainment, but it lacked the objectivity of something kike GN or related channels. heck. i sold my system and bought an arc system JUST so i could make videos myself out of frustration due to the lack of coverage
3:50 Linus agrees so much with Luke, he is mouthing along the script with him.
Ahahahahaha. Laughed so hard at this 😅
they mentioned it in WAN and i cant stop looking at it
@Josi Whitlock I wonder, tho, if they hadn't mentioned it on WAN, would it have been as noticeable? I noticed it because i was already aware of it and looking for it...
Don't they use a teleprompter ?
Same at 12:53 :D
As an owner of both an A770 and an ASRock Challenger A750 I can say that the latest driver updates really makes them more than just useable. I expect these cards to just keep on improving as time goes on. They're a good buy now!
AMD is way ahead and nobody buys their gpus. Nobody is gonna buy these either and thats fair.
@@iHadWaterForDinner yeah if a 6650 xt is selling for 300 brand new and a 3060ti is over 400 for 5% more fps its clear what people are buying lol
@@iHadWaterForDinner honestly for friends' budget builds, I'm personally recommending AMD stuff to pretty much everyone that doesn't need CUDA
The 770 is a really good buy
@MichaelLivote , do you play forza horizon? if so, how's the game performance on latest arc drivers?
Went from a GTX 1060 3GB to the A770 LE n the beginning of December paired with an i5-12600k and z690 mobo. Been having a great time past my first week's struggles. Definitely other cards to consider in the price range but I'm having fun and playing more games than ever!
@@desmasic Lemme know how it goes, I have a 3060 Ti and I'm tempted to try arc 770 before my shift to next generation cards by 2024
It would be interesting to see how the arc GPU is doing in a year from now could be a good video too to compare how much better it has become. It's good that intel is actually working hard to fix things they are learning a lot and the next GPU line might be a huge competitor already
To be honest, format wise, this has been one of my favorite episodes in the last few months, really like the dual host setup you've got going here
Linus and Luke are two peas in a pod.
it's all scripted but made to seem like it isn't .... still a good episode..
Dual host is great, and so is the crunchier tech content. Really hope that this becomes something of a model for labs. As much as I hear all the stuff on WAN Show about LTT making what it does for solid business reasons, the actual tech content has always been really good when you decide to damn the the algorithm and get serious.
Especially with Linus and Luke, but I totally see this working out with others
I really wish that Luke would be featured in more content similar to this with the dual host layout because it really brings out their dynamic that they have
The "spiky" lag might be caused by their use of DXVK for older direct x games, it's probably the vulkan shader compilation, which is an issue that has been addressed heavily for the next version of DXVK
this also affects linux as well, also on their more mature laptop igpus. Hopefully they'll also deal with that in the vulkan native side
Can you not just pre-compile the shaders? I do it on all my Linux machines.
@@luisortega8085 vulkan has been updated to do away with shader compilation stutters
@@AndRei-yc3ti huh really? weird. i get stutters a lot when i play a new game... tested it many times with nfs rivals
@Luis Ortega what distro you on? I'm on arch so I get latest updates.
I'd be interested in seeing you both review it again in 3-6-12 months, just to see the growth and fixed they are deploying. It does seem that the majority of their issues is entirely driver/software based, so its highly likely to get fixed sooner then later at the stage they are in.
Will ARC ever get supported and Good as ATi tech on AMD now?
Do we need more parties, or you need to work together on projects, we all use a Car, but one of them does the job in a boat, same results ?
Just the fact that the ARC team is communicating and actually taking criticism to heart and making solutions makes me want to buy one of these.
You guys should revisit this. I'm very curious about where it's at now. It's time to upgrade and that 16GB on a 256 bit bus for 350ish bucks is very compelling.
The improvement over the course of the 1 month gives me a lot of hope. I'm excited to see you guys try this challenge again with the 2nd gen Arc release. I'm personally waiting for Intel to have their "Zen 2" moment and (and hopefully they make it that far).
Alchemist is at least their 2nd generation. They had DG1 last generation and decades of integrated GPUs. Seems like they've had an opportunity to work on graphics drivers before asking the general public to pay to participate in this massive beta test.
@@drewdane40
For ARC, you would be wrong. DG1 was a development sample that never hit the market, well, not in discrete GPU's anyway. It is basically the Iris tech that Intel has for their integrated GPUs on a single card, so yeah.
The other thing that that some of these issues, were unavoidable. Take the elephant in the room that is Direct X 9 support, which has about a 4 to 10 year life span, 3 API revisions and ALOT of quirks from game to game. Intel has been rather honest about the effort of Direct X 9 support, which is to say.... it going to be a lot of work. What makes this worse is that Direct X 12 doesn't have a emulation layer in for 9... neither does 11, or 10. It one of the things that separates it from OpenGL (but to be fair, that has caused a lot of issues for OpenGL, to the point of Vulkan existing because of them) in that new versions of the API are NOT reverse-compatible and continues to be an issue because, games TODAY still release using the API of DX9. I guess that is why they are force to rely on the WINE subproject DXVK driver, which.... look WINE is VERY, VERY good, but you are not replacing an Windows install with it, same goes for DXVK and DX9, Windows or not.
A reasonable point that this video is making that is that we, the consumer, have to gauge our expectations.... even if it's a Multinational, Billion Dollar Corporation the size of Intel.
The way that I have been looking at Intel jump into the discrete GPU's, they get two generations tops... to figure things out, after that. Either they hit the ground running or this is dud. Right now, I they they are doing ok.
@@DuvJones You are incorrect. DG1 discrete GPUs absolutely were sold to the public through OEM PC manufacturers. It was widely publicized at the time so you'd have to have been trying really hard to have missed that news. You can buy them today on eBay for about a hundred bucks.
@@drewdane40
No, it wasn't. The DG1 never made production, for reasons know only to Intel. There was no way that Intel made them available to the public, the fact that you find these samples to buy on eBay is really irreverent.
@@DuvJones Intel's own Ark page lists DG1 as "launched," not cancelled. If you Google "DG1 cancelled" the top result is an article titled "Intel officially confirms DG1 GPU is now shipping to OEMs, and DG2 has taped out." It appears you're getting your news from an alternate timeline (or orifice.)
This is a complete aside to the subject matter, but, having two presenters sat chatting is a refreshing change to some of the norms of; single talking head, lead and support, lead and many many supports (oh hey engineering project vids didn't see you there) etc. Brings more of the WAN feel (obviously) while keeping the tight production of a regular LTT video. More of this please as and when the opportunity arises
yep
It would be interesting to see another challenge in six to eight months to see how things have progressed. Not necessarily Linus and Luke this time, maybe someone else can suffer 🤣.
they should definitely do an update when next gen Arc comes too
I'm glad Intel is listening and making constant improvements. I'm long-time Nvidia user but I really want Arc to succeed.
I truly think that having Intel has a new entry in the GPU market can only be a nice thing. I don't care if their first GPU is not perfect. If it allows avg users to get more well priced GPUs in the near future, it already is a win for me.
Not for long if everyone keeps buying amd or nvidia regardless. Intel won't pump out arc just to help you get better prices (which won't happen anyway as nvidia and amd don't and won't take Intel seriously since they know most people won't get Intel) if no one buys their GPU. If arc didn't have the DX issues it would have been worth it somewhat
As long as intel keeps making these great improvements, I can see myself switching from team green to team blue in the future when I am ready to upgrade. Seriously rooting for intel.
Really seems like they’re already killing the competition at price/performance, at least in games where there aren’t major issues. If they can work out their remaining driver showstoppers, they could be the go-to for budget builds.
@@dschwartz783 They're losing money in these cards, plus saying "it works fine in the games that work" says a lot about the quality of these cards, lol.
@@danieloberhofer9035 well, at the very least it’ll get people to consider them, a very new player to the market. Even if the prices go up a bit, I suspect that once they wack enough driver bottlenecks, it’ll still be the best budget option. I hope they manage to make this a success. The market can definitely use a third player, as it seems AMD has left the budget market.
Sources told ComputerBase another big driver update is on the way in February that will further improve performance substantially. Hope you guys keep testing Arc 👍
Link?
@Steve Sherman The hopium train is strong around these parts. "Buy our GPUs, guys! It doesnt work in most competitive games, it also is not capable of rendering reflections without glitching everything, but i swear in (n+2) weeks it will be close to a 3060ti!"
Damn, I thought they forgot about this series.
But they did forget about the $1 computer 😂
@@aayaan1935 they did not, it has been sold
I was thinking that too.. it sounded like they swapped to other GPUs like a month or two ago… and I thought this thing just started lol.
As problematic as the Arc GPUs have been I am glad to see progress. I am rooting for them to do well. The GPU market is in definite need of shakeup.
It amazes and surprises me sometimes, how I prepared repair procedures in advance.
I dropped my e-cig (again) and it broke the glass tank.
I smoked 2 real cigarettes, that I saved for that contingency.
I was preparing to go on a mission to buy a new part.
After taking a nap, I remembered, I had put a spare tank in my parts box, exactly for this contingency.
Amazes me, that I had planned for exactly this.
My e-cig works now.
It would be interesting to dig through old web archives / forum posts about early AMD and nVidia Cards and compare wether they had similar bugs and lacking support for older games etc. It's great to see Intel improving the drivers for their cards regularly and that they are listening to their customers for feedback.
Can't remember the specific card now, but years ago I had an ATi radeon card and the drivers were pretty janky. Inexplicably poor performance in certain games, even instability, then good performance elsewhere. The drivers did slowly improve the card over the two or three years I had it, but the experience was enough to ensure that my next card was an nvidia. I like the commitment we're seeing from Intel on these drivers though and I feel we desperately need a new runner in the GPU race, so I've taken the plunge and ordered an A770.
As someone that was in the game when Nvidia and AMD (ATI at the time) entered the arena, I can say that they definitely had bugs. But they weren't really that similar to what Intel has now. It was a long time ago and things have changed so much that it's really not even a fair comparison. The biggest bugs were mostly due to the growing pains of 3d rendering. But I will say this. Nvidia had a few issues with their drivers on very early cards. But they got the driver's sorted out fairly quickly. Nvidia was (and still is) good at drivers. AMD/ATI drivers have been pretty hit or miss for me. Some generations they were fine. Others they were horrible. And that horribleness lasted for years and years sometimes. AMD has come a long way since then though. As far as older games not working? That was fairly common. But that was mostly due to games being specifically designed for use on a specific card and architecture. So you had to find workarounds or settle for lackluster ports.
I do remember those days mostly fondly. Being there at the birth of the modern PC gaming community and then watching what it has become is something I'm glad I experienced. Having said that, I wouldn't want to relive it from a user/gamer perspective. It was a major pain in the ass most of the time.
Probably not software API support problems (since there simply weren't as many back then), but issues with basic things not always working, absolutely.
every GPU brand back then had different support and even API for a number of games
3:54
Linus also reading the prompter is hilarious lol
Did you pick up on it or watch them call it out on the WAN show?
I’d never have known if it wasn’t for the WAN show
You guys should do this as like a yearly thing every January switch to arc and see how its improved.
I think 3-6 months would be much better lol
@@josephoverstreet5584 Thats far to much torture.
@@memberberries7669 Lmao let’s settle at 6 months July this year and start the year rotation next year In January
My A770 works great. First mistake was putting it in my i7 5960k. The performance was very poor due to no rebar. Once I put it into my 10600k the performance almost tripled. Now everything works great at 1440p. Very happy with it.
Fyi, you can now add ReBarUEFI to your BIOS on X99 and have ReBar work just fine with Intel GPUs.
@@b0ne91 that's interesting, I just looked it up and never knew a hack was out to do this on X99. Thanks.
This has been a great adventure, I so want to Intel to become the third alternative on my next purchase decision.
I hope you Linus and Luke do these kind of videos somewhat regularly after new releases, where you try the cards you review for a month or so; Even the other brands and with updated first hand overall user experience.
Heads up: Arc Challenge pt 2 and 4 are in the GPU playlist but 1 and 3 don't appear to be
Intel removed them
@@mycelia_ow I just watched the first
For Arc, all I would really care about is video encoding performance on Linux. I suspect that it is a solid performer at this.
@@leeroyjenkins0 yeah, new hardware can be difficult. I'd be curious if it's up and running on pop yet. I would imagine it would be fine in Arch by now.
I went from a GTX 1650 and switched to Arc A770 and I'm proud of it damn it
I'm proud of you as well, i hope you do a couple hundred bug reports so in 4 years i too can buy an Intel GPU... Except then one that's actually good and not a broken underperforming mess.
@@MrTurbo_ I have used one since launch with only one issue. It's easy to look in from outside and think it's a super buggy experience. But for many users, it's been awesome including myself. They've come a long way in just the 3-4 months since release. Depends on your games as well.
@@MrTurbo_ Your expectations are way off. I have the a770 and it's a lot better than you seem to think it is
@@chronometer9931 well, it's not fast, that's for sure, clearly it still got a ton of issues as you can see in the video, and the fact i mainly play indie games and a lot of older games but at really high resolution and framerate i can't imagine helps much for making the expirience any better
One thing I genuinely don't really understand (and maybe Arc buyers can explain it to me), is why you felt comfortable spending a decent amount of money on an Intel Arc A770, when an AMD RX 6650 XT is actually cheaper and better in every possible way (and, before you say it, no, "faster ray tracing" is not a valid reason for this performance tier). I mean, I would totally give Intel my money if they made a good product (we NEED the competition), but this first generation... is really not it. Spending my hard earned money on something that "mostly works... for newer games" is not something that I would ever consider.
Thank you Linus. Been following u guys since 200k subs. Since then I passed high school...got a degree in electrical engg...now halfway through my pg in vlsi design and still watching u guys make tech content. Keep em coming.
I built T1 to be a resilient monster. It effectively has 12 hard drives.
Today, the audio amplifier that does the speakers, repeatedly said it was overheating.
I confirmed, it was hot.
It happened to be physically next to T1, I tried and couldn't find another fan.
T1 donated a case fan (120mm)
It works now.
Even after a decade of content on this channel, I still enjoy these two on camera together.
Give it 6 months and run them through some tests again, it will be interesting to see how much things have improved
I went from a gtx 1060 6gb to the Arc A770. I'd say from my experience it's a card for people who have had a pc for a while and know how to get around bugs and problems not for someone who is starting out building a pc. But performance wise I couldn't be happier. Intel has been dishing out amazing fps updates and they are not leaving it to rot like game devs do with some games now days. Something else I run the Arc A770 in a 9th gen cpu a intel I9 9900k. so even if you don't have a 10th gen cpu your good to go.
I really hope Intel doesn't give up on ARC even if some financial indicators are rather bad at the moment. Keep it up!
i suspect by the time they get to their second or third gen arc, its going to be a pretty good option.
I am really happy to see Intel is still interested in the project, it’s basically the only bright spot in the gpu market right now.
lol other than crypto crashing? Or wait the dumpster fires of AMD and Nvidia must be pretty bright. Being big dumpsters. :P
Been using A770 for a month. I have had issues, but I went in knowing all these issues. But then again, I don't really game that much so not really affected in Lightroom and Photoshop use.
That limited displayport functionality notification is not caused by Arc. I get it on my 4090 as well with an ASUS PG42UQ.
Yep I get it with a 3080 Ti and Dell U2720Q too
That's why I went from my 3070ti to a 3060ti I got in 2020 is this displayport madness. I have two HDMI cables to my monitors and no notifications!
You guys should run the Arc performance benchmarks again so you can quantify the gains Intel has achieved. Would be really interesting so keep following the improvements of Alchemist until the next gen arrive.
Lots of respect to Intel for rushing thier butts off to push out a GPU as fast as they did to try to ease the shortage. I just hope they can continue to improve and compete with AMD and NVIDIA
I jumped on the Arc train (keeping a 30 series on the side) and honestly its been pretty solid. The only time I had a problem was no driver update notifications and the new dead space remake gets like 5fps. Other than that i'm pretty happy with it.
do you play forza horizon? if so, how's the game performance on latest arc drivers?
@@eric-. Not really sorry, mostly games like Destiny 2, City Skylines, Supreme commander, Warzone, Modded Skyrim, Crysis remastered, Hyperlight drifter, Mass effect series, Cyberpunk, etc.
@@eric-. Searching Forza Horizon A770 on UA-cam, there seems to be a few videos showing how the game runs and it seems to be fine. Focus on most recent videos first if possible, the drivers have improved a lot (and still need to be improved a lot...).
I committed to an Arc A770 and I was surprised how it does blender, besides a couple problem games that wasnt too hard to fix like not running BFV in DX12 (runs fine in DX11), remembering to launch BeamNG in vulkan mode (no longer needed, runs fine in DX11 mode) its been quite solid.
Yep, I know most people here are probably focused on the gaming aspect of GPUs, but the productivity suite abilities of these 1st Gen ARC cards is seriously overlooked. In fact, I'd go so far as to say that I'd call it a great card for productivity-focused buyers given its price point!
@@UnderageBeerHere Well it ain't a surprise since they do review for gamers in mind first.
but my 3080 doesn't run BFV in DX12 properly either.
@@UnderageBeerHere yeah it's absurdly good for video encoding speed, I've never seen h265 encode at 800fps for a 1080p60 video
@@kwizzeh I'm not saying it's a bad thing, I know it's not LTT's focus, I'm just saying the ARC productivity performance has been flying under the radar.
Check out the products featured in our video at the links below!
Buy an Intel Arc A770: geni.us/9IhkN5b
Purchases made through some store links may provide some compensation to Linus Media Group.
:)
No pinned comment?
Haha, forgot to pin this
Bold move to post an affiliate link for an Intel Arc GPU after ripping into it for 15 minutes lmao
I can't mention enough how happy I am that Intel is using DXVK in this way, I hope this means more funds and work for that project :)
Out of interest, what issues did you guys have in Minecraft Dungeons? I play it quite a bit with my partner and I've found it alright on the A770! Frame pacing is definitely not as smooth though. That said I did notice FPS drops in the menus at one point but I moved over DXVK files to the game directory and it seemed to get rid of that! It may have just been a driver update though! Hope you do a 6 month or 1 year review down the line!
I’ve been putting out intel arc a770 test videos on my channel, showing game play and various settings. Objective game testing, real footage, no opinions.
I have been PLEASANTLY surprised by both its performance and phenomenal driver updates.
problem is, I make a video and they put out an update the next day. I mean, it’s a good problem to have 😂😂
I'm not in the gpu market right now, but I wanna thank Luke and Linus for the light they shed and any early adopters for going through these pains, so that when I get back into the market I'll have some great options available thanks to the competition heating up!
I like this two head, playing off eachother format, and giving a really measured, down to earth perspective.
Great stuff.
This feels like the same issues people seemed to have while trying to tweak settings for GuildWars2. Like turning off reflections because, for some reason, there's some reflective texture way below where you are (water under everything). Or users using DXVK to get stabilization, not necessarily for raw performance.
I am excited for the next generation of cards. Not just for the competition, but for trying something new.
Exactly, I think I will build a new computer just to experience something where real changes are being made vs just taking tons of my money for what (Nvidia's pricing and even amd's)
I really hope they''ll release next gen and contiiinue because the driver development is just amazing
Fr, all they need to do is pay attention to their own mistakes and keep making refinements, and in a couple years they'll have a solid competitive product line on their hands. The hardware itself seems to be good, it's just the software side holding Arc back.
My guesses are they will drop making gpus entirely.
Intel is also working on new Linux drivers for the Xe series iGPUs and up. 2023 might be really interesting for Arc I hope.
The fact they're doing this at all shows how crazy Linus sees the pricing of flagship gpus at the moment. Kudos
Yeah, I think it's very good that influential voices in gaming are constantly expressing distaste for Nvidia's pricing. If they bring the prices a little closer to earth, they'll get a lot of money from me when I build my 4090 behemoth.
It's been a year, should do an ARC Challenge 2
I swapped my 3060 12GB for an Arc A770 for my recording and editing rig, and from a productivity standpoint it isn't too bad when paired with a 13th gen i7. I will say however that I've run into more than my fair share of annoyances, most of which seem to be software based rather than hardware. I'm willing to give them a chance in the long run but they really need to ramp up the driver team to get some of those basic fixes out...like the UAC prompt for Arc Control for one.
Isn't 3060 about the same performance card? Why did you change?
@@jothain for AV1 support probably..
@@jothain av1, h265 10 bit 4:2:2, 16gb of memory, intel hyper encode, actually more powerful gpu - there are some reasons. the a770 isnt exactly a 3060 in blue - its actually quite a step up in die size and transistor count, also in raw teraflop. the alchemist cards probably have a lot of untapped power sitting in them, its just not clear how much of these resources can actually be accessed by removing software bottlenecks. considering this is first gen for intel, theres probably alot of room for improvements.
@@houssamalucad753 Oh yeah. Just remembered that Intel has been quite good with some hardware video stuff even in some of their integrated graphics before. That would make sense in editing then.
@@jothain I ran a 3dmark benchmark and it already close to RTX 3070. 3060, even 3060 TI is nothing compared to A770
I have 2 A750 Intels. One on my security NVR BlueIrirs. It is a beast recording and transcoding HD camera feeds of 8+cameras. It is a beast and barely gets above 5% on GPU resources. With the previous machine which had a intel 6th gen the GPU aspect was capping high 90%. The second one is for my plex server in a docker container on OMV. Runs great on linux side.
All the articles I have read, Intel cards are great for production transcoding video editing and obs. Gaming I have not tried it. I only have AMD Cards and Intel GPU cards on my home. Removed Nvidia from my home. I took a chance with Intel and Nvidia better gets their stuff together.
I hope that there is an update to this considering that Intel just announced a bottle neck fix the other day. Not sure how impactful, if any, this will be but still worth mentioning.
Sounds like something they'd be likely to cover in a TechQuickie
That was the best ending, "The future where I tell you about our sponsor." LOL
I am glad Intel is listening and making changes to their ARC GPU. I know it's about time the GPU market got some serious competition.
Does anyone know if the ARC GPU drivers are working in OBS fully?
The good thing about videos like this is that intel will get all the details from an end user point of view and HOPEFULLY be improved in the next iteration
gives me hope for a second generaton arc. could be my next upgrade if the price is right
Really hope they keep improving, more options are always appreciated
That glare bug happened a lot with the the HD 615 for the Surface Go. I could play MHR (with small tweaks and eons of time to allow for shaders to compile), but would get a particular ray of sunshine that would blow out the screen.
Arc survival guide:
1) Delete Arc control for stability (untill that gets reworked, so no OCs for you unless you are a power user and know a bit about computers)
2) Expect perfromance of 3070 levels (its effectively there but just with 16GB VRAM)
3) Dont be MLG, Just play your games for fun.
4) You will be fine if your setup is simple
Arc is $400 and its fine, Its absolutely fine and it will get better, probably one of the best 1440p cards too tbh cos that extra vram really helps on modern titles, even if you dont use it fully it has more breathing room.
I love the concept behind the scripting in the video, I'm excited to see the improvements you guys can make to the style moving forward
i love that both luke and linus wrote their script....you can tell the difference in other videos when someone else writes the script. they feel so natural here.
pretty sure linus said he wrote luke's part on the last WAN show
Really? Feels really forced and cheesy. Like when luke goes to swear but linus cuts him off. Barf lol
@@tbunreall Right? jfc barf defines it well.
Man, the energy of this episode is amazing. It was clear they were reading off a script, as usual, and there's nothing wrong with that, but Luke and Linus were able to vibe off each other and kind of just hang out a la WAN Show, it seemed, and it was so fun to watch.
i've got intel arc and it hasn't been too bad; i just use intel driver updater
The limited display port message probs refers to the card having display port 2.0 but monitor using 1.4
LOL! Rewatched it now after the WAN-show 😅 3:53 for Linus mouthing Kyle 😂🤣😂
Thank you for sharing. My pal and I both bought ARC A770s. The games and the programs we are running work great. Hopefully Intel will clear up the issues you identified.
Hello, I currently see reflection glitches on my NVidia GPU (an GTX 1650) on a lot of games, so I quickly made a demo with Vulkan in C and saw that the problem is REALLY weird, sometimes it works really well, sometimes the glitch happens, sometimes it fixes itself after a short or long waiting time, but the conclusion I got was that there was some features of Vulkan having troubles with memory addresses and memory overflow, I think that the glitch actually happens constantly, but that the end user see it occasionally so maybe the GPU tries a background optimization which results in this bug and that the Intel GPUs does this optimization more to compensate for the lack of performance on slow games, so I am not entirely sure that the problem is caused essentially by Intel, maybe another API in the process may cause this issue.
I want the Arc + Linux Challenge (mainly bc this is what I want to run and I'm finding shockingly little commentary about it)
That "DisplayPort connection might be limited" message I get all the time with my RTX 3090 FE. In my experience, it happens when you connect USB-C to a monitor that allows DP over USB-C functionality, but you're only using the USB-C connection for data (aka the connection is hooked up to a USB host bus and not a GPU).
Despite I'm no intel fan on any front. I found amazing the pace that Intel is putting on the arc, and now with battemage closing by, really shows how much attention they are putting to the outside words, and also how they tackle issues so fast. Nothing like this has been seen in either green or red team. I really hope this works for Intel, we need serious competition on the gpu front.
I recently switched from an RX 5700XT to the A770 LE and it's a world of difference. I went from near constant crashes and an almost unusable GPU to something that actually runs smoothly (for the most part).
I actually couldn't be happier with the new hardware.
Thanks for some great episodes on Intet Arc. I hope you will follow up in 6 months - 1 year, to see how better things are at that time. I really want Intel Arc to be a success, but because my gaming station also are my work station, I won't buy a GPU like Arc at the moment, because I has to be sure I don't run in to any issues. I do hope they will get better, and they can challenge Nvidia and AMD in a few years, because the GPU marked really need a competitor to Nvidia and AMD.
Running Intel A770 for some time now. Very happy with the performance for the price.
It's really interesting to see their drivers improve over time. It's a bit like watching the Switch emulators work out all the weird graphics bugs. Turns out there's a lot of crazy shit under the hood we never really think about, even as game developers.
I'm hoping things continue to improve. I'm fine with "okay" performance, I more just want things to work smoothly. One thing that I miss from the Nvidia GPU I had in my previous laptop was the control panel. It was simple and let me choose what opened with what without having to use the Windows settings app. When the GPU works as intended, it works just fine for me. It's just some road bumps getting it up and going.
One specific problem I've had is with BTD6 not using either GPU and Windows settings doesn't help fix it. Neither GPU in task manager will have any utilization after opening it, it all seems to happen on the CPU instead, and I don't have any way to troubleshoot it.
this is a certified intel arc moment
I Am Such A Confused Person!
@@commiequiz …
@@kevin65731 he is confused it seems, maybe we need to keep the sharp objects from him so he doesn't hurt himself
@@DodoTA727 no we give him all the sharp objects hehe
@@kevin65731 Heh
It may be a while before I put one of these in a setup, but it is really exciting to see a new gpu contender begin to rise.
Watching linus while luke is talking is quite funny 😂😂
Recipe for lasagna:
Ingredients
2 olive oil, plus a little for the dish
750g lean beef mince
90g pack prosciutto
tomato sauce
200ml hot beef stock
a little grated nutmeg
300g pack fresh lasagne sheets
white sauce
125g ball mozzarella, torn into thin strips
Method
STEP 1
To make the meat sauce, heat 2 tbsp olive oil in a frying pan and cook 750g lean beef mince in two batches for about 10 mins until browned all over.
STEP 2
Finely chop 4 slices of prosciutto from a 90g pack, then stir through the meat mixture.
STEP 3
Pour over 800g passata or half our basic tomato sauce recipe and 200ml hot beef stock. Add a little grated nutmeg, then season.
STEP 4
Bring up to the boil, then simmer for 30 mins until the sauce looks rich.
STEP 5
Heat oven to 180C/fan/160C/gas 4 and lightly oil an ovenproof dish (about 30 x 20cm).
STEP 6
Spoon one third of the meat sauce into the dish, then cover with some fresh lasagne sheets from a 300g pack. Drizzle over roughly 130g ready-made or homemade white sauce.
STEP 7
Repeat until you have 3 layers of pasta. Cover with the remaining 390g white sauce, making sure you can’t see any pasta poking through.
STEP 8
Scatter 125g torn mozzarella over the top.
STEP 9
Arrange the rest of the prosciutto on top. Bake for 45 mins until the top is bubbling and lightly browned.
Linus, the "Display connection might be limited" message might actually be related to your Asus PG42UQ.
As I also experienced that when I was testing with a PG42UQ that I decided to return.
I didn't get these messages on my two Acer monitors that I'm daily driving.
And just wait until you actually try using HDR on that Asus monitor, which has washed out colors on firmware version V32 and V33 due to some tone mapper problems.
I really hope Intel continues to improve on arc and they go for the same price to performance they did with this generation. When it works the price to performance of the A770 looks damn competitive.
I’m surprise that VR wasn’t a consideration for until from the start when making their GPUs. Granted, the close alpha does begin to address it. But it’s one of those things I wish was a consideration for the launch.
The VR userbase is still incredibly small, and I don't think its growing all that fast at the moment.
I agree. Especially cuz I think people who do buy the intel arc GPUs are gonna be those who are at the very least somewhat enthusiastic about tech, so there's a higher chance than in other cases that they would own a vr device
@@Funky_Brother Well no, because no one in their right mind would spend as much for their gpu as they would for a gaming accessory that can be used in 3 decent games
@@AnEagle you realise that things like the valve index aren't the only VR hesdsets available? There's stuff like the quest series that are much more affordable?
Honestly vr performance is pretty much the only consideration I personally have now, a mid range card runs anything 2d fine, and if you use vorpx you can run most games in vr, I just find it so much more immersive
The black values as gray may have been a quantization setting, set it to full.
They know it's that but there's no option to switch that!
I currently run an Arc a750 and apart from annoying UAC messages from Arc Control and a weird rendering bug in Minecraft which was fixed by resetting some settings, my experience has been flawless with Arc. Plus I am loving the performance improvements I keep getting from Intel's driver updates!
Thank you both for doing this challenge I know you both suffered a lot for us but could you in the future come back to this again when intel improves the drivers more ? Thanks again.. Awesome stuff !
full vs. limited color range (grey blacks) - I had the exact same issue on an Intel HD620 on an LG C9... it seems to be due to the driver recognizing the TV as a "TV" and forcing HDTV standards (limiting it to 8 bit YCbCr) instead of treating it as a PC monitor - the moment you switch to HDR, it switches to 10bit/RGB
I had the same issue with my very old Intel HD4000 once I connected my laptop to my TV. Upon searching online, I noticed that this bug has been there for so many Intel graphic cards and still hasn’t been properly resolved until now. I willing to bet my money that they still have the same issues on their ARC GPUs