I've owned the A770-16 since late December 2022. I find it to be extremely smooth and stable...not a single in-game crash or black screen on my end. I also use mine in VR on an HP Reverb G2 for simracing.
ok but how many different programs have you tried to run on it, like if you for example play Counter Strike and only Counter Strike...well then you are not going to run into problems.
@@Decenium I run it in Assetto Corsa, Assetto Corsa Competizioni, Automobilista2, Live for Speed, RaceRoom Edition and Rfactor2. All rings extremely smooth...most are 2012/2013 racing sims. AMS2 and ACC being the odd balls.
Thank God I watch GAMERS NEXUS to see the untearing of their Arc cards, a bunch of screws, ribbons, plastic brackets, card glued together, just to change the thermal paste. I'll pass on this card.🤡
Bought an A750 a few months ago for $200 for my GF, and had a 6650XT to compare it to. Surprisingly the A750 outperforms it in most modern games while costing less, only a few titles had issues. XeSS is better than FSR 2, ray tracing performance is better, and the video encoding performance and quality was better too. I am eager for more Intel Arc products, the new IGP in Meteor Lake and Battlemage next year, as I was very impressed with Intel's offering.
got an a750 second hand for ₱7000 ($140), i played 1080p high settings, no problem so far smooth and stable above 60+ fps especially in red dead redemption 2 and genshin impact😊
As a user Davinci Resolve as my main editing software. Your review especially make me 100% (from 80%) convince to get ARC series on my system. The moment you said your timeline is 4K timeline and doing great with ARC, im done. I got this card. I only play 2 esports title game, which is CS2 and Dota 2. No problem. Thanks mate, appreciate ur test. keep the good work.
Another great video! Well done! I like that you brought something different to the table with your views on the card. I am happy that Intel is entering the GPU space and shaking things up with its competitive pricing! More competition is always good for the consumer!
Definitely did NOT waste my time watching this, thank you for creating this! I bought Asrocks A750 some time ago for AV1 work and did quite a bit of testing with it and I do like to point out that there are few other + and - to encoding AV1 with hardware, but for time saved it truly is the future. I'd recommend everyone looking into AV1 read a bit into the benefits and disadvantages of hardware acceleration for video work/compression/efficiency. =)
@@vtnwesley I dont currently stream when playing other than Discord screenshare. Currently I'm using A750 as My main GPU For everything and experience Been good. All I can say get one on discount/used or wait For Intel Battlemage series of GPUs. Edit. If you can use hardware accelerated Av1 encoding, should Be a good experience comparing to other options. Still many variables to keep in mind.
Awesome. At least on my internet feed, gaming pc has totally taken over This is in-line with what I think. A PC that won’t have trouble with productivity demands, and also performs well for gaming If I wanted to spend more money on a PC, of course it makes sense to get that juicy performance But ultimately it’s been really confusing when the considerations towards productivity software is so shallowly mentioned and not touched on properly This video alone gave me much better insights into it
The latest Arc drivers from 10/22 installed on my Arc A770 16GB has boosted my Halo MCC fps in Halo 3 to 120 fps with dips no lower than 72 fps, whereas the previous driver could only give me 80 unstable fps, and dipping as low as 21 fps at times.
Had an A770 16GB LE from Intel for a month. Aside from War Thunder and some unique productivity apps. It's a perfectly good card in place of a 3060ti. My 3060ti died and the A770's increased memory improved my overall experience, minus the artefacts common in War Thunder. I do have an RX 7900 XTX now but if I didn't have that, I'd definitely say get a 16GB A770 if you can. It's that memory that makes the GPU even better, even when you don't think it would compared to the 8GB A770. As for what I had (and still have) when I had that card a month ago. It's similar to the specs in the video. i9-10900F (Factory Settings) 128GB DDR4 3000MHz - Kingston Fury Resizable BAR Enabled
Holy crap, someone in the comments who is directly helpful to me lol. So I'm a big War Thunder player, is it capable of running War thunder, or is it one of those starfield type of situations where it's just not optimized for it yet? I'd go out on a limb and say War Thunder is my most demanding game to run, but I want to get something completely future proof, and i'm looking at either the a750 or a770.
@@trickster8764 More than Capable if you have the 16GB model at Med-High 4K 200FPS. Not capable enough otherwise. given how it required 10.8GB of VRAM to do that on my setup. If you're looking for future proofing, no. This is not the best card for something like high-end gaming. For War Thunder it's pretty damn good and in many ways better than the 3060ti for that game (4060ti 16GB probably not.) An A750 and A770 8GB would be an absolute bad investment though. The VRAM is the sole reason why for those two. The 16GB hits a sweet spot that NVIDIA won't hit cause they're greedy fucks. AMD won't hit because they won't go even lower on pricing for... fairly obvious reasons. Intel's A770 16GB is the only card I can think of that I traded in for at that price range that just has a sweet spot of performance and some future proofing with XeSS and the VRAM to perform more than well enough. The issue I ran into it while playing War Thunder however was every few battles, to every other day (varied depending on driver AND game version updates) you'd have to change a single graphics setting because of heavy artifacting where the rendering engine of Dagor would literally just shit bricks and start bleeding in a lot of black through the textures and into the models. It happened once so far on the 7900 XTX.
I really want these gpus to take top spot and create some meaningful competition to nvidia, I really want to try them but I can't justify having a couple games that jank out due to graphical glitching. After playing GR Breakpoint, I realized that was the first game where graphics truly matter and what good visuals can do for a game's environment, if you use the directx renderer and compare it to vulkan, there's no comparison, vulkan is probably great for some games, but it didn't do GR Breakpoint any justice at all.
If you only do next gen games, on lower setting, this is it. Old titles, never do this please. Is 16gb enough ? I wait for higher specs now, one gen ? gen 3 ?
When it comes to RT in FH5, the reason the performance difference is so negligible is because the RT implementation in that game is basically barely there, especially on the high setting that you used where it's exclusively used in the car showroom part of the game with no difference in actual gameplay. To properly test RT you need to go to the ultra or extreme setting which actually engages in the open world, but even then it's only used for car reflections I believe.
A750 LE here, besides Starfield I haven't had any issue at all. Gives 3060 performance at 250$, great bang for buck, not to mention it is frigging solid in terms of build quality and cooling.
I've bee using the A750 LE for 5 months in my main rig. B350 + Ryzen 5600 + 32GB 3800C18 I have less issues than with the 2060S and the RX6600 and I'm not even using it for any production. It's just stable and quiet, sure some drivers could have been better optimized, but Intel is working on it, which will make it age even better. It's definitely a 1440p card if you are fine with low-medium and while raytracing works well on it you still wouldn't want to turn it on, it's simply not worth it unless running at 4k and can still get over 60FPS. Latest driver 4887 was not WHQL cerified at first which could have been why it wasn't found.
Do you recommend that card as a long term investment instead of an RX 6650 XT? I've been thinking about buying one but I'm not sure if it's worth it because of the drivers I only want to play at 1080p 75FPS games like Battlefield and Halo
I just got this same card for 170 new . Great card for that price. Runs better then my 2060 super so far. Only issue I have so far is the same as you with the driver updates so I just check the site often.
I bought my A770 about a year ago start of 2023. At that time in my opinion it passed the point where it worked. 4 months ago when this video was made it was lot better and now we see channels comparing it to 4060 and above in some cases and productivity it is like $1000 cards would give the same. Yet I feel there is +20% still on the table. Not to talk about Intel Application Optimization (APO) that is predicted to give 10-50% jump with Battlebage/14th gen CPU but this to become a thing most likely we are looking 2025 and beyond with Celestial but it should boost the Alchemist also. I see A750/A770 being in PC's 10 years from now still giving enough performance.
It's a 225W TDP card, certainly not the most efficient but not the worst I've seen either. I definitely should have mentioned that in the video, though!
I'm mostly a gamer but I found it funny enough that we both swapped between the same GPUs essentially; Reference 6950XT and a LE A770. I'd like to build another computer around it and actually try my hand at video editing again. I'd used it for about a month last year and it had a lot more teething issues. Glad to see its come such a long way though.
I'll probably be picking up a Battlemage GPU if they actually release them next year. I love the 6800XT I have now... on Linux. It's mostly fine 95-98% of the time on Windows, but some software that uses hardware acceleration causes random black screens that might last 15-30 seconds before the driver detects it has stalled out and restarts itself. This has happened to me almost exclusively with chromium based browsers (Edge/Chrome/Chromium/Brave) and Adobe Lightroom. I swapped back to Firefox full time, and I've had next to zero black screens outside when I do launch anything Chromium based that has hardware acceleration enabled in config or the occasional Lightroom spaz (very rare, less often with each revision). I swapped back to Intel with the 12600KF to build a midrange PC for my main computer to focus more on upscaling and other modern tech for 2k gameplay rather than just throwing the highest end hardware like I have for years. I would have tried an A770 for this midrange experiment and using the 6800XT in my render machine, but Linux support for Arc cards, while improving at a massive rate, isn't quite there yet.
Ah yeah I can imagine that Arc doesn't play the greatest on Linux. Would be interesting to do tests on that, but sadly I'm not familiar enough with Linux yet to test it correctly and I'd probably get yelled at for something I'd miss by all the commenters haha.
It is astonishing how terrible almost all browsers (chromium based) are when it comes to hardware acceleration on Linux. It does keep Firefox relevant though and I have to admit FF latest has a lot of pretty slick features
I found it takes arc control around a week before it sees new drivers as being available. Ive just assumed it was intels way of getting extra testing in before everyone is notified
Im so glad you did satisfactory as that game is a large part of the reason im looking to upgrade, i hadnt heard anything about it not running in directx12, that being said i dont have a problem with directx11 as long as it runs well which it seems too
Almost all performance problems in older games are solved by dxvk, it converts dx9, 10, 11 to vulkan and thus significantly improves performance, try running halo with dxvk, I'm sure the performance will be what it should be.
got mine for $179 cyber Monday with free Assassin's Creed game and really liking the game and I pared it with a budget acer 1tb 3x4 nvme 2tb ssd a R5 7600 with 16 gig of budget ddr 5 4800 on a MSI $74 a620 Matx motherboard with a MSI curved 27" 1080p 240hz monitor and it turned out to be a fantastic budget gaming rig happy with it big time.
I daily drive an RX 6800 however I do have an Intel Arc A770 LE that I have tested with a little before and I plan to daily drive for a month to get my full thoughts on it. The only other GPU in the same performance bracket I had to test it against is the Immortal GTX 1080 Ti. In most newer titles, the Arc comes out on top. Which was pretty nice to see because the 1080 Ti is roughly on par with the 3060 in new titles generally. Really love how far Intel has come and when it comes time to upgrade my RX 6800, I might consider Intel's future generation of GPUs. On a side note, the Limited Edition variants of the Arc cards are stunning.
Yeah, being first generation products and all, these cards are promising! Also, I totally agree on the design of the LE cards, they're so professional and sleek looking, and I love that.
I just snagged me a A750 today because the PC I bought was in desperate need of an upgrade (when I bought it, it had a GTX 970 in it and shortly after getting it the GPU started crashing after a few hours of use which was less then ideal) and so far I am impressed. The price point definitely makes the card for it. I got mine for $210 after tax at Microcenter. My games so far have seen a noticeable uptick in FPS from where it was, which given the older card I expected that. I definitely recommend this as an upgrade card if you aren’t looking to break the bank or don’t fall under the 2 categories that were mentioned here. I also want to add, I haven’t had any issues running older emulators on the card yet. I’m definitely on the lookout for any hiccups but so far the older games on emulators have run smooth for me
I did a build October last year with this card....a lot of people actually laughed when they heard I was using it because it has a horrible reputation on the surface, but after a year of updates this card rocks.
Well Doc I built my granddaughter a green DIY fishtank case PC with this card. blue Sparkle A580 card in it. I built this PC to look good and do a little of everything. I really did not know what her little family is doing these days due to she is a mom a wife and a career woman a nurse now. The card looked really good in that PC with the strange blueish green color. and from my testing did great. I have not have any completes about the PC. I just wanted something new and modern. The arc cards are all that. Just a closing comments about the cards is one reason I think they do well is the bus size on those cards keeps them from running out of memory. A 256 bus on an 8gb card vrs 128 bus 8gb RX 6600 Just saying. .Even a bigger bus than the 196 bus of the RTx 12 gb 3060
things are getting interesting with Apple M1/M2 (finally 8+ hours battery life on a laptop, and impressive gaming performance for an ARM chip) and now Intel ARC gpu's... as an AMD customer (probably the last time) I get so frustrated that I have to micromanage so much using AMD Ryzen platform and I hate that I have to use different programs for doing anything, when there could be just one official AMD tuning software... here I have in my hands this amazing powerful laptop but I have to mess with command prompt command lines in order to "unlock" the CPU performance mode, and after stop gaming I have to yet again type a command line in order to get my laptop back into "cooling" mode which it doesn't do automatically... it's so much micromanaging and it gets frustrating... that's the AMD experience... not to mention crappy battery life, also a problem on Steam Deck which uses AMD chips
I just ordered one for 180 on that cyba monday deal! For something that goes head to head with Nvidier and AMD gpus that are 70 dollars more expensive i'll give it a try. It's that amazn 30 day free returns anyways.
@@lucasrem 4060 cards at twice the price and only 8gb of ram, is not a good value. Nvidia gimped that ram as a marketing ploy to make you want to jump into an $800 4080 with 16gb vram
10:57 Umm Actually, OpenGL 1.0 was released in 1992. OpenGL 4.4, the version that modern Minecraft uses, was released in 2013 which is AFTER DirectX 11 (released in 2009).
For Minecraft you had VSync on without Optifine or Sodium. The fps stats are not very representative of the "normal" player experience. Fps counts you should expect for this GPU would be between 450-800fps with VSync off and performance optimisation mods, which everyone in the community uses.
Oh wow, I entirely didn't even catch that I accidentally left Vsync on, I could swear I took it off. Good catch there! As for the performance mods, I considered testing the card with them, but instead opted for a straight vanilla test because even though most people do use performance mods, I still know many who don't, and I believed that the card should be tested in an unmodified scenario. Though, that test isn't good now that the accidental VSync has been pointed out, thanks!
Micro center around me has one of these open box for $174, im considering going for to replace the rx480 on an older rig. Let's see what i decide at the end of your vid
Hopefully, that older rig will support resizable bar. The performance hit without ReBar, based on several other YT creators seems to be about 25%. The price is awesome, so I hope it works for you.
I am using Intel Arc A750 premiere pro 2023 use Video out put render Cpu only Going 100% Gpu 0% What problem Hardware Encoding not supported showing But Arc graphics card inside i don't know please reply
I was thinking about getting one of these to use in my 10-year old system, to replace a 2050. Request... Could you test the ARC with Minecraft using Distant Horizons and shaders on Fabric/Iris? I am curious to see how it handles the game in that scenario. Very good video, I was curious about the card. Thank you. I wonder why video card manufacturers are still cheaping out on RAM? This started back in the video card shortage days, but I have no idea why it continues. All newer cards should have 16GB minimum IMHO.
*I wonder why video card manufacturers are still cheaping out on RAM* It's easy - to squeeze more money out of consumers. For example to get a 16Gb from Nvidia you should pay for RTX4060Ti which has also has a narrow memory bus and has just a little more productivity rather than RTX4060. So the most budget-oriented people will have to stick with RTX4060 8Gb or go for the previous generation 3060 which has its disadvantages.
Love seeing Satisfactory in performance testing, though it is pretty stuttery so it's not a great metric. I've seen the A750 as low as $180 and think it makes a great option for that price point.
Absolutely love Satisfactory but do agree on the stutteriness of it making especially the percent lows pretty bad metrics. Definitely a good card for the price!
I am also Using Arc A750 Premiere Pro Rendering Cpu and IGPU only Load Going Not using Arc GPU 20% Only what problem I don't know Not using Gpu please tell
Halo reach was a problem for me 5 months or more back (and others had reported it as well at the time) on a a770. Sad to see its still there. Halo 1, 2 and 3 did run fine for me though.
Hi there! Halo Infinite does not support Ray Tracing currently within Campaign mode. It only offers this in Multiplayer and Arc does support it, however I think there's a texture bug in Intels release notes mentioning that they're working on fixing RT in Halo Inf.
Also, yes I agree, the card is good for pretty much everyone now at it's price point. Sept/Oct HUGE DX11 driver improvements have put the final nail in that old mantra. These days, the drivers are quite polished and offer great performance almost across the board, with a few caveats. Also, if you don't have ReBar, honestly you're going to be on quite an old Intel/AMD platform at this point. Installing a modern GPU will have diminished returns anyway as you're going to be quite memory and CPU bottlenecked. Great vid!
Bought an A-770 new. It now sits on a shelf. If Intel ever accepts that Fallout 4, Skyrim, and now Starfield are absolute essential CORE games for any PC gamer it will stay there. The card is decent. If you avoid all Bethesda titles. No reason an old 1060ti 6gb card should slaughter the A-770 in every single Bethesda title at 1440p. Including Starfield. Cards have potential but sadly these aren't ready for the show yet.
Interesting, I guess I hadn't realized but I didn't end up testing the card in any Bethesda titles just because I don't really have any that would be good benchmarks, I think. Maybe I'll have to revisit that and see if it still has those issues, thanks for sharing your experience!
DOSBox doesn't work with the Arc cards anymore in Windows 10, since about 3.5 months ago with driver 4676. I really like my Arc a750 a lot, but I have a lot of old games from GOG and Steam that depend on DOSBox working. From what I've read Intel is dismissive of this problem, and may never fix it, so I'll have to eventually switch back to my AMD card, can't stay with the old Arc driver forever just to have DOSBox work. If you don't need DOSBox, then the Arc is a great card.
@@CHWTT I thought I'd better warn people about this. Why Intel won't fix this is beyond me. I switched back to my AMD 6600XT card now. It's a bit slower than the Intel a750, but DOSBox works like it supposed to with the AMD cards.
With DaVinci Resolve... I recently bought an A750, but for the timeline playback (and the Blackmagic Raw Speed Test) my A750 is performing extremely poorly. For the BlackMagic Raw speed test, it's actually performing worse than the CPU. I can not play a 1080p 60FPS video back smoothly in the timeline. It may be a separate hardware issue, but my prior RX580 did have smooth playback even using the "Show Every Frame" setting enabled. I believe that my Motherboard is limited to PCIe 3 x16, and my CPU is only an Ryzen 3 3200G. Both of these may be limiting factors. 32GB ram, NVMe ssd. Seeing videos such as these that compliment its editing performance baffle me, because it means that I must be doing something wrong. Do you have any ideas that I should check on?
Sorry to hear you're not having the greatest time with the A750 in video editing! I think that the limiting factory may end up being your CPU/motherboard because of the PCIe link (also pretty certain that CPU doesn't have ReBAR support, but I could be mistaken). Definitely check what kind of codec your footage is in and the bitrate of it, but I hope you'll be able to find an answer to this issue soon! (hopefully also one that doesn't cost any money)
@@CHWTT Hello, and thank you for the reply. You were correct, as it was an issue with my motherboard/CPU. I purchased an intel 12100F and a cheap MSI motherboard and I am receiving expected performance now (12X the BRAW decoding performance). The previous setup did have Resize Bar support, but I am glad to have found a solution, even if it is one that costed. Thank you again for the response.
I don't at the moment, but you've just given me the idea to make a video on the way that I have all that stuff set up. Stay tuned, may have a vid out on that soon!
I just am not sure, I can get a A750 atm for 230 euro, but a RX6650XT will cost me 250 which is a small difference in price but im pretty sure the 6650XT is in general faster and will have much less driver issues.
I'd also consider an RTX4060. Right now it costs pretty much the same as 3060 but has way lower power consumption, it is very silent in daily work and it's way better optimized so in many games it shows even higher FPS than 3060 despite its narrow bus. Personally I don't work in any 3D-modelling programs but in benchmarks it shows waaay higher results than 3060 - in programs like Blender. In video editting they are close. May be 12Gb of 3060 is a little more benefitial in Davinci. Also hardware support of QuickSynch and AV1 codec in A750 gives some good expectation on its productivity in video editting (at least in future).
If " a750 and rtx 3050 is the same prize..which one is good choice considering cuda cores, driver optimization,future supports etc. I will do graphics disign using photoshop,illustrator,maya,blender,3ds max... And gunnir or intel which one will better for arc a750? Thanks.
I haven't tested the Arc in any of those programs as I don't use them personally, so I can't really speak on whether or not it'll play nicely although I'd assume it would be fine. Since I don't know how the Arc behaves in any of those programs, I can't make a super great recommendation. The 3050 will perform worse than the A750 in several things, but depending on how those programs work it might play nicer than the Arc. I just can't be certain, I bet there's someone else who's tested it in those programs that can maybe give a better answer! As for Gunnir vs. Intel, this is what I said to another commenter who asked that same question: I'm not really sure as I haven't ever used a Gunnir card, but I can tell you that the Intel LE one that I used here worked very well. Feels well built, temps were fine, was pretty quiet, and I think it looks quite cool.
I'm not entirely certain, but I built a system with a 12900k and a Deepcool AK620 lately. The AK620 seems like its almost enough to keep the 12900k cool at stock, but not quite. I've undervolted mine a fair bit and it still likes to hang out around the mid 90's in Cinebench R23. So something slightly better than the AK620 would probably be best, not entirely sure what that would look like though. The BeQuiet! Dark Rock Pro 4 is rated at a 250W TDP so I think it should tame the 12900k pretty well, but I haven't tested one personally and they can be expensive if a sale doesn't happen to crop up when you're looking to buy one.
I have an intel core i5-10400 and a asus prime h510m-e motherboard. Will intel A750 perform optimally with my pc. 16 Gb of ram also. With a 650 watt psu.
I can't tell whether or not the H510M-E motherboard will support ReBAR. That CPU should be able to support ReBAR, but I'm not sure if your motherboard would have that setting available even with the latest BIOS. If you can find out whether or not it does support ReBAR, the A750 will perform quite well for you. Otherwise, it's not the best choice for your system as the lack of ReBAR won't allow it to really stretch its legs and, in some titles, it will perform pretty terribly.
B roll wasn't a great representation of the actual experience. It was also pretty heavy footage to deal with (4K30, H.264, 130Mbps), and my CPU was the bottleneck on the timeline as I don't use any proxies or anything along the lines of that.
The 12400F supports ReBAR and is a really good performance match for this card, so pairing that CPU with this card would certainly make sense. I'm not 100% certain if that motherboard supports ReBAR, but considering it's an Asus B660 board I would think it likely does. As long as the board you choose supports ReBAR, it should work well with this GPU!
Do you recommend that card as a long term investment instead of an RX 6650 XT? I've been thinking about buying one but I'm not sure if it's worth it because of the drivers I only want to play at 1080p 75FPS games like Battlefield and Halo
I haven't tested it in battlefield, but in Halo (especially with the drivers that came out very shortly after this video was released that apparently fixed all of the issues in Halo: The Master Chief Collection, I think it would be a good option for you. I've heard from another commenter that it beats out the 6650XT while costing less, so it might even be the better option of the two. Though, I can't promise anything in the Battlefield games as I've never run them on the card (I assume that it would do pretty well in them, though).
Can you scroll the timeline of Davinci smoothly? 1080p. Because I can scroll almost natively on Apple M2, so smooth and fast, no lags. Was wondering if this is an upgrade. I wanted a dedicated machine just for encoding.
Yeah, I actually just built an editing PC with this card in my latest video! The screen recording at the end of me using it in my new video wasn't the best as I'd just loaded the project and it wasn't fully up to speed, but once the project is loaded this card (when combined with a powerful CPU) is almost able to scroll through my timelines entirely smoothly. Though, my timelines are all 130Mbps 4K30FPS h.264 video so much heavier than 1080p. Pretty certain it could be an upgrade to an M2, but definitely do a bit of research on it!
Well, generally a lot of software just doesn't play nicely with AMD's hardware. The acceleration either doesn't work very efficiently, or it doesn't work at all. There are of course some outliers, and hardware acceleration support on AMD cards is getting better by the day, but it causes them to underperform in productivity software. With the Intel Arc card I tested, and Nvidia cards, most software can effectively and efficiently use their hardware to accelerate their processes unlike Radeon which is why I say that Radeon sucks for productivity. Though, most games love Radeon and do incredibly well on it (especially when you factor in the price!). Hopefully that was at least a little help! Oh, and just a little example here: I believe that I saw somewhere - can't remember where/when/if this is even accurate but based on experience I believe it - that the RX 6950 XT (my gaming GPU) performs about like an RTX 3060 in Davinci Resolve. Having the top tier Radeon and a mid/low end Nvidia card from the same generation perform equivalently in productivity definitely demonstrates how inefficient hardware acceleration on Radeon seems to be.
I'm not really sure as I haven't ever used a Gunnir card, but I can tell you that the Intel LE one that I used here worked very well. Feels well built, temps were fine, was pretty quiet, and I think it looks pretty cool.
saying opengl is old because it was released in 1992 is like saying directx12 is old because dx was released in 1995... That being said I believe it does use a version from 2009. Doom Eternal / Doom 2016 also use opengl by default
so AMD GPU is not recognized in Davinci Resolve? wow... this is the thing with being an AMD customer, there are so many small annoyances that happen for 10+ years and they never fixed it... using AMD is like you're stuck in the past, so much micromanaging and small annoyances for everything... their internal GPU screen footage encoder is pretty crappy too, you have to bump up bitrate to have "acceptable" quality (not even good video quality)
Since the pump block isn't the highest point in the loop, it's actually fine. If the pump was the highest point, the air would collect there and it would be an issue, but in this case it's actually fine. Thanks for your concern though!
@@antonstark9689he is wrong, by having the lines high, the pump will pull in air as the water level drops due to molecules slowly evaporating thru the hoses. The hoses should be at the bottom. In 2 yrs he will find out you were right
here not many sells intel arc , for price of a750 u can get 6700 10gb and and 8gb version of a770 cost same as a750, 16gb version of a770 cost same as 6700 xt, currently i woudlnt buy any as games are more demanding and 12 gb might on amd might not be enough
8 gigs is enough for 1440p gaming you dont need more than that, heck i was gaming at 1440p with a rx 580 4gigs with a ryzen 2400g and i am hitting 80+ frames on R6 siege and wow retail at 1440p
The hitman graphic demo reminds me of Arkam City with ray tracing. I am cheering for Intel. I think Nvidia and even ATI are pretty arrogant, though Nvidia takes the prize for arrogance.
Agreed, I'm really hoping Intel can break into this market because I think they're doing some great work. I'm totally going to be checking out their Battlemage cards when they launch, and man am I excited to see those cards!
how in the world do you not have titles that run lower then DirectX 11? how is that possible, your collection cannot be THAT limited.....AC1, Bioshock (1 and 2), Call of Juarez, Company of Heroes, Crysis, Far Cry 2 and 3, Max Payne, STALKER, just....non of that?
I've owned the A770-16 since late December 2022.
I find it to be extremely smooth and stable...not a single in-game crash or black screen on my end.
I also use mine in VR on an HP Reverb G2 for simracing.
hey, what VR equipment and games you play? Hows the experience? Cus from what i heard, ARC is terrible for VR games is that true?
Here's my A770 in VR in Assetto Corsa.... ua-cam.com/video/x2i7zOFdrSg/v-deo.htmlsi=fWlQPXuZKrIfiXRQ 7:51
ok but how many different programs have you tried to run on it, like if you for example play Counter Strike and only Counter Strike...well then you are not going to run into problems.
@@Decenium I run it in Assetto Corsa, Assetto Corsa Competizioni, Automobilista2, Live for Speed, RaceRoom Edition and Rfactor2. All rings extremely smooth...most are 2012/2013 racing sims.
AMS2 and ACC being the odd balls.
Thank God I watch GAMERS NEXUS to see the untearing of their Arc cards, a bunch of screws, ribbons, plastic brackets, card glued together, just to change the thermal paste. I'll pass on this card.🤡
Bought an A750 a few months ago for $200 for my GF, and had a 6650XT to compare it to. Surprisingly the A750 outperforms it in most modern games while costing less, only a few titles had issues. XeSS is better than FSR 2, ray tracing performance is better, and the video encoding performance and quality was better too. I am eager for more Intel Arc products, the new IGP in Meteor Lake and Battlemage next year, as I was very impressed with Intel's offering.
Thanks for the insight! Glad your GF is enjoying the A750
This card is amazing
Just bought it to replace my A380
Wow what an upgrade
Especially paired with an i3 10100 for now
Moving it to a 12100 system soon
@TheDivisionAddict
Price to performance duh.
Thanks for this comment.
got an a750 second hand for ₱7000 ($140), i played 1080p high settings, no problem so far smooth and stable above 60+ fps especially in red dead redemption 2 and genshin impact😊
As a user Davinci Resolve as my main editing software. Your review especially make me 100% (from 80%) convince to get ARC series on my system. The moment you said your timeline is 4K timeline and doing great with ARC, im done. I got this card. I only play 2 esports title game, which is CS2 and Dota 2. No problem. Thanks mate, appreciate ur test. keep the good work.
I hope the card is doing well for you!
hey, how is cs2 performance on ARC? i've seen some videos showing awful performance
just got this card for hardcore stuff dont tell dont ask wish me luck
love mine. the odd issue with drivers but runs butter smooth now. great value
Another great video! Well done! I like that you brought something different to the table with your views on the card. I am happy that Intel is entering the GPU space and shaking things up with its competitive pricing! More competition is always good for the consumer!
He never did test it on old titles, it fails.
if you do next gen games on low setting, this is it !
@@lucasrem yeah, I would assume that older games had drivers and optimizations tuned for ATI/Radeon, and Nvidia!
Definitely did NOT waste my time watching this, thank you for creating this!
I bought Asrocks A750 some time ago for AV1 work and did quite a bit of testing with it and I do like to point out that there are few other + and - to encoding AV1 with hardware, but for time saved it truly is the future. I'd recommend everyone looking into AV1 read a bit into the benefits and disadvantages of hardware acceleration for video work/compression/efficiency. =)
Hate boosting an older post, but honest question... How does it handle gaming and streaming/encode at the same time?
@@vtnwesley I dont currently stream when playing other than Discord screenshare. Currently I'm using A750 as My main GPU For everything and experience Been good. All I can say get one on discount/used or wait For Intel Battlemage series of GPUs.
Edit. If you can use hardware accelerated Av1 encoding, should Be a good experience comparing to other options. Still many variables to keep in mind.
@@BTA_KeepItFunr u using rebar?
@@CuriousGamer022 Resizable bar? Yes should Be enabled👍
@@BTA_KeepItFun ah OK thx for letting me know👌👍,I have above 4g decoding which is the same apparently so I should be good to go
Great and thorough video. Intel have fixed the Halo Master Chief Collection as of driver 4953.
Awesome. At least on my internet feed, gaming pc has totally taken over
This is in-line with what I think. A PC that won’t have trouble with productivity demands, and also performs well for gaming
If I wanted to spend more money on a PC, of course it makes sense to get that juicy performance
But ultimately it’s been really confusing when the considerations towards productivity software is so shallowly mentioned and not touched on properly
This video alone gave me much better insights into it
Glad it was helpful for you!
The latest Arc drivers from 10/22 installed on my Arc A770 16GB has boosted my Halo MCC fps in Halo 3 to 120 fps with dips no lower than 72 fps, whereas the previous driver could only give me 80 unstable fps, and dipping as low as 21 fps at times.
Glad to hear that MCC performance is getting better!
i go RTX 2060 super and i have 165 fps stable no drops(ryzen 3600) in Halo 3 1440p
Yeah, great card for Horizon on lower settings on less frames ! Keep it cheap !
What a great ewview. Thank you!
Had an A770 16GB LE from Intel for a month.
Aside from War Thunder and some unique productivity apps. It's a perfectly good card in place of a 3060ti. My 3060ti died and the A770's increased memory improved my overall experience, minus the artefacts common in War Thunder.
I do have an RX 7900 XTX now but if I didn't have that, I'd definitely say get a 16GB A770 if you can. It's that memory that makes the GPU even better, even when you don't think it would compared to the 8GB A770.
As for what I had (and still have) when I had that card a month ago. It's similar to the specs in the video.
i9-10900F (Factory Settings)
128GB DDR4 3000MHz - Kingston Fury
Resizable BAR Enabled
Holy crap, someone in the comments who is directly helpful to me lol. So I'm a big War Thunder player, is it capable of running War thunder, or is it one of those starfield type of situations where it's just not optimized for it yet? I'd go out on a limb and say War Thunder is my most demanding game to run, but I want to get something completely future proof, and i'm looking at either the a750 or a770.
@@trickster8764 More than Capable if you have the 16GB model at Med-High 4K 200FPS. Not capable enough otherwise. given how it required 10.8GB of VRAM to do that on my setup.
If you're looking for future proofing, no. This is not the best card for something like high-end gaming. For War Thunder it's pretty damn good and in many ways better than the 3060ti for that game (4060ti 16GB probably not.)
An A750 and A770 8GB would be an absolute bad investment though. The VRAM is the sole reason why for those two. The 16GB hits a sweet spot that NVIDIA won't hit cause they're greedy fucks. AMD won't hit because they won't go even lower on pricing for... fairly obvious reasons. Intel's A770 16GB is the only card I can think of that I traded in for at that price range that just has a sweet spot of performance and some future proofing with XeSS and the VRAM to perform more than well enough.
The issue I ran into it while playing War Thunder however was every few battles, to every other day (varied depending on driver AND game version updates) you'd have to change a single graphics setting because of heavy artifacting where the rendering engine of Dagor would literally just shit bricks and start bleeding in a lot of black through the textures and into the models. It happened once so far on the 7900 XTX.
I really want these gpus to take top spot and create some meaningful competition to nvidia, I really want to try them but I can't justify having a couple games that jank out due to graphical glitching.
After playing GR Breakpoint, I realized that was the first game where graphics truly matter and what good visuals can do for a game's environment, if you use the directx renderer and compare it to vulkan, there's no comparison, vulkan is probably great for some games, but it didn't do GR Breakpoint any justice at all.
If you only do next gen games, on lower setting, this is it.
Old titles, never do this please.
Is 16gb enough ? I wait for higher specs now, one gen ? gen 3 ?
When it comes to RT in FH5, the reason the performance difference is so negligible is because the RT implementation in that game is basically barely there, especially on the high setting that you used where it's exclusively used in the car showroom part of the game with no difference in actual gameplay. To properly test RT you need to go to the ultra or extreme setting which actually engages in the open world, but even then it's only used for car reflections I believe.
What he said! :)
A750 LE here, besides Starfield I haven't had any issue at all. Gives 3060 performance at 250$, great bang for buck, not to mention it is frigging solid in terms of build quality and cooling.
Hi bro, what about power consumption ?
I've bee using the A750 LE for 5 months in my main rig.
B350 + Ryzen 5600 + 32GB 3800C18
I have less issues than with the 2060S and the RX6600 and I'm not even using it for any production.
It's just stable and quiet, sure some drivers could have been better optimized, but Intel is working on it, which will make it age even better.
It's definitely a 1440p card if you are fine with low-medium and while raytracing works well on it you still wouldn't want to turn it on, it's simply not worth it unless running at 4k and can still get over 60FPS.
Latest driver 4887 was not WHQL cerified at first which could have been why it wasn't found.
Do you recommend that card as a long term investment instead of an RX 6650 XT?
I've been thinking about buying one but I'm not sure if it's worth it because of the drivers
I only want to play at 1080p 75FPS games like Battlefield and Halo
I just got this same card for 170 new . Great card for that price. Runs better then my 2060 super so far. Only issue I have so far is the same as you with the driver updates so I just check the site often.
I bought my A770 about a year ago start of 2023. At that time in my opinion it passed the point where it worked. 4 months ago when this video was made it was lot better and now we see channels comparing it to 4060 and above in some cases and productivity it is like $1000 cards would give the same. Yet I feel there is +20% still on the table. Not to talk about Intel Application Optimization (APO) that is predicted to give 10-50% jump with Battlebage/14th gen CPU but this to become a thing most likely we are looking 2025 and beyond with Celestial but it should boost the Alchemist also. I see A750/A770 being in PC's 10 years from now still giving enough performance.
I think it looks really good. I prefer a clean design like that.
thanks for the heads up will be considering one of these..
The drivers have got better and better. Should be interesting when they release their new gpu in 2023-2024
I absolutely loves my Arc A750!
Hi bro, what about power consumption ?
Switched to a new mic, sorry if there's any weirdness in the audio as I'm still perfecting my recording technique with it.
I don’t have a problem with that and also your videos are awesome! I‘m here before this Chanel blows up!
Random question Am i right in thinking a intel arc also uses a lot of power?
It's a 225W TDP card, certainly not the most efficient but not the worst I've seen either. I definitely should have mentioned that in the video, though!
Sounds good to me, thanks for the vid
I'm mostly a gamer but I found it funny enough that we both swapped between the same GPUs essentially; Reference 6950XT and a LE A770. I'd like to build another computer around it and actually try my hand at video editing again. I'd used it for about a month last year and it had a lot more teething issues. Glad to see its come such a long way though.
Oh that is quite funny that we both switched very similar GPUs, haha!
I'll probably be picking up a Battlemage GPU if they actually release them next year. I love the 6800XT I have now... on Linux. It's mostly fine 95-98% of the time on Windows, but some software that uses hardware acceleration causes random black screens that might last 15-30 seconds before the driver detects it has stalled out and restarts itself. This has happened to me almost exclusively with chromium based browsers (Edge/Chrome/Chromium/Brave) and Adobe Lightroom. I swapped back to Firefox full time, and I've had next to zero black screens outside when I do launch anything Chromium based that has hardware acceleration enabled in config or the occasional Lightroom spaz (very rare, less often with each revision).
I swapped back to Intel with the 12600KF to build a midrange PC for my main computer to focus more on upscaling and other modern tech for 2k gameplay rather than just throwing the highest end hardware like I have for years. I would have tried an A770 for this midrange experiment and using the 6800XT in my render machine, but Linux support for Arc cards, while improving at a massive rate, isn't quite there yet.
Ah yeah I can imagine that Arc doesn't play the greatest on Linux. Would be interesting to do tests on that, but sadly I'm not familiar enough with Linux yet to test it correctly and I'd probably get yelled at for something I'd miss by all the commenters haha.
I'm also having black screen problems and flickering on 6700XT. I also had many time outs a few weeks ago. But that seems to have gone away.
It is astonishing how terrible almost all browsers (chromium based) are when it comes to hardware acceleration on Linux.
It does keep Firefox relevant though and I have to admit FF latest has a lot of pretty slick features
7:00 i have seen other reviewers say same thing about updating, i would hope that by now, 6 months on they have corrected with updates.
I found it takes arc control around a week before it sees new drivers as being available. Ive just assumed it was intels way of getting extra testing in before everyone is notified
Im so glad you did satisfactory as that game is a large part of the reason im looking to upgrade, i hadnt heard anything about it not running in directx12, that being said i dont have a problem with directx11 as long as it runs well which it seems too
Hi bro, what about power consumption ?
Almost all performance problems in older games are solved by dxvk, it converts dx9, 10, 11 to vulkan and thus significantly improves performance, try running halo with dxvk, I'm sure the performance will be what it should be.
I'll have to try that at some point, thanks!
@@CHWTTThe just released ark drivers increased performance in Master Chief Collection by 750%, so now you don't even need dxvk, congrats to you.
You should try it, you are too lazy for that ?
If you run next gen on low settings, ARC is good, keep it cheap !
I really like the a750. My only complaint is the 8GB of vram. Card has been stable for me.
its called buyers remorse since the a770 16gb is only 90 bucks more u cheapskate guy, you : P
got mine for $179 cyber Monday with free Assassin's Creed game and really liking the game and I pared it with a budget acer 1tb 3x4 nvme 2tb ssd a R5 7600 with 16 gig of budget ddr 5 4800 on a MSI $74 a620 Matx motherboard with a MSI curved 27" 1080p 240hz monitor and it turned out to be a fantastic budget gaming rig happy with it big time.
I daily drive an RX 6800 however I do have an Intel Arc A770 LE that I have tested with a little before and I plan to daily drive for a month to get my full thoughts on it. The only other GPU in the same performance bracket I had to test it against is the Immortal GTX 1080 Ti. In most newer titles, the Arc comes out on top. Which was pretty nice to see because the 1080 Ti is roughly on par with the 3060 in new titles generally. Really love how far Intel has come and when it comes time to upgrade my RX 6800, I might consider Intel's future generation of GPUs. On a side note, the Limited Edition variants of the Arc cards are stunning.
Yeah, being first generation products and all, these cards are promising! Also, I totally agree on the design of the LE cards, they're so professional and sleek looking, and I love that.
I got an A750 and I was most impressed with its video encoding capabilities, I can render videos at 4k without any issues.
I am also Using Arc A750 Premiere Pro Rendering Cpu and IGPU only Load Going Not using Arc GPU 20% Only what problem I don't know please tell
Just bought one yesterday. So far so good. I would recommend a clean install of the driver. And a clean removal of nvidia or amd if u had it before.
BUY THIS CARD!!! drivers updated finally!
It's really great for video editing on Premiere Pro as well.
I am also Using Arc A750 Premiere Pro Rendering Cpu and IGPU only Load Going Not using Arc GPU 20% Only what problem I don't know please tell
I just snagged me a A750 today because the PC I bought was in desperate need of an upgrade (when I bought it, it had a GTX 970 in it and shortly after getting it the GPU started crashing after a few hours of use which was less then ideal) and so far I am impressed. The price point definitely makes the card for it. I got mine for $210 after tax at Microcenter. My games so far have seen a noticeable uptick in FPS from where it was, which given the older card I expected that. I definitely recommend this as an upgrade card if you aren’t looking to break the bank or don’t fall under the 2 categories that were mentioned here.
I also want to add, I haven’t had any issues running older emulators on the card yet. I’m definitely on the lookout for any hiccups but so far the older games on emulators have run smooth for me
Glad to hear about your good experience with it!
Throw back the carp you snagged and go pay $90 more for the A770 16gb
spiderman game fells like a refresh of prototype back in old days
I did a build October last year with this card....a lot of people actually laughed when they heard I was using it because it has a horrible reputation on the surface, but after a year of updates this card rocks.
Agreed, this card is awesome!
Well Doc I built my granddaughter a green DIY fishtank case PC with this card. blue Sparkle A580 card in it. I built this PC to look good and do a little of everything. I really did not know what her little family is doing these days due to she is a mom a wife and a career woman a nurse now. The card looked really good in that PC with the strange blueish green color. and from my testing did great. I have not have any completes about the PC. I just wanted something new and modern. The arc cards are all that. Just a closing comments about the cards is one reason I think they do well is the bus size on those cards keeps them from running out of memory. A 256 bus on an 8gb card vrs 128 bus 8gb RX 6600 Just saying. .Even a bigger bus than the 196 bus of the RTx 12 gb 3060
really appreciate the video
things are getting interesting with Apple M1/M2 (finally 8+ hours battery life on a laptop, and impressive gaming performance for an ARM chip) and now Intel ARC gpu's... as an AMD customer (probably the last time) I get so frustrated that I have to micromanage so much using AMD Ryzen platform and I hate that I have to use different programs for doing anything, when there could be just one official AMD tuning software... here I have in my hands this amazing powerful laptop but I have to mess with command prompt command lines in order to "unlock" the CPU performance mode, and after stop gaming I have to yet again type a command line in order to get my laptop back into "cooling" mode which it doesn't do automatically... it's so much micromanaging and it gets frustrating... that's the AMD experience... not to mention crappy battery life, also a problem on Steam Deck which uses AMD chips
I think that’s more so because of your laptop bios rather than AMD.
I just ordered one for 180 on that cyba monday deal! For something that goes head to head with Nvidier and AMD gpus that are 70 dollars more expensive i'll give it a try. It's that amazn 30 day free returns anyways.
I ran my a770 16gb on a 9th gen intel I9- 9900k because the MSI MB had a BIOS update that included ReBAR support. It worked fine.
I would have recommenced the RTX 4060 cards, pimp the old CPU !
Not meant to run old titles !
I run starfield on it at 2k ultra, 27% cpu most of the time.. @@lucasrem
@@lucasrem 4060 cards at twice the price and only 8gb of ram, is not a good value. Nvidia gimped that ram as a marketing ploy to make you want to jump into an $800 4080 with 16gb vram
I am using an arc 310 as my encoder gpu for videos and it works great😄
I bought it and I love...I got my ASRock A750 for 180USD!!!
10:57 Umm Actually, OpenGL 1.0 was released in 1992. OpenGL 4.4, the version that modern Minecraft uses, was released in 2013 which is AFTER DirectX 11 (released in 2009).
Lol that's fair enough, it was just meant to be a rough-ish demonstration of how OpenGL isn't the newest
For Minecraft you had VSync on without Optifine or Sodium. The fps stats are not very representative of the "normal" player experience. Fps counts you should expect for this GPU would be between 450-800fps with VSync off and performance optimisation mods, which everyone in the community uses.
Oh wow, I entirely didn't even catch that I accidentally left Vsync on, I could swear I took it off. Good catch there!
As for the performance mods, I considered testing the card with them, but instead opted for a straight vanilla test because even though most people do use performance mods, I still know many who don't, and I believed that the card should be tested in an unmodified scenario. Though, that test isn't good now that the accidental VSync has been pointed out, thanks!
The new driver improves performance in halo by 750% according to them, I wonder how it performs with it now
I'll have to retest that at some point soon! Thanks for the info!
meant to run on DX 9, old Xbox crap;...
this card is for next gen titles on low settings !
Micro center around me has one of these open box for $174, im considering going for to replace the rx480 on an older rig. Let's see what i decide at the end of your vid
Hopefully, that older rig will support resizable bar. The performance hit without ReBar, based on several other YT creators seems to be about 25%. The price is awesome, so I hope it works for you.
If it is an older rig, make sure, that it supports rebar! Otherwise the Arc's performance will be worse.
I am using Intel Arc A750 premiere pro 2023 use Video out put render Cpu only Going 100% Gpu 0% What problem Hardware Encoding not supported showing But Arc graphics card inside i don't know please reply
I was thinking about getting one of these to use in my 10-year old system, to replace a 2050. Request... Could you test the ARC with Minecraft using Distant Horizons and shaders on Fabric/Iris? I am curious to see how it handles the game in that scenario.
Very good video, I was curious about the card. Thank you. I wonder why video card manufacturers are still cheaping out on RAM? This started back in the video card shortage days, but I have no idea why it continues. All newer cards should have 16GB minimum IMHO.
*I wonder why video card manufacturers are still cheaping out on RAM*
It's easy - to squeeze more money out of consumers. For example to get a 16Gb from Nvidia you should pay for RTX4060Ti which has also has a narrow memory bus and has just a little more productivity rather than RTX4060. So the most budget-oriented people will have to stick with RTX4060 8Gb or go for the previous generation 3060 which has its disadvantages.
Last time seeing you was on video, testing Core i7 7700 CPU. It was probably end of September
Love seeing Satisfactory in performance testing, though it is pretty stuttery so it's not a great metric. I've seen the A750 as low as $180 and think it makes a great option for that price point.
Absolutely love Satisfactory but do agree on the stutteriness of it making especially the percent lows pretty bad metrics. Definitely a good card for the price!
I am also Using Arc A750 Premiere Pro Rendering Cpu and IGPU only Load Going Not using Arc GPU 20% Only what problem I don't know Not using Gpu please tell
Halo reach was a problem for me 5 months or more back (and others had reported it as well at the time) on a a770. Sad to see its still there. Halo 1, 2 and 3 did run fine for me though.
can u test this on rvc okada mr.. thank you
Hi there!
Halo Infinite does not support Ray Tracing currently within Campaign mode. It only offers this in Multiplayer and Arc does support it, however I think there's a texture bug in Intels release notes mentioning that they're working on fixing RT in Halo Inf.
Also, yes I agree, the card is good for pretty much everyone now at it's price point. Sept/Oct HUGE DX11 driver improvements have put the final nail in that old mantra. These days, the drivers are quite polished and offer great performance almost across the board, with a few caveats. Also, if you don't have ReBar, honestly you're going to be on quite an old Intel/AMD platform at this point. Installing a modern GPU will have diminished returns anyway as you're going to be quite memory and CPU bottlenecked.
Great vid!
@@raffiefoxmew3691 Ah that would explain why I can't find that option. Thanks!
I want one so bad
I have it and she is a great graphic card (for 170€ the best)
Bought an A-770 new. It now sits on a shelf. If Intel ever accepts that Fallout 4, Skyrim, and now Starfield are absolute essential CORE games for any PC gamer it will stay there. The card is decent. If you avoid all Bethesda titles. No reason an old 1060ti 6gb card should slaughter the A-770 in every single Bethesda title at 1440p. Including Starfield. Cards have potential but sadly these aren't ready for the show yet.
Interesting, I guess I hadn't realized but I didn't end up testing the card in any Bethesda titles just because I don't really have any that would be good benchmarks, I think. Maybe I'll have to revisit that and see if it still has those issues, thanks for sharing your experience!
they updated the drivers and halo works great now 750% performance increase
DOSBox doesn't work with the Arc cards anymore in Windows 10, since about 3.5 months ago with driver 4676. I really like my Arc a750 a lot, but I have a lot of old games from GOG and Steam that depend on DOSBox working. From what I've read Intel is dismissive of this problem, and may never fix it, so I'll have to eventually switch back to my AMD card, can't stay with the old Arc driver forever just to have DOSBox work. If you don't need DOSBox, then the Arc is a great card.
I would have never known about that if you hadn't shared, I really appreciate your input!
@@CHWTT I thought I'd better warn people about this. Why Intel won't fix this is beyond me. I switched back to my AMD 6600XT card now. It's a bit slower than the Intel a750, but DOSBox works like it supposed to with the AMD cards.
With DaVinci Resolve... I recently bought an A750, but for the timeline playback (and the Blackmagic Raw Speed Test) my A750 is performing extremely poorly. For the BlackMagic Raw speed test, it's actually performing worse than the CPU. I can not play a 1080p 60FPS video back smoothly in the timeline. It may be a separate hardware issue, but my prior RX580 did have smooth playback even using the "Show Every Frame" setting enabled. I believe that my Motherboard is limited to PCIe 3 x16, and my CPU is only an Ryzen 3 3200G. Both of these may be limiting factors. 32GB ram, NVMe ssd. Seeing videos such as these that compliment its editing performance baffle me, because it means that I must be doing something wrong. Do you have any ideas that I should check on?
Gaming performance top-tier at least.
Sorry to hear you're not having the greatest time with the A750 in video editing! I think that the limiting factory may end up being your CPU/motherboard because of the PCIe link (also pretty certain that CPU doesn't have ReBAR support, but I could be mistaken). Definitely check what kind of codec your footage is in and the bitrate of it, but I hope you'll be able to find an answer to this issue soon! (hopefully also one that doesn't cost any money)
@@CHWTT Hello, and thank you for the reply. You were correct, as it was an issue with my motherboard/CPU. I purchased an intel 12100F and a cheap MSI motherboard and I am receiving expected performance now (12X the BRAW decoding performance). The previous setup did have Resize Bar support, but I am glad to have found a solution, even if it is one that costed. Thank you again for the response.
dusable the windows update to avoid any updating of software without notice
Seems to be on par with a GTX 1080 , with RT...
Certainly a good performer, even in a lot of RT applications
Intel ARC are more versatile, not exclusively conceived for gaming. AV1 encoding but also the most compatible ever H.264 encoding performs greatly.
Thanks, Hi bro, what about power consumption ?
The card drew about 225W under a full load
fh5 rt is only present in the car showroom. Not in Freeroam
Do you have content on that Klipsch setup?
I don't at the moment, but you've just given me the idea to make a video on the way that I have all that stuff set up. Stay tuned, may have a vid out on that soon!
I just am not sure, I can get a A750 atm for 230 euro, but a RX6650XT will cost me 250 which is a small difference in price but im pretty sure the 6650XT is in general faster and will have much less driver issues.
Not too many driver issues now that its been out for so long. Intel has caught up
Rtx 3060 vs arc 750
For gaming editing and 3D modeling rendering
Which one is better
Plz
I'd also consider an RTX4060. Right now it costs pretty much the same as 3060 but has way lower power consumption, it is very silent in daily work and it's way better optimized so in many games it shows even higher FPS than 3060 despite its narrow bus. Personally I don't work in any 3D-modelling programs but in benchmarks it shows waaay higher results than 3060 - in programs like Blender.
In video editting they are close. May be 12Gb of 3060 is a little more benefitial in Davinci. Also hardware support of QuickSynch and AV1 codec in A750 gives some good expectation on its productivity in video editting (at least in future).
If " a750 and rtx 3050 is the same prize..which one is good choice considering cuda cores, driver optimization,future supports etc. I will do graphics disign using photoshop,illustrator,maya,blender,3ds max...
And gunnir or intel which one will better for arc a750?
Thanks.
I haven't tested the Arc in any of those programs as I don't use them personally, so I can't really speak on whether or not it'll play nicely although I'd assume it would be fine. Since I don't know how the Arc behaves in any of those programs, I can't make a super great recommendation. The 3050 will perform worse than the A750 in several things, but depending on how those programs work it might play nicer than the Arc. I just can't be certain, I bet there's someone else who's tested it in those programs that can maybe give a better answer!
As for Gunnir vs. Intel, this is what I said to another commenter who asked that same question:
I'm not really sure as I haven't ever used a Gunnir card, but I can tell you that the Intel LE one that I used here worked very well. Feels well built, temps were fine, was pretty quiet, and I think it looks quite cool.
Forza Ray tracing is only on picture mode.
Question out of topic: what cooler works best for an i9 12900k and yea im on a budget
Generally, i just want the price to performance.
I'm not entirely certain, but I built a system with a 12900k and a Deepcool AK620 lately. The AK620 seems like its almost enough to keep the 12900k cool at stock, but not quite. I've undervolted mine a fair bit and it still likes to hang out around the mid 90's in Cinebench R23. So something slightly better than the AK620 would probably be best, not entirely sure what that would look like though. The BeQuiet! Dark Rock Pro 4 is rated at a 250W TDP so I think it should tame the 12900k pretty well, but I haven't tested one personally and they can be expensive if a sale doesn't happen to crop up when you're looking to buy one.
alright man thanks
@@CHWTT
I have an intel core i5-10400 and a asus prime h510m-e motherboard. Will intel A750 perform optimally with my pc. 16 Gb of ram also. With a 650 watt psu.
What is the need, replacing the old GTX 1080 ti in it ?
SPU 650 is good for all GPU cards, only not the RTX 4080 !
I can't tell whether or not the H510M-E motherboard will support ReBAR. That CPU should be able to support ReBAR, but I'm not sure if your motherboard would have that setting available even with the latest BIOS. If you can find out whether or not it does support ReBAR, the A750 will perform quite well for you. Otherwise, it's not the best choice for your system as the lack of ReBAR won't allow it to really stretch its legs and, in some titles, it will perform pretty terribly.
How was that a good GPU for Davinci Resolve? It can't even handle the Timeline live video playback seems sooooo stuttering.
B roll wasn't a great representation of the actual experience. It was also pretty heavy footage to deal with (4K30, H.264, 130Mbps), and my CPU was the bottleneck on the timeline as I don't use any proxies or anything along the lines of that.
yea they fixed DX11 on arc and halo reach runs around 200 fps on my A750 modded
Would the ASUS PRIME B660M-A WIFI D4 motherboard work well with this graphics card alongside an i5 - 12400F?
The 12400F supports ReBAR and is a really good performance match for this card, so pairing that CPU with this card would certainly make sense. I'm not 100% certain if that motherboard supports ReBAR, but considering it's an Asus B660 board I would think it likely does. As long as the board you choose supports ReBAR, it should work well with this GPU!
I have budget for 770 but instead I want to go for 750 for the value for money.
For what I've seen 770 is the same as 750 with more VRAM.
what do you think about this gpu compared to rtx 4060 on video editing in davinci?
I wish I could tell you, but I've not had any experience with a 4060 so I really can't say
Do you recommend that card as a long term investment instead of an RX 6650 XT?
I've been thinking about buying one but I'm not sure if it's worth it because of the drivers
I only want to play at 1080p 75FPS games like Battlefield and Halo
I haven't tested it in battlefield, but in Halo (especially with the drivers that came out very shortly after this video was released that apparently fixed all of the issues in Halo: The Master Chief Collection, I think it would be a good option for you. I've heard from another commenter that it beats out the 6650XT while costing less, so it might even be the better option of the two. Though, I can't promise anything in the Battlefield games as I've never run them on the card (I assume that it would do pretty well in them, though).
@@CHWTT Do you think a 500W PSU 80 Bronze would be enough to play medium settings?
Can you scroll the timeline of Davinci smoothly? 1080p. Because I can scroll almost natively on Apple M2, so smooth and fast, no lags. Was wondering if this is an upgrade. I wanted a dedicated machine just for encoding.
Yeah, I actually just built an editing PC with this card in my latest video! The screen recording at the end of me using it in my new video wasn't the best as I'd just loaded the project and it wasn't fully up to speed, but once the project is loaded this card (when combined with a powerful CPU) is almost able to scroll through my timelines entirely smoothly. Though, my timelines are all 130Mbps 4K30FPS h.264 video so much heavier than 1080p. Pretty certain it could be an upgrade to an M2, but definitely do a bit of research on it!
I'm looking forward to the productivity performance! Gaming is a neat benefit. 6700XT is great for gaming performance, TERRIBLE for anything else.
Agreed. I'm a strong advocate for AMD RX 6000 series GPUs for price to performance in gaming, but they really aren't very good at anything else!
Do you mind saying more on that?
I’ve tried to ask about that before and couldn’t tell what the difference is
Well, generally a lot of software just doesn't play nicely with AMD's hardware. The acceleration either doesn't work very efficiently, or it doesn't work at all. There are of course some outliers, and hardware acceleration support on AMD cards is getting better by the day, but it causes them to underperform in productivity software. With the Intel Arc card I tested, and Nvidia cards, most software can effectively and efficiently use their hardware to accelerate their processes unlike Radeon which is why I say that Radeon sucks for productivity. Though, most games love Radeon and do incredibly well on it (especially when you factor in the price!). Hopefully that was at least a little help!
Oh, and just a little example here: I believe that I saw somewhere - can't remember where/when/if this is even accurate but based on experience I believe it - that the RX 6950 XT (my gaming GPU) performs about like an RTX 3060 in Davinci Resolve. Having the top tier Radeon and a mid/low end Nvidia card from the same generation perform equivalently in productivity definitely demonstrates how inefficient hardware acceleration on Radeon seems to be.
Gunnir or intel which one will be better choice for arc a750?
I'm not really sure as I haven't ever used a Gunnir card, but I can tell you that the Intel LE one that I used here worked very well. Feels well built, temps were fine, was pretty quiet, and I think it looks pretty cool.
Will it VR with a quest 2?
I wish I could tell you, but I sadly don't have any VR headset to test it with at the moment
please no VR on low settings !
saying opengl is old because it was released in 1992 is like saying directx12 is old because dx was released in 1995... That being said I believe it does use a version from 2009. Doom Eternal / Doom 2016 also use opengl by default
Thanksss
good video. You should come with a channel name brother.
Yeah, I've definitely gotta come up with something better... naming is hard though, am I right? haha
750% increase in performance for halo now!
please maya render test
I'll look into it and maybe run it sometime!
so AMD GPU is not recognized in Davinci Resolve? wow... this is the thing with being an AMD customer, there are so many small annoyances that happen for 10+ years and they never fixed it... using AMD is like you're stuck in the past, so much micromanaging and small annoyances for everything... their internal GPU screen footage encoder is pretty crappy too, you have to bump up bitrate to have "acceptable" quality (not even good video quality)
your AIO is mounted the wrong way
Since the pump block isn't the highest point in the loop, it's actually fine. If the pump was the highest point, the air would collect there and it would be an issue, but in this case it's actually fine. Thanks for your concern though!
Thank you for lecturing me, it's always nice to learn something new 😊 keep up the vids man!
@@antonstark9689he is wrong, by having the lines high, the pump will pull in air as the water level drops due to molecules slowly evaporating thru the hoses. The hoses should be at the bottom. In 2 yrs he will find out you were right
Oh so i wasn't out biking on the highway afterall :P Thanks buddy!@@Gl0ckb1te
I just need Intel to bring some competition to Nvidia bc AMD just getting gamers banned
Why? You'll keep buying Nvidia anyways
why is AMD banning gamers?
@@jordankelly4684 this is a sad reality, itll just be intel and amd fighting for 10% niche gamers with 90% of lemmings buying ngreedia
here not many sells intel arc , for price of a750 u can get 6700 10gb and and 8gb version of a770 cost same as a750, 16gb version of a770 cost same as 6700 xt, currently i woudlnt buy any as games are more demanding and 12 gb might on amd might not be enough
Ah yeah that's fair enough. If the prices are like that where you are, something like the 6700 (XT) would definitely make more sense!
8 gigs is enough for 1440p gaming you dont need more than that, heck i was gaming at 1440p with a rx 580 4gigs with a ryzen 2400g and i am hitting 80+ frames on R6 siege and wow retail at 1440p
low settings for 1440p for modern games, maybe medium if the game isnt so new
bro, the fact that intel DEMOLISHED nvidia in ray tracing (in terms of performance drop) is remarkable
its good
3050 an up are all great cards
The hitman graphic demo reminds me of Arkam City with ray tracing.
I am cheering for Intel. I think Nvidia and even ATI are pretty arrogant, though Nvidia takes the prize for arrogance.
Agreed, I'm really hoping Intel can break into this market because I think they're doing some great work. I'm totally going to be checking out their Battlemage cards when they launch, and man am I excited to see those cards!
how in the world do you not have titles that run lower then DirectX 11? how is that possible, your collection cannot be THAT limited.....AC1, Bioshock (1 and 2), Call of Juarez, Company of Heroes, Crysis, Far Cry 2 and 3, Max Payne, STALKER, just....non of that?
Neat little card , just turn the settings down a little on AAA games and away you go .