we would love to see another benchmark of the arc a770, I heard that it has improved significantly after the numerous driver updates, have a great day sir
Was hoping to see it compared to a 3060 or any of the same rangeon nvidia side but the amd 6600 gave a good idea on the spectrum. Thanks for the testing!
Thanks for tuning in! My streams about GPU tests have more current data if you would like to see it. Also, I'll post a video today with a little summary of all my benchmarks too.
This card is becoming very cost-effective, exhibiting superior performance in both gaming and rendering compared to other cards in the same price range
@@GODMODELAGGER resizeable bar is a motherboard feature that allows the GPU to request larger packets of data from the CPU at a time. It can significantly speed up many programs, including rendering and gaming. For Intel GPUs, this feature is necessary to fully utilize the speed capabilities of the GPU. Nvidia and AMD GPUs can benefit too, but do not rely on resizable bar in the way Intel GPUs do. Their architecture relies on it for controlling the VRAM efficiently. So the kicker is that older motherboards do not have the ability to use re-bar. And most of the time, people will try to use older low cost systems with new GPUs or multiple old GPUs for rendering. There are few reasons why old PCs are a problem for rendering, so farms on a budget will build cheaper machines. But for Intel discrete GPUs, re-bar is needed or you lose the percentage of speed that I demonstrate in this video. Lots to read, but I hope this helps!
Thank you! Intel does not have any linking protocol like Nvlink. I have tried Blender 4.0, the speeds were similar to 3.6. I will definitely re-test the ARC with newer drivers though on Blender 4.1. They have continued to update drivers, so I need to stay on top of them.
@@ContradictionDesign thnx man. thank you very much . can you look gpu usage it will go over 10 per 100. can you try unreal engine raytracing with it please.. to see how it perform in raytracing , but you should activate ray tracing in unreal engine settings and you should restart the software. thnx man.
I certainly will. But I did just run a livestream recently with 4090, 7800 XT, and a770 to test rendering in Blender 4.0. so check that out if you'd like. But I will test more soon, and especially once Blender 4.1 comes out
I think the intel Arc A770 16 GB will be a bit faster than the 3060 now with the latest drivers. However, just make sure your PC can enable Re-sizable BAR, otherwise you may not get the full performance from the intel GPU. Plus, AV1 encode is fantastic, so that is another advantage the Intel has over the 3060.
These parts will do just fine for Blender, especially if you are just learning. I don't use the other software. But I do know that a750 has AV1 encoding, so that is nice for recording or streaming. Just maybe check to make sure that AE and premier pro have good support for arc GPUs.
Well it depends on what exactly your budget is. For $220, I would look at intel arc A750. It has 8 GB VRAM, and it should be nearly as fast as the A770. My testing shows the Intel Arc GPUs to be quite fast in latest Blender updates. At around $300, the RTX 3060 is good with 12 GB VRAM. It Just be sure you are able to run your PC with Re-sizable bar. This is a must for intel GPUs. Let me know what your ideal budget is and I can give an exact answer!
Speed wise the Intel is a bit faster and the additional 10 GB VRAM is nice. So I would say it is probably worth it. I will have to test a 2060 to know. Closest GPU to 2060 that I have is the 2060 super
@@ContradictionDesign it's the 16gb of vram that attracted me and the price. But I do a lot of stuff in Daz3d and haven't really figured out how to convert stuff to Blender well yet, but I was always running out of vram when I was trying to.
@@dazl12121 ohh I see. Yeah the VRAM is great. The A770 is a great card for the price. Just know, they are still fixing drivers, so there will be a couple bugs here and there. But for the most part, they have fixed them nicely.
Hah. I upgraded from an RTX 2060 6GB (non super) to RX 6600 to A770. If you do 3D like Blender, it's definitely worth it. With the recent updates, A770 on Windows beat RTX 2060 OptiX for Blender. And when I was using 2060, when the VRAM usage went over 5GB on Blender's status bar, it made me nervous, now, I laugh at VRAM. It's like, bring it on, Blender. But if you A.I. stuff, Nvidia is probably better. Normal A.I. stuff just don't work, and you have to find a special fork that supports Intel. Also, I think the engineers who made A770 were rich people who never worried about electricity bill. This thing consumes about 40 Watts for just idling. And the stupid neon-sign-like LED can't be turned off, further consuming the electricity. It's summer and my room has no A/C, and when I turn my computer on, it feels like turning on a heater.
@@ContradictionDesign thank you so much. There are many webUI available for stable diffusion, most popular being "Automatic 1111" . There is also Vlad diffusion, Invoke AI etc. For text synthesis, you can use Tortoise TTS, bark TTS etc.
Be aware that this is only Cycles rendering result with aid of dedicated ray tracing cores. In terms of viewport performance, Intel's cards absolutely suck, ranking below that of rx6600 often times. While Arc's rendering time 'seems' good enough, you have to understand AMD cards yet don't have dedicated RT cores. Comparing Nvidia's cards which also have dedicated RT cores, Arc doesn't even come close to them. So just get Nvidia's cards for the moment. I highly suggest get a used 3000 series over 4000 series or Arc if you're buying it for productivity purposes. Maybe with Battlemage and forward they'll get better but right now, Intel Arc is just not even worth considering purchasing.
Intel also has very high power draw for what amount of work it does. But yeah, hopefully the future will be better for Intel cards. Also, Intel GPUs need resizable bar enabled to run their best, which is not possible on older machines, and therefore is a problem for people trying to get cost-effective rendering hardware.
intel is great option actually decent option vs nvidia cards. for sure better option than amd. its only downside much higher power consumption for performance it delivers making nvidia still betetr priced in long run.
Yeah it's pretty crazy how good these Intel GPUs are now. I am very excited to see what the future generations will look like. I bet we'll have a new king if they keep up with it
@@ContradictionDesign More competition sure be good for us. Amd sucks at production, nvidia is the best right now but they are way too expensive, Intel is the only hope to put pressure on that company so we can finally have more cheaper gpu hopefully.
AMD is rubbish if you want to use Blender on Linux. This sucks because Nvidia doesn't work well on Linux, so Linux users have no choice but to switch to AMD. They say that AMD just don't have the resources to make good Linux drivers, but that wouldn't have been a problem if they made the driver fully open-source.
Intel Arc GPUs got recent speed boosts due to better drivers and software support. Intel may just be the best value for your money currently.
we would love to see another benchmark of the arc a770, I heard that it has improved significantly after the numerous driver updates, have a great day sir
I can definitely retest it!
Was hoping to see it compared to a 3060 or any of the same rangeon nvidia side but the amd 6600 gave a good idea on the spectrum. Thanks for the testing!
Thanks for tuning in! My streams about GPU tests have more current data if you would like to see it. Also, I'll post a video today with a little summary of all my benchmarks too.
@@ContradictionDesign Will check it out. Thanks friend!
Intel Arc A770 16 GB mantap 👍👍
Yeah I was not disappointed with it for sure.
The issue is OneAPI. The card is pretty good at ray tracing. Blender does not utilize is very well yet.
Yeah I have fairly frequent crashes with the arc GPU as well. Hoping we get some improvements soon!
This card is becoming very cost-effective, exhibiting superior performance in both gaming and rendering compared to other cards in the same price range
They are really cheap. As long as re-bar can be enabled, they are decent GPUs. I am excited for the next generations
@@ContradictionDesign What does re-bar mean? Would you recommend this for 3D renders and animation in blender?
@@GODMODELAGGER resizeable bar is a motherboard feature that allows the GPU to request larger packets of data from the CPU at a time. It can significantly speed up many programs, including rendering and gaming. For Intel GPUs, this feature is necessary to fully utilize the speed capabilities of the GPU. Nvidia and AMD GPUs can benefit too, but do not rely on resizable bar in the way Intel GPUs do. Their architecture relies on it for controlling the VRAM efficiently.
So the kicker is that older motherboards do not have the ability to use re-bar. And most of the time, people will try to use older low cost systems with new GPUs or multiple old GPUs for rendering. There are few reasons why old PCs are a problem for rendering, so farms on a budget will build cheaper machines. But for Intel discrete GPUs, re-bar is needed or you lose the percentage of speed that I demonstrate in this video.
Lots to read, but I hope this helps!
thank you man for this vid you deseve sub rightaway .can you try blender 4 please?! and that ubtek Arc gpu it has like nvlink or what?
Thank you! Intel does not have any linking protocol like Nvlink.
I have tried Blender 4.0, the speeds were similar to 3.6. I will definitely re-test the ARC with newer drivers though on Blender 4.1. They have continued to update drivers, so I need to stay on top of them.
@@ContradictionDesign thnx man. thank you very much . can you look gpu usage it will go over 10 per 100. can you try unreal engine raytracing with it please.. to see how it perform in raytracing , but you should activate ray tracing in unreal engine settings and you should restart the software. thnx man.
Hey I will be doing some Unreal Engine stuff eventually, but not for a little while. Have too much to catch up on for a bit.
Are you going to run more tests using the Intel ARC A770 GPU with Blender 4.0?
I certainly will. But I did just run a livestream recently with 4090, 7800 XT, and a770 to test rendering in Blender 4.0. so check that out if you'd like.
But I will test more soon, and especially once Blender 4.1 comes out
I want to buy 3060 for gaming and editing so which would be good? This intel gpu or 3060 12gb ?
I think the intel Arc A770 16 GB will be a bit faster than the 3060 now with the latest drivers. However, just make sure your PC can enable Re-sizable BAR, otherwise you may not get the full performance from the intel GPU. Plus, AV1 encode is fantastic, so that is another advantage the Intel has over the 3060.
Intel i3 13100f processor with intel arc a750 8gb gpu it was good pair for editing like blender ae ,premier pro ?
These parts will do just fine for Blender, especially if you are just learning. I don't use the other software. But I do know that a750 has AV1 encoding, so that is nice for recording or streaming. Just maybe check to make sure that AE and premier pro have good support for arc GPUs.
sir I want to buy a budget graphic card for running bleander, shall you suggest me plz..
edit:- i have window 10
Well it depends on what exactly your budget is. For $220, I would look at intel arc A750. It has 8 GB VRAM, and it should be nearly as fast as the A770. My testing shows the Intel Arc GPUs to be quite fast in latest Blender updates.
At around $300, the RTX 3060 is good with 12 GB VRAM. It
Just be sure you are able to run your PC with Re-sizable bar. This is a must for intel GPUs.
Let me know what your ideal budget is and I can give an exact answer!
Please test with new drivers. Intel say increase performance in 40%
Again? How recently?! That is super exciting if it holds true
Create a minotaur for a new Zelda style level in blade & sorcery
I can look into this! Don't know the game yet, but I will research! Thanks!
I have a rtx 2060 6gb, worth upgrading to one of these?
Speed wise the Intel is a bit faster and the additional 10 GB VRAM is nice. So I would say it is probably worth it. I will have to test a 2060 to know. Closest GPU to 2060 that I have is the 2060 super
@@ContradictionDesign it's the 16gb of vram that attracted me and the price. But I do a lot of stuff in Daz3d and haven't really figured out how to convert stuff to Blender well yet, but I was always running out of vram when I was trying to.
@@dazl12121 ohh I see. Yeah the VRAM is great. The A770 is a great card for the price. Just know, they are still fixing drivers, so there will be a couple bugs here and there. But for the most part, they have fixed them nicely.
Hah. I upgraded from an RTX 2060 6GB (non super) to RX 6600 to A770. If you do 3D like Blender, it's definitely worth it. With the recent updates, A770 on Windows beat RTX 2060 OptiX for Blender. And when I was using 2060, when the VRAM usage went over 5GB on Blender's status bar, it made me nervous, now, I laugh at VRAM. It's like, bring it on, Blender. But if you A.I. stuff, Nvidia is probably better. Normal A.I. stuff just don't work, and you have to find a special fork that supports Intel. Also, I think the engineers who made A770 were rich people who never worried about electricity bill. This thing consumes about 40 Watts for just idling. And the stupid neon-sign-like LED can't be turned off, further consuming the electricity. It's summer and my room has no A/C, and when I turn my computer on, it feels like turning on a heater.
@@typingcat haha yeah and 250 watt tdp doesn't help. The a770 is great for rendering though.
Very interesting.
Can you please test AI performance?(stable diffusion, llama2 etc)
It would be very much helpful
Sure! I will figure out how to test AI and make a comparison with some other GPUs!
@@ContradictionDesign thank you so much.
There are many webUI available for stable diffusion, most popular being "Automatic 1111" . There is also Vlad diffusion, Invoke AI etc.
For text synthesis, you can use Tortoise TTS, bark TTS etc.
@@nokiaairtel5311 very cool! I will research those. Thanks for the ideas
@@nokiaairtel5311Dont forget RVC-Retrieval-based-Voice-Conversion WebUI. I would be interested in how much time does it take to train a model.
Ok I will get this on a list of ideas! Thanks
A750 shoul give similar performance and will be much better value for money card.
The A750 looks great too. But I really was drawn to the $280 pre-tax price of the a770 with 16 GB VRAM. Otherwise the a750 looks great
Be aware that this is only Cycles rendering result with aid of dedicated ray tracing cores. In terms of viewport performance, Intel's cards absolutely suck, ranking below that of rx6600 often times. While Arc's rendering time 'seems' good enough, you have to understand AMD cards yet don't have dedicated RT cores. Comparing Nvidia's cards which also have dedicated RT cores, Arc doesn't even come close to them.
So just get Nvidia's cards for the moment. I highly suggest get a used 3000 series over 4000 series or Arc if you're buying it for productivity purposes. Maybe with Battlemage and forward they'll get better but right now, Intel Arc is just not even worth considering purchasing.
Intel also has very high power draw for what amount of work it does. But yeah, hopefully the future will be better for Intel cards. Also, Intel GPUs need resizable bar enabled to run their best, which is not possible on older machines, and therefore is a problem for people trying to get cost-effective rendering hardware.
This card uses OneAPI/SYCL and embree hardware acceleration for raytracing which is their version of RTX.
@@syntaxed2 Yes that is the case. And it makes the intel GPUs very competitive for rendering, since they are so cheap.
low price, huge vram, good performance, ... seems like Intel Arc is a good budget option.
Yeah it is very exciting! And it was very quiet for these tests, so you could probably get some better temps too.
it also has modern thermal design, one turbine and one FAN airflow passthrough the heatsink, and someone's favorite RGB design.
The cooling was fantastic in these tests, especially with a slightly higher fan curve. Even at 75%, the fans are very quiet.
intel is great option actually decent option vs nvidia cards. for sure better option than amd. its only downside much higher power consumption for performance it delivers making nvidia still betetr priced in long run.
@@boyinpyjamas that is what I found too. 4090 is about 5x faster for 5x the money, but only 2x the power.
amd for blender is never worth it. any 20 series + rtx capable card wins in performance value. doenst matter 2060 or 2090.
AMD is definitely not the best option currently
AMD SUCKS for Blender. Especially on Linux.
I mean. Basically yeah haha. For their price, they don't keep up
LMAO AMD card is really junk for production, they should be shamed of that because Intel Arc on the first gen already destroyed amd radeon.
Yeah it's pretty crazy how good these Intel GPUs are now. I am very excited to see what the future generations will look like. I bet we'll have a new king if they keep up with it
@@ContradictionDesign More competition sure be good for us. Amd sucks at production, nvidia is the best right now but they are way too expensive, Intel is the only hope to put pressure on that company so we can finally have more cheaper gpu hopefully.
@@runninginthe90s75 Yeah I am very hopeful!
AMD is rubbish if you want to use Blender on Linux. This sucks because Nvidia doesn't work well on Linux, so Linux users have no choice but to switch to AMD. They say that AMD just don't have the resources to make good Linux drivers, but that wouldn't have been a problem if they made the driver fully open-source.