Hey, hopefully we could see some "AMD Finewine" on Intel. I hope the cards get better as the driver matures. Remember, AMD & Nvidia had to deal with sh*tty drivers before and it took them years (or even decades) to perfect it. I just want to say give Intel some time to improve, their GPU division already said that Battlemage is at a much better development position atm compared to Alchemists before.
@@xtxo bro this power consumption difference won't even put 5$ on your bill even if you use it full load for 10 hours a day, stop talking about marginally power consumption or heat
It will be more expensive than we expected, lowend A380 is for like 180 eur in our shops right now and that's low end, I would not buy that for such price.
@@xtxo approx 200-210w at most under load fam. Actual high-end cards from both Nvidia & AMD right now consume 300 if not more under load. This higher power usage than the 3050 is literally a nothing-burger to even point out.
Because of that great energy consumption, perhaps they thought to make a higher-end gpu, that is why they doubled the Vram, now instead of selling it in another position they made it as an entry-level
believing 770 is designed to compete the 3070 level. due to infancy, now Vs 3060/3060ti. if betting is right, 6 months from now, no stock. taking it now can do.
then red and blue are forming a rebel alliance to fight evil green empire! ) and then after victory blue is killing red and going to a dark side... happy end!
O consumo de energia é quase o dobro, mas a performance e o preço são bem vantajosos! E isso porque ainda é uma Placa nova, imaginem quando ela estiver estável e funcionando perfeitamente? Isso é bem interessante! A Intel mandou muito bem! 💙
Note how A750 is 30% faster than 3050 in DX12 titles, but instantly drops to only 10% advantage in DX11 ones due to software emulation. Still I can see how it can be a really nice alternative for a budget gamer to run all the current competitive titles at 1080p.
hes not even testing an A750 he just renamed GPU1 with A750 in msi afterburner look at the clock speed on it. its running 350mhz faster than an A750 and the A750 doesnt have a boost clock.Its like ShadowSevens videos he did the same type of benchmark with an A750(which it wasnt) except made the mistake not taking vram usage off and it going to over 10 gigs
One has to remember that the ARC A-750 uses 16 pcie express lanes, while the RTX-3050 uses only 8. That gives the ARC an overwhelming advantage from the get go.
Man, the A750 coming in strong and stealing the 3050's spotlight!... Although, the 3050 at $300 is a steal in on of itself... And it's not like the 3050 had any spotlight to begin with...
If you live in EU, it's legal, if you live in USA, I am not sure about that. If this specific site is legit, I don't know, but reseling licences from company multilicences is legal in EU for few years.
More power (wattage) consumption on the Arc A750 side, boy the electricity bill would be expensive (& yeah, there's more power hungry cards out there), But for the price & performance ratio compared to the 3050, you get what you pay for (with a small hidden twist)
it doesn't matter, we aren't getting anything new in a budget segment for at least a year (or even two) because RTX40 will keep costing ridiculous money due to overrenting TSMC and we already see NVidia doing refreshes of 3060, 3060Ti and 3070Ti using excessive stocks of 3090 Ampere GPUs they have by cutting it down to spec. Granted RDNA3 may shoot down ARC, but a budget offering like RX7600 or RX7500 may still be at least half of a year away.
God damn, not too shabby for a first attempt at a proper GPU. I vividly recall tolerating Intel Extreme Graphics back in the day; to see them make a graphics chip that doesn’t suck for once is so refreshing.
creo que lo más interesante de estas tarjetas de video de Intel son que tienen PCIe 4.0 X16 osea puedes usarlo con procesadores PCIe 3.0 sin que pierda casi nada de rendimiento, cosa que la rtx 3050 no lo es porque trae un PCIe 4.0 X8 y el mismo error cometió AMD con las rx 6600 y rx 6600xt que no son las más optimas opciones para una actualización si ya tienes un ensamble con procesadores PCIe 3.0
But the Intel card consumes more energy. It's around 60 watts more than the 3050 so it means more performance, right? Nonetheless it's nice to have a new competitor.
Да, на 7xx она есть. Причём, что удивительно, в некоторых играх при включении лучиков 770-я обгоняет даже _("соответствующие" по мощности)_ модели RTX 3xxx и RX 6xxx.
@@darkspawn6214 Без тени иронии скажу что рад за Intel. Тем более учитывая что их "HD графика" годилась для совсем уж простеньких игр, и не оказывала конкуренцию даже бюджетным вариантам от зелёных и красных. Да и так монополия на видюхи немного разбавится третим претендентом. Вот если бы они поместили подобные чипы в десктопные процы... Тогда людям с баребонами и MicroPC не пришлось бы выбирать сомнительные APU-шки от красных.
oh my god friend!!! How are you going to compare RTX 3050 vs Arc A750? That's a box, it's like buying GTX 1060 vs RTX 3050. It's clear that Arc A750 is to compete with RTX 3060. If it exceeds it, RTX 3060 Ti passes. If it's a bit far, nothing happens through driver improvements. , performance and correction is fixed as soon as possible
Its hard to imagine that intel- a company who is new to discrete gpus- can create a gpu that beats the 3050 at the same price. But again thats why they call nvidia the greedy green machine
Games :
Hitman 3 - 0:20 - gvo.deals/TestingGamesHitman3
Red Dead Redemption 2 - 1:15 - gvo.deals/TestingGamesRDR2
Forza Horizon 5 - 2:21 - gvo.deals/TestingGamesForza5
Microsoft Flight Simulator - 3:34 - gvo.deals/TestingGamesMFS20
Spider-Man - 4:28 - gvo.deals/TestingGamesSpiderManPC
PUBG - 5:33 - gvo.deals/TestingGamesPUBG
CYBERPUNK 2077 - 6:32 - gvo.deals/TestingGamesCP2077
God of War - 7:32 - gvo.deals/TestingGamesGoWPC
Horizon Zero Dawn - 8:21 - gvo.deals/TestingGamesH0D
The Witcher 3 - 9:11 - gvo.deals/TestingGamesWitcher
System:
Windows 10 Pro
Core i9-12900K - bit.ly/3H4waEv
MSI MPG Z690 Force - bit.ly/3GVTNi6
32Gb RAM DDR5 5600Mhz - bit.ly/3BOxlni
CPU Cooler - be quiet! Dark Rock Pro 4 - bit.ly/35G5atV
GeForce RTX 3050 8GB - bit.ly/3AzdMk9
Intel ARC A750 8GB
SSD - 2xSAMSUNG 970 EVO M.2 2280 1TB - bit.ly/2NmWeQe
Power Supply CORSAIR RM850i 850W - bit.ly/3i2VoGI
not fair comparision...better with this between RTX 3060 12 GB or RX 6600 8 GB...the RTX 3050 is a way lower than that.
imagine if intel also works on their drivers it would be great to have a third player in the video card market
You must give a feedback to intel
It is kinda impressive for their first attempt of launching graphics card, we can only expect for their newer launch in the future
Naw, they're not going to do ANY work on their drivers now. 😐
Lol
Hey, hopefully we could see some "AMD Finewine" on Intel. I hope the cards get better as the driver matures.
Remember, AMD & Nvidia had to deal with sh*tty drivers before and it took them years (or even decades) to perfect it. I just want to say give Intel some time to improve, their GPU division already said that Battlemage is at a much better development position atm compared to Alchemists before.
You know this mann more competitive market thats whats needed great comment man
Damn man , that's bad news for NVIDIA but we will get some competition that brings good GPUs for us .
much higher power consumption though, just like amd
@@xtxo who cares power consumption everything is about fps
@@xtxo bro this power consumption difference won't even put 5$ on your bill even if you use it full load for 10 hours a day, stop talking about marginally power consumption or heat
It will be more expensive than we expected, lowend A380 is for like 180 eur in our shops right now and that's low end, I would not buy that for such price.
@@xtxo approx 200-210w at most under load fam. Actual high-end cards from both Nvidia & AMD right now consume 300 if not more under load. This higher power usage than the 3050 is literally a nothing-burger to even point out.
The low 0.1 and 1% on A750 is incredible
a very important data
That mean A750 provide smooth gaming nioce.........
great job
0.1 is not that relyble data, because it can be possible caused by a cpu or ram
Feels unreal to see Intel graphics playing games with good framerates & graphics. I'm excited to see what Intel brings in the future.
i have 1990year vibes omg This thing back
Even tho has better frames the frames also seem smoother good shot for intel
@@dapz44 I'm a month late, but it's probably because of the i9 & the DDR5 RAM that's giving the smooth frametimes.
ARC A750: Whose your daddy?
RTX 3050: You sir!
literally a gtx 760 to a750 in dx11 or older games: who is your Granddaddy?
@@theplayerofus319 just w8 for driver updates
@@theplayerofus319 dx11 still a thing lol worst api ever relying on a single thread
@@caribbaviator7058 and? Still great games that use it, worse or not.
Nope...4090 is the daddy
god I love competition in the mid range, I might just actually go with an intel, love the vram amount they offer
you will spend more energy
@@hugotakayama6963 i dont care, my monthly bill is around 10 dollars at max load of everything, including every component , not a problem for me 👍
@@hugotakayama6963 oh good point, I didn't even notice the energy usage
Are you profesional content creator?
@@peadee2774 depends, I do video editing, but I've had several games that needed more than 8GB of Vram
A750 has good value compared to A770
If compared with rx 6600 which better?
A770 also has good value
Cheaper and better performance than 3060
@@Adrninistrator not good enough if compared with rx 6700 xt
@@onnokis5596 amd is bad
@@beataoo for what
For productivity relatively, or driver optimization hmmm
But looks better than rtx series
I hope Intel achieves great things for their second generation of GPU's.
Because of that great energy consumption, perhaps they thought to make a higher-end gpu, that is why they doubled the Vram, now instead of selling it in another position they made it as an entry-level
Great for consumers
believing 770 is designed to compete the 3070 level. due to infancy, now Vs 3060/3060ti. if betting is right, 6 months from now, no stock. taking it now can do.
This card is really optimized for 1440p gaming, it'll compare even better to the 3050 in 1440p.
144hz monitor: i am joke for you?
@@verchielGold not the hz but the "p" / 2 K
@@pemainmobile8007 you donˋt understood his joke
We already have The "Green Team" and The "Red Team"
Now The "Blue Team" joins the Race
then red and blue are forming a rebel alliance to fight evil green empire! )
and then after victory blue is killing red and going to a dark side...
happy end!
😂 it's getting real
O consumo de energia é quase o dobro, mas a performance e o preço são bem vantajosos! E isso porque ainda é uma Placa nova, imaginem quando ela estiver estável e funcionando perfeitamente? Isso é bem interessante! A Intel mandou muito bem! 💙
Essa placa tem muito potencial para um futuro muito próximo.
I'm surprised. If this continues, the margin that battlemage could have would be brutal. It seems to me that I will buy intel, bye nvidia
Did anyone notice the less processor usage and less RAM usage on the ARC?
And it still somehow gives better performance!
Note how A750 is 30% faster than 3050 in DX12 titles, but instantly drops to only 10% advantage in DX11 ones due to software emulation. Still I can see how it can be a really nice alternative for a budget gamer to run all the current competitive titles at 1080p.
Nice vid.
Please test the A750 versus the RX 6600.
I'd like to see this Intel card against RX 6600
Hope Intel will continue to improve their GPUs, which would be good news for customers.
I hope Intel becomes the good guys in the GPU market. Fair prices, in stock, and can compete with Nvidia and Amd.
Please also test Intel GPU with DirectX 9, 10 and 11 games because they might not perform well on Intel GPU
3050 is totally toasted between the rx6600 and arc A750
RTX 3050 joke edition
hes not even testing an A750 he just renamed GPU1 with A750 in msi afterburner look at the clock speed on it. its running 350mhz faster than an A750 and the A750 doesnt have a boost clock.Its like ShadowSevens videos he did the same type of benchmark with an A750(which it wasnt) except made the mistake not taking vram usage off and it going to over 10 gigs
One has to remember that the ARC A-750 uses 16 pcie express lanes, while the RTX-3050 uses only 8. That gives the ARC an overwhelming advantage from the get go.
Please bring more intel arc gpu benchmarks
With this kind of performance from their first gen gpu's, intel can dominate the mid range gpu market in the coming years.
Man, the A750 coming in strong and stealing the 3050's spotlight!... Although, the 3050 at $300 is a steal in on of itself... And it's not like the 3050 had any spotlight to begin with...
Do you recommend buying the windows key from the sponsor? Is that a legit site?
If you live in EU, it's legal, if you live in USA, I am not sure about that. If this specific site is legit, I don't know, but reseling licences from company multilicences is legal in EU for few years.
Make a video (ARC A750 vs ARC A770)
What dimension did I step into?
_"Meanwhile, in a parallel universe..."_
Повтори этот тест,пожалуйста.Можно помимо 3050,сравнить 1660,super,ti с A750
Nice job by Intel
Once there were Red VS Green
Now we have Blue VS Red and Green
Good job Blue with his two top contenders
More power (wattage) consumption on the Arc A750 side,
boy the electricity bill would be expensive (& yeah, there's more power hungry cards out there),
But for the price & performance ratio compared to the 3050, you get what you pay for (with a small hidden twist)
How does the a750 wins in every directx 11 🤔
que belleza de grafica la comprare
Can you add Doom or just few more Vulcan games in next tests?
ARC750 is more powerfull but consume more power⚡⚡⚡⚡
Wow nice I can buy a 1650 now.
This card would have been the biggest hit ever if it came out 8 months ago when it was supposed to
it doesn't matter, we aren't getting anything new in a budget segment for at least a year (or even two) because RTX40 will keep costing ridiculous money due to overrenting TSMC and we already see NVidia doing refreshes of 3060, 3060Ti and 3070Ti using excessive stocks of 3090 Ampere GPUs they have by cutting it down to spec. Granted RDNA3 may shoot down ARC, but a budget offering like RX7600 or RX7500 may still be at least half of a year away.
In my country Rtx 3050 costs 410 usd, can’t even imagine the price for the new comings gpus from intel
I'd say this one would be like 330-350 where you live, assuming the street price isn't far off from msrp.
Thanks for review!
Good first GPU .
Good job Intel
Intel ARC A750 does a good job...only problem is power consumption...
It has double the memory bandwidth and 25% more clock speed, 40% more shaders and performs like higher tier product, how can it consume less power.
WoW intel just received a BIG FAT W and they are not even full optimized yet.
Expect some good 50 series card from nvidia now they got competition
Hey, did you pirate The Witcher 3 or did you have a mod installed, because I don't think the maximum texture settings are Halk Ultra HD.
i need EVGA arc
lol evga out from video card business
no more evga video cards ever
@@Todd_Govard857 onlyvfor NVIDIA
@@jayesh969 lol evga it's just nvidia only
Off topic but did anyone else notice how in Hitman the npc's always sound British no matter what country 47 is in.😂
A750 is doing a very good job for the first generation. RTX 3050 is a joke, horrible price/performance.
Yo, bro. When are you gonna upload test of 4090?
they need to work on efficiency...a750 is taking almost 80% more power
was resizable bar on with a 750?
never thought i would be rooting for intel
Perfect timing now that NVIDIA is looking for suckers to fill the MINING BOOM trend.
@@procrustes7669 yeah they deserve to fall because of their greediness
Notice how the Intel GPU is able to utilize the CPU more in each of these games
Because it's getting higher fps?.. like any other gpu?
@@AncientED5 lol yeah if it can push more frames then it will demand more compute from cpu
3050 with dlss is still better choice, bcoz of lower power consumption !!
excellent for the startup
This video is one year old but still many benchmark sites rate arc builds 40/50 out of 100 !!!!
That’s some great value
Bro we need streaming test with the gpu encoder
God damn, not too shabby for a first attempt at a proper GPU. I vividly recall tolerating Intel Extreme Graphics back in the day; to see them make a graphics chip that doesn’t suck for once is so refreshing.
Wow! its not bad, Intel! But what about price?
The thumbnail 😐
279 i think vs 300 for rtx 3050
RTX 3050 is totally pointless for its current price range.
Which has better dlls and ray tracing intel arc A750 or intel nvidia rtx 3050 ?
Intel did a Good job
What about ARC A750 vs RTX 2060 since those are still in stores?
What if intel made drivers source open sorce so that army of million independent programmers and thousands of company develope best drivers fast..?
The RTX 3050 is such bad value. The RTX 2060, RX 6600, and A750 are all better and cheaper. The RTX 3050 needs to be $200 at the most.
Not cheaper but yeah more value
when is ARC out?
creo que lo más interesante de estas tarjetas de video de Intel son que tienen PCIe 4.0 X16 osea puedes usarlo con procesadores PCIe 3.0 sin que pierda casi nada de rendimiento, cosa que la rtx 3050 no lo es porque trae un PCIe 4.0 X8 y el mismo error cometió AMD con las rx 6600 y rx 6600xt que no son las más optimas opciones para una actualización si ya tienes un ensamble con procesadores PCIe 3.0
Il y a aussi à tester le rendu video des 2 cartes
Do arc GPUs support Ray tracing?
A750 nice!
Hmm, the price is just correct
Good job intel, but the downside is power consumption
For 1st announced product intel did it right, good job
tenho a arc, ela é muito potente em jogos com directx 12, ja em jogos como CS2 que usa directx menor, ela perde bastante quadros de fps. 😂
Same bro
Intel needs more TDP efficent design.
I wonder how shocked the people in the comments section will be when they see this channel test the 4090?
RTX 3050 is way too overpriced. It should be around 149 USD.
2 times RX 570 power usage. 110 watt was my personal tweak.
Latest Geforce sub-50 power usage doesn't feel like a mid-range.
Intel ARC GPUs need more optimization due to it's power hunger
You should give a feedback to intel.
But the Intel card consumes more energy. It's around 60 watts more than the 3050 so it means more performance, right? Nonetheless it's nice to have a new competitor.
Who needs energy efficiency anyway?
I am fed up with high Nvidia and AMD prices, so I might just go ahead and buy an Intel GPU....
these intel's video cards are beeing insane
Intel just took a W
ARC A750 vs RTX 3060
i have i5 12400F, what shound i choose? ARC?
yes, arc is cheaper nowadays than an rtx 3050
A new competitor it's great for the market... but someone must buy their GPUs
Do these have dlss?
Now intel has joined in the GPU wars. The card looks promising though
Трассировка лучей есть?
Да, на 7xx она есть. Причём, что удивительно, в некоторых играх при включении лучиков 770-я обгоняет даже _("соответствующие" по мощности)_ модели RTX 3xxx и RX 6xxx.
А вы довольны что интел подтянулся?!
@@darkspawn6214 Без тени иронии скажу что рад за Intel. Тем более учитывая что их "HD графика" годилась для совсем уж простеньких игр, и не оказывала конкуренцию даже бюджетным вариантам от зелёных и красных.
Да и так монополия на видюхи немного разбавится третим претендентом.
Вот если бы они поместили подобные чипы в десктопные процы... Тогда людям с баребонами и MicroPC не пришлось бы выбирать сомнительные APU-шки от красных.
Bruh you can get RX 6650XT for $265 on Newegg right now. Cheaper and more powerful than both
Need to compare it to the 3060
oh my god friend!!! How are you going to compare RTX 3050 vs Arc A750? That's a box, it's like buying GTX 1060 vs RTX 3050. It's clear that Arc A750 is to compete with RTX 3060. If it exceeds it, RTX 3060 Ti passes. If it's a bit far, nothing happens through driver improvements. , performance and correction is fixed as soon as possible
tbh its more comparable to the rtx 3060 if we are comparing it to msrp prices ..
i think it's good attempt from intel, but what about the resolution 🤔🤔
why the difference on RDR2 is so huge?
ARC A750 VS RX 6600 ?
What about rx6600 I think its better option for same class
Its hard to imagine that intel- a company who is new to discrete gpus- can create a gpu that beats the 3050 at the same price. But again thats why they call nvidia the greedy green machine