In your comparison with the 890M could you try one game with raytracing enabled? I wanna see how close we are to handheld raytracing and Intel has had quite strong raytracing performance before.
Unfortunately not in the 890m comparison. Had to return that laptop a few weeks ago so I'll have to rely on the recordings that I have 😅 but I can do that in the extended gaming test. Any game you would specifically like to see with some RT enabled?
I'm impressed with what the Arc is doing with 20-30w. I'm running everything off of solar power these days and have been playing with SBCs and sff builds and it's given me a better perspective on power needs and efficiency. I love my beast of a PC for AAAA games, but I don't need that much power used all the time. I'm super happy these low power chips are being made, they got my buying more PC parts this way too, lol. BUT if I can save on power then it all works out in the end, right?
It's good to see how single player games run on integrated graphics, maybe a 14inch laptop with no discrete GPU is an option now a days for simple single player games
He has a 780m vs 890m comparison video. This shows the 890M is quite a bit faster at 20W (Z1 Extreme vs HX 370). My guess is that the 890m will win from the 140V at 30W.
@@PatJuhNL The 780M wins already over the 140V because it needs more wattage 15 vs 30 Watt not 60 Watt hehe. Read at Techpowerup, the 780M Mobile needs only 15 Watt
While the FPS look great on the ARC 140V, the frametime graph looks quite bad in almost all the games tested here. I did not notice until I find that the the 140V appears to be noticeably more stuttery. There are a lot more spikes and the variations in the frametime is clearly wider.
Why didn't you test 30W against 30W? The 780m is bottlenecked by the VRAM Bandwidth, raising the TDP from 30W to 60W barely gives any performance benefits.
Sorry, didn't have time to benchmark all wattages, so many things to do on my desk :( I'll do some 30W Benchmarks for the 890m though. Which is up next.
You can't argue that a benchmark is unrepresentative only because it is not "100% fine tuned" for one side. "Why didn't you test 30w to 30w", "890m needs 35-40w" absolute brain dead, clearly you don't care about how representative the benchmark is.
@@humanbeing9079 This has nothing to do with *fine tuned* , you compare apples to apples - something even you should understand. You do realize that these chips have a cTDP? And the 890m needs 35-45W because it has 16CUs instead of 12CUs as the 780/880m, thus the higher power requirement - else it would just lower its clockspeed too far which would result in a marginal performance difference. But hey, seems like you don't know any of that.
4:34 just for reference, ROG Ally Z1E (Non X) gets 41 FPS in SOTTR High 1080p at 30W, so basically 2 FPS for 30 additional watts wasted on CPU and GPU clock. 9:16 and 37 FPS at 20W, which makes the Ally perform better at lower watts than the Ally X.
Hast du schon den neuen Treiber ausprobiert für Lunar Lake? Vielleicht hat sich was verbessert. Generell kann man sagen, dass Battlemage eine deutlich konstantere Leistung ermöglicht, es gibt weniger Aussetzer zwischen den Spielen. Alchemist konnte gut performen je nach Spiel, viele Spiele wiederum liefen sehr bescheiden. Battlemage hat etliche Flachenhälse geschlossen.
This is impressive from intel. However, it is unfair competition between 2 different class I thought that the video is between Z1 extreme and ultra 7 258v as they both running at max 30w watt
That's why I added low power tests in the video as well. Also keep in mind, that the intel wattage also includes 2-3w for the RAM which is soldered ON the chip. So if both run at 20W according to the OSD the intel actually runs at 17-18.
@Hubwood The intel has no problem.. the AMD AI 370h is not start running at 15w TD like intel U7 258v. it is a 28W TDP that can be configured at max 54W. of course that power is for the CPU cores that it didnt matter in IGPU test. great Intel IGPU test, however, It is quite over to AMD Z1 extreme scores
For me the biggest issue is the lack of trust for driver support from Intel. Both in quality and longevity. They suck at it. AMD are not even close to Nvidia, but they are still far better than Intel on this. Also I would like to see old games and how they run on Intel's solution. From what I have heard about Intel support for the last few years, they suck at old games. DX9 and even DX10. As many games I play are old, this would make it an instant no buy. Other than that, I am very glad that Intel is taking this whole iGPU thing more seriously. It only took them decades. We need a good Intel iGPU performance so that we can get more out of AMD. Thanks for the video.
no,depends.AMD give better pure performance GPU value compare to nvidia.If you want to use other fun things like ray tracing,sure nvidia better.ML also have better support for cuda than ROCM
@@pham3383 Did you read what I said, the whole thing, or not? If you did, read it again. AMD driver support is far behind Nvidia's, especially for how long they support their GPUs with drivers. I use Nvidia's and AMD's/ATI's products since the 90s. What I said is Yes. Not No. If you don't agree, that's your problem.
ARC driver maturity isn't at where AMD's is, but Intel hardly sucks at drivers, we've had regular updates, support for new games on release (usually), XeSS 1.3 works better that FSR and you can drop the SDK into any game that utilizes a previous version, RT performance is decent and AI features seem to work fine. There's still more features incoming, and let's not forget how long it took AMD to take features from "announced" to "useful." For the amount of ARC users out there it's kind of shocking how far they've come (but it was surprising in the first place they even got a product to market).
We see , the z1 extrem hold strong vs. New chips like luna lake , cheap , great driver support , waste money to upgrade for the later comming handhelds ❤ thx amd
Didn't think I was gonna say this coming from intel but they've cooked, quire impressive results from an igpu that eats half the power, now we'll have to wait for prices not to be very fucking high, lol
If I had to guess I would say that since you need to pack both GPU and CPU on one chip... There is a limit for the maximum wattage since you cannot dissipate more heat, than let's say 70W from a single chip in a (thin) laptop. I guess 30w would be doable for the iGPU and leave 30-40 for the CPU. Or use something like dynamic power shift. But for mobile I don't think we're gonna see much more than that :)
@@Hubwood We have intel gen 13-14 which can eat 120watt on short period. Intel have 30wat and they can just double both iGPU and CPU and still be in 60W reach
In 2-3 years most people won't need discrete graphics cards, even now most people don't need them anymore. Only kinds wants everywhere ultra settings 8k resolution with 120fps, because to play 2k medium - "for losers" (C)
@@Deathdemon65 defenitely, yes. I also asked them if we would be allowed to overclock samples. they said that wouldn't have much of an effect on performance anyways. Originally they were designed for 8W according to them!
so.... ultra garbage 7 200 series against AMD from 2 years ago and amd still better why dafuq AMD dont penetrate hard on notebook market? im tired of intel BS/bad costumer service and products as main and sometimes only option in several famous notebooks
I'd like to see some more wattages from both sides, often times there are a certain workloads and titles that don't really scale higher in terms of performance with higher tdps past a certain point. So it can seem pretty disingenuous when that happens. Like if either chip is getting 29 fps at 30w, but 33 at 60w due to poor scaling it its not really going to be a realistic comparison. Because nobody would run it with 50% more power for a few fps.
These results are pretty impressive, definitely a huge improvement over Intel's previous effort. I hope they continue with their GPUs
In your comparison with the 890M could you try one game with raytracing enabled? I wanna see how close we are to handheld raytracing and Intel has had quite strong raytracing performance before.
Unfortunately not in the 890m comparison. Had to return that laptop a few weeks ago so I'll have to rely on the recordings that I have 😅 but I can do that in the extended gaming test. Any game you would specifically like to see with some RT enabled?
@@Hubwood alan wake 2 with rt if possible
I'm impressed with what the Arc is doing with 20-30w. I'm running everything off of solar power these days and have been playing with SBCs and sff builds and it's given me a better perspective on power needs and efficiency. I love my beast of a PC for AAAA games, but I don't need that much power used all the time.
I'm super happy these low power chips are being made, they got my buying more PC parts this way too, lol. BUT if I can save on power then it all works out in the end, right?
Intel just released new GPU drivers and they claim a big performance boost with it, so worth to install and retest
AMD too and we got 8% performance boost for free - checkmate
Lunar Lake power consumption also includes RAM power consumption.
Yepp
It's good to see how single player games run on integrated graphics, maybe a 14inch laptop with no discrete GPU is an option now a days for simple single player games
Great Lunar Lake Ram on chip means that's 28 Watts also includes RAM power consumption.
@@HDRPC yepp that's true.
And wifi, not only ram, it use about 22w vs 28w amd
exactly what i was looking for. Thanks!
Thank you so much
890m vs 140v please!
@@djayjp tomorrow 10am European time.
He has a 780m vs 890m comparison video. This shows the 890M is quite a bit faster at 20W (Z1 Extreme vs HX 370). My guess is that the 890m will win from the 140V at 30W.
@@PatJuhNL The 780M wins already over the 140V because it needs more wattage 15 vs 30 Watt not 60 Watt hehe. Read at Techpowerup, the 780M Mobile needs only 15 Watt
@@PatJuhNL890m is not a integrated so we cannot compare with 890m and arc 140v integrated graphics 😢
Great testing
While the FPS look great on the ARC 140V, the frametime graph looks quite bad in almost all the games tested here. I did not notice until I find that the the 140V appears to be noticeably more stuttery. There are a lot more spikes and the variations in the frametime is clearly wider.
Why didn't you test 30W against 30W? The 780m is bottlenecked by the VRAM Bandwidth, raising the TDP from 30W to 60W barely gives any performance benefits.
Sorry, didn't have time to benchmark all wattages, so many things to do on my desk :(
I'll do some 30W Benchmarks for the 890m though. Which is up next.
@@Hubwood The 890m needs 35-45W 😅
@@samserious1337 well I can't test the 140v at 35-40 cause it's not going to use that much.
You can't argue that a benchmark is unrepresentative only because it is not "100% fine tuned" for one side.
"Why didn't you test 30w to 30w", "890m needs 35-40w" absolute brain dead, clearly you don't care about how representative the benchmark is.
@@humanbeing9079 This has nothing to do with *fine tuned* , you compare apples to apples - something even you should understand. You do realize that these chips have a cTDP? And the 890m needs 35-45W because it has 16CUs instead of 12CUs as the 780/880m, thus the higher power requirement - else it would just lower its clockspeed too far which would result in a marginal performance difference. But hey, seems like you don't know any of that.
4:34 just for reference, ROG Ally Z1E (Non X) gets 41 FPS in SOTTR High 1080p at 30W, so basically 2 FPS for 30 additional watts wasted on CPU and GPU clock.
9:16 and 37 FPS at 20W, which makes the Ally perform better at lower watts than the Ally X.
Overclocked 890M @ 65W (Beelink SER9) vs GTX 1060 or RX 480 at some point? 🙏
Terima kasih atas video ini 👏👏👏
Thank you. I enjoy your videos.
Glad you like them!
Intel has so much potential for GPU. It just depends on their CEO, research, and packed with their "Intel"ligence
Great comparison, looking forward to the Ryzen IA max. What version of Windows is it? On Linux the 780m is faster than the Intel 140
Newest version of windows 11 Home
Hast du schon den neuen Treiber ausprobiert für Lunar Lake? Vielleicht hat sich was verbessert. Generell kann man sagen, dass Battlemage eine deutlich konstantere Leistung ermöglicht, es gibt weniger Aussetzer zwischen den Spielen. Alchemist konnte gut performen je nach Spiel, viele Spiele wiederum liefen sehr bescheiden. Battlemage hat etliche Flachenhälse geschlossen.
This is impressive from intel.
However, it is unfair competition between 2 different class
I thought that the video is between Z1 extreme and ultra 7 258v as they both running at max 30w watt
That's why I added low power tests in the video as well. Also keep in mind, that the intel wattage also includes 2-3w for the RAM which is soldered ON the chip. So if both run at 20W according to the OSD the intel actually runs at 17-18.
@Hubwood The intel has no problem.. the AMD AI 370h is not start running at 15w TD like intel U7 258v. it is a 28W TDP that can be configured at max 54W.
of course that power is for the CPU cores that it didnt matter in IGPU test. great Intel IGPU test, however, It is quite over to AMD Z1 extreme scores
You can set the 370 HX to 15 W via Universal X86 Tuning utility. And handhelds will allow 15w as well.
Perhaps the arc 140v will do better on the arrow lake H cpus
think its still good perf, its vega 890m thats too powerful
my boy delivered it late, yet no complaints, coz u dont get free laptop samples right ?
ベンチマーク専用のチューニングがされているのか?
TimeSpyやFireStrikeの数値ほどの差は実際のゲームでは確認できない。
Don't know if it was tuned... But yeah seems to be the drivers I think....
HX 370が高価だし、自分の用途では780MのミニPCかノートPCでよい。
でもベンチマークではMeteo Lakeが780Mより上だったのでそちらにすべきかと検討していたところ、この動画がありました。
(core ultra vs 780mで検索しました)
140Vでこの程度なら第一世代のCore Ultraも期待はできないでしょうね。
参考になる動画をありがとうございます。
Everyone gangsta untill AFMF comes
For me the biggest issue is the lack of trust for driver support from Intel. Both in quality and longevity. They suck at it.
AMD are not even close to Nvidia, but they are still far better than Intel on this.
Also I would like to see old games and how they run on Intel's solution. From what I have heard about Intel support for the last few years, they suck at old games. DX9 and even DX10. As many games I play are old, this would make it an instant no buy.
Other than that, I am very glad that Intel is taking this whole iGPU thing more seriously. It only took them decades. We need a good Intel iGPU performance so that we can get more out of AMD.
Thanks for the video.
no,depends.AMD give better pure performance GPU value compare to nvidia.If you want to use other fun things like ray tracing,sure nvidia better.ML also have better support for cuda than ROCM
@@pham3383 Did you read what I said, the whole thing, or not? If you did, read it again. AMD driver support is far behind Nvidia's, especially for how long they support their GPUs with drivers. I use Nvidia's and AMD's/ATI's products since the 90s. What I said is Yes. Not No. If you don't agree, that's your problem.
bullshit, intel driver is better than amd driver, amd drivers are the worst of the worst
@@tiltdown I'm an Intel users, AMD Drivers isn't the worst. Even if it was worst, doesn't mean it's bad
ARC driver maturity isn't at where AMD's is, but Intel hardly sucks at drivers, we've had regular updates, support for new games on release (usually), XeSS 1.3 works better that FSR and you can drop the SDK into any game that utilizes a previous version, RT performance is decent and AI features seem to work fine. There's still more features incoming, and let's not forget how long it took AMD to take features from "announced" to "useful." For the amount of ARC users out there it's kind of shocking how far they've come (but it was surprising in the first place they even got a product to market).
(AFMF) AMD fluid motion frame - Hold my Beer
impressive that intel gpus are very power efficient
We see , the z1 extrem hold strong vs. New chips like luna lake , cheap , great driver support , waste money to upgrade for the later comming handhelds ❤ thx amd
AMD is at twice the watts. Lunar Lake is a God send for PC handhelds.
Watch the second half of the video 😅
Pretty disappointing Lunar Lake performance tbh. But there is still hope for more mature drivers, Meteor Lake needed some time too.
You should not. That power includes memory usage and itnis capped at a lower temperature.
Is the Intel laptop model capped at 75 degrees C?
Didn't think I was gonna say this coming from intel but they've cooked, quire impressive results from an igpu that eats half the power, now we'll have to wait for prices not to be very fucking high, lol
Why there is not iGPU with more than 20W power?
If I had to guess I would say that since you need to pack both GPU and CPU on one chip... There is a limit for the maximum wattage since you cannot dissipate more heat, than let's say 70W from a single chip in a (thin) laptop. I guess 30w would be doable for the iGPU and leave 30-40 for the CPU. Or use something like dynamic power shift. But for mobile I don't think we're gonna see much more than that :)
@@Hubwood We have intel gen 13-14 which can eat 120watt on short period.
Intel have 30wat and they can just double both iGPU and CPU and still be in 60W reach
Comparing 60W to 30W..just shows huge superiority of the Intel solution.
20 Vs 20 in the second half and also take a look at my 890m Vs 140v video. Released today.
somehow i always think intel is arc gpu watt only, and hx370 is total cpu+gpu
why 400mb vram?
In 2-3 years most people won't need discrete graphics cards, even now most people don't need them anymore. Only kinds wants everywhere ultra settings 8k resolution with 120fps, because to play 2k medium - "for losers" (C)
thanks, same power amd dont stand a chance
Kinda pointless testing last gen vs current gen.
Should've tested 890M.
1. Not pointless. This is the current handheld king.
2. It's also okay to test a 4090 Vs a 6900Xt
3. 890m will be up next.
You clicked
@@Hubwood What I meant is that it is potentially unfair.
@@dksun ¿?
@@Splarkszter is it unfair to compare an RTX 2080 with an RTX 4080?
неплохо
Holy mother of jesus AMD power draw 🙏🙏🙏 that laptop is seeking help
@@Deathdemon65 that's CPU and GPU combined 60W 😅
@@Hubwood but lunar lakes max tdp is 28 intel locked it there so it is efficient.
@@Deathdemon65 defenitely, yes. I also asked them if we would be allowed to overclock samples. they said that wouldn't have much of an effect on performance anyways. Originally they were designed for 8W according to them!
So they're not even close to the 890M 😂
Well actually in some scenarios it's as fast as the 890m. But it can't keep up beyond 30W... Comparison will be out tomorrow morning.
yeah but way better than previous iris xe
$1000USD vs $3000USD
so.... ultra garbage 7 200 series against AMD from 2 years ago and amd still better
why dafuq AMD dont penetrate hard on notebook market? im tired of intel BS/bad costumer service and products as main and sometimes only option in several famous notebooks
I'd like to see some more wattages from both sides, often times there are a certain workloads and titles that don't really scale higher in terms of performance with higher tdps past a certain point. So it can seem pretty disingenuous when that happens. Like if either chip is getting 29 fps at 30w, but 33 at 60w due to poor scaling it its not really going to be a realistic comparison. Because nobody would run it with 50% more power for a few fps.
I'd love to test more games at mote wattages.... But time ... No time 🙈
Thanks for adding Fortnite, this channel is de only one source that I can trust.... Still de sweet spot for both are 720p 🧐