Update 24th October - Intel has been able to replicate the graphical corruption as seen in Forbidden West (06:14) and says the issue has been fixed with the latest 6044 driver.
Intel's package power includes RAM. So that's 2-4W less, including the RAM. So when you take that into account and add the RAM power consumption, it adds up to 20-30% less power available to the SOC. That makes Intel's performance more impressive than it at first might look like.
Finally, someone pointed this out. When comparing Intel versus AMD, they should use the total system power draw to obtain more accurate results. The Intel processor will be very interesting in future handheld devices and would be an intriguing competitor in the market.
Finally someone with a knowledge comment on benchmark channel, 99,999999% another channel never relise this, and think that igpu of intel use the same power as amd when testing on gdp 20w or something
Great work. Intel have a good record on bug-fixing so I have every confidence that they'll fix the issues you found. In time. One thing I picked out at 1:11 is that the 140V supports full UHBR20. If you still have the laptop, are you able to test that? Fire up a suitably undemanding game at 4k and see if you get super-high fps.
@@KitGuruTech I see my timing is perfect as usual. :) As for 4k @240 fps, I think you'd be surprised, especially if you go to older games like the original Far Cry. And what about esports titles?
LOVE the Dr Strange effects from the Intel driver! What a completely random screw up. Surely Intel's quality controllers should have spotted that one 😮
@@jamesjessenden7838 XeSS is intels own super sampling. using FSR on both like Dominic did could be seen as a bit of bias towards AMD I think but I dont think many people know much about XeSS, would be a good time to educate them
The crucial thing is XeSS is designed for Intel GPUs and will run better there thanks to using the XMX cores, whereas FSR is open source and runs on any compute hardware
As mentioned a couple times, this is primarily a performance analysis - I'm testing with FSR because that's what runs on the 890M and makes it an apples-to-apples comparison. I did use XESS for the Shadow of the Tomb Raider tests, but this isn't an FSR vs XESS video, I'm really just interested in the raw performance of each GPU
@@KitGuruTech Oops. I didn’t mean to start a whole thing here. FSR only testing is still a perfectly valid test as some games lack DLSS and XeSS support. It’s good to know how it runs. The reviewers conundrum: when you spend more time thinking about complaints on how you test vs actually testing. I know it well...
Radeon 890M is good, seems much better overall for stability and less artifacting, but Intel seem to be making ground, their last releases were terrible
And interesting point for that might be to compare at similar levels of image quality rather than resolution scales when possible. XeSS being able to put out a higher quality image should mean you can use a more intense upscaling factor for a given level of quality, and get a bit more performance out of it that way.
thank you, your benchmark showed quited a different performance than others i found on youtube but then again you had the radeon igpu limited to 33W instead of 50W, and i don't think the arc gpu scales there
Odd to see the Intel laptop switching between better performance at similar power draw and similar performance at lower power draw depending on the settings selected, and sometimes (like in Tomb Raider) the part of the benchmark run.
I am more interested in this for the next MSi handheld, which comes with Lunar Lake & the Arc 140. The first one, the Claw, was a dud because of the high wattages needed to match the Z1 in the ROG Ally, but this Arc 140 seems to have fixed that and even beats it in some games at 15-20w.
It is most curious that the Arc 140V consistently uses more memory - differences in Colour Compression, Texture Compression Formats, or Z-Buffer optimisation? Less memory used = less dependence on memory bandwidth, although Lunar Lake does have a Memory-Side cache to reduce DRAM access cost...
reminds me a lot when they launched their discrete GPUs, it appears the Intel has more hardware/capability only held back with drivers. Never thought this would happen for Intel to match or superior to AMD iGPUs, integrated graphics came with 3d graphics.
It's a driver inefficiency or some other software issue for Spider-Man. Look at the power draw on the Intel side. It drops down to the mid teens and shoots up to the 20s at times. That reminds me a lot of how Alchemist dGPUs perform in some games. The GPU reports 100% utilization, but a very low and fluctuating power draw show that it's not actually being fully used. Those issues were mostly fixed with driver updates, so this probably will be as well.
Looks like intel have done well. Managed to get a GPU that is better performance per watt on mobile and its running fairly recent games at decent fps. Obviously there are exceptions but its not bad at all.
I think its worthwhile looking at frame time, and not just average FPS. Historical, Intel Arc can perform OK in average FPS, but the performance consistency can be quite poor.
Thanks for this - very informative. I'm not a gamer myself, but am currently tossing up between AMD and Intel for my next Linux laptop. Battery life is the most important this for me (so long as the laptop is "powerful enough" - i.e. it doesn't have to be bleeding edge performance).
Knowing intel there will be driver updates to fix the performance issues but in the meantime would the performance improve significantly with lower settings? Is there a work around?
Not using Intel's XESS is an odd choice. XESS is real XESS on Arc 140v leading to not only faster rendering but superior image quality. FSR looks terrible at low resolutions, like those shown here. So it's a very real and important metric to consider when choosing an iGPU.
I hear you, but this content is more about raw performance than anything else. I already had all the data from the Radeon 890M with FSR so it made sense to go with that angle
@@KitGuruTech thanks for clarifying. I saw some posts about AMD APUs allowing almouat unlimited RAM to be used as VRAM as long as it's physically present in the system. Maybe it's next gen stuff.
Sorry I'm being really thick here or have missed something in the video. The AMD 890M is the iGPU (Strix Point I believe) but what was the AMD CPU it was running with? Was it the AMD Ryzen AI 9 HX 370 I'm just trying to understand the base spec here that are being compared.
So my benchmark. Does it cope with a modded Skyrim build? Not ENB, but Reshade with community shaders with subsurface scatter? Basically the level I would be looking at for something I would be willing to shift too at lower power.
Great video would have loved to see a 890m without power limits, as it clearly throttle sub 2ghz in almost all the games used. Granted yes it would be terribly inefficient in comparison but yeah
Horizon forbidden west, thats some mess - but generally its not too bad ! thanks for the article Dominic, really put some effort into that after the lackluster original review from Mat
It's very hard to say for these models and I only have tested the configurations I have, and when memory is integrated, you obviously can't do some direct A/B testing
Intel look quite good overall, a few bugs they need to sort out, but you always get that depending on the game studio and the companies they work with primarily.
Any of them come with 7000mhz+ LPDDR5 or is it the 5200/5400mhz "standard" speed? We all know AMDs apus love good memory timings and high mhz to get even more preformance.
Damn, I'm so late for this video. It's sad that this is just 2 iGPUs, with no reference to the previous generations for comparison. I mean, even without direct montage or fancy graphs, if you are doing relatively standardized tests - then maybe an average FPS across all tested games for the last 1-2 generations of Intel and AMD, just as a mention could've been useful. In Spiderman, I think it's a driver issue as well. You can see that Intel shows 1950mhz across all videos, so it's like there is some frequency lock, but obviously that's not how graphics(or most processors for that matter) work(you can lock peak frequency, but not the operating one) and as we all know frequency translates to wattage, and in Spiderman it dropped like 50% of the wattage, which obviously means that it does not run at 1950mhz, no matter what the sensors say, they just list the baked-in frequency, not the real one. And which also explains the performance difference. I wonder what the transistor count is for these iGPUs. Theoretically AMD has more transistors, as it's running at higher wattage, yet lower frequencies(at least compared to the wrong baked in value of Intel, after all, on paper 890M boosts to 2900mhz, good luck with that), but if 1950mhz is the boost value of Intel and it's actually running closer to 1ghz that it's used to run for its basically GCN\Vega-derived architecture(their Alder Lake graphics boosted to 1400mhzand first gen Xe Iris of Tiger Lake boosted just to 1100mhz, so both obviously had much lower operating frequencies) - then the reverse could be the case, and Intel has just fed a ton of silicon power to their chip. Which brings me to Strix Halo. Now that, unless they aim it for some 2000$+ Mac Pro competition league, would be an interesting thing to see. Looking forward to your review of that one.
I know FSR sucks in Warhammer 40,000 Space Marine 2 on Intel Arc A770, I have to use TAA balanced to get all the weird texture stuff to go away, maybe try more games that have XeSS as I'm sure it should be more tuned for XeSS than FSR, how well does it run using FSR is all I got from the review vs something made for FSR, just being honest.
Nha, it just a matter over drivers. The Intel chip is a less wattage part so in small handhelds/ with poor cooling the Intel gpu would actually win. Thats kinda mind boggling after the Arc first cards/drivers.
its not bad I guess, for casual gaming - I just dont know that lowering the IQ really is that enjoyable for me. but I suppose for a businessman in a hotel room , its something at least !
To be fair on Intel they arent the only ones who have issues - all GPU manufacturers have issues with their drivers on architectures that are way more mature
This is why I don't trust new intel GPUs yet, they're too inconsistent... Eventually it will be ironed out but would be nice if it was smooth flowing from the start 😂
Lunar Lake does things differently so not directly comparable to Meteor Lake, while the 890M is basically the same as previous gen, just with more CUs.
again Intel's drivers making the Arc looking worse than it should. I really hope they can make some changes in hardware to make Battlemage run games with minimal problems
Witcher 3 is playable at high settings at 1080p with XeSS Performance on my A370m, which is similar in performance to the 140v. Genshin runs as well, but I don't remember the settings I used. Medium ish, I think.
I will stick with NVIDIA. I use an RTX 4080 with cyberpunk, it even struggles with that at times. I'm pretty sure it's down to lack of optimisation but I don't think anybody's going to be gaming. Well, any serious gamer on an Intel GPU
I have been emulating some games on my phone Windows Games and you would be amazed how well they perform on Samsung Galaxy tab . S9 Ultra. I genuinely think Intel has some real competition. Pc Hardware has become much too expensive
Cant beat nvidia but their gaming laptops get a bit hot and do use loads of battery. And are pretty chunky to cope with all the cooling and power usage.
Update 24th October - Intel has been able to replicate the graphical corruption as seen in Forbidden West (06:14) and says the issue has been fixed with the latest 6044 driver.
Intel's package power includes RAM. So that's 2-4W less, including the RAM. So when you take that into account and add the RAM power consumption, it adds up to 20-30% less power available to the SOC. That makes Intel's performance more impressive than it at first might look like.
That's a very good point!
Finally, someone pointed this out. When comparing Intel versus AMD, they should use the total system power draw to obtain more accurate results. The Intel processor will be very interesting in future handheld devices and would be an intriguing competitor in the market.
Finally someone with a knowledge comment on benchmark channel, 99,999999% another channel never relise this, and think that igpu of intel use the same power as amd when testing on gdp 20w or something
Please report any spam you see, it's an ongoing thing at the moment! But also be sure to let us know your thoughts on the Arc 140V performance...
its a nightmare on youtube in the last year
No, I like spam. It's soft and wet and it gets everywhere
Not perfect but the graphics duopoly market really needs a third player so rooting for them!
it'll be interesting to see what Battlemage brings to the desktop market!
Completion is always good. Drives prices down !
I want Intel to recover !!!
Competition is good 👌🏻
Not really
Great review - im surpised at how well the arc 140V handled some games
Thanks!
@@KitGuruTech Only missed XeSS tests
awesome - very different article - dont often see something like this - thats why I love Kitguru - thank you!
great to see these kind of things, not many channels do this now
Great work. Intel have a good record on bug-fixing so I have every confidence that they'll fix the issues you found. In time.
One thing I picked out at 1:11 is that the 140V supports full UHBR20. If you still have the laptop, are you able to test that? Fire up a suitably undemanding game at 4k and see if you get super-high fps.
Thanks Quentin, unfortunately I've just boxed it back up! But you're very unlikely to be getting over 240fps in any game at 4k
@@KitGuruTech 40fps would be good !
@@KitGuruTech I see my timing is perfect as usual. :) As for 4k @240 fps, I think you'd be surprised, especially if you go to older games like the original Far Cry. And what about esports titles?
I think this is more than enough of a gpu for a person on the move ! Most games would be playable at 1080p with low iq. Looks ok to me tbh
@@iamahappybunny7640 Well yes, but the point of my query is to test whether or not the GPU can actually do UHBR20.
LOVE the Dr Strange effects from the Intel driver! What a completely random screw up. Surely Intel's quality controllers should have spotted that one 😮
I was only looking at the left side screen and thought it was a special effect in the game!
Looked more interesting than the grey blue screen
At 6:25
As XeSS is hardware accelerated on Intel GPUs, I’d love to see a follow up of FSR vs XeSS on Intel hardware.
@@jamesjessenden7838 XeSS is intels own super sampling. using FSR on both like Dominic did could be seen as a bit of bias towards AMD I think but I dont think many people know much about XeSS, would be a good time to educate them
The crucial thing is XeSS is designed for Intel GPUs and will run better there thanks to using the XMX cores, whereas FSR is open source and runs on any compute hardware
@@KitGuruTech mhmm, and your testing an intel gpu. so test XeSS too so people can get an unbiased result.
As mentioned a couple times, this is primarily a performance analysis - I'm testing with FSR because that's what runs on the 890M and makes it an apples-to-apples comparison. I did use XESS for the Shadow of the Tomb Raider tests, but this isn't an FSR vs XESS video, I'm really just interested in the raw performance of each GPU
@@KitGuruTech Oops. I didn’t mean to start a whole thing here. FSR only testing is still a perfectly valid test as some games lack DLSS and XeSS support. It’s good to know how it runs.
The reviewers conundrum: when you spend more time thinking about complaints on how you test vs actually testing. I know it well...
If they fix those glitches with the drivers and some engines it looks very promising indeed !
Wish you would do more of the laptop reviews Dominic tbh - you put so much work into them, really much better than the original
doctor strange has indeed opened up a portal into the game, and he took both these iGPUS and blew them up
😂😂
@@KitGuruTech
Surprised how well they both run Tomb Raider, that used to be hard work to power well
Crazy to think it's nearly 7 years old!
@@KitGuruTech It's really crazy!
Easily powerful enough for casual gaming now - runs quite well considering how crap iGPU normally is
Nice looking laptop, going to check out the original review
Hope you enjoy it!
@@KitGuruTech Ya It's really nice!
F1 performance is surprisingly good - then again its a very well optimised engine for most hardware
Good idea, thanks for doing this, bookmarked it for later after work!
impressed with the performance, even if there is still a lot of driver work to be done - its always been intels issue - glitches
Thanks Dominic, ive asked a few times for this kind of test, appreciate the work ! i am hoping Intel have narrowed the performance gap.
Welcome!
I like Intel, I know I shouldnt as they are evil, but I want competition going forward or we are all screwed
Your reviews are alway amazing!
Radeon 890M is good, seems much better overall for stability and less artifacting, but Intel seem to be making ground, their last releases were terrible
Been trying to find a comparison. Looks like intel handhelds and intel mini pcs for gaming will be a big deal soon. Better than I would have expected.
would have liked to see XeSS coverage on some level
Thanks, definitely something to look into for next time
And interesting point for that might be to compare at similar levels of image quality rather than resolution scales when possible. XeSS being able to put out a higher quality image should mean you can use a more intense upscaling factor for a given level of quality, and get a bit more performance out of it that way.
@@KitGuruTech Welcome Dear!
Overall its a big step forward for Intel, have to give them some credit for that - but it looks very patchy in some game titles
I get such a wave of excitement when i see a kitguru video notification, did not envision adult life like this 😂
😂 glad we can assist!
why
Glad you are back Dominic missed you 🙏
More to come!
Overtakes - nice one Dom
😂
intel price is just insanely high for such performance. Also it's like half the MT performance than strix point
It's not supposed to compete with Strix in MT. That's what Arrow Lake is for. Lunar Lake is an office laptop and gaming handheld chip.
FSR is open source, surprised to see Intel do so badly, is XeSS any good on this iGPU ? any benefits at all?
XeSS will generally look better than FSR for sure
@@KitGuruTech Ya Nice!
Great job Dominic 👌🏻👌🏻👌🏻
thank you, your benchmark showed quited a different performance than others i found on youtube but then again you had the radeon igpu limited to 33W instead of 50W, and i don't think the arc gpu scales there
Surprised how well it does on Cyberpunk compared to AMD - they have always struggled with this game, even their high end discrete cards
AMD GPUs actually do reasonably well in Cyberpunk, at least when only looking at Rasterisation performance. RT is an entirely different story!
@@KitGuruTech Ya you are Right!
Odd to see the Intel laptop switching between better performance at similar power draw and similar performance at lower power draw depending on the settings selected, and sometimes (like in Tomb Raider) the part of the benchmark run.
Nicely done mr Dominic 💯
Thanks!
Any more laptop reviews coming up? Need a new one for uni but would be great to see some budget options for working and gaming 😅
We'll see what we can do!
Seems about 60-40 in favour of the AMD Radeon 890M - pretty much what I thought. Horizon Dawn is well borked mind you!!!
High preset in Forza Horizon was pretty good ! Wouldnt think FSR would work too well on intel tbh, not sure what you were expecting
FSR runs on any GPU, there's nothing about it that's specific to AMD or running better than AMD than Intel or Nvidia
Thanks KitGuru I wanted to know this for my new laptop as I only game a little on the move and value the small size more than anything
Dominic showing Leo how it’s done 🤩
it's a very different piece of content - i'm only interested in gaming here, Leo did a review of the whole laptop
The Radeons pictures look consistently sharper/clearer. It seems to invest a more processing time into quality of the images.
I am more interested in this for the next MSi handheld, which comes with Lunar Lake & the Arc 140.
The first one, the Claw, was a dud because of the high wattages needed to match the Z1 in the ROG Ally, but this Arc 140 seems to have fixed that and even beats it in some games at 15-20w.
Absolutely, the implications for handhelds are really interesting!
Been looking forward to seeing this...
They had this WEEKS ago how come we're only seeing this now?
Dang, the Battlemage looks really promising now, although it might not scale as well when it comes to high end discrete GPU's.
Intel's power efficiency is top notch but they really need to sort out the driver issues :/
Intel has improved greatly with their integrated graphics. Might pick up the MSI Claw 8 AI now
It is most curious that the Arc 140V consistently uses more memory - differences in Colour Compression, Texture Compression Formats, or Z-Buffer optimisation? Less memory used = less dependence on memory bandwidth, although Lunar Lake does have a Memory-Side cache to reduce DRAM access cost...
reminds me a lot when they launched their discrete GPUs, it appears the Intel has more hardware/capability only held back with drivers. Never thought this would happen for Intel to match or superior to AMD iGPUs, integrated graphics came with 3d graphics.
It's a driver inefficiency or some other software issue for Spider-Man. Look at the power draw on the Intel side. It drops down to the mid teens and shoots up to the 20s at times. That reminds me a lot of how Alchemist dGPUs perform in some games. The GPU reports 100% utilization, but a very low and fluctuating power draw show that it's not actually being fully used. Those issues were mostly fixed with driver updates, so this probably will be as well.
Looks like intel have done well. Managed to get a GPU that is better performance per watt on mobile and its running fairly recent games at decent fps. Obviously there are exceptions but its not bad at all.
Why use FSR on intel on the games where it can use XeSS xmx?
It's just weird, that your results are SO different from mine...
Tomb raider runs well ! I know its low IQ, but it does look quite good in the video, thought it would look like pants
It holds up very well considering its age
@@KitGuruTech Right!
I think its worthwhile looking at frame time, and not just average FPS. Historical, Intel Arc can perform OK in average FPS, but the performance consistency can be quite poor.
just for the banter - dont get that kind of dialogue from a US youtuber !
Intel beeing in front here vs AMD is a Milestone for Intel. I hope they keep pushing Arc.
I'm really waiting for MSI Claw 8 with Intel Lunar Lake handheld.
Thanks for this - very informative. I'm not a gamer myself, but am currently tossing up between AMD and Intel for my next Linux laptop. Battery life is the most important this for me (so long as the laptop is "powerful enough" - i.e. it doesn't have to be bleeding edge performance).
Very interesting. A lot of people would find this more than acceptable !
Definitely!
Knowing intel there will be driver updates to fix the performance issues but in the meantime would the performance improve significantly with lower settings? Is there a work around?
Well I already tested Miles Morales using the Very Low preset plus frame generation, and it was still a mile behind the Radeon 890M!
Not using Intel's XESS is an odd choice. XESS is real XESS on Arc 140v leading to not only faster rendering but superior image quality. FSR looks terrible at low resolutions, like those shown here. So it's a very real and important metric to consider when choosing an iGPU.
I hear you, but this content is more about raw performance than anything else. I already had all the data from the Radeon 890M with FSR so it made sense to go with that angle
I know you mentioned Intel allowing up to 8GB of RAM usage. Does AMD allow more?
Good question. Was the AMD product deliberately nobbled to make the match up fairer?
No, both were set to 8GB which was the maximum on both laptops
@@KitGuruTech thanks for clarifying. I saw some posts about AMD APUs allowing almouat unlimited RAM to be used as VRAM as long as it's physically present in the system. Maybe it's next gen stuff.
Fsr over Xess ?
just for the purposes of this testing so it's apples-to-apples
Sorry I'm being really thick here or have missed something in the video. The AMD 890M is the iGPU (Strix Point I believe) but what was the AMD CPU it was running with? Was it the AMD Ryzen AI 9 HX 370 I'm just trying to understand the base spec here that are being compared.
yeah that's right, the 890M in the Ryzen AI 9 HX 370, i mention that at 01:21
where is avg fps s?
Good idea. A lot of people use these !!! 🤘🏻
Definitely!
Go Intel !!
Lunar lake is really great for small devices
So my benchmark. Does it cope with a modded Skyrim build? Not ENB, but Reshade with community shaders with subsurface scatter? Basically the level I would be looking at for something I would be willing to shift too at lower power.
Great video would have loved to see a 890m without power limits, as it clearly throttle sub 2ghz in almost all the games used. Granted yes it would be terribly inefficient in comparison but yeah
thanks, we tested it with as unlocked power as possible when using the ASUS Zenbook S 16, but it can do higher power in other laptops
@@KitGuruTech yeah this setup is more realistic for actual laptops in the wild
@@JoseMariArceta get the Vivobook s16 ryzen 370 variant since it has a much higher tdp
Horizon forbidden west, thats some mess - but generally its not too bad ! thanks for the article Dominic, really put some effort into that after the lackluster original review from Mat
Thanks - Mat didn't review this one by the way
Really found this interesting. AMD are clearly more ahead with the drivers.
How sensitive are today's GPUs to memory type/speed/amount?
Good question! Don't suppose Kitguru can answer it though.
It's very hard to say for these models and I only have tested the configurations I have, and when memory is integrated, you obviously can't do some direct A/B testing
its a laptop, you have to modify bios to tweak or overclocking the memory
F1 - wow Intel dominated that one, surprised to see that. IQ looks a little better to my eye on AMD - but not sure if that youtube compression or not
IQ should be the same, they were run at the same settings
Intel look quite good overall, a few bugs they need to sort out, but you always get that depending on the game studio and the companies they work with primarily.
The clocks on the 890m look suspiciously low
TDP setting.
Any of them come with 7000mhz+ LPDDR5 or is it the 5200/5400mhz "standard" speed? We all know AMDs apus love good memory timings and high mhz to get even more preformance.
This Zenbook S 14 has 8533MT/s memory
Damn, I'm so late for this video. It's sad that this is just 2 iGPUs, with no reference to the previous generations for comparison. I mean, even without direct montage or fancy graphs, if you are doing relatively standardized tests - then maybe an average FPS across all tested games for the last 1-2 generations of Intel and AMD, just as a mention could've been useful.
In Spiderman, I think it's a driver issue as well. You can see that Intel shows 1950mhz across all videos, so it's like there is some frequency lock, but obviously that's not how graphics(or most processors for that matter) work(you can lock peak frequency, but not the operating one) and as we all know frequency translates to wattage, and in Spiderman it dropped like 50% of the wattage, which obviously means that it does not run at 1950mhz, no matter what the sensors say, they just list the baked-in frequency, not the real one. And which also explains the performance difference.
I wonder what the transistor count is for these iGPUs. Theoretically AMD has more transistors, as it's running at higher wattage, yet lower frequencies(at least compared to the wrong baked in value of Intel, after all, on paper 890M boosts to 2900mhz, good luck with that), but if 1950mhz is the boost value of Intel and it's actually running closer to 1ghz that it's used to run for its basically GCN\Vega-derived architecture(their Alder Lake graphics boosted to 1400mhzand first gen Xe Iris of Tiger Lake boosted just to 1100mhz, so both obviously had much lower operating frequencies) - then the reverse could be the case, and Intel has just fed a ton of silicon power to their chip.
Which brings me to Strix Halo. Now that, unless they aim it for some 2000$+ Mac Pro competition league, would be an interesting thing to see. Looking forward to your review of that one.
F1 runs well - well done Intel, shame about Horizon Dawn - that was a disaster
Its got to be a driver issue
But can it run crysis???
I know FSR sucks in Warhammer 40,000 Space Marine 2 on Intel Arc A770, I have to use TAA balanced to get all the weird texture stuff to go away, maybe try more games that have XeSS as I'm sure it should be more tuned for XeSS than FSR, how well does it run using FSR is all I got from the review vs something made for FSR, just being honest.
AMD has clearly got the edge
Nha, it just a matter over drivers. The Intel chip is a less wattage part so in small handhelds/ with poor cooling the Intel gpu would actually win. Thats kinda mind boggling after the Arc first cards/drivers.
Looks like they finally fixed ARC for starfield, I see...
yeah noticed that myself, about time
now they just need to fix the game, so disappointing for me
@@boltsandbraces6261 not gonna happen
Who cares about Starfield at this point?
Intel GPU future looks promising
The AMD GPu seems to clock pretty low in some of those games much lower then the Intel part. CPU bottleneck or something else?
TDP setting.
@@Patrick73787 I mean 1080p, the mobile GPU should clock much higher than this. It's probably TGP more than TDP
its not bad I guess, for casual gaming - I just dont know that lowering the IQ really is that enjoyable for me. but I suppose for a businessman in a hotel room , its something at least !
yeah it's definitely for the crowd who are happy with gaming on the go and don't want a 4KG gaming laptop!
Dont businessman in hotel rooms just look at pawn or am I the only one?
Intel have had like 3 years to work on the arc drivers now and we're *still* getting corruption issues?
To be fair on Intel they arent the only ones who have issues - all GPU manufacturers have issues with their drivers on architectures that are way more mature
This is why I don't trust new intel GPUs yet, they're too inconsistent... Eventually it will be ironed out but would be nice if it was smooth flowing from the start 😂
Lunar Lake does things differently so not directly comparable to Meteor Lake, while the 890M is basically the same as previous gen, just with more CUs.
It's a game released on Windows in March 2024 and these Lunar Lake chips launched in September 2024. Five weeks after launch there are improvements.
Some interesting results, AMD looks to win most of them , if not all. Much more reliable anyway by a long shot
good job dominic - its actually better than I thought, even at low settings. Horizon Dawn looks like turd however
thanks! yeah Forbidden West has some serious issues right now...
Certainly dont want to be trying Horizon Dawn on my new intel powered laptop then 😛
Zero Dawn might be fine, Forbidden West is another story!
AMD is clearly some way ahead with the hardware and drivers ❤
again Intel's drivers making the Arc looking worse than it should. I really hope they can make some changes in hardware to make Battlemage run games with minimal problems
Intel is competing better than I thought. AMD seems more reliable and less buddy. All in the drivers baby.
no red dead redemption 2? what is this game selection
Can other heavy games The Witcher latest
Gension impact
Latest china game King Kom
Can these playable with Full hd setting
Witcher 3 is playable at high settings at 1080p with XeSS Performance on my A370m, which is similar in performance to the 140v. Genshin runs as well, but I don't remember the settings I used. Medium ish, I think.
What the RAM latency of Lunar Lake? No video with such information... Intel is holding your balls?
Lunar Lake, guess because the prices are on the Moon.
Unfortunately Laptops with the 890M are the same.
I will stick with NVIDIA. I use an RTX 4080 with cyberpunk, it even struggles with that at times. I'm pretty sure it's down to lack of optimisation but I don't think anybody's going to be gaming. Well, any serious gamer on an Intel GPU
It will be fascinating to see what they do with MediaTek next year
I have been emulating some games on my phone Windows Games and you would be amazed how well they perform on Samsung Galaxy tab . S9 Ultra. I genuinely think Intel has some real competition. Pc Hardware has become much too expensive
Cant beat nvidia but their gaming laptops get a bit hot and do use loads of battery. And are pretty chunky to cope with all the cooling and power usage.
Morning x
Evening