Maybe interesting for the power levels and for comparing their performance. Lunar Lake includes the Memory wattage, so with quick napkin math, 17W Lunar Lake is roughly 15W anywhere else. This makes the efficiency even more impressive.
I wanted to mention this as well. Makes Lunar Lake even more interesting. Now it would be quite an comparison to test games with XeSS support and look how they compare and even look differently.
It should be noted that the power usage of the memory is included in the package power reading of Lunar Lake, and accounts 2W according to Intel, so, against previous gen and AMD, the power limits you used are more like 10W, 15W and 26W. Still, great video as always!
Looking very good for the Claw 8...just need to nail down those driver issues. That low power single core performance will make it a beast of a handheld for all things emulation.
Can you test XeSS on Lunar Lake? I think this is the first time we have hardware-accelerated AI upscaling on a low wattage chip (outside of copilot AutoSR). FSR is just not great, and while XeSS can work on AMD, it would be interesting to see how they'll compare since they use different code, so performance and IQ should be better on Intel
That single core Lunar Lake win has not been covered nearly enough - both top chips of Intel and AMD have 5.1 ghz boost clocks - and if Intel is using less power to get the same clock AND getting a higher score in Cinebench that means Intel's Lion Cove IPC is better than Zen 5 IPC. Hands down, no questions asked. 120 vs 110 for single core/single threaded at what should be the same clock (AMD Strix tops out at 5.1, Intel's Ultra 9 [what was tested here was the ultra 7 - which should top out at 5.0 ghz] tops out at 5.1 ghz as well) indicates Intel wins the IPC by 10%, theoretically.
What a kick-ass review! You got me hyped for Intel like I haven't been, well, for too long really. Thanks Phawx, amazing video! Edit: Btw, have you been drinking from the fountain of youth my guy? You've been glowing more and more everytime a new video comes out. Cheers!
terrible standby but this is not intel's fault it's a windows issue like linus has painstakingly pointed out and attempted to make microsoft fix but sad to see it will just never happen.
@@fullerfit93 unfortunately it seems Thunderbolt 5 won't be integrated even on Panther Lake. If that is the case Thunderbolt 5 will lose the advantage to USB4 v2.
@@PixelatedWolf2077 I have tried to research and I can't find any credible source of information that Thunderbolt 5 is more challenging to integrate. Where did you get this information that Thunderbolt 5 is difficult to integrate on the SOC because of power issues?
@@EnochGitongaKimathi Think about it. Why would Razer be one of the only brands to do it? On top of that, if it was easy to implement wouldn't it be in all their laptops? I believe it can push at least 100 watts but I like TB4/USB 4 it can go a bit past that limit and that needs good motherboard work done. It probably was too energy expensive for a super thin and light laptop like what Lunar Lake is for.
Great review, really excited to see Intel hitting home on most points. Still need huge driver improvements, but they're making strides on that side as well!
Awesome stuff im glad Intel is making progress on this, would love to see this succeed in their Claw 2! For the Batman Arkham City and Knight, i accidentally found an alternative fix. This has worked for me on my msi claw so worth a shot to see if it will work for you. I installed Darksiders because i wanted to randomly try it, during its install process it installed some older direct X components and windows pre-requisites. Then when i randomly tried the batman games, its suddenly worked! So if you have darksiders, try installing that and then try Batman again!
Yeah, I'm happy he included that too in the review. Its something important and not a lot of reviewers talk about. At least it has been improved, but still needs some working on.
Intel has surely delivered with Lunar Lake and strengthen their position in the PC mobile market segment. The only constraint where Lunar Lake needs to do catch with AMD is driver optimization in video games where Arc GPUs continue to loose the race. It's not easy to come up with the new GPU design which is optimized for most of the video games. AMD and Nvidia know that pretty well and in order to achieve that in a quick run a huge driver optimization overhaul is needed. However even without that, Lunar Lake is a great step forward for Intel and their future gens of mobile and as well as desktop CPU SoCs which surely will get inspired by it. Cool but kind of long review where I am missing one important part - comparisons... sill thx for making it and keep on improving your vids👍
This feels like a new marketing angle. Instead of the usual concentration on performance, suddenly we see a switch to a form of what comes across as underselling, focussing on areas generally considered to be a slightly less importance. I do find the whole talk regarding the importance of standby more of a symptom of extreme impatience with needing instant run on lifting the lid, rather then the few extra seconds to wait for the OS to load, from a state where there can be zero loss of battery. The fact that this is ASUS, a company that doesn't seem to have given the public any reason to be concerned about their whole RMA procedures and attitudes, whilst still pumping out tech to a usual batch of reviewers that increasingly look as though they have board seats waiting for them in the future. Too many Spidey senses on alert here. I can see where some of the marketing could be of genuine practical use but when you are coming from a place of healthy suspicion in the first place, increasing it does little to inspire a purchase, at least not for me.
extreme impatience is the reality of human beings. It's nothing to be ashamed of. Humans don't work the same as our computers at home where they can just pause an action, perform another, and then resume their previous action. Humans don't operate like this. If I have an idea and don't write it down right there and then, it maybe gone forever. One phone notification affects our focus for the next 25mins according to studies. During the time the laptop wakes up, our brain doesn't go on pause mode to wait and then resume. Our brain continues to function and at a higher level due to the excitement of the idea's potential or the urgency of the situation which requires us to perform against the clock. It would be great if we could do those things, but people who are very patient aren't usually people who are very bright either. Ideally, being both would be perfect. The person that's considered the most patient is only patient when compared to other humans who are all biologically extremely impatient. Very little friction is needed (a lot less than most people think) to completely change their behaviour. Don't think a laptop's weight being 1.6kg vs 1.4kg matters? It matters massively. The difference is one will cause the majority 85%+ of people to bring their laptop everywhere with them (a lot of places) vs leaving it at home 9/10 times. The interesting part is that most of these people won't even recognise this trend and cause. We don't function how we want to function. We function how we function. Also on this topic, looks are not superficial either. They would be if we're thinking from a theoretical perspective which ignores the reality of human biology. For us, appearance is as much function as, traditional function is. Do note that waiting for the system to boot a few extra seconds isn't waiting. It's experiencing that same set of events every time, knowing that it's unnecessary, since it operates perfectly fine on alternative solutions (macos sleep...). Having said that, I completely understand your point. Your point is true and valid but to you. It applies perfectly for you and you're correct. But everyone is different and needs different things in different situations too. I don't use a phone case or screen protector because I've never dropped my phone (10+ years). I don't know how to drop it. My sister drops her phone almost daily and definitely needs a phone case. Some people can sit on their phones and scroll through a bunch of boring content in uni lectures. I can't do that nor be there at that slow pace so I would just leave. I like rollercoaster rides but I hate going because of the queue. Most people don't like the queue but I really don't. On the other hand, I can sit there and watch paint dry or watch a water filter drip the water through to fill a jug slooowlllyy. I love watching loading progress bars and percentages during long file transfers and am just drawn to staring at them the entire time (sometimes half an hour). I can watch small sections of certain videos (boxing moves, f1 turn at a corner, 5 seconds of a music video) over and over again for hours consecutively and it feels as if its only been 30 seconds max. I can do all of these things but being told to press the brake pedal before releasing the e-brake annoys the heck out of me. I know why it's there. It's there for a good reason. For other people. But not for me. So it still is very annoying. I can't see myself ever making that mistake. I can definitely see other people doing so. People are very different and change throughout their life too. On a constructive note: One way to make the hibernate feature work so much better would be for the loading time to be displayed as a bold number on a black screen and for it to count from 1-10 as the seconds go by. The loading should be done within 6-7 seconds for that laptop in the worst case scenarios but it should always count to 10 even if it loads before. And then show the lock-screen after that 10 seconds. That would completely change people's behaviour around the wait time. It's not the waiting. It's the inconsistency and distraction that the inconsistency causes. Some are more prone to it than others and neither is better than the other. It has to load at exactly 10 every-time and count the seconds not the progress. This would result in a significantly greater user satisfaction than a faster load time of 5seconds sometimes 7 seconds. Or with a progress bar that moves at different rates and then just jumps from 60-100 sometimes. This is why they put the timers on long traffic lights. It doesn't make the time go any faster. But it affects human behaviour massively. You can time it and so can now switch off as opposed to always being on and ready at any instance for the next 2 minutes.
It's not always about impatience. If you're a travelling freelancer or corporate staff going from meeting to meeting all day, you cannot afford to "wait" for an OS to load up. You also cannot "wait" for your computer to shut down at the end of a boardroom meeting. Macbook's sleep function always had this advantage compared to Windows.
No they are not 10+ battery drain in standby mode is a joke and if they don't come up with a solution in these past years x86 in laptops will be in trouble 😊😊
Windows laptop prices have been rising, and now we have ultrabooks like this at £1500 in the UK for 16GB/1TB. There are currently no options for a higher tier, but they are expected. Fortunately, they drop in price if you are patient. The same spec X Elite is now £999 with a 15-inch display. If I were buying now and as a non-gamer, the X Elite would be a compelling alternative, saving £500, though Google Drive and my VPN still do not have an ARM version (using temporary workarounds). As usual, in the US, the prices are lower but not inclusive of local taxes. Here, the common sense solution is to wait for the launch prices to fall. Another example in the US is the base M3 MB Pro 14, which was on sale for $1099. This is crazy value for ultrabook users who do not need more than 8GB of RAM.
Nice job bro. Looking good tbh. I was worried but Intel did make some real progress here with LL. Now we need batter driver and iGPU update to fully take AMD APUs. This is good competition. Wonder if any console maker would consider blue again from now on.
This is super interesting! Couple of questions for you folks hanging out in the comments: 1. Is there a chance that we get an equivalent to this, using same architecture but with Nvidia GPUs in addition? 2. What will be the Bazzite support for this thing you think?
Just bought the 16gb model since it was on sale for $250 off. Basically handed my g14 down to my wife since she is a web developer The feel of this thing is amazing. I was thinking about the 32 gb model but couldnt say no to the sale for the 16gb model
Arkham Knight is not a driver issue; the game purposefully refuses to launch on Intel branded gpus. It's never going to be fixed unless Rocksteady updates the game. Intel provided a workaround, but I don't think it was incorporated into a driver package (it's a modified game dll). You could try swapping that dll manually.
The early reviews of Lunar Lake are certainly showing a lot of promise. It will be interesting to see the higher end Intel Arrow Lake vs AMD Strix Halo.
Arrow Lake isn't going to do very well vs Strix Halo. Arrow Lake H is only 6P + 8e cores so 14 threads in total so can only compete with the 12 core 24 thread Strix Point and Ryzen HX 370 should win at the same wattage, as for the GPU Arrow Lake H is going to have a weaker GPU than Lunar lake so it's no contest there. Intel will need the 8P + 16e core Arrow Lake HX to compete with the 16 cores and 32 threads of Strix Halo but that's an inefficient desktop design scaled back from 250W meanwhile Strix Halo is optimized for efficiency, Arrow Lake HX will likely require 150W+ to compete with Strix Halo which won't go over 120W and that includes the massive GPU, provided it can compete at all when Halo has the 256bit memory bus advantage. Then there is Arrow Lake HX GPU being even weaker than Arrow Lake H so going up against Strix Halo it will get demolished and require a dGPU. The only thing Arrow Lake HX can compete with is Fire Range, which is basically desktop Zen 5 at laptop TDP, with Fire Range likely winning in performance per watt since Zen 5 is very efficient when TDP limited and both of them will have abysmal battery life since being desktop derivatives they don't have good idle power consumption, the only things Arrow Lake HX will win in though is ST performance and iGPU since Fire Range will still have the terrible 2CU RDNA2 but again theese things will come with a dGPU and iGPU will only be used for watching videos.
@Phawx: Great video, thanks! Can you say something about the glossy finish of the OLED display? I have an older XPS 13 and really like its anti-reflection coating. ASUS, however, does not state that those displays have a particular coating. How annoying would you rate the glossiness of those displays?
Intel is smart waiting on the claw. This gives time for Intel to work driver kinks before release of the claw. Super excited. I was thinking about getting this but might wait until next generation with ddr6 ram (faster bandwidth for GPU) and thunderbolt 5. I think those are the only 2 things Intel is missing with this chip.
Your testing goes into great detail but I find it kinda difficult to read the game benchmark graphs! Could you please make the bars less bright/saturated and instead give every processor or device a separate colour so that they re easier to differentiate?! And also MAYBE show the fps per wattage?
Great points! I'll look into seeing how I can differentiate colors when comparing AMD vs Intel going forward. Regarding FPS/watt. I've been working behind the scenes on a comprehensive way of actually testing these against a unit wholesale. Meaning including the dynamics of battery characteristics as well (voltage/#ofseries) But it's still taking time
How do you compare the wattages in the gaming benchmarks, Since memory is now on package, isn't it included in the wattage readouts, while for meteorlake it is not? Or did you cap it in another way? Its not a massive difference, but it might mean LNL 28W == MTL 25W since MTL uses a couple watts for mem. (and thus LNL is even better than shown)
I dunno... I expected more from LNL, especially GPU part. It's good that it is comparable and even leading against AMD, but still I'm worried and I'm wondering how Xe2 dGPU will stack up against RDNA4.
Remember Intel is years behind in Drivers optimisations, which means over time they will gain more performance just like A770 and A750 they went to near double overall performance from day 1 to several months later.
This is the best case for Intel, 3nm vs 4nm node advantage, minimal IO reduces die size and power consumption on top of the smaller node, 15% faster RAM, AMD clearly prioritizing CPU performance over GPU in HX 370, and after all that Intel can only match Strix Point in GPU performance at best above 15W. With Z2 Extreme being more GPU focused and AMD spending more time to optimize it this time around Intel will only fall further behind. As for dGPUs RDNA4 seems very impressive for a midrange GPU from what what's been leaked so far 4080 performance at 240-260W using the slightly improved 4nm node is great, battlemage on the other hand doesn't look so good with Intel only reaching their 4070Ti target in synthetic compute benchmarks and gaming being only around a 4070. If Intel doesn't manage to find some leftover performance it looks like their best GPU is going to lose to the 7800XT in raster and the RDNA3 architecture it's based on is going to be 2 years old soon.
@@HP4L16 Bro don't let the tests here fool you, multiple users done tests on Intel and found out at 1200P intel beats Strix by at least 20-30% on reasonably optimised games including cyberpunk, it's just that Phawx used 720P which reduces load on GPU, Intel had more juice not being used. As for dGPU market Intel next gen I don't expect it to be battling anywhere near levels of AMD or Nvidia approaching I think it will stick around 5060 levels hoping to gain market there. One key difference this time is that Battlemage will gain optimisations multiple times faster than Alchemist due to new design and how fast they can optimise games
I'm imagining a Lunar Lake based laptop with dedicated GPU in the future being able to actually provide great temperatures and lower fan noise.. Not sure we'd see this type of integration though but it would be nice. If anyone has seen any announcements for Lunar Lake + dGPU please let me know as i'd love to keep an eye on it.
there is a "disconnect "between the alchemist and battlemage driver may be due to the reason of the drastic architecture change from the alchemist , for example for older game they have done away from dx9 to dx12 emulated type solution with microsoft and a lot more which makes it run more on the metal instead of intermediaries we have to wait and see how things turn out with this from driver and dev level optimization and if it turns out ok then the desktop cards may have smooth landing with kinks being ironed out with lunar lake and might give tough competition to rdna 4 given we know how intel were good at RT at first place and if lunar lake fixes itself in driver department then msi's bet on intel might make them really rich given there is xess which is ai upscaling for the intel handhelds ONLY and fsr4(fsrai)can't run on amd handhelds given the fact that they might need special rt core or machine learning component to run it as we know rdna 3.5 will be there till strix halo so no mchine learning upscaling is there(exception being fsr 4 runs just like intel dp4a instruction on the pre rdna 4 handhelds )........
Doesnt look that interesting to me, i thought it would be so stronger than strix point, but it isnt. Strix point is a full 12 core or 10 core with all cores able to do multithreading and avx512, which lunar lake doesnt even have. Same problems with drivers which they had with previous generation meteor lake. Intel should build full cores and work on their driver and not cut down every new core and call it efficient. I dont want to pay for repackaged cpus like 13/14 gen. Dont know why they are ignoring this. Good video as usual from you, but nothing has changed for me with this. Its either i want gaming > strix halo (or strix point without or with gpu) or i want full functional cores what intel cant do.
See if you can turn off connected standby, I'm not at my computer to give you the command, that's what's wrong with standby power consumption, and likely Qualcomm disables WiFi on standby Also, stand by is pretty much obsolete with ssds, coming back from hibernate is almost as fast, and no we don't have to worry about edit rights to the SSD, but I suppose most won't believe that
Now to find a way to get LNL to drop to 7500MT/s memory so the iGPU comparison is more apples to apples. Also I think in some of the games the better single core perf of LNL compared to AMD makes a difference. Anyway what happened to the RDNA 3 vs RDNA3.5 deep dive ? Have I missed it ? I am glad Intel is competitive on the iGPU front now, will push AMD a bit ...
If you see Hardware Nexus results Lunar Lake absolutely destroys AMD Chips in gaming, and I believe the difference is that he used more realstic 1200P resolution which made GPU show its powers.
Hi! How bad would you say the reflections of the screen is? I had MacBook air m1 and that was acceptable. Would you or anyone recommend screen protector with anti glare?
9742 multi core cinebench r23 is....embarassing if its replacing the old H series which I believe not and only decent if its replacing the U Series. Hyperthreading /less cores ended Intels whole career if this is how all the chips will perform. I am an H series laptop user btw I want high performance but only up to a level that can be cooled with low fan noise. Truth is I am on the brink of going all out with HX and GPU on steroids but I am hoping processors and gpu's get more efficient before that happens.
Metro Exodus HX 370 vs LNL must be an outlier or there was some sort of bug. the other tested games showed that the two chips are neck and neck at 17-20W
You have to question this. What's the point of having such a nice display on a gaming machine if you can only run games comfortably at 720p or 1080p? At that point, you're simply paying money for the display that doesn't really enhance your gaming experience and chews through more battery than it needs to regularly.
There's a few ways to look at it, but modern games are just too much to run on these chipsets. Honestly the memory bandwidth is our biggest bottleneck. But you can play a lot of games
@@kirby21-xz4rxNo, this is not an x86 problem. This is caused by 3rd-party drivers or bad firmware. In this case, it must be 3rd-party drivers from peripherals or utilities that Phawx installed, since other reviews of this laptop have like 1% drain per night. Snapdragon X laptops can also drain 10% or more per night due to bad peripherals. Many such reports are around, but it’s less frequent due to no legacy driver compatibility.
I really wish we could exorcize Apple design from tech. These notebooks choose the wrong priority between form and function. How much better would cooling, peripherals, durability, serviceability, uptime and IO be if those damn things were a mere centimeter thiccer? There is a reason why IBM ThinkPads are legendary.
Fair, OEMs shouldn't try to appeal to Macbook users this much. Granted, the beauty of x86 versus Apple is you don't have to choose stupidly thin/underequipped laptops if don't want to. We've enjoyed traditionally 'chunky' laptops with great IO, cooling and serviceability AND great battery life and not bad portability through monolithic Ryzen (hopefully Arrow Lake-H delivers too). Lunar Lake kicks ass for what it wants to be though.
Is it manufacturer's fault, or just mainstream buyers want that trade-off? I remember how Jobs was always against phones bigger than 4" because of thumb's range, but market showed they don't care about Apple's beliefs, so Apple had to give in and copy Asian android phones... It goes both ways. Mainstream customers decide.
@@kirby21-xz4rx What a load of BS. Just because you don't like it, don't assume others are the same. I still use an almost 8 year old laptop, an 17.3" ASUS RoG G752 which is big AND heavy. Has about 4 kg and the charger is big and heavy too. I never had any problems with that. I always prefer having ports and good cooling versus a thin slab which forces me to use dongles (which I hate). Now I realize I'm on the other side of the spectrum, in regards to how much tolerance I have for big and heavy laptops. But to say that basically NOBODY would like a slightly bigger and heavier laptop and that you're concerned (about what ? Having an awesome tool ?) that's just another level. I'm more concerned about your warped senses in this case.
The top Battlemage GPU better be faster than A770. Otherwise it will be an absolute fail of a launch. Unless, I don't know, they sell them at $150 and they draw 100W. That wouldn't be a fail. But even in this case, people will say that Intel ARC will go nowhere if that's the best they could do in 2 years
CyberBug2077 and Exodus are 2 games which extremely favor Nvidia and they had history poor perf on almost every AMD vga. Just in case you guys did not noticed
Something is wrong in its optimization. More laptops must be tested. My SAMSUNG Galaxy Book 4 Pro only drops 2% after 8 hours of sleep and never hibernate. It has METEOR Lake.
In order to increase the frequency (and the transfers rate) you have to low the voltages but doing that will drop the signal integrity so you have to put the memory closer to make all that happens and that's why Intel is packaging the RAM with CPU, also Intel have better IMC so they can have more stable RAMs at higher frequencies
Maybe interesting for the power levels and for comparing their performance. Lunar Lake includes the Memory wattage, so with quick napkin math, 17W Lunar Lake is roughly 15W anywhere else. This makes the efficiency even more impressive.
@@darlokt51 correct
I wanted to mention this as well. Makes Lunar Lake even more interesting.
Now it would be quite an comparison to test games with XeSS support and look how they compare and even look differently.
It should be noted that the power usage of the memory is included in the package power reading of Lunar Lake, and accounts 2W according to Intel, so, against previous gen and AMD, the power limits you used are more like 10W, 15W and 26W. Still, great video as always!
thanks didnt knew that
Really appreciate you pointing out that you don't need the 288v because of the thermal limits.
Looking very good for the Claw 8...just need to nail down those driver issues. That low power single core performance will make it a beast of a handheld for all things emulation.
8 Threads and no AVX512 and you are talking about emulation? LOL
@@rapamune If you dont know what you talking about then dont talk about it, go research again how things work
Dont need those for stuff below ps3@@rapamune
@@rapamuneAVX512 is used in specific workloads. Also HT aka SMT produces a bit more heat then having the cores simply run like normal.
Can you test XeSS on Lunar Lake? I think this is the first time we have hardware-accelerated AI upscaling on a low wattage chip (outside of copilot AutoSR). FSR is just not great, and while XeSS can work on AMD, it would be interesting to see how they'll compare since they use different code, so performance and IQ should be better on Intel
@@bigworm150
Much better image quality with new XMX
Something is wrong with your settings/drivers when other reviewers have seen less than 1% battery drop during 12 hours of standby.
tech chap reported 0% lost overnight.
That single core Lunar Lake win has not been covered nearly enough - both top chips of Intel and AMD have 5.1 ghz boost clocks - and if Intel is using less power to get the same clock AND getting a higher score in Cinebench that means Intel's Lion Cove IPC is better than Zen 5 IPC. Hands down, no questions asked.
120 vs 110 for single core/single threaded at what should be the same clock (AMD Strix tops out at 5.1, Intel's Ultra 9 [what was tested here was the ultra 7 - which should top out at 5.0 ghz] tops out at 5.1 ghz as well) indicates Intel wins the IPC by 10%, theoretically.
What a kick-ass review! You got me hyped for Intel like I haven't been, well, for too long really. Thanks Phawx, amazing video!
Edit: Btw, have you been drinking from the fountain of youth my guy? You've been glowing more and more everytime a new video comes out. Cheers!
terrible standby but this is not intel's fault it's a windows issue like linus has painstakingly pointed out and attempted to make microsoft fix but sad to see it will just never happen.
Been waiting for this since you announced it on Patreon.
If only they had integrated Thunderbolt 5 on Lunar Lake. Intel Arc Xe2 Battlemage iGPU when on the go, Thunderbolt 5 Xe2 Battlemage eGPU when docked.
Ya, I'm going to wait until 300 series
@@fullerfit93 unfortunately it seems Thunderbolt 5 won't be integrated even on Panther Lake. If that is the case Thunderbolt 5 will lose the advantage to USB4 v2.
@@EnochGitongaKimathiit's not worth it. It takes a BUNCH of power currently, it's why it's only ever seen on Razer's 18 inch laptops.
@@PixelatedWolf2077 I have tried to research and I can't find any credible source of information that Thunderbolt 5 is more challenging to integrate. Where did you get this information that Thunderbolt 5 is difficult to integrate on the SOC because of power issues?
@@EnochGitongaKimathi Think about it. Why would Razer be one of the only brands to do it? On top of that, if it was easy to implement wouldn't it be in all their laptops?
I believe it can push at least 100 watts but I like TB4/USB 4 it can go a bit past that limit and that needs good motherboard work done.
It probably was too energy expensive for a super thin and light laptop like what Lunar Lake is for.
Great review, really excited to see Intel hitting home on most points. Still need huge driver improvements, but they're making strides on that side as well!
Awesome stuff im glad Intel is making progress on this, would love to see this succeed in their Claw 2! For the Batman Arkham City and Knight, i accidentally found an alternative fix. This has worked for me on my msi claw so worth a shot to see if it will work for you. I installed Darksiders because i wanted to randomly try it, during its install process it installed some older direct X components and windows pre-requisites. Then when i randomly tried the batman games, its suddenly worked! So if you have darksiders, try installing that and then try Batman again!
pleas do steam deck OLED vs 258V vs HX370 at 12-15W.
curios to see how for far we have actually come efficiency wise.
That will surely be done when MSI's Claw 8 launches.
5:50 you're the only reviewer that has the guts to truthfully speak about the standby time. kudos
💯💯
Yeah, I'm happy he included that too in the review. Its something important and not a lot of reviewers talk about. At least it has been improved, but still needs some working on.
Spike don't care
10% in 12 hours is a joke. Either Microsoft, AMD and Intel do something with this in the next few years or x86 laptops are doomed.
ONG
Intel has surely delivered with Lunar Lake and strengthen their position in the PC mobile market segment. The only constraint where Lunar Lake needs to do catch with AMD is driver optimization in video games where Arc GPUs continue to loose the race. It's not easy to come up with the new GPU design which is optimized for most of the video games. AMD and Nvidia know that pretty well and in order to achieve that in a quick run a huge driver optimization overhaul is needed. However even without that, Lunar Lake is a great step forward for Intel and their future gens of mobile and as well as desktop CPU SoCs which surely will get inspired by it.
Cool but kind of long review where I am missing one important part - comparisons... sill thx for making it and keep on improving your vids👍
Thankyou very much for hardworking ❤🎉
This feels like a new marketing angle. Instead of the usual concentration on performance, suddenly we see a switch to a form of what comes across as underselling, focussing on areas generally considered to be a slightly less importance.
I do find the whole talk regarding the importance of standby more of a symptom of extreme impatience with needing instant run on lifting the lid, rather then the few extra seconds to wait for the OS to load, from a state where there can be zero loss of battery.
The fact that this is ASUS, a company that doesn't seem to have given the public any reason to be concerned about their whole RMA procedures and attitudes, whilst still pumping out tech to a usual batch of reviewers that increasingly look as though they have board seats waiting for them in the future.
Too many Spidey senses on alert here.
I can see where some of the marketing could be of genuine practical use but when you are coming from a place of healthy suspicion in the first place, increasing it does little to inspire a purchase, at least not for me.
Man still bring up the old RMA shit really? Give Asus a chance.
extreme impatience is the reality of human beings. It's nothing to be ashamed of. Humans don't work the same as our computers at home where they can just pause an action, perform another, and then resume their previous action. Humans don't operate like this. If I have an idea and don't write it down right there and then, it maybe gone forever. One phone notification affects our focus for the next 25mins according to studies. During the time the laptop wakes up, our brain doesn't go on pause mode to wait and then resume. Our brain continues to function and at a higher level due to the excitement of the idea's potential or the urgency of the situation which requires us to perform against the clock. It would be great if we could do those things, but people who are very patient aren't usually people who are very bright either. Ideally, being both would be perfect. The person that's considered the most patient is only patient when compared to other humans who are all biologically extremely impatient. Very little friction is needed (a lot less than most people think) to completely change their behaviour. Don't think a laptop's weight being 1.6kg vs 1.4kg matters? It matters massively. The difference is one will cause the majority 85%+ of people to bring their laptop everywhere with them (a lot of places) vs leaving it at home 9/10 times. The interesting part is that most of these people won't even recognise this trend and cause. We don't function how we want to function. We function how we function. Also on this topic, looks are not superficial either. They would be if we're thinking from a theoretical perspective which ignores the reality of human biology. For us, appearance is as much function as, traditional function is. Do note that waiting for the system to boot a few extra seconds isn't waiting. It's experiencing that same set of events every time, knowing that it's unnecessary, since it operates perfectly fine on alternative solutions (macos sleep...). Having said that, I completely understand your point. Your point is true and valid but to you. It applies perfectly for you and you're correct. But everyone is different and needs different things in different situations too. I don't use a phone case or screen protector because I've never dropped my phone (10+ years). I don't know how to drop it. My sister drops her phone almost daily and definitely needs a phone case. Some people can sit on their phones and scroll through a bunch of boring content in uni lectures. I can't do that nor be there at that slow pace so I would just leave. I like rollercoaster rides but I hate going because of the queue. Most people don't like the queue but I really don't. On the other hand, I can sit there and watch paint dry or watch a water filter drip the water through to fill a jug slooowlllyy. I love watching loading progress bars and percentages during long file transfers and am just drawn to staring at them the entire time (sometimes half an hour). I can watch small sections of certain videos (boxing moves, f1 turn at a corner, 5 seconds of a music video) over and over again for hours consecutively and it feels as if its only been 30 seconds max. I can do all of these things but being told to press the brake pedal before releasing the e-brake annoys the heck out of me. I know why it's there. It's there for a good reason. For other people. But not for me. So it still is very annoying. I can't see myself ever making that mistake. I can definitely see other people doing so. People are very different and change throughout their life too. On a constructive note: One way to make the hibernate feature work so much better would be for the loading time to be displayed as a bold number on a black screen and for it to count from 1-10 as the seconds go by. The loading should be done within 6-7 seconds for that laptop in the worst case scenarios but it should always count to 10 even if it loads before. And then show the lock-screen after that 10 seconds. That would completely change people's behaviour around the wait time. It's not the waiting. It's the inconsistency and distraction that the inconsistency causes. Some are more prone to it than others and neither is better than the other. It has to load at exactly 10 every-time and count the seconds not the progress. This would result in a significantly greater user satisfaction than a faster load time of 5seconds sometimes 7 seconds. Or with a progress bar that moves at different rates and then just jumps from 60-100 sometimes. This is why they put the timers on long traffic lights. It doesn't make the time go any faster. But it affects human behaviour massively. You can time it and so can now switch off as opposed to always being on and ready at any instance for the next 2 minutes.
It's not always about impatience. If you're a travelling freelancer or corporate staff going from meeting to meeting all day, you cannot afford to "wait" for an OS to load up. You also cannot "wait" for your computer to shut down at the end of a boardroom meeting. Macbook's sleep function always had this advantage compared to Windows.
Intel certainly is back, this looks great. Thanks for the detailed review.
No they are not 10+ battery drain in standby mode is a joke and if they don't come up with a solution in these past years x86 in laptops will be in trouble 😊😊
@@kirby21-xz4rx tech chap review said he left it overnight in standby mode, and in the morning the laptop still had 100% battery.
wow, it's awsome~ thank you for kind and impressive review.
Windows laptop prices have been rising, and now we have ultrabooks like this at £1500 in the UK for 16GB/1TB. There are currently no options for a higher tier, but they are expected. Fortunately, they drop in price if you are patient. The same spec X Elite is now £999 with a 15-inch display. If I were buying now and as a non-gamer, the X Elite would be a compelling alternative, saving £500, though Google Drive and my VPN still do not have an ARM version (using temporary workarounds).
As usual, in the US, the prices are lower but not inclusive of local taxes. Here, the common sense solution is to wait for the launch prices to fall. Another example in the US is the base M3 MB Pro 14, which was on sale for $1099. This is crazy value for ultrabook users who do not need more than 8GB of RAM.
Seems kind of a mixed bag honestly. I want to see the updated drivers and how it will perform on MSI Claw 8.
Is the NVME easily replaceable? Thanks in advance. Love your reviews.
Nice job bro. Looking good tbh. I was worried but Intel did make some real progress here with LL. Now we need batter driver and iGPU update to fully take AMD APUs. This is good competition. Wonder if any console maker would consider blue again from now on.
This is super interesting! Couple of questions for you folks hanging out in the comments:
1. Is there a chance that we get an equivalent to this, using same architecture but with Nvidia GPUs in addition?
2. What will be the Bazzite support for this thing you think?
just use Oculink to connect your dGPU
Just bought the 16gb model since it was on sale for $250 off. Basically handed my g14 down to my wife since she is a web developer The feel of this thing is amazing.
I was thinking about the 32 gb model but couldnt say no to the sale for the 16gb model
Arkham Knight is not a driver issue; the game purposefully refuses to launch on Intel branded gpus. It's never going to be fixed unless Rocksteady updates the game. Intel provided a workaround, but I don't think it was incorporated into a driver package (it's a modified game dll). You could try swapping that dll manually.
The early reviews of Lunar Lake are certainly showing a lot of promise.
It will be interesting to see the higher end Intel Arrow Lake vs AMD Strix Halo.
Arrow Lake isn't going to do very well vs Strix Halo. Arrow Lake H is only 6P + 8e cores so 14 threads in total so can only compete with the 12 core 24 thread Strix Point and Ryzen HX 370 should win at the same wattage, as for the GPU Arrow Lake H is going to have a weaker GPU than Lunar lake so it's no contest there. Intel will need the 8P + 16e core Arrow Lake HX to compete with the 16 cores and 32 threads of Strix Halo but that's an inefficient desktop design scaled back from 250W meanwhile Strix Halo is optimized for efficiency, Arrow Lake HX will likely require 150W+ to compete with Strix Halo which won't go over 120W and that includes the massive GPU, provided it can compete at all when Halo has the 256bit memory bus advantage. Then there is Arrow Lake HX GPU being even weaker than Arrow Lake H so going up against Strix Halo it will get demolished and require a dGPU. The only thing Arrow Lake HX can compete with is Fire Range, which is basically desktop Zen 5 at laptop TDP, with Fire Range likely winning in performance per watt since Zen 5 is very efficient when TDP limited and both of them will have abysmal battery life since being desktop derivatives they don't have good idle power consumption, the only things Arrow Lake HX will win in though is ST performance and iGPU since Fire Range will still have the terrible 2CU RDNA2 but again theese things will come with a dGPU and iGPU will only be used for watching videos.
Finally!! After how many years exactly?
@Phawx: Great video, thanks! Can you say something about the glossy finish of the OLED display? I have an older XPS 13 and really like its anti-reflection coating. ASUS, however, does not state that those displays have a particular coating. How annoying would you rate the glossiness of those displays?
Intel is smart waiting on the claw. This gives time for Intel to work driver kinks before release of the claw. Super excited.
I was thinking about getting this but might wait until next generation with ddr6 ram (faster bandwidth for GPU) and thunderbolt 5. I think those are the only 2 things Intel is missing with this chip.
Intel laptop is looking competitive again! Asus needs to update the Zenbook Duo with Lunar Lake right away.
Your testing goes into great detail but I find it kinda difficult to read the game benchmark graphs! Could you please make the bars less bright/saturated and instead give every processor or device a separate colour so that they re easier to differentiate?! And also MAYBE show the fps per wattage?
Great points! I'll look into seeing how I can differentiate colors when comparing AMD vs Intel going forward. Regarding FPS/watt. I've been working behind the scenes on a comprehensive way of actually testing these against a unit wholesale. Meaning including the dynamics of battery characteristics as well (voltage/#ofseries) But it's still taking time
we need to get phawx to 500k subs, come on people! get the subbin
How do you compare the wattages in the gaming benchmarks, Since memory is now on package, isn't it included in the wattage readouts, while for meteorlake it is not? Or did you cap it in another way? Its not a massive difference, but it might mean LNL 28W == MTL 25W since MTL uses a couple watts for mem. (and thus LNL is even better than shown)
The fact you can run a AAA game on a integrated graphics chip at playable frame rate is impressive on its own.
How is it compare to zenbook 14 3405 model? Is it worth to pay more to buy this one over 3405?
Can't wait to get this on the GPD win max
Are these devices good enough or should I go for a MacBook Pro? I have used Windows for 20 years.
The Empire strikes Back, GJ Intel
Hi i am deciding what to choose between zenbook s16 24gb vs this one s14 lunar lake .which one should i choose
Any chance of some Linux testing in the near future?
Super excited about the new upcoming handhelds now. But Oculink is a must.
Damn, okay Intel. Get tha GPU drivers going and let it fly.
This. Since tiger lake we have been asking for driver improvement from Intel. At least they seem on the right track now.
I literally just came here for the sick transition of the intro
I dunno... I expected more from LNL, especially GPU part. It's good that it is comparable and even leading against AMD, but still I'm worried and I'm wondering how Xe2 dGPU will stack up against RDNA4.
Yeah considering that the ram is on the CPU package and they have the node advantage compared to AMD AI series, I was also expecting more.
Its also a very low margin product for Intel. AMD could squeeze them hard with upcoming launches
Remember Intel is years behind in Drivers optimisations, which means over time they will gain more performance just like A770 and A750 they went to near double overall performance from day 1 to several months later.
This is the best case for Intel, 3nm vs 4nm node advantage, minimal IO reduces die size and power consumption on top of the smaller node, 15% faster RAM, AMD clearly prioritizing CPU performance over GPU in HX 370, and after all that Intel can only match Strix Point in GPU performance at best above 15W. With Z2 Extreme being more GPU focused and AMD spending more time to optimize it this time around Intel will only fall further behind. As for dGPUs RDNA4 seems very impressive for a midrange GPU from what what's been leaked so far 4080 performance at 240-260W using the slightly improved 4nm node is great, battlemage on the other hand doesn't look so good with Intel only reaching their 4070Ti target in synthetic compute benchmarks and gaming being only around a 4070. If Intel doesn't manage to find some leftover performance it looks like their best GPU is going to lose to the 7800XT in raster and the RDNA3 architecture it's based on is going to be 2 years old soon.
@@HP4L16 Bro don't let the tests here fool you, multiple users done tests on Intel and found out at 1200P intel beats Strix by at least 20-30% on reasonably optimised games including cyberpunk, it's just that Phawx used 720P which reduces load on GPU, Intel had more juice not being used.
As for dGPU market Intel next gen I don't expect it to be battling anywhere near levels of AMD or Nvidia approaching I think it will stick around 5060 levels hoping to gain market there.
One key difference this time is that Battlemage will gain optimisations multiple times faster than Alchemist due to new design and how fast they can optimise games
Did you take into account the on package RAM? If not you maybe you should remove 2W for a better comparison.
Was the battery life of 15 hours with 120Hz turned on?
Thanks Phawx!
so much for Qualcomm the x86 killer haha, I hope non-apple arm computing gets better soon though, never say no to competition.
sensational review
I'm imagining a Lunar Lake based laptop with dedicated GPU in the future being able to actually provide great temperatures and lower fan noise.. Not sure we'd see this type of integration though but it would be nice. If anyone has seen any announcements for Lunar Lake + dGPU please let me know as i'd love to keep an eye on it.
I suspect it'll be arrow lake parts with discreet gpu. A "gaming" laptop will usually have a higher power envelope and more heavy duty cooling.
there is a "disconnect "between the alchemist and battlemage driver may be due to the reason of the drastic architecture change from the alchemist , for example for older game they have done away from dx9 to dx12 emulated type solution with microsoft and a lot more which makes it run more on the metal instead of intermediaries we have to wait and see how things turn out with this from driver and dev level optimization and if it turns out ok then the desktop cards may have smooth landing with kinks being ironed out with lunar lake and might give tough competition to rdna 4 given we know how intel were good at RT at first place and if lunar lake fixes itself in driver department then msi's bet on intel might make them really rich given there is xess which is ai upscaling for the intel handhelds ONLY and fsr4(fsrai)can't run on amd handhelds given the fact that they might need special rt core or machine learning component to run it as we know rdna 3.5 will be there till strix halo so no mchine learning upscaling is there(exception being fsr 4 runs just like intel dp4a instruction on the pre rdna 4 handhelds )........
Doesnt look that interesting to me, i thought it would be so stronger than strix point, but it isnt. Strix point is a full 12 core or 10 core with all cores able to do multithreading and avx512, which lunar lake doesnt even have. Same problems with drivers which they had with previous generation meteor lake. Intel should build full cores and work on their driver and not cut down every new core and call it efficient. I dont want to pay for repackaged cpus like 13/14 gen. Dont know why they are ignoring this. Good video as usual from you, but nothing has changed for me with this. Its either i want gaming > strix halo (or strix point without or with gpu) or i want full functional cores what intel cant do.
I can't believe how bad the Claw was. But with new architecture 1 thread per core finally Intel look competitive in handheld market.
Will 288V be able to achieve its full power on Zenbook S16?
really wonder if you disable for 4 efficiency cores if you'd get more gaming performance at lower wattage, I feel like 8 cores is still too much
See if you can turn off connected standby, I'm not at my computer to give you the command, that's what's wrong with standby power consumption, and likely Qualcomm disables WiFi on standby
Also, stand by is pretty much obsolete with ssds, coming back from hibernate is almost as fast, and no we don't have to worry about edit rights to the SSD, but I suppose most won't believe that
I'm impressed with what Intel is doing, but I still don't trust them to get igpu drivers right until I actually see it
Now to find a way to get LNL to drop to 7500MT/s memory so the iGPU comparison is more apples to apples. Also I think in some of the games the better single core perf of LNL compared to AMD makes a difference. Anyway what happened to the RDNA 3 vs RDNA3.5 deep dive ? Have I missed it ? I am glad Intel is competitive on the iGPU front now, will push AMD a bit ...
Slick chip i must say. And slick hair too!!
Can't wait to see this chip in mini-PCs.
Intel's future is bright, Q4 and beyond will be good days for them
They're literally looking to sell off massive portions of the company in order to stay afloat but yeah the future is definitely bright.
So we don't need gaming laptops anymore?
If you see Hardware Nexus results Lunar Lake absolutely destroys AMD Chips in gaming, and I believe the difference is that he used more realstic 1200P resolution which made GPU show its powers.
What gaming? Are you joking! All games are unplayable. My laptop from 2019 is way better with shitty rtx 2060
@@dantebg100 Shut up fanboy, you keep hating all day long here not appreciating a single thing about this massive improvement by Intel
@@dantebg100 and how's your battery life with that system? I bet it isn't 12+ hours.
Hardware Nexus ? Are you sure it's not Gamers Unboxed ?
A little disappointed in the lack of improvement in performance. I’m hoping the desktop cpu is much more impressive than the current 14th gen cpus.
Hi! How bad would you say the reflections of the screen is? I had MacBook air m1 and that was acceptable. Would you or anyone recommend screen protector with anti glare?
Benchmarks. Love it.
9742 multi core cinebench r23 is....embarassing if its replacing the old H series which I believe not and only decent if its replacing the U Series. Hyperthreading /less cores ended Intels whole career if this is how all the chips will perform. I am an H series laptop user btw I want high performance but only up to a level that can be cooled with low fan noise. Truth is I am on the brink of going all out with HX and GPU on steroids but I am hoping processors and gpu's get more efficient before that happens.
You should to the S14 with an egpu review. I was hoping that they released the laptop with TB5.....
Metro Exodus HX 370 vs LNL must be an outlier or there was some sort of bug. the other tested games showed that the two chips are neck and neck at 17-20W
Intel strikes back BIG TIME! Wow! These laptops are Macbook Air killers!
I cant find reveiws of the samsung galaxy book 5. Maybe it isnt released yet but id love to see a comparision between these 2.
Lunar Lake laptop and maybe a Arrow Lake PC handheld for gaming and video editing when docked.
Intel making an absolutely fantastic showing here (esp. with efficiency), but is weirdly getting tripped up on graphics drivers wrt gaming smh.
You have to question this. What's the point of having such a nice display on a gaming machine if you can only run games comfortably at 720p or 1080p? At that point, you're simply paying money for the display that doesn't really enhance your gaming experience and chews through more battery than it needs to regularly.
There's a few ways to look at it, but modern games are just too much to run on these chipsets. Honestly the memory bandwidth is our biggest bottleneck. But you can play a lot of games
Terrible standby time. Microsoft asus and Intel need to work on it.
X86 problem not "Microsofts" Snapdragon/arm already proved that
@@kirby21-xz4rxNo, this is not an x86 problem. This is caused by 3rd-party drivers or bad firmware. In this case, it must be 3rd-party drivers from peripherals or utilities that Phawx installed, since other reviews of this laptop have like 1% drain per night. Snapdragon X laptops can also drain 10% or more per night due to bad peripherals. Many such reports are around, but it’s less frequent due to no legacy driver compatibility.
Hi Phawz!
Battlemage please be good. If they can get 4070-like performance for less than 300 USD I'd make the switch.
Tsmc is fighting tsmc, that's what you mean? Not using own foundries is a slap in the face. Pat should be deeply ashamed.
yes and they need to be fairly also on price because this is quite expensive. I mean, alsmost nowadays everything is quite expensive sadly.
Test standby in Linux or steamos
Why don't use the same games to compare z1e vs strix hx vs lunarlake all 3 together?
compatibility issues
@@kartikpintu if you mean there are no games exists that can run on all 3 chips, that is a serious issue.
I really wish we could exorcize Apple design from tech. These notebooks choose the wrong priority between form and function. How much better would cooling, peripherals, durability, serviceability, uptime and IO be if those damn things were a mere centimeter thiccer? There is a reason why IBM ThinkPads are legendary.
Fair, OEMs shouldn't try to appeal to Macbook users this much. Granted, the beauty of x86 versus Apple is you don't have to choose stupidly thin/underequipped laptops if don't want to. We've enjoyed traditionally 'chunky' laptops with great IO, cooling and serviceability AND great battery life and not bad portability through monolithic Ryzen (hopefully Arrow Lake-H delivers too).
Lunar Lake kicks ass for what it wants to be though.
Is it manufacturer's fault, or just mainstream buyers want that trade-off? I remember how Jobs was always against phones bigger than 4" because of thumb's range, but market showed they don't care about Apple's beliefs, so Apple had to give in and copy Asian android phones... It goes both ways. Mainstream customers decide.
NOOO absolutely NOT 🚭 I'm convinced you've never even held or used a thick laptop BEFORE because if you have and say this I'm absolutely concerned 😭
@@kirby21-xz4rx What a load of BS. Just because you don't like it, don't assume others are the same.
I still use an almost 8 year old laptop, an 17.3" ASUS RoG G752 which is big AND heavy. Has about 4 kg and the charger is big and heavy too. I never had any problems with that. I always prefer having ports and good cooling versus a thin slab which forces me to use dongles (which I hate).
Now I realize I'm on the other side of the spectrum, in regards to how much tolerance I have for big and heavy laptops. But to say that basically NOBODY would like a slightly bigger and heavier laptop and that you're concerned (about what ? Having an awesome tool ?) that's just another level. I'm more concerned about your warped senses in this case.
how to say you're a shill without saying you're a shill
lol, I've had Intel fanboys call me a shill when I would dog on Intel and show how much better AMD is. I just call it like it is
@@ThePhawx so how exactly is Intel "back" with an 8 thread CPU they are literally regressing but youre not a shill LOL
Everyone just has to acknowledge that it is good because this is fabricated on TSMC 3nm node and not on the shit Intel fabrication.
Yea great job, the single core performance is a bit higher than Apple M1 from 2019, while using more power.
Real 😂
M1 launched in november 2020. Nice try, fanboy!
@@Winnetou17 ok 2020, 4 years ago
Have they decided to not nuke the 1 core with moronic voltage?
I am wondering if the Battlemage desktop GPUs would be better than my A770.
The top Battlemage GPU better be faster than A770. Otherwise it will be an absolute fail of a launch. Unless, I don't know, they sell them at $150 and they draw 100W. That wouldn't be a fail. But even in this case, people will say that Intel ARC will go nowhere if that's the best they could do in 2 years
Let's hope this translates to Battlemage as well 😊
At last some real competition to the AMD APUs.
we sure those chips don't self-destruct?
It's low power, hard to explode LOL
CyberBug2077 and Exodus are 2 games which extremely favor Nvidia and they had history poor perf on almost every AMD vga. Just in case you guys did not noticed
And standby still not fixed? That's just sad.
Something is wrong in its optimization. More laptops must be tested.
My SAMSUNG Galaxy Book 4 Pro only drops 2% after 8 hours of sleep and never hibernate. It has METEOR Lake.
Heard amd's new CPU laptops don't have upgradable ram anymore is Intel same?
this one in particular yes, all lunar lake is soldered ram
you can upgrade with amd
strix will become available with so-dimms from ces onwards
Thats good to know
@@TheBackyardChemist its not soldered, its SOC
Galaxy book 5 360 looking really interesting
i still curious why hx370 don't use lpddr5x 8533
In order to increase the frequency (and the transfers rate) you have to low the voltages but doing that will drop the signal integrity so you have to put the memory closer to make all that happens and that's why Intel is packaging the RAM with CPU, also Intel have better IMC so they can have more stable RAMs at higher frequencies
AMD never should have given up on HBM.
AMD never give up on HBM lol what r u sayin
Might as well play the slim shady song ! Guess who's back ?😂