Once I get the new Intel IG Phyu laptop. Which should arrive in about a week and a half. I should be able to do a comparison and bring out some data. I might just slap it onto the end of the Intel video in a more chart format showing the difference between 7 AM, 890 M, and the new Intel Off
Would love a 900p or 1080p showcase if that'd be possible at some point. Personally I ran games at 900p w/ Rog Ally X. It just looked great, improved performance and battery life quite a bit. Compared to 1200p I'm sure it'll boost the fps quite a bit. This new gen is going to be something else. Too bad it's so expensive @@TechGuyBeau
This goes on for quite a while already. Got myself an Asus ROG Strix G17 (2022) laptop for travelling with 6900HX+3070Ti and often I don’t realize that I forgot to turn on the dGPU before playing something. RDNA2+ iGPUs are awesome.
unfortunately, we're unlikely to see those powerful APU stand alone, most of the time we got intel, like 13900h on asus vivobook pro. Intel cpu with shitty iris XE is trash
@@nhh4977 Are you sure? I see a ton of 8845hs thin'n light. Yoga Pro 7, Vivobook s 14 etc. All runnin 780m. I expect 890m to be the norm next year in laptops.
@@nhh4977you are wrong tho. Many makers will start selling 890m, some are already with preorder available. In 2-3 weeks they will be in store. As for Intel, they also showed of their Lunar Lake with igpu comparable to 890m, it will hit the stores in around a month.
Not before Q1 2025 for sure. SO-DIMM support for these doesn't exist yet, so assuming Framework insist on upgradeable RAM, they won't have Strix Point CPUs until AMD releases these CPUs on a different platform. XMG said the same.
They will probably wait for camm2 before they adopt any future hardware. Given AMD and Intel requires ddr5x speeds on their current and future lineups.
Very interested in how the wattage fluctuates when you lock it to a performance target (i.e. BG3 at 30 fps cap). I'm not inclined to push handheld wattages to the metal but rather let them use what they need to get a performance target. I think it makes the experience more consistent when you're not bouncing up and down in frame rate and the battery improvement is very noticable. Can't wait for the Strix handhelds!!!
Hi man thanks for the review I have a question how does this cpu preform when it's not plugged to the charger, also what's the battery life if you game on it without been plugged
It is extremely efficient when not on charger. AMD has never really had issues with that in the past four or five years. That was more of an Intel issue. However the new Intel offering, the core ultra series 2 will probably rectify this as well for them
This is mindblowing. Will be interesting how this chip can perform if they make a custom version of it for handhelds. Hopefully they will do that, same as the Z1E. Z2E. Phawx did a breakdown of it on the Nerd Nest the other day. Said it won't be that much more efficient, but after seeing this I'm getting my hopes up! Thanks for sharing this!
What is very attractive to me is that the temperature of the processor does not exceed 70 degrees even in 100% mode! This number is great for a laptop processor; An extremely powerful and cool processor
NGL this laptop is very tempting over a 14inch base m3max since its cheaper and still delivers sufficient M3pro 12c CPU like multi core performance, the 4060m is good enough for most professionals too. Do you know if the 370HX voltage offset can be adjusted in x86 universal tuning utility? If yes this has some extra efficiency potential.
I live in Turkey and, in my region, this laptop costs 119.000 TRY or $3500 USD equivalent after taxes which is $900 to $1000 more expensive than the G14 with RTX 4090 and 7940HS open box or brand new. Proart P16 has only had a small run and is out of stock now, but the old the G14 can still be had for 85-95k TRY. And there is no doubt that G14 is the superior of the two and is still one of the best choices for gaming and productivity in its class. Not putting such a mighty capable SKU in affordable chasses with the option of upgradable DDR5 5600 Mt/s is sacrilegious. Screw ASUS and all it stands for, honestly. I'm done giving them my money and I've also been telling those who ask for laptop recommendation to steer clear of them as well, ever since GamersNexus' video on ASUS' RMA scam operations came out.
@@TechGuyBeauI know it’s not the same and different OS but I’d like to see how these will do against the M4 on natively run games. Also the title is a little misleading as this isn’t the most powerfully Igpu’s. An M3 max would kill this IGpU in everything.
@@GlobalWave1if you are into rumours ive seen a leak saying M4 GPU will score 45 000 in OpenGL compares to 28000 for 890m and Lunar Lake. If true, that would mean M4 will smoke them in natively run games but will be comparable in ones running on Rosetta. Also Apple being Apple dont expect it to cost less than $2600.
Like it is serviceable and playable (for the most part), not great experience but acceptable for low end gaming, TBH when FSR\FSR FG is used it looks bad because of all of the artifacts. But man that fan noise it would drive me NUTS after 10 minutes. STRIX halo seems VERY promising, if it delivers the rumored of about RTX 4070 (probably RTX 4070 mobile but still) level of performance - 1440P gaming on mid - high settings could really be a thing on a laptop APU.
Funny thing is I’m not lying but I’m playing Black Myth on my M2 Max through GPTK2, crossover, cxpatcher running in steam and I’m playing on high settings not cinematic at 1440p with FSR no frame generation. Game runs great and I can’t even imagine the upcoming M4 Max running that game if it were native. Just saying Apple has been hitting those settings since the M2 Max. I know it’s Apple but let’s wait for AMD to catch up.😅
@@GlobalWave1 m2 max is a massive chip with 30-38 core gpu 400GB/s, and overall 67B transistors. That strix point for example is around 34B transistor. So it ain't good of comparison. I would say m2 pro or m3 pro will be better comparison. Also talking about price you m2 max machine probably cost several times more that the average strix point machine.
@@eliadbu You’re right and a better comparison will be the Strix Halo devices come out. I think the M4 max will be close to matching the 4090 mobile in some tasks.
A comparision with a 780M would be interesting. At 15W, 30W, 45W.
Once I get the new Intel IG Phyu laptop. Which should arrive in about a week and a half. I should be able to do a comparison and bring out some data. I might just slap it onto the end of the Intel video in a more chart format showing the difference between 7 AM, 890 M, and the new Intel Off
@@TechGuyBeau Noice!
@@mariopenulli1395 Noice indeed
Would love a 900p or 1080p showcase if that'd be possible at some point. Personally I ran games at 900p w/ Rog Ally X. It just looked great, improved performance and battery life quite a bit. Compared to 1200p I'm sure it'll boost the fps quite a bit. This new gen is going to be something else. Too bad it's so expensive @@TechGuyBeau
Breau didn't tell me to buy, but I looked into my wallet anyway.
... I'm never going to financially recover from this.
Consoom
This goes on for quite a while already. Got myself an Asus ROG Strix G17 (2022) laptop for travelling with 6900HX+3070Ti and often I don’t realize that I forgot to turn on the dGPU before playing something. RDNA2+ iGPUs are awesome.
unfortunately, we're unlikely to see those powerful APU stand alone, most of the time we got intel, like 13900h on asus vivobook pro. Intel cpu with shitty iris XE is trash
@@nhh4977 Are you sure? I see a ton of 8845hs thin'n light. Yoga Pro 7, Vivobook s 14 etc. All runnin 780m. I expect 890m to be the norm next year in laptops.
@@nhh4977you are wrong tho. Many makers will start selling 890m, some are already with preorder available. In 2-3 weeks they will be in store. As for Intel, they also showed of their Lunar Lake with igpu comparable to 890m, it will hit the stores in around a month.
Interested in if or when these will make its way to the framework laptops. As long as upgradable ram is still a thing
Not before Q1 2025 for sure. SO-DIMM support for these doesn't exist yet, so assuming Framework insist on upgradeable RAM, they won't have Strix Point CPUs until AMD releases these CPUs on a different platform. XMG said the same.
They will probably wait for camm2 before they adopt any future hardware. Given AMD and Intel requires ddr5x speeds on their current and future lineups.
what frequency was the RAM running? 7500mhz? Cuz I know it drastically increases the FPS.
Very interested in how the wattage fluctuates when you lock it to a performance target (i.e. BG3 at 30 fps cap). I'm not inclined to push handheld wattages to the metal but rather let them use what they need to get a performance target. I think it makes the experience more consistent when you're not bouncing up and down in frame rate and the battery improvement is very noticable.
Can't wait for the Strix handhelds!!!
I believe 17-20w meta will be strong with them. Even Ally X can run most games efficiently at 18w. It'll be so good 😁 super excited myself.
Hi man thanks for the review
I have a question how does this cpu preform when it's not plugged to the charger, also what's the battery life if you game on it without been plugged
It is extremely efficient when not on charger. AMD has never really had issues with that in the past four or five years. That was more of an Intel issue. However the new Intel offering, the core ultra series 2 will probably rectify this as well for them
Встроенная графика начинает показывать результат.
Hoping 24H2 adds some extra FPS.
How the hell it runs so good at 20W ? 😱 Like 50W - I get it ... But to run at 20W at almost no performance drop ? 😯
This is mindblowing. Will be interesting how this chip can perform if they make a custom version of it for handhelds. Hopefully they will do that, same as the Z1E. Z2E. Phawx did a breakdown of it on the Nerd Nest the other day. Said it won't be that much more efficient, but after seeing this I'm getting my hopes up! Thanks for sharing this!
screen tearing...
What is very attractive to me is that the temperature of the processor does not exceed 70 degrees even in 100% mode! This number is great for a laptop processor; An extremely powerful and cool processor
Yeah and it’s a thin and light laptop
NGL this laptop is very tempting over a 14inch base m3max since its cheaper and still delivers sufficient M3pro 12c CPU like multi core performance,
the 4060m is good enough for most professionals too. Do you know if the 370HX voltage offset can be adjusted in x86 universal tuning utility?
If yes this has some extra efficiency potential.
I live in Turkey and, in my region, this laptop costs 119.000 TRY or $3500 USD equivalent after taxes which is $900 to $1000 more expensive than the G14 with RTX 4090 and 7940HS open box or brand new. Proart P16 has only had a small run and is out of stock now, but the old the G14 can still be had for 85-95k TRY. And there is no doubt that G14 is the superior of the two and is still one of the best choices for gaming and productivity in its class. Not putting such a mighty capable SKU in affordable chasses with the option of upgradable DDR5 5600 Mt/s is sacrilegious. Screw ASUS and all it stands for, honestly. I'm done giving them my money and I've also been telling those who ask for laptop recommendation to steer clear of them as well, ever since GamersNexus' video on ASUS' RMA scam operations came out.
Would be good to see a comparison between this and Lunar Lake.
Already ordered a lunar lake laptop. Hopefully it arrives soon
@@TechGuyBeau Which Lunar Lake? I'm looking at the Asus S14 with 258v for the wife.
@@TechGuyBeauI know it’s not the same and different OS but I’d like to see how these will do against the M4 on natively run games.
Also the title is a little misleading as this isn’t the most powerfully Igpu’s. An M3 max would kill this IGpU in everything.
@@GlobalWave1if you are into rumours ive seen a leak saying M4 GPU will score 45 000 in OpenGL compares to 28000 for 890m and Lunar Lake. If true, that would mean M4 will smoke them in natively run games but will be comparable in ones running on Rosetta. Also Apple being Apple dont expect it to cost less than $2600.
@@nocapproductions5471 I wonder what the M4 max will produce. I expect it to be around the most power laptop you can currently buy in CPU and GPU.
Like it is serviceable and playable (for the most part), not great experience but acceptable for low end gaming, TBH when FSR\FSR FG is used it looks bad because of all of the artifacts. But man that fan noise it would drive me NUTS after 10 minutes. STRIX halo seems VERY promising, if it delivers the rumored of about RTX 4070 (probably RTX 4070 mobile but still) level of performance - 1440P gaming on mid - high settings could really be a thing on a laptop APU.
Funny thing is I’m not lying but I’m playing Black Myth on my M2 Max through GPTK2, crossover, cxpatcher running in steam and I’m playing on high settings not cinematic at 1440p with FSR no frame generation.
Game runs great and I can’t even imagine the upcoming M4 Max running that game if it were native.
Just saying Apple has been hitting those settings since the M2 Max. I know it’s Apple but let’s wait for AMD to catch up.😅
@@GlobalWave1 m2 max is a massive chip with 30-38 core gpu 400GB/s, and overall 67B transistors. That strix point for example is around 34B transistor. So it ain't good of comparison. I would say m2 pro or m3 pro will be better comparison. Also talking about price you m2 max machine probably cost several times more that the average strix point machine.
@@eliadbu You’re right and a better comparison will be the Strix Halo devices come out. I think the M4 max will be close to matching the 4090 mobile in some tasks.
the fan noise LOL wow. nope!
Well it’s on max. You’ll notice at 30w there is no fans noise and no perf dip
use some in ears or headphones duh
@@pillepolle3122 wouldn't it be cheaper to cut my ears?
@@someguy321 cutting your ears won't solve your problem, because you will still be able to hear the fan LMAO
@@pillepolle3122 LMAO 🤣
so the perfomance is similar to 4050??
@@m.preacher2829 absolutely not. The 4050 is still much stronger.
Probably moreso like a 1660ish level of performance
1650
@@PixelatedWolf2077 1660is like a 1070,