They’ll satisfy every mac user LOL, that was the point of every Macbook for years The difference is this years Macs are even better at being macs then all of the previous Macs and that is not a bad thing
@@Jo21 Yes, but these are simply not. There is bunch of people with priorities that this device is tailored for and they are going to be incredibly happy with it. The same way there are people who are going to love Asus G15 and people who will tell you that for their priorities it’s complete trash.
2:54 That's a bold choice to use a 90's brick phone as a pointing device instead of literally _anything_ else. What a delightfully weird repurposing of e-waste, I fully support it.
Isn't it a fair comparison? I've seen a lot of guys "reviewing" or showcasing their M1 Max Macs as equal or better than a desktop! A desktop which they don't have or had a 5 year old one with middling specs! 🤣 I've also gotten comments from people who bought the propaganda and tell me it has RTX 3080 (not mobile) performance! 🤣
It's a no go if you're serious about music production. Rosetta is literal trash to run VSTs on and more than 70% of the 3rd party plugins are just not compatible on M1 which gliches out and have performance issues. Audio interface doesn't shows up some times and if you've to change your choices of VSTs based on machine, it's not a good machine to begin with
@@clickbaitpro You're dead wrong about pretty much all of that. There are very very few plug ins that cannot run in Rosetta, and the hit at 10% is far less than the CPU advantage that the M1 provides. NI are lagging, but NI have traditionally lagged, they're pretty much the worst at adopting to anything new. That said, Kontakt runs in Rosetta etc.
I dunno if/how you're coaching Anthony to host these things, whatever you're doing, keep doing it. He just keeps getting better and better for every video, and he was good enough with a decent margin to begin with!
I'm a little puzzled about Apple's claims. On the CPU side, it makes sense. The M1 is using a much more efficient architecture compared to the x86 that AMD and Intel is using. But that's not how GPU's work, right?
I’m pretty sure that we can’t tell that much since Most apps and engines are not even optimised for ARM architecture much less this SoC… Till now, It was mostly Rosetta 2 which was doing most of the work, however it doesn’t really mean much against other PC’s
It’s also using a more efficient architecture for the GPU. Normally you’ll have the CPU feeding data to the GPU, and the GPU storing in it’s memory. This is why high end GPUs have higher bandwidths, because this is a limiting factor. This new chips don’t need to do this, the memory is unified, and CPU and GPU can share memory directly. This obviously requires massive changes in the application. Right now what we can see, is that the M1 macs have very limited graphics performance because Rosetta can’t use this trick, it emulates the previous architecture by copying data from CPU memory to GPU memory (in this case they’re the same). This essentially halves the throughput, and that’s why performance is so poor.
It's kind of the stepped up version of Smart Access Memory. The M1 series is the first modern implementation of a consumer unified direct access memory architecture. Just like how the most effective/efficient Mining GPU's are really the best memory bus implementations, these unified CPU/GPU/DRAM chips are going to start eating the modular systems lunch as long as they can get a large enough memory pool.
@@ezicarus8216 shhhh, they might realize that the imaginary system reserved memory for the igpu is actually a thing also.... x86 consoles go as far back as the original xbox... which yea it was shared memory for the gpu/cpu
GPUz have way more than RAW performance. For example, OPTIX of nvidia, is way smarter, therefore better for rendering than CUDA on the same card. Thing of games that RT cores, are incoparable to raw performance for raytracing, even if they consume less die space and power consumption
If you want to actually see the promised performance gains: Use it for software development. Build times went from 4 minutes (2019, 16" MBP, max spec) to
I used to respect the guy but i'm not sure what to think about him or LTT at this point. If they don't address this i'm unsubscribing. - ua-cam.com/video/g1EyoTu5AX4/v-deo.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind! - ua-cam.com/video/OMgCsvcMIaQ/v-deo.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop - ua-cam.com/video/JM27aT9qhZc/v-deo.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia.. - ua-cam.com/video/YX9ttJ0coe4/v-deo.html 16" M1 Max MacBook Pro vs. My $6000 PC The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level. They need to explain.
I clicked on the video expecting very good comparisons in diffrent scenarios and such. All i got is dolphin emulator and some random bench data. How disappointing.
These recent reviews with Anthony hosting are so damn high quality that I can't wait for some of the LTT Lab content to drop in the next year. It's gonna be absolutely sick.
One thing to note, you can’t set up the pro max chip with 16GB of memory, so if you don’t NEED 32GB, it’s actually a $600 difference to go from the base M1 pro to the base M1 pro max. The $200 for the chip upgrade itself and $400 for the memory upgrade.
Yeah when I bought my M1 Pro 16 in November I wanted to spring for the max but like you said it quickly became a grand difference in price and I’m really happy with the Pro.
I would not buy a laptop with 16gb of memory. I know it's use-case dependant, but I constantly am running up on 16GB on laptop and desktop. Given usually it's when doing 3D workflows, or container development, but it does feel like even more casual use and gaming workloads are going to be pushing up on 16GB soon enough that the cost of upgrade is worth it to make the computer relevant longer. Like, I'm using 10.6GB of mem right now just to have like 20 tabs of firefox/chrome and spotify open.
@@QuakerAssassin yeah it’s def application specific. I bought my Mac for just working with Lightroom and Photoshop and to work with raw files and it’s amazing for that. But I really don’t render anything or game so it works great for me as far as productivity goes.
This "the answer may surprise you. It sure surprised me" thing is starting to be a distinctive mark of Anthony's videos and I like it. Love the energy!
I thought he was going to say Apple was true to their word and that their marketing accurately reflected their products; that would have been shocking.
Something I think was missing was battery life under load. A key part of Apple’s claims was sustained performance even on battery and at much greater efficiency. So I’m curious how the gaming comparisons would look if you capped framerates to 60 or 30 across machines and compared battery life then. You showed Apple exaggerated how close they were in raw performance, and now I want to know how much they exaggerated on efficiency.
If you have such a workhorse, why use it on battery where it would die in less than 4 hours IF it was at 100% battery? Seems extremely unrealistic scenario.
I’m a video editor , I have used Mac and pc for a long time. Recently built a nice PC and I game too much on it lol so now I’m thinking of getting the M1 Max for portability. Glad to hear it’s a beast at what I need it for. This is definitely not for everyone
It definitely is a beast especially if it has a native support for Apple silicon. If you game unfortunately there isn’t any game that natively supports it yet, if there was then you’d get close to 3080’s performance for far greater efficiency. The biggest advantage of these chips are the performance power you get on the go versus any other laptop on the go. The MacBooks just smoke them there and if you travel a lot getting a MacBook over the others is going to be a no brainer. Just remember that you’d have to sacrifice playing some AAA game titles though but if Apple themselves release some AAA games for the Mac, I’m sure more game devs would see the potential in the Mac and port titles to them. That possibility definitely exists but it’s going to be a gamble.
Interestingly Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented suggesting either serious issues with testing methodology or massive pro Intel bias. Either way an update urgently needed by the Linus team to respond to those observations and recover lost credibility
@@skadi7654 no, they misrepresented data for whatever reason. Others have proven the reality, but although IMO LTT we’re raising an important and valid concern about these laptops, they did it in a very sketchy and either underhand or unprofessional way. See the Max Tech response for more details.
@@Prithvidiamond every laptop you say? You do know that there are laptops with desktop cpu and desktop gpu? I mean they are absolutely huge, are barely able to be transported but they are still laptops and they will be 2 to 3 times more powerful than m1 macs for the same price. It's not a fair comparison but you might want to lower your expectations on Apple claim.
@@Natsukashii1111 Laptop and Portable computer aren't the same. Macbook is a laptop. Some Clevos that you are talking about are "portable" computer whith whom you can do evrything as long as you have a desk and power socket. Without those two it's a bigass brick good for nothing.
I was wondering why the rtx had an arrow head on its bar graph, while the others were normal rectangles. Then I realized the rtx was so much higher than the others that it was being truncated so you could even compare the other bars 😂
Yes, because it makes good sense to compare one of the most expensive desktop configs you can buy to these LAPTOPS. There are not enough eyeballs available in the world to roll for this asinine comparison…
I wouldn’t doubt yourself so quickly. Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented in this video suggesting either serious issues with testing methodology or massive pro Intel bias. An update urgently needed by the Linus team to respond to those observations and recover lost credibility
I'm really tired of mobile parts being called the same name (eg: 3080) as their exponentially more powerful discrete counterparts. They're fundamentally different parts I feel
I mean, they’re up to twice as powerful on desktop, but that’s plenty to mislead consumers. AMD and Apple aren’t doing that, though. Just Nvidia. I take issue with your use of the word “discrete” here - the 3080 laptop GPU is still discrete graphics because it’s not on-die with the CPU. Still, I take your point, and I second it.
@@djsnowpdx That's a fair distinction, is there a category to describe desktop + workstation + server GPUs? The only thing I can think of is 'PCIe GPUs', vs mobile GPUs and iGPUs. There's also the distinction between the specially-made rackmount-only versions, like the A100, which although use PCIe, are not PCIe-socketable, which futher muddies things
@@cristhiantv they’ll probably say some delusional things like ”pro users always have their laptop plugged in anyway so power consumption isn’t an issue”.
@@sqlevolicious you don’t know his life, do u? Also power consumption is important if you’re rendering videos on the go…. But you’re gonna probably reply something telling us how stupid we are just by looking at your comments before… so don’t mind answering, have a good day
Would be interesting if they used TeraFLOPS as a unit of measurement to determine estimated GPU performance. :) Now it's not the best unit to use, but the FLOP can show 32-bit precision calculations per second.
Not only not the best, Teraflops is quite possibly the worst measurement to use, since for every generation and architecture performance per flop can differ so much. The only thing it's good for is marketing number shows (also relative estimated performance withing one gpu family of the same generation, but that's besides the point).
@@ZerograviTea. Wow. I didn't know it was the worst. So, what is the best unit of measurement for GPU performance? GPU bandwidth (GB/s), throughput, or something totally different?
It's no wonder all of the reviews were so glowing when these laptops came out. It's because all of them almost exclusively focus on video editing and the Adobe Suite. "Benchmarking" often times is just video render times, and it's frustrating, as you can clearly see, it doesn't paint a good picture overall. The Zephyrus is what, at least $1k less? And it performs largely the same, at the cost of battery life? I guess efficiency is a good thing, but these laptops are good for only really very specific purposes, and I question whether they entirely deserved the ubiquitous glowing reviews when they dropped.
If you also consider programming, then also the m1 pro and max outshines the competition. Android projects and java projects are significantly faster than even the top-end machines running linux. Python and TensorFlow builds are also faster, although there somehow the m1 pro trains and builds the ML model faster than the m1 max due to some reasons. So in the departments of media creation and programming these laptops are truly top of the class.
Apple's gig has never been good value. I would actually consider buying it for the hardware if not for the OS lock-in. $1k for weight/battery life/build quality? Sure, why not.
@@Lodinn This is why, despite it's many downsides, I still kind of like the MacBook 16in 2019 with the updated keyboard and i7. Boot camp gives it longevity, and being that it runs x86, it runs all modern day apps. Obviously efficiency isn't nearly there, but all the other MacBook perks are, which makes it a rather nice machine. Outclassed for sure by these last few years of laptops by orders of magnitude, but hey, until Razer or Microsoft can get the build quality down as good as Apple has, it's an attractive option.
@@aritradey8334 That's fair! I haven't seen too many benchmarks I guess in the programming world, which I feel is telling when it comes to the reviewer landscape. With that being said, I remember some of the Hardware Unboxed review, and now this one, and they are such a stark contrast to the uniform praise these recieved upon launch. Great machines for sure, especially for those who use the areas they excel at. I guess I'm just rather exhausted at all of the review outlets only reviewing things for videography, simply because that's what they do. Their reviews shouldn't be a general "review" and should be more a "videographer review", so that those who don't watch/read 18 reviews like a lot of us here who do this for fun, don't get the wrong ideas.
I did wonder and reminded me of how Volkswagen optimized their software for specific use cases. I considered M1 briefly for a Linux laptop but then quickly reconsidered - if not else for the keyboard - and went for a Thinkpad Ps. I don't think these Macs are good for generic purpose computers. They are fine for the same task a Chromebook is also good for, or for the special video editing stuff. Seems quite niche, lucky them they can sell it with marketing.
Anthony, your screen-presence has improved so much from your debut. You’ve clearly gotten much more comfortable in front of the camera, and you provide a wonderfully logical insight (pun intended) into the things you present. I know you’re going by a script, but surely you contribute, and you make it yours.
They pretty much had to for this M1 chip anyway. Can't really run widely compatible API's if you're going to do specialised hardware & also claim it slays a top of the line dGPU while using less than half the power. They just don't tell you that the software to actually get the claimed performance isn't widely available (yet).
@@MLWJ1993 Just wait until the community implements OpenGL using Metal, similar to MoltenVK. It's not really "specialized hardware", it's just a graphics API, that's how every GPU works. That's why OpenGL support is still ubiquitous on non-Apple GPUs, even though they're architecturally much more geared towards Dx12 and Vulkan, which are very similar to Metal (in fact, Metal itself is barely anything more than a deliberately incompatible clone of Vulkan because Apple is still Apple). The M1 CPU may be awesome at clearing up decades-long inefficiencies of the x86 architecture, but the GPU world has long progressed way beyond that. Apple has no such advantage there. The only reason they are even remotely competitive in a performance per watt benchmark is TSMC's 5nm node, to which they currently have exclusive access, but from an architectural standpoint they have a lot of catching up to do with both AMD and Nvidia.
@@DeeSnow97 The M1 just sucks for "community anything though" since Apple doesn't really do much of anything to have "the community" fix up their slack. Most of the time they specifically go down the path where they like "the community" to be able to do absolutely nothing. Like doing basic servicing of a device...
Great perspective, appreciate the continued, in-depth coverage on these. I also appreciate what feels like an objective, enthusiastic investigation of the tech, neither a takedown nor blind exaltation, thank you so much for your work!
I would love one day to see Deep Learning Benchmarks as well ... as a DL practitioner, looking forward to the comparison for both CPU and GPU workloads.
@Christos Kokaliaris You can get these notebooks with 500nits, 4k 120Hz displays if you are willing to spend the cash. Personally I use external monitors.
@@MrGeometres if you run stuff on cloud, nothing beats a 900 dollar Macbook Air. You get a wonderful display, great touchpad, nice keyboard. At some point you have to run stuff on cloud if you are doing serious business. It does not makes sense to put thousands dollars to workstations that don't run most of the time and don't scale at all.
Unfortunately, the answer on HDMI 2.1 adapters is currently no, for software reasons. I think if you guys make a video on it that could get Apple’s attention to finish it
Great review but I'm curious about differences between pro and max for development benchmarks i.e. code compilation. This is generally a very large use case for these macbooks.
Depends on what you're compiling, if your stuff can compile on mac and is not using GPU acceleration, then the difference is minimal/non-existent. The efficiency cores on Intel next year will be very interesting, and AMD finally moving to 5nm, though that is supposedly end of year, will be very interesting to see performance jump with that including the new cache stacking. It's great getting past the stagnation. I'm probably upgrading end of next year, will move from laptop (i7 9750H, it's 3 years old now) to PC since moved continents, and things like Rider and VS Code having remote support means I can just have home PC host the stuff (which I do often enough on my NUC if I need to run overnight).
3 роки тому+9
Check Alexander Ziskind youtube channel for many, many development benchmarks done to the M1/Pro/Max machines, most videos are very short and to the point. In general, CPU-bound work sees very little difference between the Pro and Max chips, you end up seeing more differences being caused by the number of cores available on the different versions than in the kind of CPU. In some cases, specially single-threaded ones like some javascript tests, a MBP16 running a maxed out i9 might beat the numbers, but if the workflow is multithreaded the M1 chips do it better. Unless your workflow really needs more than 32GB of RAM a 10 core M1 Pro is probably the "sweet spot" for development at the moment.
My friend is a senior engineer for Apple and he does both iOS and MacOS compiling. He got a Pro for himself and they gave him a Pro for work too because the Max isn't necessary for their developers for the most part. Only certain developers would get allocated a Max but he hasn't heard of any devs getting them.
The lack of Vulcan, Cuda, or OpenCL support on Macs is absolutely killing multi platform compatibility for even professional workloads and games have taken a giant leap backwards.
That is Apple's work, they just remove and destroy a industry-standard like OpenCL and OpenGL / CUDA (they never supported the most powerful GPUs, which are Nvidia). In Linux and Windows, when you get a new standard, they let you use the old one, it does not just get removed, which destroys a lot of software. You can still run 32 Bit Apps on Win and Linux very well and that is how you must do it. Apple is just typically arrogant and does not care about its users. That is the reason why they have not had more than 10% marketshare globally, not once in 44 years the company has existed.
@@nigratruo x86 is stagnant and needs a complete reboot... but noone got the guts for it... Apple did and they now have quite powerful machines that uses little power... perfect? not yet... but way better for what they are meant for and then on top of that they can game decently... but again not perfectly... yet. but the extra power of the M1 chips? especially the pro and the max? well they could (should) be interesting for game devs to tap into
Things i still want to see covered: 1) How much can the USB-C take? 8 hubs fully loaded with all the native monitors going, with also X extra monitors using displayLink, while running a USB connected NAS and 10gb Ethernet dongle 2) EGPU support? If not, what happens if you try it? What if you try to force the Nvidia or AMD drivers with rosetta? 3) Wipe one of the system and use it as a daily driver for a week but this time refusing to install Rosetta. How are the proformance numbers changed without the emulator running or even listening
Thx. This made me re-consider buying a MacBook, after using one for mobile purposes for 4 years. I work as a freelance architect and I fell in love with Twinmotion, a UE based real time renderer. Path tracing is not a thing with Mac and it realy sucks for the price. It might get a metal update, but I need it like now and can’t wait another year. Gonna get a W notebook again I guess. Also the price difference is ridiculous.
Using Apple laptops for UE wouln't make much sense until Epic natively supports them and that will take at least a year (because they first have to release UE5).
I'm using a macbook and honestly, overpriced as hell. You should go with a high end laptop if you're willing to pay the same price for better performance. There's the Razer Blade Stealth 13 for around the same price, it's a thin and light with just better performance most of the time.
My pick would be gaming laptop. For the extra GPU power and cooling for UE. I have Lenovo Legion and when I have to upgrade again I will buy the same brand again. It is a bit clunky but it stays cool all day long with i5 on turbo and nvidia 2060.
@@davide4725 “Thanks TSMC” You sound like the kind of guy who loves bringing up John Lennon’s wife beating tendencies every time someone mentions they like the Beatles lmao
I am also loving the progression for the ARM space. What really excites me isn't the CPU or GPU in these, its the optimizations they made to make ARM that competitive. They're getting asic-like performance for a lot of low-level stuff.
@@davide4725 i find it funny how you called the other guy "kid" while here you are having absolutely no knowledge on RnD, Design, audit, documentation, subcontracting and manufacturing process works in the tech industry. "Thank TSMC" lol. Kid please.
the progression of Anthony and how much better/confident he has become on camera should be an inspiration for everyone to practice confidence in social setting (which is even worse on camera when you're staring into a lens instead of talking to people)
M1 can throw a lot of weight around as a DAW host, especially running Logic and AS-native plugins. It's reportedly less-well-suited to realtime audio tasks (like recording live guitars through software amp sims at low latency in a busy session) but it absolutely pommels at mixing and composing tasks that don't require super-low RTL figures under load. The 32GB Max variant will benefit a serious composer who wants all of the orchestral libraries and soft synths loaded at once, although all that GPU will be drastically underutilized in the same scenario.
I have the M1 Pro Max. First Apple computer I have owned. And I am nothing but impressed... Sure I could find something I don't like about it. But... I could show you a list of complaints with my last laptops that are far worse. How efficient it is does have a lot of value. My last laptop was $2,000 when i purchased it from Lenovo. And I needed a Go Pro for a project. realized the memory was full and it killed my laptop battery before it could get the footage off. Even Chrome would noticeably kill battery life. Having a laptop that is useless without a being plugged in sucks.
Blender 3.1's metal support is very nice. I still don't think it beats out some of the higher end RTX cards, but it still performs very well, even in the alpha stages
Watching Anthony go from absolutely HATING being on camera to being so much more comfortable that he cracks me the eff up with an intro like that! Bravo Anthony! 👏 👏 👏 I almost spit out my coffee lol'ing at that. Great work.
So much of this is really optimization in code, for those of us that lived through the changes from carbon to coco to metal and from Motorola to PPC and then to Intel, one of the things that happened was after a giant change in architecture, over time as software gets updated the macs would get faster. Even Apples OS is still hitting rosetta. The review is still fair, but in a year the results from the same hardware will most likely be significantly different.
Steam drains the battery on my 16” Mac(M1 Max) faster than running Windows on Arm(Parallels) + ECAD(Altium) or Keysight ADS for EM field solving. Yeah… Just having the Steam launcher running, not even with a game going. Oh well, i never intended to game on the Mac anyways since I have a gaming PC… but in terms of work, the Mac can do everything I need it to do in portable form factor, while maintaining all day battery life.
@@LiLBitsDK I was just backing up his point that unoptimized things can run really bad no matter the device. Like in my case, something as trivial as the Steam launcher
Exactly. That's another crazy thing about M1. It will just get faster as we get updates. Normally machines will be slower as they age since software gets more complicated.
I'm interested if the laptops were connected to power. Also interested what the battery percentages would be at the end of the test with all laptops disconnected from power, and how hard the fans blew.
I think it's pretty clear that Macs run much better on battery power than most PCs. At least until the latest Intel and AMD chips are properly put to the test.
@@angeloangibeau5814 I disagree heavly. The point of laptops is portability, but that doesn't mean I will use them unplugged. Battery life is good but not as important as Apple makes it out to be. It's not that important like phones. When I am using my laptop for more than an hour, it's usually on a desk and almost all places I visit with a desk, they have an outlet.
Oh wow I was watching this video and I couldn’t help it but wonder if I’m the only one being shocked with this review! I respect and admire Anthony but I believe the software used for these tests was cherry picked and doesn’t show the full potential of the MacBook Pro! I came down to the comments and I’m shocked not a lot of people talk about this and then I saw someone mentioning a channel Max Tech talking about this review. I went to watch it and indeed I agree with them.
@@truthseeker6804 you don’t sound very objective yourself. If you actually looked around on other channels, the ones that are “pc fanboys”, also did more objective tests that are similar to max tech.
@@solidsn2011 ive watched some videos from other channels and the results were similar as here. can you post an objective video you think differs from here?
@@truthseeker6804 I found this review lacking and leaning, specifically in real world function. Bropants, actually hook up all the monitors, play the games that currently work, put it against more PCs, more price ranges, yadda. I thought they did better on previous M Chip reviews. Meh, I can’t judge. I recently followed this channel because I thought it was actually called “Linux Tech Tips”. *I know.*
I know it’s not the same as a real thorough test, but most benchmarks agree that the M1 (any variant) run virtually identical both plugged and unplugged.
Absolutely, One of the main use cases for a laptop is while unplugged. The first test they should do is fully charged and unplugged performance testing, then while charging, and then when fully charged but plugged in. Large differences in performance can result in various situations, and only testing while plugged (or unplugged) can skew results to the testers desires. I think Apple's claims might be correct IF the laptop they were comparing against did poorly while unplugged, so Apple's results would look more impressive.
not on the 14inch. if needed, yes on the 16in. best to stick with the Pro on the 14in, or if needed Max with the 24core GPU. the 32 core is voltage limited in the 14in.
I’m a software dev who also edit in Resolve and do some Blender on my spare time and went for an 10/16c M1 Pro with 32Gb RAM, I don’t regret it : I don’t intent to game on it, Blender Metal support is coming, and ho boy that XCode build time is just fabulous ! That’s not just the extra CPU, faster SSD and mem bandwidth do make a huge difference, easily cut my build times in half. picking an M1 Max was just wasting battery life for me, as the test shown in the video is the best case, the drop in daily workloads is more like 30%
@@StefanUrkel a huge one! Used both on M1 16Gb, very decent performance but swap usage was way too high and memory pressure often in the yellow area. 32Gb is way way better!
So glad I got the Pro. From what I hear its the same story with AI compute (something I personally bought it for), as it uses the Neural Engine there is ZERO improvement moving up the Max. Its a shame they didn't double those cores. It will be interesting to see if the new Mac Minis will differ from the Macbooks or just use the same SoCs, as if they double the Neural Engine I will probably get one. Interestingly AI is where you only get about a doubling in performance going from M1 Pro to 5950X + 3080, so a bigger Neural Engine could really make up a ton of ground there. Whats the battery life, performance and noise level of the Zephyrus when running 3D on battery though? A big reason I got the Pro was the performance to watt and noise level were terrific, even manually pushing the fans up so the SoC never goes far over 80C under heavy load.
I like your tests and I am not an Apple Fanboy, but your results here are very different from most of the other Tech UA-cam channels that have tested these MacBooks
which other tech channel results differ from this? post a real tech channel, not a fanboy channel. before you post make sure that channel does a variety of reviews not only praising apple products.
@@truthseeker6804 All that I've seen actually. Here are some - ua-cam.com/video/g1EyoTu5AX4/v-deo.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind! - ua-cam.com/video/OMgCsvcMIaQ/v-deo.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop - ua-cam.com/video/JM27aT9qhZc/v-deo.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia.. - ua-cam.com/video/YX9ttJ0coe4/v-deo.html 16" M1 Max MacBook Pro vs. My $6000 PC The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level.
@@andremessado7659 so i watched the first video, and the m1 max actually lost to the laptop and desktop, in the chart in export times, but it did well in the timeline playback, thats literally the same as this video in the davinci resolve section at 5:28. in the second video, the gaming laptop totally destroyed the m1 max on power, not on battery. i skipped the third bias max tech apple fanboy channel video. regarding the fourth video, the m1 max lost in all the charts except the 6k braw export, which is interesting because the first link you posted had a faster than the m1 max export speed on the gpu. so in summary from the first, second and fourth video, the m1 max does best in video playback on a video editing timeline, but loses to 3080 or 3090 in video exporting, stabilization, rendering, benchmarks, everything else.
That's the first time I heard that someone is preferring the silver color. I also got a silver one and, looking around online, it seems like I'm way in the minority with that decision.
I agree with what you say: M1 max is literally only for professional video editors, which is a super ultra niche market, for everyone else, it's not worth it.
I think it'd be more accurate to say media professionals and developers in general. It's absolutely fantastic for professional audio production and software development. Silent the vast majority of the time and can easily handle on-location and remote tasks with it's awesome battery life with full power whether plugged in or not. The high-impedance-capable headphone jack and best-sound-in-a-laptop ever doesn't hurt either. I think it's important to compare Apples to Apples here (pun intended). They're not designed for gamers, they are designed for professionals. As an equal Windows and MacOS user, my experience with these has been top-notch. For pros, Apple has hit a home-run here IMHO. Also, I think the power-per-watt here should not be ignored and I don't believe this was mentioned - add that factor to the benchmarks and you'd see some very different results. Energy costs money and affects the environment. And a hot, noisy laptop isn't particularly enjoyable to use day in and day out.
Super niche. Because let's face it, the m1 air can do 4k editing. How many editors need to edit 12 simultaneous 4k streams? Most youtube viewers don't even watch in 4k yet rofl. I really wish it performed better at 3d design.
@@wykananda for audio professionals most of them were good with an older generation macbook with high memory configuration tho. also for non video editing/audio professionals, macos is really really difficult to use. even more so with arm. basic stuff like a volume mixer and any sign of useful window management are absent out of the box. what is the point if you are spending such a premium to get a sub par experience with non video editing/audio professionals.
@@pupperemeritus9189 Hi pupper. I'm not sure I understand your comments. Sadly, the previous Macbook laptop generations were all limited to16gb of ram - so high-memory configs were simply not possible. Moving to the ARM architecture did not change the underlying operating system, MacOS, it simply made the laptop hardware run faster, smoother, quieter, and for much longer on a single battery charge. As for the difficult-to-use / sound control / window management - the latest Windows and MacOS are both more than reasonably user-friendly and well-equipped in all these areas - these OSs have both been around for many years and generations now and it shows. As a multi-OS power-user I could nit-pick plenty at both OSs here and there for sure though. However, in my experience, for the countless newbies that I've trained and continue to help, MacOS has to get the nod for getting productive and comfortable more quickly with less frustration and confusion and less problems over the long haul. Let's face it, both operating systems are DEEP. They're both very capable and stable at this stage but either will take time and effort to learn to get the most out of them. Curiously, my current "go to" Windows-based laptop is a 2015 Macbook Pro running Boot Camp - ironically, it's easily the best Windows laptop I've ever owned - cool, quiet, fast, stable, good battery life, well-built, expandable - and, of course, it runs MacOS like a champ too. I'll likely get another 3-4 good years out of it before I hand it down the line. IMO, the 2015 MBP was the best overall professional laptop ever made for Windows, MacOS, or Linux until now. While I can run the ARM version of Windows on the latest MBP via Parallels and so on, I'll have a new laptop King if-ever/when Microsoft Windows gets fully up to ARM speed and these new killer Macs can boot into it natively.
All I know is on some particular photo editing workloads (specifically running a noise reduction filter in Topaz labs on a 45mp tiff from a Nikon D850, after editing in Lightroom and Photoshop), the M1 in the *Air* absolutely seems to crush my desktop Ryzen 3900x / 3070 combo. I couldn’t believe it, and it made me a little sick 😂 Can’t game on it, and the Ryzen / RTX combo is still an utter beast in other tasks, but for certain tasks these laptops are unbelievable value.
The x86/64 version of Lightroom is horribly bloated and has been in need of a re-write for the last 5+ years, using it was a nightmare on my i7 4800hq laptop as a student... Whereas my M1 Pro (base 14") rips through everything no problem, my 3700x and 2080 Super are fine but yeah LR is still slower than I think it should be...
Yup. Honestly no idea why it took LTT so long to get these videos out. All this information is widely known by now. Seems like a huge miss on their part for being so late to the game on these. If they didn't receive the products in time then sure thats fine but it's also LTT.... Surely they could of worked it out.
As time goes, I start realizing that Max Tech tends to only or mostly show the advantages of M1*. Have to watch other channels to find out for instance about the only 90%ish Adobe RGB (this is bad for Photoshop semi/professional editing) of this screen and the very slow screen response times (35-100 ms) - find Hardware Unboxed for these 2 - ua-cam.com/video/p2xo-hDCgZE/v-deo.html . Or what it is here.
@@ContraVsGigi AdobeRGB? lol, a lot of professionals don't need or want 100% AdobeRGB coverage because they're working in an SRGB or Displayp3 workspace. Non-issue. 90% is actually a very good result for AdobeRGB anyway.
@@DriveCancelDC Personally not a fan of the guy or his channel but i'll give him credit for shitting out a buttload of videos when the M1 Pro and Max Dropped. He was on it from day 1. It's been almost 2 months and Linus is only just putting out a video now? I expected better honestly.
One thing not mentioned when doing the benchmarks, how do all the laptops ( MacBooks and Zephyrus) perform while only on battery. Yes battery usage length is great, but how is the horsepower of the cpu/GPU effected running apps while on battery. I think some surprises might arise.
8:57 To make matters worse for Mario Sunshine, the starting level is the easiest level to run on lower end hardware. So the fact that it hovered around 70s and 60s, that's not looking for the m1 max. However it may just be due to the rendering API being wonky on Macs
Benchmarks in C4D/redshift don't tell the full story. You need to go into redshift's render settings and manually increase bucket size to 256/512, then you'll see a 25%+ improvement in render times.
Anthony got that point right: apple is designing their hardware specifically for video editing, with everything else left behind. For me, the M1 was shiny when nothing was installed on Mac, but after loading programs and keeping stuff running in the background, M1 isn’t that shiny and amazing anymore. Battery life is still great but far from what apple exaggerates. Given the slow rollout of M1 optimised software, M1 optimised workflow won’t be seen for quite a while.
I’d really love a software dev take on this. For my use case fast cpu, good battery life and 64gb of ram are compelling - but are distinctly not video rendering.
Developer here, I wouldn't buy any of these besides the base-level MacBook non-pro. You can literally code on a Raspberry, unless you're compiling something crazy-complex like an entire browser you're not going to feel the difference, so why pay extra for literally nothing? A USB-A port would have been a compelling addition, but oh well.
Other developer here. Never found myself desperate for a usb A port while developing but have definitely found a use for better cpu and ram. Not sure what serious developers are developing on trash hardware tbh.
@@JackiePrime Web, for example. I don't develop on trash hardware because I can afford better equipment, but if I still had my old FX-8320 it wouldn't slow me down in any way. Peripherals are way more important at that point. Also, every single hardware debugger uses USB-A, and even if you just want hobbyist stuff have fun hunting down a USB mini-B (not micro-B) to USB-C cable just because you can't use the included mini-B to A. But it does make sense, if you only develop for iOS (which is literally the only reason I've ever considered buying a Mac) then you won't run into any of those issues, and Xcode being a hot mess does necessitate a faster CPU and more RAM. But there's a lot more to development than just Apple's walled garden, and if you step out of it it's a lot more important to be able to mess with any device you want to.
Also a developer here, gpu on the max is absolutely useless and 64 gb of ram is overkill for my line of work. 32 gb ram and 10 core pro is plenty plan to keep for about 4 to 5 years.
Another Developer here, I have the M1 Max with 64GB, 32GPU and 1TB SSD. While this setup is overkill, first I can afford it and feels good not having to worry about performance while working. On the technical side, running Webstorm, and other ides, multiple node apps, multiple docker containers, electron apps that suck like Slack etc takes a toll on any computer. If you can afford it, specially since software engineering is a well paid job, plus the resell value down the line, why not?
I was just auditioned for an animation job, I was put on a last gen Intel iMac, fired up Blender and put a normal map on one surface in the scene and the GPU almost caught fire and the whole macOS GUI dropped to 0.5fps, I'm not sh1tting you!!!
Normally I completely agree. Seems very skewed and that the apps selected were designed to show this in a poor light. Was it purposefully? Guess time will tell, but I believe this video will not age well. However PC fans will point this this sole video as why the new MacBook Pros suck, despite an overwhelming number of other reviewers showing the performance in a different light, several that are also typical PC reviewers.
@@BootStrapTurnerVideography you mean max tech right? A guy who literally said that M1 max MacBook is just as fast as 5950x desktop with RTX 3090. Yeah that guy is totally not biased at all. I think what Anthony wanted to point out here is that those apple marketing slides for M1 max were very very misleading.
@@bear2507 the illusion is that apple claimed the performance is about the same as an rtx 3080, not just M1 barely beaten the rtx 3060 its not even close to rtx 3080 and i mean mobile rtx gpu, an rtx is a GAMING gpu, so when they made these claim people will think about its performance for gaming obviously, should have compared it to a profesional gpu like quadro instead of being either brave or stupid to compare them to rtx
@@foxley95 yeah, i’ll go tell my research lab to shut down our datacenter with hundreds of 3080s, because some kid on youtube said these gpus are for games only and not generic compute. comments are full of children who have never touched anything ouside minecraft, but have an opinion on everything hahah
honestly anthony is the best and honest reviewer I guess. his dialogues are easy to belive and facts are legitimate. As for "maxtech"review channel they just support apple.
I don't know about you, but I use my laptop either at home, at work, or at coffeeshops. Everywhere there is a chair and a desk, usually there is an outlet.
I think the M1 series performance is excellent, especially for a first generation silicon. They are just so efficient and so fast. I do think software support still needs to catch up a bit, but that's a given. As time moves foward and with newer M2/M3 on the horizon I think we'll see some great things from these chips. The future is promising
I think software support will be very slow to catch up, Rosetta technically already brings over most of the optimizations a dev will make, anything else is libraries support like .Net. Looking at some .Net 6 benchmarks (where MS has stated they've done a ton of performance improvements), M1 should match 4700U about if it's a lot of cache eviction happening, and a little bit behind else. We're at the point where if it is a lower power task, the M1 will be better, x86 just has that minimum instruction set it has to load no matter the task, anything above 25W that advantage no longer exists and then Intel/Ryzen start beating it handily; also helps Apple is on TSMC 5nm, they've been great at investing there and keeping it as an advantage, AMD is only set to go 5nm end of 2022, while Apple might be on 3nm by that time, will be a very interesting year next year in tech.
@@lcarsos That doesn’t matter. It’s one thing to make a phone SOC. It’s an entirely different thing to make a ln SOC that scales up to desktop and prosumer performance while maintaining excellent power management. This is an entirely new beast
What about software compilation, data analysis and heavy crunching like that? Can you 🙏 test compiling Linux or something similar workflow for the 16” review? Pretty please 🥺 It’s a lot more relevant for someone like me
After rewatching this review, I went ahead and bought the base model of the 14 inch m1 pro. I will be doing more cpu than gpu heavy work but I didn't think the 2 extra cores was worth the money
could you guys test war thunder when you do the 16 inch? it supports metal. is usually HEAVILY single core limited. and is quite interesting to test. pro tip. the build in benchmark has different results than using test drive.
Was the Zephyrus M16 tested on battery or connected to power supply? Curious if there’s a performance difference on battery like on my G15, for me the consistency across both looks like it might be the biggest draw of the new Macbooks
It's most likely out of the direct power and with fans on full force (usually there is a high performance mode for these laptops) But this also boils down to the question - what are you more focused on? Aesthetics for other's benefits and tertiary factors or on your work?
Anthony, you tested Rogue Rogue Squadron III on a build from just before MMU support was added to JitArm64 :( That means it's likely running a lot slower than intended.
It doesn't really matter though. You can't even emulate those games at full speed all of the time on the most overkill PC setup money can buy. It just makes very heavy use of pretty much any trick you can pull off with the gamecube's hardware which is just not very comparable to current day hardware at all.
@@MLWJ1993 No? Modern x86_64 CPUs have no problem emulating Rogue Squadraon III. It's just that for any ARM64 chip, the Dolphin devs hadn't implemented accelerated MMU emulation until very recently, which made any ARM64 chip very slow in that game, even the M1.
@@neutronpcxt372 Pretty sure you run into slowdowns in all those games in transitions. It's not unplayable slow, but definitely not full speed everywhere. That's why the forums are full of people asking if what they see is expected behaviour for their overkill hardware or devs answering no hardware currently is capable to run the game "smoothly" when that's a specifically provided requirement.
Tailoring the chip to videomakers is essentially marketing genius, since it'll satisfy all the UA-cam reviewers
They’ll satisfy every mac user LOL, that was the point of every Macbook for years
The difference is this years Macs are even better at being macs then all of the previous Macs and that is not a bad thing
@@dominikhanus9320 well they ben crap since 2016
@@Jo21
Yes, but these are simply not.
There is bunch of people with priorities that this device is tailored for and they are going to be incredibly happy with it.
The same way there are people who are going to love Asus G15 and people who will tell you that for their priorities it’s complete trash.
Lol
@@dominikhanus9320 The problem is the price though. It’s just not a good value.
Man, that single photo of Linus has gotten more mileage than all of the cars I've ever owned combined.
Lol
But not as much mileage than you mom lolololol get rekt kid
@@maddrone7814 Bro how old are you
@@maddrone7814 nice, keeping it old school.
mozly not as old as your mom lolololol get rekt kid
2:54 That's a bold choice to use a 90's brick phone as a pointing device instead of literally _anything_ else. What a delightfully weird repurposing of e-waste, I fully support it.
Therapist: Gangthony isn't real, he can't hurt you.
Gangthony: 'MAXED OUT MACS ARE MAD MACS WITH M1 MAX TO THE MAXXX'
It’s like 2000’s cringe all over again lol
No offense I love LTT and I get it’s ironic
Thugthony
Max max super max max super super max max.
Gangthony > Punk Linus
@@randomrdp3356 Max Super Max Max Super Super Max Max Max
i love that the 5950X testbench is so much faster that it just goes off the charts
i know it's not a fair comparison, it just looks funny
Well, all serious work that can afford a $3000 Macbook can afford, and should use, a $3000 desktop, and can eat the added peripheral cost
It was just to remind fan boys it's still a laptop and not a monstrosity
Isn't it a fair comparison? I've seen a lot of guys "reviewing" or showcasing their M1 Max Macs as equal or better than a desktop! A desktop which they don't have or had a 5 year old one with middling specs! 🤣
I've also gotten comments from people who bought the propaganda and tell me it has RTX 3080 (not mobile) performance! 🤣
I’m just sad that there was no Davinci on Linux AMD performance comparison in the graph. Genuinely interested in how it compares either way.
It is absolutely fair. If i will spend 3000 for a computer, I need all the comparissons
I wish you guys did more testing in audio production, I'd be curious to see how Logic pro and Ableton Live run with lots of VSTS on the M1 Max
THIS!! They need to focus more on the audio production side
Alas, for sound is often forgotten but always essential
It's a no go if you're serious about music production. Rosetta is literal trash to run VSTs on and more than 70% of the 3rd party plugins are just not compatible on M1 which gliches out and have performance issues. Audio interface doesn't shows up some times and if you've to change your choices of VSTs based on machine, it's not a good machine to begin with
@@clickbaitpro How come other videos I've watched people have little to no problem with their VSTs on the new mac?
@@clickbaitpro You're dead wrong about pretty much all of that. There are very very few plug ins that cannot run in Rosetta, and the hit at 10% is far less than the CPU advantage that the M1 provides. NI are lagging, but NI have traditionally lagged, they're pretty much the worst at adopting to anything new. That said, Kontakt runs in Rosetta etc.
I dunno if/how you're coaching Anthony to host these things, whatever you're doing, keep doing it. He just keeps getting better and better for every video, and he was good enough with a decent margin to begin with!
Bro Anthony is the best!!! I just wanna be best friends with him 😂
Plot twist:Anthony is coaching Linus so that he doesn't drop thousands of dollars worth of electronics every show.
i think he is just smart enough to do it naturally
I was originally offput but him but I now love the man and need him to make more videos. He's great !
Anthony is the best ❤️
I'm a little puzzled about Apple's claims. On the CPU side, it makes sense. The M1 is using a much more efficient architecture compared to the x86 that AMD and Intel is using. But that's not how GPU's work, right?
I’m pretty sure that we can’t tell that much since Most apps and engines are not even optimised for ARM architecture much less this SoC…
Till now, It was mostly Rosetta 2 which was doing most of the work, however it doesn’t really mean much against other PC’s
It’s also using a more efficient architecture for the GPU.
Normally you’ll have the CPU feeding data to the GPU, and the GPU storing in it’s memory. This is why high end GPUs have higher bandwidths, because this is a limiting factor.
This new chips don’t need to do this, the memory is unified, and CPU and GPU can share memory directly. This obviously requires massive changes in the application.
Right now what we can see, is that the M1 macs have very limited graphics performance because Rosetta can’t use this trick, it emulates the previous architecture by copying data from CPU memory to GPU memory (in this case they’re the same). This essentially halves the throughput, and that’s why performance is so poor.
It's kind of the stepped up version of Smart Access Memory. The M1 series is the first modern implementation of a consumer unified direct access memory architecture. Just like how the most effective/efficient Mining GPU's are really the best memory bus implementations, these unified CPU/GPU/DRAM chips are going to start eating the modular systems lunch as long as they can get a large enough memory pool.
@@ezicarus8216 shhhh, they might realize that the imaginary system reserved memory for the igpu is actually a thing
also.... x86 consoles go as far back as the original xbox... which yea it was shared memory for the gpu/cpu
GPUz have way more than RAW performance. For example, OPTIX of nvidia, is way smarter, therefore better for rendering than CUDA on the same card. Thing of games that RT cores, are incoparable to raw performance for raytracing, even if they consume less die space and power consumption
If you want to actually see the promised performance gains:
Use it for software development.
Build times went from 4 minutes (2019, 16" MBP, max spec) to
I used to respect the guy but i'm not sure what to think about him or LTT at this point. If they don't address this i'm unsubscribing.
- ua-cam.com/video/g1EyoTu5AX4/v-deo.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind!
- ua-cam.com/video/OMgCsvcMIaQ/v-deo.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop
- ua-cam.com/video/JM27aT9qhZc/v-deo.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia..
- ua-cam.com/video/YX9ttJ0coe4/v-deo.html 16" M1 Max MacBook Pro vs. My $6000 PC
The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level. They need to explain.
You guys should really add model training to your benchmark. Find a basic TensorFlow notebook and see how M1 fares against a neural workload.
Please please please!
Especially compared to Nvidia's Tensor cores on 20 and 30 series cards, which are both crazy powerful and have great platform support
I clicked on the video expecting very good comparisons in diffrent scenarios and such. All i got is dolphin emulator and some random bench data.
How disappointing.
@@OG_ALviK check hardware unboxed, LTT is like fast food despite being the biggest tech channel out there
100%, no idea wtf the dolphin review was. Odd at best.
These recent reviews with Anthony hosting are so damn high quality that I can't wait for some of the LTT Lab content to drop in the next year. It's gonna be absolutely sick.
Read this before video started. Then the first 3 seconds hit me…
Yeah
LTT labs content won't be in the form of videos, they'll be mostly print media, articles and posts on their website. He said so himself.
@@vijeykumar7429 no. He said most of it. Can’t imagine them spending so much money without making use of the information in videos.
this guys voice alone is 10000x better than linus IMO.
One thing to note, you can’t set up the pro max chip with 16GB of memory, so if you don’t NEED 32GB, it’s actually a $600 difference to go from the base M1 pro to the base M1 pro max. The $200 for the chip upgrade itself and $400 for the memory upgrade.
Nope, it is currently not possible to buy an M1 Max Macbook Pro 14 with 16GB unified memory. 32GB is the lowest option on Apple's site.
Yeah when I bought my M1 Pro 16 in November I wanted to spring for the max but like you said it quickly became a grand difference in price and I’m really happy with the Pro.
@@stewardappiagyei6982 ... that's what he said...
I would not buy a laptop with 16gb of memory. I know it's use-case dependant, but I constantly am running up on 16GB on laptop and desktop. Given usually it's when doing 3D workflows, or container development, but it does feel like even more casual use and gaming workloads are going to be pushing up on 16GB soon enough that the cost of upgrade is worth it to make the computer relevant longer. Like, I'm using 10.6GB of mem right now just to have like 20 tabs of firefox/chrome and spotify open.
@@QuakerAssassin yeah it’s def application specific. I bought my Mac for just working with Lightroom and Photoshop and to work with raw files and it’s amazing for that. But I really don’t render anything or game so it works great for me as far as productivity goes.
This "the answer may surprise you. It sure surprised me" thing is starting to be a distinctive mark of Anthony's videos and I like it. Love the energy!
To the max. Haha ditto!
same here, I enjoy Anthony's vids
ok
I thought he was going to say Apple was true to their word and that their marketing accurately reflected their products; that would have been shocking.
@@dmt1994 Yeah, same
Something I think was missing was battery life under load. A key part of Apple’s claims was sustained performance even on battery and at much greater efficiency. So I’m curious how the gaming comparisons would look if you capped framerates to 60 or 30 across machines and compared battery life then. You showed Apple exaggerated how close they were in raw performance, and now I want to know how much they exaggerated on efficiency.
Well, to actually make a comparison, the PC Laptops would need to deliver full performance on battery, which they can't.
@@andreasbuder4417 it would need a laptop that cost as much as the macs. The zephyr is was half the cost of the macs here
If you have such a workhorse, why use it on battery where it would die in less than 4 hours IF it was at 100% battery? Seems extremely unrealistic scenario.
@@Natsukashii1111 It is the same prize? at least in Denmark
wait, my mistake, it is the same prize as the low-end M1 pro 14 inch (2,600 dollars), therefore cheaper than the M1 Max 14 inch (5,000 dollars)
I’m a video editor , I have used Mac and pc for a long time. Recently built a nice PC and I game too much on it lol so now I’m thinking of getting the M1 Max for portability. Glad to hear it’s a beast at what I need it for. This is definitely not for everyone
It definitely is a beast especially if it has a native support for Apple silicon. If you game unfortunately there isn’t any game that natively supports it yet, if there was then you’d get close to 3080’s performance for far greater efficiency. The biggest advantage of these chips are the performance power you get on the go versus any other laptop on the go. The MacBooks just smoke them there and if you travel a lot getting a MacBook over the others is going to be a no brainer. Just remember that you’d have to sacrifice playing some AAA game titles though but if Apple themselves release some AAA games for the Mac, I’m sure more game devs would see the potential in the Mac and port titles to them. That possibility definitely exists but it’s going to be a gamble.
@@almuel you should watch the video.......... it is Rtx 2060 rather than 3080
Interestingly Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented suggesting either serious issues with testing methodology or massive pro Intel bias. Either way an update urgently needed by the Linus team to respond to those observations and recover lost credibility
@@skadi7654 no, they misrepresented data for whatever reason. Others have proven the reality, but although IMO LTT we’re raising an important and valid concern about these laptops, they did it in a very sketchy and either underhand or unprofessional way. See the Max Tech response for more details.
Same situation, 3080 gaming desktop but wanted a m1 for portability. You getting the m1 pro max or m1 pro ?
In laptop comparisons I believe having separate benchmarks for plugged AND unplugged scenarios would shine more light on Apple claims.
This, the mac slaughters every laptop on battery lol!
@@Prithvidiamond every laptop you say? You do know that there are laptops with desktop cpu and desktop gpu? I mean they are absolutely huge, are barely able to be transported but they are still laptops and they will be 2 to 3 times more powerful than m1 macs for the same price.
It's not a fair comparison but you might want to lower your expectations on Apple claim.
@@Natsukashii1111 "on battery"
yeah but why would you do something resource intensive on battery....
@@Natsukashii1111 Laptop and Portable computer aren't the same.
Macbook is a laptop. Some Clevos that you are talking about are "portable" computer whith whom you can do evrything as long as you have a desk and power socket. Without those two it's a bigass brick good for nothing.
Can we talk about the B-roll camera shots? Seems like they’re trying some new techniques here and I love it!
They're so clean!
@@charredolive Unlike Linus' humor. Lol ... :)
All jokes aside, LTT quality has up ticked in the last few months. :) I like this new style a lot.
I was wondering why the rtx had an arrow head on its bar graph, while the others were normal rectangles. Then I realized the rtx was so much higher than the others that it was being truncated so you could even compare the other bars 😂
^this
Yes, because it makes good sense to compare one of the most expensive desktop configs you can buy to these LAPTOPS. There are not enough eyeballs available in the world to roll for this asinine comparison…
@@ryanw8664 whatever it takes for them to make the mac look like a piece of trash. Honestly what trashy review
@@HuyTran-sb2ql malding comment
I wouldn’t doubt yourself so quickly. Max Tech did a response video to this revealing a surprising and very concerning set of anomalies in the data presented in this video suggesting either serious issues with testing methodology or massive pro Intel bias. An update urgently needed by the Linus team to respond to those observations and recover lost credibility
I'm really tired of mobile parts being called the same name (eg: 3080) as their exponentially more powerful discrete counterparts. They're fundamentally different parts I feel
I mean, they’re up to twice as powerful on desktop, but that’s plenty to mislead consumers. AMD and Apple aren’t doing that, though. Just Nvidia.
I take issue with your use of the word “discrete” here - the 3080 laptop GPU is still discrete graphics because it’s not on-die with the CPU. Still, I take your point, and I second it.
Technically the 3060 is different, has more cuda cores than the desktop variant and that's why they are actually comparable
That's their intention
@@djsnowpdx That's a fair distinction, is there a category to describe desktop + workstation + server GPUs? The only thing I can think of is 'PCIe GPUs', vs mobile GPUs and iGPUs. There's also the distinction between the specially-made rackmount-only versions, like the A100, which although use PCIe, are not PCIe-socketable, which futher muddies things
@@gustavrsh Probably right, might just be an upselling tactic
I’d be interested in the power consumption comparison during these tests
What nvidia doesn’t want you to hear.
I dont think you will ever see this in this channel.. the other machines would look like crap
@@cristhiantv they’ll probably say some delusional things like ”pro users always have their laptop plugged in anyway so power consumption isn’t an issue”.
Power consumption effects nothing in your life and costs next to nothing extra, unless you live in a shithole without reliable power.
@@sqlevolicious you don’t know his life, do u? Also power consumption is important if you’re rendering videos on the go…. But you’re gonna probably reply something telling us how stupid we are just by looking at your comments before… so don’t mind answering, have a good day
Anthony was the best decision LTT has done recently. Congrats to the both of you!
this test is pure crap. They should be sued by Apple for misinformation and lies.
@@nnnnnn3647 wut? They can't show results which they measured?
Or are you gonna say they should have only used software that works better on macs?
@@damara2268 They should use software that people really use./
@@nnnnnn3647 ok
@@nnnnnn3647 cope and seethe
would've been interesting to also include a G15, seems like a fair competitor (3070 and a pretty good ryzen chip, and about 7 hours of battery life)
I mean this comparison is fair because the Zephyrus is cheaper. So it actually isn't fair to the Zephyrus if anything.
Would be interesting if they used TeraFLOPS as a unit of measurement to determine estimated GPU performance. :) Now it's not the best unit to use, but the FLOP can show 32-bit precision calculations per second.
Not only not the best, Teraflops is quite possibly the worst measurement to use, since for every generation and architecture performance per flop can differ so much.
The only thing it's good for is marketing number shows (also relative estimated performance withing one gpu family of the same generation, but that's besides the point).
Lenovo Legion would be better because it has a MUX switch
@@ZerograviTea. Wow. I didn't know it was the worst. So, what is the best unit of measurement for GPU performance? GPU bandwidth (GB/s), throughput, or something totally different?
I really like the confidence Anthony grew over the times standing in front of the camera :)
The professional presentation and eloquent voice of my favorite Linus media group personalities makes this review very entertaining and informative!
It's no wonder all of the reviews were so glowing when these laptops came out. It's because all of them almost exclusively focus on video editing and the Adobe Suite. "Benchmarking" often times is just video render times, and it's frustrating, as you can clearly see, it doesn't paint a good picture overall. The Zephyrus is what, at least $1k less? And it performs largely the same, at the cost of battery life? I guess efficiency is a good thing, but these laptops are good for only really very specific purposes, and I question whether they entirely deserved the ubiquitous glowing reviews when they dropped.
If you also consider programming, then also the m1 pro and max outshines the competition. Android projects and java projects are significantly faster than even the top-end machines running linux. Python and TensorFlow builds are also faster, although there somehow the m1 pro trains and builds the ML model faster than the m1 max due to some reasons. So in the departments of media creation and programming these laptops are truly top of the class.
Apple's gig has never been good value. I would actually consider buying it for the hardware if not for the OS lock-in. $1k for weight/battery life/build quality? Sure, why not.
@@Lodinn This is why, despite it's many downsides, I still kind of like the MacBook 16in 2019 with the updated keyboard and i7. Boot camp gives it longevity, and being that it runs x86, it runs all modern day apps. Obviously efficiency isn't nearly there, but all the other MacBook perks are, which makes it a rather nice machine. Outclassed for sure by these last few years of laptops by orders of magnitude, but hey, until Razer or Microsoft can get the build quality down as good as Apple has, it's an attractive option.
@@aritradey8334 That's fair! I haven't seen too many benchmarks I guess in the programming world, which I feel is telling when it comes to the reviewer landscape. With that being said, I remember some of the Hardware Unboxed review, and now this one, and they are such a stark contrast to the uniform praise these recieved upon launch. Great machines for sure, especially for those who use the areas they excel at. I guess I'm just rather exhausted at all of the review outlets only reviewing things for videography, simply because that's what they do. Their reviews shouldn't be a general "review" and should be more a "videographer review", so that those who don't watch/read 18 reviews like a lot of us here who do this for fun, don't get the wrong ideas.
I did wonder and reminded me of how Volkswagen optimized their software for specific use cases. I considered M1 briefly for a Linux laptop but then quickly reconsidered - if not else for the keyboard - and went for a Thinkpad Ps. I don't think these Macs are good for generic purpose computers. They are fine for the same task a Chromebook is also good for, or for the special video editing stuff. Seems quite niche, lucky them they can sell it with marketing.
So what Macs did you get?
"M1 max"
Yeah I know you got M1 Macs, but what model
"M1 Max..."
**flips desk**
lol!
Anthony, your screen-presence has improved so much from your debut. You’ve clearly gotten much more comfortable in front of the camera, and you provide a wonderfully logical insight (pun intended) into the things you present. I know you’re going by a script, but surely you contribute, and you make it yours.
Apple's deprecation of OpenGL support is nasty.
They pretty much had to for this M1 chip anyway. Can't really run widely compatible API's if you're going to do specialised hardware & also claim it slays a top of the line dGPU while using less than half the power. They just don't tell you that the software to actually get the claimed performance isn't widely available (yet).
@@MLWJ1993 Just wait until the community implements OpenGL using Metal, similar to MoltenVK. It's not really "specialized hardware", it's just a graphics API, that's how every GPU works. That's why OpenGL support is still ubiquitous on non-Apple GPUs, even though they're architecturally much more geared towards Dx12 and Vulkan, which are very similar to Metal (in fact, Metal itself is barely anything more than a deliberately incompatible clone of Vulkan because Apple is still Apple).
The M1 CPU may be awesome at clearing up decades-long inefficiencies of the x86 architecture, but the GPU world has long progressed way beyond that. Apple has no such advantage there. The only reason they are even remotely competitive in a performance per watt benchmark is TSMC's 5nm node, to which they currently have exclusive access, but from an architectural standpoint they have a lot of catching up to do with both AMD and Nvidia.
@@DeeSnow97 well, Apple couldn’t “just wait.” They had a product they were ready to sell.
@@djsnowpdx lol, what a horrible take, Apple could have just kept on supporting OpenGL and not sold an incomplete product
@@DeeSnow97 The M1 just sucks for "community anything though" since Apple doesn't really do much of anything to have "the community" fix up their slack. Most of the time they specifically go down the path where they like "the community" to be able to do absolutely nothing. Like doing basic servicing of a device...
Great perspective, appreciate the continued, in-depth coverage on these. I also appreciate what feels like an objective, enthusiastic investigation of the tech, neither a takedown nor blind exaltation, thank you so much for your work!
I would love one day to see Deep Learning Benchmarks as well ... as a DL practitioner, looking forward to the comparison for both CPU and GPU workloads.
I know! The code to get a simple run of MNIST going is just a couple blocks of copy paste.
Get a workstation grade laptop. (Dell Precision / Thinkpad P-series)
@Christos Kokaliaris You can get these notebooks with 500nits, 4k 120Hz displays if you are willing to spend the cash. Personally I use external monitors.
@@MrGeometres if you run stuff on cloud, nothing beats a 900 dollar Macbook Air. You get a wonderful display, great touchpad, nice keyboard. At some point you have to run stuff on cloud if you are doing serious business. It does not makes sense to put thousands dollars to workstations that don't run most of the time and don't scale at all.
Unfortunately, the answer on HDMI 2.1 adapters is currently no, for software reasons. I think if you guys make a video on it that could get Apple’s attention to finish it
sure, cause they're apple's favorite reviewers.
yes, because Apple is well known to take into consideration what people outside Apple are saying /s
Great review but I'm curious about differences between pro and max for development benchmarks i.e. code compilation. This is generally a very large use case for these macbooks.
They use the same CPU, so while the extra bandwidth (and cache?) may make a difference, it's unlikely to be a huge one.
Depends on what you're compiling, if your stuff can compile on mac and is not using GPU acceleration, then the difference is minimal/non-existent.
The efficiency cores on Intel next year will be very interesting, and AMD finally moving to 5nm, though that is supposedly end of year, will be very interesting to see performance jump with that including the new cache stacking. It's great getting past the stagnation.
I'm probably upgrading end of next year, will move from laptop (i7 9750H, it's 3 years old now) to PC since moved continents, and things like Rider and VS Code having remote support means I can just have home PC host the stuff (which I do often enough on my NUC if I need to run overnight).
Check Alexander Ziskind youtube channel for many, many development benchmarks done to the M1/Pro/Max machines, most videos are very short and to the point.
In general, CPU-bound work sees very little difference between the Pro and Max chips, you end up seeing more differences being caused by the number of cores available on the different versions than in the kind of CPU. In some cases, specially single-threaded ones like some javascript tests, a MBP16 running a maxed out i9 might beat the numbers, but if the workflow is multithreaded the M1 chips do it better.
Unless your workflow really needs more than 32GB of RAM a 10 core M1 Pro is probably the "sweet spot" for development at the moment.
My friend is a senior engineer for Apple and he does both iOS and MacOS compiling. He got a Pro for himself and they gave him a Pro for work too because the Max isn't necessary for their developers for the most part. Only certain developers would get allocated a Max but he hasn't heard of any devs getting them.
It would have been nice to see the 2020 Intel MacBook Pro's included in these graphs.
The lack of Vulcan, Cuda, or OpenCL support on Macs is absolutely killing multi platform compatibility for even professional workloads and games have taken a giant leap backwards.
That is Apple's work, they just remove and destroy a industry-standard like OpenCL and OpenGL / CUDA (they never supported the most powerful GPUs, which are Nvidia). In Linux and Windows, when you get a new standard, they let you use the old one, it does not just get removed, which destroys a lot of software. You can still run 32 Bit Apps on Win and Linux very well and that is how you must do it. Apple is just typically arrogant and does not care about its users. That is the reason why they have not had more than 10% marketshare globally, not once in 44 years the company has existed.
@@nigratruo x86 is stagnant and needs a complete reboot... but noone got the guts for it... Apple did and they now have quite powerful machines that uses little power... perfect? not yet... but way better for what they are meant for and then on top of that they can game decently... but again not perfectly... yet. but the extra power of the M1 chips? especially the pro and the max? well they could (should) be interesting for game devs to tap into
I love the way that Mario Sunshine is used as a benchmark here lmao
Only thing that Macs can run
Things i still want to see covered:
1) How much can the USB-C take? 8 hubs fully loaded with all the native monitors going, with also X extra monitors using displayLink, while running a USB connected NAS and 10gb Ethernet dongle
2) EGPU support? If not, what happens if you try it? What if you try to force the Nvidia or AMD drivers with rosetta?
3) Wipe one of the system and use it as a daily driver for a week but this time refusing to install Rosetta. How are the proformance numbers changed without the emulator running or even listening
Thx. This made me re-consider buying a MacBook, after using one for mobile purposes for 4 years. I work as a freelance architect and I fell in love with Twinmotion, a UE based real time renderer. Path tracing is not a thing with Mac and it realy sucks for the price. It might get a metal update, but I need it like now and can’t wait another year. Gonna get a W notebook again I guess. Also the price difference is ridiculous.
Using Apple laptops for UE wouln't make much sense until Epic natively supports them and that will take at least a year (because they first have to release UE5).
I'm using a macbook and honestly, overpriced as hell. You should go with a high end laptop if you're willing to pay the same price for better performance. There's the Razer Blade Stealth 13 for around the same price, it's a thin and light with just better performance most of the time.
Lenovo has some amazing options in they're Legion range
@@Lius525 not gonna happen I think
My pick would be gaming laptop. For the extra GPU power and cooling for UE. I have Lenovo Legion and when I have to upgrade again I will buy the same brand again. It is a bit clunky but it stays cool all day long with i5 on turbo and nvidia 2060.
I am happy Apple is making great arm processors, and I’m also happy Anthony did the review for this episode again. Keep up the great work guys.
Apple makes nothing, thank TSMC
@@davide4725 “Thanks TSMC”
You sound like the kind of guy who loves bringing up John Lennon’s wife beating tendencies every time someone mentions they like the Beatles lmao
@@parkerdavis7859 Cute assumptions kid. Good bye now...
I am also loving the progression for the ARM space. What really excites me isn't the CPU or GPU in these, its the optimizations they made to make ARM that competitive. They're getting asic-like performance for a lot of low-level stuff.
@@davide4725 i find it funny how you called the other guy "kid" while here you are having absolutely no knowledge on RnD, Design, audit, documentation, subcontracting and manufacturing process works in the tech industry.
"Thank TSMC" lol. Kid please.
the progression of Anthony and how much better/confident he has become on camera should be an inspiration for everyone to practice confidence in social setting (which is even worse on camera when you're staring into a lens instead of talking to people)
Gotta love how you included a 500W desktop system in all the benchmarks where the Macs otherwise dominated ;-)
check 6:01
M1 can throw a lot of weight around as a DAW host, especially running Logic and AS-native plugins. It's reportedly less-well-suited to realtime audio tasks (like recording live guitars through software amp sims at low latency in a busy session) but it absolutely pommels at mixing and composing tasks that don't require super-low RTL figures under load. The 32GB Max variant will benefit a serious composer who wants all of the orchestral libraries and soft synths loaded at once, although all that GPU will be drastically underutilized in the same scenario.
The actual video to the sponsor sequence transitions are always smooth af ngl
I have the M1 Pro Max. First Apple computer I have owned. And I am nothing but impressed... Sure I could find something I don't like about it. But... I could show you a list of complaints with my last laptops that are far worse. How efficient it is does have a lot of value. My last laptop was $2,000 when i purchased it from Lenovo. And I needed a Go Pro for a project. realized the memory was full and it killed my laptop battery before it could get the footage off. Even Chrome would noticeably kill battery life. Having a laptop that is useless without a being plugged in sucks.
Anthony, you are nailing reviews recently! Your voice acting/narration is SO professional :) great stuff mate
Anthony = The full truth with no bs. Getting better and better every time.
Blender 3.0 now run natively on M1, so that could be a nice comparación.
Blender 3.1's metal support is very nice. I still don't think it beats out some of the higher end RTX cards, but it still performs very well, even in the alpha stages
Watching Anthony go from absolutely HATING being on camera to being so much more comfortable that he cracks me the eff up with an intro like that! Bravo Anthony! 👏 👏 👏 I almost spit out my coffee lol'ing at that. Great work.
Anthony is a wonderful personality and knows how to mix humour and information supremely well. Love his Mac content!
Cant wait to come back to this video 10 years from now.
Whats up future me :)
I love you
Anthony is such a smooth talker
So much of this is really optimization in code, for those of us that lived through the changes from carbon to coco to metal and from Motorola to PPC and then to Intel, one of the things that happened was after a giant change in architecture, over time as software gets updated the macs would get faster. Even Apples OS is still hitting rosetta. The review is still fair, but in a year the results from the same hardware will most likely be significantly different.
Steam drains the battery on my 16” Mac(M1 Max) faster than running Windows on Arm(Parallels) + ECAD(Altium) or Keysight ADS for EM field solving. Yeah… Just having the Steam launcher running, not even with a game going.
Oh well, i never intended to game on the Mac anyways since I have a gaming PC… but in terms of work, the Mac can do everything I need it to do in portable form factor, while maintaining all day battery life.
@@Cat-kp7rl but you CAN game on a mac, but I am surprised what I can push out of my OG 13" M1 Air
@@LiLBitsDK I was just backing up his point that unoptimized things can run really bad no matter the device. Like in my case, something as trivial as the Steam launcher
Exactly. That's another crazy thing about M1. It will just get faster as we get updates. Normally machines will be slower as they age since software gets more complicated.
@@LiLBitsDK Yep. I was running Diablo 3 wuthout issues. It heats my laptop from the sam e year like crazy
I'm interested if the laptops were connected to power. Also interested what the battery percentages would be at the end of the test with all laptops disconnected from power, and how hard the fans blew.
I think it's pretty clear that Macs run much better on battery power than most PCs. At least until the latest Intel and AMD chips are properly put to the test.
That is probably how apple got theirs to look so god in their comparisons... they unplugged the PCs...
@@petereriksson6760 lol u right. so I guess Apple actually makes laptops...while PCs are designed be lost in house fires..got it.
@@petereriksson6760 that's exactly what they've done and there's nothing wrong with that because laptops are meant to be use unplugged!
@@angeloangibeau5814 I disagree heavly. The point of laptops is portability, but that doesn't mean I will use them unplugged.
Battery life is good but not as important as Apple makes it out to be. It's not that important like phones.
When I am using my laptop for more than an hour, it's usually on a desk and almost all places I visit with a desk, they have an outlet.
The cinematography in this video is amazing.
Oh wow I was watching this video and I couldn’t help it but wonder if I’m the only one being shocked with this review! I respect and admire Anthony but I believe the software used for these tests was cherry picked and doesn’t show the full potential of the MacBook Pro! I came down to the comments and I’m shocked not a lot of people talk about this and then I saw someone mentioning a channel Max Tech talking about this review. I went to watch it and indeed I agree with them.
max tech? hahaha, that apple fanboy channel, just look through his content.
@@truthseeker6804 you don’t sound very objective yourself. If you actually looked around on other channels, the ones that are “pc fanboys”, also did more objective tests that are similar to max tech.
@@solidsn2011 ive watched some videos from other channels and the results were similar as here. can you post an objective video you think differs from here?
@@truthseeker6804 I found this review lacking and leaning, specifically in real world function. Bropants, actually hook up all the monitors, play the games that currently work, put it against more PCs, more price ranges, yadda. I thought they did better on previous M Chip reviews.
Meh, I can’t judge. I recently followed this channel because I thought it was actually called “Linux Tech Tips”.
*I know.*
I'm so glad these new macbooks have proper thermals rather than the "bluetooth heatsink" of models prior...
I know right. It honestly looks like the inside of my gaming laptop in there.
I think the focus with these computers is probably performance while being unplugged, which is something I really wish they had tested.
I know it’s not the same as a real thorough test, but most benchmarks agree that the M1 (any variant) run virtually identical both plugged and unplugged.
@@AlejandroLZuvic yeah, but intel laptops are not
You clearly didnt see the video until the end, lol
Absolutely, One of the main use cases for a laptop is while unplugged. The first test they should do is fully charged and unplugged performance testing, then while charging, and then when fully charged but plugged in. Large differences in performance can result in various situations, and only testing while plugged (or unplugged) can skew results to the testers desires. I think Apple's claims might be correct IF the laptop they were comparing against did poorly while unplugged, so Apple's results would look more impressive.
at around 90° C as show by Anthony's testing, i don't see unplugged rendering being viable.
not on the 14inch. if needed, yes on the 16in. best to stick with the Pro on the 14in, or if needed Max with the 24core GPU. the 32 core is voltage limited in the 14in.
I’m a software dev who also edit in Resolve and do some Blender on my spare time and went for an 10/16c M1 Pro with 32Gb RAM, I don’t regret it :
I don’t intent to game on it, Blender Metal support is coming, and ho boy that XCode build time is just fabulous ! That’s not just the extra CPU, faster SSD and mem bandwidth do make a huge difference, easily cut my build times in half.
picking an M1 Max was just wasting battery life for me, as the test shown in the video is the best case, the drop in daily workloads is more like 30%
how much difference do you think 32gb vs 16gb ram on that M1 Pro makes for Blender and Resolve? using 14" or 16"?
@@StefanUrkel a huge one! Used both on M1 16Gb, very decent performance but swap usage was way too high and memory pressure often in the yellow area.
32Gb is way way better!
So glad I got the Pro. From what I hear its the same story with AI compute (something I personally bought it for), as it uses the Neural Engine there is ZERO improvement moving up the Max. Its a shame they didn't double those cores. It will be interesting to see if the new Mac Minis will differ from the Macbooks or just use the same SoCs, as if they double the Neural Engine I will probably get one. Interestingly AI is where you only get about a doubling in performance going from M1 Pro to 5950X + 3080, so a bigger Neural Engine could really make up a ton of ground there.
Whats the battery life, performance and noise level of the Zephyrus when running 3D on battery though? A big reason I got the Pro was the performance to watt and noise level were terrific, even manually pushing the fans up so the SoC never goes far over 80C under heavy load.
What's hilarious is that if you read reviews of the Zephyrus it's constantly referred to as over priced and under powered 😂
Anthony, with that last ad transition one can only conclude that you have achieved your final form as a true artist. A poet!
0 dislikes! Great vid LTT👍👍👍
Starting to hate this joke. UA-cam should make the dislikes public again.
I like your tests and I am not an Apple Fanboy, but your results here are very different from most of the other Tech UA-cam channels that have tested these MacBooks
Careful, he might delete this comment.
which other tech channel results differ from this? post a real tech channel, not a fanboy channel. before you post make sure that channel does a variety of reviews not only praising apple products.
@@truthseeker6804 How about Matthew Moniz, he did an video and he is not biased
@@truthseeker6804 All that I've seen actually. Here are some
- ua-cam.com/video/g1EyoTu5AX4/v-deo.html M1 MAX - 32 CORE BENCHMARKS v RTX 3080 & 3090 Will Blow Your Mind!
- ua-cam.com/video/OMgCsvcMIaQ/v-deo.html M1 Max 32 Core GPU v 165W RTX 3080 Laptop
- ua-cam.com/video/JM27aT9qhZc/v-deo.html 16" MacBook Pro vs RTX 3080 Razer Blade - SORRY Nvidia..
- ua-cam.com/video/YX9ttJ0coe4/v-deo.html 16" M1 Max MacBook Pro vs. My $6000 PC
The list keeps going. These results have been out for a while too so LTT really have no excuses. They didn't use optimized software, they didn't compare laptop performance while on battery, they didn't use readily available GPU benchmarking software thats already proven to place the M1 Max around the 3080 level.
@@andremessado7659 so i watched the first video, and the m1 max actually lost to the laptop and desktop, in the chart in export times, but it did well in the timeline playback, thats literally the same as this video in the davinci resolve section at 5:28.
in the second video, the gaming laptop totally destroyed the m1 max on power, not on battery.
i skipped the third bias max tech apple fanboy channel video.
regarding the fourth video, the m1 max lost in all the charts except the 6k braw export, which is interesting because the first link you posted had a faster than the m1 max export speed on the gpu.
so in summary from the first, second and fourth video, the m1 max does best in video playback on a video editing timeline, but loses to 3080 or 3090 in video exporting, stabilization, rendering, benchmarks, everything else.
Honestly, i love every review Anthony does. His voice is like butter on a subwoofer
That's the first time I heard that someone is preferring the silver color. I also got a silver one and, looking around online, it seems like I'm way in the minority with that decision.
Loved the reference, in the intro, to the open air unboxings Linus made many years ago :))
[Looks at video][Slaps it]: This bad boy can include more than 3 ads!
Love sponsors.
I agree with what you say: M1 max is literally only for professional video editors, which is a super ultra niche market, for everyone else, it's not worth it.
I think it'd be more accurate to say media professionals and developers in general. It's absolutely fantastic for professional audio production and software development. Silent the vast majority of the time and can easily handle on-location and remote tasks with it's awesome battery life with full power whether plugged in or not. The high-impedance-capable headphone jack and best-sound-in-a-laptop ever doesn't hurt either. I think it's important to compare Apples to Apples here (pun intended). They're not designed for gamers, they are designed for professionals. As an equal Windows and MacOS user, my experience with these has been top-notch. For pros, Apple has hit a home-run here IMHO. Also, I think the power-per-watt here should not be ignored and I don't believe this was mentioned - add that factor to the benchmarks and you'd see some very different results. Energy costs money and affects the environment. And a hot, noisy laptop isn't particularly enjoyable to use day in and day out.
Super niche. Because let's face it, the m1 air can do 4k editing. How many editors need to edit 12 simultaneous 4k streams? Most youtube viewers don't even watch in 4k yet rofl. I really wish it performed better at 3d design.
@@wykananda for audio professionals most of them were good with an older generation macbook with high memory configuration tho. also for non video editing/audio professionals, macos is really really difficult to use. even more so with arm. basic stuff like a volume mixer and any sign of useful window management are absent out of the box. what is the point if you are spending such a premium to get a sub par experience with non video editing/audio professionals.
@@pupperemeritus9189 Hi pupper. I'm not sure I understand your comments. Sadly, the previous Macbook laptop generations were all limited to16gb of ram - so high-memory configs were simply not possible. Moving to the ARM architecture did not change the underlying operating system, MacOS, it simply made the laptop hardware run faster, smoother, quieter, and for much longer on a single battery charge. As for the difficult-to-use / sound control / window management - the latest Windows and MacOS are both more than reasonably user-friendly and well-equipped in all these areas - these OSs have both been around for many years and generations now and it shows. As a multi-OS power-user I could nit-pick plenty at both OSs here and there for sure though. However, in my experience, for the countless newbies that I've trained and continue to help, MacOS has to get the nod for getting productive and comfortable more quickly with less frustration and confusion and less problems over the long haul. Let's face it, both operating systems are DEEP. They're both very capable and stable at this stage but either will take time and effort to learn to get the most out of them. Curiously, my current "go to" Windows-based laptop is a 2015 Macbook Pro running Boot Camp - ironically, it's easily the best Windows laptop I've ever owned - cool, quiet, fast, stable, good battery life, well-built, expandable - and, of course, it runs MacOS like a champ too. I'll likely get another 3-4 good years out of it before I hand it down the line. IMO, the 2015 MBP was the best overall professional laptop ever made for Windows, MacOS, or Linux until now. While I can run the ARM version of Windows on the latest MBP via Parallels and so on, I'll have a new laptop King if-ever/when Microsoft Windows gets fully up to ARM speed and these new killer Macs can boot into it natively.
@@wykananda i appreciate your patient reply
All I know is on some particular photo editing workloads (specifically running a noise reduction filter in Topaz labs on a 45mp tiff from a Nikon D850, after editing in Lightroom and Photoshop), the M1 in the *Air* absolutely seems to crush my desktop Ryzen 3900x / 3070 combo. I couldn’t believe it, and it made me a little sick 😂
Can’t game on it, and the Ryzen / RTX combo is still an utter beast in other tasks, but for certain tasks these laptops are unbelievable value.
The x86/64 version of Lightroom is horribly bloated and has been in need of a re-write for the last 5+ years, using it was a nightmare on my i7 4800hq laptop as a student... Whereas my M1 Pro (base 14") rips through everything no problem, my 3700x and 2080 Super are fine but yeah LR is still slower than I think it should be...
Am I the only one who notices Anthony is nothing short from a professional at this moment? I mean his confidence is on another level with this video.
Max Tech already showed weeks ago that the 14” throttles the M1 Max chip compared to the 16” due to much smaller cooling.
Yup. Honestly no idea why it took LTT so long to get these videos out. All this information is widely known by now. Seems like a huge miss on their part for being so late to the game on these. If they didn't receive the products in time then sure thats fine but it's also LTT.... Surely they could of worked it out.
As time goes, I start realizing that Max Tech tends to only or mostly show the advantages of M1*. Have to watch other channels to find out for instance about the only 90%ish Adobe RGB (this is bad for Photoshop semi/professional editing) of this screen and the very slow screen response times (35-100 ms) - find Hardware Unboxed for these 2 - ua-cam.com/video/p2xo-hDCgZE/v-deo.html . Or what it is here.
Max Tech is an annoying surface level dweeb that only posts reviews to get clicks.
@@ContraVsGigi AdobeRGB? lol, a lot of professionals don't need or want 100% AdobeRGB coverage because they're working in an SRGB or Displayp3 workspace. Non-issue. 90% is actually a very good result for AdobeRGB anyway.
@@DriveCancelDC Personally not a fan of the guy or his channel but i'll give him credit for shitting out a buttload of videos when the M1 Pro and Max Dropped. He was on it from day 1. It's been almost 2 months and Linus is only just putting out a video now? I expected better honestly.
One thing not mentioned when doing the benchmarks, how do all the laptops ( MacBooks and Zephyrus) perform while only on battery. Yes battery usage length is great, but how is the horsepower of the cpu/GPU effected running apps while on battery. I think some surprises might arise.
8:57 To make matters worse for Mario Sunshine, the starting level is the easiest level to run on lower end hardware. So the fact that it hovered around 70s and 60s, that's not looking for the m1 max. However it may just be due to the rendering API being wonky on Macs
I am playing zelda on this thing and it runs great. but man, comparing apple to windows on games, it's like comparing mercedes and a toyota lmao.
Benchmarks in C4D/redshift don't tell the full story. You need to go into redshift's render settings and manually increase bucket size to 256/512, then you'll see a 25%+ improvement in render times.
Interesting! Thanks for the info!
I bought the 16in m1 max 32gb. It's a beast, huge improvements from the lacking macbooks from years prior.
Anthony got that point right: apple is designing their hardware specifically for video editing, with everything else left behind. For me, the M1 was shiny when nothing was installed on Mac, but after loading programs and keeping stuff running in the background, M1 isn’t that shiny and amazing anymore. Battery life is still great but far from what apple exaggerates. Given the slow rollout of M1 optimised software, M1 optimised workflow won’t be seen for quite a while.
I’d really love a software dev take on this. For my use case fast cpu, good battery life and 64gb of ram are compelling - but are distinctly not video rendering.
Developer here, I wouldn't buy any of these besides the base-level MacBook non-pro. You can literally code on a Raspberry, unless you're compiling something crazy-complex like an entire browser you're not going to feel the difference, so why pay extra for literally nothing? A USB-A port would have been a compelling addition, but oh well.
Other developer here. Never found myself desperate for a usb A port while developing but have definitely found a use for better cpu and ram. Not sure what serious developers are developing on trash hardware tbh.
@@JackiePrime Web, for example. I don't develop on trash hardware because I can afford better equipment, but if I still had my old FX-8320 it wouldn't slow me down in any way. Peripherals are way more important at that point.
Also, every single hardware debugger uses USB-A, and even if you just want hobbyist stuff have fun hunting down a USB mini-B (not micro-B) to USB-C cable just because you can't use the included mini-B to A.
But it does make sense, if you only develop for iOS (which is literally the only reason I've ever considered buying a Mac) then you won't run into any of those issues, and Xcode being a hot mess does necessitate a faster CPU and more RAM. But there's a lot more to development than just Apple's walled garden, and if you step out of it it's a lot more important to be able to mess with any device you want to.
Also a developer here, gpu on the max is absolutely useless and 64 gb of ram is overkill for my line of work. 32 gb ram and 10 core pro is plenty plan to keep for about 4 to 5 years.
Another Developer here, I have the M1 Max with 64GB, 32GPU and 1TB SSD. While this setup is overkill, first I can afford it and feels good not having to worry about performance while working. On the technical side, running Webstorm, and other ides, multiple node apps, multiple docker containers, electron apps that suck like Slack etc takes a toll on any computer. If you can afford it, specially since software engineering is a well paid job, plus the resell value down the line, why not?
I was just auditioned for an animation job, I was put on a last gen Intel iMac, fired up Blender and put a normal map on one surface in the scene and the GPU almost caught fire and the whole macOS GUI dropped to 0.5fps, I'm not sh1tting you!!!
I honestly feel like we're witnessing the makings of a legend. This guy's reviews are legit!
Anthony’s an OG at LTT for forever - great to see him getting screen time and having a bit more confidence presenting.
Seriously I think he's my fav LTT presenter now, even beating out Linus. lol
this test is pure crap. They should be sued by Apple for misinformation and lies.
Normally I completely agree. Seems very skewed and that the apps selected were designed to show this in a poor light. Was it purposefully? Guess time will tell, but I believe this video will not age well. However PC fans will point this this sole video as why the new MacBook Pros suck, despite an overwhelming number of other reviewers showing the performance in a different light, several that are also typical PC reviewers.
@@BootStrapTurnerVideography you mean max tech right? A guy who literally said that M1 max MacBook is just as fast as 5950x desktop with RTX 3090.
Yeah that guy is totally not biased at all.
I think what Anthony wanted to point out here is that those apple marketing slides for M1 max were very very misleading.
Hope they get eGPUs up and running for the M1 chip soon. Imagine the possibilities.
God I love this guy as a linus replacement when he's gone/too exhausted to do videos/building his sound setup..
So much for faster than a 3060, lol.
The media engine seems to be giving the GPU the illusion of more performance than it's actually capable of.
Yeah, it's all smoke and mirrors. People that actually believed these claims..I mean..first time? 🤣
@@shibbychingching4845 I mean no where did I hear apple say it was a gaming machine. The illusion is what you keep telling yourself.
@@bear2507 the illusion is that apple claimed the performance is about the same as an rtx 3080, not just M1 barely beaten the rtx 3060 its not even close to rtx 3080 and i mean mobile rtx gpu, an rtx is a GAMING gpu, so when they made these claim people will think about its performance for gaming obviously, should have compared it to a profesional gpu like quadro instead of being either brave or stupid to compare them to rtx
@@foxley95 m1 don't have the capability to beat 1% of quadro GPU in 3d task
@@foxley95 yeah, i’ll go tell my research lab to shut down our datacenter with hundreds of 3080s, because some kid on youtube said these gpus are for games only and not generic compute. comments are full of children who have never touched anything ouside minecraft, but have an opinion on everything hahah
honestly anthony is the best and honest reviewer I guess. his dialogues are easy to belive and facts are legitimate. As for "maxtech"review channel they just support apple.
I love how you had to change the graphs to accomodate the intel/nvidia notebook
Hey Anthony, WoW is one of the few M1-native games out there. Would love to see the resolution cranked so we can get comparisons with PC GPUs
Good review. It’s too bad the intel PC laptops tank hard when unplugged. At that point it might as well not even be a laptop.
facts!
I don't know about you, but I use my laptop either at home, at work, or at coffeeshops. Everywhere there is a chair and a desk, usually there is an outlet.
I hope this man has his own office. Love his reporting style
I think the M1 series performance is excellent, especially for a first generation silicon. They are just so efficient and so fast. I do think software support still needs to catch up a bit, but that's a given. As time moves foward and with newer M2/M3 on the horizon I think we'll see some great things from these chips. The future is promising
I think software support will be very slow to catch up, Rosetta technically already brings over most of the optimizations a dev will make, anything else is libraries support like .Net. Looking at some .Net 6 benchmarks (where MS has stated they've done a ton of performance improvements), M1 should match 4700U about if it's a lot of cache eviction happening, and a little bit behind else.
We're at the point where if it is a lower power task, the M1 will be better, x86 just has that minimum instruction set it has to load no matter the task, anything above 25W that advantage no longer exists and then Intel/Ryzen start beating it handily; also helps Apple is on TSMC 5nm, they've been great at investing there and keeping it as an advantage, AMD is only set to go 5nm end of 2022, while Apple might be on 3nm by that time, will be a very interesting year next year in tech.
@@ezicarus8216 K.
@@lcarsos That doesn’t matter. It’s one thing to make a phone SOC. It’s an entirely different thing to make a ln SOC that scales up to desktop and prosumer performance while maintaining excellent power management. This is an entirely new beast
No one who is gaming will pick m1 max over 3080
@@ezicarus8216 apple is better for a lot of stuff but gaming definitely not
Andy's on screen acting has improved so much. Proud of you bro
What about software compilation, data analysis and heavy crunching like that? Can you 🙏 test compiling Linux or something similar workflow for the 16” review? Pretty please 🥺 It’s a lot more relevant for someone like me
Andy is such a great addition to the team for these videos.
The opening sequence just blew me away man...
After rewatching this review, I went ahead and bought the base model of the 14 inch m1 pro. I will be doing more cpu than gpu heavy work but I didn't think the 2 extra cores was worth the money
could you guys test war thunder when you do the 16 inch? it supports metal. is usually HEAVILY single core limited. and is quite interesting to test. pro tip. the build in benchmark has different results than using test drive.
you guys really need to do code compilation tests, that is honestly all that I'm interested in.
Was the Zephyrus M16 tested on battery or connected to power supply? Curious if there’s a performance difference on battery like on my G15, for me the consistency across both looks like it might be the biggest draw of the new Macbooks
It's most likely out of the direct power and with fans on full force (usually there is a high performance mode for these laptops)
But this also boils down to the question - what are you more focused on? Aesthetics for other's benefits and tertiary factors or on your work?
Anthony, you tested Rogue Rogue Squadron III on a build from just before MMU support was added to JitArm64 :(
That means it's likely running a lot slower than intended.
It doesn't really matter though. You can't even emulate those games at full speed all of the time on the most overkill PC setup money can buy. It just makes very heavy use of pretty much any trick you can pull off with the gamecube's hardware which is just not very comparable to current day hardware at all.
@@MLWJ1993 No? Modern x86_64 CPUs have no problem emulating Rogue Squadraon III.
It's just that for any ARM64 chip, the Dolphin devs hadn't implemented accelerated MMU emulation until very recently, which made any ARM64 chip very slow in that game, even the M1.
@@neutronpcxt372 Pretty sure you run into slowdowns in all those games in transitions. It's not unplayable slow, but definitely not full speed everywhere. That's why the forums are full of people asking if what they see is expected behaviour for their overkill hardware or devs answering no hardware currently is capable to run the game "smoothly" when that's a specifically provided requirement.
Absolutely MAD MAX intro! And that was the smoothest and most hilarious segue ever to a sponsor! Well done haha