The ARM chip race is getting wild… Apple M4 unveiled
Вставка
- Опубліковано 19 тра 2024
- Apple just released their latest M4 chip. Let's take a look at it's performance specs and AI neural engine features. Compare M4 to Snapdragon and find out why big tech companies are competing to build the best ARM chips.
#computerscience #ai #thecodereport
💬 Chat with Me on Discord
/ discord
🔗 Resources
Apple M4 www.apple.com/newsroom/2024/0...
CPU vs GPU • CPU vs GPU vs TPU vs D...
CUDA in 100 Seconds • Nvidia CUDA in 100 Sec...
🔥 Get More Content - Upgrade to PRO
Upgrade at fireship.io/pro
Use code YT25 for 25% off PRO access
🎨 My Editor Settings
- Atom One Dark
- vscode-icons
- Fira Code Font
🔖 Topics Covered
- Fasted PC chips
- ARM vs x86
- RISC vs CISC
- What is the best chip architecture for AI
- Running AI models on device
- What is the M4 chip?
- Is x86 dead? - Наука та технологія
3:02 why tf this guy is holding his phone upside down
cause hes me
That clip is definitely ai generated
Its AI, look at the fingers
AI generated 😂
He thinks diffrent
Where does this dude get his stock footage? 😂
Tenor, Giphy and more. Probably easy to find if you just search their expression on Google
UA-cam
meta-llama
AI is cheaper than stock footage.
And who decided to make stock footage of a couple of guys in suits fighting each other with gardening tools?
people during cold war: The arms race will be wild
the ARMs race now
expectation : America and the gang vs Russia and the gang
reality : American vs another American while Taiwan supporting everyone
The hot wars
It's like that translator joke with Les armes
Apple has basically had the lead for so many years, i doubt they will lose it
@@Lenfer-hp3ic taiwan is literally just 80 miles from china, so without america taiwan is f'kd in few hours.
M4 on iPadOS is like installing thruster engines on a horse with no legs
Our new racing horse can achieve speeds of up to Mach 4!**
**if it had legs and could actually survive running at those speeds
38 TOPS, ray-tracing, but still can't run macOS fucking lol
@@civilroman still don't have file manager with file system access!
What even are you gonna do with all that power? Makeshift oven? AI generate all your art?
That cracked me up real good.
Apple car charging was hilarious.
It's not an ARMs race, it's just hotwheels with a hot charge, USB 4.XXL
it wasnt
@@Moli05 dude, Apple is nothing, ever since jobs died.
Now it's just another brand, take it easy. 🤡🤡
@@Moli05
Apparently you’ve never seen an apple magic mouse
😂
singularity blaster turbo pro max, i knew it all along
😢
It's the future
Honestly, I would buy it just because of the name (I'm broke)
woah there buddy you forgot the X
I'm waiting for the Super V2
Idk what to say but I had tacos for dinner and it was a banger. Destroyed my toilet in the process but it was worth it in at the end of the day.
nice
no pain no game reversed
Deep , definitely an interesting topic
I had quasedillas + buritto styled salad and I am currently in the same proccess you were in
Are you lactose intolerant or something? I don't get people who say that tacos make them poop
It can run a LLM on the laptop but you still can't replace the battery.
Those two are irrelevant
you can, what you smoke?
..,,::;;!!!????@@gigamoment
@@weiss588 shhhh, he's trying to make the lulz.
I can run an llm on my s20 fe, that isn't special, people have ran llms on hacked switches.
I legit thought they made apple version of the M4 rifle
now i really wanna see a rifle made by Apple
@@planetrenox featuring the thinnest scope bezels apple has ever made!
it's not far fetched, samsung make fridges, tanks etc
its an M4 rifle pointed at the snapdragon
I'm waiting until the Apple M16 comes out.
Ipad Hardware: 😎
Ipad OS: 🤡
How bad is it?
@@PapiDey-dv4gw it's bad out here man
@@PapiDey-dv4gw IpadOS is like a nerfed MacOS. It's better than iOS but worse than MacOS.
Apple refuses to put MacOS on it, I guess because they would have to kill the iPadOS. And the lack of supported apps makes you asks the question, why not buy a MacBook instead.
@@wlockuz4467 My guess is MacOS is not on iPad because MacOS allows apps outside of App Store, so there's less revenue.
@@wlockuz4467 they don't put MacOS because people wouldn't buy MacBooks anymore...
The car got me 😂
should've stayed out of the road
@@iverssonkaufert7619 lmao. looks like the car got him again.
timestamp?
@@subhashgottumukkala 0:11
Can't wait to buy myself the new Macbook Singularity Blaster Turbo Pro Max (with extra storage of course)
is going to have 4 GB of unified memory and each 4GB is going to cost $400.
@@segiraldovi Sounds like a deal to meeeee (this message is sponsored by apple)
I already prepreordered mine.
But I already sold my kidneys and liver
Sounds like super soaker water
The Grindr reference and Apple car were wild. 10/10
Why is he starting to sound more like an person and not an AI
because AI gets better
Because he's using AI
He did previously mention that his AI sounds more human than him, so...
😅😅😅😅😅😅😅
AI's getting better :)
The M4 seems to be the fastest CPU chip on the market for running AI. You'd still want a dedicated GPU if you can get one. While the M4 gets near 40 TOPS, the 4090 gets around 83 TFLOPS (trillions of floating point operations per second) at around 32 bit precision. I think the M4 is rumored to be at 8-bit precision for their current TOPS quote. No confirmation on the 8-bit rumor though, so it's good to be skeptical of that.
I mean it's quite obvious that Nvidia's chips are go to for AI related stuff, but it's still impressive what performance M chips can achieve with how little power they need. Sadly, they are OS exclusive.
@@mikadofxx9030 They aren't. Apple has specifically made sure that third-party OSs can run and can do so safely and without sabotaging the system's security model
The drivers and some other important bits of code being private is of course a big obstacle, but it's one that can be and defeated, as the Asahi Linux team has proven.
To add, Apple themselves encouraged Microsoft to port Windows to Apple silicon... why? the answer is that macOS makes them practically no money. Apple is a hardware company.
@@U20E0, sources? Note, links may not work. So, just name the sources.
@@CarlBach-ol9zb Apple's documentation on the boot process as well as documents from the Asahi Linux project
1:13 that's actually a myth. While it was true that several years ago x86 was CISC and ARM was RISC and that was one of the reason for the latter being more energy efficient, nowadays the latest and most advanced iterations of x86 and ARM have blurred the line between the CISC and RISC. The x86 processor now is using a decoder to interpret its complex CISC intructions to a RISC form internally and ARM has introduced a lot of CISC-like extensions over the years. The actual reason x86 is more inefficient[citation needed] is due to the fact it needs to a lot of 30+ year old legacy code, though Intel have recently proposed a specification called x86-S which aims to remove a lot of legacy cruft from their processors while still maintaining reasonable backwards compatibility.
Wrong! It’s exactly this CISC decoder unit which consumes a lot of energy.
Furthermore there was the famous IA64 experiment which failed.
Even further seen you have to distinguish between software which runs for companies versus for private consumers. They have different development life cycles and budgets. There is a reason why COBOL exists till today. I don’t know of any large migration project after Y2K. Usually a software stalls in further development if its code is 1) unreadable 2) test coverage is bad of low.
@@javastream5015the decoder is only about 5% of die area in modern CPU cores. Most transistors and energy are spent on branch prediction, handling out of order execution and cache.
@@javastream5015OK, I get the point about CISC to RiSC adding to overhead, but why bring up Itanium?
Yeah, it failed. But so did all the other times Intel tried to "replace" their x86 based cores.
From what I understand x86s is about removing/depreciating old 16 and 32 bit instructions. In other words making x86 more focused on x64 and beyond. That is not an attempt to replace the current architecture, just a revision.
@@MichaelDeHaven Back to x86s: Yes, it can be done, because Apple did the same with the M1. Apple removed the support for 32 Bit API even one OS version before. 👉A lot of old games (like Civilization 4) stopped working.
To run old software emulators are needed. Then you have stupid companies like Parallels who didn’t want to build a x86 emulator into Parallels Desktop so that old x86 (and x64) virtual machines could be used further. ☹️However, as I have read now, Windows for ARM supports emulation of 32 Bit x86 software (on 64 Bit ARM). 👍
👉Long story short: It could work technically. But you would make a lot of owners of legacy applications angry without a proper emulation support. Just look what Apple did.
@@MichaelDeHaven x86S isn't really about deprecating instructions (as that could be devastating for compatibility for older software - Intel's proposal would require changes only from the operating systems), but about deprecating unused features. For example, x86 chips first boot in 16bit mode, then they can be switched to 32bit and then finally to 64bit adding a lot of unnecessary complexity. There's also the entire real and protected mode ordeal, together with unused executions rings 1 and 2 and finally some things about the segmentation model.
As for the entire RISC and CISC problem on x86, in my opinion, the decoder and all the execution units (APU, FPU,...) still have to be prepared to deal with a lot of instructions. x86 contains more than 186 opcodes for just the ADD instruction, having the option to provide an immediate value or address from memory, each of those for 8, 16, 32 and 64bit values. Adding a MOV instruction just before that and supporting only the registers would have caused a huge reduction in the opcode count, but it's a little too late for that. It shows, that x86 was designed to be programmed directly in assembly and to offload more complexity to the hardware rather than the software, which is just not how we use computers today with many abstractions above the hardware.
Intel's official documentation to x86 in its current form has exactly 5060 Portrait Letter pages full of really well constructed and written text (one of the best documentations I've ever read) and yet it's still incredibly big. Sometimes less is more.
This is just moving from one proprietary architecture to another. WHERE MY RISC-V SQUAD AT?
They're not here because they're having issues compiling their web browser for RISC-V
well at least there won't be monopoly in this arms race
@@TheDanielLivingston I laughed entirely too much at your comment
Well I guess I'll have to throw away my M2 MacBook Pro I bough a year ago /s
Yeah Apple’s gotta stop
O B S O L E T E 😂
You could throw it in my bin though
@@veenmikki27 Nah this is the fault of the person throwing it out
From my experience old Macs (and almost every higher end computer and phone for that part) work perfectly fine multiple years into the future
@@IDKisReal2401True. I would have been watching on this my 2017 MBP if it weren't for the fact that I accidently dropped and broke it a few weeks ago. That's considering that those models criticised for keyboard, battery, and fan problems. Still sucks a bit that they announced M4 when I just bought a M3 MBA as a replacement.
Why is the dude at 3:03 using his phone upside down?
Some old phones were built like that. I have one at home
Remember time, when Samsung phones had charger port and jack from the top? Because I do.
@@Favour.A.Emmason-pv1mk yeah but where is the camera and why are the speakers on top then lol
it's ai generated stock videos
@@Favour.A.Emmason-pv1mk i think i would prefer that
Impressive breakdown of the technology war between major players in the ARM chip market. I find it particularly fascinating that despite the CPU advancements, x86 still holds strong in infrastructure due to its age and versatility.
"x86" CPUs have not been "x86" internally for a long time, it's just a layer on the outside. You should listen to CPU architect Jim Keller who says that x86 is not really a substantial disadvantage.
Case in point, AMD is goint to release Zen5 based StrixPoint and StrixPoint Halo, which to me seem targeted at Apple's "M" SOCs.
@@swdev245yeah it’s not the architecture, it’s just Intel and AMDs disregard of mobile chips. Now they’re scrambling to deliver much more efficient chips with Strix and Arrow Lake
Was it an AI generated comment?
@@BernardoLeon yeah looks like it, click on his profile and see his comment history
@@BernardoLeon Which comment do you mean? I wrote mine with natural stupidity ;)
Arm out there playing 4D chess selling shovel blueprints
1:45 actually microsoft was trying to push arm / equivalent battery efficient stuff before this but nobody (Qualcomm) took this seriously
Untill apple silicon
This. Everyone forgot the first windows tablet (Windows XP tablet PC edition anyone?), and shat bricks when the iPad launched. Same with arm (Windows RT anyone?), no one wanted to hop on the arm train back then. Some tech needs apple’s hypetrain.
I have a windows on snapdragon laptop and its great. I do work for Qualcomm as a software engineer though, so I may be biased.
Common Microsoft L
i like your profile picture
@@TheDeveloperGuy I heard that Windows RT sucked ass and maybe that's why nobody really wanted to jump ships. It also had no x86 compatibility layer (which the existing software library is the main selling point of Windows these days anyway). Without Rosetta functioning as well as it did, I don't think the jump would have been such a no-brainer for many people as it was.
Arm was introduced by Acorn computers, eventually renamed Acorn Risc Machines, as a processor for their desktop computers.
They found out they were extremely power efficient in their original design. They also happened to outperform products based on the 68000 series at the same clock speeds. And I mean ran circles around them. Prior to risc architectures CPUs measured performance in clocks per instruction not instructions per clock. This was the whole motivation for risc. Risc architectures introduced the first CPUs that executed 1 instruction per clock. This was a massive improvement over heritage CISC CPUs.
underrated comment
bump
Also, it kept running without the power supply for a while.
@@brunesifirst arm processor on showcase to bbc was even unplugged to electricity. It was enough of power from returned electricity, from connected monitor to motherboard, to run it😂
@@brunesi not without power supply. But it was able to run from input diode pin leakage current. Still impressive though.
2:58
I didn't know I needed this piece of stock footage in my life. Thank you
THE DRAGON FROM ULTIMA ONLINE! subscribed
~flashbacks of farming gazers with tamed dragons...or of stealing peoples shit from their bags with a thief. Or fighting them, disarming them and then stealing the weapon from their bag. Still one of the funniest things I remember doing in a game.
Vas flam
@@hmthatsniceiguess2828 Or being chased by people wanting to kill you only to provoke nearby dragons and wisps to attack and chase them. It really was a great game
@@johannesandersson9477_”Cor Por”_
The standard greeting UO before 99% of the population moved to Trammel.
im just here for the UO dragons
1:30 the grindr joke was peak
I was hoping someone said something. Genius!
I didn't get it, what was the joke? The app?
@@lucasjritter I don't think there was one. Just the fact that Grindr exists is hilarious apparently 🤔
@@BrysonThill gay people are really funny. Thats probably why theyre so happy
@@lucasjritter The very fact that Grinder was the featured app was the joke. As a gay man, this joke gave me all the good feels.
Love these videos. Wish all news was delivered with this short format, high quality, dry and clever humour!
Thanks for this channel dude, it’s super valuable to stay up to date on all this stuff which is moving so fast
Surprised you skipped a llama 3 code report. But thank you for this!
why get a measly 38 trillion when you can get 45 TRILLION
I had to check that figure. Amazing! It only takes an electron 3 billionths of a second to do one orbit. How can they do 10,000 operations in the time it takes an electron to do an orbit?
Awesome....the idea of the 'The Code Report' is actually fantastic...keep up with the work Fireship...❤❤
The Ultima Online Dragon was perfect. Thank you for the throwback :D
1:30 boyyy lmaoooo
I hope to shadow jesus that Qualcomm doesn't screw it up
not qualcomm, but windows with their shit os
@@weiss588 I dont care. If not windows what will you do with the CPU? Either they do it with windows or at least with android and Linux.
there is always linux to save the day, but windows have the corporate user OS so i think there is no going to be a real diference.
@@omarjimenezromero3463 True. Corporate users dont care about ARM chips tho. All they need is performance not efficiency.
Fireship video on my bday, thank you :D
Please what was that grindr jumpscare i'm dying- 😭
Congrats on 3M subscribers 🎉
When can we start using quantum dots for computing
subbed after watching this vid...perfect blend of humor and infotainment!
Thank you for the update on the *Singularity Blaster Turbo Pro Max,* I've been eagerly awaiting its release
3:04 I was physically hurt by this
We early today boys! Imagine the future m420 chip able to mine 21bitcoins per nanosecond while also generating 1000LLM's/second
Ah yes, making bitcoin worthless, fun.
Why llms per second just go with nano like the other one or go even lower, picosecond
All that computing power and the first thing you think about is bitcoin
@@ctrl_x1770 the first thing he thinks about is probably milf videos
Haha love this comment 😂
idk know how you came up with the UO dragon for transitions, but it's a classy touch nonetheless
I'm not sure if I enjoyed the comedic value more or the actual information, but it's a perfect combo👌
Siri should get a rename, just distance themselves from that entirely.
Got to say, there’s no flex like running dolphin Mixtral on my M1 Max locally with better response times than the web interface of ChatGPT. Their chips are absolutely nuts and if the M4 is focussed on AI workloads I dread to think what we’ll be able to do on any device in three years.
At what point is your reduced instruction set called on to do all the instructions anyway???
if use outer dynamic as speaker, you can hear better. very common in my village
Shout out to the Ultima dragon reference
yea, seeing that was a blast from the past!
Hope they haven't included any un patchable security vulnerabilities this time round. I just love security flaws baked into the silicon
this would be banguer as a reel,sometimes i love to share these on Instagram and i cant
2:53 just a note, Apple measures TOPS differently than Qualcomm, so the M4 from what I’ve heard is close to 70 TOPS when converted to Qualcomms format
That's correct. Most people don't understand this. Qualcomm's 45 TOPs number is based on INT4 precision. That's not considered accurate enough for most use cases. Apple's numbers for the M4 are based on INT8. In the past, they've used numbers based on FP16. The point being, TOPS is a meaningless metric unless you know exactly what you are comparing. Either way, Qualcomm's X Elite's NPU is clearly LESS powerful than the M4's NPU.
1:03 NOT THAT DUDE SPECIFICALLY 😭😭😭
On device AI is great. Today I tried some Whisper audio dictation on my MacBook and it worked great while on a similarly priced Windows laptop it took more than 10 times as long and was unusable. Very soon having a good AI/Neural chip will become essential in almost every device and Apple are definitely ahead at the moment.
Are the newer ones released already or you using a 2023 model?
Seems Google takes the middle ground and submits it to the cloud, while taking a little longer, it comes back on the first try as well...
Classic video take as usual Cheers! My take is M4 on iPad launch establishes base level high perfromance option In June we will see the M4 Pro and M4 Ultra offerings that will exceed Snap Dragon TOPS and also see given its a developer conference expanded software support for key AI frameworks that finally challenges the CUDA numbers people get with Nvidia. Of course nvidia is shortly to anounce its new RTX 5090 and then numbers will be challenges again.
singularity turbo max at the end cracked me up! 😂😂😂
ARM started out as RISC but is no longer RISC - has nearly as many instructions as x86.
this hardware thing is getting wild ngl
I was going to say, “now if only software would evolve so much”, but with AI like these, who needs software?
fr fr
@@TheScottShepard isn"t AI itself a software?
yea seems like we haven't hit the peak of the curve yet 😀
Project Lavender, just another advancement we can all learn from.
Best exit to your video in a while and tbh you do great exits, so that's saying something...."Mac turbo pro blaster...thingimagigy....'... awesome!
Congrats on 3M subscribers bro🖤
Bro that ultima online dragon
A real OG 🫡
wake up honey, another upload this week
Did not expect to see an Ultima Online dragon!
FYI there is no correlation between RISC and fewer transistors per instruction. E.g. apple's M1 high performance cores were more complex than the competitors from Intel or amd
ARM does design CPUs and also licenses out the architecture. Most companies use the ARM off the shelf design where as Apple and now Qualcomm design their own CPU using the ARM ISA.
RISC-V is also another x86 killer :p
Glad that someone mentioned risc-v lol
Bro who gives a FAAK if your gadget has Ai in the thing ?
Just love your humor and wit, ma man!
Even though the new pencil is really expensive, the new features on it are bangers for artists. I haven't bought an iPad yet, but will probably be buying an M4 iPad for the nano-texture screen along with procreate and the new apple pencil pro as it's amazing for animation with the new procreate dreams
Apple debuting their new processor on an iPad is like Lamborghini unveiling their newest Supercar in Los Angeles traffic
Buhu dude
The ipad is a great device. Try using it once, you will see
@@annoswet1576while the iPad may be a good device apple is the worse and most scum company on the planet so no doubt it was used on the iPad because their M4 isn't as good as they say it is as usual
@@annoswet1576no 😂
Also, do not forget that x86 is inefficient because there are a lot of legacy processes that need to be there for backwards compatibility, while arm doesn't have any of that baggage. The new intel and AMD processors are starting to ditch those old instruction sets and starting to become faster and efficient than ARM at least in the tests. The only real downfall of X86 is that only Intel and AMD can make them.
Most old instructions unsued intructions are just emulated in microcode, nobody will notice that some software written for i486 30 years ago is running 10x slower. They only add complexity on the decoder, wich uses only about 5% of a core's transistors anyway.
No they not, x86 problems has nothing to do with old instruction set or backward compatibility. So it would be cool if you stop spreading lies because you are clearly incompetent on the matter.
@@randomrandom7208he isn't wrong though
@@randomrandom7208 r u telling tat to urself
Best summary of the big tech race to date. period.
My only question is what does that mean for games? Or will it be fine as long the OSes run proper?
Make video on how to chose language for different purpose
he did
People saying the video is not about ai. Well ...
The red dragon from ultima online was brilliant
Congrats on 3M subs🎉
We need more TOPS!!!
says every bottom ever
Ghz?
I sleep
TOPS?
NOW THAT'S WHAT WE ARE WAITING FOR BABY.
Really tops those charts
I’ll see myself out
She ghz on my tops till' i M4
I wish I was topsed
GHz development stopped 15-20 years ago.
If only applicable to NPU operations per second then ok meh...
Epic ending. Kudos!
3:00 holding phone upside down do be an interesting choice
There are so many factual inaccuracies in this video
Ha! But you are not going to tell us 🤣😂
which ones?
@@chiluco2000 On my first watch and I caught these errors. (1) The article quoted is that claims x86 is dead is 90% wrong only the last paragraph is even remotely correct. (2) The structures x86 is not based on transistor count but the number of instructions the cisc chip can use to perform its micro operations. (3) Arm was poorly described in the video, he based the argument on instruction sets(Risc) and then went to describe it by transistor count. This is not really the reason why arm is considered better for mobile devices and devices with simple specialized functionality,
I still don't understand why AI has to run natively. Most people will use simple prompts, all they need is a mobile connection.
maybe speed difference
privacy matters, like govts fine tuning LLMs on top secret military data, you get the point
because you don't want to send your private and company data to FAANG and NSA ?
Because AI is a lot more than ChatGPT. For example, removing unwanted objects from pictures is a relatively "mainstream" AI task. There are services that do it online, but most require a subscription or a purchase of credits.
Being able to run AI locally means you don't have to pay for things like that.
Speed, cost, privacy. For simple tasks a high-performing fine-tuned small model running locally will be sufficient. Apple developing both their own hardware and software can achieve this very effectively.
0:57 wow it’s a brand new chip! What is it called?
*Dead inside*
3:03 omg stock footage holding the phone upside down
u can tell bc the charging port is on the top
Plot twist, most people don't even need AI. Instinct works just well enough.
Angolan here 🔥🇦🇴
fish here
ok?
@@lucassilvas1 😂
@@soejrd24978 made my day 😭
1:30 Mama kudos for showing that, for clocking those performance issues.
I agree, Grindr does drain the battery fast!
The AI cringe race is fierce.
The death of x86 has begun.
👍
:(
" Hell, It's 'bout time. "
If the 100+ TOFS is true, then I don't think so
We wouldn't need arm SoCs if intel made good energy efficient SoCs
LLVM should add more simplicity , features, optimizations and support for more & more ARM CPUs
Hahaha this is the most hilarious tech video I’ve watched so far in my life! Super! Keep them coming. I’ve subscribed
No Thunderbolt 5, no AV1 encode, 38 instead of 45 tops (irrelevant to me but still). Seems like my M1 MaxBook will live another generation.
Bu... but is has AI!!! DON'T YOU LIKE AI!? 😂
Wow, just looked it up and it really doesn't have av1 encode... Kinda wack.
@@Velocifero Yeah the M2 not having hardware decode for AV1 was an L, the M3 not having AV1 encode was an even a bigger L but the M4 STILL not having AV1 encode? What is Apple even doing anymore
for studio users the M2 Ultra was 31 tops + nobody cared or used it. M4 ultra can be anything, because it will get its own die, people say, but if it is as before 2x, then look for 76 TOPS. The key will be, if apple has it doing anything useful. Frankly, I doubt it will be critical on Mac OS, but suspect they will on iPhone OS. Because that is where some of the opportunities are obvious, like replacing Siri.
"Fastest AI chip in the PC market" at 30tflops of what we assume is INT8, meanwhile the 4090 is over here doing 660TFLOPS of FP8.
Unless you run your 4090 in the cloud to access its compute power remotely your point is invalid.
also, afaik there's no API to interact with it so other than apple no one else that can run code on it.
Now compare power consumption.
it also uses 660 watts of power geg
@@2pingu937 Cloud GPUs like the H200 can to 4000 TFLOPS of FP8
Best channel ever along with the humour it provides
So, will there ever be a m3 iPad? Should I wait for the for the m4 MacBook?
The iPad Pro is the best product to watch Netflix and UA-cam.
I’m using one right now
Apple is getting desperate...
Why? What should run on those ARM chips?
Compare them with the support for M1 and you can see the winner quickly.
Based on? If you don’t mind explaining your take
One of the best channel so far, I'm calling it
"Dead Inside" Got me XD