After 14 years of using Mac, our studio just switched to PC. The performance of 13900k and 4090 is mind-boggling and costs half as much. We're so amazed by PC
As long as you're not locked into Apple specific programs (e.g. Final Cut Pro or Logic Pro) the performance of PC hardware is outpacing Apple development.
Im my place, the electricity cost is too much, M2 Ultra actually save me $1,000 per year compared to my 13900K + 4090. My upgrade cycle is 2 years, so $2000 in electricity saving + the resell value of M2 Ultra machines is good enough for me.
@@luismigueloteromolinari5406 how? Specifically? I’ve worked on windows for about two weeks now. I’m just using adobe products. I don’t even see windows doing anything. I’m even using the studio display, so I have the same 5k experience. Different mouse and keyboard. But for my work flow in post-production OS has zero impact
Apple achieves similar performance to these high end dedicated Nvidia GPUs with their integrated chips with almost 4 times less power. They don't lose performance when unplugged. That itself is an engineering marvel.
Can you imagine, if the current Apple M processors were to use the same wattage(over 500W+), can you imagine the amount of performance they’re going to wield? Pretty remarkable stuff what Apple has been doing.
@@lnvincible Would be crazy if they could up the voltage and really push the chips, but from what I have read and understand - and please don't hold me to it as I'm not an expert - is that you cannot just increase the voltage/clock/power usage on these ARM chips without some severe sacrifices/issues/stability. I believe they are designed and engineered in such a way to work within a very fine GHz clock speed. As the memory, GPU etc are all on the same die, they all need to be working nicely with the power envelop and clock speed assigned.
Their desktop GPUs yes but I'm curious about the laptop GPUs which are considerably more efficient than their desktop GPUs. I'm curious how a 45w RTX 4060 or 4070 does up against a M2 Pro or Max in a laptop. Edit: By using Notebook Check's measurements the M2 Pro and max are about 35% more efficient than a laptop 4060 or 4070.
@@crestofhonor2349 and in the case of an unplugged machine you have a 1/3 or half the performance of the Nvidia chip or if it's working at full performance you have at best 2-3 hours of a very noisy and hot machine.
M2 Max 38-core GPU are only slightly faster than the Series X so it makes sense. As Apple started to support AAA games now I wish they will include RT support as part of the hardware in the future. This will also going to help people who develop a 3D animation or games on a platforms like UE5 or Blender. But still a I don’t know if Apple can compete in an oflline Ray Tracing workload like OptiX can.
And there you have it…. Apple finally confirmed the death of professional Macs for the foreseeable future. The new Mac Pro is barely staying relevant the moment it was released, but being unable to add a new GPU to the PCIe slots is a freakin’ joke. Some pros use PCIe slots for niche hardware, but literally all professionals and consumers use PCIe to improve their graphics every few years. Yep, pay 3000 USD extra over the Mac Studio for useless PCIe slots. System on a chip… yeah, so when any part of that stupid chip is outdated the entire system is outdated… well here I am with a $4000 M2 Max, but only cause I’m a slave to Logic Pro and music production.
If it wasn't for Logic Pro and Final Cut, many people would have moved on from Macs long ago. The Mac Studio will likely be useful until Apple decides to cancel support which, looks like every 7yrs (5yrs for macOS versions + 2yrs security updates on the last supported macOS version).
Benchmarks are one thing, but actual real-world performance in many apps like Lightroom, video editing etc... the studio would handily beat the GPUs and CPU's you listed that are better than it. Except in Blender using Optix, where nVidia will dominate due to having hardware RT silicon. Secondly Apple IS working with game developers, with more and more AAA games coming.
Apple's H.264/HEVC/ProRes media encode/decode blocks are extremely impressive, I've seen even the M1 Ultra get very close to a 4090 in video export times and with the M2 Ultra a single video export can be parallalized across the video encoders on the 2 dies now, which should make it even faster. Also should be noted that the M2 Ultra GPU has probably just as much L2 + SLC cache available to it as the 4090, and the CPU has many times the available bandwidth as a 13900K thanks to sharing the same 1024-bit LPDDR5 memory bus with the GPU (and therefore the GPU has access to up to 192GB of video memory).
If Apple is working with game developers, then where are the AAA games? It's been three years. They only show 1 or 2 AAA games every event. Is that pace fast enough to catch up on all the AAA games?
How long will the real world performance of old stand when AI development takes on the legacy providers of software? A little like how EVs are disrupting the vehicle market now, AI development will start to provide pro-sumers with alternative software at a pace that the legacy applications will struggle to evolve with. That's because their business models are baked in to the process of purchasing and their position in the market has yet to be really tested. Now, with the way pro-sumers are developing, the market will change and Apple has seen this coming, hence the flip into their own silicon to capitalise on the new software that will be shaping and driving that market. When that happens you can forget how good, or established, companies like Adobe are, because the new market will marginalised them to a smaller niche, unless they develop their "light" software to evolve for the new pro-sumer. A group of people that will want the hardware and software to do everything at a pretty high standard, as they will be solo or parntership nimble providers of service, through the web. It's already happening, this channel itself is an example.
Bro mac is essentially just a huge ARM processor. Great for efficiency and battery,when under a video game load in the real world it falls apart. They are not equipped with enough cooling. The frame graph is all over the place. It plays like crap for 3k dollar machine. You could build a pc and buy a ps5 at the same price. Both would have better fps.
@@hasansahin7965 Not quite. That will depend on the task. A task that's done much quicker might consume less power even if the GPU as a whole consumes more. If the test is done quick enough the 4090 might be the better choice per watt because it finished the task that much quicker. It's not one GPU consumes less power therefore it's better for the job. If that job takes significantly longer than that more efficient GPU might end up consuming more power from the wall in the long run. And yes power consumption is less important in a desktop than a laptop because they do not have to worry about battery life. This is why the M1 and M2 Ultra do not exist in their laptops.
The elephant in the room that you missed is the M2-Ultra's performance in running LLMs (Large Language Model) AIs. This almost makes me want to buy a mac.
There is another area where Apple Silicon performs a lot better than Intel and AMD cpus and Nvidia and AMD graphics cards. The area where Apple Silicon performs a lot better is that the Apple Silicon has a lot lower electricity energy draw. For a business that uses a lot of computers this can make a significant difference in the electricity bill. With Apple Silicon there is incredible performance with less electricity energy.
This is great, exactly the comparisons I was looking for to get a general idea of performance. Certainly the pricing on the Macs is not ideal, still, it should be noted that you can get a M2 Max in a 14in laptop, that's like walking around with a 6700XT or PS5 in a very thin and light enclosure with great battery life, which no PC can do. I think this should excite game developers on what's possible and as the market of apple silicon Macs grow. I would think it would be possible to optimize for Metal and apple silicon in a console like way not possible on PCs, and scale across the platform. So games just scale resolution and settings Low, Med, High, across M1 / M2, Pro, Max, Ultra. I wonder how new games like No Man's Sky and Death Stranding will compare on a list like this, and if it will be more or less favorable to the Mac than the Wild life extreme benchmark.
Macs are streets ahead when it comes to laptops and efficiency. No contest. Only AMD could possibly offer an alternative but they don't take laptop supply seriously at all
From a laptop perspective, Apple is closer while using much less power. I love all day battery life. For me, Apple is the clear winner in laptops. I generally don't recommend the M2 Max in the 14" MBP since it is power limited and will not deliver "Max" sustained performance like in the 16" MBP. I can't get excited for gaming on a Mac for two reasons: 1. Cost - Macs capable of higher gaming performance (think 60-series of GPUs and higher) are priced like workstations and I will never buy a workstation to game 2. Game availability - Getting a couple of "old" AAA games ported over every year is not good enough and demonstrates Apple is not serious. They could easily fund the porting of many games and yet they don't. They only do just enough to keep people and investors excited.
If Macs could game like high end PCs, I would never buy a PC again. I own an expensive Mac and PC (7950x3D, 4090, etc), and I like my Mac more in pretty much every way. When you work on an Apple machine, it feels like a luxury experience; everything works well together without having to do much work. I also appreciate my PC, but I had to do a lot of work to get the most out of a lot of my programs and games, and Windows still feels corporate and clunky.
@@ImaMac-PC agreed. If Apple were serious (like Valve) they could've made the toolkit available to end users instead of limiting it via EULA to game developers. This would also ensure that Apple puts in time and resources into improving the entire ecosystem of Wine + Vulkan/DX to Metal on Mac. Just like how Valve commits resources to Proton development.
Amazing explaination! I have a 5900x and 3080ti, so im glad i have a better price to performance than a current gen m2 ultra, although M2 Ultra is geared towards Business and Servers.
as a programmer, the thing that matters most to me is running tensorflow and pytorch. For that, the M2Ultra is a far better deal than forking over 20K for Nvidia A100 80G. The target for M2Ultra is video professionals and ML programmers. On my M2Max, I can train with images 300x300 with batch sizes of 256, my RTX 2060 can't.
Now Nvidia Professional GPU is best for sth you need to run 24/4 for a week a month with accuracy.😅😅😅 Like solving large model of differential equation model (where the previous value can affect the next value). GTX already catch up with computing side, you pay huge money is for stability.
Truth is that the base level mac has 8gb of memory, which is half of the 16gb on a PS5. It is going to be impossible to port modern titles with half the memory and make them look as good.
To me, what's most interesting about the new Mac Pro is it seems like they just gave up on the "Pro" part. With the M2 Ultra instead of an "Extreme" (4X die), RAM maxes out at192GB (soldered). While memory bandwidth is good (800GB/s vs 200-300GB/s on Threadripper Pro 5000 or Xeon W3400) the small maximum amount (that can't be upgraded) and lack of ECC support basically means the M2 Ultra is closer to consumer than workstation class chips on the memory front. The same on PCIe lanes. Granted there's no GPU support, but even for network/storage, the Mac Pro uses switched PCIe, and only offers a maximum of 64 PCIe 4.0 lanes. The W3400s have up to 112 PCIe 5.0 lanes (almost quadruple the bandwidth) and the Threadripper Pro 5000 has 120 exposed PCIe 4.0 lanes (I'd expect the upcoming 7000 series to also double the bandwidth w/ PCIe 5.0 support). Granted, Apple is targeting a very small niche of professionals these days (Video/Audio), but even then, it seems like the limited memory and I/O bandwidth is... not great.
It all depends on your needs. Many apps that requires higher bandwidth will run faster. Those Pro machines are for prosumer, not the Pros in your mind.
The lack of functional upgradability makes these a joke to anyone working in IT. I had to argue with our CTO over the amount of Ram in servers. The cost of upgrades for these are more than servers. IT departments have budgets, they don’t buy hardware just because it has a logo on it. They buy hardware that gets the job done even when the users don’t like the hardware. Apple is making itself a luxury brand and no one who has a budget is going to buy these.
@khoifoto Big studios aren't buying Mac's anymore... the primary platform for Unity and Unreal are PC's... especially using NVidia cards of which Apple has abandoned support for.
Gaming is not the target for the studio. I ran Bauldur’s Gate 3 on an M1 Max at 4k on ultra settings. It matters if the software is optimized for the hardware. And the Mac uses less power than just the graphics card on a PC
It funny how apple stop intel Mac running BG3. However installing the same game, under windows 10 has no problem running it. Now why would apple be so evil to do that
@@macgamer1973 I am sure it is not Apple doing that. Larian probably did not have the time or people to get it to work on Macs with such limited graphics or market. It runs on Apple silicon but would not run on older intel laptops.
I dont think a M2 ultra could be any faster than a RTX3090. In fact, if you run some deadass UE5 games (optimized for apple metal), the m2 ultra has much lower framerate than a 3090
Also power draw is irrelevant... once you normalise across process nodes its more or less a moot point. High clocks, more cores, more transistors means more power again that's a physics limitation.
Also keep in mind, Cinebench is proven to not run as fast on Apple Silicon as it should be. Cinebench uses a CPU renderer called Intel Embree written for Intel AVX2 SIMD which has a weak port to the ARM NEON SIMD library. Basically the M2 Ultra's CPU cores aren't stressed properly. On Geekbench 6, the M2 Ultra is in line with the i9-13900K, with the fastest OC'd KS models being ~15% faster.
Geekbench was (and is) apple-favoured for YEARS. Remember the news when they claimed that CPU in iPhone was more powerful that the desktop one? You don't but I do. Geekbench is a joke.
@@dat_21 Who claimed thr CPU in iPhone was more powerful than the desktop one? Apple? Even if they claimed that and that seemed to be the case on Geekbench I don't think there was any way to know if that was true or false until we could run desktop applications on Apple silicon when they released the M1. And even in Cinebench M1's single core score was competitive with desktop Zen 3 at the time. And at a point where Intel was stuck on 14nm I don't think that iPhone chip claim would've been immediately falsifiable. Even if the A-series wasn't more powerful, one of Apple silicon's strongest sides is the memory bandwidth available to the CPU, and in some cases even if there's more raw CPU grunt in one chip the M1 will still be faster with fewer or weaker cores because it didn't get bottlenecked by mem. bandwidth.
@@utubekullanicisi Cinebench is a joke too. Almost nobody renders on a cpu these days and somehow it is accepted as a universal benchmark these days? It's not even open source. So in addition to testing the processor, it also tests the quality of their code. Could be a lackluster port, or a very thoughtful rewrite that is better that the original code. And nobody knows because source is not open.
Please do also compare performance per wattage and performance when the laptops are unplugged. Then you see the benefit with Apple Silicon in a laptop.
"Apple Silicon" is an SoC. An AMD SoC using HBM3 shared between the upcoming Zen5 and RDNA3 GPU would destroy anything Apple makes, albeit with higher power consumption and cost.
Apple is impressive and PC people shouldn't hate. Apple are helping to raise their own as well as PC performance to keep pace. I am a lifelong PC user and although I can't see myself using a MAC for anything, it doesn't mean I can't admire what they have done and how far they have come with their own silicon.
Apple hurting our right to repair and tinker with our hardware. The new macs are a joke, apple took way our right to upgrade. You can’t make your own choice of ram, drive, etc in your computer. An pc world going to allow apple tactics
There is a bit more in it than the base GPU comparison, the inbuilt graphic encoders for graphics design for instance. However, for gaming , even I had to buy a PC after 14 years because Mac never encouraged developers to write to the OS. For laptops with general purpose usage, which is pretty much the future, the X86 is dead in the water right now. You can't get the performance without the cooling while the M series chip can. The sad fact is that the X86 technology is 50 years old now, the same as the T72 tank. You can upgrade the platform and call it a T95 but the deficiencies in the design for modern usage are still at the core of the structure. The M series chip is getting more efficient while the X86 architecture is requiring massive 3rd party power and cooling developments to advance.
Yea, it's no contest for performance per watt in laptops. Love my M1 MBA. However the M2 is actually less efficient than the M1. They bumped the clock speeds higher but the performance gain was not enough to offset the increase in power consumption. Next year, M3 will take the performance per watt crown.
"There is a bit more in it than the base GPU comparison, the inbuilt graphic encoders for graphics design for instance." - You must be mistaking media engine performance for GPU performance. The built-in media engines in the Apple silicon are mostly what handle photo/video editing workflows not the GPU so it doesn't count. Also, ARM is almost as old as x86 - it's only 4 years younger. The problem with x86 lies in the ISA itself (the CISC ISA) and is the very reason the RISC ISA (which ARM, powerPC, RISC-V uses) was created in the first place.
Is there a reason you used Wild Life as gpu bench? Maybe getting Timespy or Firestrike Ultra running on a mac is impossible. I ask this because at these gpu levels such a weak benchmark will quickly become cpu bound. Hopefully you ran them at the 8k custom setting.
PC heads never factor that in. I know when at the end of the day I will get the performance I need out of this Mac pro and even more so when the M3 Extreme comes out. At a fraction of the electricity and AC costs to keep PC cool. I said AC to make a point to the PC heads out there. Apple is on the right path and PC is not. Oh and did I say soundproofing for PC?
I mean it's an APU that sits on a absolutely massive die. AMD and Intel APUs are far smaller because they don't put as much stuff on the die. AMD can and has made APUs that perform extremely well, look at the PS5 and Xbox Series S and X. Both of those are APUs as well and don't need a dGPU
That's still the biggest problem that even if it was priced competitively, there are still way to few games to play. The selection of games on Windows is just unreached
If games are your priority, then you are not the right target group. I’m a Mac user and fan and have to admit to myself that Macs are just not good in 3D rendering and games
@@Solunexxx you’re clearly not talkling from experience like I do! I have an iMac i9 with an AMD RX5700XT… it’s a very capable machine, don’t get me wrong, but for 3D rendering and pathtracing nothing comes near a RTX GPU from Nvidia
7:52 “Who did Apple build this for?” I’ve used PC’s and Macs, and there are definitely pros and cons of each to consider. PC can be much cheaper, and on a component level you get much more bang for the buck. But the components add up to a system, which is what we interact with. This is where Apple’s real value comes in: user experience, which gets even better going from a system level to an ecosystem level. For me, this is what’s so attractive about Apple. Their pricing is shameless and maddening, no doubt. And if all you will do on your computer is gaming, by all means get a PC. But if you do anything else at all, the buttery smooth performance of apple OS beats the lagging performance and clunky design of Windows, with half the noise and half the power needed. It’s the day-to-day performance that Apple gets so right. And for that, I will probably assume the position and overpay once more.
It should be noted that in video editing I've seen the M2 Ultra perform as well as if not faster than a 13900K + 4090 combo, and the $1000 less 60-core version doesn't have a slower media engine which means the export times will be very similar if not exactly the same as the 76-core M2 Ultra, which I don't think can be said for an RTX 4080 vs. 4090. And in GFXBench 4K Aztec Ruins High offscreen the M2 Ultra has the same fps as a 4080 on DX12.
Obviously people should use whatever works best for their workflow (I think video editing is squarely in the target for the Mac Pro), but if a 13900K/4090 combo performs comparably, it's also worth noting that it will probably run you 1/2-1/4 the cost (depending on memory and storage options). Looking on the GFXBench site, it lists 298fps offscreen for the M2 Ultra and 425fps for the 4090, but I don't think that matters much either way (no one's buying a Mac Pro for gaming). For Geekbench 6's GPU Compute, the highest M2 Ultra Metal numbers score around 200K, while it looks like the avg 4090 scores around 330K w/ OpenCL. Of course if you're using anything CUDA-based, non-Nvidia isn't an option anyway. It's also worth noting that 4090 is far from the top-end option. An A6000 Ada has another +10% TFLOPS and for ML, PCIe H100s will double both the memory bandwidth and FP16 again (and another 8X w/ sparsity, or 16X for FP8!). Also, most workstations will support 4X cards, but based on the state of their PyTorch Metal port, I doubt even Apple is using Apple Silicon internally for ML training...
@@lhl A 13900K + 64GB DDR5-6400 + RTX 4090 + high end motherboard + 2TB Samsung SSD without a case costs $3400, while the Mac Studio with 64GB RAM and 2TB SSD costs $4400-$5400. Hardly a steal, but not 4x the cost. GFXBench does list the M2 Ultra with 330.9fps, and I didn't compare that with the 4090, I compared it with the 4080, as the video itself also did. I didn't mention the GFXBench score for gaming performance, I mentioned it for rasterization performance, which has more uses than just in gaming. And yeah if your application only supports CUDA and nothing else you're not considering anything other than Nvidia, and in Blender Nvidia absolutely demolishes Apple thanks to using 3-4x the power and having strong hardware ray tracing. Apple has done a good job improving the performance in Blender on the M2 series, anywhere from 75% to 2x faster or more, despite staying on 5nm, but Nvidia didn't stand still with the 4090 over the 3090. But if on 3nm Apple adopts some form of hardware ray tracing acceleration within their GPU cores, which they're speculated to do, and gives a nice boost to the GPU core count, I think they can double their performance in Blender again easily, close 50 to maybe even 70% of the gap, and be only be 40% behind a 400W card. And I'm still talking about the 2-die Ultra variant, there's also a rumored 4-die variant that consists of 4 Max dies will probably come sooner or later. I think Apple will be squeezing a lot more peroformance out of their architectures and this is only the beginning.
@@utubekullanicisi The Mac Pro starts at $7K, so that's the 2X baseline. Clicking through, it looks like +$1.6K 192GB RAM and +$2.2K 8TB storage, you get up to $10.8K. On the PC side, 192GB DDR5 RAM kit will be +$400, and an 8TB SSD will be +$800, although on the PC side you have the option of saving some money w/ 2 x 4TB PCIe 4.0 M.2 SSDs for +$500). I suppose that ends up being only a 2.5X or so, the upgrade pricing is more reasonable than previous Mac Pros (since there's not much scaling on the memory side w/ the new models). The 4X Max "Extreme" was rumored for the M1 but never released, and rumored now for M2, but I'm also doubtful that it'll be released. The market for that is just too small. TSMC N3 is fully booked for A17, and from what I've heard, the yield remains quite bad. That will likely move to N3E, but we're probably going to have to wait for N3P and N3X for higher perf parts. Apple has a strong team and I don't doubt they can execute well on whatever management wants, but that's mainly power/perf, and I just don't see the company focused on the high-end like Nvidia, AMD, or even Intel are. Apple is moving their mobile chips (where they make their money) up, the others are basically moving their server/enterprise products (where they make *their* money) down.
I went with the intel/4090, and for the video editing I do its much better than the mac ultra a friend has. I use Davinci. Once you start adding effects that are GPU powered, like Noise Reduction, magic mask, depth mapping, various blur, etc, etc, the encoders can't save the macs weaker GPU from being crushed by the 4090. If you just doing simple simple editing you won't notice any difference what so ever and both the mac and pc will export at like 300fps. Also the 4080 and 4090 both have the same exact dual video encoders. 4070 and down only get one
Great video I'm a Mac! Glad you included visual graphics and examples. Yep, the target audience for these high end macs are essentially businesses, but not all of course since some are hobbyists who have the money for it. This sums it up at 8:23 There are definitely PC counter parts to this.
I recently bought mi first Mac ever; a M2 Pro Mini with 16gb ram 512ssd. I only bought it for editing RAW photos using Adobe Lightroom. The cost was around $1,200. I am thinking that with that amount I can get a faster PC? Lightroom uses mostly CPU performance rather than GPU. What would be an Intel or AMD equivalent of the M2 PRO?. I am willing to return the mac if I cant same performance with a PC for less. THANKS 🙏🏻
@@STeroidsnicca Performance is more important to me, I don't buy a processor to save electricity. In addition, from what I've seen and read, M2 Pro processors and above are already less efficient than M1.
I was ready to buy one. I have been a Mac user for over 20 years and currently focusing on 3d work. I would love to go into the pc world but have no idea what to get that would be faster. Please advise I would be happy to pay you for your time. Looking for something faster than that 24 core / 76- core. Thank You
Depends on your type of 3D work. For example, in Cinema 4D, you have the Cinebench R23 benchmark. The Intel i9-13900K and Ryzen 9 7950x both score 37k-38k while the M2 Ultra is only at 28k. You can get even much higher performance in that App if you step up to Threadripper PRO or Intel's new Xeon W-2400/3400 CPUs with much higher core counts. But now you are clearly into workstation territory and workstation money. To get an idea of performance and cost, I would recommend checking out Puget Systems (www.pugetsystems.com/solutions/3d-design-workstations/).
@@ImaMac-PC Thank you for the information and link. I am looking to get away from apple art this point and I am sure the workstation on your link will be a lot faster and better solution. Thank You
The M2 Ultra processor already reaches 295W, you know what will happen to the SOC beyond that... the processor will melt. SOC's method is very limited and has no future especially in need of upgrades.
9:27 I really think that you are wrong. Apple Silicon has a lot of potential, so I think it will take time but it will be possible to have a good gaming experience on Mac.
Apple Silicon hardware is capable. The hardware is there (except ray tracing). You can get a good gaming experience on a Mac on a few select titles. Apple Silicon doesn't need any more time. It needs games. To get the games, it needs funding. It needs a commitment by Apple. Why doesn't Apple commit?
@@ImaMac-PC There is a lot to catch up, so yes it needs time for games to be developed natively for Mac. They can’t do whatever they want and pay every game developer. It needs to be attractive (profitable) for the developers which is becoming more and more little by little with a growing community and easier ways to bring they games from other platforms.
I’m looking for a machine to edit video in DaVinci, DaVinci Fusion, maybe Adobe, and audio production. Mac is an audio powerhouse, these are excellent for video editing. Looks good enough for DaVinci Fusion. And it’s quiet/efficient while doing it? Sign me up! If you want a gaming machine, you can buy a PC or a console and get an infinitely better experience. Hell, get a PS5/XSX and a Steam Deck/Ally for less then a mac.
I agree. I directly asked for a computer that is good for work and play. But maybe we should ask for a computer that is good for primarily play. With fewer features, it may cost as much as a Windows gaming PC and perform the same. Edit: I'm doing it. I'm requesting them. It's just a request.
Except that making their Game Porting Toolkit available for free to developers to see how easy it would be to make DirectX 12 games run on the Mac isn’t nothing. Why does Apple have to buy a studio to prove they’re serious about gaming?
Time and money taken away from porting the game over to Mac. There is are Zenamax came out and said they won't support Apple SoC when it comes to Elder Scrolls Online. There is also the fact more people use PC, Xbox, Playstation and Nintendo. So yeah Apple have to prove it's worth the time and cost.
"Mac Studio" .... it's not for gaming. It's designed for professionals to do content creation for a living and those people can make the money back in a matter of weeks or a few months.
Hello. I recently commented on another of your videos concerning how various GPUs perform with AI noise reduction (adobe). The M2max gets less than half - close to 1/3 the performance of an RTX4080 and even less than that as compared to an RX7900XT. I don't know about gaming but for compute, they're really not performant chips at all. Actually we have an RX6700XT in the lab which gets slightly better AI noise reduction performance than the M2 Max. They're really nothing special for our purposes.
A bit of an unfair comparison Ian. As it stands (although I'm not sure about the latest version of Adobe), the Adobe apps don't use the neural engine but rather the CPU for AI noise reduction (as does DXO) due to a bug in the later versions of OsX. What I can tell you is that the AI NR in DXO when it worked with the neural engine (pre-ventura) was MUCH faster on Apple Silicon. Apple are terrible at fixing bugs though, they've known about this one for months.
The sad reality is game developers are like old web developers, they have clue or refuse to do anything efficiently. And why is everything based on gaming with building a pc.
The average person who buys a Mac is just sick of being screwed by substandard PC builds I think. My one PC is near unusable because every time windows rolls out another update I can't use streaming services like UA-cam smoothly any more. That's on an HP desktop.
For an average computer user, I have converted many a PC user to entry level Macs (that were on sale) for that very reason...and it has greatly helped me as I no longer get calls for Tech Support!
I enjoy this kind of video thanks. I’d be interested in a comparison with the mobile gpu scores from this gem as they are much lower wattage and generally more directly comparable. Strange to see the 13700k comparison with the M2 Ultra which does laps around my 13900k for code compilation. Each specific use case will see different pros and come I guess
Extremely confusing and non-linear comparison. M2 direct comparisons of Pro / Max / Ultra versus the highest-end Intel, AMD, and NVIDIA would be more relevant than meandering stock footage. Even price comparisons are convoluted, factoring a strange mix of legacy systems with and without GPUs. Pick a strategy - affordable mid-tier and/or top-tier system compute, graphics, encoding, neural, usage, and competency per price point.
I think this comparison is not taking into account of actual performance numbers in actual applications that Mac users would be using these machines for. If you consider software optimization, accelerator hardware for media such as prores, and efficiency, you aren't getting close to the actual performance with the mac. Windows hardware should be praised for the massive increase in value but they are throwing wattage to bridge the gap.
They are living with the burden of being backward compatible too. I can take a new NVIDIA GPU and new CPU and run a game from 15yrs ago. Macs can't do that. Apple broke free from that legacy compatibility.
And what about for PC? NVIDIA CUDA accelerated workflows like 3D modelling with blender optix and hardware accelerated raytracing or autodesk maya with arnold or vray, or even AI workflows (like tensorflow) powered by NVIDIA tensor cores. You guys just keep praising Apple and lowballing PC when the macs are only good at a few select workflows and perform horrible with others. Those doing any 3d CAD/CAM work won't even look twice at a mac.
I will say you are completely right in those areas, however I do think Apple is going to have some major agency now to start addressing the needs of the people in those demographics. I think PC hardware has always been good, but Windows/Software has always been the limiting factor. However, I don't think anyone can really deny the sheer practicality that Apple has introduced with having hardware that is power efficient and powerful at the same time. I think their route is going to prove very useful for closing the performance gap.
Uhm... i 'Liked' because you finished making a conclusion to your argument -which i liked ; NOT because you told me to 'like& subscribe' the moment i clicked it 🤬
Youve done something wrong here. A 7900XTX is not 42% faster than a 7900XT. Not even close. Edit : The average FPS in wild life extreme for a 7900XT is 281Fps....so why did you get 240fps which actually puts it in the bottom 5% of all 7900XT results.
Imagine a starship that goes to another planet. Which ships will be used for this, Apple M ships or Intel? The starship will need to save tons of energy but some tasks will demand lots of computation power and control heat so there is no accidents. Which one will you pick? Apple M ships or intel ?
As a intel Mac gamer, I couldn’t agree with you more. Apple will never care about gaming on Mac. An love apple threw intel Mac under the bus, didn’t give Nivida their developer lic back so can still support intel Mac , stop the 5,1 from installing boot camp on Mac OS Mojave. However some still making games for intel macs. An my intel still run the latest games without problems. I’m playing evil west with no problems
You are making a sophomoric comparison. Ive built PC game machines for years, last one was a threadripper 5975wx 24 core, 64g ram, nvidia 4080, ssds total cost about 4k . Big case, lots of fan noise . Much rather have the M2 Ultra, tiny case that fits under a monitor,, no fan noise when maxed out, runs cool, better option, and i dont game anyway , so thats irrelevent. More to a choice than just specs, which are not that much different anyway, most would not even notice differences.
@ImaMac-PC not really, i have a mac mini M2 pro 12 core, 32g, 1tb, it gets hot, sits around 80c to 100c when im using it for my work flow, its still quiet with the fan at 5k rpm, but it has a lot of jank for my simulations, the studio max would at least run cool for almost the same price as the mini pro, and the 30 core gpu vs 19 in the mini pro would help the jank. the base mini for me would not work well at all.
I have a Mac mini with an M1 chip and 16 GB of Ram. At work I have a high end i7 laptop with 32 GB of ram. My Mac mini opens all apps and websites instantaneously. My work laptop always lags for few milliseconds, or seconds, before opening any application. So I can completely and absolutely confirm you video is nonsense. You are comparing apples to oranges my friend. I am not an Apple fan boy, only a normal user who uses his computers for work and for school. Performance wise Apple is way ahead of anyone. And this 1,500 USD Mac mini that I got two years ago is the best computer I ever owned. Better than any gaming PC I built in the past.
The new Mac should always perform better than the old Mac. This video adresses new Mac vs new PC. You should try a new PC to compare opening apps and website.
And don’t forget that the SOC also has cores dedicated for video encoding, neural engine, etc. So when editing video it is not just the gpu and cpu working, it has dedicated DSP for this kind of jobs and makes them very fast. How fast? Well, never “instantaneous”, but fast enough you will not have time to make yourself an “instant” coffee.
Don't forget the Nvidia graphics cards also have dedicated video encoders/decoder and many many more neural engine cores than the Mac, and also dedicated ray tracing cores that can be used as well. Also Davinci and Premier will also use, in conjunction with the NVidia GPU, the Intel integrated GPU for more built in video encoder/decoders if needed. THEY BOTH have dedicated encoder/decoder. That is just a Mac sales pitch to confuse you.
Global Gaming Content revenue by company 2021 : Tencent 32.5 Billion Apple 14.8 Billion Alphabet 12.4 Billion Sony 10.2 Billion NetEase 9.7 Billion Activision Blizzard 8.8 Billion Nintendo 8.1 Billion Electronic Arts 6.5 Billion Sea ltd 4.3 Billion I am confused about your comments about Apples commitments to gaming. They have just chose a sector that works for them to the tune of about 14.8 Billion and it has grown significantly since 2021. I just looked this up …. If you look at the makers of PC Video Card Makers, their focus, where there investment is, does not look like gaming is their business either[Machine Learning and A.I.] And then there are all the online gaming resources, like STEAM .... Gaming is clearly not in the GPU/PC thing one could be lead to believe or is it?
Thanks for the data sharing. Yes, Apple does make money from Gaming as you show. However, that is overwhelmingly from iOS devices. I was focused on AAA gaming for a desktop computer. Titles like Death Stranding and Resident Evil Village. Apple will show off an example or two every event. But they are not making any commitment to bring AAA gaming to the Mac. I would love to play my STEAM library on a Mac. To play Resident Evil Village, I can't use my STEAM version. I have to purchase it again through the Apple App store. I'm not buying the game separately for each platform. Also, NVIDIA and AMD are chasing A.I. as they see a larger revenue source from businesses and it is potentially more lucrative than gaming.
Apple products are good and dandy for video and photo editing, but that can be done on x86 machines as well. the big money making is in engineering work ie cad, good luck with running an arm apple machine dassault or siemens cad software, ie the big boys in the industry. I had a refresh class in cad not so long ago, a young women asked if here apple could run the software, everybody started to laugh and said, yeah if you want to play with tablets then the autdo cad fusion is a nice toy, but if you wanna work in this industry then you should throw your apple toy in the trash ;P
@@ImaMac-PC Is it possible for you to incorporate an Apple M Chip CPU/GPU Hierarchy Chart into future videos that pairs each Apple M Chip to their CPU/GPU Desktop Equivalent to make it easier for mac owners to understand what type of relative performance they'd be getting or have? Also would an M2 Max Mac Studio Base Model be decent enough for 3D rendering/modeling/animation for game development and CGI work for the mac platform? Thank You!
It has nothing to do with Apple hardware. There are good CAD and BIM programs for the Mac. It is just that AutoDesk targets primarily Windows and their code base is heavily interwind with Windows. If the software does not exist for a hardware platform, it does not matter how good that platform is.
@@dsblue1977 Autodesk is for Mac as well. The heavy weight of engineering software don't though, and it does not matter if it has to do with the apple hw or not. If it dont work it don't work regardless of reason, that is how companies are reasoning.
If esports titles can get same performance on M2 ultra or even upcoming M3 ultra with hardware ray tracing cores on par or surpassing 4090/5090 😂 while maintaining the same power consumption as M2 ultra (SoC, SiP) then it combined with a LOT MORE AAA titles like CP77, Gotham Knights, Hogwarts, RE4 remake, SF6, BF2042, DQ11S, FF16, FN UE5.2+, Apex, RL, R6 Siege, etc, would help save esports teams tons of electricity bill 🤠🫰🏻
@@evacody1249 but high refresh rate on high end GPU like 360+ FPS is consuming tons of power, even more than a 4K HDR TV. This is why Apple made its own chip: save power, save energy. Don’t underestimate its performance per watt, each tiny bit improvements, and each Apple silicon Mac sold by game developer, if hundreds of thousands are sold, running 247, it would save a lot of electricity bill
It is not very fair to compare apples with pears. It's not just graphics performance that matters. For example, access to memory is many times more efficient with Apple. The CPU performance is also very different. Separately, the power consumption is simply unmatched.
After 14 years of using Mac, our studio just switched to PC. The performance of 13900k and 4090 is mind-boggling and costs half as much. We're so amazed by PC
Why'd they switch?
As long as you're not locked into Apple specific programs (e.g. Final Cut Pro or Logic Pro) the performance of PC hardware is outpacing Apple development.
Im my place, the electricity cost is too much, M2 Ultra actually save me $1,000 per year compared to my 13900K + 4090. My upgrade cycle is 2 years, so $2000 in electricity saving + the resell value of M2 Ultra machines is good enough for me.
Really working on windows feels mediocre against working on Mac, I do both.
@@luismigueloteromolinari5406 how? Specifically? I’ve worked on windows for about two weeks now. I’m just using adobe products. I don’t even see windows doing anything. I’m even using the studio display, so I have the same 5k experience. Different mouse and keyboard. But for my work flow in post-production OS has zero impact
Apple achieves similar performance to these high end dedicated Nvidia GPUs with their integrated chips with almost 4 times less power. They don't lose performance when unplugged. That itself is an engineering marvel.
Can you imagine, if the current Apple M processors were to use the same wattage(over 500W+), can you imagine the amount of performance they’re going to wield? Pretty remarkable stuff what Apple has been doing.
@@lnvincible Would be crazy if they could up the voltage and really push the chips, but from what I have read and understand - and please don't hold me to it as I'm not an expert - is that you cannot just increase the voltage/clock/power usage on these ARM chips without some severe sacrifices/issues/stability. I believe they are designed and engineered in such a way to work within a very fine GHz clock speed. As the memory, GPU etc are all on the same die, they all need to be working nicely with the power envelop and clock speed assigned.
Their desktop GPUs yes but I'm curious about the laptop GPUs which are considerably more efficient than their desktop GPUs. I'm curious how a 45w RTX 4060 or 4070 does up against a M2 Pro or Max in a laptop.
Edit: By using Notebook Check's measurements the M2 Pro and max are about 35% more efficient than a laptop 4060 or 4070.
@@crestofhonor2349 and in the case of an unplugged machine you have a 1/3 or half the performance of the Nvidia chip or if it's working at full performance you have at best 2-3 hours of a very noisy and hot machine.
Yeah, let's pay 3999$+ so that you have less performance than a PC and we can save up some electricity cost in the long term.
M2 Max 38-core GPU are only slightly faster than the Series X so it makes sense. As Apple started to support AAA games now I wish they will include RT support as part of the hardware in the future. This will also going to help people who develop a 3D animation or games on a platforms like UE5 or Blender.
But still a I don’t know if Apple can compete in an oflline Ray Tracing workload like OptiX can.
doesn't apple already do RT? I'm not sure if its hardware or software tho
@@DouglasHewittNo they don't.
And there you have it…. Apple finally confirmed the death of professional Macs for the foreseeable future. The new Mac Pro is barely staying relevant the moment it was released, but being unable to add a new GPU to the PCIe slots is a freakin’ joke. Some pros use PCIe slots for niche hardware, but literally all professionals and consumers use PCIe to improve their graphics every few years. Yep, pay 3000 USD extra over the Mac Studio for useless PCIe slots. System on a chip… yeah, so when any part of that stupid chip is outdated the entire system is outdated… well here I am with a $4000 M2 Max, but only cause I’m a slave to Logic Pro and music production.
If it wasn't for Logic Pro and Final Cut, many people would have moved on from Macs long ago. The Mac Studio will likely be useful until Apple decides to cancel support which, looks like every 7yrs (5yrs for macOS versions + 2yrs security updates on the last supported macOS version).
Benchmarks are one thing, but actual real-world performance in many apps like Lightroom, video editing etc... the studio would handily beat the GPUs and CPU's you listed that are better than it. Except in Blender using Optix, where nVidia will dominate due to having hardware RT silicon. Secondly Apple IS working with game developers, with more and more AAA games coming.
Apple's H.264/HEVC/ProRes media encode/decode blocks are extremely impressive, I've seen even the M1 Ultra get very close to a 4090 in video export times and with the M2 Ultra a single video export can be parallalized across the video encoders on the 2 dies now, which should make it even faster. Also should be noted that the M2 Ultra GPU has probably just as much L2 + SLC cache available to it as the 4090, and the CPU has many times the available bandwidth as a 13900K thanks to sharing the same 1024-bit LPDDR5 memory bus with the GPU (and therefore the GPU has access to up to 192GB of video memory).
If Apple is working with game developers, then where are the AAA games? It's been three years. They only show 1 or 2 AAA games every event. Is that pace fast enough to catch up on all the AAA games?
How long will the real world performance of old stand when AI development takes on the legacy providers of software? A little like how EVs are disrupting the vehicle market now, AI development will start to provide pro-sumers with alternative software at a pace that the legacy applications will struggle to evolve with. That's because their business models are baked in to the process of purchasing and their position in the market has yet to be really tested.
Now, with the way pro-sumers are developing, the market will change and Apple has seen this coming, hence the flip into their own silicon to capitalise on the new software that will be shaping and driving that market. When that happens you can forget how good, or established, companies like Adobe are, because the new market will marginalised them to a smaller niche, unless they develop their "light" software to evolve for the new pro-sumer. A group of people that will want the hardware and software to do everything at a pretty high standard, as they will be solo or parntership nimble providers of service, through the web. It's already happening, this channel itself is an example.
Thank you! Benchmarks aren’t the whole story.
Bro mac is essentially just a huge ARM processor. Great for efficiency and battery,when under a video game load in the real world it falls apart. They are not equipped with enough cooling. The frame graph is all over the place. It plays like crap for 3k dollar machine. You could build a pc and buy a ps5 at the same price. Both would have better fps.
The ability to build and upgrade a pc is why I only use a PC. IDC how fast an m2 is if I can't upgrade or tinker with it I don't want it.
I hear you brother. I remember growing up , heading down to radio shark for a reset button for my Commodore 64
While a top of the line PC beats it in raw perf, the M2 Ultra is crazy more power efficient.
300W ?
@@uribak9144 lol 3090 alone eats more than the whole M2 Ultra SoC
Lol no one cares power consumption in PC/Desktop form factor
@@ayaanbari6711 lol more power consumption means more environmental pollution + spending or wasting more money.
@@hasansahin7965 Not quite. That will depend on the task. A task that's done much quicker might consume less power even if the GPU as a whole consumes more. If the test is done quick enough the 4090 might be the better choice per watt because it finished the task that much quicker. It's not one GPU consumes less power therefore it's better for the job. If that job takes significantly longer than that more efficient GPU might end up consuming more power from the wall in the long run.
And yes power consumption is less important in a desktop than a laptop because they do not have to worry about battery life. This is why the M1 and M2 Ultra do not exist in their laptops.
The elephant in the room that you missed is the M2-Ultra's performance in running LLMs (Large Language Model) AIs. This almost makes me want to buy a mac.
Your comparing apples with oranges, at what resolution did you test?, and with what cpu did u team up the amd and nvidia gpu's?
It was apple start this compare gaming. Did you forget their ad
There is another area where Apple Silicon performs a lot better than Intel and AMD cpus and Nvidia and AMD graphics cards. The area where Apple Silicon performs a lot better is that the Apple Silicon has a lot lower electricity energy draw. For a business that uses a lot of computers this can make a significant difference in the electricity bill. With Apple Silicon there is incredible performance with less electricity energy.
And for businesses it is cheaper to buy PCs in bulk then then a Mac Studio which won't run all the programs they need.
This is great, exactly the comparisons I was looking for to get a general idea of performance. Certainly the pricing on the Macs is not ideal, still, it should be noted that you can get a M2 Max in a 14in laptop, that's like walking around with a 6700XT or PS5 in a very thin and light enclosure with great battery life, which no PC can do. I think this should excite game developers on what's possible and as the market of apple silicon Macs grow. I would think it would be possible to optimize for Metal and apple silicon in a console like way not possible on PCs, and scale across the platform. So games just scale resolution and settings Low, Med, High, across M1 / M2, Pro, Max, Ultra.
I wonder how new games like No Man's Sky and Death Stranding will compare on a list like this, and if it will be more or less favorable to the Mac than the Wild life extreme benchmark.
Macs are streets ahead when it comes to laptops and efficiency. No contest. Only AMD could possibly offer an alternative but they don't take laptop supply seriously at all
From a laptop perspective, Apple is closer while using much less power. I love all day battery life. For me, Apple is the clear winner in laptops. I generally don't recommend the M2 Max in the 14" MBP since it is power limited and will not deliver "Max" sustained performance like in the 16" MBP. I can't get excited for gaming on a Mac for two reasons:
1. Cost - Macs capable of higher gaming performance (think 60-series of GPUs and higher) are priced like workstations and I will never buy a workstation to game
2. Game availability - Getting a couple of "old" AAA games ported over every year is not good enough and demonstrates Apple is not serious. They could easily fund the porting of many games and yet they don't. They only do just enough to keep people and investors excited.
If Macs could game like high end PCs, I would never buy a PC again. I own an expensive Mac and PC (7950x3D, 4090, etc), and I like my Mac more in pretty much every way. When you work on an Apple machine, it feels like a luxury experience; everything works well together without having to do much work. I also appreciate my PC, but I had to do a lot of work to get the most out of a lot of my programs and games, and Windows still feels corporate and clunky.
@@ImaMac-PC agreed. If Apple were serious (like Valve) they could've made the toolkit available to end users instead of limiting it via EULA to game developers. This would also ensure that Apple puts in time and resources into improving the entire ecosystem of Wine + Vulkan/DX to Metal on Mac. Just like how Valve commits resources to Proton development.
I run no man sky on Mac Pro 6,1 with Radeon Vega 56 like butter
Amazing explaination! I have a 5900x and 3080ti, so im glad i have a better price to performance than a current gen m2 ultra, although M2 Ultra is geared towards Business and Servers.
Thanks!
as a programmer, the thing that matters most to me is running tensorflow and pytorch. For that, the M2Ultra is a far better deal than forking over 20K for Nvidia A100 80G. The target for M2Ultra is video professionals and ML programmers. On my M2Max, I can train with images 300x300 with batch sizes of 256, my RTX 2060 can't.
Try 4090. It's much better for ml workloads and mobile version will run you cheaper than mac
Dude if you want Pytorch and Tensorflow just go for GTX 4090.
Nvidia professional GPU card used to use in Solid work huge mechanical 3d model.
Now Nvidia Professional GPU is best for sth you need to run 24/4 for a week a month with accuracy.😅😅😅
Like solving large model of differential equation model (where the previous value can affect the next value). GTX already catch up with computing side, you pay huge money is for stability.
hmmm yes my M2 Ultra happens to be better than a lower midrange 2019 GPU. How surprising.
@floppa9415 No, it's not. A100 used as GPU for a server and trained 24/7 for weeks. That's why it is expensive.
This guy is comparing tomatoes to aple
Truth is that the base level mac has 8gb of memory, which is half of the 16gb on a PS5. It is going to be impossible to port modern titles with half the memory and make them look as good.
maybe we could get them if they stop charging so damn much for an extra 8gb
You just gained a new subscriber. Linus tech tips also do charts, but half the time they go over my head. Your video was just perfect.
LTT is just ove rated. They are full of themselves.
Awesome, thank you! Linus tends to just flash the charts and keeps talking way too fast to digest any of the data in them.
Thanks Im a Mac! Are you planning on doing a M3 video? GPU/CPU wise similar to this one?
To me, what's most interesting about the new Mac Pro is it seems like they just gave up on the "Pro" part. With the M2 Ultra instead of an "Extreme" (4X die), RAM maxes out at192GB (soldered). While memory bandwidth is good (800GB/s vs 200-300GB/s on Threadripper Pro 5000 or Xeon W3400) the small maximum amount (that can't be upgraded) and lack of ECC support basically means the M2 Ultra is closer to consumer than workstation class chips on the memory front. The same on PCIe lanes. Granted there's no GPU support, but even for network/storage, the Mac Pro uses switched PCIe, and only offers a maximum of 64 PCIe 4.0 lanes. The W3400s have up to 112 PCIe 5.0 lanes (almost quadruple the bandwidth) and the Threadripper Pro 5000 has 120 exposed PCIe 4.0 lanes (I'd expect the upcoming 7000 series to also double the bandwidth w/ PCIe 5.0 support). Granted, Apple is targeting a very small niche of professionals these days (Video/Audio), but even then, it seems like the limited memory and I/O bandwidth is... not great.
It all depends on your needs. Many apps that requires higher bandwidth will run faster. Those Pro machines are for prosumer, not the Pros in your mind.
The lack of functional upgradability makes these a joke to anyone working in IT.
I had to argue with our CTO over the amount of Ram in servers.
The cost of upgrades for these are more than servers.
IT departments have budgets, they don’t buy hardware just because it has a logo on it. They buy hardware that gets the job done even when the users don’t like the hardware. Apple is making itself a luxury brand and no one who has a budget is going to buy these.
Big studio on the other hand, has large budget than your typical IT department. Those new shiny Macs are designed for those people instead.
@khoifoto Big studios aren't buying Mac's anymore... the primary platform for Unity and Unreal are PC's... especially using NVidia cards of which Apple has abandoned support for.
@@kleanthisgroutides7100 the ones who are still working with prores 4444 or raw for example
Gaming is not the target for the studio. I ran Bauldur’s Gate 3 on an M1 Max at 4k on ultra settings. It matters if the software is optimized for the hardware. And the Mac uses less power than just the graphics card on a PC
It funny how apple stop intel Mac running BG3. However installing the same game, under windows 10 has no problem running it. Now why would apple be so evil to do that
@@macgamer1973 I am sure it is not Apple doing that. Larian probably did not have the time or people to get it to work on Macs with such limited graphics or market. It runs on Apple silicon but would not run on older intel laptops.
I dont think a M2 ultra could be any faster than a RTX3090. In fact, if you run some deadass UE5 games (optimized for apple metal), the m2 ultra has much lower framerate than a 3090
Even a Mid-range GPU consumes more power than the entire Mac Studio.
Also power draw is irrelevant... once you normalise across process nodes its more or less a moot point. High clocks, more cores, more transistors means more power again that's a physics limitation.
Also keep in mind, Cinebench is proven to not run as fast on Apple Silicon as it should be. Cinebench uses a CPU renderer called Intel Embree written for Intel AVX2 SIMD which has a weak port to the ARM NEON SIMD library. Basically the M2 Ultra's CPU cores aren't stressed properly. On Geekbench 6, the M2 Ultra is in line with the i9-13900K, with the fastest OC'd KS models being ~15% faster.
Geekbench Is fake
@@FBL9982 That's meaningless.
Geekbench was (and is) apple-favoured for YEARS. Remember the news when they claimed that CPU in iPhone was more powerful that the desktop one? You don't but I do. Geekbench is a joke.
@@dat_21 Who claimed thr CPU in iPhone was more powerful than the desktop one? Apple? Even if they claimed that and that seemed to be the case on Geekbench I don't think there was any way to know if that was true or false until we could run desktop applications on Apple silicon when they released the M1. And even in Cinebench M1's single core score was competitive with desktop Zen 3 at the time. And at a point where Intel was stuck on 14nm I don't think that iPhone chip claim would've been immediately falsifiable. Even if the A-series wasn't more powerful, one of Apple silicon's strongest sides is the memory bandwidth available to the CPU, and in some cases even if there's more raw CPU grunt in one chip the M1 will still be faster with fewer or weaker cores because it didn't get bottlenecked by mem. bandwidth.
@@utubekullanicisi Cinebench is a joke too. Almost nobody renders on a cpu these days and somehow it is accepted as a universal benchmark these days? It's not even open source. So in addition to testing the processor, it also tests the quality of their code. Could be a lackluster port, or a very thoughtful rewrite that is better that the original code. And nobody knows because source is not open.
Please do also compare performance per wattage and performance when the laptops are unplugged. Then you see the benefit with Apple Silicon in a laptop.
"Apple Silicon" is an SoC. An AMD SoC using HBM3 shared between the upcoming Zen5 and RDNA3 GPU would destroy anything Apple makes, albeit with higher power consumption and cost.
That would be some combo.
It's remarkable the level of GPU performance Apple gets per watt of electricity consumed.
And that means what for desktop computers?
Its remarkable how much apple charges for normal storage.
256GB SSD in " pro" model..thats is a joke and insult
@@evacody1249 - I suppose it means you can save US$1000 per year in electricity you're not using.
@@BrendonCarr🙄 and you don't know much about the watting for the Mac studio or PC.
@@evacody1249When U have studio with 20-50 of them it saves nice buck
Even buying workstation level hardware you’ll get better performance. Something you need to mention is the power draw
Apple is impressive and PC people shouldn't hate. Apple are helping to raise their own as well as PC performance to keep pace. I am a lifelong PC user and although I can't see myself using a MAC for anything, it doesn't mean I can't admire what they have done and how far they have come with their own silicon.
Apple hurting our right to repair and tinker with our hardware. The new macs are a joke, apple took way our right to upgrade. You can’t make your own choice of ram, drive, etc in your computer. An pc world going to allow apple tactics
@@macgamer1973it’s another example of architects ruining everything
There is a bit more in it than the base GPU comparison, the inbuilt graphic encoders for graphics design for instance. However, for gaming , even I had to buy a PC after 14 years because Mac never encouraged developers to write to the OS. For laptops with general purpose usage, which is pretty much the future, the X86 is dead in the water right now. You can't get the performance without the cooling while the M series chip can. The sad fact is that the X86 technology is 50 years old now, the same as the T72 tank. You can upgrade the platform and call it a T95 but the deficiencies in the design for modern usage are still at the core of the structure. The M series chip is getting more efficient while the X86 architecture is requiring massive 3rd party power and cooling developments to advance.
Yea, it's no contest for performance per watt in laptops. Love my M1 MBA. However the M2 is actually less efficient than the M1. They bumped the clock speeds higher but the performance gain was not enough to offset the increase in power consumption. Next year, M3 will take the performance per watt crown.
"There is a bit more in it than the base GPU comparison, the inbuilt graphic encoders for graphics design for instance." - You must be mistaking media engine performance for GPU performance. The built-in media engines in the Apple silicon are mostly what handle photo/video editing workflows not the GPU so it doesn't count. Also, ARM is almost as old as x86 - it's only 4 years younger. The problem with x86 lies in the ISA itself (the CISC ISA) and is the very reason the RISC ISA (which ARM, powerPC, RISC-V uses) was created in the first place.
x86 is about backwards compatibility - something apple user just don't have
Is there a reason you used Wild Life as gpu bench? Maybe getting Timespy or Firestrike Ultra running on a mac is impossible. I ask this because at these gpu levels such a weak benchmark will quickly become cpu bound. Hopefully you ran them at the 8k custom setting.
but you have 192g universal ram, is how much vram increases? it's crazy amount vram.
You should remark that silicon macs get their performances using MUCH less power. A stat with performance points per Watt would make it clear.
PC heads never factor that in. I know when at the end of the day I will get the performance I need out of this Mac pro and even more so when the M3 Extreme comes out. At a fraction of the electricity and AC costs to keep PC cool. I said AC to make a point to the PC heads out there. Apple is on the right path and PC is not. Oh and did I say soundproofing for PC?
remember guys: this is an APU, and it is great as cpu and as gpu
I mean it's an APU that sits on a absolutely massive die. AMD and Intel APUs are far smaller because they don't put as much stuff on the die. AMD can and has made APUs that perform extremely well, look at the PS5 and Xbox Series S and X. Both of those are APUs as well and don't need a dGPU
I think it's only a matter of time before Apple turns their attention to gaming on Apple silicon
Maybe, but it won't be through porting PC games. Only when they can have complete control.
And game studios will spend millions of $$$ porting thier titles for 0.5% of computer market... Wake up
That's still the biggest problem that even if it was priced competitively, there are still way to few games to play. The selection of games on Windows is just unreached
Yes, and Apple is not doing much to increase the number of games to close that gap.
If games are your priority, then you are not the right target group. I’m a Mac user and fan and have to admit to myself that Macs are just not good in 3D rendering and games
@@kunemannthe macs are powerful enough, but there are barely any games
@@Solunexxx you’re clearly not talkling from experience like I do! I have an iMac i9 with an AMD RX5700XT… it’s a very capable machine, don’t get me wrong, but for 3D rendering and pathtracing nothing comes near a RTX GPU from Nvidia
@@kunemann ah for stuff like that, yes
Apple is just like medium performance PC 3060 performance in fact is performance of 8 year old 1080ti so I don't understand pricing of Apple
When unplugged on battery the Apple Silicon will maintain its performance. This is not the situation with mobile Nvidia GPU:s.
7:52 “Who did Apple build this for?” I’ve used PC’s and Macs, and there are definitely pros and cons of each to consider. PC can be much cheaper, and on a component level you get much more bang for the buck. But the components add up to a system, which is what we interact with. This is where Apple’s real value comes in: user experience, which gets even better going from a system level to an ecosystem level.
For me, this is what’s so attractive about Apple. Their pricing is shameless and maddening, no doubt. And if all you will do on your computer is gaming, by all means get a PC. But if you do anything else at all, the buttery smooth performance of apple OS beats the lagging performance and clunky design of Windows, with half the noise and half the power needed. It’s the day-to-day performance that Apple gets so right. And for that, I will probably assume the position and overpay once more.
It should be noted that in video editing I've seen the M2 Ultra perform as well as if not faster than a 13900K + 4090 combo, and the $1000 less 60-core version doesn't have a slower media engine which means the export times will be very similar if not exactly the same as the 76-core M2 Ultra, which I don't think can be said for an RTX 4080 vs. 4090. And in GFXBench 4K Aztec Ruins High offscreen the M2 Ultra has the same fps as a 4080 on DX12.
Obviously people should use whatever works best for their workflow (I think video editing is squarely in the target for the Mac Pro), but if a 13900K/4090 combo performs comparably, it's also worth noting that it will probably run you 1/2-1/4 the cost (depending on memory and storage options). Looking on the GFXBench site, it lists 298fps offscreen for the M2 Ultra and 425fps for the 4090, but I don't think that matters much either way (no one's buying a Mac Pro for gaming). For Geekbench 6's GPU Compute, the highest M2 Ultra Metal numbers score around 200K, while it looks like the avg 4090 scores around 330K w/ OpenCL. Of course if you're using anything CUDA-based, non-Nvidia isn't an option anyway. It's also worth noting that 4090 is far from the top-end option. An A6000 Ada has another +10% TFLOPS and for ML, PCIe H100s will double both the memory bandwidth and FP16 again (and another 8X w/ sparsity, or 16X for FP8!). Also, most workstations will support 4X cards, but based on the state of their PyTorch Metal port, I doubt even Apple is using Apple Silicon internally for ML training...
@@lhl A 13900K + 64GB DDR5-6400 + RTX 4090 + high end motherboard + 2TB Samsung SSD without a case costs $3400, while the Mac Studio with 64GB RAM and 2TB SSD costs $4400-$5400. Hardly a steal, but not 4x the cost. GFXBench does list the M2 Ultra with 330.9fps, and I didn't compare that with the 4090, I compared it with the 4080, as the video itself also did. I didn't mention the GFXBench score for gaming performance, I mentioned it for rasterization performance, which has more uses than just in gaming. And yeah if your application only supports CUDA and nothing else you're not considering anything other than Nvidia, and in Blender Nvidia absolutely demolishes Apple thanks to using 3-4x the power and having strong hardware ray tracing. Apple has done a good job improving the performance in Blender on the M2 series, anywhere from 75% to 2x faster or more, despite staying on 5nm, but Nvidia didn't stand still with the 4090 over the 3090. But if on 3nm Apple adopts some form of hardware ray tracing acceleration within their GPU cores, which they're speculated to do, and gives a nice boost to the GPU core count, I think they can double their performance in Blender again easily, close 50 to maybe even 70% of the gap, and be only be 40% behind a 400W card. And I'm still talking about the 2-die Ultra variant, there's also a rumored 4-die variant that consists of 4 Max dies will probably come sooner or later. I think Apple will be squeezing a lot more peroformance out of their architectures and this is only the beginning.
@@utubekullanicisi The Mac Pro starts at $7K, so that's the 2X baseline. Clicking through, it looks like +$1.6K 192GB RAM and +$2.2K 8TB storage, you get up to $10.8K. On the PC side, 192GB DDR5 RAM kit will be +$400, and an 8TB SSD will be +$800, although on the PC side you have the option of saving some money w/ 2 x 4TB PCIe 4.0 M.2 SSDs for +$500). I suppose that ends up being only a 2.5X or so, the upgrade pricing is more reasonable than previous Mac Pros (since there's not much scaling on the memory side w/ the new models). The 4X Max "Extreme" was rumored for the M1 but never released, and rumored now for M2, but I'm also doubtful that it'll be released. The market for that is just too small. TSMC N3 is fully booked for A17, and from what I've heard, the yield remains quite bad. That will likely move to N3E, but we're probably going to have to wait for N3P and N3X for higher perf parts. Apple has a strong team and I don't doubt they can execute well on whatever management wants, but that's mainly power/perf, and I just don't see the company focused on the high-end like Nvidia, AMD, or even Intel are. Apple is moving their mobile chips (where they make their money) up, the others are basically moving their server/enterprise products (where they make *their* money) down.
It should be noted that video editing is a small market compared to what the major of people use computers for.
I went with the intel/4090, and for the video editing I do its much better than the mac ultra a friend has. I use Davinci. Once you start adding effects that are GPU powered, like Noise Reduction, magic mask, depth mapping, various blur, etc, etc, the encoders can't save the macs weaker GPU from being crushed by the 4090. If you just doing simple simple editing you won't notice any difference what so ever and both the mac and pc will export at like 300fps.
Also the 4080 and 4090 both have the same exact dual video encoders. 4070 and down only get one
Great video I'm a Mac! Glad you included visual graphics and examples.
Yep, the target audience for these high end macs are essentially businesses, but not all of course since some are hobbyists who have the money for it.
This sums it up at 8:23 There are definitely PC counter parts to this.
Thank you!
But what about video editing performance?
Not everyone is UA-camr
I recently bought mi first Mac ever; a M2 Pro Mini with 16gb ram 512ssd. I only bought it for editing RAW photos using Adobe Lightroom. The cost was around $1,200. I am thinking that with that amount I can get a faster PC? Lightroom uses mostly CPU performance rather than GPU. What would be an Intel or AMD equivalent of the M2 PRO?. I am willing to return the mac if I cant same performance with a PC for less. THANKS 🙏🏻
what about the wattage, we should get a saving for that...
Okay now achieve those power levels with 1/6 the wattage and while the laptop isnt on AC...
Apples tech is more impressive, ARM is way ahead of x64
Exactly!
What exactly is impressive if still x86 processors are much faster, pathetic.
@@uribak9144 apple silicon is way more power efficient.
@@STeroidsnicca
Performance is more important to me, I don't buy a processor to save electricity.
In addition, from what I've seen and read, M2 Pro processors and above are already less efficient than M1.
I was ready to buy one. I have been a Mac user for over 20 years and currently focusing on 3d work. I would love to go into the pc world but have no idea what to get that would be faster. Please advise I would be happy to pay you for your time. Looking for something faster than that 24 core / 76- core. Thank You
Or just wait for intel 4 process node!
Depends on your type of 3D work. For example, in Cinema 4D, you have the Cinebench R23 benchmark. The Intel i9-13900K and Ryzen 9 7950x both score 37k-38k while the M2 Ultra is only at 28k. You can get even much higher performance in that App if you step up to Threadripper PRO or Intel's new Xeon W-2400/3400 CPUs with much higher core counts. But now you are clearly into workstation territory and workstation money. To get an idea of performance and cost, I would recommend checking out Puget Systems (www.pugetsystems.com/solutions/3d-design-workstations/).
@@ImaMac-PC Thank you for the information and link. I am looking to get away from apple art this point and I am sure the workstation on your link will be a lot faster and better solution. Thank You
Thanks Again for the Link. I have some one currently helping me build a system. Truly appreciated
The technology in the Mac is vastly superior, which shows up in performance per watt. In time, the arm soc will make x86 obsolete.
The M2 Ultra processor already reaches 295W, you know what will happen to the SOC beyond that... the processor will melt. SOC's method is very limited and has no future especially in need of upgrades.
@@uribak9144 perhaps! Time will tell 😊
Considering the size factor and only sipping power of the Mac Studio Ultra as compared to a PC is amazing. I like PC and Mac.
What good is a fast Apple Gpu if you can’t run anything on it?
Latest Mac Pro is still using PCIe Gen 4 :)
how can we contact you for cooperation?
Considering M2 Ultra is an entire computer packed onto one die, it's performance is actually quite good
9:27 I really think that you are wrong. Apple Silicon has a lot of potential, so I think it will take time but it will be possible to have a good gaming experience on Mac.
Apple Silicon hardware is capable. The hardware is there (except ray tracing). You can get a good gaming experience on a Mac on a few select titles. Apple Silicon doesn't need any more time. It needs games. To get the games, it needs funding. It needs a commitment by Apple. Why doesn't Apple commit?
@@ImaMac-PC There is a lot to catch up, so yes it needs time for games to be developed natively for Mac. They can’t do whatever they want and pay every game developer. It needs to be attractive (profitable) for the developers which is becoming more and more little by little with a growing community and easier ways to bring they games from other platforms.
Maybe casual gaming, or "old gen" games
I’m looking for a machine to edit video in DaVinci, DaVinci Fusion, maybe Adobe, and audio production. Mac is an audio powerhouse, these are excellent for video editing. Looks good enough for DaVinci Fusion. And it’s quiet/efficient while doing it? Sign me up! If you want a gaming machine, you can buy a PC or a console and get an infinitely better experience. Hell, get a PS5/XSX and a Steam Deck/Ally for less then a mac.
I agree. I directly asked for a computer that is good for work and play. But maybe we should ask for a computer that is good for primarily play. With fewer features, it may cost as much as a Windows gaming PC and perform the same.
Edit: I'm doing it. I'm requesting them. It's just a request.
I got RTX4080 based PC and MBA M1 :D. Really happy about this combo :D.
If only integrated graphics on other chips were comparable to Apple Silicon
Apples to oranges... no Raytracing in M2
That's a good point!
Except that making their Game Porting Toolkit available for free to developers to see how easy it would be to make DirectX 12 games run on the Mac isn’t nothing. Why does Apple have to buy a studio to prove they’re serious about gaming?
Time and money taken away from porting the game over to Mac.
There is are Zenamax came out and said they won't support Apple SoC when it comes to Elder Scrolls Online.
There is also the fact more people use PC, Xbox, Playstation and Nintendo.
So yeah Apple have to prove it's worth the time and cost.
This how allow Microsoft buy up bungie. Steve job was piss off. An brought logic and beats , why not game studio
Performace per Watt would be interesting
Someone forgot about the game design WebKit.
This video described it perfectly, I use my Apple electronics for studio work, and I buy Alienware products for gaming.
Double what the level of performance justifies, it tracks.
Buy binned Ultra Mac Pro and stuff it with 4090Ti. With that, you can live stream with your free 60 GPU while do real exporting.
Comparisons I was looking for
What about 1000 hours in terms of power consumption and the price of that ?
Half the price is the hardware the other half is the 🍎 logo.
The Apple Tax gets bigger!
Bro this video is perfect, exactly what I needed. Thanks a lot❤
Glad it helped!
"Mac Studio" .... it's not for gaming. It's designed for professionals to do content creation for a living and those people can make the money back in a matter of weeks or a few months.
Thanks for review
Hello. I recently commented on another of your videos concerning how various GPUs perform with AI noise reduction (adobe). The M2max gets less than half - close to 1/3 the performance of an RTX4080 and even less than that as compared to an RX7900XT. I don't know about gaming but for compute, they're really not performant chips at all. Actually we have an RX6700XT in the lab which gets slightly better AI noise reduction performance than the M2 Max. They're really nothing special for our purposes.
A bit of an unfair comparison Ian. As it stands (although I'm not sure about the latest version of Adobe), the Adobe apps don't use the neural engine but rather the CPU for AI noise reduction (as does DXO) due to a bug in the later versions of OsX. What I can tell you is that the AI NR in DXO when it worked with the neural engine (pre-ventura) was MUCH faster on Apple Silicon. Apple are terrible at fixing bugs though, they've known about this one for months.
@@FantabMedia Cool. Let's talk when it's working. If it is.
The sad reality is game developers are like old web developers, they have clue or refuse to do anything efficiently. And why is everything based on gaming with building a pc.
The average person who buys a Mac is just sick of being screwed by substandard PC builds I think. My one PC is near unusable because every time windows rolls out another update I can't use streaming services like UA-cam smoothly any more. That's on an HP desktop.
For an average computer user, I have converted many a PC user to entry level Macs (that were on sale) for that very reason...and it has greatly helped me as I no longer get calls for Tech Support!
i buy mac products for the look and design of the case but yea no apple silicon sucks
I enjoy this kind of video thanks. I’d be interested in a comparison with the mobile gpu scores from this gem as they are much lower wattage and generally more directly comparable.
Strange to see the 13700k comparison with the M2 Ultra which does laps around my 13900k for code compilation. Each specific use case will see different pros and come I guess
Thank you and Great suggestion! And yes, for code compilation, Apple Silicon is extremely good at that specific task.
Great analysis
Extremely confusing and non-linear comparison. M2 direct comparisons of Pro / Max / Ultra versus the highest-end Intel, AMD, and NVIDIA would be more relevant than meandering stock footage. Even price comparisons are convoluted, factoring a strange mix of legacy systems with and without GPUs. Pick a strategy - affordable mid-tier and/or top-tier system compute, graphics, encoding, neural, usage, and competency per price point.
I think this comparison is not taking into account of actual performance numbers in actual applications that Mac users would be using these machines for. If you consider software optimization, accelerator hardware for media such as prores, and efficiency, you aren't getting close to the actual performance with the mac. Windows hardware should be praised for the massive increase in value but they are throwing wattage to bridge the gap.
They are living with the burden of being backward compatible too. I can take a new NVIDIA GPU and new CPU and run a game from 15yrs ago. Macs can't do that. Apple broke free from that legacy compatibility.
@@ImaMac-PC Playing my GOG catalog is really easy using Wineskin
And what about for PC? NVIDIA CUDA accelerated workflows like 3D modelling with blender optix and hardware accelerated raytracing or autodesk maya with arnold or vray, or even AI workflows (like tensorflow) powered by NVIDIA tensor cores. You guys just keep praising Apple and lowballing PC when the macs are only good at a few select workflows and perform horrible with others. Those doing any 3d CAD/CAM work won't even look twice at a mac.
I will say you are completely right in those areas, however I do think Apple is going to have some major agency now to start addressing the needs of the people in those demographics. I think PC hardware has always been good, but Windows/Software has always been the limiting factor. However, I don't think anyone can really deny the sheer practicality that Apple has introduced with having hardware that is power efficient and powerful at the same time. I think their route is going to prove very useful for closing the performance gap.
No, it's not. People in comments talk a lot about "power efficiency" instead of speed. Buy i9 and 4090 if you need raw power or you into gaming.
My laptop i9 11 gen and 3080 (overclock 120mhz) 4k 160 fps :)
Uhm... i 'Liked' because you finished making a conclusion to your argument -which i liked ; NOT because you told me to 'like& subscribe' the moment i clicked it 🤬
Power efficiency per performance. Apple wins hands down.
the performance of their tablet chips are nowhere near intel amd and envy your talk and useless
nice job for perf per watt. not nice job for every thing else
Youve done something wrong here.
A 7900XTX is not 42% faster than a 7900XT.
Not even close.
Edit : The average FPS in wild life extreme for a 7900XT is 281Fps....so why did you get 240fps which actually puts it in the bottom 5% of all 7900XT results.
I don't understand your claims. I never showed 42% faster. I also never showed the XT at 240fps. How did you come to your conclusions? Please clarify.
Which one saves more electricity bill? That will be my first choice in today’s recession.
The Macs
I need a complete silent device with at least 160GB of VRAM. What has PC to offer?
You can run a PC quiet like an Apple Silicon Mac however, the performance would be greatly reduced.
@@ImaMac-PC Liquid coolers exist
What about power consumption?
They are desktops.
Imagine a starship that goes to another planet.
Which ships will be used for this, Apple M ships or Intel?
The starship will need to save tons of energy but some tasks will demand lots of computation power and control heat so there is no accidents.
Which one will you pick?
Apple M ships or intel ?
'Stay safe' from what?
from apple
As a intel Mac gamer, I couldn’t agree with you more. Apple will never care about gaming on Mac. An love apple threw intel Mac under the bus, didn’t give Nivida their developer lic back so can still support intel Mac , stop the 5,1 from installing boot camp on Mac OS Mojave. However some still making games for intel macs. An my intel still run the latest games without problems. I’m playing evil west with no problems
You are making a sophomoric comparison. Ive built PC game machines for years, last one was a threadripper 5975wx 24 core, 64g ram, nvidia 4080, ssds total cost about 4k . Big case, lots of fan noise . Much rather have the M2 Ultra, tiny case that fits under a monitor,, no fan noise when maxed out, runs cool, better option, and i dont game anyway , so thats irrelevent. More to a choice than just specs, which are not that much different anyway, most would not even notice differences.
If not gaming, then the Mac Mini is an even smaller and more efficient option for the desktop.
@ImaMac-PC not really, i have a mac mini M2 pro 12 core, 32g, 1tb, it gets hot, sits around 80c to 100c when im using it for my work flow, its still quiet with the fan at 5k rpm, but it has a lot of jank for my simulations, the studio max would at least run cool for almost the same price as the mini pro, and the 30 core gpu vs 19 in the mini pro would help the jank. the base mini for me would not work well at all.
I have a Mac mini with an M1 chip and 16 GB of Ram. At work I have a high end i7 laptop with 32 GB of ram. My Mac mini opens all apps and websites instantaneously. My work laptop always lags for few milliseconds, or seconds, before opening any application. So I can completely and absolutely confirm you video is nonsense. You are comparing apples to oranges my friend. I am not an Apple fan boy, only a normal user who uses his computers for work and for school. Performance wise Apple is way ahead of anyone. And this 1,500 USD Mac mini that I got two years ago is the best computer I ever owned. Better than any gaming PC I built in the past.
The new Mac should always perform better than the old Mac. This video adresses new Mac vs new PC. You should try a new PC to compare opening apps and website.
And don’t forget that the SOC also has cores dedicated for video encoding, neural engine, etc. So when editing video it is not just the gpu and cpu working, it has dedicated DSP for this kind of jobs and makes them very fast. How fast? Well, never “instantaneous”, but fast enough you will not have time to make yourself an “instant” coffee.
Don't forget the Nvidia graphics cards also have dedicated video encoders/decoder and many many more neural engine cores than the Mac, and also dedicated ray tracing cores that can be used as well. Also Davinci and Premier will also use, in conjunction with the NVidia GPU, the Intel integrated GPU for more built in video encoder/decoders if needed. THEY BOTH have dedicated encoder/decoder. That is just a Mac sales pitch to confuse you.
Global Gaming Content revenue by company 2021 :
Tencent 32.5 Billion
Apple 14.8 Billion
Alphabet 12.4 Billion
Sony 10.2 Billion
NetEase 9.7 Billion
Activision Blizzard 8.8 Billion
Nintendo 8.1 Billion
Electronic Arts 6.5 Billion
Sea ltd 4.3 Billion
I am confused about your comments about Apples commitments to gaming. They have just chose a sector that works for them to the tune of about 14.8 Billion and it has grown significantly since 2021. I just looked this up …. If you look at the makers of PC Video Card Makers, their focus, where there investment is, does not look like gaming is their business either[Machine Learning and A.I.] And then there are all the online gaming resources, like STEAM .... Gaming is clearly not in the GPU/PC thing one could be lead to believe or is it?
Thanks for the data sharing. Yes, Apple does make money from Gaming as you show. However, that is overwhelmingly from iOS devices. I was focused on AAA gaming for a desktop computer. Titles like Death Stranding and Resident Evil Village. Apple will show off an example or two every event. But they are not making any commitment to bring AAA gaming to the Mac. I would love to play my STEAM library on a Mac. To play Resident Evil Village, I can't use my STEAM version. I have to purchase it again through the Apple App store. I'm not buying the game separately for each platform.
Also, NVIDIA and AMD are chasing A.I. as they see a larger revenue source from businesses and it is potentially more lucrative than gaming.
speed does not matter quaility how ever does and the mac will always be better even if it not always faster it is better and that even you should know
but apples chip is 1/2 the size and wattage of all these comparisons
Apple products are good and dandy for video and photo editing, but that can be done on x86 machines as well. the big money making is in engineering work ie cad, good luck with running an arm apple machine dassault or siemens cad software, ie the big boys in the industry.
I had a refresh class in cad not so long ago, a young women asked if here apple could run the software, everybody started to laugh and said, yeah if you want to play with tablets then the autdo cad fusion is a nice toy, but if you wanna work in this industry then you should throw your apple toy in the trash ;P
That's funny! You're right about the photo and video editing and seems to be the focus for Apple going forward.
@@ImaMac-PC Is it possible for you to incorporate an Apple M Chip CPU/GPU Hierarchy Chart into future videos that pairs each Apple M Chip to their CPU/GPU Desktop Equivalent to make it easier for mac owners to understand what type of relative performance they'd be getting or have? Also would an M2 Max Mac Studio Base Model be decent enough for 3D rendering/modeling/animation for game development and CGI work for the mac platform? Thank You!
It has nothing to do with Apple hardware. There are good CAD and BIM programs for the Mac. It is just that AutoDesk targets primarily Windows and their code base is heavily interwind with Windows. If the software does not exist for a hardware platform, it does not matter how good that platform is.
@@dsblue1977 Autodesk is for Mac as well. The heavy weight of engineering software don't though, and it does not matter if it has to do with the apple hw or not. If it dont work it don't work regardless of reason, that is how companies are reasoning.
You're not running cyber punk on your M.2 GRANDPA... end of discussion.
No I run it on Mac Pro 2010 windows 10
Well made points, thx👍
You bet
If esports titles can get same performance on M2 ultra or even upcoming M3 ultra with hardware ray tracing cores on par or surpassing 4090/5090 😂 while maintaining the same power consumption as M2 ultra (SoC, SiP) then it combined with a LOT MORE AAA titles like CP77, Gotham Knights, Hogwarts, RE4 remake, SF6, BF2042, DQ11S, FF16, FN UE5.2+, Apex, RL, R6 Siege, etc, would help save esports teams tons of electricity bill 🤠🫰🏻
So much excitement in your comment but you know it's a joke.😂
So much excitement in your comment but you know it's a joke.😂
No one is esports titles is using ray tracing nor do they care.
@@evacody1249 but high refresh rate on high end GPU like 360+ FPS is consuming tons of power, even more than a 4K HDR TV. This is why Apple made its own chip: save power, save energy. Don’t underestimate its performance per watt, each tiny bit improvements, and each Apple silicon Mac sold by game developer, if hundreds of thousands are sold, running 247, it would save a lot of electricity bill
It is not very fair to compare apples with pears. It's not just graphics performance that matters. For example, access to memory is many times more efficient with Apple. The CPU performance is also very different. Separately, the power consumption is simply unmatched.
Why not buy PC and spent $3000 for solar panels? That way you will save more because you can use this power for everything in your house
@@jakejoyride I assume you're joking. The investment in actually usable solar panels costs much more.