The Most GPU Power in ANY Mac - What can the W6900X Do?!
Вставка
- Опубліковано 16 лип 2024
- AMD's W6900X is the MOST powerful graphics card that the Mac has ever seen, so we put it through the test to find out if this expensive $6,000 GPU is actually worth it.. Let's find out!
Can't wait for the M1Z? Here's the BEST value M1 Mac ➡ geni.us/yG9Zb
Support us by checking out our Merch! ➡ teespring.com/stores/max-tech...
Max tech wallpapers ➡ bit.ly/2WNc6Qw
Best deals on M1 Macs on Amazon ⬇️
M1 MacBook Air ($950 SALE) ➡ geni.us/1mJ41T
M1 MacBook Pro ➡ geni.us/7lb3Gn
M1 Mac Mini 2020 ➡ geni.us/Atm1
M1 iPad Pro ➡ geni.us/UnPQHJ
M1 24" iMac ➡ geni.us/UFPMx
In this video, we test out the W6900X graphics card to see how it performs and compares to other graphics cards like the W6800X which we already have.
If you enjoyed this video, Tap Like & Subscribe for more videos like this one!
Timestamps ⬇️
Radeon Pro W6800X vs W6900X - 00:00
Geekbench 5 Metal GPU - 1:21
Basemark GPU Test - 2:01
Blender 3D Rendering - 3:26
4K HEVC Stabilization - 4:36
4K H.264 Video Editing - 5:47
ProRes RAW Video Editing - 6:32
10bit R5 Video Editing - 9:31
C200 RAW Video Editing - 10:41
8K R3D RAW Video Editing - 11:06
Is the W6900X worth it? - 11:53
Max Tech Wallpapers ➡ bit.ly/2WNc6Qw
Buy one of our NEW T-Shirts to help support us! ➡ teespring.com/stores/max-tech...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Shop on Amazon ➡ geni.us/wB2mWqd
Shop on B&H ➡ bhpho.to/2kfoI34
Shop on Adorama ➡ bit.ly/2R7qezq
10% off unlimited yearly music licensing on Soundstripe (what I use for all my videos) use coupon code "Max" here: soundstripe.grsm.io/e/6lv
Shot with (Amazon) ➡ geni.us/XE0r
Lens (B&H) ➡ bhpho.to/2DZerxL
Mic (Amazon)➡ geni.us/83CN3V5
If you enjoy our content please consider supporting us on Patreon. Even $2 a month helps us make more and better content for you!
/ maxyuryev
~-~~-~~~-~~-~
PRIVACY POLICY and FULL DISCLOSURE:
°Max Tech is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to amazon.com
°Max Tech is a participant in the B&H Photo Video affiliate program that provides an advertising commission if you purchase through our links.
°If you purchase something from our affiliate links will get a small commission with no extra cost to you. This makes it possible for us to make more videos. Thank you!
°We DO NOT collect, store, use, or share any data about you.
~-~~-~~~-~~-~ - Наука та технологія
It would be cool to bring in a guest that does animation, 3D modeling, CAD, etc to run some test of their projects compared to what they currently use.
I did lot of that on my channel already with the W6800X DUO
@@morgonaut Thanks, reminded me I did not have the bell on for you
@@tomphillips8739 ...despite I say that in every video "enable All notifications" :(
macs dont have such software, all apple products are designed on windows softwares (inventor solodworks ptc creo)
@@FinoMarg Rahuj chill out and don't say lies :)))
I’m thinking you’re gonna return the mislabeled 5900x with the 5900x packaging to Apple 😂😂😂
Totally!
And just tell them like “you owe me $6000” because if they don’t plug it in and find out otherwise that’s your free money!
Shh!
Nah I think Apple test returned products and they will see the actual card model.
@@normanpang9376 I mean who knows! Apple themselves misprinted a 5800 themselves they could make a similar mistake again!
Great video! My tests revealed very similar results, in real world use and exporting, there just isn't much of a difference between these GPUs. The W5700x is a great value for the performance. The sweet spot is likely the W6800x Duo though, it will outperform the W6900x in apps that use 2 gpus, like FCP and Resolve, plus it's cheaper!
Where these new GPUs really shine is something like Octane X and 3D work, they have a much bigger difference. With 4 W6800x I was able to get the highest score on a Mac ever for Octane X, much faster than the last gen Vega ii duos.
But for video work, even the Vega ii duo is still great at the discounted price, the newer GPUs don't offer earth shattering performance as you would have thought.
In a lot of cases the CPU will be a bottleneck too - I have a 28 core and it fills up regularly, even with R3d Raw which does most of the work on the GPUs itself, it still uses it a lot.
max tech: has *2* new insane graphics cards
me: uses integrated graphics
Same for me suffering by intel hd 3000 graphics😂
Me too with uhd 620
I've been watching you all for years. You all deserve 1,000,000 subs. Always QUALITY content.
You guys are amazing and thank you so much for bringing us such amazing content.
Max, thx so much for this. On my 2019 MacPro with the Vega II, and 16 cores. My 10bit 422 R5 footage actually plays back smoothly. Scrubbing has a slight delay here and there but its pretty good. I think those extra CPU cores really help. I wonder how the 24 or 28 core CPU’s would do. Looks like the video cards for that R5 footage make very little difference.
Actually I was also going to comment about that i built a ryzentosh with r9 5950x and rx 590, latest 11.5.2 big sur running. Tested r5 4k, 8k slow mo footage. No lag at all.
I added 3 layers, with text, transition no issue at all.
But for my gpu rx 590 was utilizing about 60 to 80% and I am quite happy with the performance.
@@somade8424 yeah. In a couple years when they are cheaper I will pop in a 24 core Xeon in the MacPro. Should make it even better.
@@RockyColaFizz great idea 👌
I have a 24core Vega II duo one and I can not play back 8k raw smoothly, but with 422, I can.
@@pacosstudio412 Great to know. Thx!
The first time stamp says “W6800x vs W6800x” might want to fix that🙂🙂🙂
Thanks doing those Redcode tests, super helpful for my work, and knowing what matters in a workstation!
Test them out in Resolve, a 4K and an 8K timeline, both with R3D RAW files at different resolutions, with a regular CC workflow, about 4 nodes, and then add denoise.
+1 on this (real world scenario for a Red user in Davinci)
Cant wait until Apple releases an Apple Silicon Mac Pro. That thing will be an absolute BEAST
another great, realistic and worth seeing video, keep it up, I am watching and learning
Hello, is the Radeon Pro W6900X connected on the Razer Core X egpu compatible with the Mac mini 2018?
for that money, I would get a razer blade 14 with a RTX 3070 and a nice 5k monitor and have a nice gaming setup instead of buying a single GPU and spending more money for the rest of the mac pro
This isn’t a gaming GPU.
Seems like you are not at the age of doing some serious scientific work … Those professional GPUs are completely different cards than gaming cards …
@@miniroll32 well with an rtx 3070 I could do editing and whatever that gpu can
@@ThePianist51 an rtx 3070 can slo do the same thing
@@rohitreddy7298 Not really comparable, this beast has 32gb of VRAM. Something you'd definitely need for huge scenes in CAD software an RTX 3070 just wouldn't have sufficient memory for that. Although the RTX 3090 could be a decent contender if you don't need professional drivers.
Hi. The same device can be connected to mac air M1 ? For eGPU using plugable 3
Fantastic video. Really interested in the W5700X performance and the Mac Mini M1 comparison. It'd be nice to add maybe some benchmark with Adobe Premiere / Photoshop / Davinci, to see if these cards represent any advantage. Keep up the great work!
Hi Max PLEASE PLEASE do a video comparing Mac mini M1 vs 2019 Mac Pro exporting test in Adobe Premier.
Awesome video! Keen for the w5700x and Mac mini video. Could you do an after effects test? I know AE isn’t great with utilizing resources but it’s still be great to see. Personally I’d love to see a primatte green screen key test 😊
how do you get those system usage icons on your task bar??
Love the painstaking efforts you put into all your videos. Very much appreciated. Have you tried real time 3D graphics with this card? I wonder how that stacks up.
they did a blender 3d render which is representative of all 3d workflows
@@KaelDenna Yes, true. I guess I meant to say real time viewport engines like Redshift, Octane, etc. Stuff that really takes advantage of the card as opposed to standard benchmarks that utilize mainly the CPUs.
@@BLT-70s i assumed the blender render test was cycles on GPU compute? i think it is. otherwise why did it improve by 25%
@@KaelDenna Yea, true. Didn't think about that.
@@BLT-70s ok so i did some calculations based on the relative speed of this card and then using Opendata.Blender Org I estimated that it's about as fast as an RTX 3080 perhaps a touch slower.
I wonder which microphone you use. I realized I could hear footsteps from the apartment above you. Thought I was tripping but sure enough.
what software is he using to show the CPU and GPU processes and loads?
please do some gaming tests under bootcamp, both W6800x and W6900x?
It might be quite interesting to see how the latest Resolve 17.3 performs on the M1 released today. According to BMD it's up to 3x faster on the M1 architecture and also includes faster H265 encoding for M1 and new hardware decoding support for AVC Intra.
Do you have long videos like ~2hr interview or worship service?
Stress test the gpu with longer clips maybe there will be a difference.
Picture in picture 4k?
Right at 9:08 the top right picture looks like Coeur D Alene Idaho. Used to live there, and if I’m not mistaken 509 area code is east washington (Spokane area) which is an hour or less drive to
I'd be curious if the drivers in Monterey Beta are better. That may be where these cards really open up.
What's the app you're using at 5:07?
How about the w6800x duo? The data one the website seems better than w6900x, but it’s cheaper. What’s the actual situation?
W6900X can be tested for Shaders , particles , hair simulations , noise reduction, de flicker plugins , multiple footage stacked together 4k , 6k ,8k , Most number of effects in a footage , Gaming @ max specs etc etc.. Nice video & Thanks :) So keenly waiting for the comparison with M1 Mac Mini :) I think W6900X is for people who do professional job & won't update their machines for at the least six or seven years :)
Let’s appreciate the amount of effort he puts into every video 🙂
And money lol
@@amanagarwal1939 I was just gonna say that lol
For the resources it doesn't give an accurate utilization though does it? At least on windows when encoding video on my Vega 64 GPU utilization is around 7-10% but GPU encode utilization as displayed my windows task manager it says 100%.
Awesome to see you guys knowing about the newer GPU's and testing them too! Not even LTT or Dave2D have yet to acknowledge that they are aware of these new GPU's :D
Noise reduction in Davinci Resolve Studio 17 would be nice!
what about the 6k xdr resolution?
For editing Red Komodo footage, would it be better to get the 6900 or 6800 duo if the extra cost is not a concern? Thanks in advance!
What a shame you didn't test it in After Effects where GPU really matters. You seem to forget about that all the time :(
The GPU doesn’t matter very much in AE, only for certain effects / operations.
@@eriklindahl7376 Are you sure about that? Have you used it on an integrated graphics vs like the NVIDIA one? I haven't done actual tests but I am pretty sure it matters. However as you said maybe it's the effects which is kinda the point of After Effects...
After Effects is mostly CPU.
@@MaxTechOfficial Maybe I had a weaker CPU I guess
@@MaxTechOfficial For AE functions yes, I was thinking more about GPU-accelerated effects. I maybe should've re-phrased my post.
Awesome Bru!
Blender real time play back would be nice to see. Im an animator and I have 5700 and it does great, but I’m considering the 6800 to be able to have better real time play back (more assets in shot) and render time. Feels like the Mac Pro was built for my kind of work flow, but not really for video editing
Glad to hear users of Blender on Mac, hopefully it will help steer a bit more of the efforts towards Metal Cycles and Co (where pertinent that is).
I’m no animator, but I seem to read that actually the M1 nails it on this regard? The mostly single threaded nature of playing back a scene preview makes it a prime candidate for that chip. Any insight appreciated.
@@alejmc I’m not sure during playback, but when rendering with eevee I use little cpu and at 4K renders it can go up to 95%gpu. 1080 runs around 50%gpu. That’s on the 5700
Anxiously waiting on your W5700X vs W6900x comparison video. I currently have two W5700X cards in my setup. Do you think it's worth trading those cards for a single W6900X? or a single W6800X?
I am using Linux Desktop primarily for programming, while also have been testing macOS in VM, enjoying watching UA-cam videos about M1 MacBooks and I would love to get my hands on Macbook 14 :), Unix, Optimised OS, Boosted ARM chip, Great battery; Unrivaled Laptop for programming,
Max Tech, is it possible to replace the W6800X chip with 6900XT chip?
Would you test ethash in Windows? I wonder what RDNA can do without memory bandwidth bottleneck.
Hey Max, can u guys plz do a comparison between the Dell XPS 13 & Razer Book 13.
would like to see this test in Resolve .
How much it give mgh at ETH ?
Would love to see comparisons with 2x w5700x which may provide a great cost to performance ratio. 👍🏻
Would you consider doing a video about installing an off the shelf 68 or 6900XT in a Mac Pro? I know people have had success with it but I find it all very confusing.
Man why didn’t you do the w6800x duo? 2x dual gpus totalling four gpus.
i really wanna know the w6800x duo vs W6900x vs 6900xt, thanks a lot
I wonder if the W6900X is much more capable of mining. 6900XT is quite restrained by memory bandwidth.
Jeez I thought the 3090 with scalper pricing was expensive
Bro can anyone tell me the difference between micro and mini leds displays?
Mini LED is just much bigger LEDs compared to Micro LED. The latter however is currently so small that it needs manual placement of the LEDs to get a functional panel (Samsungs "the wall" for example) so they're not really massproduceable yet & therefore cost a small fortune.
Benefit of smaller LEDs is that you could at some point have every LED behave like its own pixel if you get them small enough, meaning the full benefit of OLED with none of the downsides (theoretically though, practically it might not actually be all roses & moonshine).
Compare it to RTX3090 for productivity. Please.
What about w6800x duo?
sheeeesh i want that monitor soooo bad... im watching on a 5k imac and somehow the frame you filmed the monitor screen, inside that part the image go so much sharper!! i just wish apple reduce the price cause i cant stand on paying that premium price, unless is a write off lol
Nothing at all wrong with your video, great job. My problem with hardware like this, just as your previous video. 5K and 6K on a graphics card we need to see more than 50-60 percent usage on the gpu side. Almost a justification
Do he buy these and return them after review?
Red Raw 8K in Resolve and FCPX with NeatVideo and grade exported to UHD ProRes 4444. Make sure you debayer at full quality.
MacPro 2019 Vega II
MacPro 2019 W6800X
MacPro 2019 W6900X
MacMini M1
I think the latter might have… issues… :)
You can also test the above in Resolve with the W6800 + W6900 together.
You are really tripling down on that Mac Pro. Any takers yet?
I'd love to see the comparison video to the w5700x. Also, in hopes apple will create a replacement M1 chip for their 2019 Mac Pro model. ;)
It would be nice to see you try to edit 12K and 6K BRAW with highest quality settings for each resolution.
And do everything in the studio version of resolve
Boy, if your results are the same, the bottleneck lies sometimes else smh 🤦♂️
@@johalun But then its not testing the card is it?
Try to export at x265 in very slow preset
I presume, CAD type of workflows would take advantage of this sort of graphics card, especially if it made a difference in blender
Bah, too bad you didn't display the M1-series results for these projects on the benchmark result sheets... :)
Now you need to try the duo w6800x
What is the eth mining performance
Double the price for 32% more performance? How does it compare to two 6800X MPX modules? Mac Pro can use two modules, right?
which clip you played that can run smoothly on m1 mac but not on an expensive card. And you have never explained why. I am very interested in that clip. (a girl with a cycle)
Max bringing the Emeril Lagasse - BAM!!
Could you do 6k gaming on it
Doing that right now.
But can it play crisis ?
I am so so excited to see the next gen mac pro with apple silicon with 64/128 cores.....
3D GPU rendering comparison video, yes please!
I did 3D tests of W6800X DUO on my YT channel yesterday
What is limiting it if the cpu and gpu aren’t being used at 100%
Storage, ram, data transfer......
It’s not that unbelieveable that a gpu is idle during encoding/decoding. The system has native encoders for those files, so it uses those encoders and is limited by that, instead of brute force.
Temps?
Can it run cyberpunk?
I believe if you were to go completely ballistic in final cut, when the 6900 would start to shine.
Crazy that by the end the $1000 cards feel cheap by comparison. Several years ago I would not entertain anything over $500 maybe.
Great video.
honestly i'm not bying any computer until m1x comes out.
crazy how fast single thread speed is on thee m1 and how it can spoil every single CPU on the market for professionals in many many fields.
I was thinking like that too but I have a plugin heavy after effects workflow so I'm getting the last Intel imac.
@@_nebulousthoughts as a video editor and cgi artist i just press Render / Export. and go to sleep XD
i'm too poor to afford 2 computers so i'mma abuse the shit out of my 2016 macbook pro and pray to all the Gods that it survives until i can buy the m1x iMac
At these specs, its the code limiting things rather than the hardware. Definition of diminishing returns.
It's like cheap batteries from Ali, 8000Ah on the label but much less inside, fortunately they were not designed in CA )
Hurray for this test and benchmark of the $6000., graphics card. Very exciting and interesting as I can never afford this graphics card. I definitely want to see your comparison of the W6900X compared to the Apple silicon M1 system on a chip.
It's going to be pointless for real. The M1 GPU is compared to Nividia 1060 I believe. I don't know why people think it's going to stand a chance against any higher-end GPU. It's beating out every CPUs built-in graphics. But It can't hang with the 20 series Nividia's
You need a MUCH faster disk, I suspect.
But can it game tho?
would be nice to see Neat Video performance :)
I wanted to ask why the graphics card is 6000 dollars and that if normal and cards can work with the Mac Pro
Please try in Davinci! They are just much better with high end GPUs, fcpx is more friendly to MacBooks.
“Not that much better score than the MacPro launched with…”
40% higher seems like “much higher”?
it it more powerful than a normal 6900XT? It has less RAM but more compute units.
About the same in terms of performance, but double the VRAM.
Put them both in and lets seeeeee 🙈!!!!!!!!
You MUST do the 6800 Duo card.
Can you mine with these cards?
the size of the GPU is proportional to the price?????
imagin you get a 1200 dollar 6900 XT with double memory and pay 6k+ on it... wow apple heads...
I look forward to the M1 Mac Mini comparison video. If for no other reason than to see how even much more a poor value the Mac Pro is. 😂 But in all seriousness, the idea that comparing Apple's high-end desktop with a stupid expensive graphics card to their lowest end desktop is not only plausible but reasonable is remarkable.
The future is extremely bright for even higher end Apple Silicon machines.
The Mac Mini's Benchmark 5 Compute is 170.000. But then again just the graphic card cost about ten times as much and it is not giving you ten times more power.
Better MacBook Air M1 256gb vs MacBook Air i5 512gb?
That is a thick gpu
Please run some Unity game dev rendering