432k+ GPUs | Inside Microsoft's latest AI supercomputer with Mark Russinovich

Поділитися
Вставка
  • Опубліковано 5 лис 2024

КОМЕНТАРІ • 77

  • @alexpearson415
    @alexpearson415 5 місяців тому +28

    This is my favorite video that Microsoft makes. So cool

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +3

      Thank you so much! Appreciate your taking the time to comment and glad you liked it.

  • @ThaLiquidEdit
    @ThaLiquidEdit 5 місяців тому +21

    Mark Russinovich is a legend!

  • @blitzio
    @blitzio 5 місяців тому +15

    Awesome to see this, especially the hardware, networking and data center breakdown and info.

  • @ds920
    @ds920 5 місяців тому +13

    That’s why I choose to buy their stocks, they know what it means to actually work. It was a long way for me from early 90s, when I’m - hardcore Unix user was calling Windows only using words “must die”, to start spending my free money on their stocks, and to actually admit what this company is really doing all this time. Thank you guys for keeping that spirit!

    • @Gersberms
      @Gersberms 4 місяці тому

      They do awesome work, VS Code is basically the best program I've ever used. It's just such a shame Windows 11 is garbage all over again. I just moved to Ubuntu at home and couldn't be happier with it.

  • @Breaking_Bold
    @Breaking_Bold 5 місяців тому +3

    Very very informative…sent it to my kid who is in college to see and keep seeing till they understand every word!!!

  • @BigEightiesNewWave
    @BigEightiesNewWave 5 місяців тому +9

    Man, Mark is God-status at Microsoft

  • @Daniel-es9dq
    @Daniel-es9dq 5 місяців тому +3

    I’m so glad people much smarter than I are working on this.

  • @user-b39z1
    @user-b39z1 5 місяців тому +3

    With Great Power comes Great Capabilities...
    Microsoft 📲💻🖥🎮

  • @IshaqIbrahim3
    @IshaqIbrahim3 5 місяців тому +4

    Timeline: 9:00 What happen to the heat energy extracted during cooling? Does it get used to generate electricity to power other devices or supply energy to some of the cooling fans or is it not used for anything?

    • @jamieknight326
      @jamieknight326 4 місяці тому

      It’s not reused. The heat is distributed across millions of litres of water and it can’t be concentrated back into a single spot. Sadly we can’t take 2 litres of 50c water and turn it into 1 litre of 100c water…
      The water is heated, but not heated enough to be very useful for much beyond heating offices / nearby buildings.
      I’m curious if someone will use the heat for some sort of low energy industrial process like drying cement.

    • @IshaqIbrahim3
      @IshaqIbrahim3 4 місяці тому

      @@jamieknight326 like keeping the tea, coffee, eggs etc. warm. 🤣

  • @ShpanMan
    @ShpanMan 5 місяців тому +2

    Underrated video, a lot of cool useful details!

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +1

      Thank you! Happy that it's useful - and it keeps evolving quickly.

  • @Merlin-b9f
    @Merlin-b9f 5 місяців тому +3

    Ah, the sysinternals guy. I owe half my career to this guy. Thx.

  • @LouSpironello
    @LouSpironello 5 місяців тому +5

    Great info about the architecture! Thank you.

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +1

      Thank you! Glad it helped on the architecture front.

  • @terogamer345
    @terogamer345 5 місяців тому +2

    5 times the Azure supercomputer deployed each month, thats insane!!! What does that mean for training next gen frontier models? 30x November 2023 does it mean you can train it 30x longer, 30x bigger or 30x faster or what? Will this continue up to the end of the year reaching almost 65x compute in one year?

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +3

      Good questions. We have deployed 30x total or on average 5 additional instances per month of the November 2023 Top 500 submission with 14k networked GPUs, 1.1m cores and 561 petaflops. These will continue getting bigger and more instances provisioned in the future. And now there are more options for GPUs and AI accelerators, too, plus the Nvidia H200 and Blackwell architectures are coming soon with more speed, power and efficiency.

  • @drivenbycuriosity
    @drivenbycuriosity 5 місяців тому +5

    Most fascinating part for me is the Multi-LORA.

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +2

      It is. It's a little like differencing disks with the additional state/data.

  • @SrikarKura
    @SrikarKura 5 місяців тому +5

    Interesting architecture.

  • @SuperRider-RS
    @SuperRider-RS 5 місяців тому +4

    Great session, Thank you

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +1

      Appreciate the compliment, thank you!

  • @liberty-matrix
    @liberty-matrix 5 місяців тому +4

    "it's funny you know all these AI 'weights'. they're just basically numbers in a comma separated value file and that's our digital God, a CSV file." ~Elon Musk. 12/2023

  • @QuantumXdeveloper
    @QuantumXdeveloper 5 місяців тому +1

    Great session, Mark is as always the best❤

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +1

      Thanks so much! Appreciate your taking the time to comment.

  • @Jj-du8ls
    @Jj-du8ls 5 місяців тому +4

    5 times the Azure supercomputer deployed each month? Is that a typo..

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +8

      It's not. We just announced 30x have been added since November 2023

    • @Hashtag-Hashtagcucu
      @Hashtag-Hashtagcucu 5 місяців тому +1

      What he isn’t saying is for how long this rate goes on

    • @guruware8612
      @guruware8612 5 місяців тому +2

      @@Hashtag-Hashtagcucu For ever, as long as there are people thinking that it's a great idea to chat with a machine or have a robot-dog.
      Insanity is the new norm.

    • @coreystrait513
      @coreystrait513 5 місяців тому +2

      ​@@MSFTMechanicsStargate and quantum computing hurry up

  • @jamieknight326
    @jamieknight326 4 місяці тому

    It’s amazing… impressive budget for by chips from NVIDIA. But is it worth it? Curious to see if AI will take off or not.

  • @lifeslooker
    @lifeslooker 5 місяців тому +1

    What would it take to take a 175B model to shrink it to run on a mobile phone? What are the limitations? The language used in the model? Can a compression be used or a language be developed that doesn't take up much space?

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +4

      The closest correlation to size is the parameter count, so Phi-3-mini has 3.8bn parameters and is roughly 2.2GB file size to run locally on the phone as demonstrated by Mark in the video. There are things that the larger models will do in terms of reasoning and built-in knowledge, as Mark said. One example that we actually hit while planning this show is that the slightly larger Phi-3 models could phrase the cookie recipe in the writing style of Yoda from Star Wars. Because mini didn't have the pop culture references in its training set, we made the tone sarcasm instead.

    • @lifeslooker
      @lifeslooker 5 місяців тому +2

      @@MSFTMechanics funny I’m watching Star Wars episode 1 right now on Apple TV+😂😂😂😂
      Sarcasm is something the is very rich in style and in different languages would be interesting to see how this is done in say Italian or French

  • @Rafael555888
    @Rafael555888 4 місяці тому

    So they can now run the same LLm on different GPUs(Nvidia vs Maya vs AMD)?

  • @phobosmoon4643
    @phobosmoon4643 5 місяців тому +2

    Great video. I have a maybe annoying question; how can we know that cloud ai services are selling us what they say they are? For example, context length could easily be fudged.

    • @phobosmoon4643
      @phobosmoon4643 5 місяців тому +1

      @@test-zg4hv yea I'm asking how you test it? Is it kind of like a error checking algorithm?

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +2

      You can stipulate that in code or using the Azure AI Studio, and you can test it. We cover that to some extent in this episode ua-cam.com/video/3hZorLy6JiA/v-deo.html

  • @GhostyDog
    @GhostyDog 4 місяці тому +1

    What’s that again..? You’re adding the capacity of the third most powerful supercomputer every month! 😮

  • @duran5533
    @duran5533 5 місяців тому +1

    Did I understand correctly: "Today, 6 months later, we deploy the equivalent of 5 of those supercomputers every month"!?!?

    • @MSFTMechanics
      @MSFTMechanics  4 місяці тому +2

      That's right. 30+ instances have been built since November 2023

  • @DanielSeacrest
    @DanielSeacrest 2 місяці тому

    So in November 2023 here was a supercomputer of ~14k H100s. Every month since then you have done an equivalent deployment of 5 of those clusters? That is quite a few hundreds of thousands of GPUs. I wonder how many of these are being used to train the next generation of OpenAI's model. 100k? 200k?

  • @kylev.8248
    @kylev.8248 5 місяців тому +3

    This is awesome

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +2

      Glad you liked it and thank you!

  • @Facts-in--Nutshell
    @Facts-in--Nutshell 5 місяців тому +2

    Thanks, quite impressive!

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +1

      Thanks for watching and commenting!

  • @sceptic33
    @sceptic33 5 місяців тому +1

    on the subject of cooling and power requirements, i've been saying for ages that the "waste heat" is only waste if you don't use it. most electricity generators work by using heat to drive turbines. instead of using burning fuel or nuclear reactions to create heat, we should use the heat generated by compute as the source for generating electricity. pump and compress the heat from the cooling fluid into a reservoir which a second heat exchanger uses to vaporise a second working fluid which drives the turbines turning generators that feed electricity back to the GPU clusters. recycle the power endlessly.

    • @jamieknight326
      @jamieknight326 4 місяці тому +1

      The physics problem is around concentrating energy / heat into one spot.
      While the total heat energy is in the MW range, it’s distributed across millions of litres of fluid (water / air) which is lightly heated and can’t be concentrated into a single place. Thermodynamics doesn’t allow for addition of heat between working fluids. You can’t use 2 litres of 50c water to create 1 litre of 100c water.
      I’m a nutshell, we can take the distributed head and convert it into the high pressure high volume of steam needed to run an electricity turbine.
      The heat may be useful for an industrial process like drying cement. But that ends up being being uneconomical as power from the grid is much cheaper than recovered heat.
      I wish this process worked. It would be amazing, but the physics doesn’t work out. :(

    • @sceptic33
      @sceptic33 4 місяці тому

      @@jamieknight326 people always say it can't be done. i'm not convinced. low grade heat is raised when compressed by a heat pump. using a multi stage setup where a chain of pumps uses the increased temp from the previous pump as the base to concentrate further, is see no reason why a final reservoir of compressed heat shouldn't be hot enough to drive a turbine and generate electricity. you can generate electricity with a sterling engine and a cup of tea. a data centre converting 100MW of electricity into 99.9MW of heat, should be able to provide 99.9MW of heat to a heat engine.

  • @jeffreyrh
    @jeffreyrh 5 місяців тому +2

    Wouldn't it be possible to create a distributed computer system like SETI or that Protein folding project, and use this computing power to train AI systems? Those projects used peoples personal computers when they had idle time.

    • @Zreknarf
      @Zreknarf 5 місяців тому

      it's called a botnet and yeah you can do that. these are purpose built AI chips though, nobody has those at home because they are not for sale yet.

    • @Zreknarf
      @Zreknarf 5 місяців тому +1

      also, from the video, inferencing requires high bandwidth memory, not so much compute power, which would suffer greatly from latency

  • @nestorreveron
    @nestorreveron 5 місяців тому +2

    Thanks.

  • @RohanKumar-vx5sb
    @RohanKumar-vx5sb 5 місяців тому +1

    cool stuff!

  • @synthwave7
    @synthwave7 5 місяців тому +2

    Glad Microsoft is making sure there is co-existence between all hardware manufacrturers, otehrwise AI hardware will become chaos.

  • @Crunch_dGH
    @Crunch_dGH 5 місяців тому +1

    I prefer the much more reliable/resilient IOS. Just replacing my trusty Air with a 2TB M3 iPad Pro.

  • @amg2u
    @amg2u 4 місяці тому +1

    iPhone?

    • @MSFTMechanics
      @MSFTMechanics  4 місяці тому

      Yes, iPhone 15 Pro Max in this case.

  • @bfg5244
    @bfg5244 5 місяців тому +1

    that's inspiring

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +1

      Glad you liked it. Thanks for taking the time to comment.

  • @kyber.octopus
    @kyber.octopus 5 місяців тому +1

    Nice

  • @Rkcuddles
    @Rkcuddles 5 місяців тому +2

    This dude AI?

    • @DeployJeremy
      @DeployJeremy 5 місяців тому +1

      Mark has been trained on at least 175 billion parameters, but he isn't AI 🙂

  • @Arcticwhir
    @Arcticwhir 5 місяців тому +1

    13:38 you used the same exact joke a year ago with mark

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +2

      Yes, that was intentional, because Multi-LoRA would allow Neo to have hundreds or thousands of skills added simultaneously, not just the one like last year.

  • @youturunnyng
    @youturunnyng 4 місяці тому

    Rubén godoy islas 4:35

  • @donelson52
    @donelson52 5 місяців тому +3

    How much CO2 does this cost? EXACTLY how bad is it now and EXACTLY HOW will you power this by 2030

    • @MSFTMechanics
      @MSFTMechanics  5 місяців тому +3

      Check out the Microsoft sustainability site for details: www.microsoft.com/en-us/corporate-responsibility/sustainability-journey

  • @ArronLorenz
    @ArronLorenz 5 місяців тому +1

    Solid organic joke.