GPU vs CPU: Running Small Language Models with Ollama & C#

Поділитися
Вставка
  • Опубліковано 26 гру 2024

КОМЕНТАРІ • 15

  • @RomuloMagalhaesAutoTOPO
    @RomuloMagalhaesAutoTOPO 15 днів тому +1

    👍

  • @bdanuw
    @bdanuw 2 місяці тому +1

    Another great video from ElBruno :)
    Thank you Bruno.

    • @elbruno
      @elbruno  Місяць тому

      Glad you liked it!

  • @eugene5096
    @eugene5096 2 місяці тому +1

    Thank you Bruno, interesting as usual !!!

    • @elbruno
      @elbruno  2 місяці тому

      @@eugene5096 Thanks! The CPU vs GPU is a wow one 😁

  • @muluhsenet7582
    @muluhsenet7582 Місяць тому +1

    ❤❤❤

  • @samirou976
    @samirou976 2 місяці тому

    Hi ! This is very interesting but I wonder how it would perform on an NPU ? Is it possible to make it run on NPU ?

    • @cuachristine
      @cuachristine 2 місяці тому +1

      NPU is a GPU with all the graphics bits removed.

    • @samirou976
      @samirou976 2 місяці тому

      @@cuachristine Yes I know thank you but that was not my question

    • @elbruno
      @elbruno  2 місяці тому +1

      Ohh that's a great question! I still don't have access to an NPU machine, however, if docker desktop allows access to the NPU, it should work. Let me ask Justin (a fellow CA.NET), who rocks the docker world to see what he can share about this.

    • @samirou976
      @samirou976 2 місяці тому

      @@elbruno Thanks man I would realy apreciate that 🙂

  • @bilalbilal7674
    @bilalbilal7674 2 місяці тому +1

    Bilal here😊, I think you should go for creating an extension then it will be good and easy to accessible

    • @elbruno
      @elbruno  2 місяці тому

      There is one in the Aspire Community Toolkit: github.com/CommunityToolkit/Aspire/tree/main
      I may record a video about that one!
      Best

  • @jimmymac601
    @jimmymac601 2 місяці тому

    Does this support multiple GPUs ?

    • @elbruno
      @elbruno  2 місяці тому

      I'm not sure, I'll say no, we may need to check with the ollama team