Llamafile on Android Tutorial

Поділитися
Вставка
  • Опубліковано 30 чер 2024
  • llamafile github - github.com/Mozilla-Ocho/llama...
    (releases on the right side)
    tinyllama - huggingface.co/TheBloke/TinyL...
    ➤ Twitter - / techfrenaj
    ➤ Twitch - / techfren
    ➤ Discord - / discord
    ➤ TikTok - / techfren
    ➤ Instagram - / techfren
  • Наука та технологія

КОМЕНТАРІ • 11

  • @mofo78536
    @mofo78536 8 днів тому +1

    Just saw your PR. Greatly appreciated your contribution to the project so far, especially with the cosmo backend.

  • @MikeBirdTech
    @MikeBirdTech 8 днів тому +3

    Debugging with Justine!! Well done dude, so happy you got this working

    • @techfren
      @techfren  8 днів тому

      @@MikeBirdTech 🥳🥳🥳

  • @wardehaj
    @wardehaj 8 днів тому +1

    Great video, thanks a lot!

    • @techfren
      @techfren  8 днів тому

      @@wardehaj you're welcome! Thank you for commenting!

  • @fire17102
    @fire17102 8 днів тому +2

    Using Ollama, whats the benefit of llamafile? (Not including android use)
    Is there a recommendation hybrid approach?
    Thanks for the bite size clip! ❤

    • @techfren
      @techfren  8 днів тому +1

      its a small file that is portable and usable on different operating systems. its also known to be faster on some architectures. comes with its own WebUI and can host inference server

    • @fire17102
      @fire17102 8 днів тому +1

      ​@@techfrenlove how it just runs the local server, that's awesome. Can this be bundled and run in the back with a "front" native android app of our making?
      Makes me think about making sure that ai pipelines are built to be xross platform, and instantly transferable.
      Regarding llamafile, the only thing it doesn't have Ollama's amazing pull feature. I wish someone made a thin wrapper above ollama and llamafile to get best of both worlds

    • @techfren
      @techfren  8 днів тому

      @@fire17102 yeah I think bundling llamafile would be much easier than other methods. What's ollama pull method

    • @fire17102
      @fire17102 8 днів тому +1

      @@techfren I meant that you can just do ollama run and it will pull and dl the model for you, easy as pie. Llamafile needs that..