Installing and Running Code Llama 70B at Home - Ollama Tutorial - Best Practices - Open-Source Model

Поділитися
Вставка
  • Опубліковано 4 лис 2024

КОМЕНТАРІ • 8

  • @Igbon5
    @Igbon5 10 днів тому

    Why recommend against running on Windows?

  • @angryktulhu
    @angryktulhu 6 місяців тому +2

    How much ram do I need to run 70b model

  • @prentrodgers
    @prentrodgers 9 місяців тому

    Awesome. The UA-cam title refers to 7B, but your narration refers to 70B. Maybe the title is wrong? Regardless, great work again.

  • @drtristanbehrens
    @drtristanbehrens  9 місяців тому

    Post your questions here! 🤗

  • @kunalr_ai
    @kunalr_ai 9 місяців тому

    i am kind enough to comment like and subscribe

  • @RajinderYadav
    @RajinderYadav 9 місяців тому

    ran codelama 70b locally, not impressed, it runs slow, gets stuck in a loop and doesn't generate anywhere near usable code. maybe its fine tuned for tubers and their stupid coding questions and snake program.