AI on x86 CPU -

Поділитися
Вставка
  • Опубліковано 9 лис 2024

КОМЕНТАРІ • 7

  • @proterotype
    @proterotype 8 днів тому

    Dude, I love your delivery. It’s going to be fun watching your channel grow. I always like finding channels like yours and seeing them gain users

  • @DanielIntense
    @DanielIntense Місяць тому

    Thanks for sharing! I came across this video while building a PC with 196GB of RAM to be able to chat with models at 1/4 token per second :D

  • @daCount0
    @daCount0 Місяць тому

    Nice done - thank you

  • @jazzargamer3064
    @jazzargamer3064 Місяць тому

    Nice video! Do you mind sharing what cpu you are using? And also if you have the time run a few models in the ollama cli with the --verbose option and give us the tokens/sec. Thank you!

  • @VamosViverFora
    @VamosViverFora 19 днів тому

    Thank you for the excellent video! How acceptable is the model 70B on a CPU with a system with 128GB ram ?

    • @VamosViverFora
      @VamosViverFora 19 днів тому

      PS.: it seems crazy but it’s cheaper than a 4090. 😂

    • @VamosViverFora
      @VamosViverFora 19 днів тому

      I’m planning to buy a Ryzen 9 7900 and 128 GB DDR5 at least 6000 MT/s. I know it’s obviously slower than a GPU, but how much is the question.