Ollama Structured Outputs: LLM Data with Zero Parsing

Поділитися
Вставка

КОМЕНТАРІ • 10

  • @coolmcdude
    @coolmcdude 3 дні тому

    Awesome video. I’m from America and I visited Wales and England earlier this year for the first time. Wales was so beautiful.

  • @ofrylivney367
    @ofrylivney367 19 днів тому +3

    Ive been needing this feature, thanks for the info!

  • @IdPreferNot1
    @IdPreferNot1 19 днів тому +1

    Great adavancment for open source efforts

  • @sMadaras
    @sMadaras 18 днів тому +3

    I’d appreciate if you could share the codes.

    • @IanWootten
      @IanWootten  18 днів тому +2

      Sure thing, I've added a link with them to the description

    • @sMadaras
      @sMadaras 18 днів тому +2

      @@IanWootten Diolch yn fawr

  • @jana171
    @jana171 17 днів тому +1

    Instructor patched SDK worked good as well, but this just cleans up the code and is simple.
    Based on my own implementation with instructor, it obeys the Pydantic basemodel structures close to a 100% of the completions, compared to the few-shot propmting setup i had before that, that had about 10% success and LOADS of failures 🙂 This way just works !

  • @StudyWithMe-mh6pi
    @StudyWithMe-mh6pi 19 днів тому

    👋👋👋

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 17 днів тому

    But given that OpenAI, Claude can do this already, why do we need ollama for it? And here, isn’t Llama doing the heavy lifting? Not clear how ollama is helping.

    • @IanWootten
      @IanWootten  17 днів тому +5

      Using a model that's local, open and free are three good reasons.