LangChain Function Calling with Open Source LLMs | LlaMa-3 | Groq API

Поділитися
Вставка
  • Опубліковано 2 лис 2024

КОМЕНТАРІ • 11

  • @ShammaAshraf-m6x
    @ShammaAshraf-m6x 2 місяці тому

    SO GLAD YOU POSTED THIS THANK YOU SM!!!

    • @CoderzColumn
      @CoderzColumn  2 місяці тому

      Thanks for your feedback. Really Appreciate it.

  • @Himanshu-yb9kz
    @Himanshu-yb9kz 4 місяці тому

    great work man

    • @CoderzColumn
      @CoderzColumn  4 місяці тому

      Thanks for sparing time to comment! Appreciate it.

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 4 місяці тому

    Which notable models work with function calling?

    • @CoderzColumn
      @CoderzColumn  4 місяці тому

      GPT-3.5, GPT-4, LlaMa-3, Phi-3, Mistral 0.3, Gemma, and few others. Most latest releases are now adding support for function calling by including it in fine-tuning step.

  • @NEUTRALEMPIRE
    @NEUTRALEMPIRE 3 місяці тому

    'Exception occurred: Error code: 400 - {'error': {'message': "'messages.1' : for 'role:user' the following must be satisfied[('messages.1.content' : one of the following must be satisfied[('messages.1.content' : value must be a string) OR ('messages.1.content.0' : one of the following must be satisfied[('messages.1.content.0.type' : value is not one of the allowed values ['text']) OR ('messages.1.content.0.type' : value is not one of the allowed values ['image_url'])])])]", 'type': 'invalid_request_error'}}'
    i am not able to fix this error

    • @CoderzColumn
      @CoderzColumn  3 місяці тому

      Can you share which line of code is failing? Is it something failing from Notebook of tutorial or you are trying something new which is failing? I am not able to understand error from this trace only.

  • @amventures1
    @amventures1 3 місяці тому

    But groq AI isn't Llama.
    It's no longer open source and host locally/ on premise

    • @CoderzColumn
      @CoderzColumn  3 місяці тому

      Groq provides access to 4 open source LLM (Llama-3 8B & 70B, Gemma 7B & Mistral 8x7B)for free. Groq specializes in hardware creation that speed up token generation for LLMs. These LLMs that they are providing through their API are not theirs but they are open source LLMs which they have deployed on their high performing hardware and provided everyone limited access through API (30 requests per minute).
      You can login to console.groq.com/settings/limits and check limit for free account. I just now noticed that they added two new models (gemma2 8B and Whisper Large v3).
      Yes, It's not locally hosted.

    • @amventures1
      @amventures1 3 місяці тому

      @@CoderzColumn Yes, free isn't opensource. I am trying to achieve something locally as I don't want to share my functions with any third party.