GPT-3.5, GPT-4, LlaMa-3, Phi-3, Mistral 0.3, Gemma, and few others. Most latest releases are now adding support for function calling by including it in fine-tuning step.
'Exception occurred: Error code: 400 - {'error': {'message': "'messages.1' : for 'role:user' the following must be satisfied[('messages.1.content' : one of the following must be satisfied[('messages.1.content' : value must be a string) OR ('messages.1.content.0' : one of the following must be satisfied[('messages.1.content.0.type' : value is not one of the allowed values ['text']) OR ('messages.1.content.0.type' : value is not one of the allowed values ['image_url'])])])]", 'type': 'invalid_request_error'}}' i am not able to fix this error
Can you share which line of code is failing? Is it something failing from Notebook of tutorial or you are trying something new which is failing? I am not able to understand error from this trace only.
Groq provides access to 4 open source LLM (Llama-3 8B & 70B, Gemma 7B & Mistral 8x7B)for free. Groq specializes in hardware creation that speed up token generation for LLMs. These LLMs that they are providing through their API are not theirs but they are open source LLMs which they have deployed on their high performing hardware and provided everyone limited access through API (30 requests per minute). You can login to console.groq.com/settings/limits and check limit for free account. I just now noticed that they added two new models (gemma2 8B and Whisper Large v3). Yes, It's not locally hosted.
SO GLAD YOU POSTED THIS THANK YOU SM!!!
Thanks for your feedback. Really Appreciate it.
great work man
Thanks for sparing time to comment! Appreciate it.
Which notable models work with function calling?
GPT-3.5, GPT-4, LlaMa-3, Phi-3, Mistral 0.3, Gemma, and few others. Most latest releases are now adding support for function calling by including it in fine-tuning step.
'Exception occurred: Error code: 400 - {'error': {'message': "'messages.1' : for 'role:user' the following must be satisfied[('messages.1.content' : one of the following must be satisfied[('messages.1.content' : value must be a string) OR ('messages.1.content.0' : one of the following must be satisfied[('messages.1.content.0.type' : value is not one of the allowed values ['text']) OR ('messages.1.content.0.type' : value is not one of the allowed values ['image_url'])])])]", 'type': 'invalid_request_error'}}'
i am not able to fix this error
Can you share which line of code is failing? Is it something failing from Notebook of tutorial or you are trying something new which is failing? I am not able to understand error from this trace only.
But groq AI isn't Llama.
It's no longer open source and host locally/ on premise
Groq provides access to 4 open source LLM (Llama-3 8B & 70B, Gemma 7B & Mistral 8x7B)for free. Groq specializes in hardware creation that speed up token generation for LLMs. These LLMs that they are providing through their API are not theirs but they are open source LLMs which they have deployed on their high performing hardware and provided everyone limited access through API (30 requests per minute).
You can login to console.groq.com/settings/limits and check limit for free account. I just now noticed that they added two new models (gemma2 8B and Whisper Large v3).
Yes, It's not locally hosted.
@@CoderzColumn Yes, free isn't opensource. I am trying to achieve something locally as I don't want to share my functions with any third party.