Wonderful as always and just in time. Was going to build a similar use case that auto generates database docs for business users next week. This comes in handy🎉 Thank you again and again
Thanks. If you are encountering model's maximum context length then you can try the following. 1. Choose a different LLM that supports a larger context window. 2. Brute Force Chunk the document, and extract content from each chunk. 3. RAG Chunk the document, only extract content from a subset of chunks that look “relevant”. Here an example of these from LangChain. js.langchain.com/v0.1/docs/use_cases/extraction/how_to/handle_long_text/
If you do not see the custom model in your ollama ecosystem then check the model file to make it's correct. Here is an example of the custom model file from openwebui. openwebui.com/m/darkstorm2150/Data-Scientist:latest
Wonderful as always and just in time. Was going to build a similar use case that auto generates database docs for business users next week. This comes in handy🎉
Thank you again and again
Glad it was helpful! Happy coding.
Great video
What if the response from database exhaustes the context window of the model.
Thanks. If you are encountering model's maximum context length then you can try the following.
1. Choose a different LLM that supports a larger context window.
2. Brute Force Chunk the document, and extract content from each chunk.
3. RAG Chunk the document, only extract content from a subset of chunks that look “relevant”.
Here an example of these from LangChain.
js.langchain.com/v0.1/docs/use_cases/extraction/how_to/handle_long_text/
Awesome video! How did you get the various categories when creating a model?
Thanks. Those are defaults in the OpenWebUI. You can select relevant categories for a custom model.
when he said "would you like me to break down the sales by product" and you responded with yes will do the action that he mention or will not?
It may work if the SQL model is able to generate sql for the question. You can try it and let us know if this extended option works.
hi sir the edited model cant be seen by ollama, when I call ollama list in CMD its display only the ollama3.1, why?
If you do not see the custom model in your ollama ecosystem then check the model file to make it's correct. Here is an example of the custom model file from openwebui. openwebui.com/m/darkstorm2150/Data-Scientist:latest
can we run llama 3 locally on any simple VPS Server, or do we need GPUS ?
Hi you'd need a gpu to run llm. By the way VPS servers can have GPUs.