Spring AI - Bring your own data by Stuffing the Prompt
Вставка
- Опубліковано 25 кві 2024
- In this tutorial we will take a look at a technique known as stuffing the prompt. This will allow us to add some context to our prompts when sending a request to the LLM. In this example we use Open AI's GPT-4 Model but this technique will work with all of the supported LLMs.
🔗Resources & Links mentioned in this video:
Spring AI Introduction: • Spring AI Introduction...
GitHub Repository: github.com/danvega/spring-int...
👋🏻Connect with me:
Website: www.danvega.dev
Twitter: / therealdanvega
Github: github.com/danvega
LinkedIn: / danvega
Newsletter: www.danvega.dev/newsletter
SUBSCRIBE TO MY CHANNEL: bit.ly/2re4GH0 ❤️ - Наука та технологія
As a junior developer your explanations are sooooo good! Thanks again for the awesome content
Dan! You're the man! These Spring AI OpenAI videos have been tremendously helpful in building out my own Spring Boot chatbot that uses the OpenAI Assistants API to get real-time data with an external function call. A video dealing with using Assistants to trigger external API calls would be great!
Thank you dan for back to back video
Thanks. And the next video extend this example and use a vector "database".
Those are my next 2 videos 🤩
Hi Dan, this is a nicely explained video on AI using Spring. Trust me no one really talks about it. However, I have just one concern, as far as I know these "context" value has restricted token length. Because of that we need to learn RAG architecture. In short resolving problem with this approach is not feasible in real world.
👍👍👍