Is there a way in which I could batch invoke a list of text files ? I want to extract certain texts from each file using openai? Or would I have to do it one by one and loop
First of all , thank you for the amazing way you explain stuff.... Its elegant...now i have only one question when we userd the retriever to assign it's value to the context variable, i assume the retriever will pass all of the documents in the rag list to the llm so that when it validates the question,it will use the whole rag documents that are all being passed as a context in the prompt. Am i right? ... Will it be a good idea if we created a chain to first take the question to query the rag and get just the needed context then pass it with the question again to form a human understandable answer from the llm ? ....im asking to see if im understanding it right or not .... Thanks man
Great Tutorial - Keep the good work
Thanks, will do!
Nice simplified explanation ❤
Thanks, subscribed!
Awesome, thank you!
Great tutorial - thanks!
Sir! Your doing great job
Great video .!
AzureOpenAi chat model doesn't seem to support LCEL. Am I doing something wrong?
Awesome buddy
Thanks ✌️
Is there a way in which I could batch invoke a list of text files ?
I want to extract certain texts from each file using openai?
Or would I have to do it one by one and loop
you can extract fill text and pass it in batch
@@FutureSmartAI cool, thanks, I shall try it out
First of all , thank you for the amazing way you explain stuff.... Its elegant...now i have only one question when we userd the retriever to assign it's value to the context variable, i assume the retriever will pass all of the documents in the rag list to the llm so that when it validates the question,it will use the whole rag documents that are all being passed as a context in the prompt. Am i right? ... Will it be a good idea if we created a chain to first take the question to query the rag and get just the needed context then pass it with the question again to form a human understandable answer from the llm ? ....im asking to see if im understanding it right or not .... Thanks man