Great video, lots of stuff is more understandable. This video was uplodaded 3 month ago & today some of the modules have changed (chroma db / chroma search - are now one module) so it's not just step-to-step follow up to represent the same logics). Thanks for video anyway
Thank you for the video, I am facing an issue with Vector Search. I don't have an option on langflow for "Vector Search". Please help me with the steps.
Thank you for a very good tutorial. Am I understanding correctly, if I select a vector store component, like Chroma, does Langflow automatically create the Chroma database, or do I have to have a seperate Chroma db elsewhere?
Thank you. langflow will persist the chroma db in the path you specify. You will be able to see it within the working directory. At 10:26 in the video you can see when i bring in the chroma langflow component there is an option to specify persist directory.
When facing big and different information data, then using cloud vector with this cart, the result will dissapointed, we need more structure so user question really answered based on whole document searching
25:23 it's to my understanding that the first time you will always get this message because the PDF hasn't been saved to the vector database until the app has run at least once.
Just asking. Why would I do this? When I can just prompt something like Meastro to just write the python code? The LLM is the interface now. I tried this and its neat, but at the end of the day, its faster to just ask Maestro to write the python. Not trying to start a riot, I just don't see why I would use this.
this is just a demonstration of what all is possible. the way to think about this is that if you know of the capabilities, you can orchestrate these type of pieces and build a solution.
Great video, lots of stuff is more understandable. This video was uplodaded 3 month ago & today some of the modules have changed (chroma db / chroma search - are now one module) so it's not just step-to-step follow up to represent the same logics). Thanks for video anyway
thank you. yes, planning to do a revision of this soon.
Please provide all the workflow as image format for better understanding.... Great video., Thanks a lot.....
thank you.
There is no Vector Search anymore. Is this a workaround? How to connect your db with the {context} now?
Thank you for the video, I am facing an issue with Vector Search. I don't have an option on langflow for "Vector Search". Please help me with the steps.
which version of langflow are you on?
Thank you for a very good tutorial.
Am I understanding correctly, if I select a vector store component, like Chroma, does Langflow automatically create the Chroma database, or do I have to have a seperate Chroma db elsewhere?
Thank you. langflow will persist the chroma db in the path you specify. You will be able to see it within the working directory. At 10:26 in the video you can see when i bring in the chroma langflow component there is an option to specify persist directory.
When facing big and different information data, then using cloud vector with this cart, the result will dissapointed, we need more structure so user question really answered based on whole document searching
yes this is just a prototype and not meant for production purposes.
what langflow version are you using? The current one I am using is v 0.5.12 and the UI and block headers seems to be different.
a revised video will be provided soon.
hi Jain, tks for this. can i seek your help on steps to deploy to front-end / web app - say via Azure web apps? tk u.
sure. please send a note to vp@mindfulcto.com
25:23 it's to my understanding that the first time you will always get this message because the PDF hasn't been saved to the vector database until the app has run at least once.
yes.
Just asking. Why would I do this? When I can just prompt something like Meastro to just write the python code? The LLM is the interface now. I tried this and its neat, but at the end of the day, its faster to just ask Maestro to write the python. Not trying to start a riot, I just don't see why I would use this.
this is just a demonstration of what all is possible. the way to think about this is that if you know of the capabilities, you can orchestrate these type of pieces and build a solution.
Did you every get the memory working? :)