Chat with MySQL Database using GPT-4 and Mistral AI | Python GUI App

Поділитися
Вставка
  • Опубліковано 10 вер 2024

КОМЕНТАРІ • 194

  • @jim02377
    @jim02377 3 місяці тому +1

    I just made it through the entire tutorial. As usual you attention to the details is awesome. Now I want to try to agent version and see the difference!

    • @alejandro_ao
      @alejandro_ao  3 місяці тому

      thank you Jim, it's great to see you around here! i'll be putting a version with agents soon!

  • @BrandonFoltz
    @BrandonFoltz 5 місяців тому +3

    Oh yea! More to learn this weekend! Cool to see all of these technologies merging and synergizing. Keep up the great work!

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      Thank you Brandon! Always a pleasure to see you here!

  • @RahulM-lm6qg
    @RahulM-lm6qg 4 місяці тому +6

    please make video for this usecase only using open source LLM models

  • @AasherKamal
    @AasherKamal 2 місяці тому

    The way you explain the details is impressive.

  • @aristoz1986
    @aristoz1986 5 місяців тому +4

    Great!! I will try it out this week!! Keep on going with the good stuff🎉

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      Thanks!! Let me know how it goes!

  • @lookingaround1586
    @lookingaround1586 5 місяців тому +3

    Thanks @alejandro_ao. Could you make a video on implementing graphs/charts alongside the NL response?
    Congrats on your diploma!

  • @Sanitiser254
    @Sanitiser254 22 дні тому

    Love your work man

  • @songfromktown
    @songfromktown 4 години тому

    Thanks for this great tutorial. In case I would like to have some plot (bar or line charts of any column of a table), how should I approach ?

  • @vinayakmane7569
    @vinayakmane7569 4 місяці тому

    I am loving your work bro . dont stop . keep making such unique projects

    • @alejandro_ao
      @alejandro_ao  4 місяці тому

      thank you! there is much more coming up :)

  • @scollin10
    @scollin10 5 місяців тому

    Love your walkthroughs! I’m using your method and approach to chat with a MySQL database of resumes. The eventual goal would be to build up the database of both resumes and job descriptions and have my team of recruiters be able to prompt it to optimize efficiency.

    • @alejandro_ao
      @alejandro_ao  5 місяців тому +2

      hey there, that's an awesome idea! just remember that, for a real application, you should not use the `root` user of your mysql database. create a new user that only has READ privileges. you wouldn't want your LLM to accidentally write anything into your database or delete some data!

    • @scollin10
      @scollin10 5 місяців тому

      @@alejandro_ao I use a .env for now just as good practice but it is all local as I’m encountering some issues. I would love to pick your brain and chat over coffee or a consultation session if that’s cool.

    • @scollin10
      @scollin10 5 місяців тому

      @@alejandro_ao Thanks for the tip on using root! I would like to pick your brain and seek your help on my app. I'll shoot you an email.

  • @muhammadqasim6524
    @muhammadqasim6524 4 місяці тому

    Congratulations on your Diploma. 🎊 Enjoying your videos.

  • @peralser
    @peralser 2 місяці тому

    Great Work!! Thanks for sharing your time and knowledge with us.

    • @alejandro_ao
      @alejandro_ao  2 місяці тому

      i appreciate it man. it's my pleasure :)

  • @niyetheg
    @niyetheg 5 місяців тому +1

    I need your help, I followed all the steps and I am at the last part where I ask questions and get feedbacks but I keep getting this error validation error even though I have included the API KEY.
    for ChatGroq __root__ Did not find groq_api_key, please add an environment variable `GROQ_API_KEY` which contains it, or pass `groq_api_key` as a named parameter. Please what can I do?

    • @gianni4302
      @gianni4302 4 місяці тому

      print under it to see if its accessing the Groq qpi key, or just input the key and see if it works. if it does its an issue with the parth to your .env. make sure to 'load_dotenv()' at the start instead that may help

  • @OwaisBinMushtaq
    @OwaisBinMushtaq 5 місяців тому +1

    Great .... Will try to implement this week
    🎉🎉🎉🎉

  • @shashankkumardubey6260
    @shashankkumardubey6260 5 місяців тому

    Great project. As a beginner this project can help me to learn better. Do make some more unique projects . Keep doing great work.

  • @adityabhagwat6965
    @adityabhagwat6965 4 місяці тому

    Great content! Appreciate you sharing. Excited to give it a go!

  • @MarioLopez-bm9mf
    @MarioLopez-bm9mf 19 днів тому

    Great, thanks for sharing. Quick question: Can it be used with large databases that have more than 100 tables? What do you recommend for handling large databases?

  • @trideepsaha2594
    @trideepsaha2594 5 місяців тому

    This is what we have been waiting for a time, No words only 🎈🤩..We must all do our homework. Special thanks for groqcloud.Congratulation on your Diploma AO.

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      thank you!! very glad to hear this was useful! let me know what else you would like to see!

    • @trideepsaha2594
      @trideepsaha2594 5 місяців тому

      @@alejandro_aoFrom this side anything u want to teach. #AI #RAG #LangChain #FineTune

  • @matiasparouy
    @matiasparouy 5 місяців тому

    Thanks Alejandro for sharing! excellent video! 👏

  • @babas5990
    @babas5990 5 місяців тому

    Congrats on earning your diploma.!!! Thank you for your excellent video tutorials.

  • @ArusaKhalfay
    @ArusaKhalfay Місяць тому

    I just built this, it does basic stuff and is great but if you notice and test it out with the last user question it actually gives the wrong response and does not correctly map the artist ids. Do you have ways to improve this performance. It makes mistakes with writing simple joins too and throws an error?

  • @akshaysangwan2172
    @akshaysangwan2172 5 місяців тому

    The code you taught works properly for small queries but fails to write complex queries and provides incorrect information. Can you help? I'm using ChatGPT-3

  • @mlg4035
    @mlg4035 5 місяців тому

    Congratulations on your diploma!! Great video!

  • @TYTennis
    @TYTennis 3 місяці тому

    Hi, this is great thank you so much I'm learning a lot from this. I see you said this wasn't production ready and that improvements could be made. What improvements do you think you would make and why would that be an improvement? Thanks again for this video, you're very ahead with tech :)

  • @LukeS-e3m
    @LukeS-e3m 28 днів тому

    Are you able to create a video where you're searching through multiple tables? Or would this approach work for this too?

  • @thiagomantuanidesouza136
    @thiagomantuanidesouza136 Місяць тому

    What to do if the columns contain underscore in the names? Is there any configuration to solve it?

  • @MohitThehuman
    @MohitThehuman Місяць тому

    Hi
    Great video
    Thank you
    I have one question how will we deal when there will be like 100 tables in the schema
    Like it will lead to the context window issue
    Do we need to add -vectordb to first search and filter out the relevant information then pass it to llm
    Any comments ?

  • @arvindprajapath2315
    @arvindprajapath2315 2 місяці тому

    How are you getting those code suggestions or auto complete feature when you are typing the code in vs code. Can someone tell me which extension is this

  • @AyomideKazeem-g7n
    @AyomideKazeem-g7n 3 місяці тому +1

    Hello, great tutorial as always. I am completely new to this, so after running the code and changing a few things I keep getting this error "AttributeError: st.session_state has no attribute "db"" and I do not know how to solve it.

  • @openyard
    @openyard 4 місяці тому

    Thank you. Another video with great content.

  • @nguyenduythai4585
    @nguyenduythai4585 Місяць тому

    If using open_ai_key, will the data in the database be leaked, how can it be secured?

  •  5 місяців тому

    Thx Alejandro , great stuff like always .. crystal clear 😎. I changed your code a bit to use it with a postgre database, works like a charm !✊❤‍🔥

    • @alejandro_ao
      @alejandro_ao  5 місяців тому +1

      that's so cool! which adapter did you use to connect? psycopg2?

    •  5 місяців тому

      @@alejandro_ao yes this one

    •  5 місяців тому

      I started to push the prompts a little further (formatting, tone, etc...) it's a really good base to work with. Thanks again for this starter kit 🙏💪

    •  5 місяців тому

      @@alejandro_ao did you ever try with a mongodb?

    • @alejandro_ao
      @alejandro_ao  5 місяців тому +1

      @ so glad to hear this!

  • @RakshithML-vo1tr
    @RakshithML-vo1tr 9 днів тому

    Any plans of making videos with the help of llama 405B parameter model

  • @AJITKUMAR-k9k2g
    @AJITKUMAR-k9k2g 2 місяці тому

    Are you using any extension which gives you autocomplete code suggestion?

  • @Parthi97
    @Parthi97 3 місяці тому

    can you show me the steps when my streamlit app is hosted on cloud so that when the user enters connection details my web app should be able to connect

  • @Shivang0369
    @Shivang0369 3 місяці тому

    Hello brother where have you used LangChain's API?
    You mentioned it in .env file ?
    Also i am using the OpenAI model 3.5 for which i am getting below error:
    openai.BadRequestError: Error code: 400 - {'error': {'message': "This model's maximum context length is 16385 tokens. However, your messages resulted in 16734 tokens. Please reduce the length of the messages.", 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}}
    kindly tell me what should I do? I am planning to build this chatbot at enterprise level, which have larger database.
    Keep uploading brother, great explanation

  • @showbikshowmma3520
    @showbikshowmma3520 2 місяці тому

    Firstly, love you 3000 times!
    I have a question. I want to build a chatbot using an open-source language model like LLaMA 3 or another available large language model (LLM). How can I integrate the LLM with my MySQL database so it can answer questions based on the information it finds in the database? Additionally, I would like to integrate an API into the LLM because I have a hosted Python backend server. Through the API, the LLM will also be able to respond to user queries. I believe u will help me out in this case..

  • @delgrave4786
    @delgrave4786 2 місяці тому

    Hey alejandro amazing project! I implemented this made a bunch if changes to it too.
    I had a request, do you have a video about creating and working with agents using open source llms like groq. If not can you make a tutorial for that? Most videos i see always use openai and i just am not able to implement them with open source llms. And openai refuses to take my debit card

  • @RaushanKumar-ut2ke
    @RaushanKumar-ut2ke 4 місяці тому

    All the methods I saw is to use single schema but I didn't find any workaround for connecting multiple schema for database chatting

  • @kingfunny4821
    @kingfunny4821 4 місяці тому

    Thank you for your wonderful efforts and excellent explanation
    Please, I would like to ask you whether it has been explained how to make a conversation with files without using the Internet, that is, it is only local

  • @MamirZhan-de7fv
    @MamirZhan-de7fv Місяць тому

    I have been following your tutorial, it helped me a lots! Thanks for sharing. All these LLM models are greatly depends on sementic meaning of the tables and columns. But in my case, my database came from Open Way loan management system and most of my tables and columns name doesn't have semantic meaning. For example, the transaction table called DOC instead of transactions. In side DOC table, non of the fields name are self explanatory. As a result, most of time, my model can't answer my questions.
    What should I do?

  • @LieutenantGeneral
    @LieutenantGeneral 2 місяці тому

    I tried to train a bert model on Spider dataset but realised that I need a dataset that also consists of DDL commands, any suggestions or do I need to nake a custom dataset?

    • @LieutenantGeneral
      @LieutenantGeneral 2 місяці тому

      Also to mention that my choice with bert was not good, so now I am going to use T5 model.

  • @ethanxsun
    @ethanxsun 4 місяці тому

    So cooooool! You are hero. Talented

    • @alejandro_ao
      @alejandro_ao  4 місяці тому

      glad this was useful! keep it up!!

  • @ritishajaiswal9918
    @ritishajaiswal9918 5 місяців тому

    Hey Alejandro,
    Thanks a bunch for the helpful video. Could you make another one where you show us how to use a model we've downloaded locally (Can be a quantized one)?. Also, it'd be awesome if you could include Vertica database as the DB in the process.
    You don't have to go through the whole process again, just maybe explain how to connect to the database using the local model. I've heard there are models like SQL-coder-7b that are designed for translating plain English into queries. It'd be really helpful if you could check out a few of these models and share your thoughts on which one would work best in this kind of situation.
    My organization is pretty cautious about sending sensitive data to LLM model APIs, so it's important for us to be able to do this kind of thing locally without relying on APIs.
    Looking forward to the next video. Thanks again for all your helpful content!

    • @laurarusu1989
      @laurarusu1989 3 місяці тому +1

      Hi, I did this with llama3-7b. Actually the code is the same one from the video. First, you install the local model. in my case I installed llama3-7b by running in Terminal 'ollama pull codellama:7b' then I run 'Ollama run code llama:7b' (you can go to the llama site and see the actual command) then when I set the llm I used
      'from langchain_community.chat_models import ChatOllama
      llm = ChatOllama(model="codellama:7b")' .
      everything else should be the same. Langchain supports many different llms, you could see the documentation and check if the llm you're interested in is available and use the corresponding module. I hope this helps you :)

  • @ilsms
    @ilsms 5 місяців тому

    Thanks for sharing ! The best video

  • @luismario6808
    @luismario6808 2 місяці тому

    amazing video, thanks!

  • @djaz7347
    @djaz7347 5 місяців тому

    Hi Alejandro , is possibile to connect a DB2 database through jdbc?

  • @udaynj
    @udaynj 4 місяці тому

    Does this mean that OpenAI and/or Langchain have access to your database schemas and also the data generated? That could be a huge security risk for most companies.

  • @andersonkoh1382
    @andersonkoh1382 2 місяці тому

    Do you have a series for beginner to learn how to build custom theme and create pages with these? I'm used to build with the builder in WordPress.

  • @zefur321
    @zefur321 Місяць тому

    What is the extension name that you use to predict the next steps of coding (i see it will show as gray color) in VS Code?

  • @tapanpati9452
    @tapanpati9452 5 місяців тому

    cool,bro i have a scenario to create a chatbot using 36,000 pdfs ,do you have any idea,but rag is not giving accurate answer.

  • @RaushanKumar-ut2ke
    @RaushanKumar-ut2ke 4 місяці тому

    Hey alejandro. Do we have any method to connect multiple schema for database chatting

  • @Jtheshooter
    @Jtheshooter 5 місяців тому

    Congrats on the diploma sir 🎓

  • @aadilgani5528
    @aadilgani5528 5 місяців тому

    Congratulations on getting your diploma, can you please make a video On Chat with SAS (clinical data .bat files) database using open-source LLM.

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      I have never used sas, but a senior technical architect there implemented an agent that interacts with sas data. Maybe this can be useful to you: www.linkedin.com/posts/bogdanteleuca_how-to-create-your-custom-langchain-agent-activity-7161603166952734720-jqRo/

  • @TheAbanik
    @TheAbanik 5 місяців тому

    Amazing video as usual, very helpful. Could you explain why you decided to use GPT 4 instead of 3.5 Turbo ? Is GPT 4 better in crafting the sql queries based on natural language ? Also can you try creating a video based on langchain agent to showcase a use case where user asks a question, the response can be either in a document/pdf or from a database and agent needs to figure out the correct way to respond.

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      hey there, sure. i used gpt-4 because i needed its 128k token context window to be sure that it could ingest the entire database schema. but actually, when the schema could fit in GPT-3's context window, i found that it actually worked better. i will be doing videos on agents very soon!

  • @yugeshkaran7547
    @yugeshkaran7547 5 місяців тому

    Thanks for the video bro . Now I have built and hosted the similar chatbot on streamlit cloud community, but I have encountered an error with connecting with MySQL database. I have crossed my database server but still i haven't resolved it. Could please guide me what the issue might be??

  • @redo22011
    @redo22011 3 місяці тому

    Thxs for sharing, its a great work!!! can you add similar example for amazon bedrock Claude 3

  • @JoEl-jx7dm
    @JoEl-jx7dm 5 місяців тому

    Hey i have issue hosting on streamlit cloud, could ya try?

  • @AIWALABRO
    @AIWALABRO 5 місяців тому

    I love this video! can I put it into my resume?
    In the end, you said that in real life we do things by "agents" instead of "chains". here we did it with "chains".
    1) so for the production-ready how we approach it are we used chain or agents?
    2) can you make a video by using agents?

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      hey there, absolutely! feel free to add this to your cv :)
      1) i would test both. the only difference is that an agent would probably be better at processing your query. it would be able to check the result from the mysql database and reformulate its sql query if the result does not answer the user's (your) question. best would be to test both and see which one performs better!
      2) coming up!

    • @AIWALABRO
      @AIWALABRO 5 місяців тому

      @@alejandro_ao thanks for your response! eagerly waiting for the agents video in mysql, we can say mysql part-3.

  • @victorchrist9899
    @victorchrist9899 5 місяців тому

    Congratulations on your diploma

  • @MaxSvid
    @MaxSvid Місяць тому

    Great stuff!

  • @AmirShafiq
    @AmirShafiq 5 місяців тому

    Good stuff and great explanation

  • @saibaldasgupta1
    @saibaldasgupta1 5 місяців тому +1

    Is this code can be use for postgresql?

    • @vincentfernandez7145
      @vincentfernandez7145 5 місяців тому

      You need to use psycopg2-binary instead of my-sql-connector

    • @tofani-pintudo
      @tofani-pintudo 4 місяці тому

      Yes, you just need to change the db config.

  • @bobcravens
    @bobcravens 5 місяців тому

    I’ve been enjoying your channel. I wonder at what temperature there is a probability of the LLM dropping tables from your DB ;-)

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      oh yeah, absolutely, please DO NOT use this with a mysql user that has write privileges

  • @Ganitham-GenAI
    @Ganitham-GenAI 4 місяці тому

    Will the test results and the data be exposed to LLM or only meta data

    • @alejandro_ao
      @alejandro_ao  3 місяці тому

      yes, all data goes through the LLM. if privacy is your concern, you can load an open source model locally, such as llama3

  • @aniketdeshmukh5776
    @aniketdeshmukh5776 2 місяці тому

    Thanks Alejandro, for excellent video!
    I have one doubt,
    when we are using openai key with our sql database.
    are they able to access all our sql database data?
    If yes, then please let me know, how to prevent that.
    If No, then it's fine.
    Thanks for clarification in advance...!

    • @alejandro_ao
      @alejandro_ao  2 місяці тому

      hey there, great question! according to the openai's docs and privacy policy, they don't keep logs of your prompts for more than 30 days. and they do not use these prompts to train or fine tune the model. so if that is your concern, you should be fine.
      however, you may absolutely do not want any of your data to go through the openai's servers regardless of their promise of privacy. if that is the case, here is some info about the implementation in this video: openai's servers only see what you add to the prompt that you sent to them. in this case, we are sending two prompts to openai during our process:
      1. firstly, we ask the LLM to generate the SQL query. in this case, we are sending the schema of the database in our prompt. the schema itself does not include the data from your database, but it does say what kind of data your database contains, such as the names of the tables, the data types, etc. the LLM will use this schema to generate the SQL query.
      2. secondly, we run the SQL query towards our database. this operation happens only between ourself and your database. openai is not involved here.
      3. thirdly, we take the results that our database returned and we send them to openai for interpretation. here is another step where we are sending data to openai.
      so in short, in this tutorial, we are sending two pieces of information to openai: the schema of your database and the results from the query.
      if you are not comfortable with this, you can always run a local model using ollama! this will work great :)

    • @aniketdeshmukh5776
      @aniketdeshmukh5776 2 місяці тому

      Thanks @@alejandro_ao, For excellent clarification. Really Appreciate👏👏

  • @stanTrX
    @stanTrX 4 місяці тому

    How those code tips are popping in vs code? Does anyone knows?

    • @alejandro_ao
      @alejandro_ao  4 місяці тому +1

      hey there, that's github copilot. saves a lot of time :)

  • @SatyendraJaiswalsattu
    @SatyendraJaiswalsattu 5 місяців тому

    Nice tutorial, please make one video - how to entract with my excel sheet data

  • @PhamDuc8504
    @PhamDuc8504 2 місяці тому

    I want to ask how the VSC IDE can suggest the remaining parts of the code ? Thank !!!

  • @stanTrX
    @stanTrX 4 місяці тому

    Thanks but traditional forms to pull data from db's would be probably more efficient.

  • @hadjersa28
    @hadjersa28 2 місяці тому

    why u choosed to use mixtral and gpt4?/

    • @alejandro_ao
      @alejandro_ao  2 місяці тому

      i wanted to show one proprietary and open source llm. but you can use any supported integration with langchain!

  • @manikandanr1242
    @manikandanr1242 5 місяців тому

    If we ask questions from out from db example user enter hi what will be it response

    • @alejandro_ao
      @alejandro_ao  5 місяців тому +1

      hopefully the overall chain would be able to get that you were actually just saying "hi". but in order for your program to decide whether or not to call the database depending on the user question, you might need to create an agent or a bit more sophisticated chain

  • @JuanPalacios
    @JuanPalacios 5 місяців тому

    excelente informacion!!!!

  • @gianni4302
    @gianni4302 4 місяці тому

    Great tutorial, but i keep running into error 400 due to token counts. Any real world schema alone will bump up to almost 10k tokens. anyone found a workaround? A tutorial on this would be lifesaving!

    • @VibhuSharma
      @VibhuSharma 4 місяці тому

      I am running into the same issue, always getting exceeded token limit error. Were you able to figure out any workaround?

  • @akhil0227
    @akhil0227 3 місяці тому

    Has anyone managed to work with postgresql?

  • @nguyenduythai4585
    @nguyenduythai4585 4 місяці тому

    i want it to print the message that the question is not related to the db. Help me ?

    • @alejandro_ao
      @alejandro_ao  4 місяці тому

      hey there, maybe you can add to the final prompt that "if the context does not contain the answer to the question, please say that the information could not be retrieved from the database", or something like that!

    • @nguyenduythai4585
      @nguyenduythai4585 4 місяці тому

      @@alejandro_ao you can help me with code...

    • @nguyenduythai4585
      @nguyenduythai4585 4 місяці тому

      @@alejandro_ao can you help me with code

    • @nguyenduythai4585
      @nguyenduythai4585 3 місяці тому

      @@alejandro_ao you can write code here.

  • @NasserTabook
    @NasserTabook 5 місяців тому

    Great tutorial, Thank you for the hard work

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      thank you, i appreciate it!!

  • @fozantalat4509
    @fozantalat4509 4 місяці тому

    Can you create this app tutorial using agent instead of chains, that would be really awesome.

    • @alejandro_ao
      @alejandro_ao  4 місяці тому

      i will probably do something like this using CrewAI

  • @thangarajerode7971
    @thangarajerode7971 5 місяців тому

    Could you please create the video about how to deploy the steamlit app into langserve?

  • @jimerlozano6523
    @jimerlozano6523 2 місяці тому

    interested in more about sql.

  • @utk1000
    @utk1000 5 місяців тому

    bro your langchain pdf reader with API code is outdated. it doesn't work Ive been stuck for a couple days

    • @zeelthumar
      @zeelthumar 5 місяців тому

      Langchain is updating their docs rapidly 😢

    • @utk1000
      @utk1000 4 місяці тому

      @@zeelthumar youre right

  • @mbanduk
    @mbanduk 3 місяці тому

    How would I go about this if Airtable was my database?

    • @alejandro_ao
      @alejandro_ao  2 місяці тому

      that's a cool idea. you can use this integration with langchain: python.langchain.com/v0.2/docs/integrations/document_loaders/airtable/
      but you would have to build your own chain, which might look a bit different from what i show here. still, the implementation should not be too different. i hope this helps!

  • @nguyenduyta7136
    @nguyenduyta7136 5 місяців тому

    Thanks god for having cool man

  • @blackpanther3005
    @blackpanther3005 5 місяців тому

    Hello, good day, excellent video, sorry, how much VRAM did you use for the project?

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      hey there, all the completions of this app are done in remote servers, through the providers' APIs so they are the ones dealing with all the processing power. you could run this in an old raspberry pi tbh 😎

  • @marvinmarkham8405
    @marvinmarkham8405 3 місяці тому

    dont talk to the mic so close but great vid

  • @AIgencify
    @AIgencify 5 місяців тому

    Next video with agents!!

  • @1sanak
    @1sanak 5 місяців тому

    Grats on the diploma! Can I use the same method to connect to a MSSQL DB?

    • @alejandro_ao
      @alejandro_ao  5 місяців тому +1

      thanks!! i heard some guys from the channel were trying to do this. i haven't tested myself. but look, apparently you can pass in a driver just like we did here with mysql, but for MSSQL: docs.sqlalchemy.org/en/20/dialects/mssql.html#module-sqlalchemy.dialects.mssql.pyodbc

  • @kaleshashaik5959
    @kaleshashaik5959 5 місяців тому

    Can we do this using offline LLM model?

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      absolutely, just initialize your LLM using ollama instead of openai or groq

  • @VoltVandal
    @VoltVandal 5 місяців тому

    congrats !

  • @user-bb6hp8hn2f
    @user-bb6hp8hn2f 5 місяців тому

    Hello, greetings and congratulations on obtaining your diploma.
    Once again, I would like to ask you a question. I may need to analyze MySQL data. The first step is to execute sql_chain and run SQL statements, and the second step is to add an analysis chain. The problem I encountered is: when I add self-awareness knowledge to prompt words, and when I ask self-awareness questions, two chains need to participate at the same time. How can I control the first chain not to execute or have any other way to only run one chain, which may make the program more powerful.
    Let me give an example:
    User: hello!
    AI: hello, can I help you?
    When I asked this, he didn't need to execute SQL statements, but now he still does. So there are the issues mentioned above.
    I hope you can understand my language, because I am Chinese and this question was translated through translation software. I do not know English.
    Wishing you again.

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      Hey there. If you want something more interactive, that is able to respond to other questions and not only those strictly related to generating a SQL query, you should probably implement an agent. LangChai makes it very straightforward to implement agents: python.langchain.com/docs/modules/agents/.
      You can also use CrewAI to create a team of agents: docs.crewai.com/
      I hope this helps!

    • @user-bb6hp8hn2f
      @user-bb6hp8hn2f 5 місяців тому

      @@alejandro_ao Thank you very much. I have benefited a lot from watching your latest video.

  • @adelinadonisa
    @adelinadonisa 5 місяців тому

    At 9:50 how did that SQL database chat page opened 😢? Maybe I am stupid but I can't figure it out if you run the code or not 😂

    • @alejandro_ao
      @alejandro_ao  5 місяців тому +1

      hey there, that’s a good question. what i opened there is not actually part of the app. i just opened the command line of my computer and showed that i have a mysql server running locally ;)

    • @adelinadonisa
      @adelinadonisa 5 місяців тому

      @@alejandro_ao Thank you!

  • @historus
    @historus 3 місяці тому

    Hello Alejandro and thx for the vidéo. I'm having an error with this "from langchain_community.utilities import SQLDatabase". it says "Import "langchain_community.utilities" could not be resolved". Also, i have this error for this "from langchain_groq import ChatGroq" "Import "langchain_groq" could not be resolved". Any solution plz ? thx.

    • @alejandro_ao
      @alejandro_ao  3 місяці тому +1

      hey there. it seems like you don't have the packages installed. did you run `pip install langchain-community langchain-groq` ?
      be sure that you have your conda environment activated when running this!

    • @historus
      @historus 3 місяці тому

      @@alejandro_ao hi. Yes i have all the packages. I just restarted vs code and all the errors gone. Know it’s ok 👍. Thank you for the video and for the answer 🙏.

  • @syedhashir5014
    @syedhashir5014 5 місяців тому

    can i build the same with MongoDB?

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      of course! but you would need another type of chain. i'll make a video about this soon

    • @syedhashir5014
      @syedhashir5014 5 місяців тому

      @@alejandro_ao plzz do

  • @Namita212
    @Namita212 5 місяців тому

    Can anyone tell where will I get keys mentioned in env file

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      hey there. if you want to use openai’s models, you have to create an account at platform.openai.com. Their models are paid for but they are the best, plus each query is like 2 cents and i think they are still giving away a few dollars for first time signups.
      if you want to use mistral (for free), just
      go to groq.com and create an account.
      You can now create an API key in their portal and use it like in the video. make sure to use the same variable names as in my .env file

    • @Namita212
      @Namita212 5 місяців тому

      Hey thanks n congrats on ur diploma. The first two I have got, where do I get langchain api key.

  • @zefur321
    @zefur321 Місяць тому

    How to obtain these API Keys?

    • @alejandro_ao
      @alejandro_ao  Місяць тому

      you can go here: platform.openai.com/
      remember this is a premium api. if you want a free version of this, you can use groq by creating an account here: console.groq.com/login and implementing it like this: python.langchain.com/v0.2/docs/integrations/providers/groq/

  • @tapanpati9452
    @tapanpati9452 5 місяців тому

    Can you please make a video how to use langsmith?

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      definitely, i'm working on it!

  • @Namita212
    @Namita212 5 місяців тому

    Please tell where i get langchain api key,sorry to bother again

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      no bother at all. you don’t actually need that to run this app. the api key you see there is to use Langsmith (the logging platform where i showed the steps of the chain and the intermediate results).
      you can run the whole app without it. but if you want to check the logs of each step of your chain, you can get it here: www.langchain.com/langsmith

    • @Namita212
      @Namita212 5 місяців тому

      Thanks for explaining. U r awesome guy.❤

  • @galdakaMusic
    @galdakaMusic 5 місяців тому

    Is posible all on local mode?

    • @alejandro_ao
      @alejandro_ao  5 місяців тому

      of course, in this video i am using a local mysql database. if you refer to a local model, yes. you can use ollama instead of openai or mistral.