Data Analysis with PandasAI and Ollama - Locally and Free

Поділитися
Вставка
  • Опубліковано 7 січ 2025

КОМЕНТАРІ • 52

  • @HistoryIsAbsurd
    @HistoryIsAbsurd 10 місяців тому +4

    Definitely worth the sub! Was lookin for exactly this
    Edit: also, thank you!

  • @mapledev9335
    @mapledev9335 9 місяців тому +2

    PandasAI is brilliant. Thanks for putting this video together!

    • @mapledev9335
      @mapledev9335 9 місяців тому

      I used the same question but only get the table. Change it a bit and I get a bargraph. Rephrase the question again and I get an error. Any suggestions how to get both the table and chart like in your video?

    • @TirendazAI
      @TirendazAI  9 місяців тому

      Prompt engineering is important to get good results. I got this response by trying a few prompts. You may write more detailed prompts.

  • @aroonsubway2079
    @aroonsubway2079 8 місяців тому +2

    Thanks for this great video! One question from me: Since the classical pandas style data analysis performed well, what is the potential advantage of using LLM? Is it possible that LLM can introduce more intelligent analysis than excel?

    • @TirendazAI
      @TirendazAI  8 місяців тому

      The advantage of this tool is that you can use it to explore, clean and analyze your data using generative artificial intelligence. All you need to do is talk to your data. If you put in good prompts, you can get good outputs. You can also use this library to analyze your Excel data.

    • @aroonsubway2079
      @aroonsubway2079 8 місяців тому

      @@TirendazAI Yes, an user-friendly QA is definitely something important. Thanks! But do you think we can find a scenario that Excel fails because it is not intelligent enough, but LLM has a chance to outperform Excel?

    • @TirendazAI
      @TirendazAI  8 місяців тому

      LLMs have huge potential for data analysis. Yes, they are not intelligent enough. We can think of them as a black box but, you can provide that they return good outputs using agents or RAG techniques.

  • @SajaniAbeysiriwardhana-fv7sj
    @SajaniAbeysiriwardhana-fv7sj 3 місяці тому

    Thanks for the video! I tried this using ollama in docker with different models like llama3, llama2 mistral in a network in my workplace. But I get a gateway timeout. Why do I get this error? (Ollama is running without errors)

  • @llIllIlI
    @llIllIlI 3 місяці тому

    Great video!
    How accurate will this be if your dataset is about 100mb in filesize?

  • @PuffNSnort
    @PuffNSnort 7 місяців тому +1

    Great video! What are the dataset size limitations? I get an answer 30% of the time and errors the rest of the time.

    • @TirendazAI
      @TirendazAI  7 місяців тому

      Large models like Llama-3:70b and GPT-4 respond better.

  • @AliAlias
    @AliAlias 10 місяців тому +1

    Thanks very much ❤❤❤
    I wait this video for long time 😊
    Q: what is best open source LLM from hugging face used for PandasAi & SQL data analysis

    • @TirendazAI
      @TirendazAI  10 місяців тому

      You're welcome. Choosing the best model varies from task to task. I love working with Mixtral for PandasAI.

  • @GhostCoder83
    @GhostCoder83 8 місяців тому +1

    Thanks Bro. Very Helpful.

  • @jason77nhri
    @jason77nhri 8 місяців тому

    The tutorial video was truly amazing, with very clear subtitle translations.
    However, I'd like to know how to use PandasAI + Ollama in VS Code like you demonstrated.
    Also, how can I implement Ollama + Llama3 + LlamaIndex?
    What are the minimum computer specifications required for such usage?
    Additionally, how should I write the Python code to ensure that your consecutive queries are processed in the same thread?
    Best regards;

    • @TirendazAI
      @TirendazAI  8 місяців тому

      For example, you need a minimum of 7B RAM to work with a 7B LLM version. Unfortunately, the responses generated by LLMs are not stable. You need to try several prompts to get the output you want.

    • @jason77nhri
      @jason77nhri 8 місяців тому

      @@TirendazAI
      Thank you very much. However, my current computer setup includes an i7 processor, an RTX 2060 6G graphics card, and 16GB RAM. Can I run a local LLM with these specifications?

    • @TirendazAI
      @TirendazAI  8 місяців тому +1

      Yes, you can run a local LLM such as Llama3:8b or mistral:7b.

    • @jason77nhri
      @jason77nhri 8 місяців тому

      @@TirendazAI Thank you for your response. Originally, I planned to spend about $120 to upgrade my laptop's standard RAM from 16GB to 64GB.
      Additionally, could you tell me what software you used in your video demonstration? The interface looks very similar to VS Code, which I am currently trying to learn on my own.
      However, I'm not sure how to set up the environment. My command prompt shows Python 3.10.6.
      Do you have any videos that explain how to install Python and use related editors? Setting up the Python environment seems a bit complex, as it appears you have to set the environment before running the program.

  • @sorgulabiraz3161
    @sorgulabiraz3161 5 місяців тому

    Thank you. I've got an error like this:
    NameError Traceback (most recent call last)
    Cell In[30], line 2
    1 from pandasai import SmartDataframe
    ----> 2 df = SmartDataframe(data, config={"llm": llm})
    NameError: name 'data' is not defined

    • @Lifes_Student
      @Lifes_Student 4 місяці тому

      You had a traceback error and your first thought is to put it in a youtube comment rather then something like ChatGPT? ....Wow

    • @sorgulabiraz3161
      @sorgulabiraz3161 4 місяці тому

      @@Lifes_Student I asked ChatGPT before and it didn't give me a solution. I am a beginner and I think I am free to ask any questions in any platform. Is there a problem for you? Who are you? Thougt police?

    • @Lifes_Student
      @Lifes_Student 4 місяці тому +1

      @@sorgulabiraz3161 lol ...I'm just surprised someone's willing to wait for a solution in a comment rather then instantly debug it using a language model these these days. You should probably also know that you'll need a powerful graphics card to run the mistral model locally. I recommend an RTX 3060

    • @Lifes_Student
      @Lifes_Student 4 місяці тому +1

      And just an FYI "data" is a variable. You need to define it first .i.e:
      data = yourdata
      then you can call 'data' or pass it through a function e.g. function(data)

    • @sorgulabiraz3161
      @sorgulabiraz3161 4 місяці тому

      @@Lifes_Student Thank you.

  • @deepakkapoor5427
    @deepakkapoor5427 9 місяців тому

    facing this error
    'Unfortunately, I was not able to answer your question, because of the following error:

    No code found in the response
    '

    • @TirendazAI
      @TirendazAI  7 місяців тому

      When a prompt does not work, I am trying again by changing this prompt.

  • @ajithnaidu6978
    @ajithnaidu6978 8 місяців тому

    Hi , Unfortunately I'm not able to install pandasai in the terminal it showing could not find a version that satisfies the requirement pandasai

    • @TirendazAI
      @TirendazAI  8 місяців тому

      Hi, did you create a virtual environment? If yes, you can also use this command: poetry add pandasai

    • @ajithnaidu6978
      @ajithnaidu6978 8 місяців тому

      This poetry add pandasai used in the terminal or cell?

  • @sonidosdetranquilidad
    @sonidosdetranquilidad 10 місяців тому

    It does not work for me. I got the following error message: 40 if llm is None or not isinstance(llm, LLM or LangchainLLM):
    ---> 41 raise LLMNotFoundError("LLM is required")
    42 return llm
    LLMNotFoundError: LLM is required

    • @TirendazAI
      @TirendazAI  10 місяців тому

      Make sure you install Ollama and download a model

    • @JAlcocerTech
      @JAlcocerTech 10 місяців тому

      Hello @tirendazakademi,
      Same Error for me.
      I have ollama version 0.1.28
      Used: ollama pull mistral
      Tried with this dependencies (the ones that were latest 1 week ago when you uploaded) and got the very same error about: LLMNotFoundError("LLM is required")
      pandasai==2.0
      langchain==0.1.10
      langchain-community==0.0.25
      Could you please verify which exact packages you used?
      Thanks and keep the great content!😁

    • @AlessandroPerugini-hz7pk
      @AlessandroPerugini-hz7pk 10 місяців тому

      i have the same error, langllm work with langchain, i can call :
      llm.invoke("Tell me a joke")
      with a valid response.
      but SmartDataFrame does not recognize ollama as a valid LLM. i have also tried to install the github package.
      The only difference is ollama version is 0.1.28

    • @AlessandroPerugini-hz7pk
      @AlessandroPerugini-hz7pk 10 місяців тому

      @@JAlcocerTech with miniconda i have your same version and error , with visual studio code seem to work
      these are the library version i have :
      pandas 1.5.3
      pandasai 2.0.8
      langchain 0.1.11
      langchain-community 0.0.27
      langchain-core 0.1.30
      langchain-text-splitters 0.0.1
      I hope this can help

  • @charlesoni2787
    @charlesoni2787 9 місяців тому +1

    Nice. Thanks

  • @liuivy2840
    @liuivy2840 9 місяців тому

    'Unfortunately, I was not able to answer your question, because of the following error:

    No code found in the response
    '

    • @liuivy2840
      @liuivy2840 9 місяців тому

      I change population.csv Country to CountryName ; Population to populations; it works ..

    • @TirendazAI
      @TirendazAI  9 місяців тому

      👍

  • @justicecharles3336
    @justicecharles3336 10 місяців тому

    my genai refused to activate on conda

    • @TirendazAI
      @TirendazAI  10 місяців тому

      Make sure you have the Anaconda platform installed on your computer.

    • @justicecharles3336
      @justicecharles3336 10 місяців тому

      @@TirendazAI I do... But whenever I try to activate genai on the terminal, it doesn't work... Do you think it's a problem with my vs code?

  • @yusuf64956
    @yusuf64956 10 місяців тому

    hocam nıye turkce konusmuyonuz

    • @TirendazAI
      @TirendazAI  10 місяців тому

      Türkçe kanalımız ayrı, şuradan ulaşabilirsin: @tirendazakademi

  • @dominikcislak191
    @dominikcislak191 Місяць тому

    Waiting for trivial questions ~1min. Try to use bigger files and you will get errors. AI is not ready for this.

  • @souravbarua3991
    @souravbarua3991 7 місяців тому

    Its not working all the time. Its handy but not good. In other way langchain dataframe agents are working better than this.

    • @TirendazAI
      @TirendazAI  7 місяців тому

      If you are using a smaller model like the Llama-8b, sometimes you may need to try a few prompts to get a good response.

    • @souravbarua3991
      @souravbarua3991 7 місяців тому

      @@TirendazAI I am using same model as shown in video.