Lets build Local, 100% Private, Tamil Bot 🤖 using Tamil-LLaMA, Streamlit, Langchain & Ollama

Поділитися
Вставка
  • Опубліковано 27 січ 2024
  • In this video we will learn how to create a Local, 100% Private Tamil capable Chat bot using newly available Tamil-Llama model using Ollama, Streamlit, Langchain & little bit of python.
    Tamil-LLM Bot Github Repo : github.com/Concepts-in-tamil/...
    Tamil-LLama model published in Ollama : ollama.ai/conceptsintamil
    Abinand's arxiv paper : arxiv.org/pdf/2311.05845.pdf
    Link to HF GGUF Model : huggingface.co/abhinand/tamil...
    This video is produced with utmost passion & care from me without any external help or team. Right from the concept ideation, content preparation, scripting, voice recording, video editing, hand drawn animation, visual effects and music curation / generation I do on my own. Hence your simple like / comment / share / subscribe will be an immense motivation for me.
    Website : www.vizkumar.com
    Course Landing Page : courses.vizkumar.com
    Discord : / discord
    #tamil #technology #career #ml #ai #python #cloud #programming #langchain #llm #chatbot
  • Наука та технологія

КОМЕНТАРІ • 17

  • @anandg1141
    @anandg1141 2 місяці тому +1

    Super bro, Chainlit tutorial mudinja upload pannugga

  • @user-oy1uq3nd6d
    @user-oy1uq3nd6d 3 місяці тому +1

    own datala llama api key use panni ready panni video poduga bro ethu pola

    • @conceptsintamil
      @conceptsintamil  3 місяці тому

      You don't need a API key at all 😃. That's the beauty of locally hosting the llms. If you use OpenAi's LLM (like chat gpt) then you need to pay $ to open AI and get the API key. With local LLM, it's free, private and secure

  • @sibichakkaravarthi3574
    @sibichakkaravarthi3574 3 місяці тому +1

    Hi bro ipo windows kum ollama use aguthu but itha nama particular domain ku yapade use pandrathu bro

    • @conceptsintamil
      @conceptsintamil  3 місяці тому +1

      Yes bro. Ollama recently added windows support. Once you installed ollama and downloaded any local llm, you can easily fine tune it either via prompt engg system prompt or via LoRa methods. Most of the latest llms like llama2, mistral, mixtral, dolphin, gemma etc are very much versatile for all domains. You can easily make it work with little effort on prompt engg

    • @sibichakkaravarthi3574
      @sibichakkaravarthi3574 3 місяці тому +1

      @@conceptsintamil Thank you so much bro.

    • @mithunkumar25557
      @mithunkumar25557 3 місяці тому +1

      @@conceptsintamil Bro, next put a tutorial for it. This will really help non-technical people in fine-tuning LLM for particular domain

    • @conceptsintamil
      @conceptsintamil  3 місяці тому

      @@mithunkumar25557 yes it’s on the d way 😊

    • @mithunkumar25557
      @mithunkumar25557 3 місяці тому +1

      @@conceptsintamil Thanks a lot bro. Will be very helpful. I am also currently trying to fine-tune an LLM for my domain in Tamil.

  • @mithunkumar25557
    @mithunkumar25557 3 місяці тому +1

    Brother also post a video on how to fine tune Llama.

  • @vengatesan26
    @vengatesan26 4 місяці тому +1

    na windows dha use pandra so how to do?

    • @conceptsintamil
      @conceptsintamil  4 місяці тому

      Ollama doesn’t support windows bro. One option is to run a Linux vm and run ollama inside . The chat bot can still run in windows

  • @nonamescurrently8373
    @nonamescurrently8373 2 місяці тому +1

    Ollama call failed with status code 404 itha ERROR VARUTHU ENNA PANNA

    • @conceptsintamil
      @conceptsintamil  2 місяці тому +1

      Ollama should be running bro. Which OS are you using ? I cannot paste external links here . Can you join either CIT telegram or discord and ask your Q? Happy to help!

    • @nonamescurrently8373
      @nonamescurrently8373 Місяць тому

      @@conceptsintamil bro discord ketan bro replay please