Це відео не доступне.
Перепрошуємо.

Run 3 Open-Source LLMs on Google Colab - for FREE ⚡️ Top Generative AI Model Hands-on (Hugging Face)

Поділитися
Вставка
  • Опубліковано 12 сер 2024
  • Here's how to get started and perform hands-on with leading open-source Generative models - using the free T4 GPU on Google Colab.
    ✅ Download Free Resources & Code Here 🔗 bit.ly/45X8k8N
    ✅ Run Stable Diffusion on Colab [Part-2] 🔗 • Run Stable Diffusion i...
    In the last 1 year, the world has been wholeheartedly appreciative of ChatGPT and all the cool stuff we may do with it. But all this while, there’s been a silent revolution that has taken shape and is now ready to disrupt this whole domain of Generative AI. This silent revolution is nothing but the emergence of more than a handful of Open Source Models that have been released into the public domain. Some of these Open Source Models, have even beaten or gotten fairly close to GPT-4 on certain task-specific use cases, like Code Writing.
    So, in this video, we are going to perform hands-on with three different Open Source Generative Models:
    1️⃣ Dolly-v2-3b Model (Databricks) 🔗 huggingface.co/databricks/dol...
    2️⃣ Falcon-7b-Instruct Model (TII) 🔗 huggingface.co/tiiuae/falcon-...
    3️⃣ Stable-Diffusion-v1-5 (RunwayML) 🔗 huggingface.co/runwayml/stabl...
    ---------------------
    Sections 🔥
    ---------------------
    00:00 Introduction
    01:44 Dolly-v2-3b (Databricks)
    07:17 Let's build an AI App on Dolly-v2-3b
    09:28 Falcon-7B-Instruct (TII)
    10:22 Stable Diffusion v1-5
    To ensure that all of us are able to follow along, we’ll be running these models on the free version of Google Colab - using their free T4 GPU. In fact, that’s the precise reason why we have chosen these smaller-sized Models, so that we may do our inference seamlessly. Yes, smaller model means deteriorated performance. But the idea here is to know the possibilities rather than pursuing perfection.
    ---------------------
    Tags ⭐️
    ---------------------
    run open-source LLM for free in Google Colab/Kaggle
    free gpu
    open source llm
    open source llm models
    llm open source
    opensource llm
    free member nvidia
    llm inference
    free cloud gpu
    free gpu cloud
    cloud gpu free
    free gpu for deep learning
    free gpu online
    free gpu cloud computing
    google cloud gpu availability
    free online gpu

КОМЕНТАРІ • 30

  • @Analyticsvidhya
    @Analyticsvidhya  14 днів тому

    Book FREE 1:1 Mentorship for Gen AI / Data Science
    Link 🔗 bit.ly/3wlIIGz

  • @swetasharma8467
    @swetasharma8467 9 місяців тому

    Excellent video! Thanks for explaining the process in detailed manner. Looking forward to more videos!!

  • @souravbarua3991
    @souravbarua3991 9 місяців тому +3

    Thank you for the video. I have used mistral 7b open source model with ctransformer and langchain in my project. From this video I came to know 3 more open source models I can use in my project.

    • @Analyticsvidhya
      @Analyticsvidhya  9 місяців тому

      Hey Sourav, will be interesting to hear more on your project. Is it documented on GitHub?
      Also, share your experience of Mistral-7b model with us.

    • @souravbarua3991
      @souravbarua3991 9 місяців тому +1

      @@Analyticsvidhya Glad to hear that you liked my project. No the project is not available at github yet. I will load it soon. The project is about chat with local pdf using mistral 7b model. The experience with mistral is good. To use open source llm with ctransformer is quite easy method.

    • @sv4647
      @sv4647 6 місяців тому +1

      ​@@Analyticsvidhya Hi for my project I am looking for small huggingface model that can understand both code and text.
      In my chatbot project I am loading github repositories and asking question on it.

    • @knkn5049
      @knkn5049 6 місяців тому

      ​@@souravbarua3991do you know how to change code so it would remember your dialogue? Today i have fould some step-by-step tutorial for colab, launched it and every answer takes 7 minutes, is it ok?

  • @ayushsharma413
    @ayushsharma413 6 місяців тому

    Thank you so much, I've been searching about hosting llms on colab and using it.

  • @hajamydeen2025
    @hajamydeen2025 6 місяців тому

    thank u for good teaching

  • @abdulazizdeveloper7915
    @abdulazizdeveloper7915 4 місяці тому

    excellent explanation thank u very much

  • @AliYlmaz-nb4vn
    @AliYlmaz-nb4vn 2 місяці тому

    Thank you for sharing, one notice: transformers > 4.38 causes bug in inference for the models.

  • @mekkicharfi5454
    @mekkicharfi5454 2 місяці тому

    Very nice job . I subscribe in your page . Thank you very much !

  • @eduardoerlo5197
    @eduardoerlo5197 3 місяці тому

    This channel is amazing!

  • @user-tj1io2ts4b
    @user-tj1io2ts4b 9 місяців тому

    Greaatt Video!!

  • @aatkafaryal1199
    @aatkafaryal1199 6 місяців тому

    I am getting below error when running gradio sinppet
    cannot import name 'Doc' from 'typing_extensions'. I tried this but still not working.
    !pip install gradio
    import gradio
    import torch

    • @Analyticsvidhya
      @Analyticsvidhya  6 місяців тому

      Try restarting runtime and then executing the Gradio part

  • @bibimblapblap
    @bibimblapblap 5 місяців тому

    How can you avoid needing to download the model every time? I saved the model to GDrive and transferred to cache in a new session but can't successfully load from cache

    • @Analyticsvidhya
      @Analyticsvidhya  5 місяців тому

      Refer to this discussion thread for your resolution: discuss.huggingface.co/t/saving-loading-model-in-colab-and-making-predictions/6723

  • @swapnil0402
    @swapnil0402 8 місяців тому

    Thanks for the video. I am getting below error when running gradio sinppet
    cannot import name 'Doc' from 'typing_extensions'
    can you please help me to understand how to resolve it. Thanks.

    • @Analyticsvidhya
      @Analyticsvidhya  8 місяців тому

      Can you share your query on our community platform: community.analyticsvidhya.com/
      Also share screenshots. Let's discuss this there.

    • @vassilissolachidis1199
      @vassilissolachidis1199 8 місяців тому +1

      It's colab's issue. Try this
      !pip install gradio
      import gradio
      import torch
      (if you have already imported torch restart the kernel)

    • @swapnil0402
      @swapnil0402 8 місяців тому

      @@vassilissolachidis1199 thanks for the help. Was able to resolve the issue.

  • @abdulazizdeveloper7915
    @abdulazizdeveloper7915 4 місяці тому

    I have a question , can I make this code as an API ? because I want to make graduation project to build a Generative AI mobile app .

    • @Analyticsvidhya
      @Analyticsvidhya  3 місяці тому

      Dear learner, drop an email to our Career Counselling team for a free mentorship session: ummed@analyticsvidhya.com
      Alternatively, you may call on the any of the following numbers:
      Line1: +91 8068342847
      Line2: +91 8046107668

  • @Kkschannel-zk3up
    @Kkschannel-zk3up 16 днів тому

    Does it have a word limit?

    • @Analyticsvidhya
      @Analyticsvidhya  6 днів тому

      Yea, look up for context window length. Generally it's 128k