GPT4ALL Tutorial: Create a Local RAG with No Coding (Bonus: Vision and Uncensored Models)

Поділитися
Вставка
  • Опубліковано 12 січ 2025

КОМЕНТАРІ • 24

  • @dognini
    @dognini 22 дні тому +1

    Thank you. This is great content.

  • @intelligentestate
    @intelligentestate 12 днів тому +1

    Hey, I am Really curious how you were able to get Hermes in GPT4ALL to view online photos. I've tried using localdocs to connect my models to web content bet no dice....

    • @AISoftwareDevelopers
      @AISoftwareDevelopers  12 днів тому

      What version of gpt4all are you using? It was reported earlier that certain versions after 3.4.2 had issues with some of the models. In my case, no special configuration was required.

    • @intelligentestate
      @intelligentestate 12 днів тому

      @@AISoftwareDevelopers I updated to their latest version (1.6.1) It's on windows, I can get my visual models to hallucinate an answer(Based on whatever the address says) but other than that no dice.

  • @capt2026
    @capt2026 21 день тому +1

    Nube here. Very interesting, Thanks. You have several models downloaded. Are they on an external drive? and what is the configuration of your machine?

    • @AISoftwareDevelopers
      @AISoftwareDevelopers  21 день тому +1

      The models are stored on my local hard drive - 14'' MPB (late 2023) with M3 and 1TB storage. You can store them in an external drive, if space is an issue. The operations will be slower, but it will work. Thanks for the comment!

  • @geelws8880
    @geelws8880 25 днів тому +1

    I would love a video on how to build custom high quality datasets with nomic

    • @albertcurtis1201
      @albertcurtis1201 25 днів тому

      You really can't using their PC models ... they are very stupid.

    • @AISoftwareDevelopers
      @AISoftwareDevelopers  25 днів тому

      Can you elaborate on the use case and the tools? Is this with Nomic Atlas or GPT4ALL?

  • @mohameddonia6544
    @mohameddonia6544 23 дні тому +1

    Hey, thanks for sharing! Does it have a limit on PDF file size? I've got some files that are almost 5GB. Will it work?

    • @AISoftwareDevelopers
      @AISoftwareDevelopers  23 дні тому +2

      I am not aware of any limits, but parsing a PDF of that size will be a challenge for any application, not just gpt4all. A powerful CPU, tons of RAM and GPU may be able to help. Otherwise, you may want to parse the PDFs into Markdown first, using something like LlamaParse (paid) and then process the MD files in gpt4all. The embeddings will still take time though.

  • @SteveHodgkiss1
    @SteveHodgkiss1 23 дні тому +1

    As it's an installed application, is there any way to use the local models inside an editor such as Visual Studio or Windsurf?

    • @themax2go
      @themax2go 23 дні тому +2

      Yes in options activate openai endpoint

    • @AISoftwareDevelopers
      @AISoftwareDevelopers  23 дні тому

      I don’t see a reason why not. The models are downloaded to a folder you can configure and therefore load and use from anywhere else you need to. Great question!

  • @naitik_patel
    @naitik_patel 23 дні тому +1

    Can I use this as a alocal server and use its API hosted locally for my other projects? If it does it will be awesome and not then I think that's a good next iteration feature it needs to implement❤

    • @themax2go
      @themax2go 23 дні тому +1

      already done see my other post

    • @AISoftwareDevelopers
      @AISoftwareDevelopers  23 дні тому

      Yes, as @themax2go pointed out, you can configure and expose an API endpoint and have other apps use the models.

    • @naitik_patel
      @naitik_patel 23 дні тому

      @@AISoftwareDevelopers thanks I will surely try

  • @albertcurtis1201
    @albertcurtis1201 25 днів тому +1

    If you use their 3.5.0 and above you won't be able to side load models ... Downgrade to 3.4.2 which rocks.

    • @AISoftwareDevelopers
      @AISoftwareDevelopers  25 днів тому

      I wasn't aware of this, but after checking they have already released 3 minor updates since the video was recorded. A fast-paced team, for sure 😃

    • @albertcurtis1201
      @albertcurtis1201 24 дні тому +1

      @@AISoftwareDevelopers Those minor updates still don't load most HF models out of the box. Your luck may vary. I use 3.4.2

    • @adamtreat7582
      @adamtreat7582 24 дні тому +1

      You can use sideloaded models just fine, but it might require tweaking the chat template. The latest version - 3.6.0 - which was just released does have replacements and examples for several well known sideloaded models.

    • @AISoftwareDevelopers
      @AISoftwareDevelopers  23 дні тому +1

      @@adamtreat7582 thanks for chiming in. What is a good link to learn more about this? If there's enough interest, maybe I can throw together a quick tutorial on how to side-load models?