Chat with an entire library of 100 books and 17 million tokens using RAG, Chromadb and GPT-4o.

Поділитися
Вставка
  • Опубліковано 2 лис 2024

КОМЕНТАРІ • 13

  • @echohive
    @echohive  3 місяці тому +5

    Download the full Project files for this project at my Patreon along with 250+ other projects:www.patreon.com/posts/chat-with-whole-108904494
    Talk with me this Sunday the 28th, AMA for Architect+ level Patrons: www.patreon.com/posts/ama-meetings-for-108628691
    Learn to code fast with AI assistance with my 1000x MasterClass: www.patreon.com/posts/1000x-dev-103326330\
    Search 200+ echohive videos and code download lnks:www.echohive.live/
    Auto Streamer: www.autostreamer.live/
    Fastapi course: www.patreon.com/posts/learn-fastapi-26-95041684
    Chat with us on Discord: discord.gg/PPxTP3Cs3G
    Follow on twitter(X) : twitter.com/hive_echo

  • @dixon1e
    @dixon1e 3 місяці тому +11

    This feels like sitting next to a friend with shared interests, leading me through the cool stuff he’s been up to. This is deeply appreciated.

    • @echohive
      @echohive  3 місяці тому +1

      This made my day! Thank you very much. I am happy to hear that my daily endeavors of learning and coding is being appreciated ❤️

    • @dittoXtime
      @dittoXtime 3 місяці тому +1

      Agreed! What a treat to have found this channel.

    • @echohive
      @echohive  3 місяці тому

      @dittoXtime thank you very much as well 🙏

  • @jankothyson
    @jankothyson 3 місяці тому +6

    Hey man, just wanted to let you know that I've been following you for close to a year now and really appreciate your content - especially your laid-back down-to-earth style and the fact, that you always make your code accessible. Thinking about joining your patreon once I can make more time to actually engage with the content more. Keep up the great work 🙂✌️

    • @echohive
      @echohive  3 місяці тому

      Thank you very much for the kind words and the feedback. I am happy to hear you find the projects useful. Appreciate it 🙏

  • @MichaelWoodrum
    @MichaelWoodrum 3 місяці тому +2

    I'm definitely going to be looking through these code files to see how I can adapt it for a large memory system. See how well it works at least. My assistant needs context awareness and it should include months of context that it can try to search through to comment in a more relevant way than just having a single days context window available as a memory. I can't wait until we have seemingly unlimited contacts windows. But it might be through tricks like this that are implemented in the back end or some other ways that are combined together. Having a parallel agent that's constantly looking for memories on every submission, especially with many, could allow for injecting past data into the main agents context window to make things more relevant and feel more like it has an actual memory system so that I don't have to constantly explain the same thing that I've talked about in the past.

    • @echohive
      @echohive  3 місяці тому

      Yeah that is a great idea! I want to also work on a personal memory project as well. Which is similar to what you mentioned. Check out this GitHub repository for memory as well: github.com/mem0ai/mem0

  • @john_blues
    @john_blues 3 місяці тому +1

    Thanks for this video. Can you show how to set this up using an open source model like Lllama 3.1?

    • @echohive
      @echohive  3 місяці тому +1

      Thank you as well 🙏 You can easily do that by using openrouter(which i made videos about before) by just changing the base url of openai library to openrouter in openai initialization then use "meta-llama/llama-3.1-405b-instruct" as the model. here is a link to openrouter: openrouter.ai/models

    • @john_blues
      @john_blues 3 місяці тому +1

      @@echohive Thanks! I've had openrouter open in another tab for a few days. I guess it's time to take a look at it. :)

    • @echohive
      @echohive  3 місяці тому

      @john_blues yeah it is super easy. You will love it.