Data Agents with LlamaIndex

Поділитися
Вставка
  • Опубліковано 22 лис 2024

КОМЕНТАРІ • 10

  • @AI-Makerspace
    @AI-Makerspace  7 місяців тому +2

    Google Colab Notebook: colab.research.google.com/drive/1JOzbVzrm8_GJAmuh2Qcjsxf5Rg0yK3AG?usp=sharing
    Event Slides: www.canva.com/design/DAGCrgbYdc0/Q56HqhQAp_-163pJNm_fmA/view?DAGCrgbYdc0&

  • @prasad_yt
    @prasad_yt 5 місяців тому +1

    I like the way you simplify and explain- starting with the big picture and then breaking down in to the details.❤

  • @roopad8742
    @roopad8742 7 місяців тому +1

    Really good session! Always looking forward to your slide decks and easy explanations!

  • @sitedev
    @sitedev 7 місяців тому +1

    Awesome as usual! I’m doing all of my development using Flowise and all of this information is useful and mostly transferable to Flowise. Thanks.

  • @techgiantt
    @techgiantt 7 місяців тому +4

    Can you guys, showcase the function calling feature with smaller models, some of us are trying to build on local. Anyways, great work 👍🏻

    • @AI-Makerspace
      @AI-Makerspace  7 місяців тому

      We'll see when we can slide this in!

  • @tyessenov
    @tyessenov 7 місяців тому +1

    Great video, thnx! Is it possible to include image in a LLM response? Text from rag and also an image as a bs64 to chat app?

    • @AI-Makerspace
      @AI-Makerspace  7 місяців тому +1

      It is possible - yes! You just need to handle the image pipeline - but LlamaIndex has pipelines for this built in!

  • @vijaybrock
    @vijaybrock 7 місяців тому

    Hello Sir, Can we add Query transformations for the above whole pipeline. My context is to build a RAG pipeline that chats with Multiple 10K reports, for which we can use domain specific LLM for embedding and query transformations, of course i want leverage the AutoRetrieveingTool also. Can you suggest me this?