Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks, with Patrick Lewis, Facebook AI

Поділитися
Вставка
  • Опубліковано 26 жов 2024

КОМЕНТАРІ • 7

  • @TheMriganks
    @TheMriganks 4 роки тому +9

    Good session, possible to get the slides?

  • @JohnCrafts-c7s
    @JohnCrafts-c7s Рік тому

    Use a pre-completion document, that stores the cached response of the model to write to as a sort of reverse RAG (gives a space for multi-hop to happen, and stores context for cross document RAGs)

  • @_HarshVerma
    @_HarshVerma 3 роки тому +7

    Now this guy knows what he is talking about otherwise every other person talkin rag on UA-cam is just saying gibrish

    • @bhaumikpatel4902
      @bhaumikpatel4902 Рік тому +5

      Probably because he was a co-author 😅

    • @ZKYQUQ
      @ZKYQUQ 10 місяців тому

      You're right bro. Love u.@@bhaumikpatel4902

  • @FunCodingwithRahul
    @FunCodingwithRahul Рік тому

    Can I integrate any LLM as generator model e.g. Falcon? I am getting errors while doing it. any lead would be of great help !!

  • @DistortedV12
    @DistortedV12 3 роки тому

    Awesome work