Knowledge Graphs, Theorem Provers & Language Models - Vijay Saraswat & Nikolaos Vasiloglou | KGC '24

Поділитися
Вставка
  • Опубліковано 6 вер 2024
  • In this masterclass, Saraswat and Vasiloglou comprehensively review the reasoning techniques developed for Language Models, such as Chain of Thought, Tree of Thoughts, Analogical Thinking, Reasoning via Abstraction, etc., and their limitations.
    Supplemental to LLMs, Knowledge Graphs and Theorem Provers can be used as verification engines for the generated reasoning plans. In many cases, there is insufficient data for pretraining LLMs on complicated reasoning tasks. Knowledge graphs can serve as reasoning planning generators that can create an infinite amount of data to train language models.
    Watch this session to see Vijay and Nikolaos review recent literature examples on how KGs can act as a handy simulator tool. They continue to demonstrate how LLMs can be thought of as a Knowledge Hub. A year later, researchers saw the LLM more like a world model, which is much more general. There is a strong consensus that the interface to the world model is natural language. However, the response has to go through a symbolic representation (Knowledge Graph or Theorem Provers) to be validated before it is communicated back to the user.
    This session was recorded at The Knowledge Graph Conference 2024. For unlimited access to all the keynotes, talks, masterclasses, and panels from this year's conference, sign up for #KGCReplay using the following link:
    events.knowled...
    ‪@relationalai8382‬ #LLM #languagemodels #techtalk

КОМЕНТАРІ • 2