Building an multi-agent concierge system using LlamaIndex Workflows

Поділитися
Вставка
  • Опубліковано 12 гру 2024

КОМЕНТАРІ • 13

  • @BrettKromkamp
    @BrettKromkamp 14 днів тому

    Good overview of more advanced workflows functionality. Thanks! Nevertheless, it seems that the code you are walking us through is not the same as the code in the linked repository? Could you provide us with (a link to) the updated code? Thanks in advance.

  • @xuexileader
    @xuexileader 3 місяці тому +5

    I m confusing, your project not using llama-agents , instead , rom llama_index.core.agent import FunctionCallingAgentWorker. what is this????

  • @chirwatra
    @chirwatra 4 місяці тому +1

    Can we do this using local LLM that we hosted on vLLM? Because we don’t want to send customer data to OpenAI.

  • @ashwanidangwal6446
    @ashwanidangwal6446 23 дні тому

    i cannot find the code...can yu please provide this one exactly?

  • @eneskosar4649
    @eneskosar4649 2 місяці тому +1

    it has so many build errors. Please provide a requirements.txt file for versions

  • @shubhammittal7832
    @shubhammittal7832 3 місяці тому

    Errors being seen in the project. If I type "stop" as the first input. It goes in some kind of loop of errors and takes some time to come out of it

  • @GuruprasadGV
    @GuruprasadGV 3 місяці тому +2

    i think this is overly complicated. 90% of this has nothing to do with LLMs - and they are all solved problems. I don't think llmindex should try to re-invent application building mechanics - like workflows etc.

  • @SamiSabirIdrissi
    @SamiSabirIdrissi 4 місяці тому

    First

  • @BerndPrager
    @BerndPrager Місяць тому

    If you want to explain the LlamaIndex workflows, I believe you would be far better off simplifying the example: less agents, less tools, less events ... for a first-time viewer this is getting quickly confusing and convoluted.

  • @RajeshGupta-gx3yz
    @RajeshGupta-gx3yz Місяць тому

    I wish the explanation was better! Disappointed!