Agents Tools & Function Calling with Amazon Bedrock (How-to)

Поділитися
Вставка
  • Опубліковано 23 кві 2024
  • Agents for Amazon Bedrock 👉 docs.aws.amazon.com/bedrock/l...
    Resources:
    🌐 Learn more: aws.amazon.com/bedrock/
    Follow AWS Developers!
    🐦 Twitter: / awsdevelopers
    💼 LinkedIn: / aws-developers
    👾 Twitch: / aws
    📺 Instagram: awsdevelope...
    #generativeai #amazonbedrock #codingtutorial
  • Наука та технологія

КОМЕНТАРІ • 13

  • @brendenriggs9018
    @brendenriggs9018 День тому

    I'm a manager for the team responsible for all of my company's GenAi features. I'll definitely be asking everyone on my team to watch this video and any others related to this. Looking forward to seeing more vids like this one.

  • @anatoliyshuba8983
    @anatoliyshuba8983 10 днів тому +2

    Mike speaks very fast but he uses very simple english so any non-native speaker can easily understand him. Thanks Mike!

  • @tobyrigby7
    @tobyrigby7 19 днів тому +4

    Literally in love with your videos Mike! I always learn something new, and in an easy to digest way.

  • @codeinrust
    @codeinrust День тому

    Excellent plain English explanations, and a real live demonstration, instead of pre-recorded cheating videos. Excellent!

  • @DionisioMichael
    @DionisioMichael День тому +1

    Great job. Very easy to follow!

  • @powerofzero5370
    @powerofzero5370 11 днів тому +1

    I still have some questions around the bigger picture of the Bedrock architecture... I understand agents and their use cases but I thought the idea when building an app was to 'front-end' the agents with a broader context FM that would be the actual chatbot interface? In other words, I am working on an application that will have a number (maybe 6 or so) of specialized agents that I thought would be invoked by the chat interface FM on an as needed basis. Also, can agents interact with each other in the background? If I can't front-end the agents I would need a chat interface for every agent I build which I very much doubt is the way the architecture is designed. Do you have something that shows a complete end-to-end application that encompasses all components?

  • @generatiacloud
    @generatiacloud 8 днів тому +1

    What about a demo using multiple agents, multiple llm, langchain and langsmith to do tracing?

  • @saeedesmailii
    @saeedesmailii 19 днів тому +1

    There is no link to the github repo in the description.

  • @pythonantole9892
    @pythonantole9892 19 днів тому +1

    This is cool. Can this be modified to use Alexa so that the input comes in through voice as slots.

    • @mikegchambers
      @mikegchambers 18 днів тому

      That's conceivable. Have an experiment. The Alexa platform already extracts intent from the input, so the input through Alexa becomes deterministic... but that sounds fun to play around with.