What is Model Lifecycle | Everything About Generative AI Explained | Part 4

Поділитися
Вставка
  • Опубліковано 15 вер 2024
  • In Part - 4 we discuss about AWS Regions and Model LifeCycle! There is a surprise giveaway for suggestions & questions.
    Pages:
    docs.aws.amazo...
    docs.aws.amazo...
    docs.aws.amazo...
    docs.aws.amazo...
    docs.aws.amazo...
    Amazon Bedrock is used to create applications that use Artificial Intelligence. This is an AWS Service which is used by leading industries and future of Development of AI Based Apps!
    As a Software Engineer, I took up a challenge to summarise all 1300 Pages of Amazon Bedrock Documentation so that you could watch or hear these videos while travelling or doing anything else and learn about the technology of Future in no time!
    Learning Amazon Bedrock will give you an Edge as you'll be able to build applications that use Artificial Intelligence from models provided by Amazon, Meta etc.
    There is also a surprise giveaway for asking questions/sharing suggestions in the comments section.
    Keep learning & subscribe to show your support!
    Welcome to part 4 of exploring Entire 1300 pages of Amazon bedrock documentation. In this video we discuss AWS Regions & dive deep into the model life cycles and Previously we looked at on-demand and provisioned capacity modes & talked a bit about embeddings. Don't forget to watch the very interesting videos before this but for now Let's move further now.
    On this page, the Access to different models in different AWS regions is provided. If you’re confused about what an AWS region is, then it is simply a region in which multiple data centres are located. Now multiple resources in AWS accounts are region specific, which means that data centres for them are present in that region.
    Now, not all models are available in all regions, you can use this page for specific needs, but for now let's move to the next one.
    Here a list of supported features per model has been provided. For example in Anthropic Claude v2 model supports query on knowledge base which means you can add your use-case specific dataset to the model to augment responses.
    Now, If you are using a model, chances are that there will be newer versions made available which have better performance. Amazon Bedrock is continuously working on new versions of the models.
    When a newer & better version comes, the older version is marked legacy and an End of life date is set after which you can't use the model.
    Another thing is When you configure Provisioned Throughput, you must specify a model version that will remain unchanged for the entire term. The commitment term must end before the EOL date. Some already legacy models with EOL have been highlighted.
    On this page, the ids of models for on-demand and provisioned throughput mode have been provided. We discussed these in the previous part, but in short, you can use the on-demand mode for when you are fine with multiple users sharing the resources, you might observe throttling in case of high overall traffic to the model. But you can get dedicated resources by calling CreateProvisionedModelThroughput with provisioned id and use the returned id in the response as model id.
    Here we end this video, where we discussed briefly about model life cycle, AWS Regions and model Ids for provisioned & on-demand mode. In the next videos we discuss in depth about using different types of model like text model, image models so don't forget to subscribe and watch previous videos to know the key points that you missed.

КОМЕНТАРІ • 2