04-Config and Deploy Phi3:14b model with Ollama in SAP AI Core

Поділитися
Вставка
  • Опубліковано 4 жов 2024
  • Configure and deploy Ollama as a Custom Inference Server in SAP AI Core to serve Microsoft's Phi3:14b (ollama.com/lib...) with OpenAI-like chat completion API, which is compatible with SAP Generative AI Hub SDK. Ensure smooth portability to another foundation models in SAP Generative AI Hub with minimal code change.
    Sample Code on Github Repo: github.com/SAP...
    Blog Post about Bring Open-Source LLMs into SAP AI Core with Ollama: community.sap....

КОМЕНТАРІ • 1

  • @shotbotop3790
    @shotbotop3790 3 місяці тому

    Hi sir , I ve recently created a rag chatbot utilising ollama and wanted it to deploy it to some cloud, can you shortly tell with the process and requirements for doing so...regards.