I came here to quickly figure out how to run BQML models in Jupyter notebooks, I walked away learning how to import and deploy those models in Vertex AI, awesome! 😁
Glad I could help! There is a vastly expanded BQML series coming this Fall. The content is already on the GitHub repo under the 03 series folder - check it out for a preview!
Hi Mike, - Can I deploy my own sklearn models inside the VertexAI model registry and afterwards uploading it into an endpoint? I should just provide a "consistent" image ? How does the endpoint will "discover" how to load the model and make the inference? - I saw in the documentation they have specific formats according to the choosen modelling technique but I don't know If I can upload my own custom models with different formats (.pkl, H20, .onnx) :( Good work!! Thanks a lot for the knowledge!!
Hi Rafael, Thank you for the questions. The videos are not up yet but there are sklearn example in the repository linked in the description. Here is a direct link to the sklearn folder: github.com/statmike/vertex-ai-mlops/tree/main/04%20-%20scikit-learn At the core, Vertex AI Model Registry has links to the the model save files (GCS URI) and a container that has the components needed to serve the model files. There are prebuilt containers for common frameworks. Another option is providing you own custom container that can serve the model. A tricky part of that is the necessary parts needed to handle and serve predictions but Vertex AI has a great tool to help with this called Custom Prediction Routines. It is a tool that makes it easy to built a custom container to serve a model and is introduced here: cloud.google.com/vertex-ai/docs/predictions/custom-prediction-routines There is more content coming to the repository related to this and videos will follow. Thank you! Mike
Very informative ! Thanks for sharing
I came here to quickly figure out how to run BQML models in Jupyter notebooks, I walked away learning how to import and deploy those models in Vertex AI, awesome! 😁
Glad I could help! There is a vastly expanded BQML series coming this Fall. The content is already on the GitHub repo under the 03 series folder - check it out for a preview!
@@statmike-channel Looks great, looking forward to the videos!
Thank you very much for providing such a detailed tutorial.
Hi Mike,
- Can I deploy my own sklearn models inside the VertexAI model registry and afterwards uploading it into an endpoint? I should just provide a "consistent" image ? How does the endpoint will "discover" how to load the model and make the inference?
- I saw in the documentation they have specific formats according to the choosen modelling technique but I don't know If I can upload my own custom models with different formats (.pkl, H20, .onnx) :(
Good work!! Thanks a lot for the knowledge!!
Hi Rafael, Thank you for the questions. The videos are not up yet but there are sklearn example in the repository linked in the description. Here is a direct link to the sklearn folder: github.com/statmike/vertex-ai-mlops/tree/main/04%20-%20scikit-learn
At the core, Vertex AI Model Registry has links to the the model save files (GCS URI) and a container that has the components needed to serve the model files. There are prebuilt containers for common frameworks. Another option is providing you own custom container that can serve the model. A tricky part of that is the necessary parts needed to handle and serve predictions but Vertex AI has a great tool to help with this called Custom Prediction Routines. It is a tool that makes it easy to built a custom container to serve a model and is introduced here: cloud.google.com/vertex-ai/docs/predictions/custom-prediction-routines
There is more content coming to the repository related to this and videos will follow. Thank you! Mike