Thanks for the wonderful explanation of k8sgpt capabilites. I tried it with Azure OpenAI as the backend and it worked very well. Looking forward to the future improvements and enahncements which will make this a good handy tool in the hands of SRE and DevOps engineers. Great stuff.
Nice workflow merging the pains of SRE through AI based solution. If I am not wrong OpenAI would be monetized down the line (maybe only basic features would be available on open source version). Does that effects API calls to retrieve the most accurate data ?
There are a few options, but the number one priority is to give people choice by supporting either additional proprietary backends or their own AI/ML model endpoint ( e.g. kubeflow etc ). This project will stay OSS regardless of any machinery/SaaS that might be built around it.
I think a more accurate mindset for cost is “how much text do I need processed?” Currently you’ll find the free OpenAI backend can adequately give you multiple scans an hour of around 20-40 broken resources. I would imagine anymore you’ll start to see 429 responses and need to pay for OpenAI tokens. Going forward we will support multiple AI backends so you can find the most cost effective for you. I hope that helps!
There is the brew installer which is x-platform so seemed like the lowest common denominator as it works on Linux also. But no specific package like deb or snap.
Thanks for the wonderful explanation of k8sgpt capabilites. I tried it with Azure OpenAI as the backend and it worked very well. Looking forward to the future improvements and enahncements which will make this a good handy tool in the hands of SRE and DevOps engineers. Great stuff.
Pretty cool project, go ahead guys
This is epic especially for a start
Nice workflow merging the pains of SRE through AI based solution.
If I am not wrong OpenAI would be monetized down the line (maybe only basic features would be available on open source version). Does that effects API calls to retrieve the most accurate data ?
There are a few options, but the number one priority is to give people choice by supporting either additional proprietary backends or their own AI/ML model endpoint ( e.g. kubeflow etc ). This project will stay OSS regardless of any machinery/SaaS that might be built around it.
very nice
Hi, very interesting tool. Is it possible to filter the analyse on a specific namespace ?
It is not currently but we could add that if you are interested
sounds good
I have few questions:
How much does openai API cost in average for 100 pod cluster?
How would it scale for cluster with 1000 pods?
I think a more accurate mindset for cost is “how much text do I need processed?” Currently you’ll find the free OpenAI backend can adequately give you multiple scans an hour of around 20-40 broken resources. I would imagine anymore you’ll start to see 429 responses and need to pay for OpenAI tokens. Going forward we will support multiple AI backends so you can find the most cost effective for you. I hope that helps!
no package installer for linux distro atm right?
There is the brew installer which is x-platform so seemed like the lowest common denominator as it works on Linux also. But no specific package like deb or snap.
k8sgpt version is throwing below error -
Error initialising kubernetes client: no Auth Provider found for name "azure"
Just wondering, is this channel affiliated w/ Lockheed?
Not at all!
Would be good to also provide Nix packages…
Nix packages are now supported in the latest release