well you should directly code a clone of the ollama console adding the go exec function so the model can exec anything ... including agents of any kind also there are go agents examples in ollama git that you can enable to all go..... just do your model code and exec go.. stop python over-bloating friendly
ollama list Error: Head "127.0.0.1:11434/": read tcp 127.0.0.1:50704->127.0.0.1:11434: wsarecv: An existing connection was forcibly closed by the remote host. Never mind then
You basically recreated CrewAI with just a few lines of code! this is super cool! Unfortunately it suffers the same weaknesses.
well you should directly code a clone of the ollama console adding the go exec function so the model can exec anything ...
including agents of any kind
also there are go agents examples in ollama git that you can enable to all go.....
just do your model code and exec go..
stop python over-bloating
friendly
Too many mistakes. Will you teach how to overcome these mistakes ?
Human in the middle is the usual approach
ollama list
Error: Head "127.0.0.1:11434/": read tcp 127.0.0.1:50704->127.0.0.1:11434: wsarecv: An existing connection was forcibly closed by the remote host. Never mind then