Thread

SK
sufyaan khateeb6:35 PMOpen in Slack
We are trying to configure archestra with ollama llm but we are getting an error, any idea how to resolve this

4 replies
J(
joey (archestra team)6:49 PMOpen in Slack
hello 👋 what error are you getting?
SK
sufyaan khateeb7:03 PMOpen in Slack
We have ollama running on a localhost, when trying to setup ollama We see that api key is optional as ollama is without any authentication locally but after clicking create we see the entry in the list of api keys but the chat is not enabled
SK
sufyaan khateeb7:04 PMOpen in Slack
http://localhost:11434/ this is where ollama is hosted
SK
sufyaan khateeb9:00 PMOpen in Slack
This was somewhat resolved after providing the env variable