Thread

DV
DAKSH VERMA4:43 PMOpen in Slack
Can we use an model that we have downloaded locally with ollama
and if yes, then can you guide me

9 replies
TK
Tushar Khatri4:51 PMOpen in Slack
Visit <http://localhost:3000/settings/llm-api-keys> and select Ollama as the provider
❤️1
DV
DAKSH VERMA4:52 PMOpen in Slack
it asks for optional api key
TK
Tushar Khatri4:54 PMOpen in Slack
i think you won’t need any api key, you will have to run ollama locally and Archestra might be able to connect to it. maybe you will have to configure something ollama side
DV
DAKSH VERMA4:55 PMOpen in Slack
I will try it
TK
Tushar Khatri5:05 PMOpen in Slack
found it bro
DV
DAKSH VERMA5:07 PMOpen in Slack
thanks bro
🙂1
TK
Tushar Khatri5:07 PMOpen in Slack
just make sure ollama is running on “http://localhost:11434/v1