Thread

VT
Vedant Thanekar5:11 PMOpen in Slack
Hello! I am trying to chat with my LLM, but it tells me "LLM Provider API key not configured. Please configure it in Chat Settings".
I have already configured a Mistral API key in the settings but it still throws me this error.
I checked the logs and it looks like its using Anthropic API by default. Is there a way to fix this?

3 replies
V
vision396:44 PMOpen in Slack
Might be you have added wrong api key wrong model.
VT
Vedant Thanekar3:34 AMOpen in Slack
I first added Mistral, it wasn't working, but then I added Gemini, and then it started working
A
akash6:49 AMOpen in Slack
Mistral APIs is not working i faced the same issue use some other platform Gemini worked for me