this error occurs at startup:
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
‘GET /progress HTTP/1.1’ 200 -
instead of openai api key I used local llm running through lm studio, and its documentation says that instead of openai api key you can use ‘http://localhost:1234/v1/’
what is the problem and how to fix it ?
this error occurs at startup:
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
‘GET /progress HTTP/1.1’ 200 -
instead of openai api key I used local llm running through lm studio, and its documentation says that instead of openai api key you can use ‘http://localhost:1234/v1/’
what is the problem and how to fix it ?