-
-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Open
Labels
Description
Check for existing issues
- I have searched the existing issues and checked that my issue is not a duplicate.
What happened?
Running LiteLLM threw errors about needing OLLAMA_API_BASE even when api_base was being passed in.
I expected LiteLLM to use the passed in api_base. I have no idea where my clients are going to be hosting their Ollama, so they need to be able to add it to the app dynamically. There is now way to have upfront knowledge of where to set OLLAMA_API_BASE to.
Steps to Reproduce
- Don't set an OLLAMA_API_BASE
- Call LiteLLM with api_base
- Get error
Relevant log output
litellm_logging.py:1493 - response_cost_failure_debug_information: {'error_str': 'OllamaError: Error getting model info for ibm/granite4:tiny-h. Set Ollama API Base via `OLLAMA_API_BASE` environment variable.What part of LiteLLM is this about?
SDK (litellm Python package)
What LiteLLM version are you on ?
v1.81.12-stable
Twitter / LinkedIn details
No response
Reactions are currently unavailable