Skip to content

[Bug]: LiteLLM requires OLLAMA_API_BASE even when passing in api_base #21967

@mgiacomi

Description

@mgiacomi

Check for existing issues

  • I have searched the existing issues and checked that my issue is not a duplicate.

What happened?

Running LiteLLM threw errors about needing OLLAMA_API_BASE even when api_base was being passed in.

I expected LiteLLM to use the passed in api_base. I have no idea where my clients are going to be hosting their Ollama, so they need to be able to add it to the app dynamically. There is now way to have upfront knowledge of where to set OLLAMA_API_BASE to.

Steps to Reproduce

  1. Don't set an OLLAMA_API_BASE
  2. Call LiteLLM with api_base
  3. Get error

Relevant log output

litellm_logging.py:1493 - response_cost_failure_debug_information: {'error_str': 'OllamaError: Error getting model info for ibm/granite4:tiny-h. Set Ollama API Base via `OLLAMA_API_BASE` environment variable.

What part of LiteLLM is this about?

SDK (litellm Python package)

What LiteLLM version are you on ?

v1.81.12-stable

Twitter / LinkedIn details

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions