Skip to content

oobabooga completion path ignores custom_prompt_dict (parameter is never used and not forwarded from main) #21946

@dario-fumarola

Description

@dario-fumarola

What happened

The oobabooga completion integration exposes a custom_prompt_dict parameter, but the current call chain never applies it.

Evidence

  1. litellm/main.py (elif custom_llm_provider == "oobabooga": branch)
  • Calls oobabooga.completion(...) without passing custom_prompt_dict.
  1. litellm/llms/oobabooga/chat/oobabooga.py
  • Function signature includes custom_prompt_dict={}.
  • Inside the function, custom_prompt_dict is never referenced.
  • transform_request(...) is called with model/messages/optional_params/litellm_params/headers, but no custom prompt handling exists in this path.
  1. Contrast with other providers
  • litellm/llms/vllm/completion/handler.py explicitly checks and applies custom_prompt_dict:
    • if model in custom_prompt_dict: ... prompt = custom_prompt(...)

Why this matters

Users can reasonably expect registered custom prompt templates to affect provider formatting consistently. For oobabooga, the API surface suggests support, but behavior is currently a no-op.

Expected behavior

For oobabooga requests, custom_prompt_dict should either:

  • be forwarded and applied in prompt construction, or
  • be removed from the signature/documented as unsupported to avoid misleading behavior.

Suggested fix

In main.py, forward custom_prompt_dict in the oobabooga provider call, and implement usage in the oobabooga request/prompt transformation path (similar to vllm/replicate/sagemaker-style integrations that apply custom_prompt).

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions