-
-
Notifications
You must be signed in to change notification settings - Fork 5.9k
Open
Description
What happened
The oobabooga completion integration exposes a custom_prompt_dict parameter, but the current call chain never applies it.
Evidence
litellm/main.py(elif custom_llm_provider == "oobabooga":branch)
- Calls
oobabooga.completion(...)without passingcustom_prompt_dict.
litellm/llms/oobabooga/chat/oobabooga.py
- Function signature includes
custom_prompt_dict={}. - Inside the function,
custom_prompt_dictis never referenced. transform_request(...)is called withmodel/messages/optional_params/litellm_params/headers, but no custom prompt handling exists in this path.
- Contrast with other providers
litellm/llms/vllm/completion/handler.pyexplicitly checks and appliescustom_prompt_dict:if model in custom_prompt_dict: ... prompt = custom_prompt(...)
Why this matters
Users can reasonably expect registered custom prompt templates to affect provider formatting consistently. For oobabooga, the API surface suggests support, but behavior is currently a no-op.
Expected behavior
For oobabooga requests, custom_prompt_dict should either:
- be forwarded and applied in prompt construction, or
- be removed from the signature/documented as unsupported to avoid misleading behavior.
Suggested fix
In main.py, forward custom_prompt_dict in the oobabooga provider call, and implement usage in the oobabooga request/prompt transformation path (similar to vllm/replicate/sagemaker-style integrations that apply custom_prompt).
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels