Skip to content

[bug] _llm_input_messages fails to parse ChatMessage objects in smolagents >= 1.x, resulting in empty LLM input spans #2962

@zhihuat

Description

@zhihuat

Describe the bug
Since smolagents changed its internal message format from plain dict to ChatMessage objects, _llm_input_messages in _wrappers.py silently drops all messages and LLM input spans appear empty in the trace.

Root Cause
In _wrappers.py, _llm_input_messages checks:

for i, message in enumerate(messages):
    if not isinstance(message, dict):
        continue   # ← ChatMessage objects are skipped here
smolagents generate() now passes a list[ChatMessage] instead of list[dict]. Because ChatMessage is not a dict, every message is skipped.

smolagents generate() now passes a list[ChatMessage] instead of list[dict]. Because ChatMessage is not a dict, every message is skipped.

Environment
openinference-instrumentation-smolagents: 0.1.25
smolagents: 1.24.0

Expected Behavior
llm.input_messages.* span attributes are populated with the full conversation history.

Actual Behavior
llm.input_messages.* attributes are empty. input.value falls back to str(messages), producing Python object repr strings like:

ChatMessage(role=<MessageRole.USER: 'user'>, content=[{'type': 'text', 'text': '...'}])

instead of JSON.

Metadata

Metadata

Assignees

Type

No type

Projects

Status

Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions