Skip to content

OpenAI models lists incompatible models #5500

@cbruyndoncx

Description

@cbruyndoncx

OpenAI models lists all available models, but goose only supports v1/chat/completions

To Reproduce:

  1. I verified the exact name to use in the model listing using goose configure
  2. start session like
    $ GOOSE_MODEL=gpt-5-codex goose session
    starting session | provider: openai model: gpt-5-codex
    ...

Goose is running! Enter your instructions, or try asking what goose can do.

Context: ○○○○○○○○○○ 0% (0/400000 tokens)
( O)>
◒ Harmonizing heuristics... Error: Request failed: This model is only supported in v1/responses and not in v1/chat/completions. (type: invalid_request_error) (status 404)
Interrupted before the model replied and removed the last message.

Expected behavior
Expect to only show the supported models in the listing. So the listing needs to be aware of the api to use.

Screenshots
If applicable, add screenshots to help explain your problem.

Please provide following information:

  • OS & Arch: [e.g. Ubuntu 22.04 x86]
  • Interface: [CLI]
  • Version: [e.g. v1.9.3]
  • Extensions enabled: [e.g. Computer Controller, Figma]
  • Provider & Model: [OpenAI gpt-5-codex]

Additional context
Wanting to use goose more for coding, but at this point it is not possible using goose this way, so i have to revert to codex cli. I think in general goose developer is faster in pinpointing specific files, so i wanted to see how the token useage compares between goose cli and codex cli.

Metadata

Metadata

Assignees

Labels

p2Priority 2 - Medium

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions