Skip to content

feat: add MiniMax provider support with M2.7 default#1707

Open
octo-patch wants to merge 3 commits intoopeninterpreter:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax provider support with M2.7 default#1707
octo-patch wants to merge 3 commits intoopeninterpreter:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 13, 2026

Summary

Add MiniMax as a provider for Open Interpreter with the latest M2.7 model as default.

Changes

  • Add MiniMax profile (minimax.py) with MiniMax-M2.7 as default model
  • Support MiniMax-M2.7, MiniMax-M2.7-highspeed, MiniMax-M2.5, and MiniMax-M2.5-highspeed models
  • Configure OpenAI-compatible API endpoint (https://api.minimax.io/v1)
  • Add MiniMax usage documentation to README

Usage

export MINIMAX_API_KEY=your_api_key
interpreter --profile minimax.py

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities, offering a 204,800 token context window.

Testing

  • Profile syntax validated
  • All existing models retained as alternatives

@endolith
Copy link
Contributor

endolith commented Mar 13, 2026

I have a fork with support for open router and reasoning tokens by the way

@octo-patch
Copy link
Author

Thanks for sharing, @endolith! OpenRouter and reasoning token support sounds great. If this PR gets traction, it would be nice to see those capabilities integrated too.

- Add MiniMax-M2.7 and MiniMax-M2.7-highspeed to model list
- Set MiniMax-M2.7 as default model
- Keep all previous models as alternatives
- Update README documentation
@octo-patch octo-patch changed the title feat: add MiniMax provider support feat: add MiniMax provider support with M2.7 default Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants