Skip to content

feat: add MiniMax as LLM provider#838

Open
octo-patch wants to merge 2 commits intoharry0703:mainfrom
octo-patch:add-minimax-llm-provider
Open

feat: add MiniMax as LLM provider#838
octo-patch wants to merge 2 commits intoharry0703:mainfrom
octo-patch:add-minimax-llm-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 13, 2026

Summary

Add MiniMax as an LLM provider for MoneyPrinterTurbo, using the OpenAI-compatible API.

Changes

  • Add MiniMax provider routing in app/services/llm.py
  • Add MiniMax configuration (API key, base URL, model name) in config.example.toml
  • Default model: MiniMax-M2.7 (also available: MiniMax-M2.7-highspeed)
  • Update README (CN & EN) to list MiniMax as a supported provider

Configuration

llm_provider = "minimax"
minimax_api_key = "your-api-key"
minimax_base_url = "https://api.minimax.io/v1"
minimax_model_name = "MiniMax-M2.7"

API Documentation

octo-patch and others added 2 commits March 13, 2026 20:36
Add MiniMax (https://platform.minimaxi.com) as a new LLM provider option.
MiniMax API is OpenAI-compatible and supports models like MiniMax-M1.

Changes:
- Add minimax provider branch in app/services/llm.py
- Add minimax configuration in config.example.toml
- Update both README.md and README-en.md to list MiniMax
- Update default model from MiniMax-M1 to MiniMax-M2.7
- Add MiniMax-M2.7-highspeed as available model option
- Update platform URL to platform.minimax.io
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant