Skip to content

feat: add MiniMax as alternative LLM provider#45

Open
octo-patch wants to merge 1 commit intoCopilotKit:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as alternative LLM provider#45
octo-patch wants to merge 1 commit intoCopilotKit:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

  • Add multi-provider LLM factory (llm_provider.py) with auto-detection from API keys
  • MiniMax M2.7 / M2.7-highspeed supported via OpenAI-compatible API with temperature clamping
  • New env vars: LLM_PROVIDER, LLM_BASE_URL, LLM_TEMPERATURE, MINIMAX_API_KEY
  • Updated .env.example and README model table with MiniMax docs

Changes

File Change
apps/agent/src/llm_provider.py New provider factory with presets, auto-detection, temp clamping
apps/agent/main.py Replace hardcoded ChatOpenAI() with create_llm() factory
.env.example Add MiniMax API key and provider config docs
README.md Add MiniMax M2.7 to model table + usage instructions
apps/agent/tests/ 28 unit tests + 3 integration tests

How it works

Set MINIMAX_API_KEY in your .env and the provider is auto-detected. Defaults to MiniMax-M2.7 (1M context window).

Existing OpenAI usage is fully backward-compatible.

Test plan

  • 28 unit tests covering provider detection, presets, factory, temperature clamping, edge cases
  • 3 integration tests against real MiniMax API (basic completion, streaming, multi-turn)
  • Manual verification with make dev using MiniMax M2.7 for generative UI

Add multi-provider LLM factory with auto-detection from API keys.
MiniMax M2.7/M2.7-highspeed models supported via OpenAI-compatible API
with temperature clamping and configurable base URL.

- New llm_provider.py module with create_llm() factory
- Provider auto-detection: MINIMAX_API_KEY → minimax, else openai
- LLM_PROVIDER, LLM_BASE_URL, LLM_TEMPERATURE env vars
- 28 unit tests + 3 integration tests
- Updated .env.example and README with MiniMax docs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant