Skip to content

feat: add MiniMax as first-class LLM provider#996

Open
octo-patch wants to merge 1 commit intodataease:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as first-class LLM provider#996
octo-patch wants to merge 1 commit intodataease:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

Add MiniMax as a built-in LLM supplier in SQLBot, alongside existing providers like OpenAI, DeepSeek, Gemini, and Kimi.

MiniMax provides an OpenAI-compatible API at https://api.minimax.io/v1, making it a seamless addition to SQLBot's existing LangChain-based architecture.

Changes

  • Supplier registry (frontend/src/entity/supplier.ts): Add MiniMax entry (id=13) with:
    • API endpoint: https://api.minimax.io/v1
    • Models: MiniMax-M2.7, MiniMax-M2.5, MiniMax-M2.5-highspeed
    • Temperature range: [0, 1] with default 0.7
  • Icon: Add 96×96 PNG icon for MiniMax brand display
  • i18n: Add "MiniMax" translations in all 3 locales (en, zh-CN, ko-KR)
  • README: Add supported LLM providers table to both Chinese and English README files
  • Tests: Add 24 unit tests + 3 integration tests validating the configuration

How It Works

MiniMax uses the OpenAI-compatible protocol, so it works through the existing OpenAILLMBaseChatOpenAI path in model_factory.py. No backend changes are needed — users simply configure MiniMax via the model settings UI.

Test Plan

  • 24 unit tests pass (supplier config, i18n, icon, model factory, README)
  • 3 integration tests pass (API connectivity, chat completions, temperature=0)
  • Manual verification: add MiniMax model in UI, configure API key, run a query

Screenshots

The MiniMax provider appears in the "Select supplier" list alongside existing providers, with its brand icon and pre-configured models.

Add MiniMax AI as a built-in LLM supplier alongside existing providers
(OpenAI, DeepSeek, Gemini, Kimi, etc.) via OpenAI-compatible API.

Changes:
- Add MiniMax supplier entry (id=13) in supplier.ts with M2.7, M2.5,
  M2.5-highspeed models and temperature range [0, 1]
- Add MiniMax icon (96x96 PNG) to frontend assets
- Add i18n translations for MiniMax in en, zh-CN, ko-KR locales
- Add supported LLM providers table to both README files
- Add 24 unit tests + 3 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant