Skip to main content
To meet developers’ needs for the Anthropic API ecosystem, our API now supports the Anthropic API format. With simple configuration, you can integrate MiniMax capabilities into the Anthropic API ecosystem.

Quick Start

1. Install Anthropic SDK

pip install anthropic

2. Call API

Python
import anthropic

client = anthropic.Anthropic(
  base_url="https://api.minimax.io/anthropic", 
  api_key="your_api_key"
)

message = client.messages.create(
    model="MiniMax-M2",
    max_tokens=1000,
    system="You are a helpful assistant.",
    messages=[
        {
            "role": "user",
            "content": [
                {
                    "type": "text",
                    "text": "Hi, how are you?"
                }
            ]
        }
    ]
)

for block in message.content:
    if block.type == "thinking":
        print(f"Thinking:\n{block.thinking}\n")
    elif block.type == "text":
        print(f"Text:\n{block.text}\n")

3. Important Note

In multi-turn function call conversations, the complete model response (i.e., the assistant message) must be append to the conversation history to maintain the continuity of the reasoning chain.
  • Append the full response.content list to the message history (includes all content blocks: thinking/text/tool_use)

Supported Models

When using the Anthropic SDK, currently only the MiniMax-M2 MiniMax-M2-Stable model is supported:
Model NameDescription
MiniMax-M2Agentic capabilities, Advanced reasoning
MiniMax-M2-Stablehigh concurrency and commercial use
The Anthropic API compatibility interface currently only supports the MiniMax-M2 MiniMax-M2-Stable model. For other models, please use the standard MiniMax API interface.

Compatibility

Supported Parameters

When using the Anthropic SDK, we support the following input parameters:
ParameterSupport StatusDescription
modelFully supportedsupports MiniMax-M2 MiniMax-M2-Stable model
messagesPartial supportSupports text and tool calls, no image/document input
max_tokensFully supportedMaximum number of tokens to generate
streamFully supportedStreaming response
systemFully supportedSystem prompt
temperatureFully supportedRange (0.0, 1.0], controls output randomness, recommended value: 1
tool_choiceFully supportedTool selection strategy
toolsFully supportedTool definitions
top_pFully supportedNucleus sampling parameter
metadataFully SupportedMetadata
thinkingFully SupportedReasoning Content
top_kIgnoredThis parameter will be ignored
stop_sequencesIgnoredThis parameter will be ignored
service_tierIgnoredThis parameter will be ignored
mcp_serversIgnoredThis parameter will be ignored
context_managementIgnoredThis parameter will be ignored
containerIgnoredThis parameter will be ignored

Messages Field Support

Field TypeSupport StatusDescription
type="text"Fully supportedText messages
type="tool_use"Fully supportedTool calls
type="tool_result"Fully supportedTool call results
type="thinking"Fully supportedReasoning Content
type="image"Not supportedImage input not supported yet
type="document"Not supportedDocument input not supported yet

Examples

Streaming Response

Python
import anthropic

client = anthropic.Anthropic(
    base_url="https://api.minimax.io/anthropic", api_key="your_api_key"
)

print("Starting stream response...\n")
print("=" * 60)
print("Thinking Process:")
print("=" * 60)

stream = client.messages.create(
    model="MiniMax-M2",
    max_tokens=1000,
    system="You are a helpful assistant.",
    messages=[
        {"role": "user", "content": [{"type": "text", "text": "Hi, how are you?"}]}
    ],
    stream=True,
)

reasoning_buffer = ""
text_buffer = ""

for chunk in stream:
    if chunk.type == "content_block_start":
        if hasattr(chunk, "content_block") and chunk.content_block:
            if chunk.content_block.type == "text":
                print("\n" + "=" * 60)
                print("Response Content:")
                print("=" * 60)

    elif chunk.type == "content_block_delta":
        if hasattr(chunk, "delta") and chunk.delta:
            if chunk.delta.type == "thinking_delta":
                # Stream output thinking process
                new_thinking = chunk.delta.thinking
                if new_thinking:
                    print(new_thinking, end="", flush=True)
                    reasoning_buffer += new_thinking
            elif chunk.delta.type == "text_delta":
                # Stream output text content
                new_text = chunk.delta.text
                if new_text:
                    print(new_text, end="", flush=True)
                    text_buffer += new_text

print("\n")

Tool Use & Interleaved Thinking

Learn how to use M2 Tool Use and Interleaved Thinking capabilities with Anthropic SDK, please refer to the following documentation.

Important Notes

  1. The Anthropic API compatibility interface currently only supports the MiniMax-M2 model
  2. The temperature parameter range is (0.0, 1.0], values outside this range will return an error
  3. Some Anthropic parameters (such as thinking, top_k, stop_sequences, service_tier, mcp_servers, context_management, container) will be ignored
  4. Image and document type inputs are not currently supported