Skip to main content
To meet developers’ needs for the Anthropic API ecosystem, our API now supports the Anthropic API format. With simple configuration, you can integrate MiniMax capabilities into the Anthropic API ecosystem.

Quick Start

1. Install Anthropic SDK

pip install anthropic

2. Configure Environment Variables

export ANTHROPIC_BASE_URL=https://api.minimax.io/anthropic
export ANTHROPIC_API_KEY=${YOUR_API_KEY}

3. Call API

import anthropic

client = anthropic.Anthropic()

message = client.messages.create(
    model="MiniMax-M2",
    max_tokens=1000,
    system="You are a helpful assistant.",
    messages=[
        {
            "role": "user",
            "content": [
                {
                    "type": "text",
                    "text": "Hi, how are you?"
                }
            ]
        }
    ]
)
print(message.content)

Supported Models

When using the Anthropic SDK, currently only the MiniMax-M2 model is supported:
Model NameDescription
MiniMax-M2MiniMax-M2 launching on October 27th, stay tuned
The Anthropic API compatibility interface currently only supports the MiniMax-M2 model. For other models, please use the standard MiniMax API interface.

Compatibility

Supported Parameters

When using the Anthropic SDK, we support the following input parameters:
ParameterSupport StatusDescription
modelFully supportedOnly supports MiniMax-M2 model
messagesPartial supportSupports text and tool calls, no image/document input
max_tokensFully supportedMaximum number of tokens to generate
streamFully supportedStreaming response
systemFully supportedSystem prompt
temperatureFully supportedRange (0.0, 1.0], controls output randomness, recommended value: 1
tool_choiceFully supportedTool selection strategy
toolsFully supportedTool definitions
top_pFully supportedNucleus sampling parameter
metadataFully SupportedMetadata
thinkingFully SupportedReasoning Content
top_kIgnoredThis parameter will be ignored
stop_sequencesIgnoredThis parameter will be ignored
service_tierIgnoredThis parameter will be ignored
mcp_serversIgnoredThis parameter will be ignored
context_managementIgnoredThis parameter will be ignored
containerIgnoredThis parameter will be ignored

Messages Field Support

Field TypeSupport StatusDescription
type="text"Fully supportedText messages
type="tool_use"Fully supportedTool calls
type="tool_result"Fully supportedTool call results
type="thinking"Fully supportedReasoning Content
type="image"Not supportedImage input not supported yet
type="document"Not supportedDocument input not supported yet

Examples

Basic Conversation

import anthropic

client = anthropic.Anthropic()

message = client.messages.create(
    model="MiniMax-M2",
    max_tokens=1024,
    messages=[
        {
            "role": "user",
            "content": "Explain machine learning in simple terms"
        }
    ]
)

print(message.content[0].text)

Streaming Response

import anthropic

client = anthropic.Anthropic()

with client.messages.stream(
    model="MiniMax-M2",
    max_tokens=1024,
    messages=[
        {
            "role": "user",
            "content": "Write a poem about spring"
        }
    ]
) as stream:
    for text in stream.text_stream:
        print(text, end="", flush=True)

Tool Calling (Function Calling)

For more details, see MiniMax-M2 Function Calling Guide
import anthropic

client = anthropic.Anthropic()

tools = [
    {
        "name": "get_weather",
        "description": "Get weather information for a specified city",
        "input_schema": {
            "type": "object",
            "properties": {
                "city": {
                    "type": "string",
                    "description": "City name"
                }
            },
            "required": ["city"]
        }
    }
]

message = client.messages.create(
    model="MiniMax-M2",
    max_tokens=1024,
    tools=tools,
    messages=[
        {
            "role": "user",
            "content": "What's the weather like in Beijing today?"
        }
    ]
)

print(message.content)

Important Notes

  1. The Anthropic API compatibility interface currently only supports the MiniMax-M2 model
  2. Set ANTHROPIC_BASE_URL to https://api.minimax.io/anthropic when using
  3. Set ANTHROPIC_API_KEY to your MiniMax API Key
  4. The temperature parameter range is (0.0, 1.0], values outside this range will return an error
  5. Some Anthropic parameters (such as thinking, top_k, stop_sequences, service_tier, mcp_servers, context_management, container) will be ignored
  6. Image and document type inputs are not currently supported