Use the Anthropic API compatible format to call MiniMax models, supporting role-playing, multi-turn conversations and other dialogue scenarios. Supports rich role settings (system, user_system, group, etc.) and example dialogue learning.
HTTP: Bearer Auth
Media type of the request body, should be set to application/json to ensure JSON format
application/json Model ID
MiniMax-M2.7, MiniMax-M2.7-highspeed, MiniMax-M2.5, MiniMax-M2.1 A list of messages containing the conversation history
Set the role and behavior of the model
Whether to use streaming output, defaults to false. When set to true, the response will be returned in chunks
Specifies the upper limit for generated content length (in tokens), maximum is 2048. Content exceeding the limit will be truncated. If generation stops due to length, try increasing this value
x >= 1Temperature coefficient, affects output randomness, value range (0, 1], default value for MiniMax model is 1.0. Higher values produce more random output; lower values produce more deterministic output
0 < x <= 1Sampling strategy, affects output randomness, value range (0, 1], default value for MiniMax model is 0.95
0 < x <= 1Unique ID of this response
Object type, fixed as message
message Role, fixed as assistant
assistant Model ID used for this request
List of response content blocks
Reason for stopping generation:
end_turn: Model ended naturallymax_tokens: Reached max_tokens limitstop_sequence: Hit a stop sequenceend_turn, max_tokens, stop_sequence Token usage for this request
Error status code and details