Record Class McpSchema.CreateMessageRequest

java.lang.Object
java.lang.Record
io.modelcontextprotocol.spec.McpSchema.CreateMessageRequest
Record Components:
messages - The conversation messages to send to the LLM
modelPreferences - The server's preferences for which model to select. The client MAY ignore these preferences
systemPrompt - An optional system prompt the server wants to use for sampling. The client MAY modify or omit this prompt
includeContext - A request to include context from one or more MCP servers (including the caller), to be attached to the prompt. The client MAY ignore this request
temperature - Optional temperature parameter for sampling
maxTokens - The maximum number of tokens to sample, as requested by the server. The client MAY choose to sample fewer tokens than requested
stopSequences - Optional stop sequences for sampling
metadata - Optional metadata to pass through to the LLM provider. The format of this metadata is provider-specific
meta - See specification for notes on _meta usage
All Implemented Interfaces:
McpSchema.Request
Enclosing class:
McpSchema

public static record McpSchema.CreateMessageRequest(List<McpSchema.SamplingMessage> messages, McpSchema.ModelPreferences modelPreferences, String systemPrompt, McpSchema.CreateMessageRequest.ContextInclusionStrategy includeContext, Double temperature, int maxTokens, List<String> stopSequences, Map<String,Object> metadata, Map<String,Object> meta) extends Record implements McpSchema.Request
A request from the server to sample an LLM via the client. The client has full discretion over which model to select. The client should also inform the user before beginning sampling, to allow them to inspect the request (human in the loop) and decide whether to approve it.