public class ControlAIConversationRequest extends AbstractModel
header, skipSign| Constructor and Description | 
|---|
| ControlAIConversationRequest() | 
| ControlAIConversationRequest(ControlAIConversationRequest source)NOTE: Any ambiguous key set via .set("AnyKey", "value") will be a shallow copy,
       and any explicit key, i.e Foo, set via .setFoo("value") will be a deep copy. | 
| Modifier and Type | Method and Description | 
|---|---|
| String | getCommand()Get 控制命令,目前支持命令如下:- ServerPushText,服务端发送文本给AI机器人,AI机器人会播报该文本. | 
| InvokeLLM | getInvokeLLM()Get 服务端发送命令主动请求大模型,当Command为InvokeLLM时会把content请求到大模型,头部增加X-Invoke-LLM="1" | 
| ServerPushText | getServerPushText()Get 服务端发送播报文本命令,当Command为ServerPushText时必填 | 
| String | getTaskId()Get 任务唯一标识 | 
| void | setCommand(String Command)Set 控制命令,目前支持命令如下:- ServerPushText,服务端发送文本给AI机器人,AI机器人会播报该文本. | 
| void | setInvokeLLM(InvokeLLM InvokeLLM)Set 服务端发送命令主动请求大模型,当Command为InvokeLLM时会把content请求到大模型,头部增加X-Invoke-LLM="1" | 
| void | setServerPushText(ServerPushText ServerPushText)Set 服务端发送播报文本命令,当Command为ServerPushText时必填 | 
| void | setTaskId(String TaskId)Set 任务唯一标识 | 
| void | toMap(HashMap<String,String> map,
     String prefix)Internal implementation, normal users should not use it. | 
any, fromJsonString, getBinaryParams, GetHeader, getMultipartRequestParams, getSkipSign, isStream, set, SetHeader, setParamArrayObj, setParamArraySimple, setParamObj, setParamSimple, setSkipSign, toJsonStringpublic ControlAIConversationRequest()
public ControlAIConversationRequest(ControlAIConversationRequest source)
public String getTaskId()
public void setTaskId(String TaskId)
TaskId - 任务唯一标识public String getCommand()
public void setCommand(String Command)
Command - 控制命令,目前支持命令如下:- ServerPushText,服务端发送文本给AI机器人,AI机器人会播报该文本. - InvokeLLM,服务端发送文本给大模型,触发对话public ServerPushText getServerPushText()
public void setServerPushText(ServerPushText ServerPushText)
ServerPushText - 服务端发送播报文本命令,当Command为ServerPushText时必填public InvokeLLM getInvokeLLM()
public void setInvokeLLM(InvokeLLM InvokeLLM)
InvokeLLM - 服务端发送命令主动请求大模型,当Command为InvokeLLM时会把content请求到大模型,头部增加X-Invoke-LLM="1"Copyright © 2025. All rights reserved.