Class PutAmazonbedrockRequest
- All Implemented Interfaces:
JsonpSerializable
Creates an inference endpoint to perform an inference task with the
amazonbedrock
service.
info You need to provide the access and secret keys only once, during the inference model creation. The get inference API does not retrieve your access or secret keys. After creating the inference model, you cannot change the associated key pairs. If you want to use a different access and secret key pair, delete the inference model and recreate it with the same name and the updated keys.
When you create an inference endpoint, the associated machine learning model
is automatically deployed if it is not already running. After creating the
endpoint, wait for the model deployment to complete before using it. To
verify the deployment status, use the get trained model statistics API. Look
for "state": "fully_allocated"
in the
response and ensure that the "allocation_count"
matches the "target_allocation_count"
. Avoid creating
multiple endpoints for the same model unless required, as each endpoint
consumes significant resources.
- See Also:
-
Nested Class Summary
Nested ClassesNested classes/interfaces inherited from class co.elastic.clients.elasticsearch._types.RequestBase
RequestBase.AbstractBuilder<BuilderT extends RequestBase.AbstractBuilder<BuilderT>>
-
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<PutAmazonbedrockRequest>
Json deserializer forPutAmazonbedrockRequest
static final Endpoint<PutAmazonbedrockRequest,
PutAmazonbedrockResponse, ErrorResponse> Endpoint "inference.put_amazonbedrock
". -
Method Summary
Modifier and TypeMethodDescriptionfinal String
Required - The unique identifier of the inference endpoint.The chunking configuration object.static PutAmazonbedrockRequest
void
serialize
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected void
serializeInternal
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) final AmazonBedrockServiceType
service()
Required - The type of service supported for the specified task type.Required - Settings used to install the inference model.protected static void
Settings to configure the inference task.final AmazonBedrockTaskType
taskType()
Required - The type of the inference task that the model will perform.Methods inherited from class co.elastic.clients.elasticsearch._types.RequestBase
toString
-
Field Details
-
_DESERIALIZER
Json deserializer forPutAmazonbedrockRequest
-
_ENDPOINT
public static final Endpoint<PutAmazonbedrockRequest,PutAmazonbedrockResponse, _ENDPOINTErrorResponse> Endpoint "inference.put_amazonbedrock
".
-
-
Method Details
-
of
public static PutAmazonbedrockRequest of(Function<PutAmazonbedrockRequest.Builder, ObjectBuilder<PutAmazonbedrockRequest>> fn) -
amazonbedrockInferenceId
Required - The unique identifier of the inference endpoint.API name:
amazonbedrock_inference_id
-
chunkingSettings
The chunking configuration object.API name:
chunking_settings
-
service
Required - The type of service supported for the specified task type. In this case,amazonbedrock
.API name:
service
-
serviceSettings
Required - Settings used to install the inference model. These settings are specific to theamazonbedrock
service.API name:
service_settings
-
taskSettings
Settings to configure the inference task. These settings are specific to the task type you specified.API name:
task_settings
-
taskType
Required - The type of the inference task that the model will perform.API name:
task_type
-
serialize
Serialize this object to JSON.- Specified by:
serialize
in interfaceJsonpSerializable
-
serializeInternal
-
setupPutAmazonbedrockRequestDeserializer
protected static void setupPutAmazonbedrockRequestDeserializer(ObjectDeserializer<PutAmazonbedrockRequest.Builder> op)
-