Class PutHuggingFaceRequest
java.lang.Object
co.elastic.clients.elasticsearch._types.RequestBase
co.elastic.clients.elasticsearch.inference.PutHuggingFaceRequest
- All Implemented Interfaces:
JsonpSerializable
@JsonpDeserializable
public class PutHuggingFaceRequest
extends RequestBase
implements JsonpSerializable
Create a Hugging Face inference endpoint.
Create an inference endpoint to perform an inference task with the
hugging_face
service.
You must first create an inference endpoint on the Hugging Face endpoint page
to get an endpoint URL. Select the model you want to use on the new endpoint
creation page (for example intfloat/e5-small-v2
), then select
the sentence embeddings task under the advanced configuration section. Create
the endpoint and copy the URL after the endpoint initialization has been
finished.
The following models are recommended for the Hugging Face service:
all-MiniLM-L6-v2
all-MiniLM-L12-v2
all-mpnet-base-v2
e5-base-v2
e5-small-v2
multilingual-e5-base
multilingual-e5-small
- See Also:
-
Nested Class Summary
Nested ClassesNested classes/interfaces inherited from class co.elastic.clients.elasticsearch._types.RequestBase
RequestBase.AbstractBuilder<BuilderT extends RequestBase.AbstractBuilder<BuilderT>>
-
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<PutHuggingFaceRequest>
Json deserializer forPutHuggingFaceRequest
static final Endpoint<PutHuggingFaceRequest,
PutHuggingFaceResponse, ErrorResponse> Endpoint "inference.put_hugging_face
". -
Method Summary
Modifier and TypeMethodDescriptionThe chunking configuration object.final String
Required - The unique identifier of the inference endpoint.static PutHuggingFaceRequest
void
serialize
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected void
serializeInternal
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) final HuggingFaceServiceType
service()
Required - The type of service supported for the specified task type.Required - Settings used to install the inference model.protected static void
final HuggingFaceTaskType
taskType()
Required - The type of the inference task that the model will perform.final Time
timeout()
Specifies the amount of time to wait for the inference endpoint to be created.Methods inherited from class co.elastic.clients.elasticsearch._types.RequestBase
toString
-
Field Details
-
_DESERIALIZER
Json deserializer forPutHuggingFaceRequest
-
_ENDPOINT
Endpoint "inference.put_hugging_face
".
-
-
Method Details
-
of
public static PutHuggingFaceRequest of(Function<PutHuggingFaceRequest.Builder, ObjectBuilder<PutHuggingFaceRequest>> fn) -
chunkingSettings
The chunking configuration object.API name:
chunking_settings
-
huggingfaceInferenceId
Required - The unique identifier of the inference endpoint.API name:
huggingface_inference_id
-
service
Required - The type of service supported for the specified task type. In this case,hugging_face
.API name:
service
-
serviceSettings
Required - Settings used to install the inference model. These settings are specific to thehugging_face
service.API name:
service_settings
-
taskType
Required - The type of the inference task that the model will perform.API name:
task_type
-
timeout
Specifies the amount of time to wait for the inference endpoint to be created.API name:
timeout
-
serialize
Serialize this object to JSON.- Specified by:
serialize
in interfaceJsonpSerializable
-
serializeInternal
-
setupPutHuggingFaceRequestDeserializer
protected static void setupPutHuggingFaceRequestDeserializer(ObjectDeserializer<PutHuggingFaceRequest.Builder> op)
-