Class TrainedModelInferenceStats
java.lang.Object
co.elastic.clients.elasticsearch.ml.TrainedModelInferenceStats
- All Implemented Interfaces:
JsonpSerializable
@JsonpDeserializable
public class TrainedModelInferenceStats
extends Object
implements JsonpSerializable
- See Also:
-
Nested Class Summary
Nested Classes -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<TrainedModelInferenceStats>Json deserializer forTrainedModelInferenceStats -
Method Summary
Modifier and TypeMethodDescriptionfinal intRequired - The number of times the model was loaded for inference and was not retrieved from the cache.final intRequired - The number of failures when using the model for inference.final intRequired - The total number of times the model has been called for inference.final intRequired - The number of inference calls where all the training features for the model were missing.static TrainedModelInferenceStatsvoidserialize(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected voidserializeInternal(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) protected static voidsetupTrainedModelInferenceStatsDeserializer(ObjectDeserializer<TrainedModelInferenceStats.Builder> op) final longRequired - The time when the statistics were last updated.toString()
-
Field Details
-
_DESERIALIZER
Json deserializer forTrainedModelInferenceStats
-
-
Method Details
-
of
public static TrainedModelInferenceStats of(Function<TrainedModelInferenceStats.Builder, ObjectBuilder<TrainedModelInferenceStats>> fn) -
cacheMissCount
public final int cacheMissCount()Required - The number of times the model was loaded for inference and was not retrieved from the cache. If this number is close to theinference_count, the cache is not being appropriately used. This can be solved by increasing the cache size or its time-to-live (TTL). Refer to general machine learning settings for the appropriate settings.API name:
cache_miss_count -
failureCount
public final int failureCount()Required - The number of failures when using the model for inference.API name:
failure_count -
inferenceCount
public final int inferenceCount()Required - The total number of times the model has been called for inference. This is across all inference contexts, including all pipelines.API name:
inference_count -
missingAllFieldsCount
public final int missingAllFieldsCount()Required - The number of inference calls where all the training features for the model were missing.API name:
missing_all_fields_count -
timestamp
public final long timestamp()Required - The time when the statistics were last updated.API name:
timestamp -
serialize
Serialize this object to JSON.- Specified by:
serializein interfaceJsonpSerializable
-
serializeInternal
-
toString
-
setupTrainedModelInferenceStatsDeserializer
protected static void setupTrainedModelInferenceStatsDeserializer(ObjectDeserializer<TrainedModelInferenceStats.Builder> op)
-