Class TrainedModelInferenceStats
java.lang.Object
co.elastic.clients.elasticsearch.ml.TrainedModelInferenceStats
- All Implemented Interfaces:
JsonpSerializable
@JsonpDeserializable
public class TrainedModelInferenceStats
extends Object
implements JsonpSerializable
- See Also:
-
Nested Class Summary
Nested Classes -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final JsonpDeserializer<TrainedModelInferenceStats>Json deserializer forTrainedModelInferenceStats -
Method Summary
Modifier and TypeMethodDescriptionfinal longRequired - The number of times the model was loaded for inference and was not retrieved from the cache.final longRequired - The number of failures when using the model for inference.final longRequired - The total number of times the model has been called for inference.final longRequired - The number of inference calls where all the training features for the model were missing.static TrainedModelInferenceStatsvoidserialize(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected voidserializeInternal(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) protected static voidsetupTrainedModelInferenceStatsDeserializer(ObjectDeserializer<TrainedModelInferenceStats.Builder> op) final TimeRequired - The time when the statistics were last updated.toString()
-
Field Details
-
_DESERIALIZER
Json deserializer forTrainedModelInferenceStats
-
-
Method Details
-
of
public static TrainedModelInferenceStats of(Function<TrainedModelInferenceStats.Builder, ObjectBuilder<TrainedModelInferenceStats>> fn) -
failureCount
public final long failureCount()Required - The number of failures when using the model for inference.API name:
failure_count -
inferenceCount
public final long inferenceCount()Required - The total number of times the model has been called for inference. This is across all inference contexts, including all pipelines.API name:
inference_count -
cacheMissCount
public final long cacheMissCount()Required - The number of times the model was loaded for inference and was not retrieved from the cache. If this number is close to the inference_count, then the cache is not being appropriately used. This can be solved by increasing the cache size or its time-to-live (TTL). See General machine learning settings for the appropriate settings.API name:
cache_miss_count -
missingAllFieldsCount
public final long missingAllFieldsCount()Required - The number of inference calls where all the training features for the model were missing.API name:
missing_all_fields_count -
timestamp
Required - The time when the statistics were last updated.API name:
timestamp -
serialize
Serialize this object to JSON.- Specified by:
serializein interfaceJsonpSerializable
-
serializeInternal
-
toString
-
setupTrainedModelInferenceStatsDeserializer
protected static void setupTrainedModelInferenceStatsDeserializer(ObjectDeserializer<TrainedModelInferenceStats.Builder> op)
-