Class TrainedModelInferenceStats
java.lang.Object
co.elastic.clients.elasticsearch.ml.TrainedModelInferenceStats
- All Implemented Interfaces:
JsonpSerializable
@JsonpDeserializable
public class TrainedModelInferenceStats
extends Object
implements JsonpSerializable
- See Also:
-
Nested Class Summary
-
Field Summary
Modifier and TypeFieldDescriptionstatic final JsonpDeserializer<TrainedModelInferenceStats>
Json deserializer forTrainedModelInferenceStats
-
Method Summary
Modifier and TypeMethodDescriptionfinal int
Required - The number of times the model was loaded for inference and was not retrieved from the cache.final int
Required - The number of failures when using the model for inference.final int
Required - The total number of times the model has been called for inference.final int
Required - The number of inference calls where all the training features for the model were missing.static TrainedModelInferenceStats
void
serialize
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) Serialize this object to JSON.protected void
serializeInternal
(jakarta.json.stream.JsonGenerator generator, JsonpMapper mapper) protected static void
setupTrainedModelInferenceStatsDeserializer
(ObjectDeserializer<TrainedModelInferenceStats.Builder> op) final DateTime
Required - The time when the statistics were last updated.toString()
-
Field Details
-
_DESERIALIZER
Json deserializer forTrainedModelInferenceStats
-
-
Method Details
-
of
public static TrainedModelInferenceStats of(Function<TrainedModelInferenceStats.Builder, ObjectBuilder<TrainedModelInferenceStats>> fn) -
cacheMissCount
public final int cacheMissCount()Required - The number of times the model was loaded for inference and was not retrieved from the cache. If this number is close to theinference_count
, the cache is not being appropriately used. This can be solved by increasing the cache size or its time-to-live (TTL). Refer to general machine learning settings for the appropriate settings.API name:
cache_miss_count
-
failureCount
public final int failureCount()Required - The number of failures when using the model for inference.API name:
failure_count
-
inferenceCount
public final int inferenceCount()Required - The total number of times the model has been called for inference. This is across all inference contexts, including all pipelines.API name:
inference_count
-
missingAllFieldsCount
public final int missingAllFieldsCount()Required - The number of inference calls where all the training features for the model were missing.API name:
missing_all_fields_count
-
timestamp
Required - The time when the statistics were last updated.API name:
timestamp
-
serialize
Serialize this object to JSON.- Specified by:
serialize
in interfaceJsonpSerializable
-
serializeInternal
-
toString
-
setupTrainedModelInferenceStatsDeserializer
protected static void setupTrainedModelInferenceStatsDeserializer(ObjectDeserializer<TrainedModelInferenceStats.Builder> op)
-