Uses of Class
tensorflow.serving.Inference.MultiInferenceResponse
Packages that use Inference.MultiInferenceResponse
-
Uses of Inference.MultiInferenceResponse in tensorflow.serving
Methods in tensorflow.serving that return Inference.MultiInferenceResponseModifier and TypeMethodDescriptionInference.MultiInferenceResponse.Builder.build()Inference.MultiInferenceResponse.Builder.buildPartial()Inference.MultiInferenceResponse.getDefaultInstance()Inference.MultiInferenceResponse.Builder.getDefaultInstanceForType()Inference.MultiInferenceResponse.getDefaultInstanceForType()PredictionLogOuterClass.MultiInferenceLog.Builder.getResponse().tensorflow.serving.MultiInferenceResponse response = 2;PredictionLogOuterClass.MultiInferenceLog.getResponse().tensorflow.serving.MultiInferenceResponse response = 2;PredictionLogOuterClass.MultiInferenceLogOrBuilder.getResponse().tensorflow.serving.MultiInferenceResponse response = 2;PredictionServiceGrpc.PredictionServiceBlockingStub.multiInference(Inference.MultiInferenceRequest request) MultiInference API for multi-headed models.PredictionServiceGrpc.PredictionServiceBlockingV2Stub.multiInference(Inference.MultiInferenceRequest request) MultiInference API for multi-headed models.Inference.MultiInferenceResponse.parseDelimitedFrom(InputStream input) Inference.MultiInferenceResponse.parseDelimitedFrom(InputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry) Inference.MultiInferenceResponse.parseFrom(byte[] data) Inference.MultiInferenceResponse.parseFrom(byte[] data, com.google.protobuf.ExtensionRegistryLite extensionRegistry) Inference.MultiInferenceResponse.parseFrom(com.google.protobuf.ByteString data) Inference.MultiInferenceResponse.parseFrom(com.google.protobuf.ByteString data, com.google.protobuf.ExtensionRegistryLite extensionRegistry) Inference.MultiInferenceResponse.parseFrom(com.google.protobuf.CodedInputStream input) Inference.MultiInferenceResponse.parseFrom(com.google.protobuf.CodedInputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry) Inference.MultiInferenceResponse.parseFrom(InputStream input) Inference.MultiInferenceResponse.parseFrom(InputStream input, com.google.protobuf.ExtensionRegistryLite extensionRegistry) Inference.MultiInferenceResponse.parseFrom(ByteBuffer data) Inference.MultiInferenceResponse.parseFrom(ByteBuffer data, com.google.protobuf.ExtensionRegistryLite extensionRegistry) Methods in tensorflow.serving that return types with arguments of type Inference.MultiInferenceResponseModifier and TypeMethodDescriptionstatic io.grpc.MethodDescriptor<Inference.MultiInferenceRequest, Inference.MultiInferenceResponse> PredictionServiceGrpc.getMultiInferenceMethod()com.google.protobuf.Parser<Inference.MultiInferenceResponse> Inference.MultiInferenceResponse.getParserForType()com.google.common.util.concurrent.ListenableFuture<Inference.MultiInferenceResponse> PredictionServiceGrpc.PredictionServiceFutureStub.multiInference(Inference.MultiInferenceRequest request) MultiInference API for multi-headed models.static com.google.protobuf.Parser<Inference.MultiInferenceResponse> Inference.MultiInferenceResponse.parser()Methods in tensorflow.serving with parameters of type Inference.MultiInferenceResponseModifier and TypeMethodDescriptionInference.MultiInferenceResponse.Builder.mergeFrom(Inference.MultiInferenceResponse other) PredictionLogOuterClass.MultiInferenceLog.Builder.mergeResponse(Inference.MultiInferenceResponse value) .tensorflow.serving.MultiInferenceResponse response = 2;Inference.MultiInferenceResponse.newBuilder(Inference.MultiInferenceResponse prototype) PredictionLogOuterClass.MultiInferenceLog.Builder.setResponse(Inference.MultiInferenceResponse value) .tensorflow.serving.MultiInferenceResponse response = 2;Method parameters in tensorflow.serving with type arguments of type Inference.MultiInferenceResponseModifier and TypeMethodDescriptiondefault voidPredictionServiceGrpc.AsyncService.multiInference(Inference.MultiInferenceRequest request, io.grpc.stub.StreamObserver<Inference.MultiInferenceResponse> responseObserver) MultiInference API for multi-headed models.voidPredictionServiceGrpc.PredictionServiceStub.multiInference(Inference.MultiInferenceRequest request, io.grpc.stub.StreamObserver<Inference.MultiInferenceResponse> responseObserver) MultiInference API for multi-headed models.