Package dev.langchain4j.model
Interface StreamingResponseHandler
-
- All Implemented Interfaces:
public interface StreamingResponseHandler<T>
Represents a handler for streaming responses from a language model. The handler is invoked each time the model generates a new token in a textual response. If the model executes a tool instead, onComplete will be invoked instead.
-
-
Method Summary
Modifier and Type Method Description abstract void
onNext(String token)
Invoked each time the language model generates a new token in a textual response. void
onComplete(Response<T> response)
Invoked when the language model has finished streaming a response. abstract void
onError(Throwable error)
This method is invoked when an error occurs during streaming. -
-
Method Detail
-
onNext
abstract void onNext(String token)
Invoked each time the language model generates a new token in a textual response. If the model executes a tool instead, this method will not be invoked; onComplete will be invoked instead.
- Parameters:
token
- The newly generated token, which is a part of the complete response.
-
onComplete
void onComplete(Response<T> response)
Invoked when the language model has finished streaming a response. If the model executes one or multiple tools, it is accessible via toolExecutionRequests.
- Parameters:
response
- The complete response generated by the language model.
-
-
-
-