Package dev.langchain4j.model.openai
Class OpenAiStreamingLanguageModel
java.lang.Object
dev.langchain4j.model.openai.OpenAiStreamingLanguageModel
- All Implemented Interfaces:
StreamingLanguageModel
,TokenCountEstimator
public class OpenAiStreamingLanguageModel
extends Object
implements StreamingLanguageModel, TokenCountEstimator
Represents an OpenAI language model with a completion interface, such as gpt-3.5-turbo-instruct.
The model's response is streamed token by token and should be handled with
StreamingResponseHandler
.
However, it's recommended to use OpenAiStreamingChatModel
instead,
as it offers more advanced features like function calling, multi-turn conversations, etc.-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic class
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuilder()
int
estimateTokenCount
(String prompt) void
generate
(String prompt, StreamingResponseHandler<String> handler) static OpenAiStreamingLanguageModel
withApiKey
(String apiKey) Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
Methods inherited from interface dev.langchain4j.model.language.StreamingLanguageModel
generate
Methods inherited from interface dev.langchain4j.model.language.TokenCountEstimator
estimateTokenCount, estimateTokenCount
-
Constructor Details
-
OpenAiStreamingLanguageModel
-
-
Method Details
-
generate
- Specified by:
generate
in interfaceStreamingLanguageModel
-
estimateTokenCount
- Specified by:
estimateTokenCount
in interfaceTokenCountEstimator
-
withApiKey
-
builder
-