Package dev.langchain4j.model.openai
Class OpenAiStreamingLanguageModel
java.lang.Object
dev.langchain4j.model.openai.OpenAiStreamingLanguageModel
- All Implemented Interfaces:
StreamingLanguageModel,TokenCountEstimator
public class OpenAiStreamingLanguageModel
extends Object
implements StreamingLanguageModel, TokenCountEstimator
Represents an OpenAI language model with a completion interface, such as gpt-3.5-turbo-instruct.
The model's response is streamed token by token and should be handled with
StreamingResponseHandler.
However, it's recommended to use OpenAiStreamingChatModel instead,
as it offers more advanced features like function calling, multi-turn conversations, etc.-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionstatic class -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuilder()intestimateTokenCount(String prompt) voidgenerate(String prompt, StreamingResponseHandler<String> handler) static OpenAiStreamingLanguageModelwithApiKey(String apiKey) Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface dev.langchain4j.model.language.StreamingLanguageModel
generateMethods inherited from interface dev.langchain4j.model.language.TokenCountEstimator
estimateTokenCount, estimateTokenCount
-
Constructor Details
-
OpenAiStreamingLanguageModel
-
-
Method Details
-
generate
- Specified by:
generatein interfaceStreamingLanguageModel
-
estimateTokenCount
- Specified by:
estimateTokenCountin interfaceTokenCountEstimator
-
withApiKey
-
builder
-