Package dev.langchain4j.model.openai
Class OpenAiChatModel
java.lang.Object
dev.langchain4j.model.openai.OpenAiChatModel
- All Implemented Interfaces:
dev.langchain4j.model.chat.ChatLanguageModel,dev.langchain4j.model.chat.TokenCountEstimator
public class OpenAiChatModel
extends Object
implements dev.langchain4j.model.chat.ChatLanguageModel, dev.langchain4j.model.chat.TokenCountEstimator
Represents an OpenAI language model with a chat completion interface, such as gpt-3.5-turbo and gpt-4.
You can find description of parameters here.
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionOpenAiChatModel(String baseUrl, String apiKey, String organizationId, String modelName, Double temperature, Double topP, List<String> stop, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Integer seed, String user, Duration timeout, Integer maxRetries, Proxy proxy, Boolean logRequests, Boolean logResponses, dev.langchain4j.model.Tokenizer tokenizer) -
Method Summary
Modifier and TypeMethodDescriptionbuilder()intestimateTokenCount(List<dev.langchain4j.data.message.ChatMessage> messages) dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification) dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications) static OpenAiChatModelwithApiKey(String apiKey) Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface dev.langchain4j.model.chat.ChatLanguageModel
generate, generateMethods inherited from interface dev.langchain4j.model.chat.TokenCountEstimator
estimateTokenCount, estimateTokenCount, estimateTokenCount, estimateTokenCount
-
Constructor Details
-
OpenAiChatModel
public OpenAiChatModel(String baseUrl, String apiKey, String organizationId, String modelName, Double temperature, Double topP, List<String> stop, Integer maxTokens, Double presencePenalty, Double frequencyPenalty, Map<String, Integer> logitBias, String responseFormat, Integer seed, String user, Duration timeout, Integer maxRetries, Proxy proxy, Boolean logRequests, Boolean logResponses, dev.langchain4j.model.Tokenizer tokenizer)
-
-
Method Details
-
modelName
-
generate
public dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> generate(List<dev.langchain4j.data.message.ChatMessage> messages) - Specified by:
generatein interfacedev.langchain4j.model.chat.ChatLanguageModel
-
generate
public dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> generate(List<dev.langchain4j.data.message.ChatMessage> messages, List<dev.langchain4j.agent.tool.ToolSpecification> toolSpecifications) - Specified by:
generatein interfacedev.langchain4j.model.chat.ChatLanguageModel
-
generate
public dev.langchain4j.model.output.Response<dev.langchain4j.data.message.AiMessage> generate(List<dev.langchain4j.data.message.ChatMessage> messages, dev.langchain4j.agent.tool.ToolSpecification toolSpecification) - Specified by:
generatein interfacedev.langchain4j.model.chat.ChatLanguageModel
-
estimateTokenCount
- Specified by:
estimateTokenCountin interfacedev.langchain4j.model.chat.TokenCountEstimator
-
withApiKey
-
builder
-