Package dev.langchain4j.model.ollama
Class OllamaStreamingChatModel.OllamaStreamingChatModelBuilder
java.lang.Object
dev.langchain4j.model.ollama.OllamaStreamingChatModel.OllamaStreamingChatModelBuilder
- Enclosing class:
OllamaStreamingChatModel
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionbuild()customHeaders(Map<String, String> customHeaders) Deprecated.httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder httpClientBuilder) TODO TODOtimeout(Duration)overrides timeouts set on theHttpClientBuilderlogRequests(Boolean logRequests) logResponses(Boolean logResponses) numPredict(Integer numPredict) repeatPenalty(Double repeatPenalty) responseFormat(dev.langchain4j.model.chat.request.ResponseFormat responseFormat) supportedCapabilities(dev.langchain4j.model.chat.Capability... supportedCapabilities) supportedCapabilities(Set<dev.langchain4j.model.chat.Capability> supportedCapabilities) temperature(Double temperature)
-
Constructor Details
-
OllamaStreamingChatModelBuilder
public OllamaStreamingChatModelBuilder()
-
-
Method Details
-
httpClientBuilder
public OllamaStreamingChatModel.OllamaStreamingChatModelBuilder httpClientBuilder(dev.langchain4j.http.client.HttpClientBuilder httpClientBuilder) TODO TODOtimeout(Duration)overrides timeouts set on theHttpClientBuilder- Parameters:
httpClientBuilder-- Returns:
-
baseUrl
-
modelName
-
temperature
-
topK
-
topP
-
repeatPenalty
-
seed
-
numPredict
-
numCtx
-
stop
-
format
Deprecated.Please useresponseFormat(ResponseFormat)instead. For example:responseFormat(ResponseFormat.JSON).
Instead of using JSON mode, consider using structured outputs with JSON schema instead, see more info here. -
responseFormat
public OllamaStreamingChatModel.OllamaStreamingChatModelBuilder responseFormat(dev.langchain4j.model.chat.request.ResponseFormat responseFormat) -
timeout
-
customHeaders
public OllamaStreamingChatModel.OllamaStreamingChatModelBuilder customHeaders(Map<String, String> customHeaders) -
logRequests
-
logResponses
-
listeners
public OllamaStreamingChatModel.OllamaStreamingChatModelBuilder listeners(List<dev.langchain4j.model.chat.listener.ChatModelListener> listeners) -
supportedCapabilities
public OllamaStreamingChatModel.OllamaStreamingChatModelBuilder supportedCapabilities(Set<dev.langchain4j.model.chat.Capability> supportedCapabilities) -
supportedCapabilities
public OllamaStreamingChatModel.OllamaStreamingChatModelBuilder supportedCapabilities(dev.langchain4j.model.chat.Capability... supportedCapabilities) -
build
-
responseFormat(ResponseFormat)instead.