Package io.codemodder.plugins.llm
Class OpenAIService
java.lang.Object
io.codemodder.plugins.llm.OpenAIService
A custom service class to wrap the
OpenAIClient-
Method Summary
Modifier and TypeMethodDescriptionstatic OpenAIServicefromAzureOpenAI(String token, String endpoint) Creates a newOpenAIServiceinstance with the given Azure OpenAI token and endpoint.static OpenAIServicefromOpenAI(String token) Creates a newOpenAIServiceinstance with the given OpenAI token.getJSONCompletion(List<com.azure.ai.openai.models.ChatRequestMessage> messages, Model modelOrDeploymentName) Gets the completion for the given messages.<T> TgetResponseForPrompt(List<com.azure.ai.openai.models.ChatRequestMessage> messages, Model modelName, Class<T> responseType) Returns an object of the given type based on the completion for the given messages.
-
Method Details
-
fromOpenAI
Creates a newOpenAIServiceinstance with the given OpenAI token.- Parameters:
token- the token to use- Returns:
- the new instance
-
fromAzureOpenAI
Creates a newOpenAIServiceinstance with the given Azure OpenAI token and endpoint.- Parameters:
token- the token to useendpoint- the endpoint to use- Returns:
- the new instance
-
getJSONCompletion
public String getJSONCompletion(List<com.azure.ai.openai.models.ChatRequestMessage> messages, Model modelOrDeploymentName) Gets the completion for the given messages.- Parameters:
messages- the messagesmodelOrDeploymentName- the model or deployment name- Returns:
- the completion
-
getResponseForPrompt
public <T> T getResponseForPrompt(List<com.azure.ai.openai.models.ChatRequestMessage> messages, Model modelName, Class<T> responseType) throws IOException Returns an object of the given type based on the completion for the given messages.- Parameters:
messages- the messagesmodelName- the model name- Returns:
- the completion
- Throws:
IOException
-