Class LangchainModelHandler

java.lang.Object
com.langchainbeam.LangchainModelHandler
All Implemented Interfaces:
Serializable

public class LangchainModelHandler extends Object implements Serializable
A handler class for managing LangChain model options and instruction prompts. This class is used to configure the model options (e.g., model name, temperature) and the instruction prompt that is passed to the model for inference.

The handler encapsulates the LangchainModelOptions and the instruction prompt, which are necessary to interact with LangChain's model provider interface. The handler is designed to be used in conjunction with LangchainBeam to run inference tasks on a PCollection of data.

See Also:
  • Constructor Details

    • LangchainModelHandler

      public LangchainModelHandler(LangchainModelOptions options, String instructionPrompt)
      Constructs a new LangchainModelHandler with the specified model options and instruction prompt.
      Parameters:
      options - the LangchainModelOptions containing model configurations such as model name and API key
      instructionPrompt - the instruction prompt that will guide the model's behavior (e.g., for classification tasks)
    • LangchainModelHandler

      public LangchainModelHandler(LangchainModelOptions options, String instructionPrompt, Map<String,String> outputFormat)
      Constructs a new LangchainModelHandler with the specified model options, instruction prompt, and output format.
      Parameters:
      options - the LangchainModelOptions containing model configurations
      instructionPrompt - the instruction prompt to guide the model on processing the element. Note: Instruct to respond in JSON to get output as a JSON string. Use out `outputFormat` Map to specify the format
      outputFormat - the desired output format, represented as a map of keys and values as description
  • Method Details

    • getOptions

      public LangchainModelOptions getOptions()
      Returns the LangchainModelOptions for this handler, which includes model configurations such as the model name and API key.
      Returns:
      the model options used for inference
    • getPrompt

      public String getPrompt()
      Returns the instruction prompt that guides the model in performing tasks such as classification or generating outputs.
      Returns:
      the instruction prompt string