Class PerceptronTrainer

All Implemented Interfaces:
EventTrainer

public class PerceptronTrainer extends AbstractEventTrainer
Trains models using the perceptron algorithm. Each outcome is represented as a binary perceptron classifier. This supports standard (integer) weighting as well average weighting as described in: Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with the Perceptron Algorithm. Michael Collins, EMNLP 2002.
  • Field Details

  • Constructor Details

    • PerceptronTrainer

      public PerceptronTrainer()
    • PerceptronTrainer

      public PerceptronTrainer(TrainingParameters parameters)
  • Method Details

    • validate

      public void validate()
      Description copied from class: AbstractTrainer
      Check parameters. If subclass overrides this, it should call super.validate();
      Overrides:
      validate in class AbstractEventTrainer
    • isValid

      @Deprecated public boolean isValid()
      Deprecated.
      Overrides:
      isValid in class AbstractEventTrainer
      Returns:
    • isSortAndMerge

      public boolean isSortAndMerge()
      Specified by:
      isSortAndMerge in class AbstractEventTrainer
    • doTrain

      public AbstractModel doTrain(DataIndexer indexer) throws IOException
      Specified by:
      doTrain in class AbstractEventTrainer
      Throws:
      IOException
    • setTolerance

      public void setTolerance(double tolerance)
      Specifies the tolerance. If the change in training set accuracy is less than this, stop iterating.
      Parameters:
      tolerance -
    • setStepSizeDecrease

      public void setStepSizeDecrease(double decrease)
      Enables and sets step size decrease. The step size is decreased every iteration by the specified value.
      Parameters:
      decrease - - step size decrease in percent
    • setSkippedAveraging

      public void setSkippedAveraging(boolean averaging)
      Enables skipped averaging, this flag changes the standard averaging to special averaging instead.

      If we are doing averaging, and the current iteration is one of the first 20 or it is a perfect square, then updated the summed parameters.

      The reason we don't take all of them is that the parameters change less toward the end of training, so they drown out the contributions of the more volatile early iterations. The use of perfect squares allows us to sample from successively farther apart iterations.

      Parameters:
      averaging - averaging flag
    • trainModel

      public AbstractModel trainModel(int iterations, DataIndexer di, int cutoff)
    • trainModel

      public AbstractModel trainModel(int iterations, DataIndexer di, int cutoff, boolean useAverage)