Class StochasticPathwiseLevenbergMarquardt

  • All Implemented Interfaces:
    Serializable, Cloneable, StochasticOptimizer
    Direct Known Subclasses:
    StochasticPathwiseLevenbergMarquardtAD

    public abstract class StochasticPathwiseLevenbergMarquardt
    extends Object
    implements Serializable, Cloneable, StochasticOptimizer
    This class implements a stochastic Levenberg Marquardt non-linear least-squares fit algorithm.

    The design avoids the need to define the objective function as a separate class. The objective function is defined by overriding a class method, see the sample code below.

    The Levenberg-Marquardt solver is implemented in using multi-threading. The calculation of the derivatives (in case a specific implementation of setDerivatives(RandomVariable[] parameters, RandomVariable[][] derivatives) is not provided) may be performed in parallel by setting the parameter numberOfThreads.

    To use the solver inherit from it and implement the objective function as setValues(RandomVariable[] parameters, RandomVariable[] values) where values has to be set to the value of the objective functions for the given parameters.
    You may also provide an a derivative for your objective function by additionally overriding the function setDerivatives(RandomVariable[] parameters, RandomVariable[][] derivatives), otherwise the solver will calculate the derivative via finite differences.

    To reject a point, it is allowed to set an element of values to Double.NaN in the implementation of setValues(RandomVariable[] parameters, RandomVariable[] values). Put differently: The solver handles NaN values in values as an error larger than the current one (regardless of the current error) and rejects the point.
    Note, however, that is is an error if the initial parameter guess results in an NaN value. That is, the solver should be initialized with an initial parameter in an admissible region.

    The following simple example finds a solution for the equation
    Sample linear system of equations.
    0.0 * x1 + 1.0 * x2 = 5.0
    2.0 * x1 + 1.0 * x2 = 10.0
     
            LevenbergMarquardt optimizer = new LevenbergMarquardt() {
                    // Override your objective function here
                    public void setValues(RandomVariable[] parameters, RandomVariable[] values) {
                            values[0] = parameters[0] * 0.0 + parameters[1];
                            values[1] = parameters[0] * 2.0 + parameters[1];
                    }
            };
    
            // Set solver parameters
            optimizer.setInitialParameters(new RandomVariable[] { 0, 0 });
            optimizer.setWeights(new RandomVariable[] { 1, 1 });
            optimizer.setMaxIteration(100);
            optimizer.setTargetValues(new RandomVariable[] { 5, 10 });
    
            optimizer.run();
    
            RandomVariable[] bestParameters = optimizer.getBestFitParameters();
     
     
    See the example in the main method below.

    The class can be initialized to use a multi-threaded valuation. If initialized this way the implementation of setValues must be thread-safe. The solver will evaluate the gradient of the value vector in parallel, i.e., use as many threads as the number of parameters.

    Note: Iteration steps will be logged (java.util.logging) with LogLevel.FINE
    Version:
    1.6
    Author:
    Christian Fries
    See Also:
    Serialized Form
    • Constructor Detail

      • StochasticPathwiseLevenbergMarquardt

        public StochasticPathwiseLevenbergMarquardt​(RandomVariable[] initialParameters,
                                                    RandomVariable[] targetValues,
                                                    RandomVariable[] weights,
                                                    RandomVariable[] parameterSteps,
                                                    int maxIteration,
                                                    RandomVariable errorTolerance,
                                                    ExecutorService executorService)
        Create a Levenberg-Marquardt solver.
        Parameters:
        initialParameters - Initial value for the parameters where the solver starts its search.
        targetValues - Target values to achieve.
        weights - Weights applied to the error.
        parameterSteps - Step used for finite difference approximation.
        maxIteration - Maximum number of iterations.
        errorTolerance - Error tolerance / accuracy.
        executorService - Executor to be used for concurrent valuation of the derivatives. This is only performed if setDerivative is not overwritten. Warning: The implementation of setValues has to be thread safe!
      • StochasticPathwiseLevenbergMarquardt

        public StochasticPathwiseLevenbergMarquardt​(RandomVariable[] initialParameters,
                                                    RandomVariable[] targetValues,
                                                    int maxIteration,
                                                    int numberOfThreads)
        Create a Levenberg-Marquardt solver.
        Parameters:
        initialParameters - Initial value for the parameters where the solver starts its search.
        targetValues - Target values to achieve.
        maxIteration - Maximum number of iterations.
        numberOfThreads - Maximum number of threads. Warning: If this number is larger than one, the implementation of setValues has to be thread safe!
      • StochasticPathwiseLevenbergMarquardt

        public StochasticPathwiseLevenbergMarquardt​(List<RandomVariable> initialParameters,
                                                    List<RandomVariable> targetValues,
                                                    int maxIteration,
                                                    ExecutorService executorService)
        Create a Levenberg-Marquardt solver.
        Parameters:
        initialParameters - List of initial values for the parameters where the solver starts its search.
        targetValues - List of target values to achieve.
        maxIteration - Maximum number of iterations.
        executorService - Executor to be used for concurrent valuation of the derivatives. This is only performed if setDerivative is not overwritten. Warning: The implementation of setValues has to be thread safe!
      • StochasticPathwiseLevenbergMarquardt

        public StochasticPathwiseLevenbergMarquardt​(List<RandomVariable> initialParameters,
                                                    List<RandomVariable> targetValues,
                                                    int maxIteration,
                                                    int numberOfThreads)
        Create a Levenberg-Marquardt solver.
        Parameters:
        initialParameters - Initial value for the parameters where the solver starts its search.
        targetValues - Target values to achieve.
        maxIteration - Maximum number of iterations.
        numberOfThreads - Maximum number of threads. Warning: If this number is larger than one, the implementation of setValues has to be thread safe!
    • Method Detail

      • getLambda

        public double[] getLambda()
        Get the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the \( \lambda \) in \( H + \lambda \diag H \).
        Returns:
        the parameter \( \lambda \).
      • setLambda

        public void setLambda​(double[] lambda)
        Set the parameter λ used in the Tikhonov-like regularization of the Hessian matrix, that is the \( \lambda \) in \( H + \lambda \diag H \).
        Parameters:
        lambda - the lambda to set
      • getLambdaMultiplicator

        public double getLambdaMultiplicator()
        Get the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if \( H + \lambda \diag H \) is not invertable.
        Returns:
        the lambdaMultiplicator
      • setLambdaMultiplicator

        public void setLambdaMultiplicator​(double lambdaMultiplicator)
        Set the multiplicator applied to lambda if the inversion of regularized Hessian fails, that is, if \( H + \lambda \diag H \) is not invertable. This will make lambda larger, hence let the stepping move slower.
        Parameters:
        lambdaMultiplicator - the lambdaMultiplicator to set. Should be > 1.
      • getLambdaDivisor

        public double getLambdaDivisor()
        Get the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if \( H + \lambda \diag H \) is invertable.
        Returns:
        the lambdaDivisor
      • setLambdaDivisor

        public void setLambdaDivisor​(double lambdaDivisor)
        Set the divisor applied to lambda (for the next iteration) if the inversion of regularized Hessian succeeds, that is, if \( H + \lambda \diag H \) is invertable. This will make lambda smaller, hence let the stepping move faster.
        Parameters:
        lambdaDivisor - the lambdaDivisor to set. Should be > 1.
      • getRootMeanSquaredError

        public double getRootMeanSquaredError()
        Specified by:
        getRootMeanSquaredError in interface StochasticOptimizer
        Returns:
        the the root mean square error achieved with the the best fit parameter
      • setErrorMeanSquaredCurrent

        public void setErrorMeanSquaredCurrent​(RandomVariable errorMeanSquaredCurrent)
        Parameters:
        errorMeanSquaredCurrent - the errorMeanSquaredCurrent to set
      • setValues

        public abstract void setValues​(RandomVariable[] parameters,
                                       RandomVariable[] values)
                                throws SolverException
        The objective function. Override this method to implement your custom function.
        Parameters:
        parameters - Input value. The parameter vector.
        values - Output value. The vector of values f(i,parameters), i=1,...,n
        Throws:
        SolverException - Thrown if the valuation fails, specific cause may be available via the cause() method.
      • setDerivatives

        public void setDerivatives​(RandomVariable[] parameters,
                                   RandomVariable[][] derivatives)
                            throws SolverException
        The derivative of the objective function. You may override this method if you like to implement your own derivative.
        Parameters:
        parameters - Input value. The parameter vector.
        derivatives - Output value, where derivatives[i][j] is d(value(j)) / d(parameters(i)
        Throws:
        SolverException - Thrown if the valuation fails, specific cause may be available via the cause() method.
      • getCloneWithModifiedTargetValues

        public StochasticPathwiseLevenbergMarquardt getCloneWithModifiedTargetValues​(RandomVariable[] newTargetVaues,
                                                                                     RandomVariable[] newWeights,
                                                                                     boolean isUseBestParametersAsInitialParameters)
                                                                              throws CloneNotSupportedException
        Create a clone of this LevenbergMarquardt optimizer with a new vector for the target values and weights. The clone will use the same objective function than this implementation, i.e., the implementation of setValues(RandomVariable[], RandomVariable[]) and that of setDerivatives(RandomVariable[], RandomVariable[][]) is reused. The initial values of the cloned optimizer will either be the original initial values of this object or the best parameters obtained by this optimizer, the latter is used only if this optimized signals a done().
        Parameters:
        newTargetVaues - New array of target values.
        newWeights - New array of weights.
        isUseBestParametersAsInitialParameters - If true and this optimizer is done(), then the clone will use this.getBestFitParameters() as initial parameters.
        Returns:
        A new LevenbergMarquardt optimizer, cloning this one except modified target values and weights.
        Throws:
        CloneNotSupportedException - Thrown if this optimizer cannot be cloned.