gov.sandia.cognition.learning.algorithm.minimization
Class FunctionMinimizerConjugateGradient

java.lang.Object
  extended by gov.sandia.cognition.util.AbstractCloneableSerializable
      extended by gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
          extended by gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm<ResultType>
              extended by gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner<EvaluatorType,InputOutputPair<InputType,OutputType>>
                  extended by gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer<Vector,Double,DifferentiableEvaluator<? super Vector,Double,Vector>>
                      extended by gov.sandia.cognition.learning.algorithm.minimization.FunctionMinimizerConjugateGradient
All Implemented Interfaces:
AnytimeAlgorithm<InputOutputPair<Vector,Double>>, IterativeAlgorithm, StoppableAlgorithm, AnytimeBatchLearner<DifferentiableEvaluator<? super Vector,Double,Vector>,InputOutputPair<Vector,Double>>, BatchLearner<DifferentiableEvaluator<? super Vector,Double,Vector>,InputOutputPair<Vector,Double>>, FunctionMinimizer<Vector,Double,DifferentiableEvaluator<? super Vector,Double,Vector>>, CloneableSerializable, Serializable, Cloneable
Direct Known Subclasses:
FunctionMinimizerFletcherReeves, FunctionMinimizerLiuStorey, FunctionMinimizerPolakRibiere

@PublicationReferences(references={@PublicationReference(author="R. Fletcher",title="Practical Methods of Optimization, Second Edition",type=Book,year=1987,pages={80,87},notes="Section 4.1"),@PublicationReference(author="Wikipedia",title="Nonlinear conjugate gradient method",type=WebPage,url="http://en.wikipedia.org/wiki/Nonlinear_conjugate_gradient_method",year=2008),@PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},title="Numerical Recipes in C, Second Edition",type=Book,year=1992,pages={423,424},notes="Section 10.6",url="http://www.nrbook.com/a/bookcpdf.php")})
public abstract class FunctionMinimizerConjugateGradient
extends AbstractAnytimeFunctionMinimizer<Vector,Double,DifferentiableEvaluator<? super Vector,Double,Vector>>

Conjugate gradient method is a class of algorithms for finding the unconstrained local minimum of a nonlinear function. CG algorithms find the local minimum by using a line search algorithm along a "conjugate gradient" direction using first-order (gradient) information. The particular approaches vary only slightly according to how the direction is updated. However, in my experience, the Liu-Storey CG variant (FunctionMinimizerLiuStorey) performs the best.

All CG variants tend to require more function/gradient evaluations than their Quasi-Newton cousins. However, CG methods only require O(N) storage, whereas Quasi-Newton algorithms (FunctionMinimizerQuasiNewton) require O(N*N) storage, where N is the size of the input space. So, if your input space is large, then CG algorithms may be the method of choice. In any case, the Liu-Storey CG variant tends to perform fairly well compared to the best Quasi-Newton algorithms, BFGS in particular.

Since:
2.1
Author:
Kevin R. Dixon
See Also:
FunctionMinimizerQuasiNewton, Serialized Form

Field Summary
static LineMinimizer<?> DEFAULT_LINE_MINIMIZER
          Default line minimization algorithm, LineMinimizerDerivativeBased
static int DEFAULT_MAX_ITERATIONS
          Default maximum number of iterations before stopping, 1000
static double DEFAULT_TOLERANCE
          Default tolerance, 1.0E-5
protected  DirectionalVectorToDifferentiableScalarFunction lineFunction
          Function that maps a Evaluator onto a Evaluator using a set point, direction and scale factor
 
Fields inherited from class gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer
initialGuess, result, tolerance
 
Fields inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner
data, keepGoing
 
Fields inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm
maxIterations
 
Fields inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
DEFAULT_ITERATION, iteration
 
Constructor Summary
FunctionMinimizerConjugateGradient(LineMinimizer<?> lineMinimizer, Vector initialGuess, double tolerance, int maxIterations)
          Creates a new instance of FunctionMinimizerConjugateGradient
 
Method Summary
protected  void cleanupAlgorithm()
          Called to clean up the learning algorithm's state after learning has finished.
protected abstract  double computeScaleFactor(Vector gradientCurrent, Vector gradientPrevious)
          Computes the conjugate gradient parameter for the particular update scheme.
 LineMinimizer<?> getLineMinimizer()
          Getter for lineMinimizer
protected  boolean initializeAlgorithm()
          Called to initialize the learning algorithm's state based on the data that is stored in the data field.
 void setLineMinimizer(LineMinimizer<?> lineMinimizer)
          Setter for lineMinimizer
protected  boolean step()
          Called to take a single step of the learning algorithm.
 
Methods inherited from class gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer
getInitialGuess, getResult, getTolerance, setInitialGuess, setResult, setTolerance
 
Methods inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner
clone, getData, getKeepGoing, learn, setData, setKeepGoing, stop
 
Methods inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm
getMaxIterations, isResultValid, setMaxIterations
 
Methods inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
addIterativeAlgorithmListener, fireAlgorithmEnded, fireAlgorithmStarted, fireStepEnded, fireStepStarted, getIteration, getListeners, removeIterativeAlgorithmListener, setIteration, setListeners
 
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 
Methods inherited from interface gov.sandia.cognition.learning.algorithm.minimization.FunctionMinimizer
learn
 
Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable
clone
 
Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm
getMaxIterations, setMaxIterations
 
Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm
addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener
 
Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm
isResultValid, stop
 

Field Detail

DEFAULT_MAX_ITERATIONS

public static final int DEFAULT_MAX_ITERATIONS
Default maximum number of iterations before stopping, 1000

See Also:
Constant Field Values

DEFAULT_TOLERANCE

public static final double DEFAULT_TOLERANCE
Default tolerance, 1.0E-5

See Also:
Constant Field Values

DEFAULT_LINE_MINIMIZER

public static final LineMinimizer<?> DEFAULT_LINE_MINIMIZER
Default line minimization algorithm, LineMinimizerDerivativeBased


lineFunction

protected DirectionalVectorToDifferentiableScalarFunction lineFunction
Function that maps a Evaluator onto a Evaluator using a set point, direction and scale factor

Constructor Detail

FunctionMinimizerConjugateGradient

public FunctionMinimizerConjugateGradient(LineMinimizer<?> lineMinimizer,
                                          Vector initialGuess,
                                          double tolerance,
                                          int maxIterations)
Creates a new instance of FunctionMinimizerConjugateGradient

Parameters:
initialGuess - Initial guess about the minimum of the method
tolerance - Tolerance of the minimization algorithm, must be >= 0.0, typically ~1e-10
lineMinimizer - Work-horse algorithm that minimizes the function along a direction
maxIterations - Maximum number of iterations, must be >0, typically ~100
Method Detail

getLineMinimizer

public LineMinimizer<?> getLineMinimizer()
Getter for lineMinimizer

Returns:
Work-horse algorithm that minimizes the function along a direction

setLineMinimizer

public void setLineMinimizer(LineMinimizer<?> lineMinimizer)
Setter for lineMinimizer

Parameters:
lineMinimizer - Work-horse algorithm that minimizes the function along a direction

initializeAlgorithm

protected boolean initializeAlgorithm()
Description copied from class: AbstractAnytimeBatchLearner
Called to initialize the learning algorithm's state based on the data that is stored in the data field. The return value indicates if the algorithm can be run or not based on the initialization.

Specified by:
initializeAlgorithm in class AbstractAnytimeBatchLearner<DifferentiableEvaluator<? super Vector,Double,Vector>,InputOutputPair<Vector,Double>>
Returns:
True if the learning algorithm can be run and false if it cannot.

step

protected boolean step()
Description copied from class: AbstractAnytimeBatchLearner
Called to take a single step of the learning algorithm.

Specified by:
step in class AbstractAnytimeBatchLearner<DifferentiableEvaluator<? super Vector,Double,Vector>,InputOutputPair<Vector,Double>>
Returns:
True if another step can be taken and false it the algorithm should halt.

cleanupAlgorithm

protected void cleanupAlgorithm()
Description copied from class: AbstractAnytimeBatchLearner
Called to clean up the learning algorithm's state after learning has finished.

Specified by:
cleanupAlgorithm in class AbstractAnytimeBatchLearner<DifferentiableEvaluator<? super Vector,Double,Vector>,InputOutputPair<Vector,Double>>

computeScaleFactor

protected abstract double computeScaleFactor(Vector gradientCurrent,
                                             Vector gradientPrevious)
Computes the conjugate gradient parameter for the particular update scheme.

Parameters:
gradientCurrent - Gradient at the current evaluation point
gradientPrevious - Gradient at the previous evaluation point
Returns:
"beta" scale factor