gov.sandia.cognition.learning.algorithm.regression
Class ParameterDifferentiableCostMinimizer

java.lang.Object
  extended by gov.sandia.cognition.util.AbstractCloneableSerializable
      extended by gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
          extended by gov.sandia.cognition.algorithm.AnytimeAlgorithmWrapper<ResultType,FunctionMinimizer<Vector,Double,? super EvaluatorType>>
              extended by gov.sandia.cognition.learning.algorithm.regression.AbstractMinimizerBasedParameterCostMinimizer<GradientDescendable,DifferentiableEvaluator<Vector,Double,Vector>>
                  extended by gov.sandia.cognition.learning.algorithm.regression.ParameterDifferentiableCostMinimizer
All Implemented Interfaces:
AnytimeAlgorithm<GradientDescendable>, IterativeAlgorithm, IterativeAlgorithmListener, MeasurablePerformanceAlgorithm, StoppableAlgorithm, BatchCostMinimizationLearner<Collection<? extends InputOutputPair<? extends Vector,Vector>>,GradientDescendable>, BatchLearner<Collection<? extends InputOutputPair<? extends Vector,Vector>>,GradientDescendable>, ParameterCostMinimizer<GradientDescendable>, SupervisedBatchLearner<Vector,Vector,GradientDescendable>, CloneableSerializable, Serializable, Cloneable

public class ParameterDifferentiableCostMinimizer
extends AbstractMinimizerBasedParameterCostMinimizer<GradientDescendable,DifferentiableEvaluator<Vector,Double,Vector>>

This class adapts the unconstrained nonlinear minimization algorithms in the "minimization" package to the task of estimating locally optimal (minimum-cost) parameter sets. This allows us to use algorithms like BFGS FunctionMinimizerBFGS to find locally optimal parameters of, for example, a DifferentiableFeedforwardNeuralNetwork. Any first-order derivative FunctionMinimizer may be dropped into this class.

My current preference is for using BFGS (FunctionMinimizerBFGS) to solve virtually all problems. However, when there are too many parameters, then Liu-Storey conjugate gradient (FunctionMinimizerLiuStorey) is another good choice.

When first-order derivative information is not available, then you may use either automatic differentiation (GradientDescendableApproximator) or the derivative-free minimization routines, such as those used by ParameterDerivativeFreeCostMinimizer.

Since:
2.1
Author:
Kevin R. Dixon
See Also:
FunctionMinimizer, ParameterDerivativeFreeCostMinimizer, Serialized Form

Nested Class Summary
static class ParameterDifferentiableCostMinimizer.ParameterCostEvaluatorDerivativeBased
          Function that maps the parameters of an object to its inputs, so that minimization algorithms can tune the parameters of an object against a cost function.
 
Field Summary
static FunctionMinimizer<Vector,Double,DifferentiableEvaluator<? super Vector,Double,Vector>> DEFAULT_FUNCTION_MINIMIZER
          Default function minimizer, FunctionMinimizerBFGS with LineMinimizerBacktracking
 
Fields inherited from class gov.sandia.cognition.learning.algorithm.regression.AbstractMinimizerBasedParameterCostMinimizer
DEFAULT_COST_FUNCTION
 
Fields inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
DEFAULT_ITERATION, iteration
 
Constructor Summary
ParameterDifferentiableCostMinimizer()
          Creates a new instance of ParameterDifferentiableCostMinimizer
ParameterDifferentiableCostMinimizer(FunctionMinimizer<Vector,Double,? super DifferentiableEvaluator<Vector,Double,Vector>> minimizer)
          Creates a new instance of ParameterDerivativeFreeCostMinimizer
 
Method Summary
 ParameterDifferentiableCostMinimizer clone()
          This makes public the clone method on the Object class and removes the exception that it throws.
 ParameterDifferentiableCostMinimizer.ParameterCostEvaluatorDerivativeBased createInternalFunction()
          Creates the internal function that maps the parameter set of result as the input to the function, so that the minimization algorithms can perturb this input in their minimization schemes.
 
Methods inherited from class gov.sandia.cognition.learning.algorithm.regression.AbstractMinimizerBasedParameterCostMinimizer
getCostFunction, getObjectToOptimize, getPerformance, getResult, learn, setCostFunction, setObjectToOptimize, setResult
 
Methods inherited from class gov.sandia.cognition.algorithm.AnytimeAlgorithmWrapper
algorithmEnded, algorithmStarted, getAlgorithm, getIteration, getMaxIterations, isResultValid, readResolve, setAlgorithm, setMaxIterations, stepEnded, stepStarted, stop
 
Methods inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
addIterativeAlgorithmListener, fireAlgorithmEnded, fireAlgorithmStarted, fireStepEnded, fireStepStarted, getListeners, removeIterativeAlgorithmListener, setIteration, setListeners
 
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 
Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm
getMaxIterations, setMaxIterations
 
Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm
addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener
 
Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm
isResultValid, stop
 

Field Detail

DEFAULT_FUNCTION_MINIMIZER

public static final FunctionMinimizer<Vector,Double,DifferentiableEvaluator<? super Vector,Double,Vector>> DEFAULT_FUNCTION_MINIMIZER
Default function minimizer, FunctionMinimizerBFGS with LineMinimizerBacktracking

Constructor Detail

ParameterDifferentiableCostMinimizer

public ParameterDifferentiableCostMinimizer()
Creates a new instance of ParameterDifferentiableCostMinimizer


ParameterDifferentiableCostMinimizer

public ParameterDifferentiableCostMinimizer(FunctionMinimizer<Vector,Double,? super DifferentiableEvaluator<Vector,Double,Vector>> minimizer)
Creates a new instance of ParameterDerivativeFreeCostMinimizer

Parameters:
minimizer - Minimization algorithm used to find locally optimal parameters
Method Detail

clone

public ParameterDifferentiableCostMinimizer clone()
Description copied from class: AbstractCloneableSerializable
This makes public the clone method on the Object class and removes the exception that it throws. Its default behavior is to automatically create a clone of the exact type of object that the clone is called on and to copy all primitives but to keep all references, which means it is a shallow copy. Extensions of this class may want to override this method (but call super.clone() to implement a "smart copy". That is, to target the most common use case for creating a copy of the object. Because of the default behavior being a shallow copy, extending classes only need to handle fields that need to have a deeper copy (or those that need to be reset). Some of the methods in ObjectUtil may be helpful in implementing a custom clone method. Note: The contract of this method is that you must use super.clone() as the basis for your implementation.

Specified by:
clone in interface CloneableSerializable
Overrides:
clone in class AbstractMinimizerBasedParameterCostMinimizer<GradientDescendable,DifferentiableEvaluator<Vector,Double,Vector>>
Returns:
A clone of this object.

createInternalFunction

public ParameterDifferentiableCostMinimizer.ParameterCostEvaluatorDerivativeBased createInternalFunction()
Description copied from class: AbstractMinimizerBasedParameterCostMinimizer
Creates the internal function that maps the parameter set of result as the input to the function, so that the minimization algorithms can perturb this input in their minimization schemes.

Specified by:
createInternalFunction in class AbstractMinimizerBasedParameterCostMinimizer<GradientDescendable,DifferentiableEvaluator<Vector,Double,Vector>>
Returns:
Evaluator to use internally.