

PREV CLASS NEXT CLASS  FRAMES NO FRAMES  
SUMMARY: NESTED  FIELD  CONSTR  METHOD  DETAIL: FIELD  CONSTR  METHOD 
InputType
 Input class of the Evaluator
that we are trying to minimize,
such as Vector
OutputType
 Output class of the Evaluator
that we are trying to minimize,
such as Double
EvaluatorType
 Evaluator
class that this minimization algorithm can handle, such as
Evaluator
or DifferentiableEvaluator
.public interface FunctionMinimizer<InputType,OutputType,EvaluatorType extends Evaluator<? super InputType,? extends OutputType>>
Interface for unconstrained minimization of nonlinear functions. Implementing
classes find the input that minimizes the output of a function. To find the
parameter set that minimizes a cost function, use ParameterCostMinimizer
.
Broadly speaking, this package is decomposed into multivariate
(Vector
) and line (scalar or Double
) minimization. Furthermore,
each of these categories can rely exclusively on function evaluations
(derivativefree) or could additionally require firstorder derivative
(gradient) information. Generally speaking, derivativebased minimizers
require fewer function evaluations and gradient evaluations than their
derivativefree counterparts.
Here are my opinions:
Multivariate minimization, averaged function evaluations and gradient
evaluations for a random (though repeated) set of initial conditions for
minimizing the Rosenbrock function:
There is broad consensus that the BFGS QuasiNewton algorithm
(FunctionMinimizerBFGS, 73.49/59.35) is the most efficient on realworld
problems. However, all QuasiNewton algorithms require the storage of an
NbyN matrix, where N is the size of the input space. If storing that
matrix is impractical, then use LiuStorey Conjugate Gradient
(FunctionMinimizerLiuStorey, 92.91/44.58). In my experience, this CG
variant performs almost as well as BFGS but doesn't require the NbyN
matrix.
When gradient information isn't available, or gradients are expensive to
compute, then I would strongly suggest trying finitedifference
approximation to emulate firstorder derivatives and then using one of the
algorithms mentioned above. If you are still not satisfied, then we have
implemented minimization algorithms that do not require derivative
information. The very clever directionset minimization method of Smith,
Powell, and Brent (FunctionMinimizerDirectionSet, 448.66/0.0) is my personal
method of choice for derivativefree minimization. Another option is the
bruteforce downhill simplex algorithm of Nelder and Mead
(FunctionMinimizerNelderMead, 187.32/0.0), which can be quite zoomy on some
problems. Since they are both inherently heuristic and neither is uniformly
better than the other, try them both before settling on a particular
algorithm for a particular domain.
Line minimization, reported as minimizing a cosine and a nonlinear
polynomial:
We have three line minimizers, Fletchertype derivativebased, Brenttype
derivativefree, and Newtontype backtracking. I have yet to find an
instance when backtracking is beneficial, so I won't discuss it further.
An extremely important distinction between line search and
multivariate minimization is that derivative information appears to be much
less important, so it is not necessarily a given that a derivativebased
line minimizer is inherently superior to one that is derivativefree. With
that said, in my experience the Fletchertype line minimizer has superior
performance, particularly because it is vastly superior in the manner by
which it brackets a minimum. However, I would try both the Fletchertype
(LineMinimizerDerivativeBased, 4.300/3.435, 7.982/4.987) and Brenttype
(LineMinimizerDerivativeFree, 7.705/0.0, 11.300/0.00)
algorithms before settling on one of them for a particular domain.
ParameterCostMinimizer
Method Summary  

InputType 
getInitialGuess()
Gets the initial guess of the minimization routine 
double 
getTolerance()
Gets the tolerance of the minimization algorithm 
InputOutputPair<InputType,OutputType> 
learn(EvaluatorType function)
Finds the (local) minimum of the given function 
void 
setInitialGuess(InputType initialGuess)
Sets the initial guess of the minimization routine 
void 
setTolerance(double tolerance)
Sets the tolerance of the minimization algorithm 
Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable 

clone 
Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm 

getMaxIterations, getResult, setMaxIterations 
Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm 

addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener 
Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm 

isResultValid, stop 
Method Detail 

InputOutputPair<InputType,OutputType> learn(EvaluatorType function)
learn
in interface BatchLearner<EvaluatorType extends Evaluator<? super InputType,? extends OutputType>,InputOutputPair<InputType,OutputType>>
function
 Evaluator to minimize
double getTolerance()
void setTolerance(double tolerance)
tolerance
 Tolerance of the minimization algorithmInputType getInitialGuess()
void setInitialGuess(InputType initialGuess)
initialGuess
 Initial guess of the minimization routine


PREV CLASS NEXT CLASS  FRAMES NO FRAMES  
SUMMARY: NESTED  FIELD  CONSTR  METHOD  DETAIL: FIELD  CONSTR  METHOD 