

PREV CLASS NEXT CLASS  FRAMES NO FRAMES  
SUMMARY: NESTED  FIELD  CONSTR  METHOD  DETAIL: FIELD  CONSTR  METHOD 
EvaluatorType
 Type of Evaluator to use.public interface LineMinimizer<EvaluatorType extends Evaluator<Double,Double>>
Defines the functionality of a lineminimization algorithm, often called a
"line search" algorithm. These algorithms find the minimum of a scalar
function near an "initial guess". Generally, these algorithms operate in
two phases. The first phase is known as the "bracketing phase", where the
algorithm attempts to find an interval along the xaxis that contains a
known minimum. Once the bracketing phase is complete, and a minimum is
bracketing, the second phase begins. The second phase is known as the
"sectioning phase", where the size of the bracket is repeatedly reduced,
while keeping the minimum between the bracket bounds, until a desired
accuracy is reached.
Again, just to recap: line minimizers operate in two phases. First, rope
off xaxis territory until you can demonstrate you've got a minimum in your
bracket. Then, squeeze the bracket together until you get sufficiently
close to the minimum.
Some LineMinimizers do not require derivative information in either the
bracketing or sectioning, such as Brent's method
(LineMinimizerDerivativeFree). Some LineMinimizers make use of derivative
information both during bracketing and sectioning, such as Fletcher's line
search (LineMinimizerDerivativeBased).
In my experience, derivativefree methods are as efficient as
derivativebased line minimizations. Because I do not see large
computational or realworld time savings, I tend to just use derivativefree
line minimizers using Brent's interpolator (which is very clever and
extremely robust). But it's always worth trying out derivativebased line
minimizers using Hermite polynomial interpolators (you can also use Brent's
interpolator with derivativebased methods, but it doesn't make use of the
gradients that are being calculated anyway, so it tends to slow things
down).
Method Summary  

boolean 
bracketingStep()
Continues the bracketing phase of the algorithm, which attempts to place a bracket around a known minimum. 
LineBracket 
getBracket()
Gets the LineBracket used to bound the search 
LineBracketInterpolator<? super EvaluatorType> 
getInterpolator()
Gets the interpolator used to fit data points and derive an interpolated (hypothesized) minimum to try next. 
boolean 
isValidBracket()
Returns true if the algorithm has found a valid bracket on a minimum, false if the algorithm needs to continue the bracketing phase 
WeightedInputOutputPair<Vector,Double> 
minimizeAlongDirection(DirectionalVectorToScalarFunction function,
Double functionValue,
Vector gradient)
Minimizes a Vector function along the direction given by the DirectionalVectorToScalarFunction. 
boolean 
sectioningStep()
Continues the sectioning phase of the algorihtm. 
Methods inherited from interface gov.sandia.cognition.learning.algorithm.minimization.FunctionMinimizer 

getInitialGuess, getTolerance, learn, setInitialGuess, setTolerance 
Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable 

clone 
Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm 

getMaxIterations, getResult, setMaxIterations 
Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm 

addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener 
Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm 

isResultValid, stop 
Method Detail 

LineBracketInterpolator<? super EvaluatorType> getInterpolator()
LineBracket getBracket()
boolean isValidBracket()
boolean bracketingStep()
boolean sectioningStep()
WeightedInputOutputPair<Vector,Double> minimizeAlongDirection(DirectionalVectorToScalarFunction function, Double functionValue, Vector gradient)
function
is
taken to be the initialGuess.
function
 Defines the direction to search along, and the initial guess. The
direction is scaled by the linesearch solutionfunctionValue
 Value of function at initialGuess, may be nullgradient
 Derivative of the output with respect to the input of function
at the initial guess. Gradient may be null if it's not being computer.
So, gradient is not required for all linesearch methods, but
will throw an exception if it's expected but not available.


PREV CLASS NEXT CLASS  FRAMES NO FRAMES  
SUMMARY: NESTED  FIELD  CONSTR  METHOD  DETAIL: FIELD  CONSTR  METHOD 