gov.sandia.cognition.learning.algorithm.minimization.line
Interface LineMinimizer<EvaluatorType extends Evaluator<Double,Double>>

Type Parameters:
EvaluatorType - Type of Evaluator to use.
All Superinterfaces:
AnytimeAlgorithm<InputOutputPair<Double,Double>>, BatchLearner<EvaluatorType,InputOutputPair<Double,Double>>, Cloneable, CloneableSerializable, FunctionMinimizer<Double,Double,EvaluatorType>, IterativeAlgorithm, Serializable, StoppableAlgorithm
All Known Implementing Classes:
AbstractAnytimeLineMinimizer, LineMinimizerBacktracking, LineMinimizerDerivativeBased, LineMinimizerDerivativeFree

public interface LineMinimizer<EvaluatorType extends Evaluator<Double,Double>>
extends FunctionMinimizer<Double,Double,EvaluatorType>

Defines the functionality of a line-minimization algorithm, often called a "line search" algorithm. These algorithms find the minimum of a scalar function near an "initial guess". Generally, these algorithms operate in two phases. The first phase is known as the "bracketing phase", where the algorithm attempts to find an interval along the x-axis that contains a known minimum. Once the bracketing phase is complete, and a minimum is bracketing, the second phase begins. The second phase is known as the "sectioning phase", where the size of the bracket is repeatedly reduced, while keeping the minimum between the bracket bounds, until a desired accuracy is reached.
Again, just to recap: line minimizers operate in two phases. First, rope off x-axis territory until you can demonstrate you've got a minimum in your bracket. Then, squeeze the bracket together until you get sufficiently close to the minimum.
Some LineMinimizers do not require derivative information in either the bracketing or sectioning, such as Brent's method (LineMinimizerDerivativeFree). Some LineMinimizers make use of derivative information both during bracketing and sectioning, such as Fletcher's line search (LineMinimizerDerivativeBased).
In my experience, derivative-free methods are as efficient as derivative-based line minimizations. Because I do not see large computational or real-world time savings, I tend to just use derivative-free line minimizers using Brent's interpolator (which is very clever and extremely robust). But it's always worth trying out derivative-based line minimizers using Hermite polynomial interpolators (you can also use Brent's interpolator with derivative-based methods, but it doesn't make use of the gradients that are being calculated anyway, so it tends to slow things down).

Since:
2.1
Author:
Kevin R. Dixon

Method Summary
 boolean bracketingStep()
          Continues the bracketing phase of the algorithm, which attempts to place a bracket around a known minimum.
 LineBracket getBracket()
          Gets the LineBracket used to bound the search
 LineBracketInterpolator<? super EvaluatorType> getInterpolator()
          Gets the interpolator used to fit data points and derive an interpolated (hypothesized) minimum to try next.
 boolean isValidBracket()
          Returns true if the algorithm has found a valid bracket on a minimum, false if the algorithm needs to continue the bracketing phase
 WeightedInputOutputPair<Vector,Double> minimizeAlongDirection(DirectionalVectorToScalarFunction function, Double functionValue, Vector gradient)
          Minimizes a Vector function along the direction given by the DirectionalVectorToScalarFunction.
 boolean sectioningStep()
          Continues the sectioning phase of the algorihtm.
 
Methods inherited from interface gov.sandia.cognition.learning.algorithm.minimization.FunctionMinimizer
getInitialGuess, getTolerance, learn, setInitialGuess, setTolerance
 
Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable
clone
 
Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm
getMaxIterations, getResult, setMaxIterations
 
Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm
addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener
 
Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm
isResultValid, stop
 

Method Detail

getInterpolator

LineBracketInterpolator<? super EvaluatorType> getInterpolator()
Gets the interpolator used to fit data points and derive an interpolated (hypothesized) minimum to try next.

Returns:
LineBracketInterpolator used in this optimization

getBracket

LineBracket getBracket()
Gets the LineBracket used to bound the search

Returns:
LineBracket used to bound the search

isValidBracket

boolean isValidBracket()
Returns true if the algorithm has found a valid bracket on a minimum, false if the algorithm needs to continue the bracketing phase

Returns:
True if a valid bracket on a minimum has been found, false otherwise

bracketingStep

boolean bracketingStep()
Continues the bracketing phase of the algorithm, which attempts to place a bracket around a known minimum.

Returns:
True if a valid bracket has been found, false to continue the bracketing phase of the algorithm.

sectioningStep

boolean sectioningStep()
Continues the sectioning phase of the algorihtm. This phase occurs after the bracketing phase and attempts to shrink the size of the bracket (using a LineBracketInterpolator) until sufficient accuracy is attained.

Returns:
True that a valid minimum has been found, false to continue sectioning the bracket.

minimizeAlongDirection

WeightedInputOutputPair<Vector,Double> minimizeAlongDirection(DirectionalVectorToScalarFunction function,
                                                              Double functionValue,
                                                              Vector gradient)
Minimizes a Vector function along the direction given by the DirectionalVectorToScalarFunction. The offset in function is taken to be the initialGuess.

Parameters:
function - Defines the direction to search along, and the initial guess. The direction is scaled by the line-search solution
functionValue - Value of function at initialGuess, may be null
gradient - Derivative of the output with respect to the input of function at the initial guess. Gradient may be null if it's not being computer. So, gradient is not required for all line-search methods, but will throw an exception if it's expected but not available.
Returns:
Location of the minimum-value Vector solution to function and it's corresponding output value. The weight is the length of the optimal line search along "direction".