gov.sandia.cognition.learning.algorithm.minimization.line
Class LineMinimizerDerivativeFree

java.lang.Object
  extended by gov.sandia.cognition.util.AbstractCloneableSerializable
      extended by gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
          extended by gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm<ResultType>
              extended by gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner<EvaluatorType,InputOutputPair<InputType,OutputType>>
                  extended by gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer<Double,Double,EvaluatorType>
                      extended by gov.sandia.cognition.learning.algorithm.minimization.line.AbstractAnytimeLineMinimizer<Evaluator<Double,Double>>
                          extended by gov.sandia.cognition.learning.algorithm.minimization.line.LineMinimizerDerivativeFree
All Implemented Interfaces:
AnytimeAlgorithm<InputOutputPair<Double,Double>>, IterativeAlgorithm, StoppableAlgorithm, AnytimeBatchLearner<Evaluator<Double,Double>,InputOutputPair<Double,Double>>, BatchLearner<Evaluator<Double,Double>,InputOutputPair<Double,Double>>, FunctionMinimizer<Double,Double,Evaluator<Double,Double>>, LineMinimizer<Evaluator<Double,Double>>, CloneableSerializable, Serializable, Cloneable

@PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},
                      title="Numerical Recipes in C, Second Edition",
                      type=Book,
                      year=1992,
                      pages={400,405},
                      url="http://www.nrbook.com/a/bookcpdf.php")
public class LineMinimizerDerivativeFree
extends AbstractAnytimeLineMinimizer<Evaluator<Double,Double>>

This is an implementation of a LineMinimizer that does not require derivative information. This implementation is much slower than its cousin that uses derivative information, LineMinimizerDerivativeBased. In particular, the bracketing phase of this class is much slower than its cousin. However, this implementation is appears to be faster than using finite-differences to approximate derivative information in conjunction with the derivative-based line minimizer.

My recommendation: use LineMinimizerDerivativeBased whenever derivatives are available, but LineMinimizerDerivativeFree otherwise (even when approximating derivatives).
This implementation is loosely based on the Numerical Recipes function "Brent method," however I've corrected for some serious inefficiencies in that code.

Since:
2.2
Author:
Kevin R. Dixon
See Also:
Serialized Form

Field Summary
static LineBracketInterpolator<? super Evaluator<Double,Double>> DEFAULT_INTERPOLATOR
          Default interpolation algorithm, LineBracketInterpolatorBrent.
static double STEP_MAX
          Maximum step size allowed by a parabolic fit, 100.0.
 
Fields inherited from class gov.sandia.cognition.learning.algorithm.minimization.line.AbstractAnytimeLineMinimizer
DEFAULT_MAX_ITERATIONS, DEFAULT_TOLERANCE
 
Fields inherited from class gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer
initialGuess, result, tolerance
 
Fields inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner
data, keepGoing
 
Fields inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm
maxIterations
 
Fields inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
DEFAULT_ITERATION, iteration
 
Constructor Summary
LineMinimizerDerivativeFree()
          Creates a new instance of LineMinimizerDerivativeFree
LineMinimizerDerivativeFree(LineBracketInterpolator<? super Evaluator<Double,Double>> interpolator)
          Creates a new instance of LineMinimizerDerivativeFree
 
Method Summary
 boolean bracketingStep()
          Here's the general idea of derivative-free minimum bracketing:

Given an initial point, a={x,f(x)}, we're looking to find a triplet of points {a,b,c} such that bx is between ax and cx.
 boolean sectioningStep()
          Continues the sectioning phase of the algorihtm.
 
Methods inherited from class gov.sandia.cognition.learning.algorithm.minimization.line.AbstractAnytimeLineMinimizer
cleanupAlgorithm, getBracket, getInitialGuessFunctionValue, getInitialGuessSlope, getInterpolator, initializeAlgorithm, isValidBracket, minimizeAlongDirection, setBracket, setData, setInitialGuess, setInitialGuessFunctionValue, setInitialGuessSlope, setInterpolator, setValidBracket, step
 
Methods inherited from class gov.sandia.cognition.learning.algorithm.minimization.AbstractAnytimeFunctionMinimizer
getInitialGuess, getResult, getTolerance, setResult, setTolerance
 
Methods inherited from class gov.sandia.cognition.learning.algorithm.AbstractAnytimeBatchLearner
clone, getData, getKeepGoing, learn, setKeepGoing, stop
 
Methods inherited from class gov.sandia.cognition.algorithm.AbstractAnytimeAlgorithm
getMaxIterations, isResultValid, setMaxIterations
 
Methods inherited from class gov.sandia.cognition.algorithm.AbstractIterativeAlgorithm
addIterativeAlgorithmListener, fireAlgorithmEnded, fireAlgorithmStarted, fireStepEnded, fireStepStarted, getIteration, getListeners, removeIterativeAlgorithmListener, setIteration, setListeners
 
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 
Methods inherited from interface gov.sandia.cognition.learning.algorithm.minimization.FunctionMinimizer
getInitialGuess, getTolerance, learn, setTolerance
 
Methods inherited from interface gov.sandia.cognition.util.CloneableSerializable
clone
 
Methods inherited from interface gov.sandia.cognition.algorithm.AnytimeAlgorithm
getMaxIterations, getResult, setMaxIterations
 
Methods inherited from interface gov.sandia.cognition.algorithm.IterativeAlgorithm
addIterativeAlgorithmListener, getIteration, removeIterativeAlgorithmListener
 
Methods inherited from interface gov.sandia.cognition.algorithm.StoppableAlgorithm
isResultValid, stop
 

Field Detail

STEP_MAX

public static final double STEP_MAX
Maximum step size allowed by a parabolic fit, 100.0.

See Also:
Constant Field Values

DEFAULT_INTERPOLATOR

public static final LineBracketInterpolator<? super Evaluator<Double,Double>> DEFAULT_INTERPOLATOR
Default interpolation algorithm, LineBracketInterpolatorBrent.

Constructor Detail

LineMinimizerDerivativeFree

public LineMinimizerDerivativeFree()
Creates a new instance of LineMinimizerDerivativeFree


LineMinimizerDerivativeFree

public LineMinimizerDerivativeFree(LineBracketInterpolator<? super Evaluator<Double,Double>> interpolator)
Creates a new instance of LineMinimizerDerivativeFree

Parameters:
interpolator - Type of algorithm to fit data points and find an interpolated minimum to the known points.
Method Detail

bracketingStep

@PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},
                      title="Numerical Recipes in C, Second Edition",
                      type=Book,
                      year=1992,
                      pages={400,401},
                      url="http://www.nrbook.com/a/bookcpdf.php")
public boolean bracketingStep()
Here's the general idea of derivative-free minimum bracketing:

Given an initial point, a={x,f(x)}, we're looking to find a triplet of points {a,b,c} such that bx is between ax and cx. Furthermore, we need f(bx) less than both f(ax) and f(cy). This necessarily implies that there exists a minimum between ax and cx. To find these mythical points, the first step here is to fit a parabola to the existing points to determine where there should be a local minimum. A few things can happen due to this parabolic fit. The hypothesized minimum of the parabola: - provides a proper brack to a minimum on [b,newpoint,c] or [a,b,newpoint] - is outside the current set [a,b,c,newpoint], so we'll look for a minimum between c and newpoint using golden-section step - is exhibiting signs of numerical instability. We'll throw out the minimum, set it to a maximum step and use golden-section between c and maxstep.

However, it's possible/likely that the parabola doesn't provide any information about where a minimum is. If this is the case, then we'll just take a golden-section step between b and c. If that doesn't expose a minimum, then we'll replace the oldest point, a, with newpoint and then we'll fit another parabola next iteration to the new points.

When the method returns true, then we have a bracket with a minimum between ax and cx, furthermore: ax
Specified by:
bracketingStep in interface LineMinimizer<Evaluator<Double,Double>>
Specified by:
bracketingStep in class AbstractAnytimeLineMinimizer<Evaluator<Double,Double>>
Returns:
true means we have a valid bracket, false means we do not (yet) have a valid bracket

sectioningStep

@PublicationReference(author={"William H. Press","Saul A. Teukolsky","William T. Vetterling","Brian P. Flannery"},
                      title="Numerical Recipes in C, Second Edition",
                      type=Book,
                      year=1992,
                      pages={404,405},
                      url="http://www.nrbook.com/a/bookcpdf.php")
public boolean sectioningStep()
Description copied from interface: LineMinimizer
Continues the sectioning phase of the algorihtm. This phase occurs after the bracketing phase and attempts to shrink the size of the bracket (using a LineBracketInterpolator) until sufficient accuracy is attained.

Specified by:
sectioningStep in interface LineMinimizer<Evaluator<Double,Double>>
Specified by:
sectioningStep in class AbstractAnytimeLineMinimizer<Evaluator<Double,Double>>
Returns:
True that a valid minimum has been found, false to continue sectioning the bracket.