edu.stanford.nlp.optimization
Class SQNMinimizer

java.lang.Object
  extended by edu.stanford.nlp.optimization.SQNMinimizer
All Implemented Interfaces:
Minimizer<DiffFunction>

public class SQNMinimizer
extends Object
implements Minimizer<DiffFunction>

Online Limited-Memory Quasi-Newton BFGS implementation based on the algorithms in

Nocedal, Jorge, and Stephen J. Wright. 2000. Numerical Optimization. Springer. pp. 224--

and modified to the online version presented in

A Stocahstic Quasi-Newton Method for Online Convex Optimization Schraudolph, Yu, Gunter (2007)

As of now, it requires a Stochastic differentiable function (AbstractStochasticCachingDiffFunction) as input.

The basic way to use the minimizer is with a null constructor, then the simple minimize method: !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! THIS IS NOT UPDATE FOR THE STOCHASTIC VERSION YET. !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Minimizer qnm = new QNMinimizer();
DiffFunction df = new SomeDiffFunction();
double tol = 1e-4;
double[] initial = getInitialGuess();
double[] minimum = qnm.minimize(df,tol,initial);

If you do not choose a value of M, it will use the max amount of memory available, up to M of 20. This will slow things down a bit at first due to forced garbage collection, but is probably faster overall b/c you are guaranteed the largest possible M. The Stochastic version was written by Alex Kleeman, but about 95% of the code was taken directly from the previous QNMinimizer written mostly by Jenny.

Since:
1.0
Author:
Jenny Finkel, Galen Andrew, Alex Kleeman

Field Summary
 double calcTime
           
 int outputFrequency
           
 boolean outputIterationsToFile
           
 
Constructor Summary
SQNMinimizer()
           
SQNMinimizer(FloatFunction monitor)
           
SQNMinimizer(Function monitor, int m)
           
SQNMinimizer(int m)
           
SQNMinimizer(int mem, double initialGain, int batchSize)
           
 
Method Summary
 double[] minimize(DiffFunction function, double functionTolerance, double[] initial)
          Attempts to find an unconstrained minimum of the objective function starting at initial, within functionTolerance.
 double[] minimize(DiffFunction function, double functionTolerance, double[] initial, int maxIterations)
           
 void setM(int m)
           
 void shutUp()
           
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Field Detail

outputIterationsToFile

public boolean outputIterationsToFile

outputFrequency

public int outputFrequency

calcTime

public double calcTime
Constructor Detail

SQNMinimizer

public SQNMinimizer(int m)

SQNMinimizer

public SQNMinimizer()

SQNMinimizer

public SQNMinimizer(int mem,
                    double initialGain,
                    int batchSize)

SQNMinimizer

public SQNMinimizer(Function monitor,
                    int m)

SQNMinimizer

public SQNMinimizer(FloatFunction monitor)
Method Detail

shutUp

public void shutUp()

setM

public void setM(int m)

minimize

public double[] minimize(DiffFunction function,
                         double functionTolerance,
                         double[] initial)
Description copied from interface: Minimizer
Attempts to find an unconstrained minimum of the objective function starting at initial, within functionTolerance.

Specified by:
minimize in interface Minimizer<DiffFunction>
Parameters:
function - the objective function
functionTolerance - a double value
initial - a initial feasible point

minimize

public double[] minimize(DiffFunction function,
                         double functionTolerance,
                         double[] initial,
                         int maxIterations)
Specified by:
minimize in interface Minimizer<DiffFunction>


Stanford NLP Group