edu.stanford.nlp.optimization
Class SMDMinimizer
java.lang.Object
edu.stanford.nlp.optimization.SMDMinimizer
- All Implemented Interfaces:
- Minimizer
public class SMDMinimizer
- extends Object
- implements Minimizer
Stochastic Meta Descent Minimizer based on
Accelerated training of conditional random fields with stochastic gradient methods
S. V. N. Vishwanathan, Nicol N. Schraudolph, Mark W. Schmidt, Kevin P. Murphy
June 2006 Proceedings of the 23rd international conference on Machine learning ICML '06
Publisher: ACM Press
The basic way to use the minimizer is with a null constructor, then
the simple minimize method:
Minimizer smd = new SMDMinimizer();
DiffFunction df = new SomeDiffFunction();
double tol = 1e-4;
double[] initial = getInitialGuess();
int maxIterations = someSafeNumber;
double[] minimum = qnm.minimize(df,tol,initial,maxIterations);
Constructing with a null constructor will use the default values of
batchSize = 15;
initialGain = 0.1;
useAlgorithmicDifferentiation = true;
- Since:
- 1.0
- Author:
- Alex Kleeman
Method Summary |
static void |
main(String[] args)
|
double[] |
minimize(Function function,
double functionTolerance,
double[] initial)
Attempts to find an unconstrained minimum of the objective
function starting at initial , within
functionTolerance . |
double[] |
minimize(Function function,
double functionTolerance,
double[] initial,
int maxIterations)
|
static long[] |
primeFactors(long N)
|
void |
setBatchSize(int batchSize)
|
void |
shutUp()
|
boolean |
testObjectiveFunction(Function function,
double[] x,
double functionTolerance)
testObjectiveFunction
This function was written to provide a test for accuracy of stochastic objective functions. |
Methods inherited from class java.lang.Object |
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
outputIterationsToFile
public boolean outputIterationsToFile
outputFrequency
public int outputFrequency
initialGain
public double initialGain
restrictSteps
public boolean restrictSteps
useAlgorithmicDifferentiation
public boolean useAlgorithmicDifferentiation
mu
public double mu
lam
public double lam
cPosDef
public double cPosDef
testObjFunc
public boolean testObjFunc
printMinMax
public boolean printMinMax
finalVal
public double finalVal
useGaussNewton
public boolean useGaussNewton
SMDMinimizer
public SMDMinimizer()
SMDMinimizer
public SMDMinimizer(double initialSMDGain,
int batchSize,
StochasticCalculateMethods method)
shutUp
public void shutUp()
setBatchSize
public void setBatchSize(int batchSize)
minimize
public double[] minimize(Function function,
double functionTolerance,
double[] initial)
- Description copied from interface:
Minimizer
- Attempts to find an unconstrained minimum of the objective
function
starting at initial
, within
functionTolerance
.
- Specified by:
minimize
in interface Minimizer
- Parameters:
function
- the objective functionfunctionTolerance
- a double
valueinitial
- a initial feasible point
minimize
public double[] minimize(Function function,
double functionTolerance,
double[] initial,
int maxIterations)
- Specified by:
minimize
in interface Minimizer
primeFactors
public static long[] primeFactors(long N)
testObjectiveFunction
public boolean testObjectiveFunction(Function function,
double[] x,
double functionTolerance)
- testObjectiveFunction
This function was written to provide a test for accuracy of stochastic objective functions. The test
checks for the following properties:
1) The sum of the value over each batch equals the full value
2) The sum of the gradients over each batch equals the full gradient
3) The gradient calculated using Incorporated Finite Difference is never more than functionTolerance from the
gradient using External Finite Difference
4) The hessian vector also does not varry between Incorporated and External Finite Difference
- Parameters:
function
- The function to testx
- The point to use for testing (v is generated randomlyfunctionTolerance
- The tolerance
- Returns:
main
public static void main(String[] args)
Stanford NLP Group