Package edu.stanford.nlp.optimization

Interface Summary
DiffFloatFunction An interface for once-differentiable double-valued functions over double arrays.
DiffFunction An interface for once-differentiable double-valued functions over double arrays.
FloatFunction An interface for double-valued functions over double arrays.
Function An interface for double-valued functions over double arrays.
HasInitial Indicates that a function has a method for supplying an intitial value.
LineSearcher The interface for one variable function minimizers.
Minimizer<T extends Function> The interface for unconstrained function minimizers.
 

Class Summary
AbstractCachingDiffFunction  
AbstractStochasticCachingDiffFunction  
CGMinimizer Conjugate-gradient implementation based on the code in Numerical Recipes in C.
GoldenSectionLineSearch A class to do golden section line search.
HybridMinimizer Hybrid Minimizer is set up as a combination of two minimizers.
QNMinimizer Limited-Memory Quasi-Newton BFGS implementation based on the algorithms in
SGDMinimizer Stochastic Gradient Descent Minimizer The basic way to use the minimizer is with a null constructor, then the simple minimize method:

SGDToQNMinimizer Stochastic Gradient Descent To Quasi Newton Minimizer An experimental minimizer which takes a stochastic function (one implementing AbstractStochasticCachingDiffFunction) and executes SGD for the first couple passes, During the final iterations a series of approximate hessian vector products are built up...
SMDMinimizer Stochastic Meta Descent Minimizer based on
SQNMinimizer Online Limited-Memory Quasi-Newton BFGS implementation based on the algorithms in
StochasticDiffFunctionTester  
 

Enum Summary
AbstractStochasticCachingDiffFunction.SamplingMethod  
StochasticCalculateMethods This enumeratin was created to organize the selection of different methods for stochastic calculations.
 



Stanford NLP Group