See: Description
Interface | Description |
---|---|
DiffFloatFunction |
An interface for once-differentiable double-valued functions over
double arrays.
|
DiffFunction |
An interface for once-differentiable double-valued functions over
double arrays.
|
Evaluator | |
FloatFunction |
An interface for double-valued functions over double arrays.
|
Function |
An interface for double-valued functions over double arrays.
|
HasEvaluators |
Indicates that an minimizer supports evaluation periodically
|
HasFeatureGrouping |
Indicates that an minimizer supports grouping features for g-lasso or ae-lasso
|
HasFloatInitial |
Indicates that a function has a method for supplying an intitial value.
|
HasInitial |
Indicates that a function has a method for supplying an intitial value.
|
HasRegularizerParamRange |
Indicates that a Function should only be regularized on a subset
of its parameters.
|
LineSearcher |
The interface for one variable function minimizers.
|
Minimizer<T extends Function> |
The interface for unconstrained function minimizers.
|
SparseMinimizer<K,T extends SparseOnlineFunction<K>> |
The interface for unconstrained function minimizers with sparse parameters
like Minimizer, except with sparse parameters
|
SparseOnlineFunction<K> |
An interface for functions over sparse parameters.
|
StochasticMinimizer.PropertySetter<T1> |
Class | Description |
---|---|
AbstractCachingDiffFloatFunction | |
AbstractCachingDiffFunction |
A differentiable function that caches the last evaluation of its value and
derivative.
|
AbstractStochasticCachingDiffFunction | |
AbstractStochasticCachingDiffUpdateFunction |
Function for stochastic calculations that does update in place
(instead of maintaining and returning the derivative).
|
CGMinimizer |
Conjugate-gradient implementation based on the code in Numerical
Recipes in C.
|
CmdEvaluator |
Runs a cmdline to evaluate a dataset (assumes cmd takes input from stdin)
|
GoldenSectionLineSearch |
A class to do golden section line search.
|
HybridMinimizer |
Hybrid Minimizer is set up as a combination of two minimizers.
|
InefficientSGDMinimizer<T extends Function> |
Stochastic Gradient Descent Minimizer.
|
MemoryEvaluator |
Evaluate current memory usage
|
QNMinimizer |
An implementation of L-BFGS for Quasi Newton unconstrained minimization.
|
ResultStoringFloatMonitor | |
ResultStoringMonitor | |
ScaledSGDMinimizer<Q extends AbstractStochasticCachingDiffFunction> |
Stochastic Gradient Descent To Quasi Newton Minimizer.
|
ScaledSGDMinimizer.Weights | |
SGDMinimizer<T extends Function> |
In place Stochastic Gradient Descent Minimizer.
|
SGDToQNMinimizer |
Stochastic Gradient Descent To Quasi Newton Minimizer
An experimental minimizer which takes a stochastic function (one implementing AbstractStochasticCachingDiffFunction)
and executes SGD for the first couple passes.
|
SGDWithAdaGradAndFOBOS<T extends DiffFunction> |
Stochastic Gradient Descent With AdaGrad and FOBOS in batch mode.
|
SMDMinimizer<T extends Function> |
Stochastic Meta Descent Minimizer based on Accelerated training of conditional random fields with stochastic gradient methods S. |
SparseAdaGradMinimizer<K,F extends SparseOnlineFunction<K>> |
AdaGrad optimizer that works online, and use sparse gradients, need a
function that takes a Counter<K> as argument and returns a
Counter<K> as gradient
|
SQNMinimizer<T extends Function> |
Online Limited-Memory Quasi-Newton BFGS implementation based on the algorithms in
Nocedal, Jorge, and Stephen J. |
StochasticDiffFunctionTester | |
StochasticMinimizer<T extends Function> |
Stochastic Gradient Descent Minimizer.
|
Enum | Description |
---|---|
AbstractStochasticCachingDiffFunction.SamplingMethod | |
SGDWithAdaGradAndFOBOS.Prior | |
StochasticCalculateMethods |
This enumeratin was created to organize the selection of different methods for stochastic
calculations.
|
Exception | Description |
---|---|
QNMinimizer.SurpriseConvergence |