Interface | Description |
---|---|
GradientUpdater<T extends IUpdater> |
Gradient modifications: Calculates an update and tracks related information for gradient changes over time
for handling updates.
|
Class | Description |
---|---|
AdaDeltaUpdater |
http://www.matthewzeiler.com/pubs/googleTR2012/googleTR2012.pdf
https://arxiv.org/pdf/1212.5701v1.pdf
|
AdaGradUpdater |
Vectorized Learning Rate used per Connection Weight
Adapted from: http://xcorr.net/2014/01/23/adagrad-eliminating-learning-rates-in-stochastic-gradient-descent/
See also http://cs231n.github.io/neural-networks-3/#ada
|
AdaMaxUpdater |
The AdaMax updater, a variant of Adam.
|
AdamUpdater |
The Adam updater.
|
NadamUpdater |
The Nadam updater.
|
NesterovsUpdater |
Nesterov's momentum.
|
NoOpUpdater |
NoOp updater: gradient updater that makes no changes to the gradient
|
RmsPropUpdater |
RMS Prop updates:
|
SgdUpdater |
SGD updater applies a learning rate only
|
Copyright © 2017. All rights reserved.