Package | Description |
---|---|
org.nd4j.autodiff.samediff | |
org.nd4j.linalg.learning | |
org.nd4j.linalg.learning.config |
Class and Description |
---|
IUpdater
IUpdater interface: used for configuration and instantiation of updaters - both built-in and custom.
Note that the actual implementations for updaters are in GradientUpdater |
Class and Description |
---|
AdaDelta
https://www.matthewzeiler.com/mattzeiler/adadelta.pdf
https://arxiv.org/pdf/1212.5701v1.pdf
|
AdaGrad
Vectorized Learning Rate used per Connection Weight
Adapted from: http://xcorr.net/2014/01/23/adagrad-eliminating-learning-rates-in-stochastic-gradient-descent/
See also http://cs231n.github.io/neural-networks-3/#ada
|
Adam
The Adam updater.
|
AdaMax
The AdaMax updater, a variant of Adam.
|
AMSGrad
The AMSGrad updater
Reference: On the Convergence of Adam and Beyond - https://openreview.net/forum?id=ryQu7f-RZ |
IUpdater
IUpdater interface: used for configuration and instantiation of updaters - both built-in and custom.
Note that the actual implementations for updaters are in GradientUpdater |
Nadam
Setup and DynamicCustomOpsBuilder for Nadam updater.
|
Nesterovs
Nesterov's momentum.
|
NoOp
NoOp updater: gradient updater that makes no changes to the gradient
|
RmsProp
RMS Prop updates:
|
Sgd
SGD updater applies a learning rate only
|
Class and Description |
---|
AdaDelta
https://www.matthewzeiler.com/mattzeiler/adadelta.pdf
https://arxiv.org/pdf/1212.5701v1.pdf
|
AdaGrad
Vectorized Learning Rate used per Connection Weight
Adapted from: http://xcorr.net/2014/01/23/adagrad-eliminating-learning-rates-in-stochastic-gradient-descent/
See also http://cs231n.github.io/neural-networks-3/#ada
|
Adam
The Adam updater.
|
AMSGrad
The AMSGrad updater
Reference: On the Convergence of Adam and Beyond - https://openreview.net/forum?id=ryQu7f-RZ |
IUpdater
IUpdater interface: used for configuration and instantiation of updaters - both built-in and custom.
Note that the actual implementations for updaters are in GradientUpdater |
Nadam
Setup and DynamicCustomOpsBuilder for Nadam updater.
|
Nesterovs
Nesterov's momentum.
|
NoOp
NoOp updater: gradient updater that makes no changes to the gradient
|
RmsProp
RMS Prop updates:
|
Sgd
SGD updater applies a learning rate only
|
Copyright © 2020. All rights reserved.