Package | Description |
---|---|
org.nd4j.linalg.learning |
Class and Description |
---|
AdaGrad
Vectorized Learning Rate used per Connection Weight
Adapted from: http://xcorr.net/2014/01/23/adagrad-eliminating-learning-rates-in-stochastic-gradient-descent/
See also http://cs231n.github.io/neural-networks-3/#ada
|
GradientUpdater
Gradient modifications:
Calculates an update and tracks related
information for gradient changes over time
for handling updates.
|
GradientUpdaterAggregator
The GradientUpdaterAggregator is used (typically in distributed learning scenarios) to combine
separate GradientUpdater instances for different networks (usually by averaging).
|
Copyright © 2016. All Rights Reserved.