See: Description
Class | Description |
---|---|
BatchNorm |
In batch training (training with more than one samples per iteration), a batch normalization
layer works by normalizing the values of input data to have mean of 0 and variance of 1.
|
BatchNorm.Builder |
The Builder to construct a
BatchNorm . |
Dropout |
A dropout layer benefits a network by allowing some units (neurons), and hence their respective
connections, of a network to be randomly and temporarily removed by setting its value to 0
only during training by specified probability \(p\), usually set to 0.5.
|
Dropout.Builder |