Package ai.djl.nn.norm
Contains classes that define normalizing neural network operations.
-
Class Summary Class Description BatchNorm In batch training (training with more than one samples per iteration), a batch normalization layer works by normalizing the values of input data to have mean of 0 and variance of 1.BatchNorm.BaseBuilder<T extends BatchNorm.BaseBuilder<T>> BatchNorm.Builder The Builder to construct aBatchNorm
.Dropout A dropout layer benefits a network by allowing some units (neurons), and hence their respective connections, of a network to be randomly and temporarily removed by setting its value to 0 only during training by specified probability \(p\), usually set to 0.5.Dropout.Builder GhostBatchNorm GhostBatchNorm
is similar toBatchNorm
except that it splits a batch into a smaller sub-batches aka ghost batches, and normalize them individually to have a mean of 0 and variance of 1 and finally concatenate them again to a single batch.GhostBatchNorm.Builder The Builder to construct aGhostBatchNorm
.LayerNorm Layer normalization works by normalizing the values of input data for each input sample to have mean of 0 and variance of 1.LayerNorm.Builder The Builder to construct aLayerNorm
.