Package ai.djl.nn.norm
Contains classes that define normalizing neural network operations.
-
Class Summary Class Description BatchNorm In batch training (training with more than one samples per iteration), a batch normalization layer works by normalizing the values of input data to have mean of 0 and variance of 1.BatchNorm.Builder The Builder to construct aBatchNorm
.Dropout A dropout layer benefits a network by allowing some units (neurons), and hence their respective connections, of a network to be randomly and temporarily removed by setting its value to 0 only during training by specified probability \(p\), usually set to 0.5.Dropout.Builder LayerNorm Layer normalization works by normalizing the values of input data for each input sample to have mean of 0 and variance of 1.LayerNorm.Builder The Builder to construct aLayerNorm
.