A normalization method softmax ( pj = exp(yj) / Sumi(exp(yi) ) ) or simplemax ( pj = yj / Sumi(yi) ) can be applied
to the computed activation values. The attribute normalizationMethod is defined for the network with default value
none ( pj = yj ), but can be specified for each layer as well. Softmax normalization is most often applied to the
output layer of a classification network to get the probabilities of all answers. Simplemax normalization is often
applied to the hidden layer consisting of elements with radial basis activation function to get a "normalized RBF"
activation.
Linear Supertypes
Enumeration, Serializable, Serializable, AnyRef, Any
A normalization method softmax ( pj = exp(yj) / Sumi(exp(yi) ) ) or simplemax ( pj = yj / Sumi(yi) ) can be applied to the computed activation values. The attribute normalizationMethod is defined for the network with default value none ( pj = yj ), but can be specified for each layer as well. Softmax normalization is most often applied to the output layer of a classification network to get the probabilities of all answers. Simplemax normalization is often applied to the hidden layer consisting of elements with radial basis activation function to get a "normalized RBF" activation.