Build graph: some other modules point to current module
Build graph: some other modules point to current module
upstream variables
Variable containing current module
A Single Shape, does not include the batch dimension.
Input map number.
Input map number. Default is 0, which means using PReLU in shared version and has only one parameter.
(Since version 0.3.0) please use recommended saveModule(path, overWrite)
Applies parametric ReLU, where parameter varies the slope of the negative part.
f(x) = max(0, x) + a * min(0, x)
Notice: Please don't use weight decay on this.
When you use this layer as the first layer of a model, you need to provide the argument inputShape (a Single Shape, does not include the batch dimension).
Remark: This layer is from Torch and wrapped in Keras style.
The numeric type of parameter(e.g. weight, bias). Only support float/double now.