public class MaskZeroLayer extends BaseWrapperLayer
Layer.Builder<T extends Layer.Builder<T>>underlyingconstraints, iDropout, layerName| Constructor and Description |
|---|
MaskZeroLayer(Layer underlying) |
| Modifier and Type | Method and Description |
|---|---|
double |
getL1ByParam(String paramName)
Get the L1 coefficient for the given parameter.
|
double |
getL2ByParam(String paramName)
Get the L2 coefficient for the given parameter.
|
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
org.nd4j.linalg.api.ndarray.INDArray layerParamsView,
boolean initializeParams) |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type
|
String |
toString() |
initializerclone, getUpdaterByParam, initializeConstraints, resetLayerDefaultConfigpublic MaskZeroLayer(Layer underlying)
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, org.nd4j.linalg.api.ndarray.INDArray layerParamsView, boolean initializeParams)
instantiate in class Layerpublic InputType getOutputType(int layerIndex, InputType inputType)
LayergetOutputType in class BaseWrapperLayerlayerIndex - Index of the layerinputType - Type of input for the layerpublic void setNIn(InputType inputType, boolean override)
LayersetNIn in class BaseWrapperLayerinputType - Input type for this layeroverride - If false: only set the nIn value if it's not already set. If true: set it regardless of whether it's
already set or not.public InputPreProcessor getPreProcessorForInputType(InputType inputType)
LayerInputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessorgetPreProcessorForInputType in class BaseWrapperLayerinputType - InputType to this layerpublic double getL1ByParam(String paramName)
LayergetL1ByParam in class BaseWrapperLayerparamName - Parameter namepublic double getL2ByParam(String paramName)
LayergetL2ByParam in class BaseWrapperLayerparamName - Parameter namepublic boolean isPretrainParam(String paramName)
LayerisPretrainParam in class BaseWrapperLayerparamName - Parameter name/keypublic LayerMemoryReport getMemoryReport(InputType inputType)
LayergetMemoryReport in class BaseWrapperLayerinputType - Input type to the layer. Memory consumption is often a function of the input typeCopyright © 2018. All rights reserved.