Class BaseLayer
- java.lang.Object
-
- org.deeplearning4j.nn.conf.layers.Layer
-
- org.deeplearning4j.nn.conf.layers.BaseLayer
-
- All Implemented Interfaces:
Serializable
,Cloneable
,TrainingConfig
- Direct Known Subclasses:
FeedForwardLayer
,PReLULayer
public abstract class BaseLayer extends Layer implements Serializable, Cloneable
A neural network layer.- See Also:
- Serialized Form
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description static class
BaseLayer.Builder<T extends BaseLayer.Builder<T>>
-
Field Summary
Fields Modifier and Type Field Description protected IActivation
activationFn
protected double
biasInit
protected IUpdater
biasUpdater
protected double
gainInit
protected GradientNormalization
gradientNormalization
protected double
gradientNormalizationThreshold
protected IUpdater
iUpdater
protected List<Regularization>
regularization
protected List<Regularization>
regularizationBias
protected IWeightInit
weightInitFn
protected IWeightNoise
weightNoise
-
Fields inherited from class org.deeplearning4j.nn.conf.layers.Layer
constraints, iDropout, layerName
-
-
Constructor Summary
Constructors Constructor Description BaseLayer(BaseLayer.Builder builder)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description BaseLayer
clone()
GradientNormalization
getGradientNormalization()
List<Regularization>
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.IUpdater
getUpdaterByParam(String paramName)
Get the updater for the given parameter.void
resetLayerDefaultConfig()
Reset the learning related configs of the layer to default.-
Methods inherited from class org.deeplearning4j.nn.conf.layers.Layer
getMemoryReport, getOutputType, getPreProcessorForInputType, initializeConstraints, initializer, instantiate, isPretrainParam, setDataType, setNIn
-
Methods inherited from class java.lang.Object
equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from interface org.deeplearning4j.nn.api.TrainingConfig
getGradientNormalizationThreshold, getLayerName
-
-
-
-
Field Detail
-
activationFn
protected IActivation activationFn
-
weightInitFn
protected IWeightInit weightInitFn
-
biasInit
protected double biasInit
-
gainInit
protected double gainInit
-
regularization
protected List<Regularization> regularization
-
regularizationBias
protected List<Regularization> regularizationBias
-
iUpdater
protected IUpdater iUpdater
-
biasUpdater
protected IUpdater biasUpdater
-
weightNoise
protected IWeightNoise weightNoise
-
gradientNormalization
protected GradientNormalization gradientNormalization
-
gradientNormalizationThreshold
protected double gradientNormalizationThreshold
-
-
Constructor Detail
-
BaseLayer
public BaseLayer(BaseLayer.Builder builder)
-
-
Method Detail
-
resetLayerDefaultConfig
public void resetLayerDefaultConfig()
Reset the learning related configs of the layer to default. When instantiated with a global neural network configuration the parameters specified in the neural network configuration will be used. For internal use with the transfer learning API. Users should not have to call this method directly.- Overrides:
resetLayerDefaultConfig
in classLayer
-
getUpdaterByParam
public IUpdater getUpdaterByParam(String paramName)
Get the updater for the given parameter. Typically the same updater will be used for all updaters, but this is not necessarily the case- Specified by:
getUpdaterByParam
in interfaceTrainingConfig
- Overrides:
getUpdaterByParam
in classLayer
- Parameters:
paramName
- Parameter name- Returns:
- IUpdater for the parameter
-
getGradientNormalization
public GradientNormalization getGradientNormalization()
- Specified by:
getGradientNormalization
in interfaceTrainingConfig
- Returns:
- The gradient normalization configuration
-
getRegularizationByParam
public List<Regularization> getRegularizationByParam(String paramName)
Description copied from class:Layer
Get the regularization types (l1/l2/weight decay) for the given parameter. Different parameters may have different regularization types.- Specified by:
getRegularizationByParam
in interfaceTrainingConfig
- Specified by:
getRegularizationByParam
in classLayer
- Parameters:
paramName
- Parameter name ("W", "b" etc)- Returns:
- Regularization types (if any) for the specified parameter
-
-