public abstract class AbstractSameDiffLayer extends Layer
| Modifier and Type | Class and Description |
|---|---|
static class |
AbstractSameDiffLayer.Builder<T extends AbstractSameDiffLayer.Builder<T>> |
| Modifier and Type | Field and Description |
|---|---|
protected IUpdater |
biasUpdater |
protected GradientNormalization |
gradientNormalization |
protected double |
gradientNormalizationThreshold |
protected List<Regularization> |
regularization |
protected List<Regularization> |
regularizationBias |
protected IUpdater |
updater |
constraints, iDropout, layerName| Modifier | Constructor and Description |
|---|---|
protected |
AbstractSameDiffLayer() |
protected |
AbstractSameDiffLayer(AbstractSameDiffLayer.Builder builder) |
| Modifier and Type | Method and Description |
|---|---|
void |
applyGlobalConfig(NeuralNetConfiguration.Builder b) |
void |
applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig) |
abstract void |
defineParameters(SDLayerParams params)
Define the parameters for the network.
|
SDLayerParams |
getLayerParams() |
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor for this layer, such as a CnnToFeedForwardPreProcessor |
List<Regularization> |
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.
|
IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
abstract void |
initializeParameters(Map<String,INDArray> params)
Set the initial parameter values for this layer, if required
|
ParamInitializer |
initializer() |
protected void |
initWeights(int fanIn,
int fanOut,
WeightInit weightInit,
INDArray array) |
abstract Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
DataType networkDataType) |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
INDArray |
onesMaskForInput(INDArray input)
This method generates an "all ones" mask array for use in the SameDiff model when none is provided.
|
char |
paramReshapeOrder(String param)
Returns the memory layout ('c' or 'f' order - i.e., row/column major) of the parameters.
|
void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input
type
|
clone, getOutputType, initializeConstraints, resetLayerDefaultConfig, setDataTypeequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetGradientNormalization, getGradientNormalizationThreshold, getLayerNameprotected List<Regularization> regularization
protected List<Regularization> regularizationBias
protected IUpdater updater
protected IUpdater biasUpdater
protected GradientNormalization gradientNormalization
protected double gradientNormalizationThreshold
protected AbstractSameDiffLayer(AbstractSameDiffLayer.Builder builder)
protected AbstractSameDiffLayer()
public List<Regularization> getRegularizationByParam(String paramName)
LayergetRegularizationByParam in interface TrainingConfiggetRegularizationByParam in class LayerparamName - Parameter name ("W", "b" etc)public SDLayerParams getLayerParams()
public void setNIn(InputType inputType, boolean override)
Layerpublic InputPreProcessor getPreProcessorForInputType(InputType inputType)
LayerInputPreProcessor for this layer, such as a CnnToFeedForwardPreProcessorgetPreProcessorForInputType in class LayerinputType - InputType to this layerpublic void applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig)
public abstract void defineParameters(SDLayerParams params)
SDLayerParams.addWeightParam(String, long...) and SDLayerParams.addBiasParam(String, long...)params - Object used to set parameters for this layerpublic abstract void initializeParameters(Map<String,INDArray> params)
params - Parameter arrays that may be initializedpublic abstract Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, DataType networkDataType)
instantiate in class Layerpublic ParamInitializer initializer()
initializer in class Layerpublic IUpdater getUpdaterByParam(String paramName)
LayergetUpdaterByParam in interface TrainingConfiggetUpdaterByParam in class LayerparamName - Parameter namepublic boolean isPretrainParam(String paramName)
LayerisPretrainParam in interface TrainingConfigisPretrainParam in class LayerparamName - Parameter name/keypublic LayerMemoryReport getMemoryReport(InputType inputType)
LayergetMemoryReport in class LayerinputType - Input type to the layer. Memory consumption is often a function of the input
typepublic char paramReshapeOrder(String param)
param - Name of the parameterprotected void initWeights(int fanIn,
int fanOut,
WeightInit weightInit,
INDArray array)
public void applyGlobalConfig(NeuralNetConfiguration.Builder b)
Copyright © 2022. All rights reserved.