public abstract class AbstractSameDiffLayer extends Layer
| Modifier and Type | Class and Description |
|---|---|
static class |
AbstractSameDiffLayer.Builder<T extends AbstractSameDiffLayer.Builder<T>> |
| Modifier and Type | Field and Description |
|---|---|
protected org.nd4j.linalg.learning.config.IUpdater |
biasUpdater |
protected double |
l1 |
protected double |
l1Bias |
protected double |
l2 |
protected double |
l2Bias |
protected org.nd4j.linalg.learning.config.IUpdater |
updater |
constraints, iDropout, layerName| Modifier | Constructor and Description |
|---|---|
protected |
AbstractSameDiffLayer() |
protected |
AbstractSameDiffLayer(AbstractSameDiffLayer.Builder builder) |
| Modifier and Type | Method and Description |
|---|---|
void |
applyGlobalConfig(NeuralNetConfiguration.Builder b) |
abstract void |
applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig)
Apply the global configuration (weight init, activation function, etc) to this layer
|
abstract void |
defineParameters(SDLayerParams params)
Define the parameters for the network.
|
double |
getL1ByParam(String paramName)
Get the L1 coefficient for the given parameter.
|
double |
getL2ByParam(String paramName)
Get the L2 coefficient for the given parameter.
|
SDLayerParams |
getLayerParams() |
LayerMemoryReport |
getMemoryReport(InputType inputType)
This is a report of the estimated memory consumption for the given layer
|
abstract InputType |
getOutputType(int layerIndex,
InputType inputType)
For a given type of input to this layer, what is the type of the output?
|
abstract InputPreProcessor |
getPreProcessorForInputType(InputType inputType)
For the given type of input to this layer, what preprocessor (if any) is required?
Returns null if no preprocessor is required, otherwise returns an appropriate InputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessor |
org.nd4j.linalg.learning.config.IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
abstract void |
initializeParameters(Map<String,org.nd4j.linalg.api.ndarray.INDArray> params)
Set the initial parameter values for this layer, if required
|
ParamInitializer |
initializer() |
protected void |
initWeights(int fanIn,
int fanOut,
WeightInit weightInit,
org.nd4j.linalg.api.ndarray.INDArray array) |
abstract Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
org.nd4j.linalg.api.ndarray.INDArray layerParamsView,
boolean initializeParams) |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
char |
paramReshapeOrder(String param)
Returns the memory layout ('c' or 'f' order - i.e., row/column major) of the parameters.
|
abstract void |
setNIn(InputType inputType,
boolean override)
Set the nIn value (number of inputs, or input channels for CNNs) based on the given input type
|
clone, initializeConstraints, resetLayerDefaultConfigprotected double l1
protected double l2
protected double l1Bias
protected double l2Bias
protected org.nd4j.linalg.learning.config.IUpdater updater
protected org.nd4j.linalg.learning.config.IUpdater biasUpdater
protected AbstractSameDiffLayer(AbstractSameDiffLayer.Builder builder)
protected AbstractSameDiffLayer()
public SDLayerParams getLayerParams()
public abstract InputType getOutputType(int layerIndex, InputType inputType)
LayergetOutputType in class LayerlayerIndex - Index of the layerinputType - Type of input for the layerpublic abstract void setNIn(InputType inputType, boolean override)
Layerpublic abstract InputPreProcessor getPreProcessorForInputType(InputType inputType)
LayerInputPreProcessor
for this layer, such as a CnnToFeedForwardPreProcessorgetPreProcessorForInputType in class LayerinputType - InputType to this layerpublic abstract void defineParameters(SDLayerParams params)
SDLayerParams.addWeightParam(String, int...) and
SDLayerParams.addBiasParam(String, int[])params - Object used to set parameters for this layerpublic abstract void initializeParameters(Map<String,org.nd4j.linalg.api.ndarray.INDArray> params)
params - Parameter arrays that may be initializedpublic abstract void applyGlobalConfigToLayer(NeuralNetConfiguration.Builder globalConfig)
globalConfig - Global configurationpublic abstract Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, org.nd4j.linalg.api.ndarray.INDArray layerParamsView, boolean initializeParams)
instantiate in class Layerpublic ParamInitializer initializer()
initializer in class Layerpublic double getL1ByParam(String paramName)
LayergetL1ByParam in class LayerparamName - Parameter namepublic double getL2ByParam(String paramName)
LayergetL2ByParam in class LayerparamName - Parameter namepublic org.nd4j.linalg.learning.config.IUpdater getUpdaterByParam(String paramName)
LayergetUpdaterByParam in class LayerparamName - Parameter namepublic boolean isPretrainParam(String paramName)
LayerisPretrainParam in class LayerparamName - Parameter name/keypublic LayerMemoryReport getMemoryReport(InputType inputType)
LayergetMemoryReport in class LayerinputType - Input type to the layer. Memory consumption is often a function of the input typepublic char paramReshapeOrder(String param)
param - Name of the parameterprotected void initWeights(int fanIn,
int fanOut,
WeightInit weightInit,
org.nd4j.linalg.api.ndarray.INDArray array)
public void applyGlobalConfig(NeuralNetConfiguration.Builder b)
Copyright © 2018. All rights reserved.