public class SameDiffLayer extends AbstractLayer<AbstractSameDiffLayer>
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected ExternalErrorsFunction |
fn |
protected INDArray |
gradients |
protected Map<String,INDArray> |
gradTable |
static String |
INPUT_KEY |
static String |
MASK_KEY |
protected String |
outputKey |
protected SDVariable |
outputVar |
protected INDArray |
params |
protected Map<String,INDArray> |
paramTable |
protected SameDiff |
sameDiff |
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
SameDiffLayer(NeuralNetConfiguration conf,
DataType dataType) |
| Modifier and Type | Method and Description |
|---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
void |
clearNoiseWeightParams() |
Layer |
clone() |
protected void |
doInit() |
Pair<INDArray,MaskState> |
feedForwardMaskArray(INDArray maskArray,
MaskState currentMaskState,
int minibatchSize)
Feed forward the input mask array, setting in the layer as appropriate.
|
INDArray |
getGradientsViewArray() |
INDArray |
getParam(String param)
Get the parameter
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
long |
numParams()
The number of parameters for the model
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Map<String,INDArray> |
paramTable()
The param table
|
Map<String,INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
void |
setBackpropGradientsViewArray(INDArray gradients)
Set the gradients array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParam(String key,
INDArray val)
Set the parameter with a new ndarray
|
void |
setParams(INDArray params)
Set the parameters for this model.
|
protected void |
setParams(INDArray params,
char order) |
void |
setParamsViewArray(INDArray params)
Set the initial parameters array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParamTable(Map<String,INDArray> paramTable)
Setter for the param table
|
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, calcRegularizationScore, clear, close, computeGradientAndScore, conf, fit, fit, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, input, layerConf, layerId, numParams, score, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, update, update, updaterDivideByMinibatchequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetIterationCount, setIterationCountpublic static final String INPUT_KEY
public static final String MASK_KEY
protected SameDiff sameDiff
protected SDVariable outputVar
protected ExternalErrorsFunction fn
protected String outputKey
protected INDArray params
protected INDArray gradients
public SameDiffLayer(NeuralNetConfiguration conf, DataType dataType)
public boolean isPretrainLayer()
Layerpublic void clearNoiseWeightParams()
public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layertraining - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic INDArray params()
params in interface Modelparams in interface Trainableparams in class AbstractLayer<AbstractSameDiffLayer>public INDArray getParam(String param)
ModelgetParam in interface ModelgetParam in class AbstractLayer<AbstractSameDiffLayer>param - the key of the parameterpublic long numParams()
AbstractLayernumParams in interface ModelnumParams in interface TrainablenumParams in class AbstractLayer<AbstractSameDiffLayer>public void setParam(String key, INDArray val)
ModelsetParam in interface ModelsetParam in class AbstractLayer<AbstractSameDiffLayer>key - the key to se tval - the new ndarraypublic void setParams(INDArray params)
ModelsetParams in interface ModelsetParams in class AbstractLayer<AbstractSameDiffLayer>params - the parameters for the modelprotected void setParams(INDArray params, char order)
setParams in class AbstractLayer<AbstractSameDiffLayer>public void setParamsViewArray(INDArray params)
ModelsetParamsViewArray in interface ModelsetParamsViewArray in class AbstractLayer<AbstractSameDiffLayer>params - a 1 x nParams row vector that is a view of the larger (MLN/CG) parameters arraypublic INDArray getGradientsViewArray()
getGradientsViewArray in interface ModelgetGradientsViewArray in interface TrainablegetGradientsViewArray in class AbstractLayer<AbstractSameDiffLayer>public void setBackpropGradientsViewArray(INDArray gradients)
ModelsetBackpropGradientsViewArray in interface ModelsetBackpropGradientsViewArray in class AbstractLayer<AbstractSameDiffLayer>gradients - a 1 x nParams row vector that is a view of the larger (MLN/CG) gradients arraypublic void setParamTable(Map<String,INDArray> paramTable)
ModelsetParamTable in interface ModelsetParamTable in class AbstractLayer<AbstractSameDiffLayer>public Map<String,INDArray> paramTable()
ModelparamTable in interface ModelparamTable in class AbstractLayer<AbstractSameDiffLayer>public Map<String,INDArray> paramTable(boolean backpropParamsOnly)
ModelparamTable in interface ModelparamTable in interface TrainableparamTable in class AbstractLayer<AbstractSameDiffLayer>backpropParamsOnly - If true, return backprop params only. If false: return all params (equivalent to
paramsTable())protected void doInit()
public Pair<INDArray,MaskState> feedForwardMaskArray(INDArray maskArray, MaskState currentMaskState, int minibatchSize)
LayerfeedForwardMaskArray in interface LayerfeedForwardMaskArray in class AbstractLayer<AbstractSameDiffLayer>maskArray - Mask array to setcurrentMaskState - Current state of the mask - see MaskStateminibatchSize - Current minibatch size. Needs to be known as it cannot always be inferred from the activations
array due to reshaping (such as a DenseLayer within a recurrent neural network)Copyright © 2020. All rights reserved.