public class SameDiffOutputLayer extends AbstractLayer<SameDiffOutputLayer> implements IOutputLayer
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected INDArray |
gradients |
protected Map<String,INDArray> |
gradTable |
static String |
INPUT_KEY |
protected INDArray |
labels |
static String |
LABELS_KEY |
protected String |
outputKey |
protected SDVariable |
outputVar |
protected INDArray |
params |
protected Map<String,INDArray> |
paramTable |
protected SameDiff |
sameDiff |
cacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
SameDiffOutputLayer(NeuralNetConfiguration conf,
DataType dataType) |
| Modifier and Type | Method and Description |
|---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
void |
clearNoiseWeightParams() |
Layer |
clone() |
double |
computeScore(double fullNetRegTerm,
boolean training,
LayerWorkspaceMgr workspaceMgr)
Compute score after labels and input have been set.
|
INDArray |
computeScoreForExamples(double fullNetRegTerm,
LayerWorkspaceMgr workspaceMgr)
Compute the score for each example individually, after labels and input have been set.
|
protected void |
doInit() |
double |
f1Score(DataSet data)
Sets the input and labels and returns a score for the prediction
wrt true labels
|
double |
f1Score(INDArray examples,
INDArray labels)
Returns the f1 score for the given examples.
|
void |
fit(DataSet data)
Fit the model
|
void |
fit(DataSetIterator iter)
Train the model based on the datasetiterator
|
void |
fit(INDArray examples,
INDArray labels)
Fit the model
|
void |
fit(INDArray examples,
int[] labels)
Fit the model
|
INDArray |
getGradientsViewArray() |
INDArray |
getParam(String param)
Get the parameter
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
boolean |
needsLabels()
Returns true if labels are required
for this output layer
|
int |
numLabels()
Returns the number of possible labels
|
long |
numParams()
The number of parameters for the model
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Map<String,INDArray> |
paramTable()
The param table
|
Map<String,INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
List<String> |
predict(DataSet dataSet)
Takes in a DataSet of examples
For each row, returns a label
|
int[] |
predict(INDArray examples)
Takes in a list of examples
For each row, returns a label
|
void |
setBackpropGradientsViewArray(INDArray gradients)
Set the gradients array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParam(String key,
INDArray val)
Set the parameter with a new ndarray
|
void |
setParams(INDArray params)
Set the parameters for this model.
|
protected void |
setParams(INDArray params,
char order) |
void |
setParamsViewArray(INDArray params)
Set the initial parameters array as a view of the full (backprop) network parameters
NOTE: this is intended to be used internally in MultiLayerNetwork and ComputationGraph, not by users.
|
void |
setParamTable(Map<String,INDArray> paramTable)
Setter for the param table
|
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, calcRegularizationScore, clear, close, computeGradientAndScore, conf, feedForwardMaskArray, fit, fit, getConfig, getEpochCount, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradient, gradientAndScore, init, input, layerConf, layerId, numParams, score, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, type, update, update, updaterDivideByMinibatchequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetLabels, setLabelsactivate, allowInputModification, calcRegularizationScore, feedForwardMaskArray, getEpochCount, getHelper, getIndex, getInputMiniBatchSize, getIterationCount, getListeners, getMaskArray, setCacheMode, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setIterationCount, setListeners, setListeners, setMaskArray, typegetConfig, updaterDivideByMinibatchaddListeners, applyConstraints, batchSize, clear, close, computeGradientAndScore, conf, fit, fit, getOptimizer, gradient, gradientAndScore, init, input, numParams, score, setConf, update, updatepublic static final String INPUT_KEY
public static final String LABELS_KEY
protected SameDiff sameDiff
protected SDVariable outputVar
protected String outputKey
protected INDArray labels
protected INDArray params
protected INDArray gradients
public SameDiffOutputLayer(NeuralNetConfiguration conf, DataType dataType)
public boolean isPretrainLayer()
LayerisPretrainLayer in interface Layerpublic void clearNoiseWeightParams()
clearNoiseWeightParams in interface Layerpublic INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layeractivate in interface Layertraining - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
LayerbackpropGradient in interface Layerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerpublic INDArray params()
params in interface Modelparams in interface Trainableparams in class AbstractLayer<SameDiffOutputLayer>public INDArray getParam(String param)
ModelgetParam in interface ModelgetParam in class AbstractLayer<SameDiffOutputLayer>param - the key of the parameterpublic long numParams()
AbstractLayernumParams in interface ModelnumParams in interface TrainablenumParams in class AbstractLayer<SameDiffOutputLayer>public void setParam(String key, INDArray val)
ModelsetParam in interface ModelsetParam in class AbstractLayer<SameDiffOutputLayer>key - the key to se tval - the new ndarraypublic void setParams(INDArray params)
ModelsetParams in interface ModelsetParams in class AbstractLayer<SameDiffOutputLayer>params - the parameters for the modelprotected void setParams(INDArray params, char order)
setParams in class AbstractLayer<SameDiffOutputLayer>public void setParamsViewArray(INDArray params)
ModelsetParamsViewArray in interface ModelsetParamsViewArray in class AbstractLayer<SameDiffOutputLayer>params - a 1 x nParams row vector that is a view of the larger (MLN/CG) parameters arraypublic INDArray getGradientsViewArray()
getGradientsViewArray in interface ModelgetGradientsViewArray in interface TrainablegetGradientsViewArray in class AbstractLayer<SameDiffOutputLayer>public void setBackpropGradientsViewArray(INDArray gradients)
ModelsetBackpropGradientsViewArray in interface ModelsetBackpropGradientsViewArray in class AbstractLayer<SameDiffOutputLayer>gradients - a 1 x nParams row vector that is a view of the larger (MLN/CG) gradients arraypublic void setParamTable(Map<String,INDArray> paramTable)
ModelsetParamTable in interface ModelsetParamTable in class AbstractLayer<SameDiffOutputLayer>public Map<String,INDArray> paramTable()
ModelparamTable in interface ModelparamTable in class AbstractLayer<SameDiffOutputLayer>public Map<String,INDArray> paramTable(boolean backpropParamsOnly)
ModelparamTable in interface ModelparamTable in interface TrainableparamTable in class AbstractLayer<SameDiffOutputLayer>backpropParamsOnly - If true, return backprop params only. If false: return all params (equivalent to
paramsTable())protected void doInit()
public boolean needsLabels()
IOutputLayerneedsLabels in interface IOutputLayerpublic double computeScore(double fullNetRegTerm,
boolean training,
LayerWorkspaceMgr workspaceMgr)
IOutputLayercomputeScore in interface IOutputLayerfullNetRegTerm - Regularization score (l1/l2/weight decay) for the entire networktraining - whether score should be calculated at train or test time (this affects things like application of
dropout, etc)public INDArray computeScoreForExamples(double fullNetRegTerm, LayerWorkspaceMgr workspaceMgr)
IOutputLayercomputeScoreForExamples in interface IOutputLayerfullNetRegTerm - Regularization score (l1/l2/weight decay) for the entire networkpublic double f1Score(DataSet data)
Classifierf1Score in interface Classifierdata - the data to scorepublic double f1Score(INDArray examples, INDArray labels)
Classifierf1Score in interface Classifierexamples - te the examples to classify (one example in each row)labels - the true labelspublic int numLabels()
ClassifiernumLabels in interface Classifierpublic void fit(DataSetIterator iter)
Classifierfit in interface Classifieriter - the iterator to train onpublic int[] predict(INDArray examples)
Classifierpredict in interface Classifierexamples - the examples to classify (one example in each row)public List<String> predict(DataSet dataSet)
Classifierpredict in interface ClassifierdataSet - the examples to classifypublic void fit(INDArray examples, INDArray labels)
Classifierfit in interface Classifierexamples - the examples to classify (one example in each row)labels - the example labels(a binary outcome matrix)public void fit(DataSet data)
Classifierfit in interface Classifierdata - the data to train onpublic void fit(INDArray examples, int[] labels)
Classifierfit in interface Classifierexamples - the examples to classify (one example in each row)labels - the labels for each example (the number of labels must match
the number of rows in the exampleCopyright © 2020. All rights reserved.