public abstract class BasePretrainNetwork<LayerConfT extends BasePretrainNetwork> extends BaseLayer<LayerConfT>
Layer.TrainingMode, Layer.Type| Modifier and Type | Field and Description |
|---|---|
protected Collection<TrainingListener> |
trainingListeners |
gradient, gradientsFlattened, gradientViews, optimizer, params, paramsFlattened, score, solvercacheMode, conf, dropoutApplied, dropoutMask, index, input, iterationListeners, maskArray, maskState, preOutput| Constructor and Description |
|---|
BasePretrainNetwork(NeuralNetConfiguration conf) |
BasePretrainNetwork(NeuralNetConfiguration conf,
org.nd4j.linalg.api.ndarray.INDArray input) |
| Modifier and Type | Method and Description |
|---|---|
Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> |
backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
Calculate the gradient relative to the error in the next layer
|
double |
calcL1(boolean backpropParamsOnly)
Calculate the l1 regularization term
0.0 if regularization is not used. |
double |
calcL2(boolean backpropParamsOnly)
Calculate the l2 regularization term
0.0 if regularization is not used. |
protected Gradient |
createGradient(org.nd4j.linalg.api.ndarray.INDArray wGradient,
org.nd4j.linalg.api.ndarray.INDArray vBiasGradient,
org.nd4j.linalg.api.ndarray.INDArray hBiasGradient) |
org.nd4j.linalg.api.ndarray.INDArray |
getCorruptedInput(org.nd4j.linalg.api.ndarray.INDArray x,
double corruptionLevel)
Corrupts the given input by doing a binomial sampling
given the corruption level
|
int |
numParams()
The number of parameters for the model, for backprop (i.e., excluding visible bias)
|
int |
numParams(boolean backwards)
the number of parameters for the model
|
org.nd4j.linalg.api.ndarray.INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
Map<String,org.nd4j.linalg.api.ndarray.INDArray> |
paramTable(boolean backpropParamsOnly)
Table of parameters by key, for backprop
For many models (dense layers, etc) - all parameters are backprop parameters
|
abstract Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
sampleHiddenGivenVisible(org.nd4j.linalg.api.ndarray.INDArray v)
Sample the hidden distribution given the visible
|
abstract Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> |
sampleVisibleGivenHidden(org.nd4j.linalg.api.ndarray.INDArray h)
Sample the visible distribution given the hidden
|
void |
setListeners(Collection<IterationListener> listeners)
Set the iteration listeners for this layer.
|
void |
setListeners(IterationListener... listeners)
Set the iteration listeners for this layer.
|
void |
setParams(org.nd4j.linalg.api.ndarray.INDArray params)
Set the parameters for this model.
|
protected void |
setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z) |
accumulateScore, activate, activate, activate, activationMean, applyLearningRateScoreDecay, calcGradient, clone, computeGradientAndScore, error, fit, fit, getGradientsViewArray, getOptimizer, getParam, gradient, initParams, iterate, layerConf, merge, paramTable, preOutput, preOutput, score, setBackpropGradientsViewArray, setParam, setParams, setParamsViewArray, setParamTable, toString, transpose, update, updateactivate, activate, activate, addListeners, applyDropOutIfNecessary, applyMask, batchSize, clear, conf, derivativeActivation, feedForwardMaskArray, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, gradientAndScore, init, input, layerId, preOutput, preOutput, setCacheMode, setConf, setIndex, setInput, setInputMiniBatchSize, setMaskArray, type, validateInputequals, finalize, getClass, hashCode, notify, notifyAll, wait, wait, waitisPretrainLayerprotected Collection<TrainingListener> trainingListeners
public BasePretrainNetwork(NeuralNetConfiguration conf)
public BasePretrainNetwork(NeuralNetConfiguration conf, org.nd4j.linalg.api.ndarray.INDArray input)
public void setListeners(Collection<IterationListener> listeners)
LayersetListeners in interface LayersetListeners in interface ModelsetListeners in class AbstractLayer<LayerConfT extends BasePretrainNetwork>public void setListeners(IterationListener... listeners)
LayersetListeners in interface LayersetListeners in interface ModelsetListeners in class AbstractLayer<LayerConfT extends BasePretrainNetwork>public org.nd4j.linalg.api.ndarray.INDArray getCorruptedInput(org.nd4j.linalg.api.ndarray.INDArray x,
double corruptionLevel)
x - the input to corruptcorruptionLevel - the corruption valueprotected Gradient createGradient(org.nd4j.linalg.api.ndarray.INDArray wGradient, org.nd4j.linalg.api.ndarray.INDArray vBiasGradient, org.nd4j.linalg.api.ndarray.INDArray hBiasGradient)
public int numParams(boolean backwards)
ModelnumParams in interface ModelnumParams in class AbstractLayer<LayerConfT extends BasePretrainNetwork>public abstract Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> sampleHiddenGivenVisible(org.nd4j.linalg.api.ndarray.INDArray v)
v - the visible to sample frompublic abstract Pair<org.nd4j.linalg.api.ndarray.INDArray,org.nd4j.linalg.api.ndarray.INDArray> sampleVisibleGivenHidden(org.nd4j.linalg.api.ndarray.INDArray h)
h - the hidden to sample fromprotected void setScoreWithZ(org.nd4j.linalg.api.ndarray.INDArray z)
setScoreWithZ in class BaseLayer<LayerConfT extends BasePretrainNetwork>public Map<String,org.nd4j.linalg.api.ndarray.INDArray> paramTable(boolean backpropParamsOnly)
ModelparamTable in interface ModelparamTable in class BaseLayer<LayerConfT extends BasePretrainNetwork>backpropParamsOnly - If true, return backprop params only. If false: return all params (equivalent to
paramsTable())public org.nd4j.linalg.api.ndarray.INDArray params()
BaseLayerparams in interface Modelparams in class BaseLayer<LayerConfT extends BasePretrainNetwork>public int numParams()
numParams in interface ModelnumParams in class BaseLayer<LayerConfT extends BasePretrainNetwork>public void setParams(org.nd4j.linalg.api.ndarray.INDArray params)
ModelsetParams in interface ModelsetParams in class BaseLayer<LayerConfT extends BasePretrainNetwork>params - the parameters for the modelpublic Pair<Gradient,org.nd4j.linalg.api.ndarray.INDArray> backpropGradient(org.nd4j.linalg.api.ndarray.INDArray epsilon)
LayerbackpropGradient in interface LayerbackpropGradient in class BaseLayer<LayerConfT extends BasePretrainNetwork>epsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.public double calcL2(boolean backpropParamsOnly)
LayercalcL2 in interface LayercalcL2 in class BaseLayer<LayerConfT extends BasePretrainNetwork>backpropParamsOnly - If true: calculate L2 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public double calcL1(boolean backpropParamsOnly)
LayercalcL1 in interface LayercalcL1 in class BaseLayer<LayerConfT extends BasePretrainNetwork>backpropParamsOnly - If true: calculate L1 based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)Copyright © 2017. All rights reserved.