public class SpaceToBatch extends AbstractLayer<SpaceToBatchLayer>
Does a 2-dimensional space to batch operation, i.e. ransforms data from a tensor from 2 spatial dimensions into batch dimension according to the "blocks" specified (a vector of length 2). Afterwards the spatial dimensions are optionally padded, as specified in "padding", a tensor of dim (2, 2), denoting the padding range.
Example: input: [[[[1], [2]], [[3], [4]]]] input shape: [1, 2, 2, 1] blocks: [2, 2] padding: [[0, 0], [0, 0]]
output: [[[[1]]], [[[2]]], [[[3]]], [[[4]]]] output shape: [4, 1, 1, 1]
Layer.TrainingMode, Layer.TypecacheMode, conf, dataType, dropoutApplied, epochCount, index, input, inputModificationAllowed, iterationCount, maskArray, maskState, preOutput, trainingListeners| Constructor and Description |
|---|
SpaceToBatch(NeuralNetConfiguration conf,
DataType dataType) |
| Modifier and Type | Method and Description |
|---|---|
INDArray |
activate(boolean training,
LayerWorkspaceMgr workspaceMgr)
Perform forward pass and return the activations array with the last set input
|
Pair<Gradient,INDArray> |
backpropGradient(INDArray epsilon,
LayerWorkspaceMgr workspaceMgr)
Calculate the gradient relative to the error in the next layer
|
double |
calcRegularizationScore(boolean backpropParamsOnly)
Calculate the regularization component of the score, for the parameters in this layer
For example, the L1, L2 and/or weight decay components of the loss function |
void |
clearNoiseWeightParams() |
INDArray |
getParam(String param)
Get the parameter
|
Gradient |
gradient()
Get the gradient.
|
boolean |
isPretrainLayer()
Returns true if the layer can be trained in an unsupervised/pretrain manner (AE, VAE, etc)
|
long |
numParams()
The number of parameters for the model
|
INDArray |
params()
Returns the parameters of the neural network as a flattened row vector
|
protected INDArray |
preOutput(boolean training,
boolean forBackprop,
LayerWorkspaceMgr workspaceMgr) |
double |
score()
The score for the model
|
void |
setParams(INDArray params)
Set the parameters for this model.
|
Layer.Type |
type()
Returns the layer type
|
void |
update(INDArray gradient,
String paramType)
Perform one update applying the gradient
|
activate, addListeners, allowInputModification, applyConstraints, applyDropOutIfNecessary, applyMask, assertInputSet, backpropDropOutIfPresent, batchSize, clear, close, computeGradientAndScore, conf, feedForwardMaskArray, fit, fit, getConfig, getEpochCount, getGradientsViewArray, getHelper, getIndex, getInput, getInputMiniBatchSize, getListeners, getMaskArray, getOptimizer, gradientAndScore, init, input, layerConf, layerId, numParams, paramTable, paramTable, setBackpropGradientsViewArray, setCacheMode, setConf, setEpochCount, setIndex, setInput, setInputMiniBatchSize, setListeners, setListeners, setMaskArray, setParam, setParams, setParamsViewArray, setParamTable, update, updaterDivideByMinibatchclone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetIterationCount, setIterationCountpublic SpaceToBatch(NeuralNetConfiguration conf, DataType dataType)
public Layer.Type type()
Layertype in interface Layertype in class AbstractLayer<SpaceToBatchLayer>public Pair<Gradient,INDArray> backpropGradient(INDArray epsilon, LayerWorkspaceMgr workspaceMgr)
Layerepsilon - w^(L+1)*delta^(L+1). Or, equiv: dC/da, i.e., (dC/dz)*(dz/da) = dC/da, where C
is cost function a=sigma(z) is activation.workspaceMgr - Workspace managerArrayType.ACTIVATION_GRAD workspace via the workspace managerprotected INDArray preOutput(boolean training, boolean forBackprop, LayerWorkspaceMgr workspaceMgr)
public INDArray activate(boolean training, LayerWorkspaceMgr workspaceMgr)
Layertraining - training or test modeworkspaceMgr - Workspace managerArrayType.ACTIVATIONS workspace via the workspace managerpublic double calcRegularizationScore(boolean backpropParamsOnly)
LayercalcRegularizationScore in interface LayercalcRegularizationScore in class AbstractLayer<SpaceToBatchLayer>backpropParamsOnly - If true: calculate regularization score based on backprop params only. If false: calculate
based on all params (including pretrain params, if any)public boolean isPretrainLayer()
Layerpublic void clearNoiseWeightParams()
public Gradient gradient()
ModelModel.computeGradientAndScore(LayerWorkspaceMgr) } .gradient in interface Modelgradient in class AbstractLayer<SpaceToBatchLayer>public long numParams()
AbstractLayernumParams in interface ModelnumParams in interface TrainablenumParams in class AbstractLayer<SpaceToBatchLayer>public double score()
Modelscore in interface Modelscore in class AbstractLayer<SpaceToBatchLayer>public void update(INDArray gradient, String paramType)
Modelupdate in interface Modelupdate in class AbstractLayer<SpaceToBatchLayer>gradient - the gradient to applypublic INDArray params()
AbstractLayerparams in interface Modelparams in interface Trainableparams in class AbstractLayer<SpaceToBatchLayer>public INDArray getParam(String param)
ModelgetParam in interface ModelgetParam in class AbstractLayer<SpaceToBatchLayer>param - the key of the parameterpublic void setParams(INDArray params)
ModelsetParams in interface ModelsetParams in class AbstractLayer<SpaceToBatchLayer>params - the parameters for the modelCopyright © 2020. All rights reserved.