public class FrozenLayerWithBackprop extends BaseWrapperLayer
FrozenLayer,
Serialized FormLayer.Builder<T extends Layer.Builder<T>>underlyingconstraints, iDropout, layerName| Constructor and Description |
|---|
FrozenLayerWithBackprop(Layer layer) |
| Modifier and Type | Method and Description |
|---|---|
Layer |
clone() |
NeuralNetConfiguration |
getInnerConf(NeuralNetConfiguration conf) |
List<Regularization> |
getRegularizationByParam(String paramName)
Get the regularization types (l1/l2/weight decay) for the given parameter.
|
IUpdater |
getUpdaterByParam(String paramName)
Get the updater for the given parameter.
|
ParamInitializer |
initializer() |
Layer |
instantiate(NeuralNetConfiguration conf,
Collection<TrainingListener> trainingListeners,
int layerIndex,
INDArray layerParamsView,
boolean initializeParams,
org.nd4j.linalg.api.buffer.DataType networkDataType) |
boolean |
isPretrainParam(String paramName)
Is the specified parameter a layerwise pretraining only parameter?
For example, visible bias params in an autoencoder (or, decoder params in a variational autoencoder) aren't used during supervised backprop. Layers (like DenseLayer, etc) with no pretrainable parameters will return false for all (valid) inputs. |
void |
setConstraints(List<LayerConstraint> constraints) |
void |
setLayerName(String layerName) |
getGradientNormalization, getGradientNormalizationThreshold, getMemoryReport, getOutputType, getPreProcessorForInputType, setNIninitializeConstraints, resetLayerDefaultConfig, setDataTypeequals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitgetLayerNamepublic FrozenLayerWithBackprop(Layer layer)
public NeuralNetConfiguration getInnerConf(NeuralNetConfiguration conf)
public Layer instantiate(NeuralNetConfiguration conf, Collection<TrainingListener> trainingListeners, int layerIndex, INDArray layerParamsView, boolean initializeParams, org.nd4j.linalg.api.buffer.DataType networkDataType)
instantiate in class Layerpublic ParamInitializer initializer()
initializer in class BaseWrapperLayerpublic List<Regularization> getRegularizationByParam(String paramName)
LayergetRegularizationByParam in interface TrainingConfiggetRegularizationByParam in class BaseWrapperLayerparamName - Parameter name ("W", "b" etc)public boolean isPretrainParam(String paramName)
LayerisPretrainParam in interface TrainingConfigisPretrainParam in class BaseWrapperLayerparamName - Parameter name/keypublic IUpdater getUpdaterByParam(String paramName)
LayergetUpdaterByParam in interface TrainingConfiggetUpdaterByParam in class LayerparamName - Parameter namepublic void setLayerName(String layerName)
setLayerName in class BaseWrapperLayerpublic void setConstraints(List<LayerConstraint> constraints)
Copyright © 2019. All rights reserved.