Package org.deeplearning4j.nn.conf
Class NeuralNetConfiguration.ListBuilder
- java.lang.Object
-
- org.deeplearning4j.nn.conf.MultiLayerConfiguration.Builder
-
- org.deeplearning4j.nn.conf.NeuralNetConfiguration.ListBuilder
-
- Enclosing class:
- NeuralNetConfiguration
public static class NeuralNetConfiguration.ListBuilder extends MultiLayerConfiguration.Builder
Fluent interface for building a list of configurations
-
-
Nested Class Summary
Nested Classes Modifier and Type Class Description classNeuralNetConfiguration.ListBuilder.InputTypeBuilderHelper class for setting input types
-
Field Summary
-
Fields inherited from class org.deeplearning4j.nn.conf.MultiLayerConfiguration.Builder
backpropType, cacheMode, confs, dampingFactor, dataType, inferenceWorkspaceMode, inputPreProcessors, inputType, overrideNinUponBuild, tbpttBackLength, tbpttFwdLength, trainingWorkspaceMode, validateOutputConfig, validateTbpttConfig
-
-
Constructor Summary
Constructors Constructor Description ListBuilder(NeuralNetConfiguration.Builder globalConfig)ListBuilder(NeuralNetConfiguration.Builder globalConfig, Map<Integer,NeuralNetConfiguration.Builder> layerMap)
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description MultiLayerConfiguration.BuilderbackpropType(@NonNull BackpropType type)The type of backprop.MultiLayerConfigurationbuild()Build the multi layer network based on this neural network and overr ridden parametersNeuralNetConfiguration.ListBuildercacheMode(@NonNull CacheMode cacheMode)This method defines how/if preOutput cache is handled: NONE: cache disabled (default value) HOST: Host memory will be used DEVICE: GPU memory will be used (on CPU backends effect will be the same as for HOST)NeuralNetConfiguration.ListBuilderconfs(List<NeuralNetConfiguration> confs)NeuralNetConfiguration.ListBuilderdataType(@NonNull DataType dataType)Set the DataType for the network parameters and activations for all layers in the network.protected voidfinalize()List<InputType>getLayerActivationTypes()For the (perhaps partially constructed) network configuration, return a list of activation sizes for each layer in the network.
Note: To use this method, the network input type must have been set usingsetInputType(InputType)firstMap<Integer,NeuralNetConfiguration.Builder>getLayerwise()NeuralNetConfiguration.ListBuilderinputPreProcessor(Integer layer, InputPreProcessor processor)Specify the processors.NeuralNetConfiguration.ListBuilderinputPreProcessors(Map<Integer,InputPreProcessor> processors)NeuralNetConfiguration.ListBuilder.InputTypeBuilderinputType()A convenience method for setting input types: note that for example .inputType().convolutional(h,w,d) is equivalent to .setInputType(InputType.convolutional(h,w,d))NeuralNetConfiguration.ListBuilderlayer(int ind, @NonNull Layer layer)NeuralNetConfiguration.ListBuilderlayer(Layer layer)NeuralNetConfiguration.ListBuilderoverrideNinUponBuild(boolean overrideNinUponBuild)Whether to over ride the nIn configuration forcibly upon construction.NeuralNetConfiguration.ListBuildersetInputType(InputType inputType)NeuralNetConfiguration.ListBuildertBPTTBackwardLength(int backwardLength)When doing truncated BPTT: how many steps of backward should we do?
Only applicable when doing backpropType(BackpropType.TruncatedBPTT)
This is the k2 parameter on pg23 of http://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdfNeuralNetConfiguration.ListBuildertBPTTForwardLength(int forwardLength)When doing truncated BPTT: how many steps of forward pass should we do before doing (truncated) backprop?
Only applicable when doing backpropType(BackpropType.TruncatedBPTT)
Typically tBPTTForwardLength parameter is same as the tBPTTBackwardLength parameter, but may be larger than it in some circumstances (but never smaller)
Ideally your training data time series length should be divisible by this This is the k1 parameter on pg23 of http://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdfNeuralNetConfiguration.ListBuildertBPTTLength(int bpttLength)When doing truncated BPTT: how many steps should we do?
Only applicable when doing backpropType(BackpropType.TruncatedBPTT)
See: http://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdfNeuralNetConfiguration.ListBuildervalidateOutputLayerConfig(boolean validate)Enabled by default.NeuralNetConfiguration.ListBuildervalidateTbpttConfig(boolean validate)Enabled by default.-
Methods inherited from class org.deeplearning4j.nn.conf.MultiLayerConfiguration.Builder
inferenceWorkspaceMode, trainingWorkspaceMode
-
-
-
-
Constructor Detail
-
ListBuilder
public ListBuilder(NeuralNetConfiguration.Builder globalConfig, Map<Integer,NeuralNetConfiguration.Builder> layerMap)
-
ListBuilder
public ListBuilder(NeuralNetConfiguration.Builder globalConfig)
-
-
Method Detail
-
layer
public NeuralNetConfiguration.ListBuilder layer(int ind, @NonNull @NonNull Layer layer)
-
layer
public NeuralNetConfiguration.ListBuilder layer(Layer layer)
-
getLayerwise
public Map<Integer,NeuralNetConfiguration.Builder> getLayerwise()
-
overrideNinUponBuild
public NeuralNetConfiguration.ListBuilder overrideNinUponBuild(boolean overrideNinUponBuild)
Description copied from class:MultiLayerConfiguration.BuilderWhether to over ride the nIn configuration forcibly upon construction. Default value is true- Overrides:
overrideNinUponBuildin classMultiLayerConfiguration.Builder- Parameters:
overrideNinUponBuild- Whether to over ride the nIn configuration forcibly upon construction.- Returns:
- builder pattern
-
inputPreProcessor
public NeuralNetConfiguration.ListBuilder inputPreProcessor(Integer layer, InputPreProcessor processor)
Description copied from class:MultiLayerConfiguration.BuilderSpecify the processors. These are used at each layer for doing things like normalization and shaping of input.- Overrides:
inputPreProcessorin classMultiLayerConfiguration.Builderprocessor- what to use to preProcess the data.- Returns:
- builder pattern
-
inputPreProcessors
public NeuralNetConfiguration.ListBuilder inputPreProcessors(Map<Integer,InputPreProcessor> processors)
- Overrides:
inputPreProcessorsin classMultiLayerConfiguration.Builder
-
cacheMode
public NeuralNetConfiguration.ListBuilder cacheMode(@NonNull @NonNull CacheMode cacheMode)
Description copied from class:MultiLayerConfiguration.BuilderThis method defines how/if preOutput cache is handled: NONE: cache disabled (default value) HOST: Host memory will be used DEVICE: GPU memory will be used (on CPU backends effect will be the same as for HOST)- Overrides:
cacheModein classMultiLayerConfiguration.Builder- Returns:
-
backpropType
public MultiLayerConfiguration.Builder backpropType(@NonNull @NonNull BackpropType type)
Description copied from class:MultiLayerConfiguration.BuilderThe type of backprop. Default setting is used for most networks (MLP, CNN etc), but optionally truncated BPTT can be used for training recurrent neural networks. If using TruncatedBPTT make sure you set both tBPTTForwardLength() and tBPTTBackwardLength()- Overrides:
backpropTypein classMultiLayerConfiguration.Builder
-
tBPTTLength
public NeuralNetConfiguration.ListBuilder tBPTTLength(int bpttLength)
Description copied from class:MultiLayerConfiguration.BuilderWhen doing truncated BPTT: how many steps should we do?
Only applicable when doing backpropType(BackpropType.TruncatedBPTT)
See: http://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdf- Overrides:
tBPTTLengthin classMultiLayerConfiguration.Builder- Parameters:
bpttLength- length > 0
-
tBPTTForwardLength
public NeuralNetConfiguration.ListBuilder tBPTTForwardLength(int forwardLength)
Description copied from class:MultiLayerConfiguration.BuilderWhen doing truncated BPTT: how many steps of forward pass should we do before doing (truncated) backprop?
Only applicable when doing backpropType(BackpropType.TruncatedBPTT)
Typically tBPTTForwardLength parameter is same as the tBPTTBackwardLength parameter, but may be larger than it in some circumstances (but never smaller)
Ideally your training data time series length should be divisible by this This is the k1 parameter on pg23 of http://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdf- Overrides:
tBPTTForwardLengthin classMultiLayerConfiguration.Builder- Parameters:
forwardLength- Forward length > 0, >= backwardLength
-
tBPTTBackwardLength
public NeuralNetConfiguration.ListBuilder tBPTTBackwardLength(int backwardLength)
Description copied from class:MultiLayerConfiguration.BuilderWhen doing truncated BPTT: how many steps of backward should we do?
Only applicable when doing backpropType(BackpropType.TruncatedBPTT)
This is the k2 parameter on pg23 of http://www.cs.utoronto.ca/~ilya/pubs/ilya_sutskever_phd_thesis.pdf- Overrides:
tBPTTBackwardLengthin classMultiLayerConfiguration.Builder- Parameters:
backwardLength- <= forwardLength
-
confs
public NeuralNetConfiguration.ListBuilder confs(List<NeuralNetConfiguration> confs)
- Overrides:
confsin classMultiLayerConfiguration.Builder
-
validateOutputLayerConfig
public NeuralNetConfiguration.ListBuilder validateOutputLayerConfig(boolean validate)
Description copied from class:MultiLayerConfiguration.BuilderEnabled by default. If enabled, the output layer configuration will be validated, to throw an exception on likely invalid outputs - such as softmax + nOut=1, or LossMCXENT + Tanh.
If disabled (false) no output layer validation will be performed.
Disabling this validation is not recommended, as the configurations that fail validation usually will not be able to learn correctly. However, the option to disable this validation is provided for advanced users when creating non-standard architectures.- Overrides:
validateOutputLayerConfigin classMultiLayerConfiguration.Builder- Parameters:
validate- If true: validate output layer configuration. False: don't validate
-
validateTbpttConfig
public NeuralNetConfiguration.ListBuilder validateTbpttConfig(boolean validate)
Description copied from class:MultiLayerConfiguration.BuilderEnabled by default. If enabled, an exception will be throw when using the (invalid) combination of truncated backpropagation through time (TBPTT) with either a GlobalPoolingLayer or LastTimeStepLayer.
It is possible to disable this validation to allow what is almost certainly an invalid configuration to be used, however this is not recommended.- Overrides:
validateTbpttConfigin classMultiLayerConfiguration.Builder- Parameters:
validate- Whether TBPTT validation should be performed
-
dataType
public NeuralNetConfiguration.ListBuilder dataType(@NonNull @NonNull DataType dataType)
Description copied from class:MultiLayerConfiguration.BuilderSet the DataType for the network parameters and activations for all layers in the network. Default: Float- Overrides:
dataTypein classMultiLayerConfiguration.Builder- Parameters:
dataType- Datatype to use for parameters and activations
-
finalize
protected void finalize() throws Throwable
-
setInputType
public NeuralNetConfiguration.ListBuilder setInputType(InputType inputType)
- Overrides:
setInputTypein classMultiLayerConfiguration.Builder
-
inputType
public NeuralNetConfiguration.ListBuilder.InputTypeBuilder inputType()
A convenience method for setting input types: note that for example .inputType().convolutional(h,w,d) is equivalent to .setInputType(InputType.convolutional(h,w,d))
-
getLayerActivationTypes
public List<InputType> getLayerActivationTypes()
For the (perhaps partially constructed) network configuration, return a list of activation sizes for each layer in the network.
Note: To use this method, the network input type must have been set usingsetInputType(InputType)first- Returns:
- A list of activation types for the network, indexed by layer number
-
build
public MultiLayerConfiguration build()
Build the multi layer network based on this neural network and overr ridden parameters- Overrides:
buildin classMultiLayerConfiguration.Builder- Returns:
- the configuration to build
-
-