Size of input *
Size of input *
Size of output *
Size of output *
Apply this layer (forward computation)
Apply this layer (forward computation)
Output computation information.
Apply RDD of vector.
Apply RDD of vector.
RDD of (ID, Value), ID is Long value.
RDD of (ID, Vector), with same ID for input.
Backward computation
Backward computation
Sequence of entries to be used for backward computation.
Error sequence, to backpropagate into previous layer.
For the computation, we only used denominator layout. (cf. Wikipedia Page of Matrix Computation) For the computation rules, see "Matrix Cookbook" from MIT.
Backward computation using propagated error.
Backward computation using propagated error.
Propagated error sequence.
Error sequence for back propagation.
ClassTag for input *
ClassTag for input *
Apply Parallel Sequence of vector.
Apply Parallel Sequence of vector.
Parallel Sequence of Input
Parallel Sequence of Vector
Apply RDD of vector.
Apply RDD of vector.
RDD of (ID, Value), ID is Long value.
RDD of (ID, Vector), with same ID for input.
Apply this layer (forward computation)
Retrieve first input
Retrieve first input
input to be separated
first input
Retrive second input
Retrive second input
input to be separated
second input
Set weight builder for this layer.
Set weight builder for this layer.
Weight builder to be applied
self
Sequence for backpropagation.
Sequence for backpropagation. Stores output values. *
True if this layer affected by backward propagation *
True if this layer affected by backward propagation *
Weight Loss of this layer
Output converter from OutInfo to Vector *
Output converter from OutInfo to Vector *
Reconstruct error from fragments
Reconstruct error from fragments
error of input1
error of input2
restored error
Assign whether this layer updatable or not.
Assign whether this layer updatable or not. value.
True if this layer used in backpropagation.
Set the activation function.
Set input size
Set output size
Layer: Basic, Fully-connected Rank 3 Tensor Layer.