| Package | Description |
|---|---|
| org.tensorflow.op | |
| org.tensorflow.op.train |
| Modifier and Type | Method and Description |
|---|---|
<T extends TType> |
TrainOps.resourceSparseApplyAdagradDa(Operand<? extends TType> var,
Operand<? extends TType> gradientAccumulator,
Operand<? extends TType> gradientSquaredAccumulator,
Operand<T> grad,
Operand<? extends TNumber> indices,
Operand<T> lr,
Operand<T> l1,
Operand<T> l2,
Operand<TInt64> globalStep,
ResourceSparseApplyAdagradDa.Options... options)
Update entries in '*var' and '*accum' according to the proximal adagrad scheme.
|
| Modifier and Type | Method and Description |
|---|---|
static ResourceSparseApplyAdagradDa.Options |
ResourceSparseApplyAdagradDa.useLocking(Boolean useLocking)
Sets the useLocking option.
|
ResourceSparseApplyAdagradDa.Options |
ResourceSparseApplyAdagradDa.Options.useLocking(Boolean useLocking)
Sets the useLocking option.
|
| Modifier and Type | Method and Description |
|---|---|
static <T extends TType> |
ResourceSparseApplyAdagradDa.create(Scope scope,
Operand<? extends TType> var,
Operand<? extends TType> gradientAccumulator,
Operand<? extends TType> gradientSquaredAccumulator,
Operand<T> grad,
Operand<? extends TNumber> indices,
Operand<T> lr,
Operand<T> l1,
Operand<T> l2,
Operand<TInt64> globalStep,
ResourceSparseApplyAdagradDa.Options... options)
Factory method to create a class wrapping a new ResourceSparseApplyAdagradDA operation.
|
Copyright © 2015–2022. All rights reserved.