org.platanios.tensorflow.api.ops.training.optimizers.schedules
Decay rate.
Decay steps.
If true
, the decay will occur at discrete intervals.
Step after which to start decaying the learning rate.
Composes the provided other
schedule with this schedule and returns the resulting schedule.
Composes the provided other
schedule with this schedule and returns the resulting schedule.
Applies the decay method to value
, the current iteration in the optimization loop is step
and returns the
result.
Applies the decay method to value
, the current iteration in the optimization loop is step
and returns the
result.
Value to decay.
Option containing current iteration in the optimization loop, if one has been provided.
Decayed value.
IllegalArgumentException
If the decay method requires a value for step
but the provided option is empty.
Composes this schedule with the provided, other
schedule and returns the resulting schedule.
Composes this schedule with the provided, other
schedule and returns the resulting schedule.
Decay rate.
Decay steps.
If true
, the decay will occur at discrete intervals.
Step after which to start decaying the learning rate.
Exponential decay method.
This method applies an exponential decay function to a provided initial learning rate (i.e.,
value
). It requires a step value to be provided in it's application function, in order to compute the decayed learning rate. You may simply pass a TensorFlow variable that you increment at each training step.The decayed value is computed as follows:
where if
staircase = true
, then(step / decaySteps)
is an integer division and the decayed learning rate follows a staircase function.