Package com.github.benmanes.caffeine.cache.simulator.policy.sketch.climbing.gradient
-
Class Summary Class Description Adam Adaptive Moment Estimation (Adam) optimizer.AmsGrad AMSGrad is an improvement on Adam that ensures that the velocity is always increasing, correcting for cases where Adam fails to converge to the optimal solution.Nadam Nesterov-accelerated Adaptive Moment Estimation (Nadam) optimizer.Stochastic Stochastic gradient descent (SGD) optimizer.