Convergence Analysis of S
Convergence Analysis of Stochastic Accelerated Gradient Methods for Generalized Smooth Optimizations
Convergence Analysis of Stochastic Accelerated Gradient Methods for Generalized Smooth Optimizations
We investigate the Randomized Stochastic Accelerated Gradient (RSAG) method, utilizing either constant or adaptive step sizes, for stochastic optimization problems with generalized smooth objective functions. Under relaxed affine variance assumptions for the stochastic gradient noise, we establish h…