Speaker: 程涤非博士,中国科学院自动化研究所
Inviter: 张波研究员
Title: Stochastic Gradient Descent in Non-Convex Problems: Asymptotic Convergence with Relaxed Step-Size via Stopping Time Methods
Time & Venue: 2025.03.25 10:00-11:00 南楼N208
Abstract: Stochastic Gradient Descent (SGD) is widely adopted in machine learning; however, existing convergence analyses typically assume step sizes satisfying the Robbins-Monro conditions. In practice, a broader class of step sizes is frequently utilized, yet theoretical convergence guarantees under such relaxed conditions remain limited. To bridge this gap, we introduce a novel stopping-time based analytical framework from probability theory, enabling rigorous asymptotic convergence analysis of SGD under significantly relaxed step-size conditions. This work is a joint collaboration with Ruinan Jin, Bo Zhang, Hong Qiao, and others.
附件下载: