Speaker:Prof. Guoyin Li,The University of New South Wales
Inviter:丁超 研究员
Title:On some recent developments on Kurdyka-Łojasiewicz (KL) inequality
Language:English
Time & Venue:2025.4.22 9:30-10:30 数学院思源楼813
Abstract:The Kurdyka-Łojasiewicz (KL) inequality is a fundamental tool for analyzing the convergence of various numerical methods in solving nonsmooth and nonconvex optimization problems. In this talk, we discuss recent developments on two aspects of the KL inequality. In the first part, we establish an abstract extended convergence framework that enables one to derive superlinear convergence towards a specific target set (such as the second-order stationary points) under a generalized metric subregularity condition, extending the widely used analyzing framework with KL inequality. We then show that this generalized metric subreguarity for second-order stationary points can be ensured by KL inequality and the strict saddle point condition, which, in turn, is satisfied by several important applications easily. In the second part, we explain an approach for estimating the associated exponent (when it exists) in the KL inequality using a lift-and-project-approach. This enables us to estimate KL exponents for functions involving semi-definite programming representablity and $C^2$-cone reducible structures. As an application,we establish convergence analysis for cubic regularized Newton's method with momentum steps. Specifically, when applying this method to solve the (nonconvex) over-parameterized compressed sensing model, we obtain a (local) quadratic convergence rate to a global minimizer, under the strict complementarity condition. In the absence of the strict complementarity condition, we obtain a sublinear convergence rate of $O(\frac{1}{k^2})$ to a global minimizer.
附件下载: