A Stochastic Block-coordinate Proximal Newton Method for Nonconvex Composite Minimization

A Stochastic Block-coordinate Proximal Newton Method for Nonconvex Composite Minimization
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

This paper presents a stochastic block-coordinate proximal Newton method for minimizing the sum of a blockwise Lipschitz-continuously differentiable function and a separable nonsmooth convex function. At each iteration, the method randomly selects one block and approximately solves a strongly convex regularized quadratic subproblem built from a second-order local model of the smooth part of the objective function, with a backtracking line search to ensure monotonicity of the objective. Under mild sampling assumptions, we show that its convergence properties match those of the inexact proximal Newton method. We further develop a line-search-free variant, where the strongly convex regularized quadratic subproblem is constructed using the Lipschitz constant of the gradient of the smooth component. For this variant, under a suitable parameter setting, we establish the global convergence rate of the residual mapping as well as the superlinear convergence rate of the iterates under the metric (q)-subregularity property with (q > 1) of the residual mapping for nonconvex composite problems. Under a suitable parameter setting, a more restrictive condition on the Hessian approximation, and the Hölderian error bound condition ((q\in(0, 1])) of the residual mapping, we also prove the local superlinear/quadratic convergence rate of both the residual mapping and the iterates for convex composite problems. Finally, numerical experiments are conducted to demonstrate the effectiveness and convergence behavior of the proposed algorithm.


💡 Research Summary

This paper addresses the large‑scale nonconvex composite optimization problem
\


Comments & Academic Discussion

Loading comments...

Leave a Comment