This website requires JavaScript.

Minimization Over the Nonconvex Sparsity Constraint Using A Hybrid First-order method

Xiangyu YangHao WangYichen ZhuXiao Wang
Feb 2024
0被引用
0笔记
摘要原文
We investigate a class of nonconvex optimization problems characterized by a feasible set consisting of level-bounded nonconvex regularizers, with a continuously differentiable objective. We propose a novel hybrid approach to tackle such structured problems within a first-order algorithmic framework by combining the Frank-Wolfe method and the gradient projection method. The Frank-Wolfe step is amenable to a closed-form solution, while the gradient projection step can be efficiently performed in a reduced subspace. A notable characteristic of our approach lies in its independence from introducing smoothing parameters, enabling efficient solutions to the original nonsmooth problems. We establish the global convergence of the proposed algorithm and show the $O(1/\sqrt{k})$ convergence rate in terms of the optimality error for nonconvex objectives under reasonable assumptions. Numerical experiments underscore the practicality and efficiency of our proposed algorithm compared to existing cutting-edge methods. Furthermore, we highlight how the proposed algorithm contributes to the advancement of nonconvex regularizer-constrained optimization.
展开全部
机器翻译
AI理解论文&经典十问
图表提取
参考文献
发布时间 · 被引用数 · 默认排序
被引用
发布时间 · 被引用数 · 默认排序
社区问答