This website requires JavaScript.

# Distribution-aware $\ell_1$ Analysis Minimization

Dec 2022

This work is about recovering an analysis-sparse vector, i.e. sparse vectorin some transform domain, from under-sampled measurements. In real-worldapplications, there often exist random analysis-sparse vectors whosedistribution in the analysis domain are known. To exploit this information, aweighted $\ell_1$ analysis minimization is often considered. The task ofchoosing the weights in this case is however challenging and non-trivial. Inthis work, we provide an analytical method to choose the suitable weights.Specifically, we first obtain a tight upper-bound expression for the expectednumber of required measurements. This bound depends on two critical parameters:support distribution and expected sign of the analysis domain which are bothaccessible in advance. Then, we calculate the near-optimal weights byminimizing this expression with respect to the weights. Our strategy works forboth noiseless and noisy settings. Numerical results demonstrate thesuperiority of our proposed method. Specifically, the weighted $\ell_1$analysis minimization with our near-optimal weighting design considerably needsfewer measurements than its regular $\ell_1$ analysis counterpart.

Q1论文试图解决什么问题？
Q2这是否是一个新的问题？
Q3这篇文章要验证一个什么科学假设？
0