All Dates/Times are Australian Eastern Standard Time (AEST)

Technical Program

Paper Detail

Paper IDD5-S4-T3.3
Paper Title Local Convergence of an AMP Variant to the LASSO Solution in Finite Dimensions
Authors Yanting Ma, Mitsubishi Electric Research Laboratories, United States; Min Kang, Jack W. Silverstein, Dror Baron, North Carolina State University, United States
Session D5-S4-T3: Topics in Recovery I
Chaired Session: Friday, 16 July, 23:00 - 23:20
Engagement Session: Friday, 16 July, 23:20 - 23:40
Abstract A common sparse linear regression formulation is l1 regularized least squares, which is also known as least absolute shrinkage and selection operator (LASSO). Approximate message passing (AMP) has been proved to asymptotically achieve the LASSO solution when the regression matrix has independent and identically distributed (i.i.d.) Gaussian entries in the sense that the averaged per-coordinate l2 distance between the AMP iterates and LASSO solution vanishes as the signal dimension goes to infinity before the iteration number. However, in finite dimensional settings, characterization of AMP iterates in the limit of large iteration number has not been established. In this work, we propose an AMP variant by including a parameter that depends on the largest singular value of the regression matrix. The proposed algorithm can also be considered as a primal dual hybrid gradient algorithm with adaptive stepsizes. We show that whenever the AMP variant converges, it converges to the LASSO solution for arbitrary finite dimensional regression matrices. Moreover, we show that our AMP variant is locally stable around the LASSO solution under the condition that the LASSO solution is unique and that the regression matrix is drawn from a continuous distribution. Our local stability result implies that when the regression matrix is large and has i.i.d. random entries, the original AMP, which is a special case of the proposed AMP variant, is locally stable around the LASSO solution.