In sparse linear regression, the SLOPE estimator generalizes LASSO by assigning magnitude-dependent regularizations to different coordinates of the estimate. In our new paper, we present an asymptotically exact characterization of the performance of SLOPE in the high-dimensional regime. Our asymptotic characterization enables us to derive optimal regularization sequences to either minimize the MSE or to maximize the power in variable selection under any given level of Type-I error. Read more about (03/28/19) New paper: Asymptotics and optimal designs of SLOPE
In our recent paper, we present the optimal design of a spectral method widely used to initialize nonconvex optimization algorithms for solving phase retrieval and other signal recovery problems. Our work leverages recent results that provide an exact characterization of the performance of the spectral method in the high-dimensional limit. Interestingly, under a mild technical condition, our results show that there exists a fixed design that is uniformly optimal over all sampling...
Substantial progress has been made recently on developing provably accurate and efficient algorithms for low-rank matrix factorization via nonconvex optimization. In our recent paper, we (Yuejie Chi, Yuxin Chen, and I) present a technical overview to highlight the important role of statistical models in enabling efficient nonconvex optimization with performance guarantees. We review two contrasting approaches: (1) two-stage algorithms, which consist of a tailored... Read more about (09/26/18) New paper: Nonconvex optimization meets low-rank matrix factorization
Together with Yuxin Chen (Princeton) and Yuejie Chi (CMU), I gave a tutorial at this year's ICASSP on recent advances on nonconvex statistical estimation. We will be covering topics include the landscapes of nonconvex estimation, analyzing gradient descent and stochastic gradient descent methods, spectral methods for initialization, and example applications to phase retrieval, low-rank matrix recovery and blind deconvolution.