In sparse linear regression, the SLOPE estimator generalizes LASSO by assigning magnitude-dependent regularizations to different coordinates of the estimate. In our new paper, we present an asymptotically exact characterization of the performance of SLOPE in the high-dimensional regime. Our asymptotic characterization enables us to derive optimal regularization sequences to either minimize the MSE or to maximize the power in variable selection under any given level of Type-I error. Read more about (03/28/19) New paper: Asymptotics and optimal designs of SLOPE
In our recent paper, we present the optimal design of a spectral method widely used to initialize nonconvex optimization algorithms for solving phase retrieval and other signal recovery problems. Our work leverages recent results that provide an exact characterization of the performance of the spectral method in the high-dimensional limit. Interestingly, under a mild technical condition, our results show that there exists a fixed design that is uniformly optimal over all sampling...
Substantial progress has been made recently on developing provably accurate and efficient algorithms for low-rank matrix factorization via nonconvex optimization. In our recent paper, we (Yuejie Chi, Yuxin Chen, and I) present a technical overview to highlight the important role of statistical models in enabling efficient nonconvex optimization with performance guarantees. We review two contrasting approaches: (1) two-stage algorithms, which consist of a tailored... Read more about (09/26/18) New paper: Nonconvex optimization meets low-rank matrix factorization
In our recent paper, we study algorithms for solving quadratic systems of equations based on optimization methods over polytopes. Our work is inspired by a recently proposed convex formulation of the phase retrieval problem, which estimates the unknown signal by solving a simple linear program over a polytope constructed from the measurements. We present a sharp characterization of the high-dimensional geometry of the aforementioned polytope under Gaussian measurements. This... Read more about (05/27/18) Phase retrieval via polytope optimization: Geometry, phase transitions, and new algorithms
Together with Yuxin Chen (Princeton) and Yuejie Chi (CMU), I gave a tutorial at this year's ICASSP on recent advances on nonconvex statistical estimation. We will be covering topics include the landscapes of nonconvex estimation, analyzing gradient descent and stochastic gradient descent methods, spectral methods for initialization, and example applications to phase retrieval, low-rank matrix recovery and blind deconvolution.
In our recent paper, we present a tractable and asymptotically exact framework for analyzing the dynamics of online learning algorithms in the high-dimensional scaling limit. Our results are applied to two concrete examples: online regularized linear regression and principal component analysis. As the ambient dimension tends to infinity, and with proper time scaling, we show that the time-varying joint empirical measures of the target feature vector and its estimates provided by the... Read more about (12/06/17) Understanding the Dynamics of Online Learning Algorithms via Scaling and Mean-Field Limits