%0 Journal Article %J IEEE Transactions on Information Theory %D 2022 %T Asymptotics and Optimal Designs of SLOPE for Sparse Linear Regression %A Hong Hu %A Yue M. Lu %X In sparse linear regression, the SLOPE estimator generalizes LASSO by assigning magnitude-dependent regularizations to different coordinates of the estimate. In this paper, we present an asymptotically exact characterization of the performance of SLOPE in the high-dimensional regime where the number of unknown parameters grows in proportion to the number of observations. Our asymptotic characterization enables us to derive optimal regularization sequences to either minimize the MSE or to maximize the power in variable selection under any given level of Type-I error. In both cases, we show that the optimal design can be recast as certain infinite-dimensional convex optimization problems, which have efficient and accurate finite-dimensional approximations. Numerical simulations verify our asymptotic predictions. They also demonstrate the superiority of our optimal design over LASSO and a regularization sequence previously proposed in the literature. %B IEEE Transactions on Information Theory %V 68 %P 7627--7664 %G eng %U https://arxiv.org/abs/1903.11582 %N 11