# Publications

Spectral Universality of Regularized Linear Regression with Nearly Deterministic Sensing Matrices,” Submitted. arXiv:2208.02753 [cs.IT]Abstract

, “ Universality of Approximate Message Passing with Semi-Random Matrices,” Submitted. arXiv:2204.04281 [math.PR]Abstract

, “ , “ , “ Universality Laws for High-Dimensional Learning with Random Features,” IEEE Transactions on Information Theory, in press, 2022. arXiv:2009.07669 [cs.IT]Abstract

, “ Asymptotics and Optimal Designs of SLOPE for Sparse Linear Regression,” IEEE Transactions on Information Theory, in press, 2022. arXiv:1903.11582 [cs.IT]Abstract

, “ Analysis of Random Sequential Message Passing Algorithms for Approximate Inference,” Journal of Statistical Mechanics: Theory and Experiments, no. 073401, 2022. arXiv:2202.08198 [cs.LG]Abstract

, “ Householder Dice: A Matrix-Free Algorithm for Simulating Dynamics on Gaussian and Random Orthogonal Ensembles,” IEEE Transactions on Information Theory, vol. 67, no. 12, pp. 8264-8272, 2021. arXiv:2101.07464 [cs.IT]Abstract

, “ A Precise Performance Analysis of Learning with Random Features,” Technical report, 2021. arXiv:2008.11904 [cs.IT]Abstract

, “ Construction of optimal spectral methods in phase retrieval,” in Mathematical and Scientific Machine Learning, 2021. arXiv:2012.04524 [cs.IT]Abstract

, “ On the Inherent Regularization Effects of Noise Injection During Training,” in International Conference on Machine Learning (ICML), 2021. arXiv:2102.07379 [cs.LG]Abstract

, “ Phase Transitions in Transfer Learning for High-Dimensional Perceptrons,” Entropy, Special Issue "The Role of Signal Processing and Information Theory in Modern Machine Learning", vol. 23, no. 4, 2021. arXiv:2101.01918 [cs.LG]Abstract

, “ The Limiting Poisson Law of Massive MIMO Detection with Box Relaxation,” IEEE Journal on Selected Areas in Information Theory, vol. 1, no. 3, pp. 695-704, 2020. arXiv:2006.08416 [cs.IT]Abstract

, “ Phase Transitions of Spectral Initialization for High-Dimensional Nonconvex Estimation,” Information and Inference: A Journal of the IMA, vol. 9, no. 3, pp. 507-541, 2020. arXiv:1702.06435 [cs.IT]Abstract

, “ Generalization error in high-dimensional perceptrons: Approaching Bayes error with convex optimization,” in Conference on Neural Information Processing Systems (NeurIPS), 2020. arXiv:2006.06560 [stat.ML]Abstract

, “ The role of regularization in classification of high-dimensional noisy Gaussian mixture,” in International Conference on Machine Learning (ICML), 2020. arXiv:2002.11544 [stat.ML]Abstract

, “ Optimal Spectral Initialization for Signal Recovery with Applications to Phase Retrieval,” IEEE Transactions on Signal Processing, vol. 67, no. 9, pp. 2347-2356, 2019. arXiv:1811.04420 [cs.IT]Abstract

, “ Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview,” IEEE Transactions on Signal Processing, vol. 67, no. 20, pp. 5239-5269, 2019. arXiv:1809.09573 [cs.LG]Abstract

, “ The scaling limit of high-dimensional online independent component analysis,” Journal of Statistical Mechanics (Special Issue on Machine Learning), vol. 2019, 2019. Publisher's VersionAbstract JSTAT_2019.pdf

, “ A Solvable High-Dimensional Model of GAN,” in Proc. Thirty-third Conference on Neural Information Processing Systems (NeurIPS), 2019. arXiv:1805.08349 [cs.LG]Abstract

, “