Andrew and Erna Viterbi Early Career Chair

Assistant Professor, Departments of Electrical and Computer Engineering and Computer Science

University of Southern California

I'm currently a Ph.D. student in the Electrical Engineering department at Stanford, advised by Emmanuel Candes.

. Starting September I will spend one year at UC Berkeley EECS/Stats as a postdoctoral researcher. My sponsers Berkeley are Ben Recht.

and Martin Wainwright.

. In August 2015 I will join the USC EE department as an assitant professor. I'm interested in various problems that lie at the interplay between statistics, convex optimization, geometric functional analysis and theoretical computer science. I'm interested in using these tools for the design and analysis of algorithms for data mining, machine learning, computer vision and signal/image processing. Recently, I've also become interested in the analysis of non-convex problems via iterative methods and the interactions between convex optimization and algebraic geometry.

Email: mahdisol@stanford.edu

Office: 380-U, Mathematics, Building 380, 450 Serra Mall, Stanford, CA 94305

. Starting September I will spend one year at UC Berkeley EECS/Stats as a postdoctoral researcher. My sponsers Berkeley are Ben Recht.

and Martin Wainwright.

. In August 2015 I will join the USC EE department as an assitant professor. I'm interested in various problems that lie at the interplay between statistics, convex optimization, geometric functional analysis and theoretical computer science. I'm interested in using these tools for the design and analysis of algorithms for data mining, machine learning, computer vision and signal/image processing. Recently, I've also become interested in the analysis of non-convex problems via iterative methods and the interactions between convex optimization and algebraic geometry.

Email: mahdisol@stanford.edu

Office: 380-U, Mathematics, Building 380, 450 Serra Mall, Stanford, CA 94305

- Compressive sensing with un-trained neural networks: Gradient descent finds the smoothest approximation.

R. Heckel and M. Soltanolkotabi. International Conference on Machine Learning (ICML 2020). - Denoising and Regularization via Exploiting the
Structural Bias of Convolutional Generators.

R. Heckel and M. Soltanolkotabi. International Conference on Learning Representations (ICLR 2020). - Precise Tradeoffs in Adversarial Training for Linear Regression.

A. Javanmard, M. Soltanolkotabi, and H. Hassani. Conference on Learning Theory (COLT 2020). - High-dimensional Robust Mean Estimation via Gradient Descent.

Y. Cheng, I. Diakonikolas, R. Ge, and M. Soltanolkotabi. International Conference on Machine Learning (ICML 2020). - Approximation Schemes for ReLU Regression.

I. Diakonikolas, S. Goel, S. Karmalkar, A. Klivans, and M. Soltanolkotabi. Conference on Learning Theory (COLT 2020). - Convergence and sample complexity of gradient methods for the model-free linear quadratic regulator problem.

H. Mohammadi, A. Zare, M. Soltanolkotabi, M. R. Jovanovic. Submitted. - Gradient Descent with Early Stopping is Provably Robust
to Label Noise for Overparameterized Neural Networks.

M. Li, M. Soltanolkotabi, and S. Oymak. International Conference on Artificial Intelligence and Statistics (AISTATS 2020). - Towards moderate overparameterization: global convergence guarantees for training shallow neural networks.

S. Oymak and M. Soltanolkotabi. Journal on Selected Areas of Information Theory, Deep Learning: Mathematical Foundations and Applications to Information Science, 2020. - 3D Phase Retrieval at Nano-Scale via Accelerated Wirtinger Flow.

Z. Fabian, J. Haldar, R. Leahy, M. Soltanolkotabi. EUSIPCO 2020. - Generalization Guarantees for Neural Networks
via Harnessing the Low-rank Structure of the Jacobian.

S. Oymak, Z. Fabian, M. Li, and M. Soltanolkotabi. Submitted. - Compressed Sensing with Deep Image Prior and Learned Regularization.

D. V. Veen, A. Jalal, M. Soltanolkotabi, E. Price, S. Vishwanath, A. G. Dimakis. Submitted. - End-to-end Learning of a Convolutional Neural Network via Deep Tensor Decomposition.

S. Oymak and M. Soltanolkotabi. Submitted. - Overparameterized Nonlinear Learning: Gradient Descent Takes the Shortest Path?

S. Oymak and M. Soltanolkotabi. International Conference on Machine Learning (ICML 2019). - Lagrange Coded Computing: Optimal Design for Resiliency, Security and Privacy.

Q. Yu, S. Li, N. Raviv, M. Mousavi Kalan, M. Soltanolkotabi, and S. Avestimehr.

International Conference on Artificial Intelligence and Statistics (AISTATS 2019). - Fitting ReLUs via SGD and Quantized SGD.

M. Mousavi Kalan, M. Soltanolkotabi, and A. Avestimehr. International Symposium on Information Theory (ISIT 2019). - Fundamental Resource Trade-offs for Encoded Distributed
Optimization.

A. Avestimehr, M. Mousavi Kalan, and M. Soltanolkotabi. Submitted. - Accelerated Wirtinger Flow: A Fast Algorithm for Ptychography.

R. Xu, M. Soltanolkotabi, J. P. Haldar, W. Unglaub, J. Zusman, A. F. J. Levi, R. M. Leahy. Submitted. - Accelerated Wirtinger Flow for Multiplexed Fourier Ptychographic Microscopy.

E. Bostan, M. Soltanolkotabi, D. Ren, and L. Waller. IEEE International Conference on Image Processing (ICIP 2018). - Structured signal recovery from quadratic measurements: Breaking sample complexity barriers via nonconvex optimization.

M. Soltanolkotabi. IEEE Trans. on Info. Theory. - Theoretical insights into the optimization landscape of
over-parameterized shallow neural networks.

M. Soltanolkotabi, A. Javanmard, and J. D. Lee. IEEE Trans. on Info. Theory. - Near-Optimal Straggler Mitigation for Distributed Gradient Methods.

S. Li, M. Mousavi Kalan, S. Avestimehr, and M. Soltanolkotabi.

The 7th International Workshop on Parallel and Distributed Computing for Large Scale Machine Learning and Big Data Analytics. - Learning ReLUs via gradient Descent.

M. Soltanolkotabi. Neural Information Processing Systems (NIPS 2017). - Gradient methods for submodular maximization.

H. Hassani, M. Soltanolkotabi and A. Karbasi. Neural Information Processing Systems (NIPS 2017). - Fast and Reliable Parameter Estimation
from Nonlinear Observations.

S. Oymak and M. Soltanolkotabi. SIAM Journal on Optimization. - Generalized Line Spectral Estimation.

R. Heckel and M. Soltanolkotabi. IEEE Transactions on Information Theory. - Sharp Time-data tradeoffs for linear inverse problems.

S. Oymak, B. Recht, and M. Soltanolkotabi. IEEE Transactions on Information Theory. - Isometric sketching of any set via the Restricted Isometry Property.

S. Oymak, B. Recht, and M. Soltanolkotabi. Information and Inference. - Low-rank Solutions of Linear Matrix Equations via Procrustes Flow.

S. Tu, R. Boczar, M. Soltanolkotabi, and B. Recht. Proceedings of The 33rd International Conference on Machine Learning, 2016. - Super-Resolution Radar.

R. Heckel, V. I. Morghenstern, and M. Soltanolkotabi. Information and Inference (2016) 5 (1): 22-75. - Experimental robustness of Fourier Ptychography phase retrieval algorithms

L. Yeh, J. Dong, J. Zhong, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, L. Waller. Optics Express Vol. 23, Issue 26, pp. 33214-33240 (2015) - Phase retrieval via Wirtinger Flow: Theory and Algorithms. (website)

E. J. Candes, X. Li, and M. Soltanolkotabi. - Algorithms and Theory for Clustering and Nonconvex Quadratic Programming.

M. Soltanolkotabi. Stanford University Ph. D. Dissertation, August 2014. - Phase retrieval from coded diffraction patterns. (website)

E. J. Candes, X. Li, and M. Soltanolkotabi. - Robust Subspace Clustering. (website)

M. Soltanolkotabi, E. Elhamifar, and E. J. Candes. Annals of Statistics 2014, Vol. 42, No. 2, 669-699. - A Geometric Analysis of Subspace Clustering with Outliers. (code)

M. Soltanolkotabi and E. J. Candes. Annals of Statistics 40(4), 2195--2238, 2012. - Discussion of "Latent Variable Graphical Model Selection via Convex Optimization".

E. J. Candes and M. Soltanolkotabi. Annals of Statistics 40(2), 1997--2004, 2012. - From robust subspace clustering to full-rank matrix completion.(Extended abstract-more detail available in my dissertation)

E. J. Candes, L. Mackey, and M. Soltanolkotabi. - A Unified Approach to Sparse Signal Processing.(during undergrad)

F. Marvasti, A. Amini, F. Haddadi, M. Soltanolkotabi, B. Khalaj, A. Aldroubi, S. Sanei and J. Chambers. EURASIP Journal on Advances in Signal Processing. - Errorless Codes for Over-loaded Wireless CDMA with Active User Detection.

P. Pad, M. Soltanolkotabi, S. Hadikhanlou, A. Enayati and F. Marvasti. Proc. of the International Conference on Communications, ICC 2009, Dresden, Germany. - OFDM Channel Estimation based on Adaptive Thresholding for Sparse Signal Detection.

M. Soltanolkotabi, A. Amini and F. Marvasti. Proc. of European Signal Processing Conference, EUSIPCO 2009. - Throughput Capacity of a Multi-channel Multi- hop Mobile Ad-hoc Network.

M. Soltanolkotabi and F. Ashtiani. Proc. of International Conference on Telecommunications, ICT 2009. - A Practical Sparse Channel Estimation for Current OFDM Standards.

M. Soltanolkotabi, M. Soltanalian, A. Amini and F. Marvasti. Proc. of International Conference on Telecommunications, ICT 2009. - Salt and Pepper Noise Removal for Images.

S. Feizi-Khankandi, S. Zahed Pour, M. Soltanolkotabi, A. Amini and F. Marvasti. Proc. of International Conference on Telecommunications, ICT 2008.

- Robust Subspace Clustering.

- Stanford Biostatistics Seminar, February 2014.

- ICML Spectral Learning Workshop, June 2013.

- Asilomar Conference on Signals, Systems, and Computers.

- MURI annual meeting, Princeton, October 2012.

- Information Theory and Applications workshop, Feb. 2013. - A Geometric Analysis of Subspace Clustering with Outliers.

- Berkeley Robotics Lab, Febuary 2012.

- High-Dimensional Phenomena in Statistics and Machine Learning Seminar, Georgia Tech., July 2012.

- Workshop on Modern Massive Data Sets (MMDS), Stanford, July 2012.

- CS 229: Machine Learning, Fall 2012.

Stanford EE:

- EE 278: Statistical Signal Processing, Summer 2010.

Stanford Math:

- Math 104: Linear Algebra, Winter 2012.

Stanford Statistics: