Back to MJ's Publications
Publications about 'Convergence rate'


S. Samuelson.
Performance tradeoffs of accelerated firstorder optimization algorithms.
PhD thesis,
University of Southern California,
2024.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Gradientflow dynamics,
Heavyball method,
Nesterov's accelerated method,
Nonnormal dynamics,
Noise amplification,
Optimization,
Transient growth.
[bibtexentry]

H. Mohammadi.
Robustness of gradient methods for datadriven decision making.
PhD thesis,
University of Southern California,
2022.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Datadriven control,
Gradient descent,
Gradientflow dynamics,
Heavyball method,
Integral quadratic constraints,
Linear quadratic regulator,
Modelfree control,
Nesterov's accelerated method,
Nonconvex optimization,
Nonnormal dynamics,
Noise amplification,
Optimization,
Optimal control,
PolyakLojasiewicz inequality,
Random search method,
Reinforcement learning,
Sample complexity,
Secondorder moments,
Transient growth.
[bibtexentry]

H. Mohammadi,
M. Razaviyayn,
and M. R. Jovanovic.
Tradeoffs between convergence rate and noise amplification for momentumbased accelerated optimization algorithms.
IEEE Trans. Automat. Control,
2024.
Note: Doi:10.1109/TAC.2024.3453656.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Fundamental limitations,
Heavyball method,
Nesterov's accelerated method,
Nonnormal dynamics,
Noise amplification,
Secondorder moments.
[bibtexentry]

W. Wu,
J. Chen,
M. R. Jovanovic,
and T. T. Georgiou.
Tannenbaum's gainmargin optimization meets Polyak's heavyball algorithm.
IEEE Trans. Automat. Control,
2024.
Note: Submitted; also arXiv:2409.19882.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Fundamental limitations,
Heavyball method,
Integral quadratic constraints,
Nesterov's accelerated method,
NevanlinnaPick interpolation,
Optimization,
Optimal control,
Robust control.
[bibtexentry]

W. Wu,
J. Chen,
M. R. Jovanovic,
and T. T. Georgiou.
Frequencydomain synthesis of implicit algorithms.
In Proceedings of the 2025 American Control Conference,
Denver, CO,
2025.
Note: Submitted.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Heavyball method,
Integral quadratic constraints,
Nesterov's accelerated method,
Optimization,
Optimal control.
[bibtexentry]

S. Samuelson and M. R. Jovanovic.
Tradeoffs between convergence speed and noise amplification in firstorder optimization: the role of averaging.
In Proceedings of the 2024 American Control Conference,
Toronto, Canada,
pages 650655,
2024.
Keyword(s): Accelerated firstorder algorithms,
Averaging,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient flow dynamics,
Noise amplification,
Nonnormal dynamics,
Twostep momentum algorithm.
[bibtexentry]

H. Mohammadi,
M. Razaviyayn,
and M. R. Jovanovic.
Noise amplification of momentumbased optimization algorithms.
In Proceedings of the 2023 American Control Conference,
San Diego, CA,
pages 849854,
2023.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Heavyball method,
Nesterov's accelerated method,
Noise amplification,
Nonnormal dynamics,
Twostep momentum algorithm.
[bibtexentry]

I. K. Ozaslan and M. R. Jovanovic.
Tight lower bounds on the convergence rate of primaldual dynamics for equality constrained convex problems.
In Proceedings of the 62nd IEEE Conference on Decision and Control,
Singapore,
pages 73127317,
2023.
Keyword(s): Gradient flow dynamics,
Exponential stability,
Integral quadratic constraints,
Primaldual gradient flow dynamics,
Primaldual methods.
[bibtexentry]

S. Samuelson,
H. Mohammadi,
and M. R. Jovanovic.
Performance of noisy higherorder accelerated gradient flow dynamics for strongly convex quadratic optimization problems.
In Proceedings of the 2023 American Control Conference,
San Diego, CA,
pages 38393844,
2023.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient flow dynamics,
Noise amplification,
Nonnormal dynamics,
Twostep momentum algorithm.
[bibtexentry]

S. Samuelson,
H. Mohammadi,
and M. R. Jovanovic.
Performance of noisy threestep accelerated firstorder optimization algorithms for strongly convex quadratic problems.
In Proceedings of the 62nd IEEE Conference on Decision and Control,
Singapore,
pages 13001305,
2023.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient flow dynamics,
Noise amplification,
Nonnormal dynamics,
Threestep momentum algorithm.
[bibtexentry]

S. HassanMoghaddam and M. R. Jovanovic.
On the exponential convergence rate of proximal gradient flow algorithms.
In Proceedings of the 57th IEEE Conference on Decision and Control,
Miami, FL,
pages 42464251,
2018.
Note: (Invited paper).
Keyword(s): Control for optimization,
Distributed optimization,
Forwardbackward envelope,
Exponential convergence,
Global exponential stability,
Gradient flow dynamics,
Largescale systems,
Nonsmooth optimization,
Primaldual method,
Proximal algorithms,
Proximal augmented Lagrangian.
[bibtexentry]
Back to MJ's Publications
Disclaimer:
This material is presented to ensure timely dissemination of
scholarly and technical work. Copyright and all rights therein
are retained by authors or by other copyright holders.
All person copying this information are expected to adhere to
the terms and constraints invoked by each author's copyright.
In most cases, these works may not be reposted
without the explicit permission of the copyright holder.
Last modified: Sat Oct 5 22:00:41 2024
Author: mihailo.
This document was translated from BibT_{E}X by
bibtex2html