Back to MJ's Publications
Publications about 'Control for optimization'


S. Samuelson.
Performance tradeoffs of accelerated firstorder optimization algorithms.
PhD thesis,
University of Southern California,
2024.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Gradientflow dynamics,
Heavyball method,
Nesterov's accelerated method,
Nonnormal dynamics,
Noise amplification,
Optimization,
Transient growth.
[bibtexentry]

H. Mohammadi.
Robustness of gradient methods for datadriven decision making.
PhD thesis,
University of Southern California,
2022.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Datadriven control,
Gradient descent,
Gradientflow dynamics,
Heavyball method,
Integral quadratic constraints,
Linear quadratic regulator,
Modelfree control,
Nesterov's accelerated method,
Nonconvex optimization,
Nonnormal dynamics,
Noise amplification,
Optimization,
Optimal control,
PolyakLojasiewicz inequality,
Random search method,
Reinforcement learning,
Sample complexity,
Secondorder moments,
Transient growth.
[bibtexentry]

S. HassanMoghaddam.
Analysis, design, and optimization of largescale networks of dynamical systems.
PhD thesis,
University of Southern California,
2019.
Keyword(s): Consensus,
Control for optimization,
Convex Optimization,
Distributed control,
Forwardbackward envelope,
DouglasRachford splitting,
Global exponential stability,
Integral quadratic constraints,
Networks of dynamical systems,
Nonsmooth optimization,
PolyakLojasiewicz inequality,
Proximal algorithms,
Primaldual methods,
Proximal augmented Lagrangian,
Regularization for design,
Sparse graphs,
Sparsitypromoting optimal control,
Structured optimal control,
Structure identification,
Topology design.
[bibtexentry]

H. Mohammadi,
M. Razaviyayn,
and M. R. Jovanovic.
Tradeoffs between convergence rate and noise amplification for momentumbased accelerated optimization algorithms.
IEEE Trans. Automat. Control,
2024.
Note: Doi:10.1109/TAC.2024.3453656.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Fundamental limitations,
Heavyball method,
Nesterov's accelerated method,
Nonnormal dynamics,
Noise amplification,
Secondorder moments.
[bibtexentry]

W. Wu,
J. Chen,
M. R. Jovanovic,
and T. T. Georgiou.
Tannenbaum's gainmargin optimization meets Polyak's heavyball algorithm.
IEEE Trans. Automat. Control,
2024.
Note: Submitted; also arXiv:2409.19882.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Fundamental limitations,
Heavyball method,
Integral quadratic constraints,
Nesterov's accelerated method,
NevanlinnaPick interpolation,
Optimization,
Optimal control,
Robust control.
[bibtexentry]

H. Mohammadi,
S. Samuelson,
and M. R. Jovanovic.
Transient growth of accelerated optimization algorithms.
IEEE Trans. Automat. Control,
68(3):18231830,
March 2023.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convex optimization,
Gradient descent,
Heavyball method,
Integral quadratic constraints,
Nesterov's accelerated method,
Nonnormal dynamics,
Transient growth.
[bibtexentry]

I. K. Ozaslan and M. R. Jovanovic.
Accelerated forwardbackward and DouglasRachford splitting dynamics.
Automatica,
2023.
Note: Submitted; also arXiv:2407.20620.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convex Optimization,
Forwardbackward envelope,
DouglasRachford splitting,
Global exponential stability,
Integral quadratic constraints,
Nesterov's accelerated method,
Nonsmooth optimization,
Proximal algorithms.
[bibtexentry]

S. HassanMoghaddam and M. R. Jovanovic.
Proximal gradient flow and DouglasRachford splitting dynamics: global exponential stability via integral quadratic constraints.
Automatica,
123:109311,
January 2021.
Keyword(s): Control for optimization,
Convex Optimization,
Forwardbackward envelope,
DouglasRachford splitting,
Global exponential stability,
Integral quadratic constraints,
Nonsmooth optimization,
PolyakLojasiewicz inequality,
Proximal algorithms,
Primaldual methods,
Proximal augmented Lagrangian.
[bibtexentry]

H. Mohammadi,
M. Razaviyayn,
and M. R. Jovanovic.
Robustness of accelerated firstorder algorithms for strongly convex optimization problems.
IEEE Trans. Automat. Control,
66(6):24802495,
June 2021.
Keyword(s): Accelerated firstorder algorithms,
Consensus networks,
Control for optimization,
Convex optimization,
Integral quadratic constraints,
Linear matrix inequalities,
Noise amplification,
Secondorder moments,
Semidefinite programming.
[bibtexentry]

M. Chertkov,
M. R. Jovanovic,
B. Lesieutre,
S. Low,
P. van Hentenryck,
and L. Wehenkel.
Guest Editorial Special Issue on Analysis, Control, and Optimization of Energy Networks.
IEEE Trans. Control Netw. Syst.,
6(3):922924,
September 2019.
Keyword(s): Optimization,
Control,
Energy networks,
Power Networks.
[bibtexentry]

N. K. Dhingra,
S. Z. Khong,
and M. R. Jovanovic.
The proximal augmented Lagrangian method for nonsmooth composite optimization.
IEEE Trans. Automat. Control,
64(7):28612868,
July 2019.
Keyword(s): Augmented Lagrangian,
Control for optimization,
Exponential convergence,
Global exponential stability,
Method of multipliers,
Nonsmooth optimization,
Primaldual gradient flow dynamics,
Proximal algorithms,
Proximal augmented Lagrangian,
Regularization for design,
Sparsitypromoting optimal control,
Structured optimal control,
Structure identification.
[bibtexentry]

K. Sawant,
P. Seiler,
M. R. Jovanovic,
J. Poon,
and S. Dhople.
Realtime solution strategy for linearly constrained quadratic programs with proportionalintegral control and variants.
In Proceedings of the 2025 American Control Conference,
Denver, CO,
2025.
Note: Submitted.
Keyword(s): Control for optimization,
Convex optimization,
Exponential convergence,
Global exponential stability,
Optimization.
[bibtexentry]

W. Wu,
J. Chen,
M. R. Jovanovic,
and T. T. Georgiou.
Frequencydomain synthesis of implicit algorithms.
In Proceedings of the 2025 American Control Conference,
Denver, CO,
2025.
Note: Submitted.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Heavyball method,
Integral quadratic constraints,
Nesterov's accelerated method,
Optimization,
Optimal control.
[bibtexentry]

S. Samuelson and M. R. Jovanovic.
Tradeoffs between convergence speed and noise amplification in firstorder optimization: the role of averaging.
In Proceedings of the 2024 American Control Conference,
Toronto, Canada,
pages 650655,
2024.
Keyword(s): Accelerated firstorder algorithms,
Averaging,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient flow dynamics,
Noise amplification,
Nonnormal dynamics,
Twostep momentum algorithm.
[bibtexentry]

H. Mohammadi,
M. Razaviyayn,
and M. R. Jovanovic.
Noise amplification of momentumbased optimization algorithms.
In Proceedings of the 2023 American Control Conference,
San Diego, CA,
pages 849854,
2023.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient descent,
Heavyball method,
Nesterov's accelerated method,
Noise amplification,
Nonnormal dynamics,
Twostep momentum algorithm.
[bibtexentry]

S. Samuelson,
H. Mohammadi,
and M. R. Jovanovic.
Performance of noisy higherorder accelerated gradient flow dynamics for strongly convex quadratic optimization problems.
In Proceedings of the 2023 American Control Conference,
San Diego, CA,
pages 38393844,
2023.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient flow dynamics,
Noise amplification,
Nonnormal dynamics,
Twostep momentum algorithm.
[bibtexentry]

S. Samuelson,
H. Mohammadi,
and M. R. Jovanovic.
Performance of noisy threestep accelerated firstorder optimization algorithms for strongly convex quadratic problems.
In Proceedings of the 62nd IEEE Conference on Decision and Control,
Singapore,
pages 13001305,
2023.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convergence rate,
Convex optimization,
Gradient flow dynamics,
Noise amplification,
Nonnormal dynamics,
Threestep momentum algorithm.
[bibtexentry]

H. Mohammadi and M. R. Jovanovic.
On the noise amplification of primaldual gradient flow dynamics based on proximal augmented Lagrangian.
In Proceedings of the 2022 American Control Conference,
Atlanta, GA,
pages 926931,
2022.
Keyword(s): Control for optimization,
Convex Optimization,
Integral quadratic constraints,
Linear matrix inequalities,
Noise amplification,
Nonsmooth optimization,
Proximal algorithms,
Primaldual gradient flow dynamics,
Primaldual methods,
Proximal augmented Lagrangian,
Secondorder moments,
Semidefinite programming.
[bibtexentry]

I. K. Ozaslan,
S. HassanMoghaddam,
and M. R. Jovanovic.
On the asymptotic stability of proximal algorithms for convex optimization problems with multiple nonsmooth regularizers.
In Proceedings of the 2022 American Control Conference,
Atlanta, GA,
pages 132137,
2022.
Keyword(s): Control for optimization,
Convex Optimization,
DouglasRachford splitting,
Global asymptotic stability,
Lyapunovbased analysis,
Nonsmooth optimization,
Proximal algorithms,
Primaldual gradient flow dynamics,
Primaldual methods,
Proximal augmented Lagrangian.
[bibtexentry]

D. Ding and M. R. Jovanovic.
Global exponential stability of primaldual gradient flow dynamics based on the proximal augmented Lagrangian: A Lyapunovbased approach.
In Proceedings of the 59th IEEE Conference on Decision and Control,
Jeju Island, Republic of Korea,
pages 48364841,
2020.
Keyword(s): Augmented Lagrangian,
Control for optimization,
Convex optimization,
Global exponential stability,
Lyapunovbased approach,
Nonsmooth optimization,
Primaldual gradient flow dynamics,
Primaldual methods,
Proximal augmented Lagrangian.
[bibtexentry]

S. HassanMoghaddam and M. R. Jovanovic.
Global exponential stability of the DouglasRachford splitting dynamics.
In Preprints of the 21st IFAC World Congress,
Berlin, Germany,
pages 73507354,
2020.
Keyword(s): Control for optimization,
Convex Optimization,
Forwardbackward envelope,
DouglasRachford splitting,
Global exponential stability,
Integral quadratic constraints,
Nonsmooth optimization,
PolyakLojasiewicz inequality,
Proximal algorithms,
Primaldual methods,
Proximal augmented Lagrangian.
[bibtexentry]

S. Samuelson,
H. Mohammadi,
and M. R. Jovanovic.
On the transient growth of Nesterov's accelerated method for strongly convex optimization problems.
In Proceedings of the 59th IEEE Conference on Decision and Control,
Jeju Island, Republic of Korea,
pages 59115916,
2020.
Note: (Invited paper).
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convex optimization,
Gradient descent,
Integral quadratic constraints,
Nesterov's accelerated method,
Nonnormal dynamics,
Transient growth.
[bibtexentry]

S. Samuelson,
H. Mohammadi,
and M. R. Jovanovic.
Transient growth of accelerated firstorder methods.
In Proceedings of the 2020 American Control Conference,
Denver, CO,
pages 28582863,
2020.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convex optimization,
Gradient descent,
Transient growth.
[bibtexentry]

H. Mohammadi,
M. Razaviyayn,
and M. R. Jovanovic.
Performance of noisy Nesterov's accelerated method for strongly convex optimization problems.
In Proceedings of the 2019 American Control Conference,
Philadelphia, PA,
pages 34263431,
2019.
Keyword(s): Accelerated firstorder algorithms,
Control for optimization,
Convex optimization,
Integral quadratic constraints,
Linear matrix inequalities,
Noise amplification,
Secondorder moments,
Semidefinite programming.
[bibtexentry]

D. Ding,
B. Hu,
N. K. Dhingra,
and M. R. Jovanovic.
An exponentially convergent primaldual algorithm for nonsmooth composite minimization.
In Proceedings of the 57th IEEE Conference on Decision and Control,
Miami, FL,
pages 49274932,
2018.
Keyword(s): Control for optimization,
Convex optimization,
Euler discretization,
Exponential convergence,
Global exponential stability,
Integral quadratic constraints,
Proximal augmented Lagrangian,
Nonsmooth optimization,
Primaldual gradient flow dynamics,
Proximal algorithms,
Regularization.
[bibtexentry]

S. HassanMoghaddam and M. R. Jovanovic.
On the exponential convergence rate of proximal gradient flow algorithms.
In Proceedings of the 57th IEEE Conference on Decision and Control,
Miami, FL,
pages 42464251,
2018.
Note: (Invited paper).
Keyword(s): Control for optimization,
Distributed optimization,
Forwardbackward envelope,
Exponential convergence,
Global exponential stability,
Gradient flow dynamics,
Largescale systems,
Nonsmooth optimization,
Primaldual method,
Proximal algorithms,
Proximal augmented Lagrangian.
[bibtexentry]

H. Mohammadi,
M. Razaviyayn,
and M. R. Jovanovic.
Variance amplification of accelerated firstorder algorithms for strongly convex quadratic optimization problems.
In Proceedings of the 57th IEEE Conference on Decision and Control,
Miami, FL,
pages 57535758,
2018.
Keyword(s): Accelerated optimization algorithms,
Control for optimization,
Inputoutput analysis,
Largescale networks,
Fundamental limitations,
Robustness,
Variance amplifications.
[bibtexentry]
Back to MJ's Publications
Disclaimer:
This material is presented to ensure timely dissemination of
scholarly and technical work. Copyright and all rights therein
are retained by authors or by other copyright holders.
All person copying this information are expected to adhere to
the terms and constraints invoked by each author's copyright.
In most cases, these works may not be reposted
without the explicit permission of the copyright holder.
Last modified: Sat Oct 5 22:00:41 2024
Author: mihailo.
This document was translated from BibT_{E}X by
bibtex2html