CS675: Convex and Combinatorial Optimization (Fall 2023)
Basic information
- Lecture time: Monday and Wednesday 4:00pm - 5:50pm
- Lecture place: VPD LL101
- Instructor: Shaddin Dughmi
- Email: shaddin@usc.edu
- Office: SAL 234
- Office Hours: Mondays and Wednesdays 2:00pm-3:00pm
- TA: Neel Patel
- Email: neelbpat@usc.edu
- Office Hours Time: Tuesday and Thursday 10:00am - 11:00am
- Office Hours Location: SAL 246
- Course Homepage: viterbi-web.usc.edu/~shaddin/cs675fa23
Announcements
- Nov 14: Homework 9 is out! It will be due Wednesday Nov 22.
- Nov 6: Homework 8 is out! It will be due Monday, Nov 13.
- Oct 23: Homework 7 is out! It will be due Wednesday, Nov 1.
- Oct 12: Project instructions are up. Proposals due October 30th, final report due December 11th.
- Oct 12: Homework 6 is out! It will be due Friday, October 20.
- Sept 27: Homework 5 is out! It will be due Monday, October 9.
- Sept 19: Homework 4 is out! It will be due next Wednesday.
- Sept 11: Homework 3 is out! It will be due in 1 week.
- Sept 11: Homework solutions can be found here throughout the class.
- Aug 30: Homework 2 is out! It will be due in 1 week.
- Aug 29: We have a new room! See above.
- Aug 27: We have a mailing list for the class. If you officially registered before today, you should already be on it. If not, then you will have to email the TA to join the list.
- Aug 23: Homework 1 is out! It will be due in 1 week.
- Aug 23: Course website is up! The topics, schedule, and slides below were initialized from the last iteration of the class. They will change throughout the semester.
Schedule by Week
- Week 1-2: Introduction to Optimization. Linear Programming and Duality.
- Week 3: Convex Sets, Convex Functions
- Slides: Convex Sets, Convex Functions.
- Reading: BV Chapters 2, 3. Even though we didn't cover parts of chapter 3 in class (e.g. quasiconvexity and log-concavity), please do read the entire chapter.
- Week 4: Geometric Duality, Convex Optimization Problems.
- Week 5: Duality of Convex Optimization Problems. KKT conditions.
- Weeks 6-7: Combinatorial problems as linear and convex programs
- Weeks 8-9: Algorithms: Simplex method, ellipsoid method and its consequences, interior point methods.
- Slides: Simplex, Ellipsoid, Consequences of Ellipsoid.
- Reading: Korte Vygen Chapter 3 (Simplex Algorithm) and Chapter 4 (Ellipsoid Algorithm).
- Additional Reading: Luenberger and Ye Chapter 3 and Vince Conitzer's lecture notes for an algebraic treatment of the simplex method
- Highly recommended: Ryan O'Donnell's lecture notes on the Ellipsoid method: here and here.
- Highly recommended: For equivalence of separation and optimization, I recommend the impeccably-written breakthrough paper by Grotschel, Lovasz and Schrijver. You may have to use the USC library proxy to get access, or find it elsewhere online.
- Highly recommended: For a more thorough treatment of all topics related to the Ellipsoid method and its consequences, check out the book Geometric Algorithms and Combinatorial Optimization by Grotschel, Lovasz, and Schrijver. Available electronically for free through USC libraries.
- Highly recommended: For a more thorough treatment of algorithms for convex optimization and linear programming, including interior point methods, check out the book Algorithms for Convex Optimization by Nisheeth Vishnoi. The book is available online through USC libraries.
- Highly Recommended: Anupam Gupta's notes (Chapters 16-19) on gradient descent and interior point methods.
- Week 10: Introduction to Matroid theory. Optimization over matroids and matroid intersections.
- Week 11-12: Submodular functions and optimization.
- Slides
- Lecture notes by Jan Vondrak (lectures 16-19)
- A survey article by your instructor on continuous extensions of submodular functions and their algorithmic applications.
- The original paper by Laszlo Lovasz on submodular functions and convexity. You may have to use the USC library proxy to get access, or find it elsewhere online.
- The Calinescu et al paper on maximizing a monotone submodular functions subject to a matroid constraint.
- The recent paper by Buchbinder et al on unconstrained maximization of non-monotone submodular functions.
- Slides from a recent tutorial by Jan Vondrak, with an overview of many of the "state of the art" results.
- For those interested in applications of submodularity to machine learning, see the materials and references on this page maintained by Andreas Krause and Carlos Guestrin.
- Week 13-15: TBD
Course Description
Over the past half century or so, computer science and mathematical optimization have witnessed the development and maturity of two different paradigms for algorithm design.
The first approach, most familiar to computer scientists, is combinatorial in nature. The tools of discrete mathematics are used to understand the structure of the problem, and algorithms effectively exploit this structure to search over a large yet finite set of possible solutions. The second approach, standard in much of the operations research and mathematical optimization communities, primarily employs the tools of continuous mathematics, high dimensional geometry, and convex analysis. Problems are posed as a search over a set of points in high-dimensional Euclidean space, which can be performed efficiently when the search space and objective function are ``convex.''
Whereas many optimization problems are best modeled either as a discrete or convex optimization problem, researchers have increasingly discovered that many problems are best tackled by a combination of combinatorial and continuous techniques. The ability to seamlessly transition between the two views has become an important skill to every researcher working in algorithm design and analysis. This course intends to instill this skill by presenting a unified treatment of both approaches, focusing on algorithm design tasks that employ techniques from both. The intended audience for this course are PhD students, Masters students, and advanced undergraduates interested in research questions in algorithm design, mathematical optimization, or related disciplines.
Prerequisites
- Prerequisite Courses: CSCI 570 or CSCI 670 or permission of instructor.
- Recommended Preparation: Mathematical maturity and a solid grounding in linear algebra.
Requirements and Grading
Homework assignments will count for 75% of the grade. There will be 6-8 assignments, which will be proof-based, and are intended to be challenging. Collaboration and discussion among students is allowed, even encouraged, though students must write up their solutions independently.
The remaining 25% of the grade will be allocated to a final project. Students will have to choose a related research topic, read several papers in that area, and write a survey of the area.
Late Homework Policy: Students will be allowed 6 late days for homework, to be used in integer amounts and distributed as the student sees fit. No additional late days are allowed.
References
We will refer to two main texts: Convex Optimization by Boyd and Vandenberghe, available free online, and Combinatorial Optimization, Fifth edition by Korte and Vygen, available online through USC libraries. Additional references include Combinatorial Optimization by Schrijver, Linear and Nonlinear Programming by Luenberger and Ye, Fourth edition, available online through usc libraries, as well as research papers and lecture notes from related courses elsewhere which will be linked on the course website.