CS675: Convex and Combinatorial Optimization (Spring 2022)
Basic information
- Lecture time: Monday + Wednesday 2:00pm - 3:50pm
- Lecture place: Zoom for the first couple of weeks (see blackboard for the link), then VHE 210
- Instructor: Shaddin Dughmi
- Email: shaddin@usc.edu
- Office: SAL 234
- Office Hours: Mondays and Wednesdays 4pm-5pm
- TA: Curtis Bechtel
- Email: bechtel@usc.edu
- Office Hours Time: Tuesdays and Fridays 10am-11am
- Office Hours Location: SAL 246
- Course Homepage: viterbi-web.usc.edu/~shaddin/cs675sp22
Announcements
- April 18: Homework 5 is up. It will be due in 2 weeks, on Monday May 2nd.
- April 1: Homework 4 is up. It will be due in 2 weeks, on Friday April 15th.
- March 7: Homework 3 is up. It will be due in 3 weeks, on Monday March 28th.
- March 7: Project instructions are up. Proposal is due Monday March 21st, and final report is due Monday May 9th.
- Feb 13: Homework 2 is out, and will be due on Feb 28.
- Jan 26: Homework 1 is out, and will be due on Feb 9.
- Jan 10: Mailing list has been created, and I added everyone who is already registered. If you were added then you should have received a welcome email. If you need to be added manually, please email Curtis.
- Jan 9: Course website is up! See you all (virtually) tomorrow! We will be remote for the first couple of weeks (see blackboard for link). Afterwards we will hopefully move to KDC 241. I initialized the website using the schedule, slides, and documents from the last iteration, but I reserve the right to update and change things arbitrarily as we go along without warning.
Schedule by Week
- Week 1-2: Introduction to Optimization. Linear Programming and Duality.
- Week 3: Convex Sets, Convex Functions
- Slides: Convex Sets, Convex Functions.
- Reading: BV Chapters 2, 3. Even though we didn't cover parts of chapter 3 in class (e.g. quasiconvexity and log-concavity), please do read the entire chapter.
- Week 4: Geometric Duality, Convex Optimization Problems.
- Week 5: Duality of Convex Optimization Problems. KKT conditions.
- Weeks 6-7: Combinatorial problems as linear and convex programs
- Weeks 8-9: Algorithms: Simplex method, ellipsoid method and its consequences.
- Slides: Simplex, Ellipsoid, Consequences of Ellipsoid.
- Reading: Korte Vygen Chapter 3 (Simplex Algorithm) and Chapter 4 (Ellipsoid Algorithm).
- Additional Reading: Luenberger and Ye Chapter 3 and Vince Conitzer's lecture notes for an algebraic treatment of the simplex method
- Highly recommended: Ryan O'Donnell's lecture notes on the Ellipsoid method: here and here.
- Highly recommended: For equivalence of separation and optimization, I recommend the impeccably-written breakthrough paper by Grotschel, Lovasz and Schrijver. You may have to use the USC library proxy to get access, or find it elsewhere online.
- Highly recommended: For a more thorough treatment of all topics related to the Ellipsoid method and its consequences, check out the book Geometric Algorithms and Combinatorial Optimization by Grotschel, Lovasz, and Schrijver. Available electronically for free through USC libraries.
- Highly recommended: For a more thorough treatment of algorithms for convex optimization and linear programming, including interior point methods, check out the book Algorithms for Convex Optimization by Nisheeth Vishnoi. The book is available online through USC libraries.
- Week 10: Introduction to Matroid theory. Optimization over matroids and matroid intersections.
- Week 11-12: More Matroid theory: Intersection and Union theorems.
- Week 12-14: Submodular functions and optimization.
- Slides
- Lecture notes by Jan Vondrak (lectures 16-19)
- A survey article by your instructor on continuous extensions of submodular functions and their algorithmic applications.
- The original paper by Laszlo Lovasz on submodular functions and convexity. You may have to use the USC library proxy to get access, or find it elsewhere online.
- The Calinescu et al paper on maximizing a monotone submodular functions subject to a matroid constraint.
- The recent paper by Buchbinder et al on unconstrained maximization of non-monotone submodular functions.
- Slides from a recent tutorial by Jan Vondrak, with an overview of many of the "state of the art" results.
- For those interested in applications of submodularity to machine learning, see the materials and references on this page maintained by Andreas Krause and Carlos Guestrin.
- Week 15: Proper scoring rules and crowdsourcing information.
- Reading: See readings under lectures 9-11 here
Course Description
Over the past half century or so, computer science and mathematical optimization have witnessed the development and maturity of two different paradigms for algorithm design.
The first approach, most familiar to computer scientists, is combinatorial in nature. The tools of discrete mathematics are used to understand the structure of the problem, and algorithms effectively exploit this structure to search over a large yet finite set of possible solutions. The second approach, standard in much of the operations research and mathematical optimization communities, primarily employs the tools of continuous mathematics, high dimensional geometry, and convex analysis. Problems are posed as a search over a set of points in high-dimensional Euclidean space, which can be performed efficiently when the search space and objective function are ``convex.''
Whereas many optimization problems are best modeled either as a discrete or convex optimization problem, researchers have increasingly discovered that many problems are best tackled by a combination of combinatorial and continuous techniques. The ability to seamlessly transition between the two views has become an important skill to every researcher working in algorithm design and analysis. This course intends to instill this skill by presenting a unified treatment of both approaches, focusing on algorithm design tasks that employ techniques from both. The intended audience for this course are PhD students, Masters students, and advanced undergraduates interested in research questions in algorithm design, mathematical optimization, or related disciplines.
Prerequisites
- Prerequisite Courses: CSCI 570 or CSCI 670 or permission of instructor.
- Recommended Preparation: Mathematical maturity and a solid grounding in linear algebra.
Requirements and Grading
Homework assignments will count for 75% of the grade. There will be 4-6 assignments, which will be proof-based, and are intended to be very challenging. Collaboration and discussion among students is allowed, even encouraged, though students must write up their solutions independently.
The remaining 25% of the grade will be allocated to a final project. Students will have to choose a related research topic, read several papers in that area, and write a survey of the area.
Late Homework Policy: Students will be allowed 6 late days for homework, to be used in integer amounts and distributed as the student sees fit. No additional late days are allowed.
References
We will refer to two main texts: Convex Optimization by Boyd and Vandenberghe, available free online, and Combinatorial Optimization, Fifth edition by Korte and Vygen, available online through USC libraries. Additional references include Combinatorial Optimization by Schrijver, Linear and Nonlinear Programming by Luenberger and Ye, Fourth edition, available online through usc libraries, as well as research papers and lecture notes from related courses elsewhere which will be linked on the course website.