CS675: Convex and Combinatorial Optimization (Fall 2019)
Basic information
- Lecture time: Monday + Wednesday 4:00pm - 5:50pm
- Lecture place: SGM 226
- Instructor: Shaddin Dughmi
- Email: shaddin@usc.edu
- Office: SAL 234
- Office Hours: Thursday 10:30am - noon.
- TA: Yury Zemlyanskiy
- Email: yury.zemlyanskiy@usc.edu
- Office Hours Time: Monday 2-3pm and Friday 3-4pm
- Office Hours Location: Michelson Hall, first floor (tables at the KAP-facing entrance)
- Course Homepage: viterbi-web.usc.edu/~shaddin/cs675fa19
Announcements
- Dec 12: Homework 4 solutions are out.
- Nov 22: Homework 3 solutions are out.
- Nov 17: Homework 4 is out. It will be due at the beginning of class in 2 weeks, on Monday Dec 2.
- Oct 25: Homework 3 is out. It will be due at the beginning of class in a little over 2 weeks, on Monday Nov 11.
- Oct 24: Homework 2 solutions are out.
- Oct 7: You can find a couple of examples of successful projects from years past here
- Oct 7: Homework 1 solutions are out.
- Sep 29: Project instructions have been posted. The proposal will be due on Oct 28, and the final report will be due on Dec 16.
- Sep 29: Homework 2 is out. It will be due at the beginning of class in 2.5 weeks, on Wed Oct 16.
- Sep 11: Yury's office hours are now posted.
- Sep 11: Homework 1 is out. It will be due at the beginning of class in two weeks, on Wed Sep 25. Though this is not the most difficult homework assignment of the course, it is the longest, so please start immediately!
- Sep 7: Shaddin's office hours will be on Thursdays 10:30am - noon.
- Sep 5: We have a course mailing list: csci675-usc-fa19@googlegroups.com. I added everyone who is registered as of today. If you aren't on the list, please search for the list on googlegroups and request joining. There was an issue with group visibility which I just fixed, so if you tried already and couldn't find the list, try again.
- Sep 4: We have a new room! Starting today, we will meet in SGM 226.
- Aug 30: The class is currently full. It remains to be seen whether we can increase the number of students who can officially register for the class. It also remains to be seen whether we will get a bigger room. However, everyone is welcome to show up to the lectures, and we will do our best to fit everyone in the room.
- Aug 30: Course website is up!
Schedule by Week
- Week 1-2: Introduction to Optimization. Linear Programming and Duality.
- Week 3: Convex Sets, Convex Functions
- Slides: Convex Sets, Convex Functions.
- Reading: BV Chapters 2, 3. Even though we didn't cover parts of chapter 3 in class (e.g. quasiconvexity and log-concavity), please do read the entire chapter.
- Week 4: Geometric Duality, Convex Optimization Problems.
- Week 5: Duality of Convex Optimization Problems. KKT conditions.
- Weeks 6-7: Combinatorial problems as linear and convex programs
- Weeks 8-9: Algorithms: Simplex method, ellipsoid method and its consequences.
- Slides: Simplex, Ellipsoid, Consequences of Ellipsoid.
- Reading: Korte Vygen Chapter 3 (Simplex Algorithm) and Chapter 4 (Ellipsoid Algorithm).
- Additional Reading: Luenberger and Ye Chapter 3 and Vince Conitzer's lecture notes for an algebraic treatment of the simplex method
- Highly recommended: Ryan O'Donnell's lecture notes on the Ellipsoid method: here and here.
- Highly recommended: For equivalence of separation and optimization, I recommend the impeccably-written breakthrough paper by Grotschel, Lovasz and Schrijver. You may have to use the USC library proxy to get access, or find it elsewhere online.
- Highly recommended: For a more thorough treatment of all topics related to the Ellipsoid method and its consequences, check out the book Geometric Algorithms and Combinatorial Optimization by Grotschel, Lovasz, and Schrijver. Available electronically for free through USC libraries.
- Week 10: Introduction to Matroid theory. Optimization over matroids and matroid intersections.
- Week 11-12: More Matroid theory: Intersection and Union theorems.
- Week 12-14: Submodular functions and optimization.
- Slides
- Lecture notes by Jan Vondrak (lectures 16-19)
- A survey article by your instructor on continuous extensions of submodular functions and their algorithmic applications.
- The original paper by Laszlo Lovasz on submodular functions and convexity. You may have to use the USC library proxy to get access, or find it elsewhere online.
- The Calinescu et al paper on maximizing a monotone submodular functions subject to a matroid constraint.
- The recent paper by Buchbinder et al on unconstrained maximization of non-monotone submodular functions.
- Slides from a recent tutorial by Jan Vondrak, with an overview of many of the "state of the art" results.
- For those interested in applications of submodularity to machine learning, see the materials and references on this page maintained by Andreas Krause and Carlos Guestrin.
- Week 15: Proper scoring rules and crowdsourcing information.
- Reading: See readings under lectures 9-11 here
Course Description
Over the past half century or so, computer science and mathematical optimization have witnessed the development and maturity of two different paradigms for algorithm design.
The first approach, most familiar to computer scientists, is combinatorial in nature. The tools of discrete mathematics are used to understand the structure of the problem, and algorithms effectively exploit this structure to search over a large yet finite set of possible solutions. The second approach, standard in much of the operations research and mathematical optimization communities, primarily employs the tools of continuous mathematics, high dimensional geometry, and convex analysis. Problems are posed as a search over a set of points in high-dimensional Euclidean space, which can be performed efficiently when the search space and objective function are ``convex.''
Whereas many optimization problems are best modeled either as a discrete or convex optimization problem, researchers have increasingly discovered that many problems are best tackled by a combination of combinatorial and continuous techniques. The ability to seamlessly transition between the two views has become an important skill to every researcher working in algorithm design and analysis. This course intends to instill this skill by presenting a unified treatment of both approaches, focusing on algorithm design tasks that employ techniques from both. The intended audience for this course are PhD students, Masters students, and advanced undergraduates interested in research questions in algorithm design, mathematical optimization, or related disciplines.
Prerequisites
- Prerequisite Courses: CSCI 570 or CSCI 670 or permission of instructor.
- Recommended Preparation: Mathematical maturity and a solid grounding in linear algebra.
Requirements and Grading
Homework assignments will count for 75% of the grade. There will be 4-6 assignments, which will be proof-based, and are intended to be very challenging. Collaboration and discussion among students is allowed, even encouraged, though students must write up their solutions independently.
The remaining 25% of the grade will be allocated to a final project. Students will have to choose a related research topic, read several papers in that area, and write a survey of the area.
Late Homework Policy: Students will be allowed 5 late days for homework, to be used in integer amounts and distributed as the student sees fit. No additional late days are allowed.
References
We will refer to two main texts: Convex Optimization by Boyd and Vandenberghe, available free online, and Combinatorial Optimization, Fifth edition by Korte and Vygen, available online through USC libraries. Additional references include Combinatorial Optimization by Schrijver, Linear and Nonlinear Programming by Luenberger and Ye, Fourth edition, available online through usc libraries, as well as research papers and lecture notes from related courses elsewhere which will be linked on the course website.