Course Description
This course offers a comprehensive exploration of modern optimization theory and algorithms, with applications in machine learning and operations research. It begins with an examination of fundamental concepts and problem properties in optimization, such as convexity, duality, smoothness, and subdifferentials. Students will then learn to design first-order optimization algorithms tailored to various problem characteristics, with a strong focus on how these structures impact convergence analysis.
The course also covers lower complexity bounds for different function classes, providing students with a critical understanding of the computational limits inherent in optimization algorithms. In the final part of this class, students will apply these advanced optimization techniques to tackle real-world challenges in machine learning, data science, and operations research, with particular attention to topics like optimal transport and distributionally robust optimization.
General Information
Please see [this information sheet] for syllabus and grading details.
Template
Please use the provided template to scribe the lecture notes.
Lecture Notes
- Week 1 (Sep 3rd, 2024) — [Lecture 1: Overview], [Lecture 2: Element of Convex Analysis I]
- Week 2 (Sep 10th, 2024) — [Lecture 3: Element of Convex Analysis II], [Lecture 4: Duality Theory]
- Week 3 (Sep 17th, 2024) [[Lecture 5: Duality Theory II and Conjugate Functions]]
- Week 4 (Sep 24th, 2024) [[Lecture 6: Optimality Conditions (Connection between Duality Theory and Subdifferential Calculus)]]
- Week 5 (Oct 1st, 2024) [Lecture 7: (Sub)Gradient Descent under Different Conditions]
- Week 6 (Oct 8th, 2024) [Lecture 8: Subgradient Methods Under Weakly Convexity]
- Week 7 (Oct 15th, 2024) [Lecture 9: Unified Convergence Analysis Framework for PGD]
- Week 8 (Oct 22th, 2024) No class!
- Week 9 (Oct 29th, 2024)