Featured

Optimization in Computer Science

This is the course blog/website for students enrolled in CS 690OP Optimization in Computer Science in spring 2019 at UMass Amherst.

Your Instructor is Arya Mazumdar.

Class Meets: MoWe 9:05AM – 10:20AM Room CS 142

Instructor Office Hour: We 10:30AM – 11:30AM Room CS 332

TA I : Liu Yang lyang@cs.umass.edu

Office Hour: Every Monday 10:40AM – 11:40AM Room CS 207

Please stay tuned for more updates.

Advertisements

Class Schedule

Jan 23 Lecture 1 Introduction to Optimization, Course Logistics, Example of Optimization: Least squares, Curve fitting, Classification, Perceptron Notes

Jan 28 Lecture 2 SVM and optimal separating hyperplane, Basics of linear algebra, calculus, geometry and convexity Notes

Jan 30 Lecture 3 Basics of linear algebra, calculus, geometry and convexity Notes

Feb 4 Lecture 4  More on convexity, epigraph and sublevel sets, Convex characterizations, Notes  Homework 1 Out

Feb 6 Lecture 5 Generic optimization problems, structures of optimal points, global minimum of convex functions Notes

Feb 11 Lecture 6 How to optimize, descent direction, black-box model, first order methods, gradient descent Notes Homework 1 Submission

Feb 13 University Closed Due to Snow Storm (No Lecture 7)

Feb 19 Lecture 8 Steepest descent, convergence of gradient descent, strongly convex functions, convergence based on potential functions Notes

Feb 20 In-Class Midterm

Feb 25 Lecture 9 Smooth convex function and faster convergence, condition number, linear-rate convergence Notes

Feb 27 Lecture 10 Constrained convex optimization, projected gradient descent, convergence of PGD Notes

Mar 4 University Closed Due to Snow Storm (No Lecture 11)

Mar 6 Lecture 12 Conditional gradient descent (Frank-Wolfe), Not-differentiable convex functions, subgradient descent Notes

Mar 18 Lecture 13 Center of gravity method, method of conjugate gradients Notes Homework 2 Out

Mar 20 Lecture 14 Conjugate gradients, Accelerated gradient descent Notes

Mar 25 Lecture 15 Nesterov’s acceleration, why acceleration works Notes Homework 2 Submission

Mar 27 Lecture 16 Accelerated gradient descent: proof of convergence, stochastic methods Notes

Apr 1 Lecture 17 Stochastic gradient descent and convergence, revisiting SVM and perceptron Notes

Apr 3 Lecture 18 Second order: Newton’s method Notes

Apr 8 Lecture 19 Theory of optimization, Lagrangian, dual problems, strong duality and Slater’s condition, complementary slackness Notes

Apr 10 Lecture 20 KKT conditions, Linear Program and duality Notes

Apr 17 In-Class Midterm

Apr 22 Lecture 21 Discrete optimization, integer LP and rounding, integrality gap, submodular set functions Notes

Apr 24 Homework 3 Out

Apr 29 Lecture 23 submodular minimization, Lovasz extension, gradient descent for submodular minimization Notes

May 1 Lecture 24 Submodular maximization and greedy algorithm, generic recipes for nonconvex optimization: sparse recovery, low-rank approximation, PCA Notes Homework 3 Submission