Machine Learning Theory
(STATS214/CS229M, Fall 2022)
- Instructor: Tselil Schramm
- Time: Mondays & Wednesdays, 13:30–14:50 Pacific
- Location: 530-127
- TAs: John Cherian, Yash Nair, Asher Spector, and Yu Wang
- Office hours: Monday 17:30 in 160-B40 (Yash), Tuesday 17:30 in 200-303 (Asher), Wednesday 15:30 in TBD (John), Thursday 10:00 in Sequoia Hall 220 (Yu)
We will have no official course text, but you may find the following resources useful:
- Lecture notes from the previous iteration of this course, taught by Tengyu Ma.
- Lecture notes from the previous iteration of this course, taught by Percy Liang.
- "High-dimensional statistics: a non-asymptotic viewpoint" by Martin Wainwright, available for free with your Stanford ID.
- "Convex Optimization" by Boyd and Vandenberghe, freely available online.
Course Policies: A detailed overview of course policies (including grading and assignments) can be found in the course syllabus.
The goal of this course is to give mathematical tools for understanding machine learning.
We will explore the following questions:
How do machine learning algorithms work?
How can we quantify their success?
How much data do we need in order to guarantee that we actually learned something?
This is a theoretical, proof-based course, and our focus will be on algorithms with rigorous guarantees wherever they are to be had.
Lectures and Reading
Below is a preliminary schedule (subject to change), including the readings relevant to each lecture.