Machine Learning Theory
(STATS214/CS229M, Fall 2022)
- Instructor: Tselil Schramm
- Time: Mondays & Wednesdays, 13:30–14:50 Pacific
- Location: 530-127
- TAs: John Cherian, Yash Nair, Asher Spector, and Yu Wang
- Office hours: Monday 17:30-19:30 in 160-B40 (Yash), Tuesday 17:30-19:30 in 200-303 (Asher), Wednesday 15:30-17:30 in 120-314 (John), Thursday 12:30-14:30 in 80-113 (Yu), Professor OH by appointment
Resources:
We will have no official course text, but you may find the following resources useful:
- Lecture notes from the previous iteration of this course, taught by Tengyu Ma.
- Lecture notes from the previous iteration of this course, taught by Percy Liang.
- "High-dimensional statistics: a non-asymptotic viewpoint" by Martin Wainwright, available for free with your Stanford ID.
- "Convex Optimization" by Boyd and Vandenberghe, freely available online.
Prerequisites:
Course Policies: A detailed overview of course policies (including grading and assignments) can be found in the course syllabus.
Course description
The goal of this course is to give mathematical tools for understanding machine learning.
We will explore the following questions:
How do machine learning algorithms work?
How can we quantify their success?
How much data do we need in order to guarantee that we actually learned something?
This is a theoretical, proof-based course, and our focus will be on algorithms with rigorous guarantees wherever they are to be had.
Lectures and Reading
Below is a preliminary schedule (subject to change), including the readings relevant to each lecture.