Machine Learning Theory
(STATS214/CS229M, Fall 2024)
- Instructor: Tselil Schramm
- Time: Mondays & Wednesdays, 13:30–14:50 Pacific
- Location: Econ 140
- TAs: Yash Nair and Ziang Song
- Office hours: listed on Canvas
Resources:
We will have no official course text, but you may find the following resources useful:
- Handwritten lecture notes from the previous iteration of this course on Canvas.
- Typed unofficial lecture notes written by students from a previous iteration of this course: one set by Max Du, and another by Neil Rathi.
Thank you Max and Neil!
Note that these have not been checked for correctness.
- Lecture notes from a previous iteration of this course, taught by Tengyu Ma.
- Lecture notes from a previous iteration of this course, taught by Percy Liang.
- "High-dimensional statistics: a non-asymptotic viewpoint" by Martin Wainwright, available for free with your Stanford ID.
- "Convex Optimization" by Boyd and Vandenberghe, freely available online.
Prerequisites:
Course Policies: A detailed overview of course policies (including grading and assignments) can be found in the course syllabus.
Course description
The goal of this course is to give a mathematical framework for understanding machine learning.
We will explore the following questions:
How do machine learning algorithms work?
How can we formally quantify their success?
How much data do they need in order to learn?
This is a theoretical, proof-based course, and our focus will be on algorithms with rigorous guarantees wherever they are to be had.
Lectures and Reading
Below is a preliminary schedule (subject to change), including the readings relevant to each lecture.