Machine Learning Theory
 
	  (STATS214/CS229M, Fall 2023)
      
        - Instructor: Tselil Schramm
- Time: Mondays & Wednesdays, 13:30–14:50 Pacific 
- Location: 200-205 
- TAs: Yash Nair and Asher Spector
- Office hours: TBA
Resources: 
 We will have no official course text, but you may find the following resources useful:
-  Handwritten lecture notes from the previous iteration of this course on Canvas.
-  Lecture notes from the previous iteration of this course, taught by Tengyu Ma.
-  Lecture notes from the previous iteration of this course, taught by Percy Liang.
-  "High-dimensional statistics: a non-asymptotic viewpoint" by Martin Wainwright, available for free with your Stanford ID.
-  "Convex Optimization" by Boyd and Vandenberghe, freely available online.
Prerequisites:  Course Policies:  A detailed overview of course policies (including grading and assignments) can be found in the course syllabus.
      Course description
The goal of this course is to give a mathematical framework for understanding machine learning.
We will explore the following questions:
How do machine learning algorithms work? 
How can we formally quantify their success?
How much data do they need in order to learn?	
This is a theoretical, proof-based course, and our focus will be on algorithms with rigorous guarantees wherever they are to be had.
      
Lectures and Reading 
Below is a preliminary schedule (subject to change), including the readings relevant to each lecture.