Tselil Schramm

tselil AT stanford DOT edu

I am an assistant professor at Stanford in the Department of Statistics (and in Computer Science and Mathematics, by courtesy).

My research is at the intersection of theoretical computer science and statistics. I study algorithms for high-dimensional estimation problems, and I work to characterize and explain information-computation tradeoffs.

Before joining Stanford, I received my PhD from U.C. Berkeley, where I was lucky to be advised by Prasad Raghavendra and Satish Rao. After that I was a postdoc at Harvard and MIT, hosted by the wonderful quadrumvirate of Boaz Barak, Jon Kelner, Ankur Moitra, and Pablo Parrilo.

Here is a tutorial for pronouncing my name.


Teaching:
Spring 2025: Intro to Statistics (precalculus) (STATS 60)
Winter 2025: Theory of Statistics II (STATS 300B)
Fall 2024: Machine Learning Theory (STATS 214 / CS 228M)
Winter 2024: Theory of Statistics II (STATS 300B)
Fall 2023: Machine Learning Theory (STATS 214 / CS 228M)
Spring 2023: Probability Theory (STATS 116)
Winter 2023: Intro to Stochastic Processes 1 (STATS 217)
Fall 2022: Machine Learning Theory (STATS 214 / CS 228M)
Spring 2022: The Sum-of-Squares Algorithmic Paradigm in Statistics (STATS 314a)
Winter 2022: Random Processes on Graphs and Lattices (STATS 221)
Spring 2021: Probability Theory (STATS 116)
Winter 2021: The Sum-of-Squares Algorithmic Paradigm in Statistics (STATS 319)


Selected and Recent Papers [all papers]:

Some easy optimization problems have the overlap-gap property [arXiv]
with Shuangping Li, in submission.

Discrepancy Algorithms for the Binary Perceptron [arXiv]
with Shuangping Li and Kangjie Zhou, in submission.

Fast, robust approximate message passing [arXiv]
with Misha Ivkov, in submission.

Semidefinite programs simulate approximate message passing robustly [arXiv]
with Misha Ivkov, in STOC 2024.

Spectral clustering in the Gaussian mixture block model [arXiv]
with Shuangping Li , in submission.

Local and global expansion in random geometric graphs [arXiv]
with Siqi Liu, Sidhanth Mohanty, and Elizabeth Yang, in STOC 2023.

Testing thresholds for high-dimensional sparse random geometric graphs [arXiv]
with Siqi Liu, Sidhanth Mohanty, and Elizabeth Yang, in STOC 2022.
Invited to the STOC 2022 special issue of SICOMP.

Statistical query algorithms and low-degree tests are almost equivalent [arXiv]
with Matthew Brennan, Guy Bresler, Sam Hopkins and Jerry Li, in COLT 2021, runner up to the Best Paper.

Computational barriers to estimation from low-degree polynomials [arXiv]
with Alex Wein, in The Annals of Statistics, 2022.

Subexponential LPs approximate max-cut [arXiv]
with Sam Hopkins and Luca Trevisan, in FOCS 2020.

On the power of sum-of-squares for detecting hidden structures [arXiv]
with Sam Hopkins, Pravesh Kothari, Aaron Potechin, Prasad Raghavendra, and David Steurer, in FOCS 2017.

Strongly refuting random CSPs below the spectral threshold [arXiv]
with Prasad Raghavendra and Satish Rao, in STOC 2017.

Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors [arXiv]
with Sam Hopkins, Jonathan Shi, and David Steurer, in STOC 2016.