Tselil Schramm

tselil AT stanford DOT edu

I am an assistant professor at Stanford in the Department of Statistics (and in Computer Science, by courtesy).

My research is at the intersection of theoretical computer science and statistics. My work aims to develop algorithmic tools for high-dimensional estimation problems and to characterize and explain information-computation tradeoffs.

Before joining Stanford, I received my PhD from U.C. Berkeley, where I was lucky to be advised by Prasad Raghavendra and Satish Rao. After that I was a postdoc at Harvard and MIT, hosted by the wonderful quadrumvirate of Boaz Barak, Jon Kelner, Ankur Moitra, and Pablo Parrilo.

Here is a tutorial for pronouncing my name.


Teaching:
Winter 2024: Theory of Statistics II (STATS 300B)
Fall 2023: Machine Learning Theory (STATS 214 / CS 228M)
Spring 2023: Probability Theory (STATS 116)
Winter 2023: Intro to Stochastic Processes 1 (STATS 217)
Fall 2022: Machine Learning Theory (STATS 214 / CS 228M)
Spring 2022: The Sum-of-Squares Algorithmic Paradigm in Statistics (STATS 314a)
Winter 2022: Random Processes on Graphs and Lattices (STATS 221)
Spring 2021: Probability Theory (STATS 116)
Winter 2021: The Sum-of-Squares Algorithmic Paradigm in Statistics (STATS 319)


Selected and Recent Papers [all papers]:

Semidefinite programs simulate approximate message passing robustly [arXiv]
with Misha Ivkov, in STOC 2024.

Spectral clustering in the Gaussian mixture block model [arXiv]
with Shuangping Li , in submission.

Local and global expansion in random geometric graphs [arXiv]
with Siqi Liu, Sidhanth Mohanty, and Elizabeth Yang, in STOC 2023.

The Franz-Parisi Criterion and Computational Trade-offs in High Dimensional Statistics [arXiv]
with Afonso Bandeira, Ahmed El Alaoui, Sam Hopkins, Alex Wein, and Ilias Zadik, in NeurIPS 2022.

Testing thresholds for high-dimensional sparse random geometric graphs [arXiv]
with Siqi Liu, Sidhanth Mohanty, and Elizabeth Yang, in STOC 2022.
Invited to the STOC 2022 special issue of SICOMP.

Statistical query algorithms and low-degree tests are almost equivalent [arXiv]
with Matthew Brennan, Guy Bresler, Sam Hopkins and Jerry Li, in COLT 2021, runner up to the Best Paper.

Computational barriers to estimation from low-degree polynomials [arXiv]
with Alex Wein, in The Annals of Statistics, 2022.

Subexponential LPs approximate max-cut [arXiv]
with Sam Hopkins and Luca Trevisan, in FOCS 2020.

On the power of sum-of-squares for detecting hidden structures [arXiv]
with Sam Hopkins, Pravesh Kothari, Aaron Potechin, Prasad Raghavendra, and David Steurer, in FOCS 2017.

Strongly refuting random CSPs below the spectral threshold [arXiv]
with Prasad Raghavendra and Satish Rao, in STOC 2017.

Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors [arXiv]
with Sam Hopkins, Jonathan Shi, and David Steurer, in STOC 2016.