Tselil Schramm
tselil AT stanford DOT edu
I am an assistant professor of Statistics at Stanford University.
I am broadly interested in the theory of algorithms, optimization, and computational complexity, especially for problems arising in statistics. My work aims to develop algorithmic tools for high-dimensional estimation problems and to characterize and explain information-computation tradeoffs. Some specific themes in my research are: understanding the algorithmic power of the sum-of-squares hierarchy of semidefinite programs, developing fast spectral methods, and relating the power of different models of computation for high-dimensional estimation tasks.
Before joining Stanford,
I received my PhD from U.C. Berkeley, where I was lucky to be advised by
Prasad Raghavendra and
Satish Rao.
After that I was a postdoc at Harvard and MIT, hosted by the wonderful quadrumvirate of Boaz Barak, Jon Kelner, Ankur Moitra, and Pablo Parrilo.
I also spent two lovely semesters as a research fellow at the Simons Institute, as part of the programs on Optimization and on Probability, Geometry, and Computation in High Dimensions.
Here is a tutorial for pronouncing my name.
Teaching:
Winter 2023: Intro to Stochastic Processes 1 (STATS 217)
Fall 2022: Machine Learning Theory (STATS 214 / CS 228M)
Spring 2022: The Sum-of-Squares Algorithmic Paradigm in Statistics (STATS 314a)
Winter 2022: Random Processes on Graphs and Lattices (STATS 221)
Spring 2021: Probability Theory (STATS 116)
Winter 2021: The Sum-of-Squares Algorithmic Paradigm in Statistics (STATS 319)
Papers:
Local and global expansion in random geometric graphs
[arXiv]
with
Siqi Liu,
Sidhanth Mohanty, and
Elizabeth Yang,
in submission.
The Franz-Parisi Criterion and Computational Trade-offs in High Dimensional Statistics
[arXiv]
with
Afonso Bandeira,
Ahmed El Alaoui,
Sam Hopkins,
Alex Wein, and
Ilias Zadik,
to appear in NeurIPS 2022.
Testing thresholds for high-dimensional sparse random geometric graphs
[arXiv]
with
Siqi Liu,
Sidhanth Mohanty, and
Elizabeth Yang,
in STOC 2022.
The SDP value of random 2CSPs
[arXiv]
with
Amulya Musipatla,
Ryan O'Donnell, and
Xinyu Wu,
in ICALP 2022.
Robust Regression Revisited: Acceleration and Improved Estimation Rates
[arXiv]
with
Arun Jambulapati,
Jerry Li, and
Kevin Tian,
in NeurIPS 2021.
Statistical query algorithms and low-degree tests are almost equivalent
[arXiv]
with
Matthew Brennan,
Guy Bresler,
Sam Hopkins and
Jerry Li,
in COLT 2021, runner up to the Best Paper.
Non-asymptotic approximations of neural networks by Gaussian processes
[arXiv]
with
Dan Mikulincer and
Ronen Eldan,
in COLT 2021.
The strongish planted clique hypothesis and its consequences
[arXiv]
with
Pasin Manurangsi and
Aviad Rubinstein,
in ITCS 2021.
Computational barriers to estimation from low-degree polynomials
[arXiv]
with
Alex Wein,
in The Annals of Statistics, 2022.
Playing unique games on certified small-set expanders
[arXiv]
with
Mitali Bafna,
Boaz Barak,
Pravesh Kothari and
David Steurer,
in STOC 2021.
Subexponential LPs approximate max-cut
[arXiv]
with
Sam Hopkins and
Luca Trevisan,
in FOCS 2020.
(Nearly) efficient algorithms for the graph matching problem on correlated random graphs
[arXiv]
with
Boaz Barak,
Chi-Ning Chou,
Zhixian Lei,
and
Yueqi Sheng,
in NeurIPS 2019.
Sherali-Adams strikes back
[arXiv]
with
Ryan O'Donnell,
in CCC 2019.
Invited to the CCC 2019 special issue of Theory of Computing.
A robust spectral algorithm for overcomplete tensor decomposition
[PMLR]
with
Sam Hopkins,
and
Jonathan Shi,
in COLT 2019.
SOS lower bounds with hard constraints: think global, act local
[arXiv]
with
Pravesh Kothari,
and
Ryan O'Donnell,
in ITCS 2019.
The threshold for SDP-refutation of random regular NAE-3SAT
[arXiv]
with
Yash Deshpande,
Andrea Montanari,
Ryan O'Donnell,
and
Subhabrata Sen,
in SODA 2019.
Computing exact minimum cuts without knowing the graph
[arXiv]
with
Aviad Rubinstein and
Matt Weinberg,
in ITCS 2018.
On the power of sum-of-squares for detecting hidden structures
[arXiv]
with
Sam Hopkins,
Pravesh Kothari,
Aaron Potechin,
Prasad Raghavendra, and
David Steurer,
in FOCS 2017.
Fast and robust tensor decomposition with applications to dictionary learning
[arXiv]
with
David Steurer,
in COLT 2017.
Strongly refuting random CSPs below the spectral threshold
[arXiv]
with
Prasad Raghavendra
and
Satish Rao,
in STOC 2017.
Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
[arXiv]
with Sam Hopkins, Jonathan Shi, and David Steurer,
in STOC 2016.
On the integrality gap of degree-4 sum-of-squares for planted clique
with
Sam Hopkins,
Pravesh Kothari,
Aaron Potechin,
and
Prasad Raghavendra,
in SODA 2016
(merge of [this] paper and [this] paper)
Invited to the SODA 2016 special issue of ACM Transactions on Algorithms.
Braess's paradox for the spectral gap in random graphs and delocalization of eigenvectors
[arXiv]
with Ronen Eldan and
Miklos Racz,
in Random Structures & Algorithms (2016).
Near optimal LP rounding algorithms for correlation clustering in complete and complete k-partite graphs
[arXiv]
with Shuchi Chawla,
Konstantin Makarychev,
and Grigory Yaroslavtsev,
in STOC 2015.
Gap amplification for small-set expansion via random walks
[arXiv]
with Prasad Raghavendra,
in APPROX 2014.
Global and local information in clustering labeled block models
[arXiv]
with
Varun Kanade
and
Elchanan Mossel
,
in RANDOM 2014, and in IEEE Transactions on Information Theory (2016).