Tselil Schramm's papers (by topic):
Information-computation tradeoffs:
Some easy optimization problems have the overlap-gap property
[arXiv]
with
Shuangping Li
,
in submission.
Fast, robust approximate message passing
[arXiv]
with
Misha Ivkov,
in submission.
Semidefinite programs simulate approximate message passing robustly
[arXiv]
with
Misha Ivkov,
in STOC 2024.
The Franz-Parisi Criterion and Computational Trade-offs in High Dimensional Statistics
[arXiv]
with
Afonso Bandeira,
Ahmed El Alaoui,
Sam Hopkins,
Alex Wein, and
Ilias Zadik,
in NeurIPS 2022.
The SDP value of random 2CSPs
[arXiv]
with
Amulya Musipatla,
Ryan O'Donnell, and
Xinyu Wu,
in ICALP 2022.
Statistical query algorithms and low-degree tests are almost equivalent
[arXiv]
with
Matthew Brennan,
Guy Bresler,
Sam Hopkins and
Jerry Li,
in COLT 2021, runner up to the Best Paper.
Computational barriers to estimation from low-degree polynomials
[arXiv]
with
Alex Wein,
in The Annals of Statistics, 2022.
The strongish planted clique hypothesis and its consequences
[arXiv]
with
Pasin Manurangsi and
Aviad Rubinstein,
in ITCS 2021.
SOS lower bounds with hard constraints: think global, act local
[arXiv]
with
Pravesh Kothari,
and
Ryan O'Donnell,
in ITCS 2019.
The threshold for SDP-refutation of random regular NAE-3SAT
[arXiv]
with
Yash Deshpande,
Andrea Montanari,
Ryan O'Donnell,
and
Subhabrata Sen,
in SODA 2019.
On the power of sum-of-squares for detecting hidden structures
[arXiv]
with
Sam Hopkins,
Pravesh Kothari,
Aaron Potechin,
Prasad Raghavendra, and
David Steurer,
in FOCS 2017.
On the integrality gap of degree-4 sum-of-squares for planted clique
with
Sam Hopkins,
Pravesh Kothari,
Aaron Potechin,
and
Prasad Raghavendra,
in SODA 2016
(merge of [this] paper and [this] paper)
Invited to the SODA 2016 special issue of ACM Transactions on Algorithms.
Algorithms for estimation, hypothesis testing, and refutation:
Robust Regression Revisited: Acceleration and Improved Estimation Rates
[arXiv]
with
Arun Jambulapati,
Jerry Li, and
Kevin Tian,
in NeurIPS 2021.
(Nearly) efficient algorithms for the graph matching problem on correlated random graphs
with
Boaz Barak,
Chi-Ning Chou,
Zhixian Lei,
and
Yueqi Sheng,
in NeurIPS 2019.
A robust spectral algorithm for overcomplete tensor decomposition
[PMLR]
with
Sam Hopkins,
and
Jonathan Shi,
in COLT 2019.
Spectral clustering in the Gaussian mixture block model
[arXiv]
with
Shuangping Li
,
in submission.
Fast and robust tensor decomposition with applications to dictionary learning
[arXiv]
with
David Steurer,
in COLT 2017.
Strongly refuting random CSPs below the spectral threshold
[arXiv]
with
Prasad Raghavendra
and
Satish Rao,
in STOC 2017.
Fast spectral algorithms from sum-of-squares proofs: tensor decomposition and planted sparse vectors
[arXiv]
with Sam Hopkins, Jonathan Shi, and David Steurer,
in STOC 2016.
Global and local information in clustering labeled block models
[arXiv]
with
Varun Kanade
and
Elchanan Mossel
,
in RANDOM 2014, and in IEEE Transactions on Information Theory (2016).
Random Graphs:
Local and global expansion in random geometric graphs
[arXiv]
with
Siqi Liu,
Sidhanth Mohanty, and
Elizabeth Yang,
in STOC 2023.
Testing thresholds for high-dimensional sparse random geometric graphs
[arXiv]
with
Siqi Liu,
Sidhanth Mohanty, and
Elizabeth Yang,
in STOC 2022.
Invited to the STOC 2022 special issue of SICOMP.
Braess's paradox for the spectral gap in random graphs and delocalization of eigenvectors
[arXiv]
with Ronen Eldan and
Miklos Racz,
in Random Structures & Algorithms (2016).
Approximation Algorithms and Convex Programs:
Playing unique games on certified small-set expanders
[arXiv]
with
Mitali Bafna,
Boaz Barak,
Pravesh Kothari and
David Steurer,
in STOC 2021.
Subexponential LPs approximate max-cut
[arXiv]
with
Sam Hopkins and
Luca Trevisan,
in FOCS 2020.
Sherali-Adams strikes back
[arXiv]
with
Ryan O'Donnell,
in CCC 2019.
Invited to the CCC 2019 special issue of Theory of Computing.
Near optimal LP rounding algorithms for correlation clustering in complete and complete k-partite graphs
[arXiv]
with Shuchi Chawla,
Konstantin Makarychev,
and Grigory Yaroslavtsev,
in STOC 2015.
Gap amplification for small-set expansion via random walks
[arXiv]
with Prasad Raghavendra,
in APPROX 2014.
Miscellaneous:
Non-asymptotic approximations of neural networks by Gaussian processes
[arXiv]
with
Dan Mikulincer and
Ronen Eldan,
in COLT 2021.
Computing exact minimum cuts without knowing the graph
[arXiv]
with
Aviad Rubinstein and
Matt Weinberg,
in ITCS 2018.