- 05/17/19: I have given a talk “Extension theorems, dimension reduction, and clustering” at Simons Collaboration on Algorithms & Geometry Annual Meeting
- 04/21/19: One paper accepted to ICML 2019
- 04/18/19: One paper accepted to COLT 2019
- 04/03/19: New paper “SANNS: Scaling Up Secure Approximate $k$-Nearest Neighbors Search”
- 02/11/19: One paper accepted to STOC 2019
- 02/07/19: New paper “On Mean Estimation for General Norms with Statistical Queries”
- 01/26/19: New paper “Learning Sublinear-Time Indexing for Nearest Neighbor Search”
- 12/25/18: The video of my talk “New Algorithms for High-Dimensional Data” is up. It is a coherent overview of my research as of beginning of 2017
- 12/07/18: I am looking for two interns to work on privacy-preserving machine learning: apply soon!
- 11/29/18: I have given a talk “Hölder Homeomorphisms and Approximate Nearest Neighbors” at Simons Institute (video).
- 11/15/18: New paper “Adversarial Examples from Cryptographic Pseudo-Random Generators”
- 11/08/18: New paper “Performance of Johnson-Lindenstrauss Transform for k-Means and k-Medians Clustering”; see the talk given by Yury Makarychev at Simons Institute.
- 09/13/18: I have created a webpage for the class “Algorithms through geometric lens” I will be teaching this Fall. (See also the previous offering.)
- 09/10/18: I have given a talk “Hölder Homeomorphisms and Approximate Nearest Neighbors” at the Harvard Theory of Computation Seminar (slides).
- 08/31/18: I have given a tutorial on nearest neighbor search at the Foundations of the Data Science Bootcamp at Simons Institute: part 1, part 2.
- 08/13/18: An article in Quanta Magazine covering some of my recent work.
- S. Bubeck, Y.T. Lee, E. Price and IR, Adversarial examples from computational constraints, ICML, 2019 (arXiv + arXiv).
Accepted as a long talk
K. Makarychev, Y. Makarychev and IR, Performance of Johnson—Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering, STOC, 2019 (arXiv).
Invited to the special issue of Theory of Computing
- A. Andoni, A. Naor, A. Nikolov, IR and E. Waingarten, Hölder Homeomorphisms and Approximate Nearest Neighbors, FOCS, 2018 (local copy).
- A. Andoni, A. Naor, A. Nikolov, IR and E. Waingarten, Data-Dependent Hashing via Nonlinear Spectral Gaps, STOC, 2018 (local copy).
A. Andoni, T. Laarhoven, IR and E. Waingarten,
Optimal Hashing-based Time–Space Trade-offs for Approximate Near Neighbors, SODA, 2017 (arXiv).
Invited to the special issue of ACM Transactions on Algorithms
- A. Andoni, P. Indyk, T. Laarhoven, IR and L. Schmidt, Practical and Optimal LSH for Angular Distance, NIPS, 2015 (arXiv).
A. Andoni and IR, Optimal Data-Dependent Hashing for Approximate Near Neighbors, STOC, 2015 (arXiv).
Invited to Highlights of Algorithms (HALG 2016)
A. Andoni, R. Krauthgamer and IR, Sketching and Embedding are Equivalent for Norms, STOC, 2015 (arXiv).
Appeared in the special issue of SIAM Journal on Computing
Invited to Highlights of Algorithms (HALG 2016)
See the full list of papers (with slides, posters, videos etc).
During Fall 2017, I was a Simons Junior Fellow at Columbia University hosted by Alexandr Andoni. In August 2017 I graduated with Ph.D. in Computer Science from MIT Computer Science and AI Laboratory, where I was privileged to be advised by Piotr Indyk. Check out my thesis “High-Dimensional Similarity Search and Sketching: Algorithms and Hardness”. I graduated in June 2012 from Moscow State University with B.S. in Mathematics; my advisors were great Maxim Babenko and Sasha Shen.
- Email: email@example.com, firstname.lastname@example.org
- My CV (as of July 2, 2018)
- My Google Scholar profile
- My research statement (as of December 2016)
I do Theoretical Computer Science. My research interests are revolving around efficient algorithms for massive datasets with geometric structure.
I help organizing TCS+: a series of online seminars in Theory. Watch some of the past talks: the speaker list can easily compete with any offline seminar I know of! If you want to suggest a speaker, fill a form.
My wife does Theory as well!
VIDEOS OF MY TALKS
- “Hölder Homeomorphisms and Approximate Nearest Neighbors” (Simons Institute)
- A tutorial on the nearest neighbor search: part 1, part 2 (Foundations of Data Science Bootcamp @ Simons Institute)
- “Beyond John's Ellipsoid” (Learning and Optimization workshop at MSR)
- “Optimal Hashing-based Time-Space Trade-offs for Approximate Near Neighbors” (CMU Theory Lunch)
- “Sketching and Embedding are Equivalent for Norms” (TCS+)
- “Locality-Sensitive Hashing and Beyond” (Microsoft Research)
- FALCONN: a highly optimized C++ library (with Python bindings) for similarity search based on LSH.
- QuadSketch: a simple and fast algorithm for compressing Euclidean metrics.
- pytorch_kmeans: implementation of the k-means algorithm in PyTorch that works for large datasets.
- Fall 2018: The second incarnation of Algorithms through Geometric Lens, University of Washington.
- Fall 2017: co-teaching (with Alexandr Andoni) a class Algorithms through Geometric Lens, Columbia University.
- May 2017: mini-course Algorithms for high-dimensional data at the school Recent Advances in Algorithms, Saint Petersburg.
- January 2016: mini-course From “compact presentations” to algorithms for big data at PDMI CS Club (me and students who took the class)
- Fall 2015: TA for Introduction to Algorithms (6.006) at MIT (taught by Piotr Indyk and Ron Rivest)
- Spring 2014: TA for the Piotr Indyk's course Geometric Computing (6.850) at MIT
- Spring 2011: TA for the Andrew Goldberg's mini-course Path and Flow Algorithms at Yandex Data Analysis School