During Fall 2017, I was a Simons Junior Fellow at Columbia University hosted by Alexandr Andoni. In August 2017 I graduated with Ph.D. in Computer Science from MIT Computer Science and AI Laboratory, where I was privileged to be advised by Piotr Indyk. Check out my award-winning thesis “High-Dimensional Similarity Search and Sketching: Algorithms and Hardness”. I graduated in June 2012 from Moscow State University with B.S. in Mathematics; my advisors were great Maxim Babenko and Sasha Shen.
I am privileged to work with and learn from many great “junior” researchers:
- Bihao Zhang (MS student, Columbia)
- Erik Waingarten (intern, Summer 2018)
- Yihe Dong (AI Resident, 2018–2019)
- Mohammad Samragh (intern, Summer 2019)
- Sameer Wagh (intern, Summer 2019)
- Tal Wagner (intern, Summer 2019)
- 06/12/19: New paper “Provably Robust Deep Learning via Adversarially Trained Smoothed Classifiers”
- 06/11/19: I have given a talk “Adversarial Examples from Computational Constraints” at ICML 2019
- 05/26/19: I have given a talk “Hölder Homeomorphisms and Approximate Nearest Neighbors” at Workshop on Metric Embeddings and Dimensionality Reduction at Northwestern University
- 05/17/19: I have given a talk “Extension theorems, dimension reduction, and clustering” at Simons Collaboration on Algorithms & Geometry Annual Meeting
- 04/21/19: One paper accepted to ICML 2019
- 04/18/19: One paper accepted to COLT 2019
- 04/03/19: New paper “SANNS: Scaling Up Secure Approximate $k$-Nearest Neighbors Search”
- 02/11/19: One paper accepted to STOC 2019
- 02/07/19: New paper “On Mean Estimation for General Norms with Statistical Queries”
- 01/26/19: New paper “Learning Sublinear-Time Indexing for Nearest Neighbor Search”
- Email: firstname.lastname@example.org, email@example.com
- My CV (as of July 2, 2018)
- My Google Scholar profile
- My research statement (as of December 2016)
My wife does Theory as well!
VIDEOS OF MY TALKS
- “Hölder Homeomorphisms and Approximate Nearest Neighbors” (Simons Institute)
- A tutorial on the nearest neighbor search: part 1, part 2 (Foundations of Data Science Bootcamp @ Simons Institute)
- “Beyond John's Ellipsoid” (Learning and Optimization workshop at MSR)
- “Optimal Hashing-based Time-Space Trade-offs for Approximate Near Neighbors” (CMU Theory Lunch)
- “Sketching and Embedding are Equivalent for Norms” (TCS+)
- “Locality-Sensitive Hashing and Beyond” (Microsoft Research)
- FALCONN: a highly optimized C++ library (with Python bindings) for similarity search based on LSH.
- QuadSketch: a simple and fast algorithm for compressing Euclidean metrics.
- pytorch_kmeans: implementation of the k-means algorithm in PyTorch that works for large datasets.
- Fall 2018: The second incarnation of Algorithms through Geometric Lens, University of Washington.
- Fall 2017: co-teaching (with Alexandr Andoni) a class Algorithms through Geometric Lens, Columbia University.
- May 2017: mini-course Algorithms for high-dimensional data at the school Recent Advances in Algorithms, Saint Petersburg.
- January 2016: mini-course From “compact presentations” to algorithms for big data at PDMI CS Club (me and students who took the class)
- Fall 2015: TA for Introduction to Algorithms (6.006) at MIT (taught by Piotr Indyk and Ron Rivest)
- Spring 2014: TA for the Piotr Indyk's course Geometric Computing (6.850) at MIT
- Spring 2011: TA for the Andrew Goldberg's mini-course Path and Flow Algorithms at Yandex Data Analysis School