Fields of expertise
Specific topics I have worked on include streaming, sketching, randomized numerical linear algebra, kernel methods, locality sensitive hashing, and sparse recovery.
- Designed the first sublinear time algorithm with optimal sample complexity for black-box optimization, hyperparameter tuning of deep nets and learning decision trees.
- Designed several improved algorithms for scaling-up kernel-based learning methods to massive high dimensional datasets. The techniques include: Adaptive leverage score sampling, locality sensitive hashing, oblivious sketching, and Fourier features sampling. My algorithms are the first to resolve the curse of dimensionality inherent to all prior results for the Gaussian kernel.
- Designed the first sublinear time algorithm for model-based compressive sensing
- Designed new fast algorithms for computing Sparse Fast Fourier Transforms (FFT) in high dimensions for signal processing applications. This is the first algorithm that resolves the curse of dimensionality for the Sparse FFT problem.
- Designed improved algorithms for maximizing submodular functions on a massive stream of data for machine learning and data mining applications
|Scaling up Kernel Ridge Regression via Locality Sensitive Hashing (AISTATS 2020)
||International Conference on Artificial Intelligence and Statistics (AISTATS 2020)|
|Oblivious Sketching of High-Degree Polynomial Kernels (SODA 2020)
||ACM-SIAM Symposium on Discrete Algorithms (SODA 2020)|
|Efficiently Learning Fourier Sparse Set Functions (NeurIPS 2019)
||Conference on Neural Information Processing Systems (NeurIPS 2019)|
|A Universal Sampling Method for Reconstructing Signals with Simple Fourier Transforms (STOC 2019)
||ACM SIGACT Symposium on Theory of Computing (STOC 2019)|
|Dimension-independent Sparse Fourier Transform (SODA 2019)
||ACM-SIAM Symposium on Discrete Algorithms (SODA 2019)|
|Beyond 1/2-Approximation for Submodular Maximization on Massive Data Streams (ICML 2018)
||International Conference on Machine Learning (ICML 2018)|
|Random Fourier features for kernel ridge regression: Approximation bounds and statistical guarantees (ICML 2017)
||International Conference on Machine Learning (ICML 2017)|
|An adaptive sublinear-time block sparse Fourier transform (STOC 2017)
||ACM SIGACT Symposium on Theory of Computing (STOC 2017)|