Welcome

I am Sudeep Salgia, a Postdoctoral Research Associate with Prof. Yuejie Chi in the department of Electrical and Computer Engineering at Carnegie Mellon University. I obtained my Ph.D. from Cornell University in 2023 where I was fortunate to be advised by Prof. Qing Zhao. I have been working on sequential learning problems arising in Reinforcement Learning, Stochastic Optimization, Federated and Distributed Learning, and Active learning. My research focuses on establishing fundamental limits on feasible performance and developing machine learning algorithms that achieve or approach the performance limits under practical constraints in terms of computation complexity and communication costs. I completed my undergraduate degree in Electrical Engineering at IIT Bombay in 2018.

News and Updates

Sep 2024 Our paper on Federated Q-learning is accepted in NeurIPS 2024 as an oral presentation.
Sep 2024 I will be presenting my work on Sample Communication Trade-off in Federated Q-learning at Allerton Conference.
Aug 2024 New preprint that provides a compelete characterization of the Sample-Communication Complexity Trade-off in Federated Q-learning
May 2024 Our paper on random sampling for Bayesian Optimization got accepted for ICML 2024.
Apr 2024 Our paper on Adaptive Uniformity Testing got accepted into IEEE Transactions on Signal Processing.
Jan 2024 Our paper on communication efficient Federated Learning got accepted into IEEE Transactions on Signal Processing.
Oct 2023 New preprint that studies random sampling for exploration in Bayesian Optimization. We establish that random sampling is as good as Maximum Posterior Variance sampling and is computationally more efficient to implement. We also propose an algorithm using random exploration that achieves optimal regret in noise-free setting, settling an open COLT problem.
Oct 2023 Our paper on distributed kernel bandits with personalization got accepted into IEEE Transactions on Signal Processing.
Sep 2023 I will be joining CMU as a Postdoctoral Research Associate with Prof. Yuejie Chi.
May 2023 I defended my PhD dissertation!
Apr 2023 Our work on non-asymptotic error bounds for smooth neural nets was accepted to ICML 2023
Apr 2023 Our work on accuracy-communication trade-off frontier in distributed linear bandits was accepted to ICML 2023
Oct 2022 New preprint out that rigorously analyzes the accuracy-communication trade-off frontier at an information-theoretic level in distributed linear bandits. It establishes a new lower bound and proposes a novel algorithm on progressive learning and sharing that optimal regret while incurring optimal communication cost
Oct 2022 New result that proposes a novel algorithm that achieves order-optimal cumulative regret in distributed stochastic convex optimization with optimal communication cost
May 2022 New result that establishes non-asymptotic error bounds on the difference between an overparameterized neural net and its corresponding Neural Tangent Kernel for smooth activation functions
Mar 2022 New paper in IEEE Transactions on Signal Processing on Disagreement based Active Learning where we propose a novel algorithm with optimal query complexity and bounded regret
Jan 2022 New paper in ICASSP 2021 on an adaptive test plan for noisy group testing under unknown noise with order-optimal sample complexity
Oct 2021 New preprint out where we propose order-optimal algorithms for the kernel-based federated bandit problem along with strategies to significantly reduce communication while maintaining the regret order
Oct 2021 New preprint out where we propose an sequential strategy for uniformity testing of discrete and continuous distributions whose sample complexity adapts to the distance of the unknown distribution to the uniform distribution
Sep 2021 I passed my candidacy exam
Sep 2021 Our work on an order-optimal algorithm for optimization of RKHS functions is accepted to NeurIPS 2021
May 2021 I will serve as a reviewer for NeurIPS 2021
May 2021 I will be working as an Applied Scientist Intern over the summer at the Machine Learning Solutions Lab, Amazon, hosted by Daniel Horowitz and Emmanuel Salawu
May 2021 New prepint out where we propose the first algorithm for optimization of black-box functions in an RKHS using Gaussian Processes with order-optimal regret and has low computational costs (code)
Jan 2021 I will serve as a reviewer for ICML 2021
Jan 2021 Our work on an new adaptive strategy for noisy group testing under general and unknown noise models was accepted to ICASSP 2021
Oct 2020 I will serve as a reviewer for AISTATS 2021
May 2020 New paper on algorithm for Stochastic Optimization based on Coordinate Minimization was accepted at ICML 2020
Sep 2019 Our paper on a random walk based algorithm for stochastic optimization was published in Allerton
Aug 2018 Receipient of the Jacobs Scholar Fellowship
Aug 2018 Started my PhD at Cornell University in Electrical Engineering
Aug 2018 Graduated from IIT Bombay, Silver medallist, Institute Rank 6
Apr 2018 Presented paper on reconstructing spatiotemporal field with samples from location and time unaware sensors at ICASSP 2018
Oct 2017 New preprint on reconstructing bandlimited fields with samples generated from an unknown autoregressive process