⏰ Tutorial time: August 4, 18:30–21:30 Mountain Time
The goal of this tutorial is to provide the attendees with an overview of techniques and recipes for distributed estimation and testing under constraints. Over the recent years, many papers have obtained both upper and lower bounds for statistical estimation under communication, local privacy, and memory constraints: these questions are motivated by applications in machine learning and distributed computing, and are at the intersection of theoretical computer science, machine learning, statistics, and information theory.
This tutorial will provide a primer of those techniques, aiming to give both an understanding of the underlying challenges and ideas, and “plug-and-play” general recipes the attendees could then apply to the problems of their choice. Our focus will be on establishing lower bounds for statistical estimation, in particular for parameter estimation (single- and high-dimensional) and testing. The tutorial will cover various models: nonadaptive, sequentially adaptive, blackboard model, and memory-constrained settings; with applications to high-dimensional parameter estimation and testing.
Presenters: Jayadev Acharya, Clément Canonne, and Himanshu Tyagi
Recitation tutors: Aditya Vikram Singh, Ziteng Sun
📚 Bibliography: available here (BibTex source)