Aryeh Kontorovich - Learning infinitely many coins simultaneously

Inferring the bias of a single coin from independent flips is a well-understood problem, technically known as estimating the Bernoulli parameter p. In particular, how the sample size (number of flips) n, the precision ε, and the confidence δ constrain each other is known within tight upper and lower bounds. When we want to estimate the bias of d coins simultaneously, this problem is well-understood as well, at least in the worst case over the Bernoulli parameters pᵢ. What if we want to estimate infinitely many pᵢ's simultaneously?

A simple argument shows that this is impossible in the "worst case" over the pᵢ's; thus, any result must depend on their actual values. If we define M as the expected maximum deviation between any pᵢ and its estimate, we want to understand for which sequences pᵢ this quantity decays to zero and at what rate. We obtain tight, exhaustive answers to these questions.

Presented at COLT 2023, joint work with Doron Cohen.

https://arxiv.org/abs/2209.04054

Date and Time: 
Thursday, May 2, 2024 - 13:30 to 14:30
Speaker: 
Aryeh Kontorovich
Speaker Bio: 

Aryeh Kontorovich received his undergraduate degree in mathematics with a certificate in applied mathematics from Princeton University in 2001. His M.Sc. and Ph.D. are from Carnegie Mellon University, where he graduated in 2007. After a postdoctoral fellowship at the Weizmann Institute of Science, he joined the Computer Science department at Ben-Gurion University of the Negev in 2009, where he is currently a full professor. His research interests are mainly in machine learning, with a focus on probability, statistics, Markov chains, and metric spaces.

He served as the director of the Ben-Gurion University Data Science Research Center during 2021-2022.