Michal Moshkovitz - How to Learn Under Constraints?

The goal of classical machine learning is to learn a high-accuracy model from given examples. The learning process has no constraints besides running in a reasonable time relative to the input size. However, these days, as machine learning is utilized in high-stake applications like healthcare and law, learning has to obey several new constraints. In this talk, I will focus on two types of constraints: explainability and bounded memory. I will present the first explainable algorithm for k-means clustering that has provable guarantees. Then I will focus on another constraint -- learning with bounded memory, where I will present a characterization of high-accuracy learning with bounded memory and its equivalence to learning with statistical queries.

Date and Time: 
Monday, November 9, 2020 - 19:00 to Tuesday, November 10, 2020 - 19:45
Speaker: 
Michal Moshkovitz
Location: 
Zoom
Speaker Bio: 

Michal is a postdoctoral fellow at the Qualcomm Institute of the University of California, San Diego. Her interests lie in the foundations of AI, exploring how different constraints affect learning. She works on explainable machine learning, bounded memory learning, and online no-substitution clustering. Michal received her PhD from the Hebrew University and an MSc from Tel-Aviv University. She was the recipient of an Anita Borg scholarship from Google and a Hoffman scholarship from the Hebrew University.