The influx of data witnessed during the last decade gave rise to groundbreaking applications in data sciences and machine learning. However, due to hardware constraints, the volume of data grows much faster than the growth of the available computational resources. Such modern setting poses new challenges for algorithm design as more efficient methods are needed. One way to obtain such methods is by exploiting the underlying structure of the data.
In this talk, we will discuss three examples, ranging from theory to practice, where structure can be leveraged to obtain performance. The first two examples are from sublinear computation, where the combinatorial/geometric structure of the data is used to obtain dramatic time and space savings, and the third coming from Graph Neural Networks, where local structure is exploited to obtain superior real-world models.
Amit Levi is a research scientist at Huawei Noah's ark lab in Toronto. He obtained his PhD from the University of Waterloo, where he was advised by Eric Blais. He is interested in developing a rigorous understanding of the interplay between structured data and performance of algorithms, with a particular focus on sub-linear computation and graph neural networks.