Colloquium (C)

Gal Amram - Extensions of GR(1): efficient synthesis for meaningful specifications

GR(1) is a popular and applicative fragment of Linear Temporal Logic (LTL) for which a feasible reactive synthesis algorithm exists. In this talk, I will present some of my recent works that introduce extensions for GR(1). Specifically, I will speak about the addition of regular expressions and triggers, energy constraints, and existential guarantees to GR(1). We support the first two extensions by a reduction to GR(1), and I will present efficient and concise encodings thereof.

03/12/2020 - 13:30

Sarah Keren - Better Environments for Better AI

Most AI research focuses exclusively on the AI agent itself, i.e., given some input, what are the improvements to the agent’s reasoning that will yield the best possible output? In my research, I take a novel approach to increasing the capabilities of AI agents via the use of AI to design the environments in which they are intended to act. My methods identify the inherent capabilities and limitations of AI agents and find the best way to modify their environment in order to maximize performance.

17/12/2020 - 13:30

Ben Lee Volk - Recent Lower Bounds in Algebraic Complexity Theory

Algebraic Complexity Theory studies the complexity of solving algebraic computational tasks using algebraic models of computation.
One major problem in this area is to prove lower bounds on the number of arithmetic operations required for computing explicit polynomials. This natural mathematical problem is the algebraic analog of the P vs. NP problem. It also has tight connections to other classical mathematical areas and to fundamental questions in complexity theory.

10/12/2020 - 13:30

Shay Deutsch: Spectral Embedding of Graph Networks

We introduce an unsupervised graph embedding that trades off local node similarity and connectivity and global structure. Our proposed graph embedding captures the structure of a graph with both local and global characteristics by incorporating basis functions that can be derived from the spectral representation of a graph. The key idea is to transform the given graph into one whose weights measure the centrality of an edge by the fraction of the number of shortest paths that pass through that edge and employ its spectral proprieties in the representation.

30/11/2020 - 19:00

Dean Doron - Randomness in Computation — Extractors and PRGs

Randomness is an incredibly useful, yet expensive, resource. In many important applications, randomness is essential, and it needs to be as pure as possible. In other cases, prominently time- and space-bounded algorithms, we can reduce and even eliminate the use of it. In this talk, I will discuss two, not unrelated, results from the theory of pseudorandomness.

26/11/2020 - 13:30

Reuth Mirski - The Seeing-eye Robot: Developing a Human-Aware Artificial Collaborator

In this talk I will present the seeing-eye robot grand challenge and discuss the components required to design and build a service robot that can replace most, if not all, functionalities of a seeing-eye dog. This challenge encompasses a variety of research problems that can benefit from human-inspired AI: reasoning about other agents, human-robot interactions, explainability, teaching teammates, and more. For each of these problems, I will present an example novel contribution that leverages the bilateral investigation of human and artificial intelligence.

16/11/2020 - 19:00

Michal Moshkovitz - How to Learn Under Constraints?

The goal of classical machine learning is to learn a high-accuracy model from given examples. The learning process has no constraints besides running in a reasonable time relative to the input size. However, these days, as machine learning is utilized in high-stake applications like healthcare and law, learning has to obey several new constraints. In this talk, I will focus on two types of constraints: explainability and bounded memory. I will present the first explainable algorithm for k-means clustering that has provable guarantees.

09/11/2020 - 19:00

Evgeny Lipovetsky - Subdivision schemes refining point-normal pairs

Subdivision is a well-known and established method for generating smooth curves and surfaces from discrete data by repeated refinements. The typical input for such a process is a mesh of vertices. In this work we propose to refine data consisting of vertices of a mesh and a normal vector at each vertex. This is done by replacing the weighted binary arithmetic means in a linear subdivision scheme, represented in terms of repeated binary weighted linear means, by binary averaging operations with the same weights.

05/11/2020 - 13:30