Colloquium (C)

×

Error message

  • Deprecated function: Creation of dynamic property LdapUserConf::$createLDAPAccounts is deprecated in LdapUserConf->load() (line 265 of /var/lib/drupal7/modules/ldap/ldap_user/LdapUserConf.class.php).
  • Deprecated function: Creation of dynamic property LdapUserConf::$createLDAPAccountsAdminApproval is deprecated in LdapUserConf->load() (line 266 of /var/lib/drupal7/modules/ldap/ldap_user/LdapUserConf.class.php).

Hadar Frenkel - Verification of Complex Hyperproperties

Hyperproperties are system properties that relate multiple execution traces to one another. Hyperproperties are essential to express a wide range of system requirements such as information flow and security policies; epistemic properties like knowledge in multi-agent systems; fairness; and robustness. 
With the aim of verifying program correctness, the two major challenges are (1) providing a specification language that can precisely express the desired properties; and (2) providing scalable verification algorithms. 

11/01/2024 - 13:30

Ariel Kulik - Constrained Resource Allocation via Iterative Randomized Rounding

Constrained resource allocation problems can often be modeled as variants of the classic Bin Packing and Knapsack problems. The study of these problems has had a great impact on the development of algorithmic tools, ranging from simple dynamic programming to involved linear programs and rounding techniques. I will present a new and simple algorithmic approach for obtaining efficient approximations for such problems, based on iterative randomized rounding of Configuration LPs.

15/02/2024 - 11:30

Bracha Laufer - Leveraging Structures in Complex Spaces for Robust, Reliable and Efficient Learning

With the great promise of the data-science revolution, a major open question is whether the underlying models are efficient and trustworthy, especially when deployed in complex real-world settings. In this talk I will show how these challenges can be approached from a data-driven perspective, relying on the geometry and the statistical patterns hidden in the data to establish strong notions of robustness and reliability.

12/01/2023 - 11:30

Moshe Babaioff - Complexity-Performance Tradeoffs in Mechanism Design

Online computational platforms that directly engage with users must account for the strategic behavior of self-interested individuals. The goal of mechanism design is to optimize an objective, such as efficiency or revenue, in such scenarios, i.e., when the agents that participate in the mechanisms act strategically. In many fundamental computational settings the theoretical optimal mechanisms are highly complex and thus are not practical.

05/01/2023 - 13:30

Amit Levi - On structure and performance in the era of (really) big data

The influx of data witnessed during the last decade gave rise to groundbreaking applications in data sciences and machine learning. However, due to hardware constraints, the volume of data grows much faster than the growth of the available computational resources. Such modern setting poses new challenges for algorithm design as more efficient methods are needed. One way to obtain such methods is by exploiting the underlying structure of the data. 

26/01/2023 - 11:30

Amos Korman - An Algorithmic Perspective to Collective Behavior

In this talk, I will present a new interdisciplinary approach that I have been developing in recent years, aiming to build a bridge between the fields of algorithm theory and collective (animal) behavior. Ideally, an algorithmic perspective on biological phenomena can provide a level of fundamental understanding that is difficult to achieve using typical computational tools employed in this area of research (e.g., differential equations or computer simulations).

26/01/2023 - 13:30

Itay Safran - The Interconnection Between Approximation, Optimization and Generalization in Deep Learning Theory.

The modern study of deep learning theory can be crudely partitioned into three major aspects: Approximation, which is concerned with the ability of a given neural network architecture to approximate various objective functions; optimization, which deals with when we can or cannot guarantee that a certain optimization algorithm will converge to a network with small empirical loss; and generalization, which asks how well the network we trained is able to generalize to previously unseen examples.

19/01/2023 - 13:30

Or Zamir - Algorithmic Applications of Hypergraph and Partition Containers

We present a general method to convert algorithms into faster algorithms for almost-regular input instances. Informally, an almost-regular input is an input in which the maximum degree is larger than the average degree by at most a constant factor. This family of inputs vastly generalizes several families of inputs for which we commonly have improved algorithms, including bounded-degree inputs and random inputs.

29/12/2022 - 13:30