Yehuda Dar - Generalization in Overparameterized Machine Learning

×

Error message

  • Deprecated function: Creation of dynamic property LdapUserConf::$createLDAPAccounts is deprecated in LdapUserConf->load() (line 265 of /var/lib/drupal7/modules/ldap/ldap_user/LdapUserConf.class.php).
  • Deprecated function: Creation of dynamic property LdapUserConf::$createLDAPAccountsAdminApproval is deprecated in LdapUserConf->load() (line 266 of /var/lib/drupal7/modules/ldap/ldap_user/LdapUserConf.class.php).

Deep neural networks are highly overparameterized models, i.e., they are highly complex with typically many more parameters than the number of training data examples. Such overparameterized models are usually learned to perfectly fit their training data; yet, in sharp contrast to conventional machine learning guidelines, they still generalize extremely well to inputs outside of their training dataset. This practical generalization performance motivates numerous foundational questions that have not yet been addressed even for much simpler models like linear regression.

This talk presents new analyses of the fundamental factors that affect generalization in overparameterized machine learning. Our results characterize the so-called “double descent phenomenon,” which extends the classical bias-variance tradeoff, in several overparameterized learning settings. First, we consider a transfer learning process between source and target linear regression problems that are related and overparameterized. Our theory characterizes the generalization performance and hence the cases where transfer learning is beneficial. Second, we consider linear subspace learning problems for dimensionality reduction and data generation. We define semi- and fully supervised extensions to the common unsupervised forms of these subspace learning problems and demonstrate that overparameterization in conjunction with supervision can improve generalization performance.

 

Date and Time: 
Thursday, November 18, 2021 - 13:30 to 14:30
Speaker: 
Yehuda Dar
Location: 
C110
Speaker Bio: 

Yehuda Dar is a postdoctoral research associate in the Electrical and Computer Engineering Department at Rice University, working on topics in modern machine learning theory. Before that, he was a postdoctoral fellow in the Computer Science Department of the Technion — Israel Institute of Technology where he also received his PhD in 2018. Yehuda earned his MSc in Electrical Engineering and a BSc in Computer Engineering, both also from the Technion. His main research interests are in the fields of machine learning, signal and image processing, optimization, and data compression.