Elior Sulem - Learning with Less Data and Labels for Language Acquisition and Understanding
Natural Language Processing has attracted a lot of interest in recent years and has seen large improvements with the use of contextualized neural language models. However, the progress is limited to specific tasks where large datasets are available and models are often brittle outside of these datasets. Also, current models are usually pretrained on extremely large unlabeled datasets, which limits our understanding of low-resource scenarios and makes their use unfeasible for many in the academia and industry.