generated from mmistakes/mm-github-pages-starter
-
Notifications
You must be signed in to change notification settings - Fork 18
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
8f32ff9
commit 0b663a9
Showing
2 changed files
with
39 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
38 changes: 38 additions & 0 deletions
38
_posts/reading-group/fall-2024/2024-10-18-eva-portelance.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,38 @@ | ||
--- | ||
title: "Reframing linguistic bootstrapping as joint inference using visually-grounded grammar induction models" | ||
venue: HEC Montreal | ||
names: Eva Portelance | ||
author: Eva Portelance | ||
tags: | ||
- NLP RG | ||
categories: | ||
- Reading-Group | ||
- Fall-2024 | ||
layout: archive | ||
classes: | ||
- wide | ||
- no-sidebar | ||
--- | ||
|
||
*{{ page.names }}* | ||
|
||
**{{ page.venue }}** | ||
|
||
{% include display-publication-links.html pub=page %} | ||
|
||
The [NLP Reading Group]({% link _pages/reading-group.md %}) is excited to host [Eva Portelance](https://evaportelance.github.io/), an assistant professor at HEC Montreal, who will be speaking **in person** in Auditorium 2 at 11:30 AM on Friday October 18th about **joint learning in language acquisition**. | ||
|
||
|
||
## Talk Description | ||
|
||
Semantic and syntactic bootstrapping posit that children use their prior knowledge of one linguistic domain, say syntactic relations, to help later acquire another, such as the meanings of new words. Empirical results supporting both theories may tempt us to believe that these are different learning strategies, where one may precede the other. Here, we argue that they are instead both contingent on a more general learning strategy for language acquisition: joint learning. Using a series of neural visually-grounded grammar induction models, we demonstrate that both syntactic and semantic bootstrapping effects are strongest when syntax and semantics are learnt simultaneously. Joint learning results in better grammar induction, realistic lexical category learning, and better interpretations of novel sentence and verb meanings. Joint learning makes language acquisition easier for learners by mutually constraining the hypotheses spaces for both syntax and semantics. Studying the dynamics of joint inference over many input sources and modalities represents an important new direction for language modeling and learning research in both cognitive sciences and AI, as it may help us explain how language can be acquired in more constrained learning settings. | ||
|
||
## Speaker Bio | ||
|
||
Eva Portelance is an Assistant Professor of machine learning in the Department of decision sciences at HEC Montréal and a member of Mila — Québec Artificial Intelligence Institute. Her research intersects AI and cognitive science; She is interested in understanding how both humans and machines learn to understand language and reason about complex problems. Previously, she was a postdoc at Mila and McGill University with Timothy J. O'Donnell and Siva Reddy. Her Ph.D. was in computational/cognitive linguistics from Stanford University, where she worked with Dan Jurafsky and Mike C. Frank. | ||
|
||
## Logistics | ||
|
||
Date: October 18th<br> | ||
Time: 11:30AM <br> | ||
Location: Auditorium 2 or via Zoom (See email) |