PowerBI Data Analyst - Create visualizations and dashboards from scratch
40% Off Career-Building Certificates
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore a thought-provoking lecture that examines the complex relationship between linguistics and language models, delivered by Kyle Mahowald from UT Austin. Delve into the ongoing debate about whether language models truly learn language and their relevance to human learning and processing. Learn how language models can contribute to fundamental questions about linguistic structure while challenging traditional arguments about language learning. Discover insights from research on grammaticality judgments in language models and controlled pretraining paradigms, where small models are trained using systematically manipulated input corpora. Understand why neither extreme position - dismissing language models' relevance to linguistics or claiming they make linguistic theory obsolete - is correct, and gain a balanced perspective on how these technologies can complement and enhance our understanding of linguistic theory and structure.
Syllabus
How Linguistics Learned to Stop Worrying and Love the Language Models
Taught by
Simons Institute