Facing the Curse of Dimensionality in Statistical Language Modeling using Distributed Representations
Center for Language & Speech Processing(CLSP), JHU via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn how to address the curse of dimensionality in statistical language modeling through distributed representations in this comprehensive lecture by renowned AI researcher Yoshua Bengio. Explore the fundamental challenges that arise when working with high-dimensional data in natural language processing and discover how distributed word representations can provide elegant solutions to these computational and statistical problems. Delve into the mathematical foundations of dimensionality reduction techniques and understand how neural network approaches can create more efficient and effective language models. Examine the theoretical underpinnings of distributed representations and their practical applications in overcoming the exponential growth of parameters that traditionally plague statistical language models. Gain insights into how these techniques revolutionized the field of computational linguistics and laid the groundwork for modern deep learning approaches to language understanding.
Syllabus
Yoshua Bengio: Facing the Curse of Dimensionality in Statistical Language Modeling using Distribu...
Taught by
Center for Language & Speech Processing(CLSP), JHU