Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore fundamental concepts of probabilistic modeling in natural language processing through this comprehensive lecture from Johns Hopkins University's Summer School on Human Language Technology. Delve into the mathematical foundations of probability theory as applied to computational linguistics, examining how statistical methods can be used to model and understand human language patterns. Learn about various types of language models, their construction, and their applications in speech recognition, machine translation, and other NLP tasks. Discover how probabilistic frameworks enable computers to handle the inherent uncertainty and ambiguity in natural language, while gaining insights into the theoretical underpinnings that drive modern language processing systems. Master key concepts including probability distributions, parameter estimation, and model evaluation techniques that form the backbone of contemporary computational linguistics research and applications.
Syllabus
Jason Eisner: Probabilities and language models
Taught by
Center for Language & Speech Processing(CLSP), JHU