Master AI and Machine Learning: From Neural Networks to Applications
AI, Data Science & Business Certificates from Google, IBM & Microsoft
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore fundamental concepts of probabilistic modeling in natural language processing through this comprehensive lecture from Johns Hopkins University's Summer School on Human Language Technology. Delve into the mathematical foundations of probability theory as applied to computational linguistics, examining how statistical methods can be used to model and understand human language patterns. Learn about various types of language models, their construction, and their applications in speech recognition, machine translation, and other NLP tasks. Discover how probabilistic frameworks enable computers to handle the inherent uncertainty and ambiguity in natural language, while gaining insights into the theoretical underpinnings that drive modern language processing systems. Master key concepts including probability distributions, parameter estimation, and model evaluation techniques that form the backbone of contemporary computational linguistics research and applications.
Syllabus
Jason Eisner: Probabilities and language models
Taught by
Center for Language & Speech Processing(CLSP), JHU