Expectation Maximization - Introduction and Examples
Computational Genomics Summer Institute CGSI via YouTube
Google, IBM & Meta Certificates — 40% Off for a Limited Time
Learn Generative AI, Prompt Engineering, and LLMs for Free
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn the fundamentals of the Expectation Maximization (EM) algorithm through this 38-minute lecture from the Computational Genomics Summer Institute. Explore the theoretical foundations of EM as a powerful statistical method for finding maximum likelihood estimates when dealing with incomplete data. Discover how this iterative algorithm alternates between expectation and maximization steps to converge on optimal parameter estimates. Examine practical applications and worked examples that demonstrate the algorithm's utility in computational genomics and related fields. Gain insights into the mathematical framework underlying EM, including its relationship to maximum likelihood estimation and its convergence properties. Study key research contributions that shaped the development of EM algorithms, from the foundational 1977 paper by Dempster, Laird, and Rubin to modern extensions and variations. Understand connections to MM (Majorization-Minimization) algorithms and explore real-world applications such as emission tomography reconstruction that showcase the algorithm's versatility across different domains.
Syllabus
Saharon Rosset | Expectation Maximization: Intro and Examples | CGSI 2025
Taught by
Computational Genomics Summer Institute CGSI