Deterministic Annealing for Clustering, Classification and Speech Recognition
Center for Language & Speech Processing(CLSP), JHU via YouTube
MIT Sloan: Lead AI Adoption Across Your Organization — Not Just Pilot It
Live Online Classes in Design, Coding & AI — Small Classes, Free Retakes
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore the deterministic annealing approach to clustering and its extensions in this comprehensive lecture from the Center for Language & Speech Processing at Johns Hopkins University. Delve into the method's three key features: avoiding poor local optima, applicability to various structures, and ability to minimize complex cost functions. Gain insights into the probabilistic framework and information theoretic principles underlying the approach, including maximum entropy and random coding. Discover the analogy to statistical physics and the connection to rate-distortion theory, providing new perspectives on both the method and theory. Learn how structural constraints are incorporated to optimize popular structures like vector quantizers, decision trees, multilayer perceptrons, radial basis functions, and mixtures of experts. Examine experimental results demonstrating significant performance improvements over standard training methods in applications such as compression, estimation, pattern recognition, classification, and statistical regression. Conclude with a brief overview of ongoing research and extensions to the deterministic annealing method.
Syllabus
Deterministic Annealing for Clustering, Classification and Speech Recognition - Kenneth Rose - 2001
Taught by
Center for Language & Speech Processing(CLSP), JHU