Learning Under Differing Training and Test Distributions
Center for Language & Speech Processing(CLSP), JHU via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Learn about the fundamental challenges of machine learning when training and test data come from different distributions in this comprehensive lecture by Tobias Scheffer from the Max Planck Institute for Computer Science. Explore the theoretical foundations and practical implications of distribution shift, a critical problem in real-world machine learning applications where models must perform well on data that differs from their training set. Examine various approaches to address domain adaptation, covariate shift, and concept drift, while understanding how these distribution mismatches affect model performance and generalization. Discover techniques for detecting and measuring distribution differences, methods for adapting models to new domains, and strategies for building more robust learning systems. Gain insights into the mathematical frameworks used to analyze these problems and their solutions, including importance weighting, domain adaptation algorithms, and transfer learning approaches. This workshop presentation from Johns Hopkins University's Center for Language & Speech Processing provides both theoretical depth and practical guidance for handling one of machine learning's most pervasive challenges.
Syllabus
Tobias Scheffer: Learning Under Differing Training and Test Distributions
Taught by
Center for Language & Speech Processing(CLSP), JHU