Foundations of Data Visualization - Self Paced Online
Learn Excel & Financial Modeling the Way Finance Teams Actually Use Them
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn about groundbreaking research findings on improving machine learning model performance through an AutoML seminar presentation by Thomas Nagler and Lennart Schneider. Explore how reshuffling resampling splits during hyperparameter optimization can enhance model generalization on unseen data, challenging conventional wisdom about fixed train-validation splits. Dive into theoretical analysis explaining the asymptotic behavior of validation loss surfaces and expected regret bounds, connecting reshuffling benefits to signal and noise characteristics. Examine evidence from both controlled simulations and large-scale experiments demonstrating how reshuffling can make single train-validation holdout protocols more competitive with cross-validation while reducing computational costs. Master key insights for implementing more effective hyperparameter optimization strategies in machine learning workflows during this 46-minute technical discussion.
Syllabus
Reshuffling Resampling Splits Can Improve Generalization of Hyperparameter Optimization
Taught by
AutoML Seminars