Early Stopping for Reaching Optimality While Saving Computational Resources
INI Seminar Room 2 via YouTube
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore early stopping techniques for achieving optimal performance while conserving computational resources in this seminar by Professor Alain Celisse from Université Paris 1. Learn how to strategically halt iterative algorithms at the right moment to balance computational efficiency with model performance, a crucial skill in modern machine learning and statistical optimization. Discover the theoretical foundations behind early stopping criteria and understand when and why to implement these techniques in various computational scenarios. Examine practical applications where early stopping can significantly reduce computational costs without sacrificing solution quality. Gain insights into the mathematical principles that govern the trade-off between computational resources and algorithmic convergence. This presentation is part of the "Representing, calibrating & leveraging prediction uncertainty from statistics to machine learning" event at the Isaac Newton Institute, offering advanced perspectives on computational optimization strategies that are essential for efficient machine learning implementations.
Syllabus
Date: 18th Jun 2025 - 17:00 to 18:30
Taught by
INI Seminar Room 2