Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore the principles and applications of parsimony learning in deep neural networks through this 51-minute conference talk from ICBS2025. Delve into advanced techniques for creating more efficient and interpretable deep learning models by incorporating parsimony principles that emphasize simplicity and sparsity. Learn how to balance model complexity with performance, understand regularization methods that promote sparse representations, and discover approaches for reducing computational overhead while maintaining accuracy. Examine theoretical foundations of parsimony in machine learning, practical implementation strategies for various deep network architectures, and real-world applications where parsimonious models provide advantages over traditional dense networks. Gain insights into cutting-edge research on model compression, feature selection, and architectural design principles that lead to more efficient and deployable deep learning systems.
Syllabus
Quanming Yao: Parsimony Learning from Deep Networks #ICBS2025
Taught by
BIMSA