Our career paths help you become job ready faster
Gain a Splash of New Skills - Coursera+ Annual Nearly 45% Off
Overview
Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Watch a 24-minute conference talk from INSAIT 2022 where Prof. Martin Jaggi from EPFL explores the evolution and challenges of decentralized learning systems. Dive into key concepts including gradient descent, federated learning, and Byzantine robust training. Learn about the progression from federated to decentralized approaches, examining how momentum and historical data can enhance training robustness. Explore collaborative learning frameworks, with special attention to mean estimation and personalized optimization. The presentation covers theoretical foundations through practical applications, including detailed analysis of Byzantine workers in graph topology and the implications for system resilience.
Syllabus
Intro
Gradient Descent
Federated Learning
Evolution
Byzantine Robust Training
Fix: Using history with momentum
Robustness theorem
Robust training: from federated towards decentralized
Byzantine workers in the graph topology
Collaborative Learning
Special Case: Mean Estimation
Personalized learning/optimization
References
Taught by
INSAIT Institute