Power BI Fundamentals - Create visualizations and dashboards from scratch
Free courses from frontend to fullstack and AI
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Watch a 24-minute conference talk from INSAIT 2022 where Prof. Martin Jaggi from EPFL explores the evolution and challenges of decentralized learning systems. Dive into key concepts including gradient descent, federated learning, and Byzantine robust training. Learn about the progression from federated to decentralized approaches, examining how momentum and historical data can enhance training robustness. Explore collaborative learning frameworks, with special attention to mean estimation and personalized optimization. The presentation covers theoretical foundations through practical applications, including detailed analysis of Byzantine workers in graph topology and the implications for system resilience.
Syllabus
Intro
Gradient Descent
Federated Learning
Evolution
Byzantine Robust Training
Fix: Using history with momentum
Robustness theorem
Robust training: from federated towards decentralized
Byzantine workers in the graph topology
Collaborative Learning
Special Case: Mean Estimation
Personalized learning/optimization
References
Taught by
INSAIT Institute