2,000+ Free Courses with Certificates: Coding, AI, SQL, and More
AI Engineer - Learn how to integrate AI into software applications
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Attend this seminar to explore the fundamental principles and practical techniques for scaling and accelerating large language model (LLM) training processes. Learn about scaling laws and their role in understanding the rationale behind large-scale training initiatives. Discover how to effectively apply parallelization techniques by identifying where they deliver the greatest performance benefits. Explore low-precision training methods designed to maximize cluster performance and computational efficiency. The presentation, delivered by Andrea Pilzer, Ph.D. from NVIDIA AI Technology Center in Italy, provides insights into optimizing LLM training workflows for high-performance computing environments. Gain practical knowledge about balancing computational resources, training speed, and model performance when working with large-scale language models in distributed computing settings.
Syllabus
NHR PerfLab Seminar: Scaling and accelerating LLM trainings
Taught by
NHR@FAU