Cache Friendly C++ - CPU Caches, Data Structures, and Performance Optimization
Meeting Cpp via YouTube
Google Data Analytics, IBM AI & Meta Marketing — All in One Subscription
Learn Backend Development Part-Time, Online
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore CPU cache optimization techniques in this comprehensive conference talk that builds understanding from fundamental concepts to practical implementation strategies. Learn why std::vector should be your default container choice and discover the underlying mechanisms that make it cache-friendly. Examine how CPU caches work, their performance implications, and the sophisticated tricks processors use to maintain seamless operation. Understand when these optimization techniques fail and how to structure programs to avoid performance bottlenecks. Dive into cache-friendly data structures, data-oriented design principles, and strategies for avoiding common performance pitfalls that can significantly impact application speed. Gain practical insights for writing more efficient C++ code by leveraging cache locality and memory access patterns.
Syllabus
Cache Friendly C++ - Jonathan Müller - Meeting C++ 2025
Taught by
Meeting Cpp