Real-Time Inference of Neural Networks - A Guide for DSP Engineers - Part II
ADC - Audio Developer Conference via YouTube
Learn AI, Data Science & Business — Earn Certificates That Get You Hired
AI Engineer - Learn how to integrate AI into software applications
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Learn advanced techniques for implementing neural network inference in real-time audio applications through this conference talk from ADC 2024. Explore the development of a comprehensive library that simplifies neural network deployment and integration in audio systems, building upon foundational concepts from Part I. Discover methods for quantifying real-time violations within inference executions and understand the critical importance of monitoring performance in real-time environments. Examine strategies for integrating inference engines in real-time audio systems, particularly when running multiple instances simultaneously using static thread pools and host-provided threads. Master techniques for achieving minimal latency in neural network inference, including controversial approaches that push the boundaries of performance optimization. Analyze extensive benchmark results comparing different neural network architectures across various inference engines, revealing how factors such as input buffer size, model size, and previously executed inferences impact overall performance. Gain insights from researchers at Technische Universität Berlin who specialize in applying neural networks to creative audio effects and synthesis in real-time and mixed-signal domains, combining theoretical knowledge with practical implementation strategies for DSP engineers working with modern AI-powered audio applications.
Syllabus
Real-Time Inference of Neural Networks: A Guide for DSP Engineers - Part II - ADC 2024
Taught by
ADC - Audio Developer Conference