Teaching Audio Developers How to Build AI-Enhanced Audio Plugins
ADC - Audio Developer Conference via YouTube
Overview
Coursera Spring Sale
40% Off Coursera Plus Annual!
Grab it
In this 24-minute conference talk from ADCx Gather 2024, Professor Matthew Yee-King reflects on his experience as an educator, developer, and musician working with AI and music technology. Explore an educationally focused workflow for developing AI-enhanced audio plugins using C++, JUCE, CMake, PyTorch, and RTNeural—designed specifically to support educators, students, and developers interested in integrating AI and machine learning into real-time audio applications. Discover various example plugins built using this workflow, including a MIDI improviser, a neural network synthesizer controller, and a neural effects unit. Matthew Yee-King is a professor at Goldsmiths, University of London, programme director for the UoL Worldwide Computer Science Programme, and author of "Build AI-enhanced Audio Plugins with C++." His expertise spans education technology, AI-enhanced systems, and the application of AI to digital signal processing and music performance.
Syllabus
Teaching Audio Developers How to Build AI-Enhanced Audio Plugins - Matthew Yee-King - ADCx Gather
Taught by
ADC - Audio Developer Conference