Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Google

Inside Lyria 3 - Google's Music Generation Model

Google via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
Explore Google's revolutionary music generation model through this 37-minute podcast featuring Jeff Chang, Myriam Hamed Torres, and Jason Baldridge from the Google DeepMind team. Discover how Lyria 3 transcends traditional audio generation to function as a collaborative instrument that provides creators with precise control over mood, instrumentation, and vocals. Learn about the technical challenges of prompt adherence in music generation and understand why "vibe" plays a crucial role in human evaluations of AI-generated music. Examine the transition from simple audio creation to sophisticated tools that bridge the gap for non-musicians while empowering new forms of musical expression. Witness live demonstrations including instrumental funk jams, lyrical compositions with vocals, and experimental sonic landscapes that showcase the model's versatility. Understand the distinction between real-time and song generation models, and explore how iterative co-creation enables users across all expertise levels to articulate their creative vision through natural language. Gain insights into the emotional and communal impact of AI-generated music, evaluation methodologies that incorporate both taste and technical expertise, and the future of AI-first workflows in music composition. Discover opportunities for developers and the broader creative community to engage with these cutting-edge tools that are reshaping how we think about musical creativity and collaboration.

Syllabus

- Intro
- Defining music generation models
- Lyria as a new instrument
- Connecting language and creative intent
- Guest backgrounds and musical journeys
- Demo: Instrumental funk jam
- Bridging the gap for non-musicians
- Demo: Exploring lyrics and vocals
- The magic of iterative co-creation
- Meeting users across the expertise spectrum
- Empowering new musical expressions
- Emotional and communal impact of music
- Opportunities for developers and community
- Real-time vs. song generation models
- Creating experimental sonic landscapes
- Demo: Capturing unexpectedness and energy
- Evaluating music through taste and expertise
- The diligence of music evaluation
- The future of Lyria and AI-first workflows
- Articulating creative vision through language

Taught by

Google Developers

Reviews

Start your review of Inside Lyria 3 - Google's Music Generation Model

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.