Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Does Equivariance Matter at Scale? A Study of Neural Architectures and Symmetries

Valence Labs via YouTube

Overview

Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Explore a comprehensive research presentation that investigates the effectiveness of equivariant and non-equivariant neural networks at scale in AI drug discovery. Learn how different neural architectures perform when dealing with large datasets and substantial computational resources, focusing specifically on rigid-body interactions and transformer architectures. Discover key findings about data efficiency, including how non-equivariant models with data augmentation can match equivariant models given enough training epochs. Understand the power law relationship in computational scaling, where equivariant models consistently outperform their non-equivariant counterparts across various compute budgets. Examine the optimal allocation strategies for computational resources between model size and training duration, and how these strategies differ between equivariant and non-equivariant approaches. Access additional insights and connect with speakers through the AI drug discovery community at Portal.

Syllabus

Does equivariance matter at scale? | Johann Brehmer

Taught by

Valence Labs

Reviews

Start your review of Does Equivariance Matter at Scale? A Study of Neural Architectures and Symmetries

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.