Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Can Transformers Do Enumerative Geometry?

Harvard CMSA via YouTube

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
This seminar talk from the New Technologies in Mathematics series features Baran Hashemi from Technical University of Munich exploring whether transformer models can effectively tackle problems in enumerative geometry. Learn about the Neural Enumerative Reasoning model developed for computing ψ-class intersection numbers on the moduli space of curves, which reformulates the problem as a continuous optimization task capable of handling values ranging from 10e-45 to 10e45. Discover the innovative Dynamic Range Activator (DRA) activation function designed to enhance transformers' ability to model recursive patterns and manage severe heteroscedasticity. The presentation delves into uncertainty quantification using Conformal Prediction with dynamic sliding windows, and reveals fascinating interpretability findings showing that neural networks implicitly model Virasoro constraints in a data-driven way. Through abductive hypothesis testing, probing, and causal inference, Hashemi demonstrates evidence of emergent internal representations of large-genus asymptotic behaviors, suggesting new possibilities for inferring asymptotic closed-form expressions from limited data. The talk is based on research published at https://openreview.net/pdf?id=4X9RpKH4Ls.

Syllabus

Baran Hashemi | Can Transformers Do Enumerative Geometry?

Taught by

Harvard CMSA

Reviews

Start your review of Can Transformers Do Enumerative Geometry?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.