Layers as Lenses - A Narrative of Feature Learning in Deep Networks
Centre for Networked Intelligence, IISc via YouTube
Free courses from frontend to fullstack and AI
Google AI Professional Certificate - Learn AI Skills That Get You Hired
Overview
Google, IBM & Meta Certificates — All 10,000+ Courses at 40% Off
One annual plan covers every course and certificate on Coursera. 40% off for a limited time.
Get Full Access
Attend this online lecture exploring a novel theoretical framework for understanding why deeper neural networks consistently outperform shallower ones across various architectures and datasets. Learn about the innovative "layers as lenses" narrative, which proposes that each layer acts as an importance-weighting lens for other layers, enabling the emergence of features that would otherwise be overlooked. Discover how this perspective explains feature learning through a positive feedback loop where individual neuron parameters serve dual roles as data separators and lens components for other layers. Examine theoretical analysis and empirical results on Deep Linearly Gated Networks (DLGN), an architecture combining elements of deep linear and ReLU networks. Gain new insights into the effects of architectural modifications including pruning, skip-connections, and momentum-based optimization. The presentation will be delivered by Prof. Harish Guruprasad Ramaswamy, Assistant Professor at IIT Madras, whose research focuses on machine learning, statistical learning theory, and optimization, with previous experience as a research scientist at IBM Research Labs and postdoctoral researcher at the University of Michigan.
Syllabus
Time: 5:00 PM - 6:00 PM IST
Taught by
Centre for Networked Intelligence, IISc