Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

NPTEL

Mathematical Foundations of Machine Learning

NPTEL via Swayam

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
ABOUT THE COURSE:This course introduces the mathematical foundations of machine learning, covering risk minimization, density estimation, regularization, and generalization. Students learn classical methods such as linear models, kernel machines, SVMs, decision trees, and ensemble techniques, as well as modern deep learning approaches including MLPs, CNNs, RNNs, and Transformers. Probabilistic models, clustering, PCA, and the EM algorithm are presented to build a solid grounding in unsupervised learning. The course concludes with an introduction to generative models (GANs, VAEs) as a bridge to advanced topics. Emphasis is placed on both theory and practice, with coding assignments connecting math to real-world ML applications.INTENDED AUDIENCE: Senior Undergraduates and Graduate Students from EECS disciplinesPREREQUISITES: BE/BTech. ME/MTech Basic course on Probability theory, Linear Algebra. Should have some background in Python ProgrammingINDUSTRY SUPPORT: Most IT companies including Google, Microsoft, Amazon, IBM, Flipkart, Oracle, Infosys, Accenture, GE etc.

Syllabus

Week 1: Introduction to Supervised/Unsupervised/Generative, Learning via Empirical Risk Minimization
Week 2:Bayes Optimality and Density Estimation via Divergence Minimization
Week 3:Maximum Likelihood and MAP Estimates, Non-Parametric Estimates (Nearest Neighbours and Parzen Window)
Week 4:Linear Models: Linear regression, least squares, Fisher discriminant, Logistic regressions
Week 5:Regularization & Generalization: Bias–variance Decomposition, Ridge regression, Lasso, Probabilistic interpretation of regularization
Week 6:Kernel Machines & SVMs: Maximum margin classifiers, Dual form, KKT conditions, Kernel trick & RKHS intuition
Week 7:Perceptron, Neural Networks, Gradient-based Optimization, Error Back Propagation
Week 8:Convolutional Neural Networks: Convolution, pooling, receptive fields, CNN architectures,Transfer learning
Week 9:Sequence Models: RNNs, backpropagation through time, Vanishing/exploding gradients, GRU, LSTMs
Week 10:Attention & Transformers: Attention mechanism, Self-attention vs recurrence, Encoder–decoder Transformers.
Week 11:Ensembles and Decision trees: Bagging & Random Forests, Boosting (AdaBoost, XGBoost).
Week 12:Unsupervised Learning & EM: Clustering: k-Means, Gaussian mixtures, EM algorithm, dimensionality reduction and PCA. A preview of Generative Models: GANs and VAEs, Diffusion models (high-level only).

Taught by

Prof. Prathosh A P

Reviews

Start your review of Mathematical Foundations of Machine Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.