Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

NPTEL

Foundations of Deep Learning: Concepts and Applications

NPTEL via Swayam

Overview

Coursera Flash Sale
40% Off Coursera Plus for 3 Months!
Grab it
ABOUT THE COURSE:1.Deep Learning is a core pillar of modern Artificial Intelligence, powering applications in computer vision, natural language processing, healthcare, and robotics.2.With the rapid expansion of AI-related programs in AICTE-affiliated institutions, it's essential for students to build a strong foundation in this field.3.This course is designed to guide learners step-by-step—from the basics of neural networks to advanced architectures like CNNs, RNNs, and Autoencoders.4.Emphasis is placed on both conceptual clarity and practical implementation using Python and Google Colab.5.By the end of the course, students will be able to apply deep learning algorithms on real-world data to develop intelligent solutions.INTENDED AUDIENCE: UG and PG students of all the AICTE affiliated institutionsPREREQUISITES: This course is designed to be self-contained and suitable for learners with no prior background in the subject. However, a basic familiarity with Python programming is recommended to facilitate better understanding of the concepts and hands-on components of the courseINDUSTRY SUPPORT: Deep Learning is a critical component in the AI and Data Science ecosystem, and this course aligns well with the skill requirements of leading technology and research-driven companies. The following industries and companies are likely to recognize and value this course due to their active engagement in AI and deep learning applications:•Technology Companies: Google, Microsoft, Amazon, Meta, Apple, IBM•AI and Data Science Firms: NVIDIA, OpenAI, DeepMind, DataRobot•IT Services and Consulting: TCS, Infosys, Wipro, Accenture, Cognizant, Capgemini•Startups in AI/ML: Fractal Analytics, SigTuple, Mad Street Den, InData Labs•Healthcare and Bioinformatics: Siemens Healthineers, Philips, GE Healthcare, Tata Elxsi•Automotive and Robotics: Tesla, Bosch, Continental, Qualcomm•Finance and Banking: JPMorgan Chase, Goldman Sachs, PayPal, Razorpay, HDFC Bank (AI-driven risk modeling and fraud detection)This course will help learners build strong foundational knowledge and practical skills that are highly sought after in roles such as Machine Learning Engineer, AI Researcher, Data Scientist, Computer Vision Engineer, and NLP Specialist.

Syllabus

Week 1: Overview and motivation for the course: Why DL, Importance, Companies which are working, Applications, Future. To whom the course is designed (target audience), Contents (week wise). How this course is different from others. (More hands-on) and live classes, Make it more impactful.

Overview of machine learning and deep learning, difference between ML and DL with an example, History and Evolution of Deep Learning with computational efficiency.

Introduction to Neural networks: Perceptron: logistic regression, Single Layer Perceptron, Single Layer Perceptron numerical problem, Limitations of Single Layer Perceptron

Hands-on on building a simple perceptron model using colab file

Introduction to Multilayer Perceptron, Difference between Shallow neural networks, deep neural networks, Take an example and show how to design the NN (5 to 7 examples), Activation Functions, Loss Functions
Week 2:Gradient Descent (GD) and Backpropagation (MSE)

Optimizers: Momentum-Based GD, Nesterov Accelerated GD, Stochastic GD, AdaDelta, AdaGrad, RMSProp, Adam. Regularization Techniques: L1/L2 regularization, dropout, Early stopping

Hands-on on building the Artificial Neural Network for classification and regression problems with exposure to hyperparameter tuning. Interpreting the results using simple XAI techniques: LIME & SHAP
Week 3:CNN: Fundamentals of Image representation and Image preprocessing and Data augmentation

Introduction to Convolutional Neural Networks, Inspiration behind CNN, Key Components of CNN, Types of convolutions

CNN architecture

CNN architecture

Hands-on on building a simple CNN model for binary and multiclass classification.
Week 4:A typical CNN structure, Standard CNN models: AlexNet, VGGNet 16 and 19

Standard CNN Models: GoogLeNet, ResNet 18, 34

Standard CNN Models: Inception, Transfer Learning

Hands-on on transfer learning and building an ensemble model.
Week 5:Introduction to XAI: Algorithms and its working mechanism

Hands-on: Interpreting the results from CNN model using simple XAI techniques: GRADCAM and SMOOTHGRAD
Week 6:Evaluation metrics for segmentation, CNN based segmentation algorithms: UNet
Attention-based UNet, Introduction to CNN based Object Detection models.
Object detection algorithms: YOLO, RCNN, Faster RCNN models.
Hands-on on object detection using YOLO
Hands-on on UNet and attention based UNet
Week 7:Sequence-to-sequence models: Introduction to Recurrent Neural Networks and their structure Challenges in RNN (Vanishing and Exploding Gradients)
Numerical problem on RNN
Hands-on on building RNN on structured and unstructured data
Variants of RNN and its hands-on
Week 8:Introduction to Long-short term memory (LSTM) architecture and its necessity, Bidirectional LSTM, Stacked LSTMs

Understand the GRU architecture, Compare LSTM vs GRU: speed, accuracy, complexity. When to use GRU over LSTM

Why attention mechanism in RNNs, working of attention mechanism, benefits of attention mechanism

Hands-on on building LSTM models for structured and unstructured data.
Week 9:Introduce NLP tasks, Classical NLP vs Deep Learning NLP, Text Preprocessing: Tokenization, Stopwords, Lemmatization
Word Representations: One-hot encoding, Word embeddings: Word2Vec, GloVe, FastText

Sequence modeling in NLP, Recurrent Neural Networks (RNN) basics, Long Short-Term Memory (LSTM) and Gated Recurrent Units (GRU), Word embeddings + RNN for sequence tasks

Hands on session on RNN for NLP task
Week 10:Unsupervised Learning: Introduction to Autoencoder, Architecture of AE and math behind
Types (Simple, Deep, CNN-based, Types), Training of Autoencoders,
Hands-on on building a AE and types of AE
Week 11:Transformer architectures: Self Attention, Encoder Decoder Attention, In-context-learning,

Low-rank adaptation. Self-supervised learning: Objectives and Loss Functions, Masked Language

Modeling
Week 12:Large Language Models: Tokenizers, Pre-training and post-training, multimodal alignment,

model compression, Reinforcement Learning for fine-tuning, Proximal Policy Optimization,

Benchmarking and Evaluation of LLMs.,

Diffusion Models: Deep generative models, VAEs and GANs, Forward and reverse diffusion, Denoising Score Matching, Variational Lower Bounds, Stable Diffusion

Taught by

Prof. Sriram Ganapathy, Prof. Ashwini Kodipalli, Prof. Baishali Garai

Reviews

Start your review of Foundations of Deep Learning: Concepts and Applications

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.