Applications Open now for May 2024 Batch | Applications Close: May 26, 2024 | Exam: Jul 07, 2024

Applications Open now for May 2024 Batch | Applications Close: May 26, 2024 | Exam: Jul 07, 2024

Degree Level Course

Deep Learning

To study the basics of Neural Networks and their various variants such as the Convolutional Neural Networks and Recurrent Neural Networks, to study the different ways in which they can be used to solve problems in various domains such as Computer Vision, Speech and NLP.

by Mitesh M.Khapra

Course ID: BSCS3004

Course Credits: 4

Course Type: Core Option II

Pre-requisites: None

What you’ll learnVIEW COURSE VIDEOS

A brief history of deep learning and its success stories.
Perceptrons, Sigmoid neurons and Multi-Layer Perceptrons (MLP) with specific emphasis on their representation power and algorithms used for training them (such as Perceptron Learning Algorithm and Backpropagation).
Gradient Descent (GD) algorithm and its variants like Momentum based GD,AdaGrad, Adam etc Principal Component Analysis and its relation to modern Autoencoders.
The bias variance tradeoff and regularisation techniques used in DNNs (such as L2 regularisation, noisy data augmentation, dropout, etc).
Different activation functions and weight initialization strategies
Convolutional Neural Networks (CNNs) such as AlexNet, ZFNet, VGGNet, InceptionNet and ResNet.
Recurrent Neural Network (RNNs) and their variants such as LSTMs and GRUs (in particular, understanding the vanishing/exploding gradient problem and how LSTMs overcome the vanishing gradient problem)
Applications of CNN and RNN models for various computer vision and Natural Language Processing (NLP) problems.

Course structure & Assessments

12 weeks of coursework, weekly online assignments, 2 in-person invigilated quizzes, 1 in-person invigilated end term exam. For details of standard course structure and assessments, visit Academics page.

WEEK 1 History of Deep Learning, McCulloch Pitts Neuron, Thresholding Logic, Perceptron Learning Algorithm and Convergence
WEEK 2 Multilayer Perceptrons (MLPs), Representation Power of MLPs, Sigmoid Neurons, Gradient Descent
WEEK 3 Feedforward Neural Networks, Representation Power of Feedforward Neural Networks, Backpropagation
WEEK 4 Gradient Descent(GD), Momentum Based GD, Nesterov Accelerated GD, Stochastic GD, Adagrad, AdaDelta,RMSProp, Adam,AdaMax,NAdam, learning rate schedulers
WEEK 5 Autoencoders and relation to PCA , Regularization in autoencoders, Denoising autoencoders, Sparse autoencoders, Contractive autoencoders
WEEK 6 Bias Variance Tradeoff, L2 regularization, Early stopping, Dataset augmentation, Parameter sharing and tying, Injecting noise at input, Ensemble methods, Dropout
WEEK 7 Greedy Layer Wise Pre-training, Better activation functions, Better weight initialization methods, Batch Normalization
WEEK 8 Learning Vectorial Representations Of Words, Convolutional Neural Networks, LeNet, AlexNet, ZF-Net, VGGNet, GoogLeNet, ResNet
WEEK 9 Visualizing Convolutional Neural Networks, Guided Backpropagation, Deep Dream, Deep Art, Fooling Convolutional Neural Networks
WEEK 10 Recurrent Neural Networks, Backpropagation Through Time (BPTT), Vanishing and Exploding Gradients, Truncated BPTT
WEEK 11 Gated Recurrent Units (GRUs), Long Short Term Memory (LSTM) Cells, Solving the vanishing gradient problem with LSTM
WEEK 12 Encoder Decoder Models, Attention Mechanism, Attention over images, Hierarchical Attention, Transformers.
+ Show all weeks

Prescribed Books

The following are the suggested books for the course:

Ian Goodfellow and Yoshua Bengio and Aaron Courville. Deep Learning. An MIT Press book. 2016.

Charu C. Aggarwal. Neural Networks and Deep Learning: A Textbook. Springer. 2019.

About the Instructors

Mitesh M.Khapra
Associate Professor, Department of Computer Science and Engineering, IIT Madras

Mitesh M. Khapra is an Associate Professor in the Department of Computer Science and Engineering at IIT Madras and is affiliated with the Robert Bosch Centre for Data Science and AI. He is also a co-founder of One Fourth Labs, a startup whose mission is to design and deliver affordable hands-on courses on AI and related topics. He is also a co-founder of AI4Bharat, a voluntary community with an aim to provide AI-based solutions to India-specific problems. His research interests span the areas of Deep Learning, Multimodal Multilingual Processing, Natural Language Generation, Dialog systems, Question Answering and Indic Language Processing. Prior to IIT Madras, he was a Researcher at IBM Research India for four and a half years, where he worked on several interesting problems in the areas of Statistical Machine Translation, Cross Language Learning, Multimodal Learning, Argument Mining and Deep Learning. Prior to IBM, he completed his PhD and M.Tech from IIT Bombay in Jan 2012 and July 2008 respectively.During his PhD he was a recipient of the IBM PhD Fellowship (2011) and the Microsoft Rising Star Award (2011). He is also a recipient of the Google Faculty Research Award (2018), the IITM Young Faculty Recognition Award (2019) and the Prof. B. Yegnanarayana Award for Excellence in Research and Teaching (2020).

  less