AD3501 Deep Learning Syllabus:
AD3501 Deep Learning Syllabus – Anna University Regulation 2021
COURSE OBJECTIVES:
• To understand and need and principles of deep neural networks
• To understand CNN and RNN architectures of deep neural networks
• To comprehend advanced deep learning models
• To learn the evaluation metrics for deep learning models
UNIT I DEEP NETWORKS BASICS
Linear Algebra: Scalars — Vectors — Matrices and tensors; Probability Distributions — Gradient-based Optimization – Machine Learning Basics: Capacity — Overfitting and underfitting –Hyperparameters and validation sets — Estimators — Bias and variance — Stochastic gradient descent — Challenges motivating deep learning; Deep Networks: Deep feedforward networks; Regularization — Optimization.
UNIT II CONVOLUTIONAL NEURAL NETWORKS
Convolution Operation — Sparse Interactions — Parameter Sharing — Equivariance — Pooling — Convolution Variants: Strided — Tiled — Transposed and dilated convolutions; CNN Learning: Nonlinearity Functions — Loss Functions — Regularization — Optimizers –Gradient Computation.
UNIT III RECURRENT NEURAL NETWORKS
Unfolding Graphs — RNN Design Patterns: Acceptor — Encoder –Transducer; Gradient Computation — Sequence Modeling Conditioned on Contexts — Bidirectional RNN — Sequence to Sequence RNN – Deep Recurrent Networks — Recursive Neural Networks — Long Term Dependencies; Leaky Units: Skip connections and dropouts; Gated Architecture: LSTM.
UNIT IV MODEL EVALUATION
Performance metrics — Baseline Models — Hyperparameters: Manual Hyperparameter — Automatic Hyperparameter — Grid search — Random search — Debugging strategies.
UNIT V AUTOENCODERS AND GENERATIVE MODELS
Autoencoders: Undercomplete autoencoders — Regularized autoencoders — Stochastic encoders and decoders — Learning with autoencoders; Deep Generative Models: Variational autoencoders – Generative adversarial networks.
TOTAL: 45 PERIODS
COURSE OUTCOMES
After the completion of this course, students will be able to:
CO1:Explain the basics in deep neural networks
CO2:Apply Convolution Neural Network for image processing
CO3:Apply Recurrent Neural Network and its variants for text analysis
CO4:Apply model evaluation for various applications
CO5:Apply autoencoders and generative models for suitable applications
TEXT BOOK
1. Ian Goodfellow, Yoshua Bengio, Aaron Courville, “Deep Learning”, MIT Press, 2016.
2. Andrew Glassner, “Deep Learning: A Visual Approach”, No Starch Press, 2021.
REFERENCES
1. Salman Khan, Hossein Rahmani, Syed Afaq Ali Shah, Mohammed Bennamoun, “A Guide to Convolutional Neural Networks for Computer Vision”, Synthesis Lectures on Computer Vision, Morgan & Claypool publishers, 2018.
2. Yoav Goldberg, “Neural Network Methods for Natural Language Processing”, Synthesis Lectures on Human Language Technologies, Morgan & Claypool publishers, 2017.
3. Francois Chollet, “Deep Learning with Python”, Manning Publications Co, 2018.
4. Charu C. Aggarwal, “Neural Networks and Deep Learning: A Textbook”, Springer International Publishing, 2018.
5. Josh Patterson, Adam Gibson, “Deep Learning: A Practitioner’s Approach”, O’Reilly Media, 2017.
