Neural Network Technologies and Their Usage

Major: Computer Science
Code of subject: 6.122.00.M.079
Credits: 6.00
Department: Artificial Intelligence Systems
Lecturer: O. Gurbych
Semester: 6 семестр
Mode of study: денна
Learning outcomes: Software Engineering Languages: Python or R Tools: git, Jupyter Notebook, Colab, Gradient, or similar Implement function, class Refactoring Debugging Import library/module Documentation Data Manipulation Libraries: PyTorch, TensorFlow, fast.ai, MXNet, or similar Data Preprocessing Libraries: pandas, PySpark, Dask, or similar Imputing missing values Drop duplicates MinMax, Normalization, Standardization, etc. Mean, median, mode Correlation analysis Data Visualization Libraries: matplotlib, seaborn, plotly, ggplot or similar Linear Algebra Libraries: numpy, scipy Scalars Vectors Matrices Tensors, Tensor Arithmetic Reduction Dot Products Matrix-Matrix Multiplication Norms Calculus Libraries: numpy, scipy Differentiation and Derivatives Partial Derivatives Gradients Chain Rule Probability Theory Basic Probability Theory Dealing with Multiple Random Variables Expectation and Variance Bayes Theorem Machine Learning Libraries: sklearn Models: Linear, Tree Ensembles, Probabilistic, etc.
Required prior and related subjects: Software Engineering Languages: Python or R Tools: git, Jupyter Notebook, Colab, Gradient, or similar Implement function, class Refactoring Debugging Import library/module Documentation Data Manipulation Libraries: PyTorch, TensorFlow, fast.ai, MXNet, or similar Data Preprocessing Libraries: pandas, PySpark, Dask, or similar Imputing missing values Drop duplicates MinMax, Normalization, Standardization, etc. Mean, median, mode Correlation analysis Data Visualization Libraries: matplotlib, seaborn, plotly, ggplot or similar Linear Algebra Libraries: numpy, scipy Scalars Vectors Matrices Tensors, Tensor Arithmetic Reduction Dot Products Matrix-Matrix Multiplication Norms Calculus Libraries: numpy, scipy Differentiation and Derivatives Partial Derivatives Gradients Chain Rule Probability Theory Basic Probability Theory Dealing with Multiple Random Variables Expectation and Variance Bayes Theorem Machine Learning Libraries: sklearn Models: Linear, Tree Ensembles, Probabilistic, etc.
Summary of the subject: Introduction to Deep Learning What Deep Learning Is Kinds of Deep Learning Deep Learning Projects - Generalized Pipeline Key Components From Biological to Artificial Neurons Tabular / DNN Linear Neural Networks Multilayer Feed-Forward Perceptrons Regression & MSE Binary Classification & LogLoss Multiclass Classification & Categorical Cross-Entropy Training Neural Networks Model Selection Underfitting & Overfitting Weights Initialization: Glorot, He, Xavier Nonsaturating Activation Functions Regression Loss Functions Classification Loss Functions Reconstruction Loss Functions Optimizers: SGD, Momentum, Nesterov, AdaGrad, RMSProp, Adam, Adam Regularization: L1, L2, Dropout, Monte-Carlo Dropout, Max-Norm Saving and restoring model Callbacks: Early Stopping, LR Decay on Plateau, Snapshots TensorBoard Hyperparameters Optimization Number of Hidden Layers Number of Neurons per Hidden Layer Learning Rate Batch Size Other Hyperparameters Vision / CNN 1D, 2D, 3D Conv layers, filters, padding, stride, pooling, feature maps Image Classification CNN Architectures ResNetV2 DenseNet MobileNetV3 Xception SENet ImageNet Using pre-trained models: fine-tuning vs transfer learning Object Detection - review 2021 YOLOv5 SSD FRCNN OFA Swin Transformer Semantic Segmentation - Hands-On-Machine-Learning UNet PSPNet Language / NLP / RNN Text Preprocessing Tokenization Numerization Word Embeddings word2vec GloVe Recurrent Neural Networks (RNNs) GRU LSTM Attention Mechanisms Transformers BERT GPT-2, GPT-3 Recommender Systems Generative Adversarial Networks (GANs) Deep Reinforcement Learning DQN A2C MLOps ML Infrastructure & Operations Deployment & Monitoring
Assessment methods and criteria: project - 100
Recommended books: Dive into Deep Learning by Alex Smola, 2021 The hundred-page machine learning book by Andriy Burkov, 2020 Machine Learning Engineering by Andriy Burkov, 2020 Deep Learning Book by Ian Goodfellow, Yoshua Bengio; MIT, 2017 ML Yearning book, by Andrew Ng; 2019 Stanford CS 329P: Practical Machine Learning Stanford CS 329S: Machine Learning Systems Design; Stanford, Winter 2021 Stanford CS229: Machine Learning; Stanford, Spring 2021 MIT 6.S191: Introduction to Deep learning, 2020 MIT Deep Learning and Artificial Intelligence Lectures, 2019-2020