PART- 2 :ARTIFICIAL INTELLIGENCE & DEEP LEARNING
11. A Brief Introduction to Deep Learing and TensorFlow
-
Deep Learning at a glance
-
Artificial Neural Networks (ANN)
-
Deep Learning Architectures
-
Fully connected Layers
-
Convolutional Layers
-
Dropout Layers
-
-
A brief introduction of Tensorflow
-
A brief introduction of Keras
-
From Biological to artificial neurons
-
Biological Neurons
-
Logical computations with neurons
-
The perceptron
-
Multi-Layer perceptron and Backpropagation
-
-
Training DNN using plain TensorFlow
-
Construction Phase
-
Esecution Phase
-
Using the neural network
-
-
Fine-tuning neural network hyperparameter
-
Number of hidden layers
-
Number of neurons per hidden layer
-
Activation Functions
-
12. Convolutional Neural Networks (CNN) / Computer vision
-
Architecture of the visual cortex
-
Convolution Layer
-
Filters
-
Stacking Multiple feature maps
-
TensorflowImplemention
-
Memory Requirement
-
-
Pooling Layer
-
CNN Architectures
-
LaNet – 5
-
AlexNet
-
GoogleNet
-
ResNet
-
13. Recurrent Neural Network (RNN)
-
Recurrent Neurons
-
Memory cells
-
Input and output sequences
-
-
Basic RNNs TensorFlow
-
Static unrolling through time
-
Dynamic unrolling through time
-
Handling variable length input sequences
-
Handling variable length output sequences
-
-
Training RNNs
-
Training a sequence classifier
-
Training to predict time series
-
Creative RNN
-
-
Distributing deep RNN across multiple GPUs
-
Applying Dropout
-
The difficulty of training over many time steps
-
LSTM Cell (Long short-term Memory)
You can Pay by using PhonePe & GPay
Pay @ 9966642877