11. So Many Applications
1. How FB cluster images - Open Face
2. Google Neural Machine Translation - Seq2Seq
3. Google Assistant/ Siri - Seq2Seq , Attention Mechanisms
4. Self Driving Cars - Detection , Deep Reinforcement Learning
5. Signal Processing - Discriminative and separable feature extractors
12.
13. Yes… Even In Wall Street !
Algorithmic Trading With Bots
25. Again Why Deep Learning
❏ Traditional ML works with more rule based structure
❏ Not deep enough to extract complex patterns
❏ Should carefully input selected features : PCA , Handcraft -Haar
Haar Features
SIFT Features
31. Nonlinearity to mine complex boundaries
Basic Idea -:
These units are there to squash the information which means they have a
working rage .
Ex : Activation Range Of Sigmoid (a) & Tanh(b) Neurons
Linear line in the curve
32. Understanding the Neural Model
● Training Phase -: Supervised Learning | Labeled data
● Inference / Testing Phase -: Checking the model for unseen data
50. Algorithm Fail To Generalize Things
❏ When predicting , algorithm should be very logical , it should take decisions
based on the patterns .
❏ Otherwise It will fail eventually on Unseen Data
❏ When networks become bigger and bigger they came up with overfitting issue
more and more
56. A Neural Net should have more Unsaturated Neurons .
● Backpropagation is basically applying chain rule
● In-order to make it happen we need to calculate local gradients in each node
connection
Network with ReLU
activation function
60. Can we design set of features for
machines ??
No way!
We may design some high level features!
But our machines deal with PIXELS!
(In other domains like NLP also)
61. What if we let the machine to
extract it’s own features!
62. Deep Learning is all about End-End Training
Automated Feature Extraction
Visualization of self learned filters of a CNN. Each layer different
features