SlideShare una empresa de Scribd logo
1 de 63
Recurrent Neural Network
What’s in it for you?
RNN
What is a Neural Network?
Why Recurrent Neural Network?
Popular Neural Networks
How does a RNN work?
Vanishing and Exploding Gradient Problem
Long Short Term Memory (LSTM)
Use case implementation of LSTM
What is a Recurrent Neural Network?
Introduction to RNN
Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing?
h
y
x
A
B
C
h
y
x
A
B
C
h
y
x
A
B
C
Fed to a Recurrent Neural
Network
What is the best food to
eat in Las
Vegas
Google search
Autocompletes the
search
Introduction to RNN
Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing?
h
y
x
A
B
C
h
y
x
A
B
C
h
y
x
A
B
C
Fed to a Recurrent Neural
Network
What is the best food to
eat in Las
Vegas
Google search
Autocompletes the
search
Collection of large volumes of
most frequently occurring
consecutive words
Introduction to RNN
Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing?
Collection of large volumes of
most frequently occurring
consecutive words
h
y
x
A
B
C
h
y
x
A
B
C
h
y
x
A
B
C
Fed to a Recurrent Neural
Network
What is the best food to
eat in Las
Vegas
Google search
Autocompletes the
search
Analyses the data by finding the
sequence of words occurring
frequently and builds a model to
predict the next word in the sentence
Introduction to RNN
Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing?
Collection of large volumes of
most frequently occurring
consecutive words
h
y
x
A
B
C
h
y
x
A
B
C
h
y
x
A
B
C
Fed to a Recurrent Neural
Network
What is the best food to
eat in Las
Vegas
Google search
Autocompletes the
search
Introduction to RNN
Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing?
Collection of large volumes of
most frequently occurring
consecutive words
h
y
x
A
B
C
h
y
x
A
B
C
h
y
x
A
B
C
Fed to a Recurrent Neural
Network
What is the best food to
eat in Las
Vegas
Google search
Autocompletes the
search
What is a Neural Network?
Neural Networks used in Deep Learning, consists of different layers connected to each other and work on
the structure and functions of a human brain. It learns from huge volumes of data and uses complex
algorithms to train a neural net.
Input Layer Hidden Layers
Output Layer
German Shepherd
Labrador
Image pixels of 2 different breeds
of dog
Identifies the dogs
breed
What is a Neural Network?
Neural Networks used in Deep Learning, consists of different layers connected to each other and work on
the structure and functions of a human brain. It learns from huge volumes of data and uses complex
algorithms to train a neural net.
Hidden Layers
Output Layer
Input Layer
German Shepherd
Labrador
Image pixels of 2 different breeds
of dog
Identifies the dogs
breed
Such networks do not
require memorizing the
past output
Popular Neural Networks
Feed Forward Neural
Network
Used in general Regression and Classification
problems
Convolution Neural
Network
Used for Image Recognition
Deep Neural Network
Used for Acoustic Modeling
Deep Belief Network Used for Cancer Detection
Recurrent Neural
Network
Used for Speech Recognition
Feed Forward Neural Network
Input Layers Hidden Layers Output Layer
x y^
input Predicted
output
Simplified presentation
In a Feed-Forward Network , information flows only in forward direction, from the input nodes, through the hidden layers (if
any) and to the output nodes. There are no cycles or loops in the network.
Input Layer Hidden Layers
Output Layer
• Decisions are based on current input
• No memory about the past
• No future scope
Why Recurrent Neural Network?
Feed Forward Neural
Network
01
02
03
cannot handle
sequential data
cannot memorize
previous inputs
considers only the
current input
Issues in Feed Forward Neural Network
Why Recurrent Neural Network?
h
y
x
A
B
C
Recurrent Neural
Network
01
02
03
can handle
sequential data
can memorize previous
inputs due to its internal
memory
considers the current input and
also the previously received
inputs
Solution to Feed Forward Neural Network
Applications of RNN
Image captioning
RNN is used to caption an image by analyzing the activities present in it
“A Dog catching a ball in mid air”
Applications of RNN
Time series prediction
Any time series problem like predicting the prices of stocks in a particular month can
be solved using RNN
Applications of RNN
Natural Language Processing
Text mining and Sentiment analysis can be carried out using RNN for Natural
Language Processing
When it rains, look for rainbows. When
it’s dark, look for stars.
Positive Sentiment
Applications of RNN
Machine Translation
Given an input in one language, RNN can be used to translate the input into
different languages as output
Here the person is speaking in
English and it is getting translated into
Chinese, Italian, French, German and
Spanish languages
What is a Recurrent Neural Network?
Recurrent Neural Network works on the principle of saving the output of a layer and feeding
this back to the input in order to predict the output of the layer.
h
y
x
A
B
C
Recurrent Neural Network
x
x
x
h
h
h
h
h
h
h
h
y
y
Input Layer Hidden Layers Output Layer
www.simplilearn.com
How does a RNN look like?
A, B and C are the parameters
How does a RNN look like?
www.simplilearn.com
How does a RNN work?
= new state
fc = function with parameter c
= old state
= input vector at time step t
= f (h(t-1) ,c )
C
A
B
C
A
B
C
A
B
y(t-1) y(t+1)y(t)
x(t-1) x(t+1)
h(t-1) h(t+1)
h(t)
h(t)
x(t)
x(t)
h(t)
h(t-1)
x(t)
Types of Recurrent Neural Network
one to one
Single output
Single input
one to one network is known as the Vanilla Neural
Network. Used for regular machine learning
problems
Types of Recurrent Neural Network
one to many
Multiple outputs
Single input
one to many network generates sequence of
outputs. Example: Image captioning
Types of Recurrent Neural Network
many to one
Single output
Multiple inputs
many to one network takes in a sequence of
inputs. Example: Sentiment analysis where a
given sentence can be classified as expressing
positive or negative sentiments
Types of Recurrent Neural Network
many to many
Multiple outputs
Multiple inputs
many to many network takes in a sequence of
inputs and generates a sequence of outputs.
Example: Machine Translation
Vanishing Gradient Problem
While training a RNN, your slope can be either too
small or very large and this makes training difficult.
When the slope is too small, the problem is know as
Vanishing gradient.
Change in Y
Change in X
Y
X
s0
E0
x0
s1
E1
s2
E2
x2x1
s3
E3
x3
Loss of Information through time
Backpropagate the error
Exploding Gradient Problem
When the slope tends to grow exponentially
instead of decaying, this problem is called
Exploding gradient.
Y
X
• Long training time
• Poor performance
• Bad accuracys0
E0
x0
s1
E1
s2
E2
x2x1
s3
E3
x3
Loss of Information through time
Backpropagate the error
Issues in Gradient Problem
Explaining Gradient Problem
s1
y1
x1
s2
y2
s3
y3
x3x2
st
yt
xt
…….s0
Consider the following 2 examples to understand what should be the next word in the sequence:
The person who took my bike and…………………………………………., a thief.
The students who got into Engineering with …………………………………………., from Asia.
____
____
was
were
Explaining Gradient Problem
s1
y1
x1
s2
y2
s3
y3
x3x2
st
yt
xt
…….s0
Consider the following 2 examples:
The person who took my bike and…………………………………………., a thief.
The students who got into Engineering with …………………………………………., were from Asia .
____was
In order to understand what would be the next word in the sequence, the RNN must memorize the
previous context whether the subject was singular noun or plural noun
Explaining Gradient Problem
Consider the following 2 examples:
The person who took my bike and…………………………………………., was a thief.
The students who got into Engineering with …………………………………………., from Asia .
In order to understand what would be the next word in the sequence, the RNN must memorize the
previous context whether the subject was singular noun or plural noun
____were
It might be sometimes difficult for
the error to backpropagate to the
beginning of the sequence to
predict what should be the output
s1
y1
x1
s2
y2
s3
y3
x3x2
st
yt
xt
…….s0
Solution to Gradient Problem
Identity
Initialization
Truncated
Backpropagation
Gradient
Clipping
2
1 3
Exploding Gradient
Weight
initialization
Choosing the right Activation
Function
Long Short-Term
Memory Networks
(LSTMs)
2
1 3
Vanishing Gradient
Long-Term Dependencies
Here we do not need any further context. It’s pretty
clear the last word is going to be “sky”.
Suppose we try to predict the
last word in the text
“ The clouds are in the ___”sky
Long-Term Dependencies
Suppose we try to predict the
last word in the text
“I have been staying in Spain for the last 10 years… I
can speak fluent ______.”
• Here we need the context of Spain to predict the last word in
the text.
• It’s possible that the gap between the relevant information
and the point where it is needed to become very large.
• LSTMs help us solve this problem.
Spanish
The word we predict will depend on the
previous few words in context
Long Short-Term Memory Networks
LSTMs are special kind of Recurrent Neural Networks, capable of learning long-term dependencies. Remembering
information for long periods of time is their default behavior.
tanh
ht-1
xt-1
tanh
ht
xt
tanh
ht+1
Xt+1
A AA
All recurrent neural networks have the form of a chain of repeating modules of neural network. In standard RNNs, this repeating
module will have a very simple structure, such as a single tanh layer.
Long Short-Term Memory Networks
LSTMs are special kind of Recurrent Neural Networks, capable of learning long-term dependencies. Remembering
information for long periods of time is their default behavior.
+x
tanh
x
tanh
x
+x
tanh
x
tanh
x
+x
tanh
x
tanh
x
ht-1 ht Ht+1
xt-1
xt Xt+1
A A
LSTMs also have a chain like structure, but the repeating module has a different structure. Instead of having a single neural
network layer, there are four interacting layers communicating in a very special way.
Long Short-Term Memory Networks
3 step process of LSTMs
Step 1 Step 2 Step 3
Forget irrelevant parts
of previous state
Selectively update cell
state values
Output certain parts of
cell state
Ct-1 Ct
Long Short-Term Memory Networks
3 step process of LSTMs
Step 1 Step 2 Step 3
Forget irrelevant parts
of previous state
Selectively update cell
state values
Output certain parts of
cell state
Ct-1 Ct
Long Short-Term Memory Networks
3 step process of LSTMs
Step 1 Step 2 Step 3
Forget irrelevant parts
of previous state
Selectively update cell
state values
Output certain parts of
cell state
Ct-1 Ct
ht
Ct
Working of LSTMs
First step in the LSTM is to decide which information to be omitted in from
the cell in that particular time step. It is decided by the sigmoid function. It
looks at the previous state (ht-1 ) and the current input xt and computes the
function.
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
ft = forget gate
Decides which information to delete
that is not important from previous
time step
Decides how much of the past it should rememberStep-1
ht
Ct
Working of LSTMs
First step in the LSTM is to decide which information to be omitted in from
the cell in that particular time step. It is decided by the sigmoid function. It
looks at the previous state (ht-1 ) and the current input xt and computes the
function.
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
ft = forget gate
Decides which information to delete
that is not important from previous
time step
Consider an LSTM is fed with the following inputs from previous and present time step :
Alice is good in Physics. John on the other hand is good in Chemistry.
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team.
Previous Output
Current Input
ht-1
xt
Forget gate realizes there might be a change in context after
encountering the first full stop.
Compares with the current input sentence at xt
The next sentence talks about John, so the information on
Alice is deleted.
The position of subject is vacated and is assigned to John
ht
Ct
Working of LSTMs
First step in the LSTM is to decide which information to be omitted in from
the cell in that particular time step. It is decided by the sigmoid function. It
looks at the previous state (ht-1 ) and the current input xt and computes the
function.
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
ft = forget gate
Decides which information to delete
that is not important from previous
time step
Consider an LSTM is fed with the following inputs from previous and present time step :
Alice is good in Physics. John on the other hand is good in Chemistry.
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team.
Previous Output
Current Input
ht-1
xt
Forget gate realizes there might be a change in context after
encountering the first full stop.
Compares with the current input sentence at xt
The next sentence talks about John, so the information on
Alice is deleted.
The position of subject is vacated and is assigned to John
ht
Ct
Working of LSTMs
First step in the LSTM is to decide which information to be omitted in from
the cell in that particular time step. It is decided by the sigmoid function. It
looks at the previous state (ht-1 ) and the current input xt and computes the
function.
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
ft = forget gate
Decides which information to delete
that is not important from previous
time step
Consider an LSTM is fed with the following inputs from previous and present time step :
Alice is good in Physics. John on the other hand is good in Chemistry.
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team.
Previous Output
Current Input
ht-1
xt
Forget gate realizes there might be a change in context after
encountering the first full stop.
Compares with the current input sentence at xt
The next sentence talks about John, so the information on
Alice is deleted.
The position of subject is vacated and is assigned to John
ht
Ct
Working of LSTMs
First step in the LSTM is to decide which information to be omitted in from
the cell in that particular time step. It is decided by the sigmoid function. It
looks at the previous state (ht-1 ) and the current input xt and computes the
function.
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
ft = forget gate
Decides which information to delete
that is not important from previous
time step
Consider an LSTM is fed with the following inputs from previous and present time step :
Alice is good in Physics. John on the other hand is good in Chemistry.
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team.
Previous Output
Current Input
ht-1
xt
Forget gate realizes there might be a change in context after
encountering the first full stop.
Compares with the current input sentence at xt
The next sentence talks about John, so the information on
Alice is deleted.
The position of subject is vacated and is assigned to John
ht
Ct
Working of LSTMs
First step in the LSTM is to decide which information to be omitted in from
the cell in that particular time step. It is decided by the sigmoid function. It
looks at the previous state (ht-1 ) and the current input xt and computes the
function.
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
ft = forget gate
Decides which information to delete
that is not important from previous
time step
Consider an LSTM is fed with the following inputs from previous and present time step :
Alice is good in Physics. John on the other hand is good in Chemistry.
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team.
Previous Output
Current Input
ht-1
xt
Forget gate realizes there might be a change in context after
encountering the first full stop.
Compares with the current input sentence at xt
The next sentence talks about John, so the information on
Alice is deleted.
The position of subject is vacated and is assigned to John
ht
Ct
Working of LSTMs
In the second layer, there are 2 parts. One is the sigmoid function and the
other is the tanh. In the sigmoid function, it decides which values to let
through(0 or 1). tanh function gives the weightage to the values which are
passed deciding their level of importance(-1 to 1).
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
Decides how much should this unit add to the current stateStep-2
it = input gate
Determines which information to let
through based on its significance in the
current time step
ht
Ct
Working of LSTMs
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
Consider the current input at xt
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team. Current Input
Input gate analyses
the important
information
ht
Ct
Working of LSTMs
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
Consider the current input at xt
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team. Current Input
John plays football
and he was the
captain of his college
team is important
ht
Ct
Working of LSTMs
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
Consider the current input at xt
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team. Current Input
He told me over
phone yesterday is
less important, hence
it is forgotten
ht
Ct
Working of LSTMs
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
Consider the current input at xt
John plays football well. He told me yesterday over phone that he had served as
the captain of his college football team. Current Input
This process of
adding some new
information can be
done via
the input gate
Working of LSTMs
The third step is to decide what will be our output. First, we run a sigmoid
layer which decides what parts of the cell state make it to the output.
Then, we put the cell state through tanh to push the values to be between
-1 and 1 and multiply it by the output of the sigmoid gate.
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
ht
ot
Ct
Decides what part of the current cell state makes it to the outputStep-3
ot = output gate
Allows the passed in information to
impact the output in the current time step
Working of LSTMs
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
ht
ot
Ct
Let’s consider this example to predicting the next word in the sentence:
John played tremendously well against the opponent and won for his team. For his
contributions, brave ____ was awarded player of the match.
There could be a lot
of choices for the
empty space
Working of LSTMs
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
ht
ot
Ct
Let’s consider this example to predicting the next word in the sentence:
John played tremendously well against the opponent and won for his team. For his
contributions, brave ____ was awarded player of the match.
Current input brave is
an adjective
Working of LSTMs
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
ht
ot
Ct
Let’s consider this example to predicting the next word in the sentence:
John played tremendously well against the opponent and won for his team. For his
contributions, brave ____ was awarded player of the match.
Adjectives describe a
noun
Working of LSTMs
+x
tanh
x
tanh
x
ht
xt
Ct-1
ht-1
ft
Ct
it ~
ht
ot
Ct
Let’s consider this example to predicting the next word in the sentence:
John played tremendously well against the opponent and won for his team. For his
contributions, brave ____ was awarded player of the match.John
“John” could be the
best output after
brave
Use case implementation of LSTM
Let’s predict the prices of stocks using LSTM network
Based on the stock price
data between 2012-
2016
Predict the stock prices of 2017
Use case implementation of LSTM
1. Import the Libraries
2. Import the training dataset
3. Feature Scaling
Use case implementation of LSTM
4. Create a data structure with 60 timesteps and 1 output
5. Import keras libraries and packages
Use case implementation of LSTM
6. Initialize the RNN
7. Adding the LSTM layers and some Dropout regularization
Use case implementation of LSTM
8. Adding the output layer
9. Compile the RNN
10. Fit the RNN to the training set
Use case implementation of LSTM
11. Load the stock price test data for 2017
12. Get the predicted stock price of 2017
Use case implementation of LSTM
13. Visualize the results of predicted and real stock price
Key Takeaways
Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | Simplilearn

Más contenido relacionado

La actualidad más candente

Recurrent Neural Networks. Part 1: Theory
Recurrent Neural Networks. Part 1: TheoryRecurrent Neural Networks. Part 1: Theory
Recurrent Neural Networks. Part 1: TheoryAndrii Gakhov
 
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...Edureka!
 
Autoencoders in Deep Learning
Autoencoders in Deep LearningAutoencoders in Deep Learning
Autoencoders in Deep Learningmilad abbasi
 
Deep neural networks
Deep neural networksDeep neural networks
Deep neural networksSi Haem
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep LearningOswald Campesato
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)EdutechLearners
 
Recurrent neural networks rnn
Recurrent neural networks   rnnRecurrent neural networks   rnn
Recurrent neural networks rnnKuppusamy P
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptronomaraldabash
 
Natural Language Processing (NLP)
Natural Language Processing (NLP)Natural Language Processing (NLP)
Natural Language Processing (NLP)Yuriy Guts
 
Deep learning - A Visual Introduction
Deep learning - A Visual IntroductionDeep learning - A Visual Introduction
Deep learning - A Visual IntroductionLukas Masuch
 
Natural language processing (nlp)
Natural language processing (nlp)Natural language processing (nlp)
Natural language processing (nlp)Kuppusamy P
 
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...Simplilearn
 
MACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMMACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMPuneet Kulyana
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Suraj Aavula
 

La actualidad más candente (20)

Recurrent Neural Networks. Part 1: Theory
Recurrent Neural Networks. Part 1: TheoryRecurrent Neural Networks. Part 1: Theory
Recurrent Neural Networks. Part 1: Theory
 
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
Recurrent Neural Networks (RNN) | RNN LSTM | Deep Learning Tutorial | Tensorf...
 
Autoencoders in Deep Learning
Autoencoders in Deep LearningAutoencoders in Deep Learning
Autoencoders in Deep Learning
 
Deep neural networks
Deep neural networksDeep neural networks
Deep neural networks
 
Introduction to Deep Learning
Introduction to Deep LearningIntroduction to Deep Learning
Introduction to Deep Learning
 
K Nearest Neighbors
K Nearest NeighborsK Nearest Neighbors
K Nearest Neighbors
 
Perceptron (neural network)
Perceptron (neural network)Perceptron (neural network)
Perceptron (neural network)
 
LSTM Basics
LSTM BasicsLSTM Basics
LSTM Basics
 
Deep learning
Deep learningDeep learning
Deep learning
 
LSTM Tutorial
LSTM TutorialLSTM Tutorial
LSTM Tutorial
 
Deep learning presentation
Deep learning presentationDeep learning presentation
Deep learning presentation
 
Recurrent neural networks rnn
Recurrent neural networks   rnnRecurrent neural networks   rnn
Recurrent neural networks rnn
 
Multilayer perceptron
Multilayer perceptronMultilayer perceptron
Multilayer perceptron
 
Natural Language Processing (NLP)
Natural Language Processing (NLP)Natural Language Processing (NLP)
Natural Language Processing (NLP)
 
Deep learning - A Visual Introduction
Deep learning - A Visual IntroductionDeep learning - A Visual Introduction
Deep learning - A Visual Introduction
 
Natural language processing (nlp)
Natural language processing (nlp)Natural language processing (nlp)
Natural language processing (nlp)
 
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
Convolutional Neural Network - CNN | How CNN Works | Deep Learning Course | S...
 
MACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHMMACHINE LEARNING - GENETIC ALGORITHM
MACHINE LEARNING - GENETIC ALGORITHM
 
Recurrent Neural Network
Recurrent Neural NetworkRecurrent Neural Network
Recurrent Neural Network
 
Convolution Neural Network (CNN)
Convolution Neural Network (CNN)Convolution Neural Network (CNN)
Convolution Neural Network (CNN)
 

Similar a Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | Simplilearn

Alberto Massidda - Images and words: mechanics of automated captioning with n...
Alberto Massidda - Images and words: mechanics of automated captioning with n...Alberto Massidda - Images and words: mechanics of automated captioning with n...
Alberto Massidda - Images and words: mechanics of automated captioning with n...Codemotion
 
lepibwp74jd2rz.pdf
lepibwp74jd2rz.pdflepibwp74jd2rz.pdf
lepibwp74jd2rz.pdfSajalTyagi6
 
Deep Learning: Application & Opportunity
Deep Learning: Application & OpportunityDeep Learning: Application & Opportunity
Deep Learning: Application & OpportunityiTrain
 
Introduction to deep learning
Introduction to deep learningIntroduction to deep learning
Introduction to deep learningJunaid Bhat
 
Deep Learning and Watson Studio
Deep Learning and Watson StudioDeep Learning and Watson Studio
Deep Learning and Watson StudioSasha Lazarevic
 
Recurrent Neural Networks for Text Analysis
Recurrent Neural Networks for Text AnalysisRecurrent Neural Networks for Text Analysis
Recurrent Neural Networks for Text Analysisodsc
 
Deep learning from a novice perspective
Deep learning from a novice perspectiveDeep learning from a novice perspective
Deep learning from a novice perspectiveAnirban Santara
 
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...Simplilearn
 
Top 10 deep learning algorithms you should know in
Top 10 deep learning algorithms you should know inTop 10 deep learning algorithms you should know in
Top 10 deep learning algorithms you should know inAmanKumarSingh97
 
Deep Learning Overview Classification Types Examples And Limitations
Deep Learning Overview Classification Types Examples And LimitationsDeep Learning Overview Classification Types Examples And Limitations
Deep Learning Overview Classification Types Examples And LimitationsSlideTeam
 
Introduction to k-Nearest Neighbors and Amazon SageMaker
Introduction to k-Nearest Neighbors and Amazon SageMaker Introduction to k-Nearest Neighbors and Amazon SageMaker
Introduction to k-Nearest Neighbors and Amazon SageMaker Suman Debnath
 
Deep learning lecture - part 1 (basics, CNN)
Deep learning lecture - part 1 (basics, CNN)Deep learning lecture - part 1 (basics, CNN)
Deep learning lecture - part 1 (basics, CNN)SungminYou
 
CSCE181 Big ideas in NLP
CSCE181 Big ideas in NLPCSCE181 Big ideas in NLP
CSCE181 Big ideas in NLPInsoo Chung
 
Introduction to Neural Networks in Tensorflow
Introduction to Neural Networks in TensorflowIntroduction to Neural Networks in Tensorflow
Introduction to Neural Networks in TensorflowNicholas McClure
 
Deep Learning Tutorial
Deep Learning Tutorial Deep Learning Tutorial
Deep Learning Tutorial Ligeng Zhu
 
Document Analysis with Deep Learning
Document Analysis with Deep LearningDocument Analysis with Deep Learning
Document Analysis with Deep Learningaiaioo
 
Intro to deep learning_ matteo alberti
Intro to deep learning_ matteo albertiIntro to deep learning_ matteo alberti
Intro to deep learning_ matteo albertiDeep Learning Italia
 
DEEPLEARNING recurrent neural networs.pdf
DEEPLEARNING recurrent neural networs.pdfDEEPLEARNING recurrent neural networs.pdf
DEEPLEARNING recurrent neural networs.pdfAamirMaqsood8
 

Similar a Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | Simplilearn (20)

Alberto Massidda - Images and words: mechanics of automated captioning with n...
Alberto Massidda - Images and words: mechanics of automated captioning with n...Alberto Massidda - Images and words: mechanics of automated captioning with n...
Alberto Massidda - Images and words: mechanics of automated captioning with n...
 
lepibwp74jd2rz.pdf
lepibwp74jd2rz.pdflepibwp74jd2rz.pdf
lepibwp74jd2rz.pdf
 
Deep Learning: Application & Opportunity
Deep Learning: Application & OpportunityDeep Learning: Application & Opportunity
Deep Learning: Application & Opportunity
 
Introduction to deep learning
Introduction to deep learningIntroduction to deep learning
Introduction to deep learning
 
Deep Learning and Watson Studio
Deep Learning and Watson StudioDeep Learning and Watson Studio
Deep Learning and Watson Studio
 
Recurrent Neural Networks for Text Analysis
Recurrent Neural Networks for Text AnalysisRecurrent Neural Networks for Text Analysis
Recurrent Neural Networks for Text Analysis
 
Deep learning from a novice perspective
Deep learning from a novice perspectiveDeep learning from a novice perspective
Deep learning from a novice perspective
 
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...
TensorFlow Tutorial | Deep Learning With TensorFlow | TensorFlow Tutorial For...
 
Top 10 deep learning algorithms you should know in
Top 10 deep learning algorithms you should know inTop 10 deep learning algorithms you should know in
Top 10 deep learning algorithms you should know in
 
Deep Learning Overview Classification Types Examples And Limitations
Deep Learning Overview Classification Types Examples And LimitationsDeep Learning Overview Classification Types Examples And Limitations
Deep Learning Overview Classification Types Examples And Limitations
 
Introduction to k-Nearest Neighbors and Amazon SageMaker
Introduction to k-Nearest Neighbors and Amazon SageMaker Introduction to k-Nearest Neighbors and Amazon SageMaker
Introduction to k-Nearest Neighbors and Amazon SageMaker
 
Cnn
CnnCnn
Cnn
 
Deep learning lecture - part 1 (basics, CNN)
Deep learning lecture - part 1 (basics, CNN)Deep learning lecture - part 1 (basics, CNN)
Deep learning lecture - part 1 (basics, CNN)
 
CSCE181 Big ideas in NLP
CSCE181 Big ideas in NLPCSCE181 Big ideas in NLP
CSCE181 Big ideas in NLP
 
Introduction to Neural Networks in Tensorflow
Introduction to Neural Networks in TensorflowIntroduction to Neural Networks in Tensorflow
Introduction to Neural Networks in Tensorflow
 
Rnn & Lstm
Rnn & LstmRnn & Lstm
Rnn & Lstm
 
Deep Learning Tutorial
Deep Learning Tutorial Deep Learning Tutorial
Deep Learning Tutorial
 
Document Analysis with Deep Learning
Document Analysis with Deep LearningDocument Analysis with Deep Learning
Document Analysis with Deep Learning
 
Intro to deep learning_ matteo alberti
Intro to deep learning_ matteo albertiIntro to deep learning_ matteo alberti
Intro to deep learning_ matteo alberti
 
DEEPLEARNING recurrent neural networs.pdf
DEEPLEARNING recurrent neural networs.pdfDEEPLEARNING recurrent neural networs.pdf
DEEPLEARNING recurrent neural networs.pdf
 

Más de Simplilearn

ChatGPT in Cybersecurity
ChatGPT in CybersecurityChatGPT in Cybersecurity
ChatGPT in CybersecuritySimplilearn
 
Whatis SQL Injection.pptx
Whatis SQL Injection.pptxWhatis SQL Injection.pptx
Whatis SQL Injection.pptxSimplilearn
 
Top 5 High Paying Cloud Computing Jobs in 2023
 Top 5 High Paying Cloud Computing Jobs in 2023  Top 5 High Paying Cloud Computing Jobs in 2023
Top 5 High Paying Cloud Computing Jobs in 2023 Simplilearn
 
Types Of Cloud Jobs In 2024
Types Of Cloud Jobs In 2024Types Of Cloud Jobs In 2024
Types Of Cloud Jobs In 2024Simplilearn
 
Top 12 AI Technologies To Learn 2024 | Top AI Technologies in 2024 | AI Trend...
Top 12 AI Technologies To Learn 2024 | Top AI Technologies in 2024 | AI Trend...Top 12 AI Technologies To Learn 2024 | Top AI Technologies in 2024 | AI Trend...
Top 12 AI Technologies To Learn 2024 | Top AI Technologies in 2024 | AI Trend...Simplilearn
 
What is LSTM ?| Long Short Term Memory Explained with Example | Deep Learning...
What is LSTM ?| Long Short Term Memory Explained with Example | Deep Learning...What is LSTM ?| Long Short Term Memory Explained with Example | Deep Learning...
What is LSTM ?| Long Short Term Memory Explained with Example | Deep Learning...Simplilearn
 
Top 10 Chat GPT Use Cases | ChatGPT Applications | ChatGPT Tutorial For Begin...
Top 10 Chat GPT Use Cases | ChatGPT Applications | ChatGPT Tutorial For Begin...Top 10 Chat GPT Use Cases | ChatGPT Applications | ChatGPT Tutorial For Begin...
Top 10 Chat GPT Use Cases | ChatGPT Applications | ChatGPT Tutorial For Begin...Simplilearn
 
React JS Vs Next JS - What's The Difference | Next JS Tutorial For Beginners ...
React JS Vs Next JS - What's The Difference | Next JS Tutorial For Beginners ...React JS Vs Next JS - What's The Difference | Next JS Tutorial For Beginners ...
React JS Vs Next JS - What's The Difference | Next JS Tutorial For Beginners ...Simplilearn
 
Backpropagation in Neural Networks | Back Propagation Algorithm with Examples...
Backpropagation in Neural Networks | Back Propagation Algorithm with Examples...Backpropagation in Neural Networks | Back Propagation Algorithm with Examples...
Backpropagation in Neural Networks | Back Propagation Algorithm with Examples...Simplilearn
 
How to Become a Business Analyst ?| Roadmap to Become Business Analyst | Simp...
How to Become a Business Analyst ?| Roadmap to Become Business Analyst | Simp...How to Become a Business Analyst ?| Roadmap to Become Business Analyst | Simp...
How to Become a Business Analyst ?| Roadmap to Become Business Analyst | Simp...Simplilearn
 
Career Opportunities In Artificial Intelligence 2023 | AI Job Opportunities |...
Career Opportunities In Artificial Intelligence 2023 | AI Job Opportunities |...Career Opportunities In Artificial Intelligence 2023 | AI Job Opportunities |...
Career Opportunities In Artificial Intelligence 2023 | AI Job Opportunities |...Simplilearn
 
Programming for Beginners | How to Start Coding in 2023? | Introduction to Pr...
Programming for Beginners | How to Start Coding in 2023? | Introduction to Pr...Programming for Beginners | How to Start Coding in 2023? | Introduction to Pr...
Programming for Beginners | How to Start Coding in 2023? | Introduction to Pr...Simplilearn
 
Best IDE for Programming in 2023 | Top 8 Programming IDE You Should Know | Si...
Best IDE for Programming in 2023 | Top 8 Programming IDE You Should Know | Si...Best IDE for Programming in 2023 | Top 8 Programming IDE You Should Know | Si...
Best IDE for Programming in 2023 | Top 8 Programming IDE You Should Know | Si...Simplilearn
 
React 18 Overview | React 18 New Features and Changes | React 18 Tutorial 202...
React 18 Overview | React 18 New Features and Changes | React 18 Tutorial 202...React 18 Overview | React 18 New Features and Changes | React 18 Tutorial 202...
React 18 Overview | React 18 New Features and Changes | React 18 Tutorial 202...Simplilearn
 
What Is Next JS ? | Introduction to Next JS | Basics of Next JS | Next JS Tut...
What Is Next JS ? | Introduction to Next JS | Basics of Next JS | Next JS Tut...What Is Next JS ? | Introduction to Next JS | Basics of Next JS | Next JS Tut...
What Is Next JS ? | Introduction to Next JS | Basics of Next JS | Next JS Tut...Simplilearn
 
How To Become an SEO Expert In 2023 | SEO Expert Tutorial | SEO For Beginners...
How To Become an SEO Expert In 2023 | SEO Expert Tutorial | SEO For Beginners...How To Become an SEO Expert In 2023 | SEO Expert Tutorial | SEO For Beginners...
How To Become an SEO Expert In 2023 | SEO Expert Tutorial | SEO For Beginners...Simplilearn
 
WordPress Tutorial for Beginners 2023 | What Is WordPress and How Does It Wor...
WordPress Tutorial for Beginners 2023 | What Is WordPress and How Does It Wor...WordPress Tutorial for Beginners 2023 | What Is WordPress and How Does It Wor...
WordPress Tutorial for Beginners 2023 | What Is WordPress and How Does It Wor...Simplilearn
 
Blogging For Beginners 2023 | How To Create A Blog | Blogging Tutorial | Simp...
Blogging For Beginners 2023 | How To Create A Blog | Blogging Tutorial | Simp...Blogging For Beginners 2023 | How To Create A Blog | Blogging Tutorial | Simp...
Blogging For Beginners 2023 | How To Create A Blog | Blogging Tutorial | Simp...Simplilearn
 
How To Start A Blog In 2023 | Pros And Cons Of Blogging | Blogging Tutorial |...
How To Start A Blog In 2023 | Pros And Cons Of Blogging | Blogging Tutorial |...How To Start A Blog In 2023 | Pros And Cons Of Blogging | Blogging Tutorial |...
How To Start A Blog In 2023 | Pros And Cons Of Blogging | Blogging Tutorial |...Simplilearn
 
How to Increase Website Traffic ? | 10 Ways To Increase Website Traffic in 20...
How to Increase Website Traffic ? | 10 Ways To Increase Website Traffic in 20...How to Increase Website Traffic ? | 10 Ways To Increase Website Traffic in 20...
How to Increase Website Traffic ? | 10 Ways To Increase Website Traffic in 20...Simplilearn
 

Más de Simplilearn (20)

ChatGPT in Cybersecurity
ChatGPT in CybersecurityChatGPT in Cybersecurity
ChatGPT in Cybersecurity
 
Whatis SQL Injection.pptx
Whatis SQL Injection.pptxWhatis SQL Injection.pptx
Whatis SQL Injection.pptx
 
Top 5 High Paying Cloud Computing Jobs in 2023
 Top 5 High Paying Cloud Computing Jobs in 2023  Top 5 High Paying Cloud Computing Jobs in 2023
Top 5 High Paying Cloud Computing Jobs in 2023
 
Types Of Cloud Jobs In 2024
Types Of Cloud Jobs In 2024Types Of Cloud Jobs In 2024
Types Of Cloud Jobs In 2024
 
Top 12 AI Technologies To Learn 2024 | Top AI Technologies in 2024 | AI Trend...
Top 12 AI Technologies To Learn 2024 | Top AI Technologies in 2024 | AI Trend...Top 12 AI Technologies To Learn 2024 | Top AI Technologies in 2024 | AI Trend...
Top 12 AI Technologies To Learn 2024 | Top AI Technologies in 2024 | AI Trend...
 
What is LSTM ?| Long Short Term Memory Explained with Example | Deep Learning...
What is LSTM ?| Long Short Term Memory Explained with Example | Deep Learning...What is LSTM ?| Long Short Term Memory Explained with Example | Deep Learning...
What is LSTM ?| Long Short Term Memory Explained with Example | Deep Learning...
 
Top 10 Chat GPT Use Cases | ChatGPT Applications | ChatGPT Tutorial For Begin...
Top 10 Chat GPT Use Cases | ChatGPT Applications | ChatGPT Tutorial For Begin...Top 10 Chat GPT Use Cases | ChatGPT Applications | ChatGPT Tutorial For Begin...
Top 10 Chat GPT Use Cases | ChatGPT Applications | ChatGPT Tutorial For Begin...
 
React JS Vs Next JS - What's The Difference | Next JS Tutorial For Beginners ...
React JS Vs Next JS - What's The Difference | Next JS Tutorial For Beginners ...React JS Vs Next JS - What's The Difference | Next JS Tutorial For Beginners ...
React JS Vs Next JS - What's The Difference | Next JS Tutorial For Beginners ...
 
Backpropagation in Neural Networks | Back Propagation Algorithm with Examples...
Backpropagation in Neural Networks | Back Propagation Algorithm with Examples...Backpropagation in Neural Networks | Back Propagation Algorithm with Examples...
Backpropagation in Neural Networks | Back Propagation Algorithm with Examples...
 
How to Become a Business Analyst ?| Roadmap to Become Business Analyst | Simp...
How to Become a Business Analyst ?| Roadmap to Become Business Analyst | Simp...How to Become a Business Analyst ?| Roadmap to Become Business Analyst | Simp...
How to Become a Business Analyst ?| Roadmap to Become Business Analyst | Simp...
 
Career Opportunities In Artificial Intelligence 2023 | AI Job Opportunities |...
Career Opportunities In Artificial Intelligence 2023 | AI Job Opportunities |...Career Opportunities In Artificial Intelligence 2023 | AI Job Opportunities |...
Career Opportunities In Artificial Intelligence 2023 | AI Job Opportunities |...
 
Programming for Beginners | How to Start Coding in 2023? | Introduction to Pr...
Programming for Beginners | How to Start Coding in 2023? | Introduction to Pr...Programming for Beginners | How to Start Coding in 2023? | Introduction to Pr...
Programming for Beginners | How to Start Coding in 2023? | Introduction to Pr...
 
Best IDE for Programming in 2023 | Top 8 Programming IDE You Should Know | Si...
Best IDE for Programming in 2023 | Top 8 Programming IDE You Should Know | Si...Best IDE for Programming in 2023 | Top 8 Programming IDE You Should Know | Si...
Best IDE for Programming in 2023 | Top 8 Programming IDE You Should Know | Si...
 
React 18 Overview | React 18 New Features and Changes | React 18 Tutorial 202...
React 18 Overview | React 18 New Features and Changes | React 18 Tutorial 202...React 18 Overview | React 18 New Features and Changes | React 18 Tutorial 202...
React 18 Overview | React 18 New Features and Changes | React 18 Tutorial 202...
 
What Is Next JS ? | Introduction to Next JS | Basics of Next JS | Next JS Tut...
What Is Next JS ? | Introduction to Next JS | Basics of Next JS | Next JS Tut...What Is Next JS ? | Introduction to Next JS | Basics of Next JS | Next JS Tut...
What Is Next JS ? | Introduction to Next JS | Basics of Next JS | Next JS Tut...
 
How To Become an SEO Expert In 2023 | SEO Expert Tutorial | SEO For Beginners...
How To Become an SEO Expert In 2023 | SEO Expert Tutorial | SEO For Beginners...How To Become an SEO Expert In 2023 | SEO Expert Tutorial | SEO For Beginners...
How To Become an SEO Expert In 2023 | SEO Expert Tutorial | SEO For Beginners...
 
WordPress Tutorial for Beginners 2023 | What Is WordPress and How Does It Wor...
WordPress Tutorial for Beginners 2023 | What Is WordPress and How Does It Wor...WordPress Tutorial for Beginners 2023 | What Is WordPress and How Does It Wor...
WordPress Tutorial for Beginners 2023 | What Is WordPress and How Does It Wor...
 
Blogging For Beginners 2023 | How To Create A Blog | Blogging Tutorial | Simp...
Blogging For Beginners 2023 | How To Create A Blog | Blogging Tutorial | Simp...Blogging For Beginners 2023 | How To Create A Blog | Blogging Tutorial | Simp...
Blogging For Beginners 2023 | How To Create A Blog | Blogging Tutorial | Simp...
 
How To Start A Blog In 2023 | Pros And Cons Of Blogging | Blogging Tutorial |...
How To Start A Blog In 2023 | Pros And Cons Of Blogging | Blogging Tutorial |...How To Start A Blog In 2023 | Pros And Cons Of Blogging | Blogging Tutorial |...
How To Start A Blog In 2023 | Pros And Cons Of Blogging | Blogging Tutorial |...
 
How to Increase Website Traffic ? | 10 Ways To Increase Website Traffic in 20...
How to Increase Website Traffic ? | 10 Ways To Increase Website Traffic in 20...How to Increase Website Traffic ? | 10 Ways To Increase Website Traffic in 20...
How to Increase Website Traffic ? | 10 Ways To Increase Website Traffic in 20...
 

Último

Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)lakshayb543
 
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQ-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQuiz Club NITW
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfPatidar M
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17Celine George
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1GloryAnnCastre1
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Seán Kennedy
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...DhatriParmar
 
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxMan or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxDhatriParmar
 
week 1 cookery 8 fourth - quarter .pptx
week 1 cookery 8  fourth  -  quarter .pptxweek 1 cookery 8  fourth  -  quarter .pptx
week 1 cookery 8 fourth - quarter .pptxJonalynLegaspi2
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvRicaMaeCastro1
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptxmary850239
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDhatriParmar
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataBabyAnnMotar
 
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...DhatriParmar
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxHumphrey A Beña
 
4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptxmary850239
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseCeline George
 
Using Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea DevelopmentUsing Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea Developmentchesterberbo7
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operationalssuser3e220a
 

Último (20)

Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
Visit to a blind student's school🧑‍🦯🧑‍🦯(community medicine)
 
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITWQ-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
Q-Factor HISPOL Quiz-6th April 2024, Quiz Club NITW
 
Active Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdfActive Learning Strategies (in short ALS).pdf
Active Learning Strategies (in short ALS).pdf
 
How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17How to Fix XML SyntaxError in Odoo the 17
How to Fix XML SyntaxError in Odoo the 17
 
Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1Reading and Writing Skills 11 quarter 4 melc 1
Reading and Writing Skills 11 quarter 4 melc 1
 
Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...Student Profile Sample - We help schools to connect the data they have, with ...
Student Profile Sample - We help schools to connect the data they have, with ...
 
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
Beauty Amidst the Bytes_ Unearthing Unexpected Advantages of the Digital Wast...
 
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptxMan or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
Man or Manufactured_ Redefining Humanity Through Biopunk Narratives.pptx
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
week 1 cookery 8 fourth - quarter .pptx
week 1 cookery 8  fourth  -  quarter .pptxweek 1 cookery 8  fourth  -  quarter .pptx
week 1 cookery 8 fourth - quarter .pptx
 
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnvESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
ESP 4-EDITED.pdfmmcncncncmcmmnmnmncnmncmnnjvnnv
 
4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx4.16.24 Poverty and Precarity--Desmond.pptx
4.16.24 Poverty and Precarity--Desmond.pptx
 
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptxDecoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
Decoding the Tweet _ Practical Criticism in the Age of Hashtag.pptx
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped data
 
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
Blowin' in the Wind of Caste_ Bob Dylan's Song as a Catalyst for Social Justi...
 
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptxINTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
INTRODUCTION TO CATHOLIC CHRISTOLOGY.pptx
 
4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx4.11.24 Mass Incarceration and the New Jim Crow.pptx
4.11.24 Mass Incarceration and the New Jim Crow.pptx
 
How to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 DatabaseHow to Make a Duplicate of Your Odoo 17 Database
How to Make a Duplicate of Your Odoo 17 Database
 
Using Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea DevelopmentUsing Grammatical Signals Suitable to Patterns of Idea Development
Using Grammatical Signals Suitable to Patterns of Idea Development
 
Expanded definition: technical and operational
Expanded definition: technical and operationalExpanded definition: technical and operational
Expanded definition: technical and operational
 

Recurrent Neural Network (RNN) | RNN LSTM Tutorial | Deep Learning Course | Simplilearn

  • 2. What’s in it for you? RNN What is a Neural Network? Why Recurrent Neural Network? Popular Neural Networks How does a RNN work? Vanishing and Exploding Gradient Problem Long Short Term Memory (LSTM) Use case implementation of LSTM What is a Recurrent Neural Network?
  • 3. Introduction to RNN Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing? h y x A B C h y x A B C h y x A B C Fed to a Recurrent Neural Network What is the best food to eat in Las Vegas Google search Autocompletes the search
  • 4. Introduction to RNN Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing? h y x A B C h y x A B C h y x A B C Fed to a Recurrent Neural Network What is the best food to eat in Las Vegas Google search Autocompletes the search Collection of large volumes of most frequently occurring consecutive words
  • 5. Introduction to RNN Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing? Collection of large volumes of most frequently occurring consecutive words h y x A B C h y x A B C h y x A B C Fed to a Recurrent Neural Network What is the best food to eat in Las Vegas Google search Autocompletes the search Analyses the data by finding the sequence of words occurring frequently and builds a model to predict the next word in the sentence
  • 6. Introduction to RNN Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing? Collection of large volumes of most frequently occurring consecutive words h y x A B C h y x A B C h y x A B C Fed to a Recurrent Neural Network What is the best food to eat in Las Vegas Google search Autocompletes the search
  • 7. Introduction to RNN Do you know how Google’s autocomplete feature predicts the rest of the words a user is typing? Collection of large volumes of most frequently occurring consecutive words h y x A B C h y x A B C h y x A B C Fed to a Recurrent Neural Network What is the best food to eat in Las Vegas Google search Autocompletes the search
  • 8. What is a Neural Network? Neural Networks used in Deep Learning, consists of different layers connected to each other and work on the structure and functions of a human brain. It learns from huge volumes of data and uses complex algorithms to train a neural net. Input Layer Hidden Layers Output Layer German Shepherd Labrador Image pixels of 2 different breeds of dog Identifies the dogs breed
  • 9. What is a Neural Network? Neural Networks used in Deep Learning, consists of different layers connected to each other and work on the structure and functions of a human brain. It learns from huge volumes of data and uses complex algorithms to train a neural net. Hidden Layers Output Layer Input Layer German Shepherd Labrador Image pixels of 2 different breeds of dog Identifies the dogs breed Such networks do not require memorizing the past output
  • 10. Popular Neural Networks Feed Forward Neural Network Used in general Regression and Classification problems Convolution Neural Network Used for Image Recognition Deep Neural Network Used for Acoustic Modeling Deep Belief Network Used for Cancer Detection Recurrent Neural Network Used for Speech Recognition
  • 11. Feed Forward Neural Network Input Layers Hidden Layers Output Layer x y^ input Predicted output Simplified presentation In a Feed-Forward Network , information flows only in forward direction, from the input nodes, through the hidden layers (if any) and to the output nodes. There are no cycles or loops in the network. Input Layer Hidden Layers Output Layer • Decisions are based on current input • No memory about the past • No future scope
  • 12. Why Recurrent Neural Network? Feed Forward Neural Network 01 02 03 cannot handle sequential data cannot memorize previous inputs considers only the current input Issues in Feed Forward Neural Network
  • 13. Why Recurrent Neural Network? h y x A B C Recurrent Neural Network 01 02 03 can handle sequential data can memorize previous inputs due to its internal memory considers the current input and also the previously received inputs Solution to Feed Forward Neural Network
  • 14. Applications of RNN Image captioning RNN is used to caption an image by analyzing the activities present in it “A Dog catching a ball in mid air”
  • 15. Applications of RNN Time series prediction Any time series problem like predicting the prices of stocks in a particular month can be solved using RNN
  • 16. Applications of RNN Natural Language Processing Text mining and Sentiment analysis can be carried out using RNN for Natural Language Processing When it rains, look for rainbows. When it’s dark, look for stars. Positive Sentiment
  • 17. Applications of RNN Machine Translation Given an input in one language, RNN can be used to translate the input into different languages as output Here the person is speaking in English and it is getting translated into Chinese, Italian, French, German and Spanish languages
  • 18. What is a Recurrent Neural Network? Recurrent Neural Network works on the principle of saving the output of a layer and feeding this back to the input in order to predict the output of the layer. h y x A B C Recurrent Neural Network x x x h h h h h h h h y y Input Layer Hidden Layers Output Layer
  • 19. www.simplilearn.com How does a RNN look like? A, B and C are the parameters
  • 20. How does a RNN look like? www.simplilearn.com
  • 21. How does a RNN work? = new state fc = function with parameter c = old state = input vector at time step t = f (h(t-1) ,c ) C A B C A B C A B y(t-1) y(t+1)y(t) x(t-1) x(t+1) h(t-1) h(t+1) h(t) h(t) x(t) x(t) h(t) h(t-1) x(t)
  • 22. Types of Recurrent Neural Network one to one Single output Single input one to one network is known as the Vanilla Neural Network. Used for regular machine learning problems
  • 23. Types of Recurrent Neural Network one to many Multiple outputs Single input one to many network generates sequence of outputs. Example: Image captioning
  • 24. Types of Recurrent Neural Network many to one Single output Multiple inputs many to one network takes in a sequence of inputs. Example: Sentiment analysis where a given sentence can be classified as expressing positive or negative sentiments
  • 25. Types of Recurrent Neural Network many to many Multiple outputs Multiple inputs many to many network takes in a sequence of inputs and generates a sequence of outputs. Example: Machine Translation
  • 26. Vanishing Gradient Problem While training a RNN, your slope can be either too small or very large and this makes training difficult. When the slope is too small, the problem is know as Vanishing gradient. Change in Y Change in X Y X s0 E0 x0 s1 E1 s2 E2 x2x1 s3 E3 x3 Loss of Information through time Backpropagate the error
  • 27. Exploding Gradient Problem When the slope tends to grow exponentially instead of decaying, this problem is called Exploding gradient. Y X • Long training time • Poor performance • Bad accuracys0 E0 x0 s1 E1 s2 E2 x2x1 s3 E3 x3 Loss of Information through time Backpropagate the error Issues in Gradient Problem
  • 28. Explaining Gradient Problem s1 y1 x1 s2 y2 s3 y3 x3x2 st yt xt …….s0 Consider the following 2 examples to understand what should be the next word in the sequence: The person who took my bike and…………………………………………., a thief. The students who got into Engineering with …………………………………………., from Asia. ____ ____ was were
  • 29. Explaining Gradient Problem s1 y1 x1 s2 y2 s3 y3 x3x2 st yt xt …….s0 Consider the following 2 examples: The person who took my bike and…………………………………………., a thief. The students who got into Engineering with …………………………………………., were from Asia . ____was In order to understand what would be the next word in the sequence, the RNN must memorize the previous context whether the subject was singular noun or plural noun
  • 30. Explaining Gradient Problem Consider the following 2 examples: The person who took my bike and…………………………………………., was a thief. The students who got into Engineering with …………………………………………., from Asia . In order to understand what would be the next word in the sequence, the RNN must memorize the previous context whether the subject was singular noun or plural noun ____were It might be sometimes difficult for the error to backpropagate to the beginning of the sequence to predict what should be the output s1 y1 x1 s2 y2 s3 y3 x3x2 st yt xt …….s0
  • 31. Solution to Gradient Problem Identity Initialization Truncated Backpropagation Gradient Clipping 2 1 3 Exploding Gradient Weight initialization Choosing the right Activation Function Long Short-Term Memory Networks (LSTMs) 2 1 3 Vanishing Gradient
  • 32. Long-Term Dependencies Here we do not need any further context. It’s pretty clear the last word is going to be “sky”. Suppose we try to predict the last word in the text “ The clouds are in the ___”sky
  • 33. Long-Term Dependencies Suppose we try to predict the last word in the text “I have been staying in Spain for the last 10 years… I can speak fluent ______.” • Here we need the context of Spain to predict the last word in the text. • It’s possible that the gap between the relevant information and the point where it is needed to become very large. • LSTMs help us solve this problem. Spanish The word we predict will depend on the previous few words in context
  • 34. Long Short-Term Memory Networks LSTMs are special kind of Recurrent Neural Networks, capable of learning long-term dependencies. Remembering information for long periods of time is their default behavior. tanh ht-1 xt-1 tanh ht xt tanh ht+1 Xt+1 A AA All recurrent neural networks have the form of a chain of repeating modules of neural network. In standard RNNs, this repeating module will have a very simple structure, such as a single tanh layer.
  • 35. Long Short-Term Memory Networks LSTMs are special kind of Recurrent Neural Networks, capable of learning long-term dependencies. Remembering information for long periods of time is their default behavior. +x tanh x tanh x +x tanh x tanh x +x tanh x tanh x ht-1 ht Ht+1 xt-1 xt Xt+1 A A LSTMs also have a chain like structure, but the repeating module has a different structure. Instead of having a single neural network layer, there are four interacting layers communicating in a very special way.
  • 36. Long Short-Term Memory Networks 3 step process of LSTMs Step 1 Step 2 Step 3 Forget irrelevant parts of previous state Selectively update cell state values Output certain parts of cell state Ct-1 Ct
  • 37. Long Short-Term Memory Networks 3 step process of LSTMs Step 1 Step 2 Step 3 Forget irrelevant parts of previous state Selectively update cell state values Output certain parts of cell state Ct-1 Ct
  • 38. Long Short-Term Memory Networks 3 step process of LSTMs Step 1 Step 2 Step 3 Forget irrelevant parts of previous state Selectively update cell state values Output certain parts of cell state Ct-1 Ct
  • 39. ht Ct Working of LSTMs First step in the LSTM is to decide which information to be omitted in from the cell in that particular time step. It is decided by the sigmoid function. It looks at the previous state (ht-1 ) and the current input xt and computes the function. +x tanh x tanh x ht xt Ct-1 ht-1 ft ft = forget gate Decides which information to delete that is not important from previous time step Decides how much of the past it should rememberStep-1
  • 40. ht Ct Working of LSTMs First step in the LSTM is to decide which information to be omitted in from the cell in that particular time step. It is decided by the sigmoid function. It looks at the previous state (ht-1 ) and the current input xt and computes the function. +x tanh x tanh x ht xt Ct-1 ht-1 ft ft = forget gate Decides which information to delete that is not important from previous time step Consider an LSTM is fed with the following inputs from previous and present time step : Alice is good in Physics. John on the other hand is good in Chemistry. John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Previous Output Current Input ht-1 xt Forget gate realizes there might be a change in context after encountering the first full stop. Compares with the current input sentence at xt The next sentence talks about John, so the information on Alice is deleted. The position of subject is vacated and is assigned to John
  • 41. ht Ct Working of LSTMs First step in the LSTM is to decide which information to be omitted in from the cell in that particular time step. It is decided by the sigmoid function. It looks at the previous state (ht-1 ) and the current input xt and computes the function. +x tanh x tanh x ht xt Ct-1 ht-1 ft ft = forget gate Decides which information to delete that is not important from previous time step Consider an LSTM is fed with the following inputs from previous and present time step : Alice is good in Physics. John on the other hand is good in Chemistry. John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Previous Output Current Input ht-1 xt Forget gate realizes there might be a change in context after encountering the first full stop. Compares with the current input sentence at xt The next sentence talks about John, so the information on Alice is deleted. The position of subject is vacated and is assigned to John
  • 42. ht Ct Working of LSTMs First step in the LSTM is to decide which information to be omitted in from the cell in that particular time step. It is decided by the sigmoid function. It looks at the previous state (ht-1 ) and the current input xt and computes the function. +x tanh x tanh x ht xt Ct-1 ht-1 ft ft = forget gate Decides which information to delete that is not important from previous time step Consider an LSTM is fed with the following inputs from previous and present time step : Alice is good in Physics. John on the other hand is good in Chemistry. John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Previous Output Current Input ht-1 xt Forget gate realizes there might be a change in context after encountering the first full stop. Compares with the current input sentence at xt The next sentence talks about John, so the information on Alice is deleted. The position of subject is vacated and is assigned to John
  • 43. ht Ct Working of LSTMs First step in the LSTM is to decide which information to be omitted in from the cell in that particular time step. It is decided by the sigmoid function. It looks at the previous state (ht-1 ) and the current input xt and computes the function. +x tanh x tanh x ht xt Ct-1 ht-1 ft ft = forget gate Decides which information to delete that is not important from previous time step Consider an LSTM is fed with the following inputs from previous and present time step : Alice is good in Physics. John on the other hand is good in Chemistry. John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Previous Output Current Input ht-1 xt Forget gate realizes there might be a change in context after encountering the first full stop. Compares with the current input sentence at xt The next sentence talks about John, so the information on Alice is deleted. The position of subject is vacated and is assigned to John
  • 44. ht Ct Working of LSTMs First step in the LSTM is to decide which information to be omitted in from the cell in that particular time step. It is decided by the sigmoid function. It looks at the previous state (ht-1 ) and the current input xt and computes the function. +x tanh x tanh x ht xt Ct-1 ht-1 ft ft = forget gate Decides which information to delete that is not important from previous time step Consider an LSTM is fed with the following inputs from previous and present time step : Alice is good in Physics. John on the other hand is good in Chemistry. John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Previous Output Current Input ht-1 xt Forget gate realizes there might be a change in context after encountering the first full stop. Compares with the current input sentence at xt The next sentence talks about John, so the information on Alice is deleted. The position of subject is vacated and is assigned to John
  • 45. ht Ct Working of LSTMs In the second layer, there are 2 parts. One is the sigmoid function and the other is the tanh. In the sigmoid function, it decides which values to let through(0 or 1). tanh function gives the weightage to the values which are passed deciding their level of importance(-1 to 1). +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ Decides how much should this unit add to the current stateStep-2 it = input gate Determines which information to let through based on its significance in the current time step
  • 46. ht Ct Working of LSTMs +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ Consider the current input at xt John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Current Input Input gate analyses the important information
  • 47. ht Ct Working of LSTMs +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ Consider the current input at xt John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Current Input John plays football and he was the captain of his college team is important
  • 48. ht Ct Working of LSTMs +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ Consider the current input at xt John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Current Input He told me over phone yesterday is less important, hence it is forgotten
  • 49. ht Ct Working of LSTMs +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ Consider the current input at xt John plays football well. He told me yesterday over phone that he had served as the captain of his college football team. Current Input This process of adding some new information can be done via the input gate
  • 50. Working of LSTMs The third step is to decide what will be our output. First, we run a sigmoid layer which decides what parts of the cell state make it to the output. Then, we put the cell state through tanh to push the values to be between -1 and 1 and multiply it by the output of the sigmoid gate. +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ ht ot Ct Decides what part of the current cell state makes it to the outputStep-3 ot = output gate Allows the passed in information to impact the output in the current time step
  • 51. Working of LSTMs +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ ht ot Ct Let’s consider this example to predicting the next word in the sentence: John played tremendously well against the opponent and won for his team. For his contributions, brave ____ was awarded player of the match. There could be a lot of choices for the empty space
  • 52. Working of LSTMs +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ ht ot Ct Let’s consider this example to predicting the next word in the sentence: John played tremendously well against the opponent and won for his team. For his contributions, brave ____ was awarded player of the match. Current input brave is an adjective
  • 53. Working of LSTMs +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ ht ot Ct Let’s consider this example to predicting the next word in the sentence: John played tremendously well against the opponent and won for his team. For his contributions, brave ____ was awarded player of the match. Adjectives describe a noun
  • 54. Working of LSTMs +x tanh x tanh x ht xt Ct-1 ht-1 ft Ct it ~ ht ot Ct Let’s consider this example to predicting the next word in the sentence: John played tremendously well against the opponent and won for his team. For his contributions, brave ____ was awarded player of the match.John “John” could be the best output after brave
  • 55. Use case implementation of LSTM Let’s predict the prices of stocks using LSTM network Based on the stock price data between 2012- 2016 Predict the stock prices of 2017
  • 56. Use case implementation of LSTM 1. Import the Libraries 2. Import the training dataset 3. Feature Scaling
  • 57. Use case implementation of LSTM 4. Create a data structure with 60 timesteps and 1 output 5. Import keras libraries and packages
  • 58. Use case implementation of LSTM 6. Initialize the RNN 7. Adding the LSTM layers and some Dropout regularization
  • 59. Use case implementation of LSTM 8. Adding the output layer 9. Compile the RNN 10. Fit the RNN to the training set
  • 60. Use case implementation of LSTM 11. Load the stock price test data for 2017 12. Get the predicted stock price of 2017
  • 61. Use case implementation of LSTM 13. Visualize the results of predicted and real stock price

Notas del editor

  1. Style - 01
  2. Style - 01
  3. Style - 01
  4. Style - 01
  5. Style - 01
  6. Style - 01
  7. Style - 01
  8. Style - 01
  9. Style - 01
  10. Style - 01
  11. Style - 01
  12. Style - 01
  13. Style - 01
  14. Style - 01
  15. Style - 01
  16. Style - 01
  17. Style - 01
  18. Style - 01
  19. Style - 01
  20. Style - 01
  21. Style - 01
  22. Style - 01
  23. Style - 01
  24. Style - 01
  25. Style - 01
  26. Style - 01
  27. Style - 01
  28. Style - 01
  29. Style - 01
  30. Style - 01
  31. Style - 01
  32. Style - 01
  33. Style - 01
  34. Style - 01
  35. Style - 01
  36. Style - 01
  37. Style - 01
  38. Style - 01
  39. Style - 01
  40. Style - 01
  41. Style - 01
  42. Style - 01
  43. Style - 01
  44. Style - 01
  45. Style - 01
  46. Style - 01
  47. Style - 01
  48. Style - 01
  49. Style - 01
  50. Style - 01
  51. Style - 01
  52. Style - 01
  53. Style - 01
  54. Style - 01
  55. Style - 01
  56. Style - 01
  57. Style - 01
  58. Style - 01
  59. Style - 01
  60. Style - 01
  61. Style - 01
  62. Style - 01