• No products in the cart.

ratings 

Recurrent Neural Networks are one of the most common Neural Networks used in Natural Language Processing because of its satisfying results. RNNs are type of neural network where the output from previous step is treated as input to the current step.

PRIVATE
Course Access

Unlimited Duration

Last Updated

July 29, 2021

Students Enrolled

20

Total Reviews

Posted by
Certification

A Recurrent Neural Network is a type of neural network that contains loops, allowing information to be stored within the network. RNNs use their reasoning from previous experiences to inform the upcoming events. It can be thought of as a series of networks linked together. An RNN remembers each and every information through time. It is useful in time series prediction only because of the feature to remember previous inputs as well. RNN is a very difficult task and cannot process very long sequences. A common example of Recurrent Neural Networks is machine translation, e.g: a neural network may take an input sentence in French and translate it into a sentence in English. The network determines the probability of each word in the output sentence based upon the word itself, and the previous output sequence.  

Course Curriculum

    • A Simple Perceptron 00:00:00
    • Neural Network Overview and its use case 00:00:00
    • Various Neural Network architect overview 00:00:00
    • Multilayer Network 00:00:00
    • Loss Functions 00:00:00
    • The Learning Mechanism 00:00:00
    • Optimizers 00:00:00
    • Forward and Backward Propagation 00:00:00
    • Gradient Descent 00:00:00
    • Labs 00:00:00
    • Introduction to Recurrent Neural Networks 00:00:00
    • RNN use cases 00:00:00
    • RNN architecture 00:00:00
    • Types of RNN architectures 00:00:00
    • Why use of RNN and its Application 00:00:00
    • RNN Forward Propagation 00:00:00
    • RNN Backward Propagation 00:00:00
    • Problem with simple RNN 00:00:00
    • Understanding LSTM Networks 00:00:00
    • LSTM Networks 00:00:00
    • Core Idea Behind LSTMs 00:00:00
    • Step-by-step LSTM Walk Through 00:00:00
    • Labs 00:00:00
    • What is Gradient Exploding 00:00:00
    • How to know whether model is suffering from Exploding Gradients 00:00:00
    • How LSTM solves Gradient Exploding problem 00:00:00
    • What is Vanishing Gradient 00:00:00
    • How to solve Vanishing Gradient Problem 00:00:00
    • Import the necessary packages 00:00:00
    • Import the data set 00:00:00
    • Data set Preprocessing 00:00:00
    • Create a model object 00:00:00
    • Train, Test split 00:00:00
    • Train the model 00:00:00
    • Regularizing the model with Dropout 00:00:00
    • Validating the model 00:00:00
    • Save the model for final prediction 00:00:00
    • Introduction to GRU 00:00:00
    • Architecture of GRU 00:00:00
    • Advantages of GRU 00:00:00
    • Practical Implementation of GRU 00:00:00

Course Reviews

Profile Photo
4 4
1937

Students

About Instructor

Course Events

[wplms_eventon_events]

More Courses by Insturctor

© 2021 Ernesto.  All rights reserved.  
X