Deep Learning – Introduction to Recurrent Neural Networks

Recurrent Neural Networks – Main use of RNNs are when using google or facebook these interfaces are able to predict next word what you are about to type. RNNs have loops to allow information to persist. RNN’s are considered to be fairly good for modeling sequence data. Recurrent neural networks are linear architectural variant of recursive networks.

This post is a high level over view for creating basic understanding. Don’t expect too much if you are PHD or master degree student. We will only focus on the intuition behind RNNs instead. This post will give you little comfort to start digging deeper in RNNs.


Recurrent Neural Networks- Introduction

Recurrent neural networks are not too old neural network, they were developed in the 1980s.

  • RNNs takes input as time series and provide output as time series,
  • They have at least one connection cycle.

One of the biggest uniqueness RNNs have is “UAP- Universal Approximation Property” thus they can approximate virtually any dynamical system. This unique property force us to say recurrent neural networks have something magical about them.