Part I: Basic of Recurrent Neural Network

kimleang
3 min readJul 31, 2023

--

The Recurrent Neural Network word is a combination of the words “Recurrent” and “Neural Network”. Before we go deep down into the concept of RNN, first I want you to understand these:

  • What is Recurrent mean? In simple meaning, something that happens again and again over a period of time. Example: In terms of data, it’s about sequential or time series data.
  • What is a Neural Network mean? is the interconnection of the node that is inspired by the human brain.

What is Recurrent Neural Network?

If we combine the above meaning together it means the repetition of something works through the connection of nodes and then produces a result. But in the general definition, Recurrent Neural Network (RNN) is a deep learning model that is usually used to process sequence data, the input of a certain state depends on the output of the previous state.

RNN Architecture

Type of RNNs:

  • One-to-one: One input produces one output in the same timestamp. It usually uses in classification. For example, text classification or image classification
one-to-one architecture
  • One-to-many: One input at a time x produces an output at a time x then the output becomes the input of the next state. For example, it uses for text generation,
one-to-many architecture
  • Many-to-one: Input as a sequence in a period of time produce one output at the final state. Commonly used for sentiment analysis.
many-to-one architecture
  • Many-to-many: Input in time x produces output in time x. Input and output are in the sequence form. For example, it uses for Name-Entity recognition
many-to-many architecture
  • Many-to-many: Input in time x produces output in time y. For example, it uses for machine translation, text summary
many-to-many architecture

Benefits & Drawbacks of RNNs:

Benefits:

  • can process sequence data
  • fix model size (not increase followed by the input size)
  • can use the current input and the previous result to compute the new result
  • the weight can be shared across time

Drawbacks:

  • take a long time for computing
  • cannot add more input in the current state
  • unable to process (not memorable) a long sequence input when using tanh or Relu as an activation function.

Bonus:

What is sequential data? Time series data?

  • Sequential Data is discrete data that order matter.
example of sequential data
  • Time Series Data can be discrete and continuous data in a period of time that we observe and the order also matters.
Example of time series data over the time

If there have anything wrong or you have something that you want to add more, please don’t hesitate to message me 😊.

The next story will be released soon, so don’t forget to follow me 🤭

Thanks for reading 🙏

--

--