Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Lære Sigmoid and Tanh Activations | Advanced RNN Variants
Introduction to RNNs
course content

Kursinnhold

Introduction to RNNs

Introduction to RNNs

1. Introduction to RNNs
2. Advanced RNN Variants
3. Time Series Analysis
4. Sentiment Analysis

book
Sigmoid and Tanh Activations

In this chapter, we explore the Sigmoid and Tanh activation functions, which are crucial in the working of RNNs. These functions help transform the inputs into outputs, enabling the model to make predictions.

  • Sigmoid Activation: the sigmoid function maps input values to an output range between 0 and 1. It is commonly used in binary classification tasks, as its output can be interpreted as a probability. However, it suffers from the vanishing gradient problem when the input values are very large or very small;
  • Tanh Activation: the tanh function is similar to the sigmoid but maps the input values to an output range between -1 and 1. It helps center the data around zero, which can aid learning. Despite its benefits, it also suffers from the vanishing gradient problem in certain situations;
  • Working of Sigmoid and Tanh: both functions work by squashing the input values into a bounded range. The primary difference lies in their output range: Sigmoid (0 to 1) vs. Tanh (-1 to 1), which affects how the network processes and updates the information.

In the next chapter, we will look at how these activation functions play a role in LSTM networks and how they help overcome some of the limitations of standard RNNs.

question mark

What is the output range of the Sigmoid activation function?

Select the correct answer

Alt var klart?

Hvordan kan vi forbedre det?

Takk for tilbakemeldingene dine!

Seksjon 2. Kapittel 2
Vi beklager at noe gikk galt. Hva skjedde?
some-alt