Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Apprendre Challenge: Creating a Perceptron | Neural Network from Scratch
Introduction to Neural Networks
course content

Contenu du cours

Introduction to Neural Networks

Introduction to Neural Networks

1. Concept of Neural Network
2. Neural Network from Scratch
3. Conclusion

book
Challenge: Creating a Perceptron

Since our goal is to implement a multilayer perceptron, creating a Perceptron class will simplify model initialization. Its only attribute, layers is essentially a list of the Layer objects that define the structure of the network:

The variables used to initialize the layers are the following:

  • input_size: the number of input features;
  • hidden_size: the number of neurons in each hidden layer (both hidden layers will have the same number of neurons in this case);
  • output_size: the number of neurons in the output layer.

The structure of the resulting perceptron should be as follows:

Tâche

Swipe to start coding

Your goal is to set up the basic structure of the perceptron by implementing its layers:

  1. Initialize the weights (a matrix) and biases (a vector) with random values from a uniform distribution in range [-1, 1) using NumPy.
  2. Compute the raw output values of the neurons in the forward() method of the Layer class.
  3. Apply the activation function to the raw outputs in the forward() method of the Layer class and return the result.
  4. Define three layers in the Perceptron class: two hidden layers with the same number of neurons and one output layer. Both hidden layers should use the relu activation function, while the output layer should use sigmoid.

Solution

Switch to desktopPassez à un bureau pour une pratique réelleContinuez d'où vous êtes en utilisant l'une des options ci-dessous
Tout était clair ?

Comment pouvons-nous l'améliorer ?

Merci pour vos commentaires !

Section 2. Chapitre 4
toggle bottom row

book
Challenge: Creating a Perceptron

Since our goal is to implement a multilayer perceptron, creating a Perceptron class will simplify model initialization. Its only attribute, layers is essentially a list of the Layer objects that define the structure of the network:

The variables used to initialize the layers are the following:

  • input_size: the number of input features;
  • hidden_size: the number of neurons in each hidden layer (both hidden layers will have the same number of neurons in this case);
  • output_size: the number of neurons in the output layer.

The structure of the resulting perceptron should be as follows:

Tâche

Swipe to start coding

Your goal is to set up the basic structure of the perceptron by implementing its layers:

  1. Initialize the weights (a matrix) and biases (a vector) with random values from a uniform distribution in range [-1, 1) using NumPy.
  2. Compute the raw output values of the neurons in the forward() method of the Layer class.
  3. Apply the activation function to the raw outputs in the forward() method of the Layer class and return the result.
  4. Define three layers in the Perceptron class: two hidden layers with the same number of neurons and one output layer. Both hidden layers should use the relu activation function, while the output layer should use sigmoid.

Solution

Switch to desktopPassez à un bureau pour une pratique réelleContinuez d'où vous êtes en utilisant l'une des options ci-dessous
Tout était clair ?

Comment pouvons-nous l'améliorer ?

Merci pour vos commentaires !

Section 2. Chapitre 4
Switch to desktopPassez à un bureau pour une pratique réelleContinuez d'où vous êtes en utilisant l'une des options ci-dessous
We're sorry to hear that something went wrong. What happened?
some-alt