Kursinhalt
Introduction to Neural Networks
Introduction to Neural Networks
Challenge: Creating a Perceptron
Since our goal is to implement a multilayer perceptron, creating a Perceptron
class will simplify model initialization. Its only attribute, layers
is essentially a list of the Layer
objects that define the structure of the network:
The variables used to initialize the layers are the following:
input_size
: the number of input features;hidden_size
: the number of neurons in each hidden layer (both hidden layers will have the same number of neurons in this case);output_size
: the number of neurons in the output layer.
The structure of the resulting perceptron should be as follows:
Swipe to start coding
Your goal is to set up the basic structure of the perceptron by implementing its layers:
- Initialize the weights (a matrix) and biases (a vector) with random values from a uniform distribution in range
[-1, 1)
using NumPy. - Compute the raw output values of the neurons in the
forward()
method of theLayer
class. - Apply the activation function to the raw outputs in the
forward()
method of theLayer
class and return the result. - Define three layers in the
Perceptron
class: two hidden layers with the same number of neurons and one output layer. Both hidden layers should use therelu
activation function, while the output layer should usesigmoid
.
Lösung
Danke für Ihr Feedback!
Challenge: Creating a Perceptron
Since our goal is to implement a multilayer perceptron, creating a Perceptron
class will simplify model initialization. Its only attribute, layers
is essentially a list of the Layer
objects that define the structure of the network:
The variables used to initialize the layers are the following:
input_size
: the number of input features;hidden_size
: the number of neurons in each hidden layer (both hidden layers will have the same number of neurons in this case);output_size
: the number of neurons in the output layer.
The structure of the resulting perceptron should be as follows:
Swipe to start coding
Your goal is to set up the basic structure of the perceptron by implementing its layers:
- Initialize the weights (a matrix) and biases (a vector) with random values from a uniform distribution in range
[-1, 1)
using NumPy. - Compute the raw output values of the neurons in the
forward()
method of theLayer
class. - Apply the activation function to the raw outputs in the
forward()
method of theLayer
class and return the result. - Define three layers in the
Perceptron
class: two hidden layers with the same number of neurons and one output layer. Both hidden layers should use therelu
activation function, while the output layer should usesigmoid
.
Lösung
Danke für Ihr Feedback!