Contenu du cours
Introduction to Neural Networks
Introduction to Neural Networks
Single Neuron Implementation
For now, we want to build a neural network with a single neuron. As an example, let's say we'll use it for a binary classification task, such as spam detection, where 0
represents a ham (non-spam) email and 1
represents a spam email.
The neuron will take numerical features related to emails as inputs and produce an output between 0
and 1
, representing the probability that an email is spam.
Here's what happens step by step:
- Each input is multiplied by a corresponding weight. The weights are learnable parameters that determine the importance of each input;
- All the weighted inputs are summed together;
- An additional parameter called bias is added to the input sum. The bias allows the neuron to shift its output up or down, providing flexibility to the model;
- The input sum is then passed through an activation function. Since we have only a single neuron, which directly produces the final output (a probability), we'll use the sigmoid function, which compresses values into the range
(0, 1)
.
Neuron Class
A neuron needs to store its weights and bias making a class a natural way to group these related properties.
weights
: a list of randomly initialized values that determine how important each input (n_inputs
is the number of inputs) is to the neuron;bias
: a randomly initialized value that helps the neuron make flexible decisions.
Weights and bias should be randomly initialized with small values between -1
and 1
, drawn from a uniform distribution, to ensure proper training and allow them to be correctly adjusted at each iteration.
To recap, NumPy provides the random.uniform()
function to generate a random number or an array (by specifying the size
argument) of random numbers from a uniform distribution within the [low, high)
range.
Forward Propagation
Additionally, the Neuron
class should include an activate()
method, which computes the weighted sum of the inputs and applies the activation function (sigmoid in our case).
In fact, if we have two vectors of equal length (weights
and inputs
), the weighted sum can be computed using the dot product of these vectors:
This allows us to compute the weighted sum in a single line of code using the numpy.dot()
function, eliminating the need for a loop. The bias can then be directly added to the result to get input_sum_with_bias
. The output is then computed by applying the sigmoid activation function:
Activation Functions
The formula for the sigmoid function is as follows, given that z
represents the weighted sum of inputs with bias added (raw output value) for this particular neuron:
Using this formula, sigmoid can be implemented as a simple function in Python:
The formula for the ReLU function is as follows, which basically sets the output equal to z
if it is positive and 0
otherwise:
Merci pour vos commentaires !