Kursusindhold
Neural Networks with TensorFlow
Neural Networks with TensorFlow
Common Layers
Input Layer
The Input
layer is used to specify the shape of the input data. It does not have any weights and does not perform any operations on the data. Nevertheless, it is essential because it defines the structure of the input data that the model will receive, ensuring that the data is correctly formatted for processing by subsequent layers.
We'll create the layers for the following neural network starting with the Input
layer:
The shape of the input data is specified via the shape
parameter:
python
Dense Layer
The Dense
layer, also known as a fully connected layer, is a fundamental component in neural networks. Each neuron in a dense layer receives input from all neurons of the previous layer, making it "fully connected", and performs a weighted sum of the inputs.
To create a single Dense
layer in Keras, you can only specify the number of neurons in this layer:
python
As a matter of fact, you can directly incorporate an activation function into this layer by specifying the activation
parameter. Common activation functions include 'relu'
, 'sigmoid'
, and 'tanh'
.
python
Activation Layer
Keras also has a separate Activation
layer, which applies an activation function to its input. All you have to do to initialize this layer is pass the activation function as a string (e.g, 'relu'
, 'sigmoid'
or 'tanh'
).
python
The Activation
layer independently applies an activation function to each individual input.
1. Which layer is essential for specifying the shape of the input data?
2. Which of these layers can be used to apply a non-linear transformation to the data?
3. What is the primary function of the Dense
layer in Keras?
Tak for dine kommentarer!