Conteúdo do Curso
Neural Networks with TensorFlow
Neural Networks with TensorFlow
Layers
Introduction
Keras, a high-level neural networks API, is widely used for deep learning projects. It simplifies the process of building and training neural networks, making it accessible to both beginners and experts. In this chapter, we'll delve into the core of Keras: its layers. Layers are the fundamental building blocks of neural networks in Keras.
Note
Keras can be utilized as an independent library (from Keras 3.0 onwards) or as a component within TensorFlow or other deep learning frameworks. In this course, we will use it as a component within TensorFlow.
What Are Layers?
In the context of neural networks, a layer is a structured arrangement of neurons (nodes), typically organized in rows. Each neuron in a layer is connected to neurons in the previous and next layers. These connections are characterized by weights, which are adjusted during training.
Keras Layers
Keras, on the other hand, adopts a more modular and flexible approach. In Keras:
- Layers are more granular and specialized. Each layer typically performs a specific type of transformation or computation.
- Separation of concerns. Instead of a single layer handling both weighted sums and activations, Keras often separates these into different layer types. For instance, a
Dense
layer handles weighted sums, and anActivation
layer handles the activation functions.
Input Layer
The Input
layer is used to specify the shape of the input data.
Creating an Input Layer
Note
The
Input
layer doesn't process or modify data; it's a way to specify the input data shape.
Dense Layer
The Dense
layer, also known as a fully connected layer, is a basic building block in neural networks. Each neuron in a dense layer receives input from all neurons of the previous layer, making it "fully connected."
Creating a Dense Layer
Here's how you create a single Dense
layer in Keras:
Note
It's worth noting that the
activation
parameter can be utilized to directly incorporate an activation function following aDense
layer in your code. For example:However, in many cases, it is advisable to treat the activation function as an independent layer. This approach offers greater flexibility in the design of your neural network.
Activation Layer
The Activation
layer applies an activation function to its input. Common activation functions include 'relu'
, 'sigmoid'
, and 'tanh'
.
Creating an Activation Layer
Note
The
Activation
layer independently applies an activation function to each individual input.
Obrigado pelo seu feedback!