Зміст курсу
Neural Networks with TensorFlow
Neural Networks with TensorFlow
Non-Sequential Models
Non-Sequential models in neural networks are those that do not follow a linear sequence of layers, where each layer has exactly one input and one output. These models are more flexible and can handle complex architectures, which is especially useful in scenarios where you need to process multiple types of data or when the model needs to output multiple predictions.
Key Advantages
- Multiple Inputs/Outputs: Non-Sequential models can have multiple input and output layers, making them suitable for a wide range of applications like multi-task learning, where the model performs several tasks simultaneously.
- Complex Layer Connectivity: They allow for layers to be connected in various ways, not just stacked sequentially as we did before. This includes skip connections, shared layers, and branches.
- Flexibility: They offer the flexibility to design custom architectures that can handle complex data relationships and task requirements.
Example of Model with Multiple Inputs
Imagine a model designed to predict a property's price based on both textual descriptions and image data:
- Textual Description Branch: Processes textual features (like property descriptions) using layers suitable for natural language processing.
- Image Data Branch: Processes image data of the property using convolutional layers.
- Merging: The outputs of these two branches are then merged using a layer like
Concatenate
and further processed to make the final prediction.
Note
- For now we assume that usage of simple
Dense
layers are or in the context of image and textual data. In further courses you discover more advanced and sophisticated methods of processing such data.Concatenate
layer does not change the data in any way, it just unites outputs of several layers into a single one.
Implementation in Keras
The Model
class in Keras is used to create a model from your defined layers allowing creation of non-sequential models. It encapsulates the entire model and provides methods for training, evaluation, and prediction.
It's parameters:
inputs
: Specifies the input(s) of the model, which can be a list if the model has multiple inputs.outputs
: Specifies the output(s) of the model, which can also be a list for multiple outputs.
But before constructing the model, you must first create and interconnect the layers. This is done by defining a layer and assigning it to a variable, while simultaneously connecting it to its preceding layer by specifying it in additional parentheses (for example, layer = Dense(20)(previous_layer)
).
To merge the outputs of multiple layers into a single layer, the Concatenate
layer can be utilized. It combines the outputs of the specified layers into one unified layer. To achieve this, instead of connecting a single layer, pass a list of layers within the parentheses, like concat = Concatenate()([layer_1, layer_2, layer_3])
.
Here's an implementation example of the model we discussed earlier:
Note
Once a layer is passed into another layer, it's not necessary to retain the previous layer in a separate variable. Therefore, you can reassign the same variable for subsequent layers if you're not going to reference the earlier layer again (as demonstrated with
text_branch
andimage_branch
in the example).
Дякуємо за ваш відгук!