Зміст курсу
Introduction to TensorFlow
Introduction to TensorFlow
Batches
Batches in Data Processing
When training a machine learning model, it's common to feed the data in small chunks rather than all at once. These chunks are called "batches". Instead of showing a model a single data item (like one image or one sentence), we might feed it a batch of, say, 32
items together. This approach can make training more stable and faster.
When thinking about tensors, this means adding an extra dimension at the beginning. So, if a single item's data was represented by a tensor of shape (height, width)
, a batch of these items would have the shape (batch_size, height, width)
. In this example, if the batch size is 32
, the shape becomes (32, height, width)
.
Let's say we have 2048
data samples, each with a shape of (base shape)
. This gives us a tensor of (2048, base shape)
. If we break this data into batches of 32
samples, we'll end up with 64
batches, as 64 * 32 = 2048
. And the new shape will be (64, 32, base shape)
.
When designing your own neural network or another model, you can employ different shapes for the tasks mentioned above. However, these shaping techniques are standard in Tensorflow, as they are structured both logically and hierarchically to optimize the performance of learning algorithms.
Дякуємо за ваш відгук!