Contenido del Curso
Introduction to Neural Networks
Introduction to Neural Networks
Challenge: Automatic Hyperparameter Tuning
Tarea
Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV
) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.
In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.
This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.
Here is an example of Random Search usage:
Your task is:
- Generate values for two hidden layers with number of neurons in range from
20
to30
with step2
. - Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around
0.01
. So we can reduce the search area to the values0.02
,0.01
, and0.005
. - Generate
10
random values for epochs in range from10
to50
. - Apply random search for 4 models (iterations).
- Evaluate the model.
¡Gracias por tus comentarios!
Challenge: Automatic Hyperparameter Tuning
Tarea
Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV
) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.
In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.
This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.
Here is an example of Random Search usage:
Your task is:
- Generate values for two hidden layers with number of neurons in range from
20
to30
with step2
. - Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around
0.01
. So we can reduce the search area to the values0.02
,0.01
, and0.005
. - Generate
10
random values for epochs in range from10
to50
. - Apply random search for 4 models (iterations).
- Evaluate the model.
¡Gracias por tus comentarios!
Challenge: Automatic Hyperparameter Tuning
Tarea
Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV
) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.
In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.
This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.
Here is an example of Random Search usage:
Your task is:
- Generate values for two hidden layers with number of neurons in range from
20
to30
with step2
. - Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around
0.01
. So we can reduce the search area to the values0.02
,0.01
, and0.005
. - Generate
10
random values for epochs in range from10
to50
. - Apply random search for 4 models (iterations).
- Evaluate the model.
¡Gracias por tus comentarios!
Tarea
Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV
) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.
In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.
This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.
Here is an example of Random Search usage:
Your task is:
- Generate values for two hidden layers with number of neurons in range from
20
to30
with step2
. - Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around
0.01
. So we can reduce the search area to the values0.02
,0.01
, and0.005
. - Generate
10
random values for epochs in range from10
to50
. - Apply random search for 4 models (iterations).
- Evaluate the model.