Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Challenge: Automatic Hyperparameter Tuning | Conclusion
Introduction to Neural Networks
course content

Contenido del Curso

Introduction to Neural Networks

Introduction to Neural Networks

1. Concept of Neural Network
2. Neural Network from Scratch
3. Conclusion

book
Challenge: Automatic Hyperparameter Tuning

Tarea
test

Swipe to show code editor

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopCambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones
¿Todo estuvo claro?

¿Cómo podemos mejorarlo?

¡Gracias por tus comentarios!

Sección 3. Capítulo 4
toggle bottom row

book
Challenge: Automatic Hyperparameter Tuning

Tarea
test

Swipe to show code editor

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopCambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones
¿Todo estuvo claro?

¿Cómo podemos mejorarlo?

¡Gracias por tus comentarios!

Sección 3. Capítulo 4
Switch to desktopCambia al escritorio para practicar en el mundo realContinúe desde donde se encuentra utilizando una de las siguientes opciones
We're sorry to hear that something went wrong. What happened?
some-alt