Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Challenge: Automatic Hyperparameter Tuning | Conclusion
Introduction to Neural Networks
course content

Conteúdo do Curso

Introduction to Neural Networks

Introduction to Neural Networks

1. Concept of Neural Network
2. Neural Network from Scratch
3. Conclusion

bookChallenge: Automatic Hyperparameter Tuning

Tarefa

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

Seção 3. Capítulo 4
toggle bottom row

bookChallenge: Automatic Hyperparameter Tuning

Tarefa

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

Seção 3. Capítulo 4
toggle bottom row

bookChallenge: Automatic Hyperparameter Tuning

Tarefa

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Tudo estava claro?

Como podemos melhorá-lo?

Obrigado pelo seu feedback!

Tarefa

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
Seção 3. Capítulo 4
Switch to desktopMude para o desktop para praticar no mundo realContinue de onde você está usando uma das opções abaixo
some-alt