Notice: This page requires JavaScript to function properly.
Please enable JavaScript in your browser settings or update your browser.
Challenge: Automatic Hyperparameter Tuning | Conclusion
Introduction to Neural Networks
course content

Course Content

Introduction to Neural Networks

Introduction to Neural Networks

1. Concept of Neural Network
2. Neural Network from Scratch
3. Conclusion

bookChallenge: Automatic Hyperparameter Tuning

Task

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

Section 3. Chapter 4
toggle bottom row

bookChallenge: Automatic Hyperparameter Tuning

Task

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

Section 3. Chapter 4
toggle bottom row

bookChallenge: Automatic Hyperparameter Tuning

Task

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Everything was clear?

How can we improve it?

Thanks for your feedback!

Task

Rather than manually selecting specific values for our model's hyperparameters, employing Random Search (RandomizedSearchCV) can be a more efficient strategy to discover the optimal settings. The concept is somewhat akin to GridSearchCV, yet it comes with a significant difference.

In the realm of neural networks, exhaustively searching through every possible combination of parameters, as GridSearchCV does, can be prohibitively time-consuming.

This is where Random Search shines. Instead of evaluating all parameter combinations, it samples a random subset of them, which often leads to faster and surprisingly effective results.

Here is an example of Random Search usage:

Your task is:

  1. Generate values for two hidden layers with number of neurons in range from 20 to 30 with step 2.
  2. Set the values for the learning rate to choose from. As we saw in the previous chapter, the model performs well with a learning rate of around 0.01. So we can reduce the search area to the values 0.02, 0.01, and 0.005.
  3. Generate 10 random values for epochs in range from 10 to 50.
  4. Apply random search for 4 models (iterations).
  5. Evaluate the model.

Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
Section 3. Chapter 4
Switch to desktopSwitch to desktop for real-world practiceContinue from where you are using one of the options below
some-alt