Contenu du cours
Introduction to Neural Networks
Introduction to Neural Networks
Challenge: Automatic Hyperparameter Tuning
Rather than manually selecting specific values for our model's hyperparameters, randomized search (RandomizedSearchCV
) offers a more efficient way to find an optimal configuration. Unlike grid search (GridSearchCV
), which systematically evaluates all possible combinations of hyperparameters, randomized search selects a random subset of these combinations. This approach significantly reduces computational cost while still yielding strong results.
For neural networks, where the number of possible hyperparameter combinations can be immense, exhaustively testing every option is often impractical. Randomized search circumvents this issue by randomly sampling a defined number of hyperparameter sets, balancing exploration and efficiency.
estimator
: the model to optimize (e.g.,MLPClassifier
);param_distributions
: a dictionary where keys are hyperparameter names and values are lists which to sample;n_iter
: specifies how many random combinations should be tested. A higher value increases the chances of finding an optimal combination but requires more computation;scoring
: defines the evaluation metric (e.g.,'accuracy'
for classification).
Swipe to start coding
- In
param_distributions
, generate values for two hidden layers, where each layer has the same number of neurons, ranging from20
to30
(exclusive) with a step of2
. - In
param_distributions
, set the learning rate values to0.02
,0.01
, and0.005
. - In
param_distributions
, generate 10 random values for the number of training epochs, ensuring they are within the range10
to50
(inclusive). - Apply randomized search with
4
iterations (number of hyperparameter combinations to evaluate) and use accuracy as the evaluation metric.
Solution
Merci pour vos commentaires !
Challenge: Automatic Hyperparameter Tuning
Rather than manually selecting specific values for our model's hyperparameters, randomized search (RandomizedSearchCV
) offers a more efficient way to find an optimal configuration. Unlike grid search (GridSearchCV
), which systematically evaluates all possible combinations of hyperparameters, randomized search selects a random subset of these combinations. This approach significantly reduces computational cost while still yielding strong results.
For neural networks, where the number of possible hyperparameter combinations can be immense, exhaustively testing every option is often impractical. Randomized search circumvents this issue by randomly sampling a defined number of hyperparameter sets, balancing exploration and efficiency.
estimator
: the model to optimize (e.g.,MLPClassifier
);param_distributions
: a dictionary where keys are hyperparameter names and values are lists which to sample;n_iter
: specifies how many random combinations should be tested. A higher value increases the chances of finding an optimal combination but requires more computation;scoring
: defines the evaluation metric (e.g.,'accuracy'
for classification).
Swipe to start coding
- In
param_distributions
, generate values for two hidden layers, where each layer has the same number of neurons, ranging from20
to30
(exclusive) with a step of2
. - In
param_distributions
, set the learning rate values to0.02
,0.01
, and0.005
. - In
param_distributions
, generate 10 random values for the number of training epochs, ensuring they are within the range10
to50
(inclusive). - Apply randomized search with
4
iterations (number of hyperparameter combinations to evaluate) and use accuracy as the evaluation metric.
Solution
Merci pour vos commentaires !