Like change of chord (hyperparameter tunning)

What is Hyperparameter Tuning or optimization?

First, we need to know, what is Hyperparameter.

Hyperparameter is a parameter whose value is used to control the learning process.

According to Wikipedia, “Hyperparameter tuning or optimization is the problem of choosing a set of optimal hyperparameters for a learning algorithm.”

Some difference between hyperparameter and parameter:

Those are used interchangeably but there is a difference between them.

·         Parameter are the properties of the training data, that are learnt during train

·         Hyper parameters are those which cannot learn within estimator directly, but can control the behavior of training algorithm and can impact on performance.

 

svm.SVC(C =0.01, kernel ='rbf', random_state=33)

 

Here, C, kernel is hyperparameter and the random_state is parameter

In deep learning, Hyperparameter is like Layers, input features, Learning Rate, Activation Function, Hidden Layer etc.

Parameter like weight and bias.

 

Process of finding optimal hyperparameter:

Most used algorithms are, 

1. GridSearchCV 

2. RandomSearch

GridSearch use cross-validation to measure performance and train algorithm for all combinations.

n_estimators = [10, 50, 100, 200]  
max_depth = [3, 10, 20, 40]
RandomForestClassifier(n_estimators=10, max_depth=3)
RandomForestClassifier(n_estimators=10, max_depth=10)
RandomForestClassifier(n_estimators=10, max_depth=20)
RandomForestClassifier(n_estimators=10, max_depth=40)

RandomForestClassifier(n_estimators=50, max_depth=3)
RandomForestClassifier(n_estimators=50, max_depth=10)
RandomForestClassifier(n_estimators=50, max_depth=20)
RandomForestClassifier(n_estimators=50, max_depth=40)

RandomForestClassifier(n_estimators=100, max_depth=3)
RandomForestClassifier(n_estimators=100, max_depth=10)
RandomForestClassifier(n_estimators=100, max_depth=20)
RandomForestClassifier(n_estimators=100, max_depth=40)

RandomForestClassifier(n_estimators=200, max_depth=3)
RandomForestClassifier(n_estimators=200, max_depth=10)
RandomForestClassifier(n_estimators=200, max_depth=20)
RandomForestClassifier(n_estimators=200, max_depth=40)
 

Instead of searching over the entire, random search only evaluates a random sample point on the grid. It makes a lot cheaper than grid search.

 

 

 

 

 

Comments