Hyperparameter tuning using random search scheme.
Given a set of hyper parameters, random search trainer provides a faster way of hyper parameter tuning. Here, the number of models to be trained can be defined by the user.
superml::GridSearchCV
-> RandomSearchTrainer
n_iter
number of models to be trained
Inherited methods
new()
RandomSearchCV$new(
trainer = NA,
parameters = NA,
n_folds = NA,
scoring = NA,
n_iter = NA
)
trainer
superml trainer object, must be either XGBTrainer, LMTrainer, RFTrainer, NBTrainer
parameters
list, list containing parameters
n_folds
integer, number of folds to use to split the train data
scoring
character, scoring metric used to evaluate the best model, multiple values can be provided. currently supports: auc, accuracy, mse, rmse, logloss, mae, f1, precision, recall
n_iter
integer, number of models to be trained
## ------------------------------------------------
## Method `RandomSearchCV$new`
## ------------------------------------------------
rf <- RFTrainer$new()
rst <-RandomSearchCV$new(trainer = rf,
parameters = list(n_estimators = c(100,500),
max_depth = c(5,2,10,14)),
n_folds = 3,
scoring = c('accuracy','auc'),
n_iter = 4)
## ------------------------------------------------
## Method `RandomSearchCV$fit`
## ------------------------------------------------
rf <- RFTrainer$new()
rst <-RandomSearchCV$new(trainer = rf,
parameters = list(n_estimators = c(100,500),
max_depth = c(5,2,10,14)),
n_folds = 3,
scoring = c('accuracy','auc'),
n_iter = 4)
data("iris")
rst$fit(iris, "Species")
#> [1] "In total, 4 models will be trained"
rst$best_iteration()
#> $n_estimators
#> [1] 500
#>
#> $max_depth
#> [1] 10
#>
#> $accuracy_avg
#> [1] 0.96
#>
#> $accuracy_sd
#> [1] 0.02
#>
#> $auc_avg
#> [1] NaN
#>
#> $auc_sd
#> [1] NA
#>