You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If evaluations are moderately inexpensive (i.e. tens of thousands of evaluations), go with a genetic algorithm via e.g. sklearn-genetic-opt or TPOT.
If evaluations are very expensive (i.e. hundreds of evaluations), go with Bayesian optimization via e.g. skopt.BayesSearchCV or Ax. BayesSearchCV is a more lightweight model and requires models to be optimized that match the scikit-learnestimator API. Ax has much more sophisticated Bayesian models, including automatic relevance determination (ARD) and corresponding feature importances, advanced handling of noise, and capabilities to handle high-dimensional datasets. It also has several interfaces ranging from easy-to-use to heavily customizable and is a tool that we recommend.
There may be other reasons in addition to the expense of model evaluation that can guide the choice of hyperparameter optimization scheme such as interpretability and ease of use.
In our case, due to [inexpensive/moderately expensive/expensive] model evaluations for sklearn models and to maintain a lightweight environment, we choose to use [GridSearchCV/sklearn-genetic-opt/skopt.BayesSearchCV; however, other options could have been used instead.
add a section on hyperparameter tuning since classical models were used with default hyperparameters
The text was updated successfully, but these errors were encountered: