You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is the opposite of the SLURM code where the HyperOptArgumentParser object is passed into SlurmCluster.
Code duplication and entanglement.
Hard to test HyperOptArgumentParser independent of the mechanism of deployment.
Proposed change:
Having something like Local or LocalSystem object that similar to SlurmCluster accepts a HyperOptArgumentParser that can be used to optimize hyperparameter locally:
hyperparams = parser.parse_args()
# Enable cluster training.
system = LocalSystem(
hyperparam_optimizer=hyperparams,
log_path=hyperparams.log_path,
python_cmd='python3',
test_tube_exp_name=hyperparams.test_tube_exp_name
)
system.max_cpus = 100
system.max_gpus = 5
# Each hyperparameter combination will use 200 cpus.
system.optimize_parallel_cpu(
# Function to execute:
train,
# Number of hyperparameter combinations to search:
nb_trials=24')
Downsides
Probably breaks backward compatibility.
The text was updated successfully, but these errors were encountered:
zafarali
changed the title
Proposal: Independent Local hyperparmaeter optimization module
Proposal: Independent Local hyperparameter optimization module
Nov 22, 2018
Right now
HyperOptArgumentParser
contains much of the logic for doing a hyperparamter search on a local machine.https://github.com/williamFalcon/test-tube/blob/master/test_tube/argparse_hopt.py#L259
Why this it not great:
HyperOptArgumentParser
object is passed intoSlurmCluster
.HyperOptArgumentParser
independent of the mechanism of deployment.Proposed change:
Having something like
Local
orLocalSystem
object that similar toSlurmCluster
accepts aHyperOptArgumentParser
that can be used to optimize hyperparameter locally:Downsides
The text was updated successfully, but these errors were encountered: