Skip to content

Commit

Permalink
HPO.md link fix
Browse files Browse the repository at this point in the history
  • Loading branch information
Jethro Gaglione committed Mar 10, 2024
1 parent 22874f7 commit 1b9ed5f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/tutorials/HPO.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Hyperparameter Optimization with Optuna
======================
In place of grid or random search approaches to HPO, we recommend the use of the Optuna framework for Bayesian hyperparameter sampling and trial pruning (in models where intermediate results are available). Optuna can also integrate with MLflow for convinient logging of optimal parameters.

In this tutorial, we take the model and training approach detailed in the [Single-GPU Training (Custom Mlflow)](({% link pytorch_singlGPU_customMLflow.md %}) tutorial to build our HPO on.
In this tutorial, we take the model and training approach detailed in the [Single-GPU Training (Custom Mlflow)](https://docs.mltf.vu/tutorials/pytorch_singlGPU_customMLflow.html) tutorial to build our HPO on.

First, we install the Optuna package:
```bash
Expand Down

0 comments on commit 1b9ed5f

Please sign in to comment.