Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Logging of hyperparameters for the sampler #11

Open
jlperla opened this issue Sep 25, 2020 · 3 comments
Open

Logging of hyperparameters for the sampler #11

jlperla opened this issue Sep 25, 2020 · 3 comments

Comments

@jlperla
Copy link
Contributor

jlperla commented Sep 25, 2020

See JuliaLogging/TensorBoardLogger.jl#77

In one form or another, it would be great to have hyperparmaters visible in the tesnorboard output. See https://pytorch.org/docs/stable/tensorboard.html#torch.utils.tensorboard.writer.SummaryWriter.add_hparams

Maybe add support so the samplers themselves could do this? For example, have a dispatch to log hyperparameters for the sampler. For example,

log_hparams(lg::TBLogger, sampler) = nothing  # no default logging

function log_hparams(lg::TBLogger, sampler::NUTS, prefix = "")
    log_hparams(lg, {"$(prefix)n_adapts" => sampler.n_adapts, "$(prefix)δ" =? sampler.δ})
    return nothing
end

And until hparams are suppored, maybe it could be done with custom scalars? https://philipvinc.github.io/TensorBoardLogger.jl/dev/explicit_interface/#Custom-Scalars-plugin Though I am not sure how that stuff works

log_hparams(lg::TBLogger, sampler) = nothing  # no default logging

function log_hparams(lg::TBLogger, sampler::NUTS, prefix = "")
    log_custom_scalar(lg, {"$(prefix)n_adapts" => sampler.n_adapts, "$(prefix)δ" =? sampler.δ})
    return nothing
end

I put in the "prefix" argument so that you could do things recursively with Gibbs so that the hyperparameters are distinguishable. e.g. something like

function log_hparams(lg::TBLogger, sampler::Gibbs, prefix = "")
    for i, alg in enumerate(sampler.algs)
        log_hparams(lg, alg, "component_$i")
    end
    return nothing
end

But with all that said, it is not immediately clear to me where you can hook this into the current Turing callback structure. One possibility is to hack it by having a toggle on whether the hparams have been logged? e.g. TensorBoardCallback is mutable with a has_logged_hparams in it, then in the callback https://github.com/torfjelde/TuringCallbacks.jl/blob/master/src/callbacks/tensorboard.jl#L100 you check the toggle...

@PhilipVinc
Copy link

Custom scalars is just a way to define ad-hoc how to visualise scalars.
Basically you log scalars (log_scalar) as you want, and then you specify with log_custom_scalar the layout of your plots, and (especially useful) the x and y axis.

log_custom_scalar must be called only once, because it's defining the layout, it's not actually logging anything (maybe we should rename it...)

The code was contributed by @fissoreg who knows much more about them than me.

@torfjelde
Copy link
Member

I 100% agree this would be super-nice, so thank you for the suggestion! But I think it would be better to put the effort towards implementing the plugin on TensorBoardLogger.jl's side, since there's no good way of doing with the current functionality (AFAIK).

@jlperla
Copy link
Contributor Author

jlperla commented Sep 26, 2020

@torfjelde I agree completely. Better to wait and do it right. It sounds like @PhilipVinc doesn't have capacity right now to add in that plugin support. It is out of my skillset at this point, but if you have any bored grad students maybe they would like to play with it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants