-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How does BAT differ from Turing? #321
Comments
Hi, BAT differs from Turing & friends in that it is not a PPL, but purely a Bayesian inference/analysis framework. BAT is focused on applications in which the user specifies prior and likelihood separately and explicitly, the (log-)likelihood function may be arbitrary code. BAT is both higher and lower-level than Turing in a way: Providing the likelihood is the users task, there is no DSL. But on the other hand, BAT provides higher-level plotting recipes, posterior integration algorithms and so on. BAT and PPLs like Turing are not mutually exclusive: I would like to support using Turing models as densities in BAT in the future. Also, we use functionality like AdvancedHMC from the Turing "universe". We plan to split several parts of BAT (like the prior-transformation code) out as separate packages soon, to make things more modular and offer them to the rest of the ecosystem without depending on BAT.jl itself (which is currently a very heavy dependency). BAT is supposed to be a toolbox, offering operations relevant to Bayesian analysis (and different algorithms for each of those) under a common API, so users can quickly switch between them and try out different ways to tackle their problem. |
I want to point out that you can also provide custom likelihood in Turing.jl: |
Oh, right, that's a nice feature! I guess another important difference is that BAT explicitly distinguishes between prior and likelihood, and can support likelihoods that are not (explicitly) tied to data. The separation between prior and likelihood becomes important for algorithms like MGVI.jl (still to be integrated into BAT), which will also need separation from forward-model and data (we'll add a special likelihood type for that). |
I don't know how hard that would be in Turing but surely if you don't use the DSL macro you have more control. Might be worth asking in Slack/Discourse? |
Sure, you can of course use the samplers that power Turing directly - which BAT does, e.g. AdvancedHMC. BAT then adds a nice likelihood/prior-oriented interface on top of it, including its parameter space transformation capabilities (which we're currently extending). |
Not sure I understand how that differs from Turing -- in DynamicPPL you can distinguish between the prior and the likelihood by working with |
Sure, I didn't mean that DynamicPPL doesn't have a concept of prior and likelihood, but you don't get them as separate objects, right? Though I guess you can realize separate priors and likelihoods via DynamicPPL submodels? In BAT, priors are actual Distributions (we'd like to add compatibility with MeasureBase as well), typically via And for MGVI, we'll need models whose output are distributions, not data directly, since we need to compute Fisher information.
I actually would like to add support for DynamicPPL to BAT, as they're kinda orthogonal. DynamicPPL is a way to express models, whereas BAT is more concerned with taking models from the user and offering a toolset for samling, integration and posterior plotting/analysis. BAT also differs in doing parameter transformations differently, it transforms the prior to uniform/normal exactly, unlike Bijectors. We're in the process of releasing this as a separate package, so have both transformation mechanisms generally available. This is why we established InverseFunctions.jl and ChangesOfVariables.jl together with the Bijectors developers, to have common APIs. I hope DynamicPPL will be able to use it as an option at some point (have to get the package out first). I don't see BAT and Turing as competing efforts, really - BAT uses quite a few functions that evolved within Turing, and I hope Turing can make use of functionality well split out as separate packages in the future as well. BAT is currently still too monolithic, the plan is to split it apart an then release BAT v3.0 - hopefully quite soon. |
is my understanding correct that in BAT.jl you have to define likelihood manually? for example what's the equivalent of using Turing
@model function coinflip(y)
p ~ Uniform(0, 1)
@. y ~ Bernoulli(p)
end |
Yes. However, now that we finally have DensityInterface.jl in place, I would like to establish packages that provide likelihoods for standard use cases (e.g. histogram fits, etc.) than can then be used with BAT, MeasureTheory and hopefully other packages as well (Turing could support DensityInterface too via it's "external likelihood" macro, I think). We'll also add support to BAT for forward models expressed as Markov kernels. I'm not sure in which package to put this yet, it should ideally be somewhere more lightweight than BAT.jl itself. These well be important to support MGVI integration into BAT, but also for automatized posterior predictive checks. It'll look like this:
Currently, users still have to provide the full likelihood:
Using |
CC @cscherrer |
Hi! Found this on Github and I'd like to say, BAT.jl looks really interesting, and I'm always happy to see new tools for Bayesian analysis in Julia! I have a quick question -- how does BAT.jl differ from other PPLs in Julia, like Turing, Soss, or Gen?
The text was updated successfully, but these errors were encountered: