-
Notifications
You must be signed in to change notification settings - Fork 219
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PDMat error when sampling from prior of model with LKJCholesky #2316
Comments
Thanks @simonsteiger, looking into this. A smaller repro:
On the second pass the eltype of |
The issue isn't specific to the distribution, this will happen with any combination of taking a matrix product of a random variable and calling DynamicPPL.@model function test()
x ~ Distributions.product_distribution([Distributions.Uniform();;])
mat = x * [1.0;;]
return PDMats.PDMat(mat)
end
m = test()
vi = DynamicPPL.VarInfo()
m(vi)
m(vi) I think I understand the problem, but I'm unsure how to best fix it. In the above example, the second time we call Some options that I could think of but didn't like:
Any thoughts @yebai, @sunxd3, @willtebbutt? |
Thank you for taking the time to look into this! :) |
Cc @devmotion, who might have encountered this before. |
Could this be changed? In my opinion, matrix multiplication shouldn't promote containers of
Such a julia> using Turing, PDMats
julia> @model function test()
x ~ product_distribution([Distributions.Uniform();;])
mat = (Base.isconcretetype(x) ? x : map(identity, x)) * [1.0;;]
return PDMat(mat)
end;
julia> m = test();
julia> vi = DynamicPPL.VarInfo();
julia> m(vi)
1×1 PDMat{Float64, Matrix{Float64}}:
0.871459128598002
julia> m(vi)
1×1 PDMat{Float64, Matrix{Float64}}:
0.871459128598002
julia> m(vi)
1×1 PDMat{Float64, Matrix{Float64}}:
0.871459128598002 Maybe we could make use of that internally? |
Fully agree with both points.
Yeah, I wonder this as well. It could be part of the transform that turns This touches on the whole Note that cc @torfjelde too |
One possible |
On my phone so will reply more extensively later, but I'd highly recommend against automatic expansion of container. It adds a lot of complexity that can be captured in a simple "convert from abstract type to specialized" step after constructing the container. |
Wouldn't this complexity be nicely hidden away behind the |
Sure, but
IMO it doesn't seem worth it only to save us from doing a single line of code to convert from type "unstable" to "stable", given that right now there's really no end-user making use of |
I'm thinking of this for the future when
I think it saves us more much than one line of code. I find it quite confusing which operations return a I dream of a world where
Code internal to |
Haha, fair enough 👍 I'm very much in agreement that it'll make things better if we can do it in a nice way:) But there are also realisitic scenarios where you don't actually want the |
Yeah, this is the part where my stretch goal 4. might fail. I don't understand well enough when, and how badly, we want to avoid Another scenario that I fear a bit is that if we incrementally build up a |
Hi,
I have encountered an error while trying to sample from the prior of a Turing model with an
LKJCholesky
prior. The error seems to stem fromPDMats
callingoneunit(Any)
at some point.The code example below follows this blog post.
The error raised after the last line is
MethodError: no method matching oneunit(::Type{Any})
with
PDMat
somewhere further down in the stack tracePDMats.PDMat(fac::LinearAlgebra.Cholesky{Any, Matrix{Any}}) @ pdmat.jl:20
Swapping out the above model for one where
Ω
is distributed asLKJ
no longer raises an error during prior sampling.Sampling with
sample(m1, NUTS(), 1000)
works normally.EDIT
I ran the above examples in Pluto with
Distributions v0.25.111
,Turing v0.33.3
, andPDMats v0.11.31
.When running them in the REPL (
Turing v0.28.3
here) I could sample from the prior ofm1
without error.The text was updated successfully, but these errors were encountered: