Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump DynamicPPL compat to 0.30 #2376

Merged
merged 2 commits into from
Oct 25, 2024
Merged

Bump DynamicPPL compat to 0.30 #2376

merged 2 commits into from
Oct 25, 2024

Conversation

penelopeysm
Copy link
Member

@penelopeysm penelopeysm commented Oct 24, 2024

Do not merge – seeing where CI breaks

Tests fixed via TuringLang/DynamicPPL.jl#699.

Closes #2377

Closes #2378

Copy link

codecov bot commented Oct 24, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 86.39%. Comparing base (08331c5) to head (d5d0eea).
Report is 2 commits behind head on master.

Additional details and impacted files
@@           Coverage Diff           @@
##           master    #2376   +/-   ##
=======================================
  Coverage   86.39%   86.39%           
=======================================
  Files          22       22           
  Lines        1573     1573           
=======================================
  Hits         1359     1359           
  Misses        214      214           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@penelopeysm
Copy link
Member Author

penelopeysm commented Oct 24, 2024

Two of the failing tests:

using Turing
import ReverseDiff

alg = HMC(0.01, 5; adtype=AutoReverseDiff())

@model function vdemo6()
    x = Vector{Real}(undef, 10)
    @. x ~ InverseGamma(2, 3)
end
sample(vdemo6(), alg, 10)

@model function vdemo7()
    x = Array{Real}(undef, N, N)
    @. x ~ [InverseGamma(2, 3) for i in 1:N]
end
sample(vdemo7(), alg, 1000)

Bisects to TuringLang/DynamicPPL.jl@c38e65f, i.e. TuringLang/DynamicPPL.jl#555


It seems to only happen with InverseGamma, if it's replaced with Normal() or Beta(2, 2) this doesn't error.


from_maybe_linked_internal shows up in the stacktrace, so I tracked the result of of from_maybe_linked_internal_transform(vi, vn, dist) for a variety of dists:

Dist DPPL 0.29 DPPL 0.30
InverseGamma(2, 3) Base.Fix1{typeof(broadcast), typeof(exp)}(broadcast, exp) ∘ DynamicPPL.FromVec{Tuple{}}(()) DynamicPPL.UnwrapSingletonTransform{Tuple{}}(()) ∘ (Base.Fix1{typeof(broadcast), typeof(exp)}(broadcast, exp) ∘ DynamicPPL.ReshapeTransform{Tuple{Int64}, Tuple{}}((1,), ()))
Normal() identity ∘ DynamicPPL.FromVec{Tuple{}}(()) DynamicPPL.UnwrapSingletonTransform{Tuple{}}(()) ∘ (identity ∘ DynamicPPL.ReshapeTransform{Tuple{Int64}, Tuple{}}((1,), ()))
Beta(2, 2) Inverse{Bijectors.Logit{Float64, Float64}}(Bijectors.Logit{Float64, Float64}(0.0, 1.0)) ∘ DynamicPPL.FromVec{Tuple{}}(()) DynamicPPL.UnwrapSingletonTransform{Tuple{}}(()) ∘ (Inverse{Bijectors.Logit{Float64, Float64}}(Bijectors.Logit{Float64, Float64}(0.0, 1.0)) ∘ DynamicPPL.ReshapeTransform{Tuple{Int64}, Tuple{}}((1,), ()))

The top-right is the only one that fails.


I commented out these lines:

https://github.com/TuringLang/DynamicPPL.jl/blob/bd9f465d174a631c862645739af6748ac0abbefd/src/utils.jl#L378-L384

and when rerun, from_maybe_linked_internal_transform(vi, vn, InverseGamma(2, 3)) gives

Base.Fix1{typeof(broadcast), typeof(exp)}(broadcast, exp) ∘ DynamicPPL.ReshapeTransform{Tuple{Int64}, Tuple{}}((1,), ()))

But it still errors. So the problem is with ReshapeTransform rather than UnwrapSingletonTransform – but apparently only when it's composed with Base.Fix1{typeof(broadcast), typeof(exp)}(broadcast, exp).

@coveralls
Copy link

coveralls commented Oct 24, 2024

Pull Request Test Coverage Report for Build 11520126094

Warning: This coverage report may be inaccurate.

This pull request's base commit is no longer the HEAD commit of its target branch. This means it includes changes from outside the original pull request, including, potentially, unrelated coverage changes.

Details

  • 0 of 0 changed or added relevant lines in 0 files are covered.
  • No unchanged relevant lines lost coverage.
  • Overall coverage remained the same at 86.395%

Totals Coverage Status
Change from base Build 11497545739: 0.0%
Covered Lines: 1359
Relevant Lines: 1573

💛 - Coveralls

@penelopeysm
Copy link
Member Author

penelopeysm commented Oct 25, 2024

MWE that doesn't involve DynamicPPL at all:

import ReverseDiff
f(x) = exp.(reshape(vec(x), ()))
f([1.0]) # this is fine
ReverseDiff.gradient(f, [1.0]) # this errors

Here exp.() is a simplified version of invlink_transform(InverseGamma(...)) and reshape(vec(x), ()) a simplified version of DynamicPPL.ReshapeTransform(()).

Reported upstream JuliaDiff/ReverseDiff.jl#265

@penelopeysm penelopeysm marked this pull request as ready for review October 25, 2024 15:18
Copy link
Member

@yebai yebai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @penelopeysm!

@penelopeysm penelopeysm merged commit 42189fd into master Oct 25, 2024
62 checks passed
@penelopeysm penelopeysm deleted the py/dppl-0.30 branch October 25, 2024 16:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants