-
Notifications
You must be signed in to change notification settings - Fork 218
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replace old Gibbs sampler with the experimental one. #2328
base: master
Are you sure you want to change the base?
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #2328 +/- ##
==========================================
- Coverage 86.80% 82.32% -4.49%
==========================================
Files 24 21 -3
Lines 1599 1533 -66
==========================================
- Hits 1388 1262 -126
- Misses 211 271 +60 ☔ View full report in Codecov by Sentry. |
Pull Request Test Coverage Report for Build 11016717132Details
💛 - Coveralls |
|
||
The old Gibbs constructor relied on being called with several subsamplers, and each of the constructors of the subsamplers would take as arguments the symbols for the variables that they are to sample, e.g. `Gibbs(HMC(:x), MH(:y))`. This constructor has been deprecated, and will be removed in the future. The new constructor works by assigning samplers to either symbols or `VarNames`, e.g. `Gibbs(; x=HMC(), y=MH())` or `Gibbs(@varname(x) => HMC(), @varname(y) => MH())`. This allows more granular specification of which sampler to use for which variable. | ||
|
||
Likewise, the old constructor for calling one subsampler more often than another, `Gibbs((HMC(:x), 2), (MH(:y), 1))` has been deprecated. The new way to achieve this effect is to list the same sampler multiple times, e.g. as `hmc = HMC(); mh = MH(); Gibbs(@varname(x) => hmc, @varname(x) => hmc, @varname(y) => mh)`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. We had a chat about a closely related issue with @torfjelde too, I'll rework the interface around this a bit.
@torfjelde, if you have a moment to take a look at the one remaining test failure, would be interested in your thoughts. We are sampling for a model with two vector variables, @testset "dynamic model" begin
@model function imm(y, alpha, ::Type{M}=Vector{Float64}) where {M}
N = length(y)
rpm = DirichletProcess(alpha)
z = zeros(Int, N)
cluster_counts = zeros(Int, N)
fill!(cluster_counts, 0)
for i in 1:N
z[i] ~ ChineseRestaurantProcess(rpm, cluster_counts)
cluster_counts[z[i]] += 1
end
Kmax = findlast(!iszero, cluster_counts)
m = M(undef, Kmax)
for k in 1:Kmax
m[k] ~ Normal(1.0, 1.0)
end
end
model = imm(Random.randn(100), 1.0)
# https://github.com/TuringLang/Turing.jl/issues/1725
# sample(model, Gibbs(MH(:z), HMC(0.01, 4, :m)), 100);
sample(model, Gibbs(; z=PG(10), m=HMC(0.01, 4; adtype=adbackend)), 100)
end |
Will have a look at this in a bit @mhauru (just need to do some grocery shopping 😬 ) |
Think I found the error: if the number of Lines 57 to 65 in 6f9679a
doesn't hit the |
I'm a bit uncertain how we should best handle this @yebai @mhauru The first partially viable idea that comes to mind is to But this would not quite be equivalent to the current implementation of Another way is to explicitly add the
Thoughts? |
I lean towards the above approach and (maybe later) provide explicit APIs to inference algorithms. This will enable us to handle reversible jumps (varying model dimensions) in MCMC more flexibly. At the moment, this is only possible in particle Gibbs; if it happens in HMC/MH, inference will likely fail (silently) EDIT: we can keep |
This does however complicate the new Gibbs sampling procedure quite drastically 😕 And it makes me bring up a question I really didn't think I'd be asking: is it then actually preferable to the current I guess we should first have a go at implementing this for the new Another point to add to the conversation that @mhauru brought to my attention the other day: we also want to support stuff like
So all in all, immediate things we need to address with Gibbs:
|
Closes #2318.
Work in progress.