You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
at the beginning of the run when ncall is larger than some threshold, and efficiency becomes lower than a threshold
During the run if a threshold ncall since the previous boundary update has passed
Also the bound update is relevant when starting a new batch. There, a new sampler is created and it the bound is
either used from a base run, or the run will start without the bound if logl< logl_first_bound_update. That's where things can be potentially inefficient as in the latter case the whole nlive set of pts has to be generated from unit cube.
Also it is still not clear to me if
The boundary update need to be delayed like it is now. I previously remember the discussion about shredding of the posterior, but I don't quite know if that's a real problem and if delaying the bound creation is useful for that.
I don't particularly like that things are decided based on 'efficiency' which is a cumulative efficiency. It is unclear if the cumulative efficiency is ever useful. I'd say efficiency over last nlive samples probably is.
The text was updated successfully, but these errors were encountered:
In #428 I've refactored the boundary update rules, but it's not 100% clear if everything there is optimal.
First, the boundary updates is something that's mildly important for non-uniform sampling, but critically important for uniform sampling.
Right now the boundary is updated (
dynesty/py/dynesty/sampler.py
Line 294 in ab78939
Also the bound update is relevant when starting a new batch. There, a new sampler is created and it the bound is
either used from a base run, or the run will start without the bound if logl< logl_first_bound_update. That's where things can be potentially inefficient as in the latter case the whole nlive set of pts has to be generated from unit cube.
Also it is still not clear to me if
The boundary update need to be delayed like it is now. I previously remember the discussion about shredding of the posterior, but I don't quite know if that's a real problem and if delaying the bound creation is useful for that.
I don't particularly like that things are decided based on 'efficiency' which is a cumulative efficiency. It is unclear if the cumulative efficiency is ever useful. I'd say efficiency over last nlive samples probably is.
The text was updated successfully, but these errors were encountered: