Comments (6)
Yes, i can see the problem now, thanks.
I'll fix it later today/tomorrow.
from dynesty.
Hi,
I am not 100% convinced there is necessary an issue here (or at least I'm not sure there is a better behaviour)
The rationale that I've put in is the code should either run normally or crash early, rather than be in an infinite loop.
Also, ideally i'd prefer not to have too many tuning parameters.
The behaviour that you are observing is caused by sampling ~ 10000 times and failing to get at least 10 finite logl values. Going to the extreme, if the logl is -np.inf everywhere except a tiny volume, I think it is reasonable for the code to bail out early.
The only modification that I can think of, is that if after sampling say the current 1000 times we managed to get at least 1 normal sample, a warning can be displayed and sampling can be continued till the minimum number of pts is achieved.
But if none have been found after 1000 iterations, I think it is right to quit.
I am not sure I fully understand your proposed options a and b to be honest.
from dynesty.
Sorry I wasn't clear - I'm not referring to a case where the likelihood is pathological. It crashes even if the likelihood is completely fine.
from dynesty.
Can you give an reproduceable example please ?
I dont' quite see how what you tell can happen if logl is finite (almost) everywhere.
Because even if you have nlive=2, you should have 2000 valid points after _initialize_livepoints().
(unless I am missing some bug there)
from dynesty.
Sure, here's an adaptation of the test code from http://mattpitkin.github.io/samplers-demo/pages/dynesty/
but you can try it out on any example:
`
def prior_transform(theta):
"""
A function defining the tranform between the parameterisation in the unit hypercube
to the true parameters.
Args:
theta (tuple): a tuple containing the parameters.
Returns:
tuple: a new tuple or array with the transformed parameters.
"""
mprime, cprime = theta # unpack the parameters (in their unit hypercube form)
cmin = -10. # lower bound on uniform prior on c
cmax = 10. # upper bound on uniform prior on c
mmu = 0. # mean of Gaussian prior on m
msigma = 10. # standard deviation of Gaussian prior on m
m = mmu + msigma*ndtri(mprime) # convert back to m
c = cprime*(cmax-cmin) + cmin # convert back to c
return (m, c)
def loglikelihood_dynesty(theta):
"""
The log-likelihood function.
"""
m, c = theta
return m + c
nlive = 8 # number of (initial) live points. Fails when less than 11
bound = 'multi' # use MutliNest algorithm
sample = 'rwalk' # use the random walk to draw new samples
ndims = 2 # two parameters
dsampler = DynamicNestedSampler(loglikelihood_dynesty, prior_transform, ndims,
bound=bound, sample=sample)
dsampler.run_nested(nlive_init=nlive)
`
I believe the issue lies here:
# Check to make sure there are enough finite
# log-likelihood value within the initial set of live
# points.
if ngoods > min_npoints:
I believe "ngoods" has a maximum value of n_live_points (as the comment suggests) while min_npoints is 10. So it will always fail here for n_live_points < 11.
from dynesty.
I think the bc3f251 should fix the issue. Thanks for reporting!
from dynesty.
Related Issues (20)
- Recover partial chains from the dynesty.save checkpoint file
- An issue about dynesty posterior HOT 6
- live point's likelihood not valid HOT 6
- Documentation incorrect for Pool helper object HOT 1
- Importing Nested Sampling chains from file to plot HOT 2
- Using dynesty with npdim HOT 8
- Questions about dyplot.cornerplot HOT 4
- Questions about DynamicNestedSampler setting HOT 4
- Periodic parameters improvement by rotating them
- Ellipsoid check failed HOT 2
- tqdm example HOT 4
- Discretised prior in Dynesty HOT 7
- Loglstar and logZ values HOT 1
- Importance weight PDF shape HOT 4
- How to access posteriors? HOT 1
- Doubt on sampling methods HOT 6
- Run-to-run instability of logZ HOT 8
- Using weighted samples HOT 3
- blob issues when add_live HOT 4
- log means log_e or log_10? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from dynesty.