Look at Bayes Non parametric learning #1616
Replies: 8 comments
-
|
Intended for when the model is wrong. Should be good for multimodal distributions |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
|
I would like to look into this too. And here is another paper that the above paper extends from: |
Beta Was this translation helpful? Give feedback.
-
|
Cool! We should also remember to check when Chris Holmes talk goes online on the Newton website. Would be good to save a link :-) |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
|
Seems to require multiple optimisation problems to be solved every iteration?? |
Beta Was this translation helpful? Give feedback.
-
|
One optimisation problem per 'sample' (like normal bootstrap) if I understood it correctly... |
Beta Was this translation helpful? Give feedback.
-
|
Yes, "One optimisation problem per 'sample'", but if we want N=5000 samples, then we have to run 5000 times of optimisation -- which sounds heavily to me. But I remember Chris once said this method does not lose computational time compared to normal MCMC, which I might get it wrong but I don't quite understand that. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
"Bayesian posterior bootstrap"
Lyddon, Holmes, walker 2018 I think
Looks cool
Based on Ferguson 1974 ?
Beta Was this translation helpful? Give feedback.
All reactions