General Bayesian updating and the loss-likelihood bootstrap
Lyddon SP., Holmes CC., Walker SG.
© 2019 Biometrika Trust. In this paper we revisit the weighted likelihood bootstrap, a method that generates samples from an approximate Bayesian posterior of a parametric model. We show that the same method can be derived, without approximation, under a Bayesian nonparametric model with the parameter of interest defined through minimizing an expected negative loglikelihood under an unknown sampling distribution. This interpretation enables us to extend the weighted likelihood bootstrap to posterior sampling for parameters minimizing an expected loss.We call this method the losslikelihood bootstrap, and we make a connection between it and general Bayesian updating, which is a way of updating prior belief distributions that does not need the construction of a global probability model, yet requires the calibration of two forms of loss function. The losslikelihood bootstrap is used to calibrate the general Bayesian posterior by matching asymptotic Fisher information.We demonstrate the proposed method on a number of examples.