Posterior prediction vs. parametric bootstrap
The Posterior Predictive Distribution and the Parametric Bootstrap are both very similar in how you do them. Both use repeated sampling based on the observed data to make a distribution.
The main difference is the parametric bootstrap uses a point estimate (probably the MLE) in the distribution to pull realizations from. So you are sampling from , and using the same value every time.
On the other hand, Bayesian prediction algorithm samples a value from the posterior to use to get the realizations . So you are using different ‘s each time for each new sample of you simulate. This takes into account the uncertainty of the parameter - instead it uses the prior updated through Bayes Rule.
References
@lancaster2004 - Chapter 2