Ryan MartinNC State University


Talk Title: All Bayesian Inference is Approximate

Abstract: No, I don’t mean approximate in a numerical sense!  In applications, there’s virtually never a single “correct” Bayesian solution.  Instead, there’s unavoidable ambiguity in various steps of the modeling process which, if taken seriously, would produce a collection of acceptable posteriors.  So, a “correct” Bayesian solution would have to consider all the acceptable posteriors simultaneously, but that’s not what’s done in practice.  Just one posterior is chosen to carry out the analysis, and it’s in this sense that I mean all Bayesian inference is approximate.  This realization might seem discouraging, but it doesn’t have to be.  I think it sheds new light on modern Bayesian inference and, in particular, on the various efforts recently that fall under the umbrella of “approximate Bayes”.  Following some setup, this talk will focus on the approximation of a collection of posteriors by a single posterior and the connections this has to recent work (by myself and others) on Gibbs posterior distributions.