6533b86dfe1ef96bd12c935c

RESEARCH PRODUCT

Generalization of Jeffreys Divergence-Based Priors for Bayesian Hypothesis Testing

Maria J. BayarriGonzalo García-donato

subject

Statistics and ProbabilityKullback–Leibler divergenceMarkov chainMarkov chain Monte CarloBayes factorMixture modelsymbols.namesakePrior probabilityEconometricssymbolsApplied mathematicsStatistics Probability and UncertaintyDivergence (statistics)Statistical hypothesis testingMathematics

description

Summary We introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence-based (DB) priors. DB priors have simple forms and desirable properties like information (finite sample) consistency and are often similar to other existing proposals like intrinsic priors. Moreover, in normal linear model scenarios, they reproduce the Jeffreys–Zellner–Siow priors exactly. Most importantly, in challenging scenarios such as irregular models and mixture models, DB priors are well defined and very reasonable, whereas alternative proposals are not. We derive approximations to the DB priors as well as Markov chain Monte Carlo and asymptotic expressions for the associated Bayes factors.

https://doi.org/10.1111/j.1467-9868.2008.00667.x