0000000000076066

AUTHOR

M. J. Bayarri

Generalization of Jeffreys' divergence based priors for Bayesian hypothesis testing

In this paper we introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence based (DB) priors. DB priors have simple forms and desirable properties, like information (finite sample) consistency; often, they are similar to other existing proposals like the intrinsic priors; moreover, in normal linear models scenarios, they exactly reproduce Jeffreys-Zellner-Siow priors. Most importantly, in challenging scenarios such as irregular models and mixture models, the DB priors are well defined and very reasonable, while alternative proposals are not. We derive approximations to the D…

research product

Truncation, Information, and the Coefficient of Variation

The Fisher information in a random sample from the truncated version of a distribution that belongs to an exponential family is compared with the Fisher information in a random sample from the un- truncated distribution. Conditions under which there is more information in the selection sample are given. Examples involving the normal and gamma distributions with various selection sets, and the zero-truncated binomial, Poisson, and negative binomial distributions are discussed. A property pertaining to the coefficient of variation of certain discrete distributions on the non-negative integers is introduced and shown to be satisfied by all binomial, Poisson, and negative binomial distributions.

research product

The relation between theory and application in statistics

General comments on the relation between theory and application in statistics are made and emphasis placed on issues and principles of model formulation. Three examples are described in outline. Criteria for the choice of models are discussed.

research product

MCMC methods to approximate conditional predictive distributions

Sampling from conditional distributions is a problem often encountered in statistics when inferences are based on conditional distributions which are not of closed-form. Several Markov chain Monte Carlo (MCMC) algorithms to simulate from them are proposed. Potential problems are pointed out and some suitable modifications are suggested. Approximations based on conditioning sets are also explored. The issues are illustrated within a specific statistical tool for Bayesian model checking, and compared in an example. An example in frequentist conditional testing is also given.

research product