0000000000105876

AUTHOR

M. J. Bayarri

showing 4 related works from this author

Generalization of Jeffreys' divergence based priors for Bayesian hypothesis testing

2008

In this paper we introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence based (DB) priors. DB priors have simple forms and desirable properties, like information (finite sample) consistency; often, they are similar to other existing proposals like the intrinsic priors; moreover, in normal linear models scenarios, they exactly reproduce Jeffreys-Zellner-Siow priors. Most importantly, in challenging scenarios such as irregular models and mixture models, the DB priors are well defined and very reasonable, while alternative proposals are not. We derive approximations to the D…

Methodology (stat.ME)FOS: Computer and information sciencesStatistics - Methodology
researchProduct

Truncation, Information, and the Coefficient of Variation

1989

The Fisher information in a random sample from the truncated version of a distribution that belongs to an exponential family is compared with the Fisher information in a random sample from the un- truncated distribution. Conditions under which there is more information in the selection sample are given. Examples involving the normal and gamma distributions with various selection sets, and the zero-truncated binomial, Poisson, and negative binomial distributions are discussed. A property pertaining to the coefficient of variation of certain discrete distributions on the non-negative integers is introduced and shown to be satisfied by all binomial, Poisson, and negative binomial distributions.

symbols.namesakeExponential familyBinomial (polynomial)Negative binomial distributionsymbolsGamma distributionApplied mathematicsProbability distributionTruncation (statistics)Poisson distributionMathematicsTruncated distribution
researchProduct

The relation between theory and application in statistics

1995

General comments on the relation between theory and application in statistics are made and emphasis placed on issues and principles of model formulation. Three examples are described in outline. Criteria for the choice of models are discussed.

Statistics and ProbabilityRelation (database)StatisticsStatistics Probability and UncertaintyMathematicsTest
researchProduct

MCMC methods to approximate conditional predictive distributions

2006

Sampling from conditional distributions is a problem often encountered in statistics when inferences are based on conditional distributions which are not of closed-form. Several Markov chain Monte Carlo (MCMC) algorithms to simulate from them are proposed. Potential problems are pointed out and some suitable modifications are suggested. Approximations based on conditioning sets are also explored. The issues are illustrated within a specific statistical tool for Bayesian model checking, and compared in an example. An example in frequentist conditional testing is also given.

Statistics and ProbabilityMarkov chainApplied MathematicsMarkov chain Monte CarloConditional probability distributionBayesian inferenceComputational Mathematicssymbols.namesakeMetropolis–Hastings algorithmComputational Theory and MathematicsSampling distributionFrequentist inferencesymbolsEconometricsAlgorithmMathematicsGibbs samplingComputational Statistics & Data Analysis
researchProduct