Search results for "bayesian"
showing 10 items of 604 documents
Exponential and bayesian conjugate families: Review and extensions
1997
The notion of a conjugate family of distributions plays a very important role in the Bayesian approach to parametric inference. One of the main features of such a family is that it is closed under sampling, but a conjugate family often provides prior distributions which are tractable in various other respects. This paper is concerned with the properties of conjugate families for exponential family models. Special attention is given to the class of natural exponential families having a quadratic variance function, for which the theory is particularly fruitful. Several classes of conjugate families have been considered in the literature and here we describe some of their most interesting feat…
Model comparison and selection for stationary space–time models
2007
An intensive simulation study to compare the spatio-temporal prediction performances among various space-time models is presented. The models having separable spatio-temporal covariance functions and nonseparable ones, under various scenarios, are also considered. The computational performance among the various selected models are compared. The issue of how to select an appropriate space-time model by accounting for the tradeoff between goodness-of-fit and model complexity is addressed. Performances of the two commonly used model-selection criteria, Akaike information criterion and Bayesian information criterion are examined. Furthermore, a practical application based on the statistical ana…
Adaptive Metropolis algorithm using variational Bayesian adaptive Kalman filter
2013
Markov chain Monte Carlo (MCMC) methods are powerful computational tools for analysis of complex statistical problems. However, their computational efficiency is highly dependent on the chosen proposal distribution, which is generally difficult to find. One way to solve this problem is to use adaptive MCMC algorithms which automatically tune the statistics of a proposal distribution during the MCMC run. A new adaptive MCMC algorithm, called the variational Bayesian adaptive Metropolis (VBAM) algorithm, is developed. The VBAM algorithm updates the proposal covariance matrix using the variational Bayesian adaptive Kalman filter (VB-AKF). A strong law of large numbers for the VBAM algorithm is…
Optimal Reporting of Predictions
1989
Abstract Consider a problem in which you and a group of other experts must report your individual predictive distributions for an observable random variable X to some decision maker. Suppose that the report of each expert is assigned a prior weight by the decision maker and that these weights are then updated based on the observed value of X. In this situation you will try to maximize your updated, or posterior, weight by appropriately choosing the distribution that you report, rather than necessarily simply reporting your honest predictive distribution. We study optimal reporting strategies under various conditions regarding your knowledge and beliefs about X and the reports of the other e…
Bayesian analysis of a Gibbs hard-core point pattern model with varying repulsion range
2014
A Bayesian solution is suggested for the modelling of spatial point patterns with inhomogeneous hard-core radius using Gaussian processes in the regularization. The key observation is that a straightforward use of the finite Gibbs hard-core process likelihood together with a log-Gaussian random field prior does not work without penalisation towards high local packing density. Instead, a nearest neighbour Gibbs process likelihood is used. This approach to hard-core inhomogeneity is an alternative to the transformation inhomogeneous hard-core modelling. The computations are based on recent Markovian approximation results for Gaussian fields. As an application, data on the nest locations of Sa…
Deriving Reference Decisions
1998
To solve a statistical decision problem from a Bayesian viewpoint, the decision maker must specify a probability distribution on the parameter space, his prior distribution. In order to analyze the influence of this prior distribution on the solution of the problem, Bernardo (1981) proposed to compare the results with those that one would obtain by using that prior distribution which maximizes the useful experimental information, thus introducing the concept of reference decision. This definition is too involved for most of the problems usually found in practice. Here we analyze situations in which it is possible to simplify the definition of the reference decision, and we provide condition…
Bayesian Smoothing in the Estimation of the Pair Potential Function of Gibbs Point Processes
1999
A flexible Bayesian method is suggested for the pair potential estimation with a high-dimensional parameter space. The method is based on a Bayesian smoothing technique, commonly applied in statistical image analysis. For the calculation of the posterior mode estimator a new Monte Carlo algorithm is developed. The method is illustrated through examples with both real and simulated data, and its extension into truly nonparametric pair potential estimation is discussed.
Rejoinder: Bayesian Checking of the Second Levels of Hierarchical Models
2008
Rejoinder: Bayesian Checking of the Second Levels of Hierarchical Models [arXiv:0802.0743]
PValues for Composite Null Models
2000
Abstract The problem of investigating compatibility of an assumed model with the data is investigated in the situation when the assumed model has unknown parameters. The most frequently used measures of compatibility are p values, based on statistics T for which large values are deemed to indicate incompatibility of the data and the model. When the null model has unknown parameters, p values are not uniquely defined. The proposals for computing a p value in such a situation include the plug-in and similar p values on the frequentist side, and the predictive and posterior predictive p values on the Bayesian side. We propose two alternatives, the conditional predictive p value and the partial…
Gaussian component mixtures and CAR models in Bayesian disease mapping
2012
Hierarchical Bayesian models involving conditional autoregression (CAR) components are commonly used in disease mapping. An alternative model to the proper or improper CAR is the Gaussian component mixture (GCM) model. A review of CAR and GCM models is provided in univariate settings where only one disease is considered, and also in multivariate situations where in addition to the spatial dependence between regions, the dependence among multiple diseases is analyzed. A performance comparison between models using a set of simulated data to help illustrate their respective properties is reported. The results show that both in univariate and multivariate settings, both models perform in a comp…