Search results for "probability"
showing 10 items of 3417 documents
Adaptive Metropolis algorithm using variational Bayesian adaptive Kalman filter
2013
Markov chain Monte Carlo (MCMC) methods are powerful computational tools for analysis of complex statistical problems. However, their computational efficiency is highly dependent on the chosen proposal distribution, which is generally difficult to find. One way to solve this problem is to use adaptive MCMC algorithms which automatically tune the statistics of a proposal distribution during the MCMC run. A new adaptive MCMC algorithm, called the variational Bayesian adaptive Metropolis (VBAM) algorithm, is developed. The VBAM algorithm updates the proposal covariance matrix using the variational Bayesian adaptive Kalman filter (VB-AKF). A strong law of large numbers for the VBAM algorithm is…
Quantile regression via iterative least squares computations
2012
We present an estimating framework for quantile regression where the usual L 1-norm objective function is replaced by its smooth parametric approximation. An exact path-following algorithm is derived, leading to the well-known ‘basic’ solutions interpolating exactly a number of observations equal to the number of parameters being estimated. We discuss briefly possible practical implications of the proposed approach, such as early stopping for large data sets, confidence intervals, and additional topics for future research.
Forward likelihood-based predictive approach for space-time point processes
2011
Dealing with data from a space–time point process, the estimation of the conditional intensity function is a crucial issue even if a complete definition of a parametric model is not available. In particular, in case of exploratory contexts or if we want to assess the adequacy of a specific parametric model, some kind of nonparametric estimation procedure could be useful. Often, for these purposes kernel estimators are used and the estimation of the intensity function depends on the estimation of bandwidth parameters. In some fields, like for instance the seismological one, predictive properties of the estimated intensity function are pursued. Since a direct ML approach cannot be used, we pr…
Equivalence Testing With Particle Size Distribution Data: Methods and Applications in the Development of Inhalative Drugs
2017
ABSTRACTKey criteria of the quality of inhalative drugs are assessed in experiments generating so-called particle size distributions as data. Many experiments of that kind are carried out to demonstrate that necessary modifications to whatever part of the manufacturing process do not substantially change basic characteristics of an inhalable drug product. The equivalence testing procedures we derive for that purpose rely on different models accommodating the specific structure of such data and on different ways of specifying the region of nonrelevant differences. For each hypotheses formulation, three different tests are derived (two parametric and one asymptotically distribution-free proce…
Linear Recursive Equations, Covariance Selection, and Path Analysis
1980
Abstract By defining a reducible zero pattern and by using the concept of multiplicative models, we relate linear recursive equations that have been introduced by econometrician Herman Wold (1954) and path analysis as it was proposed by geneticist Sewall Wright (1923) to the statistical theory of covariance selection formulated by Arthur Dempster (1972). We show that a reducible zero pattern is the condition under which parameters as well as least squares estimates in recursive equations are one-to-one transformations of parameters and of maximum likelihood estimates, respectively, in a decomposable covariance selection model. As a consequence, (a) we can give a closed-form expression for t…
Robustifying principal component analysis with spatial sign vectors
2012
Abstract In this paper, we apply orthogonally equivariant spatial sign covariance matrices as well as their affine equivariant counterparts in principal component analysis. The influence functions and asymptotic covariance matrices of eigenvectors based on robust covariance estimators are derived in order to compare the robustness and efficiency properties. We show in particular that the estimators that use pairwise differences of the observed data have very good efficiency properties, providing practical robust alternatives to classical sample covariance matrix based methods.
Boolean Models: Maximum Likelihood Estimation from Circular Clumps
1990
This paper deals with the problem of making inferences on the maximum radius and the intensity of the Poisson point process associated to a Boolean Model of circular primary grains with uniformly distributed random radii. The only sample information used is observed radii of circular clumps (DUPAC, 1980). The behaviour of maximum likelihood estimation has been evaluated by means of Monte Carlo methods.
Optimal Reporting of Predictions
1989
Abstract Consider a problem in which you and a group of other experts must report your individual predictive distributions for an observable random variable X to some decision maker. Suppose that the report of each expert is assigned a prior weight by the decision maker and that these weights are then updated based on the observed value of X. In this situation you will try to maximize your updated, or posterior, weight by appropriately choosing the distribution that you report, rather than necessarily simply reporting your honest predictive distribution. We study optimal reporting strategies under various conditions regarding your knowledge and beliefs about X and the reports of the other e…
Bayesian analysis of a Gibbs hard-core point pattern model with varying repulsion range
2014
A Bayesian solution is suggested for the modelling of spatial point patterns with inhomogeneous hard-core radius using Gaussian processes in the regularization. The key observation is that a straightforward use of the finite Gibbs hard-core process likelihood together with a log-Gaussian random field prior does not work without penalisation towards high local packing density. Instead, a nearest neighbour Gibbs process likelihood is used. This approach to hard-core inhomogeneity is an alternative to the transformation inhomogeneous hard-core modelling. The computations are based on recent Markovian approximation results for Gaussian fields. As an application, data on the nest locations of Sa…
Sample Size Requirements of a Mixture Analysis Method with Applications in Systematic Biology
1999
The available information on sample size requirements of mixture analysis methods is insufficient to permit a precise evaluation of the potential problems facing practical applications of mixture analysis. We use results from Monte Carlo simulation to assess the sample size requirements of a simple mixture analysis method under conditions relevant to biological applications of mixture analysis. The mixture model used includes two univariate normal components with equal variances but assumes that the researcher is ignorant as to the equality of the variances. The method used relies on the EM algorithm to compute the maximum likelihood estimates of the mixture parameters, and the likelihood r…