Search results for "Prior probability"

showing 10 items of 47 documents

An introduction to Bayesian reference analysis: inference on the ratio of multinomial parameters

1998

This paper offers an introduction to Bayesian reference analysis, often described as the more successful method to produce non-subjective, model-based, posterior distributions. The ideas are illustrated in detail with an interesting problem, the ratio of multinomial parameters, for which no model-based Bayesian analysis has been proposed. Signposts are provided to the huge related literature.

Statistics and ProbabilityBayesian probabilityPosterior probabilityInferenceBayesian inferencecomputer.software_genreStatistics::ComputationBayesian statisticsComputingMethodologies_PATTERNRECOGNITIONPrior probabilityEconometricsData miningBayesian linear regressionBayesian averagecomputerMathematicsJournal of the Royal Statistical Society: Series D (The Statistician)
researchProduct

An overview of robust Bayesian analysis

1994

Robust Bayesian analysis is the study of the sensitivity of Bayesian answers to uncertain inputs. This paper seeks to provide an overview of the subject, one that is accessible to statisticians outside the field. Recent developments in the area are also reviewed, though with very uneven emphasis. © 1994 SEIO.

Statistics and ProbabilityComputer scienceBayesian probabilitycomputer.software_genreData scienceField (computer science)Bayesian robustnessN/ARobust Bayesian analysisPrior probabilityData miningSensitivity (control systems)Statistics Probability and Uncertaintycomputer
researchProduct

Overall Objective Priors

2015

In multi-parameter models, reference priors typically depend on the parameter or quantity of interest, and it is well known that this is necessary to produce objective posterior distributions with optimal properties. There are, however, many situations where one is simultaneously interested in all the parameters of the model or, more realistically, in functions of them that include aspects such as prediction, and it would then be useful to have a single objective prior that could safely be used to produce reasonable posterior inferences for all the quantities of interest. In this paper, we consider three methods for selecting a single objective prior and study, in a variety of problems incl…

Statistics and ProbabilityComputer sciencebusiness.industryApplied MathematicsMathematics - Statistics TheoryStatistics Theory (math.ST)Joint Reference PriorReference AnalysisMachine learningcomputer.software_genreLogarithmic DivergenceObjective PriorsVariety (cybernetics)Single objectiveMultinomial ModelPrior probabilityFOS: MathematicsMultinomial distributionMultinomial modelArtificial intelligencebusinesscomputerReference analysisBayesian Analysis
researchProduct

Extending conventional priors for testing general hypotheses in linear models

2007

We consider that observations come from a general normal linear model and that it is desirable to test a simplifying null hypothesis about the parameters. We approach this problem from an objective Bayesian, model-selection perspective. Crucial ingredients for this approach are 'proper objective priors' to be used for deriving the Bayes factors. Jeffreys-Zellner-Siow priors have good properties for testing null hypotheses defined by specific values of the parameters in full-rank linear models. We extend these priors to deal with general hypotheses in general linear models, not necessarily of full rank. The resulting priors, which we call 'conventional priors', are expressed as a generalizat…

Statistics and ProbabilityGeneralizationApplied MathematicsGeneral MathematicsModel selectionBayesian probabilityLinear modelBayes factorAgricultural and Biological Sciences (miscellaneous)Prior probabilityEconometricsStatistics Probability and UncertaintyGeneral Agricultural and Biological SciencesNull hypothesisStatistical hypothesis testingMathematicsBiometrika
researchProduct

Generalization of Jeffreys Divergence-Based Priors for Bayesian Hypothesis Testing

2008

Summary We introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence-based (DB) priors. DB priors have simple forms and desirable properties like information (finite sample) consistency and are often similar to other existing proposals like intrinsic priors. Moreover, in normal linear model scenarios, they reproduce the Jeffreys–Zellner–Siow priors exactly. Most importantly, in challenging scenarios such as irregular models and mixture models, DB priors are well defined and very reasonable, whereas alternative proposals are not. We derive approximations to the DB priors as w…

Statistics and ProbabilityKullback–Leibler divergenceMarkov chainMarkov chain Monte CarloBayes factorMixture modelsymbols.namesakePrior probabilityEconometricssymbolsApplied mathematicsStatistics Probability and UncertaintyDivergence (statistics)Statistical hypothesis testingMathematicsJournal of the Royal Statistical Society Series B: Statistical Methodology
researchProduct

Prior-based Bayesian information criterion

2019

We present a new approach to model selection and Bayes factor determination, based on Laplace expansions (as in BIC), which we call Prior-based Bayes Information Criterion (PBIC). In this approach, the Laplace expansion is only done with the likelihood function, and then a suitable prior distribution is chosen to allow exact computation of the (approximate) marginal likelihood arising from the Laplace approximation and the prior. The result is a closed-form expression similar to BIC, but now involves a term arising from the prior distribution (which BIC ignores) and also incorporates the idea that different parameters can have different effective sample sizes (whereas BIC only allows one ov…

Statistics and ProbabilityLaplace expansionApplied MathematicsBayes factorMarginal likelihoodStatistics::Computationsymbols.namesakeComputational Theory and MathematicsLaplace's methodBayesian information criterionPrior probabilitysymbolsApplied mathematicsStatistics::MethodologyStatistics Probability and UncertaintyLikelihood functionFisher informationAnalysisMathematics
researchProduct

Bayesian analysis of a disability model for lung cancer survival

2016

Bayesian reasoning, survival analysis and multi-state models are used to assess survival times for Stage IV non-small-cell lung cancer patients and the evolution of the disease over time. Bayesian estimation is done using minimum informative priors for the Weibull regression survival model, leading to an automatic inferential procedure. Markov chain Monte Carlo methods have been used for approximating posterior distributions and the Bayesian information criterion has been considered for covariate selection. In particular, the posterior distribution of the transition probabilities, resulting from the multi-state model, constitutes a very interesting tool which could be useful to help oncolog…

Statistics and ProbabilityLung NeoplasmsEpidemiologyComputer scienceMatemáticasPosterior probabilityBayesian probabilityEstadísticaBiostatisticsAccelerated failure time modelsBayesian inference01 natural sciences010104 statistics & probability03 medical and health sciencesBayes' theoremsymbols.namesake0302 clinical medicineHealth Information ManagementBayesian information criterionCarcinoma Non-Small-Cell LungStatisticsPrior probabilityHumans0101 mathematicsBiología y BiomedicinaNeoplasm StagingInformáticaBayes estimatorBayes TheoremMarkov chain Monte CarloSurvival AnalysisBayesian information criterionMarkov Chains030220 oncology & carcinogenesisMinimum informative priorsymbolsMulti-state modelsRegression AnalysisWeibull distributionMonte Carlo Method
researchProduct

Criteria for Bayesian model choice with application to variable selection

2012

In objective Bayesian model selection, no single criterion has emerged as dominant in defining objective prior distributions. Indeed, many criteria have been separately proposed and utilized to propose differing prior choices. We first formalize the most general and compelling of the various criteria that have been suggested, together with a new criterion. We then illustrate the potential of these criteria in determining objective model selection priors by considering their application to the problem of variable selection in normal linear models. This results in a new model selection objective prior with a number of compelling properties.

Statistics and ProbabilityMathematical optimization62C10Model selectiong-priorLinear modelMathematics - Statistics TheoryFeature selectionStatistics Theory (math.ST)Model selectionBayesian inferenceObjective model62J05Prior probability62J15FOS: MathematicsStatistics Probability and Uncertaintyobjective BayesSelection (genetic algorithm)variable selectionMathematicsThe Annals of Statistics
researchProduct

Objective Priors for Discrete Parameter Spaces

2012

This article considers the development of objective prior distributions for discrete parameter spaces. Formal approaches to such development—such as the reference prior approach—often result in a constant prior for a discrete parameter, which is questionable for problems that exhibit certain types of structure. To take advantage of structure, this article proposes embedding the original problem in a continuous problem that preserves the structure, and then using standard reference prior theory to determine the appropriate objective prior. Four different possibilities for this embedding are explored, and applied to a population-size model, the hypergeometric distribution, the multivariate hy…

Statistics and ProbabilityMathematical optimizationNegative hypergeometric distributionGeometric distributionHypergeometric distributionDirichlet distributionBinomial distributionsymbols.namesakeBeta-binomial distributionPrior probabilitysymbolsStatistics Probability and UncertaintyCompound probability distributionMathematicsJournal of the American Statistical Association
researchProduct

Deriving Reference Decisions

1998

To solve a statistical decision problem from a Bayesian viewpoint, the decision maker must specify a probability distribution on the parameter space, his prior distribution. In order to analyze the influence of this prior distribution on the solution of the problem, Bernardo (1981) proposed to compare the results with those that one would obtain by using that prior distribution which maximizes the useful experimental information, thus introducing the concept of reference decision. This definition is too involved for most of the problems usually found in practice. Here we analyze situations in which it is possible to simplify the definition of the reference decision, and we provide condition…

Statistics and ProbabilityMathematical optimizationWeak topologyOrder (exchange)Prior probabilityBayesian probabilityProbability distributionStatistics Probability and UncertaintyDecision problemParameter spaceOptimal decisionMathematicsTest
researchProduct