Search results for "Statistics::Computation"

showing 10 items of 48 documents

Robust estimation and inference for bivariate line-fitting in allometry.

2011

In allometry, bivariate techniques related to principal component analysis are often used in place of linear regression, and primary interest is in making inferences about the slope. We demonstrate that the current inferential methods are not robust to bivariate contamination, and consider four robust alternatives to the current methods -- a novel sandwich estimator approach, using robust covariance matrices derived via an influence function approach, Huber's M-estimator and the fast-and-robust bootstrap. Simulations demonstrate that Huber's M-estimators are highly efficient and robust against bivariate contamination, and when combined with the fast-and-robust bootstrap, we can make accurat…

Statistics and ProbabilityHeteroscedasticityAnalysis of VarianceCovariance matrixRobust statisticsEstimatorGeneral MedicineBivariate analysisCovarianceBiostatisticsStatistics::ComputationEfficient estimatorPrincipal component analysisStatisticsEconometricsStatistics::MethodologyBody SizeStatistics Probability and UncertaintyMathematicsProbabilityBiometrical journal. Biometrische Zeitschrift
researchProduct

Importance sampling type estimators based on approximate marginal Markov chain Monte Carlo

2020

We consider importance sampling (IS) type weighted estimators based on Markov chain Monte Carlo (MCMC) targeting an approximate marginal of the target distribution. In the context of Bayesian latent variable models, the MCMC typically operates on the hyperparameters, and the subsequent weighting may be based on IS or sequential Monte Carlo (SMC), but allows for multilevel techniques as well. The IS approach provides a natural alternative to delayed acceptance (DA) pseudo-marginal/particle MCMC, and has many advantages over DA, including a straightforward parallelisation and additional flexibility in MCMC implementation. We detail minimal conditions which ensure strong consistency of the sug…

Statistics and ProbabilityHyperparameter05 social sciencesBayesian probabilityStrong consistencyEstimatorContext (language use)Markov chain Monte Carlo01 natural sciencesStatistics::Computation010104 statistics & probabilitysymbols.namesake0502 economics and businesssymbols0101 mathematicsStatistics Probability and UncertaintyParticle filterAlgorithmImportance sampling050205 econometrics MathematicsScandinavian Journal of Statistics
researchProduct

Prior-based Bayesian information criterion

2019

We present a new approach to model selection and Bayes factor determination, based on Laplace expansions (as in BIC), which we call Prior-based Bayes Information Criterion (PBIC). In this approach, the Laplace expansion is only done with the likelihood function, and then a suitable prior distribution is chosen to allow exact computation of the (approximate) marginal likelihood arising from the Laplace approximation and the prior. The result is a closed-form expression similar to BIC, but now involves a term arising from the prior distribution (which BIC ignores) and also incorporates the idea that different parameters can have different effective sample sizes (whereas BIC only allows one ov…

Statistics and ProbabilityLaplace expansionApplied MathematicsBayes factorMarginal likelihoodStatistics::Computationsymbols.namesakeComputational Theory and MathematicsLaplace's methodBayesian information criterionPrior probabilitysymbolsApplied mathematicsStatistics::MethodologyStatistics Probability and UncertaintyLikelihood functionFisher informationAnalysisMathematics
researchProduct

Posterior moments and quantiles for the normal location model with Laplace prior

2021

We derive explicit expressions for arbitrary moments and quantiles of the posterior distribution of the location parameter η in the normal location model with Laplace prior, and use the results to approximate the posterior distribution of sums of independent copies of η.

Statistics and ProbabilityLaplace priorsLaplace priorLocation parameterreflected generalized gamma priorSettore SECS-P/05Posterior probability0211 other engineering and technologiesSettore SECS-P/05 - Econometria02 engineering and technology01 natural sciencesCornish-Fisher approximation010104 statistics & probabilityStatistics::Methodologyposterior quantile0101 mathematicsposterior moments and cumulantsMathematicsreflected generalized gamma priors021103 operations researchLaplace transformLocation modelMathematical analysisStatistics::Computationposterior moments and cumulantCornish–Fisher approximationSettore SECS-S/01 - StatisticaNormal location modelposterior quantilesQuantileCommunications in Statistics - Theory and Methods
researchProduct

Componentwise adaptation for high dimensional MCMC

2005

We introduce a new adaptive MCMC algorithm, based on the traditional single component Metropolis-Hastings algorithm and on our earlier adaptive Metropolis algorithm (AM). In the new algorithm the adaption is performed component by component. The chain is no more Markovian, but it remains ergodic. The algorithm is demonstrated to work well in varying test cases up to 1000 dimensions.

Statistics and ProbabilityMathematical optimization010504 meteorology & atmospheric sciencesMonte Carlo methodMarkov processMarkov chain Monte Carlo01 natural sciencesStatistics::Computation010104 statistics & probabilityComputational Mathematicssymbols.namesakeMetropolis–Hastings algorithmTest caseChain (algebraic topology)Component (UML)symbolsStatistics::MethodologyErgodic theory0101 mathematicsStatistics Probability and Uncertainty0105 earth and related environmental sciencesMathematicsComputational Statistics
researchProduct

Adaptive Metropolis algorithm using variational Bayesian adaptive Kalman filter

2013

Markov chain Monte Carlo (MCMC) methods are powerful computational tools for analysis of complex statistical problems. However, their computational efficiency is highly dependent on the chosen proposal distribution, which is generally difficult to find. One way to solve this problem is to use adaptive MCMC algorithms which automatically tune the statistics of a proposal distribution during the MCMC run. A new adaptive MCMC algorithm, called the variational Bayesian adaptive Metropolis (VBAM) algorithm, is developed. The VBAM algorithm updates the proposal covariance matrix using the variational Bayesian adaptive Kalman filter (VB-AKF). A strong law of large numbers for the VBAM algorithm is…

Statistics and ProbabilityMathematical optimizationCovariance matrixApplied MathematicsBayesian probabilityRejection samplingMathematics - Statistics TheoryMarkov chain Monte CarloStatistics Theory (math.ST)Kalman filterStatistics::ComputationComputational Mathematicssymbols.namesakeComputingMethodologies_PATTERNRECOGNITIONMetropolis–Hastings algorithmComputational Theory and MathematicsConvergence (routing)FOS: MathematicsKernel adaptive filtersymbolsMathematicsComputational Statistics & Data Analysis
researchProduct

Rejoinder: Bayesian Checking of the Second Levels of Hierarchical Models

2008

Rejoinder: Bayesian Checking of the Second Levels of Hierarchical Models [arXiv:0802.0743]

Statistics and ProbabilityModel checkingFOS: Computer and information sciencesStatistics::TheoryDistribution (number theory)Computer sciencebusiness.industryGeneral MathematicsBayesian probabilityProbability and statisticsMachine learningcomputer.software_genreComputer Science::Digital LibrariesStatistics::ComputationMethodology (stat.ME)Test statisticStatistics::MethodologyArtificial intelligenceStatistics Probability and UncertaintybusinesscomputerStatistics - Methodology
researchProduct

Varying-coefficient functional linear regression models

2008

This article considers a generalization of the functional linear regression in which an additional real variable influences smoothly the functional coefficient. We thus define a varying-coefficient regression model for functional data. We propose two estimators based, respectively, on conditional functional principal regression and on local penalized regression splines and prove their pointwise consistency. We check, with the prediction one day ahead of ozone concentration in the city of Toulouse, the ability of such nonlinear functional approaches to produce competitive estimations.

Statistics and ProbabilityPolynomial regressionStatistics::TheoryProper linear modelMultivariate adaptive regression splines010504 meteorology & atmospheric sciencesLocal regression01 natural sciences62G05 (62G20 62M20)Statistics::ComputationNonparametric regressionStatistics::Machine Learning010104 statistics & probabilityLinear regressionStatisticsStatistics::Methodology0101 mathematicsSegmented regressionRegression diagnosticComputingMilieux_MISCELLANEOUS0105 earth and related environmental sciencesMathematics
researchProduct

Estimating growth charts via nonparametric quantile regression: a practical framework with application in ecology.

2013

We discuss a practical and effective framework to estimate reference growth charts via regression quantiles. Inequality constraints are used to ensure both monotonicity and non-crossing of the estimated quantile curves and penalized splines are employed to model the nonlinear growth patterns with respect to age. A companion R package is presented and relevant code discussed to favour spreading and application of the proposed methods.

Statistics and ProbabilitySettore BIO/07 - EcologiaStatistics::TheoryEcology (disciplines)Nonparametric statisticsMonotonic functionRegressionStatistics::ComputationQuantile regressionNonlinear systemR packageStatisticsEconometricsStatistics::MethodologyGrowth charts Nonparametric regression quantiles Penalized splines P. oceanica modelling R softwareStatistics Probability and UncertaintySettore SECS-S/01 - StatisticaGeneral Environmental ScienceMathematicsQuantile
researchProduct

The Induced Smoothed lasso: A practical framework for hypothesis testing in high dimensional regression.

2020

This paper focuses on hypothesis testing in lasso regression, when one is interested in judging statistical significance for the regression coefficients in the regression equation involving a lot of covariates. To get reliable p-values, we propose a new lasso-type estimator relying on the idea of induced smoothing which allows to obtain appropriate covariance matrix and Wald statistic relatively easily. Some simulation experiments reveal that our approach exhibits good performance when contrasted with the recent inferential tools in the lasso framework. Two real data analyses are presented to illustrate the proposed framework in practice.

Statistics and ProbabilityStatistics::TheoryInduced smoothingEpidemiologyComputer scienceFeature selectionWald test01 natural sciencesasthma researchStatistics::Machine Learning010104 statistics & probability03 medical and health sciencesHealth Information ManagementLasso (statistics)Linear regressionsparse modelsStatistics::MethodologyComputer Simulation0101 mathematicssandwich formula030304 developmental biologyStatistical hypothesis testing0303 health sciencesCovariance matrixlung functionRegression analysisStatistics::Computationsparse modelResearch DesignAlgorithmSmoothingvariable selectionStatistical methods in medical research
researchProduct