Search results for "ESTIMATOR"

showing 10 items of 313 documents

B-Spline Estimation in a Survey Sampling Framework

2021

Nonparametric regression models have been used more and more over the last years to model survey data and incorporate efficiently auxiliary information in order to improve the estimation of totals, means or other study parameters such as Gini index or poverty rate. B-spline nonparametric regression has the benefit of being very flexible in modeling nonlinear survey data while keeping many similarities and properties of the classical linear regression. This method proved to be efficient for deriving a unique system of weights which allowed to estimate in an efficient way and simultaneously many study parameters. Applications on real and simulated survey data showed its high efficiency. This …

EstimationStatistics::TheoryComputer scienceConsistency (statistics)B-splineLinear regressionStatisticsStatistics::MethodologySurvey data collectionEstimatorSurvey samplingNonparametric regression
researchProduct

Inclusion ratio based estimator for the mean length of the boolean line segment model with an application to nanocrystalline cellulose

2014

A novel estimator for estimating the mean length of fibres is proposed for censored data observed in square shaped windows. Instead of observing the fibre lengths, we observe the ratio between the intensity estimates of minus-sampling and plus-sampling. It is well-known that both intensity estimators are biased. In the current work, we derive the ratio of these biases as a function of the mean length assuming a Boolean line segment model with exponentially distributed lengths and uniformly distributed directions. Having the observed ratio of the intensity estimators, the inverse of the derived function is suggested as a new estimator for the mean length. For this estimator, an approximation…

Exponential distributionAcoustics and UltrasonicsMaterials Science (miscellaneous)General MathematicsInversevarianceSquare (algebra)exponential length distributionfibresLine segmentStatisticsRadiology Nuclear Medicine and imagingnanocellulose crystallineratio of estimatesInstrumentationnanocelluloseMathematicsplus-samplinglcsh:R5-920lcsh:MathematicsMathematical analysisEstimatorBoolean modelFunction (mathematics)lcsh:QA1-939mean lengthsimulationEfficient estimatorminus-samplingSignal Processinglength distributionComputer Vision and Pattern Recognitionlcsh:Medicine (General)Intensity (heat transfer)line segmentsBiotechnologyImage Analysis and Stereology
researchProduct

On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction

2020

Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…

FOS: Computer and information sciences0301 basic medicineStatistics and Probabilitytolerance choiceGeneral MathematicsMarkovin ketjutInference01 natural sciencesStatistics - Computationapproximate Bayesian computation010104 statistics & probability03 medical and health sciencessymbols.namesakeMixing (mathematics)adaptive algorithmalgoritmit0101 mathematicsComputation (stat.CO)MathematicsAdaptive algorithmMarkov chainbayesilainen menetelmäApplied MathematicsProbabilistic logicEstimatorMarkov chain Monte CarloAgricultural and Biological Sciences (miscellaneous)Markov chain Monte CarloMonte Carlo -menetelmätimportance sampling030104 developmental biologyconfidence intervalsymbolsStatistics Probability and UncertaintyApproximate Bayesian computationGeneral Agricultural and Biological SciencesAlgorithm
researchProduct

A Review of Multiple Try MCMC algorithms for Signal Processing

2018

Many applications in signal processing require the estimation of some parameters of interest given a set of observed data. More specifically, Bayesian inference needs the computation of {\it a-posteriori} estimators which are often expressed as complicated multi-dimensional integrals. Unfortunately, analytical expressions for these estimators cannot be found in most real-world applications, and Monte Carlo methods are the only feasible approach. A very powerful class of Monte Carlo techniques is formed by the Markov Chain Monte Carlo (MCMC) algorithms. They generate a Markov chain such that its stationary distribution coincides with the target posterior density. In this work, we perform a t…

FOS: Computer and information sciencesComputer scienceMonte Carlo methodMachine Learning (stat.ML)02 engineering and technologyMultiple-try MetropolisBayesian inference01 natural sciencesStatistics - Computation010104 statistics & probabilitysymbols.namesakeArtificial IntelligenceStatistics - Machine Learning0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringComputation (stat.CO)Signal processingMarkov chainApplied MathematicsEstimator020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputational Theory and MathematicsSignal ProcessingsymbolsSample spaceComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm
researchProduct

CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration

2017

International audience; In this paper, we propose a new framework to remove parts of the systematic errors affecting popular restoration algorithms, with a special focus for image processing tasks. Generalizing ideas that emerged for $\ell_1$ regularization, we develop an approach re-fitting the results of standard methods towards the input data. Total variation regularizations and non-local means are special cases of interest. We identify important covariant information that should be preserved by the re-fitting method, and emphasize the importance of preserving the Jacobian (w.r.t. the observed signal) of the original estimator. Then, we provide an approach that has a ``twicing'' flavor a…

FOS: Computer and information sciencesInverse problemsMathematical optimization[ INFO.INFO-TS ] Computer Science [cs]/Signal and Image ProcessingComputer Vision and Pattern Recognition (cs.CV)General MathematicsComputer Science - Computer Vision and Pattern RecognitionMachine Learning (stat.ML)Mathematics - Statistics TheoryImage processingStatistics Theory (math.ST)02 engineering and technologyDebiasing[ INFO.INFO-CV ] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]01 natural sciencesRegularization (mathematics)Boosting010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing[STAT.ML]Statistics [stat]/Machine Learning [stat.ML]Variational methods[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST]Statistics - Machine LearningRefittingMSC: 49N45 65K10 68U10[ INFO.INFO-TI ] Computer Science [cs]/Image ProcessingFOS: Mathematics0202 electrical engineering electronic engineering information engineeringCovariant transformation[ MATH.MATH-ST ] Mathematics [math]/Statistics [math.ST]0101 mathematicsImage restoration[ STAT.ML ] Statistics [stat]/Machine Learning [stat.ML]MathematicsApplied Mathematics[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]EstimatorInverse problem[INFO.INFO-TI]Computer Science [cs]/Image Processing [eess.IV]Jacobian matrix and determinantsymbolsTwicing020201 artificial intelligence & image processingAffine transformationAlgorithm
researchProduct

Heretical Mutiple Importance Sampling

2016

Multiple Importance Sampling (MIS) methods approximate moments of complicated distributions by drawing samples from a set of proposal distributions. Several ways to compute the importance weights assigned to each sample have been recently proposed, with the so-called deterministic mixture (DM) weights providing the best performance in terms of variance, at the expense of an increase in the computational cost. A recent work has shown that it is possible to achieve a trade-off between variance reduction and computational effort by performing an a priori random clustering of the proposals (partial DM algorithm). In this paper, we propose a novel "heretical" MIS framework, where the clustering …

FOS: Computer and information sciencesMean squared errorComputer scienceApplied MathematicsEstimator020206 networking & telecommunications02 engineering and technologyVariance (accounting)Statistics - Computation01 natural sciencesReduction (complexity)010104 statistics & probability[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingSignal Processing0202 electrical engineering electronic engineering information engineeringA priori and a posterioriVariance reduction0101 mathematicsElectrical and Electronic EngineeringCluster analysisAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputation (stat.CO)ComputingMilieux_MISCELLANEOUS
researchProduct

Unbiased Estimators and Multilevel Monte Carlo

2018

Multilevel Monte Carlo (MLMC) and unbiased estimators recently proposed by McLeish (Monte Carlo Methods Appl., 2011) and Rhee and Glynn (Oper. Res., 2015) are closely related. This connection is elaborated by presenting a new general class of unbiased estimators, which admits previous debiasing schemes as special cases. New lower variance estimators are proposed, which are stratified versions of earlier unbiased schemes. Under general conditions, essentially when MLMC admits the canonical square root Monte Carlo error rate, the proposed new schemes are shown to be asymptotically as efficient as MLMC, both in terms of variance and cost. The experiments demonstrate that the variance reduction…

FOS: Computer and information sciencesMonte Carlo methodWord error rate010103 numerical & computational mathematicsstochastic differential equationManagement Science and Operations ResearchStatistics - Computation01 natural sciences010104 statistics & probabilityStochastic differential equationstratificationSquare rootFOS: MathematicsApplied mathematics0101 mathematicsComputation (stat.CO)stokastiset prosessitMathematicsProbability (math.PR)ta111EstimatorVariance (accounting)unbiased estimatorsComputer Science ApplicationsMonte Carlo -menetelmät65C05 (Primary) 65C30 (Secondary)efficiencykerrostuneisuusVariance reductionunbiasemultilevel Monte CarlodifferentiaaliyhtälötMathematics - ProbabilityOperations Research
researchProduct

Multispectral image denoising with optimized vector non-local mean filter

2016

Nowadays, many applications rely on images of high quality to ensure good performance in conducting their tasks. However, noise goes against this objective as it is an unavoidable issue in most applications. Therefore, it is essential to develop techniques to attenuate the impact of noise, while maintaining the integrity of relevant information in images. We propose in this work to extend the application of the Non-Local Means filter (NLM) to the vector case and apply it for denoising multispectral images. The objective is to benefit from the additional information brought by multispectral imaging systems. The NLM filter exploits the redundancy of information in an image to remove noise. A …

FOS: Computer and information sciencesMulti-spectral imaging systemsComputer Vision and Pattern Recognition (cs.CV)Optimization frameworkMultispectral imageComputer Science - Computer Vision and Pattern Recognition02 engineering and technologyWhite noisePixels[SPI]Engineering Sciences [physics][ SPI ] Engineering Sciences [physics]0202 electrical engineering electronic engineering information engineeringComputer visionUnbiased risk estimatorMultispectral imageMathematicsMultispectral imagesApplied MathematicsBilateral FilterNumerical Analysis (math.NA)Non-local meansAdditive White Gaussian noiseStein's unbiased risk estimatorIlluminationComputational Theory and MathematicsRestorationImage denoisingsymbols020201 artificial intelligence & image processingNon-local mean filtersComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyGaussian noise (electronic)Non- local means filtersAlgorithmsNoise reductionComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISIONFace Recognitionsymbols.namesakeNoise RemovalArtificial IntelligenceFOS: MathematicsParameter estimationMedian filterMathematics - Numerical AnalysisElectrical and Electronic EngineeringFusionPixelbusiness.industryVector non-local mean filter020206 networking & telecommunicationsPattern recognitionFilter (signal processing)Bandpass filters[ SPI.TRON ] Engineering Sciences [physics]/Electronics[SPI.TRON]Engineering Sciences [physics]/ElectronicsStein's unbiased risk estimators (SURE)NoiseAdditive white Gaussian noiseComputer Science::Computer Vision and Pattern RecognitionSignal ProcessingArtificial intelligenceReconstructionbusinessModel
researchProduct

Estimation of causal effects with small data in the presence of trapdoor variables

2021

We consider the problem of estimating causal effects of interventions from observational data when well-known back-door and front-door adjustments are not applicable. We show that when an identifiable causal effect is subject to an implicit functional constraint that is not deducible from conditional independence relations, the estimator of the causal effect can exhibit bias in small samples. This bias is related to variables that we call trapdoor variables. We use simulated data to study different strategies to account for trapdoor variables and suggest how the related trapdoor bias might be minimized. The importance of trapdoor variables in causal effect estimation is illustrated with rea…

FOS: Computer and information sciencesStatistics and ProbabilityEconomics and EconometricsbiascausalityComputer scienceBayesian probabilityContext (language use)01 natural sciencesStatistics - ComputationMethodology (stat.ME)010104 statistics & probability0504 sociologyEconometrics0101 mathematicsComputation (stat.CO)Statistics - MethodologyestimointiEstimationSmall databayesilainen menetelmä05 social sciences050401 social sciences methodsEstimatorBayesian estimationidentifiabilityConstraint (information theory)functional constraintConditional independencekausaliteettiObservational studyStatistics Probability and UncertaintySocial Sciences (miscellaneous)
researchProduct

Thresholding projection estimators in functional linear models

2008

We consider the problem of estimating the regression function in functional linear regression models by proposing a new type of projection estimators which combine dimension reduction and thresholding. The introduction of a threshold rule allows to get consistency under broad assumptions as well as minimax rates of convergence under additional regularity hypotheses. We also consider the particular case of Sobolev spaces generated by the trigonometric basis which permits to get easily mean squared error of prediction as well as estimators of the derivatives of the regression function. We prove these estimators are minimax and rates of convergence are given for some particular cases.

FOS: Computer and information sciencesStatistics and ProbabilityMathematical optimizationStatistics::TheoryMean squared error of predictionMean squared errorMathematics - Statistics TheoryStatistics Theory (math.ST)Projection (linear algebra)Methodology (stat.ME)FOS: MathematicsApplied mathematicsStatistics - MethodologyMathematicsLinear inverse problemNumerical AnalysisLinear modelEstimatorRegression analysisMinimaxSobolev spaceThresholdingOptimal rate of convergenceDerivatives estimationRate of convergenceHilbert scaleStatistics Probability and UncertaintyGalerkin methodJournal of Multivariate Analysis
researchProduct