Search results for "Density estimation"

showing 10 items of 61 documents

An Introduction to Kernel Methods

2009

Machine learning has experienced a great advance in the eighties and nineties due to the active research in artificial neural networks and adaptive systems. These tools have demonstrated good results in many real applications, since neither a priori knowledge about the distribution of the available data nor the relationships among the independent variables should be necessarily assumed. Overfitting due to reduced training data sets is controlled by means of a regularized functional which minimizes the complexity of the machine. Working with high dimensional input spaces is no longer a problem thanks to the use of kernel methods. Such methods also provide us with new ways to interpret the cl…

Mathematical optimizationbusiness.industryMachine learningcomputer.software_genreKernel principal component analysisKernel methodVariable kernel density estimationPolynomial kernelKernel embedding of distributionsKernel (statistics)Radial basis function kernelKernel smootherArtificial intelligencebusinesscomputerMathematics
researchProduct

Transport equations and quasi-invariant flows on the Wiener space

2010

Abstract We shall investigate on vector fields of low regularity on the Wiener space, with divergence having low exponential integrability. We prove that the vector field generates a flow of quasi-invariant measurable maps with density belonging to the space L log L . An explicit expression for the density is also given.

Mathematics(all)General MathematicsMathematical analysisIntegral representation theorem for classical Wiener spaceMalliavin calculusDensity estimationSpace (mathematics)Quasi-invariant flowsDivergenceCommutator estimateFlow (mathematics)Transport equationsWiener spaceClassical Wiener spaceVector fieldInvariant (mathematics)MathematicsBulletin des Sciences Mathématiques
researchProduct

A Robust Wrap Reduction Algorithm for Fringe Projection Profilometry and Applications in Magnetic Resonance Imaging.

2017

In this paper, we present an effective algorithm to reduce the number of wraps in a 2D phase signal provided as input. The technique is based on an accurate estimate of the fundamental frequency of a 2D complex signal with the phase given by the input, and the removal of a dependent additive term from the phase map. Unlike existing methods based on the discrete Fourier transform (DFT), the frequency is computed by using noise-robust estimates that are not restricted to integer values. Then, to deal with the problem of a non-integer shift in the frequency domain, an equivalent operation is carried out on the original phase signal. This consists of the subtraction of a tilted plane whose slop…

Non-uniform discrete Fourier transformSpectral density estimation020206 networking & telecommunicationsk-space02 engineering and technologyFundamental frequency01 natural sciencesComputer Graphics and Computer-Aided DesignSignalDiscrete Fourier transform010309 opticsFrequency domain0103 physical sciencesDiscrete frequency domain0202 electrical engineering electronic engineering information engineeringAlgorithmSoftwareMathematicsIEEE transactions on image processing : a publication of the IEEE Signal Processing Society
researchProduct

Adaptive design optimization: a mutual information-based approach to model discrimination in cognitive science.

2010

Discriminating among competing statistical models is a pressing issue for many experimentalists in the field of cognitive science. Resolving this issue begins with designing maximally informative experiments. To this end, the problem to be solved in adaptive design optimization is identifying experimental designs under which one can infer the underlying model in the fewest possible steps. When the models under consideration are nonlinear, as is often the case in cognitive science, this problem can be impossible to solve analytically without simplifying assumptions. However, as we show in this letter, a full solution can be found numerically with the help of a Bayesian computational trick d…

Optimal designCognitive scienceModels StatisticalArtificial neural networkbusiness.industryCognitive NeuroscienceDesign of experimentsBayesian probabilityPosterior probabilityStatistical modelBayes TheoremMutual informationDensity estimationCognitionDiscrimination PsychologicalArts and Humanities (miscellaneous)Nonlinear DynamicsResearch DesignHumansComputer SimulationArtificial intelligencebusinessMathematicsNeural computation
researchProduct

Probabilistic Selection Approaches in Decomposition-based Evolutionary Algorithms for Offline Data-Driven Multiobjective Optimization

2022

In offline data-driven multiobjective optimization, no new data is available during the optimization process. Approximation models, also known as surrogates, are built using the provided offline data. A multiobjective evolutionary algorithm can be utilized to find solutions by using these surrogates. The accuracy of the approximated solutions depends on the surrogates and approximations typically involve uncertainties. In this paper, we propose probabilistic selection approaches that utilize the uncertainty information of the Kriging models (as surrogates) to improve the solution process in offline data-driven multiobjective optimization. These approaches are designed for decomposition-base…

Pareto optimalitypareto-tehokkuusgaussiset prosessitGaussian processesevoluutiolaskentamonitavoiteoptimointiTheoretical Computer ScienceKrigingComputational Theory and Mathematicsmetamodellingsurrogatekernel density estimationkriging-menetelmäSoftware
researchProduct

Regularization operators for natural images based on nonlinear perception models.

2006

Image restoration requires some a priori knowledge of the solution. Some of the conventional regularization techniques are based on the estimation of the power spectrum density. Simple statistical models for spectral estimation just take into account second-order relations between the pixels of the image. However, natural images exhibit additional features, such as particular relationships between local Fourier or wavelet transform coefficients. Biological visual systems have evolved to capture these relations. We propose the use of this biological behavior to build regularization operators as an alternative to simple statistical models. The results suggest that if the penalty operator take…

Regularization perspectives on support vector machinesInformation Storage and RetrievalImage processingRegularization (mathematics)Pattern Recognition AutomatedOperator (computer programming)Artificial IntelligenceImage Interpretation Computer-AssistedCluster AnalysisComputer SimulationImage restorationMathematicsModels Statisticalbusiness.industryWavelet transformSpectral density estimationStatistical modelPattern recognitionNumerical Analysis Computer-AssistedSignal Processing Computer-AssistedImage EnhancementComputer Graphics and Computer-Aided DesignNonlinear DynamicsArtificial intelligencebusinessSoftwareAlgorithmsIEEE transactions on image processing : a publication of the IEEE Signal Processing Society
researchProduct

BICKEL–ROSENBLATT TEST FOR WEAKLY DEPENDENT DATA

2012

The aim of this paper is to analyze the Bickel–Rosenblatt test for simple hypothesis in case of weakly dependent data. Although the test has nice theoretical properties, it is not clear how to implement it in practice. Choosing different band-width sequences first we analyze percentage rejections of the test statistic under H0 by some empirical simulation analysis. This can serve as an approximate rule for choosing the bandwidth in case of simple hypothesis for practical implementation of the test. In the recent paper [12] a version of Neyman goodness-of-fit test was established for weakly dependent data in the case of simple hypotheses. In this paper we also aim to compare and discuss the …

Score testgoodness-of-fitBickel-Rosenblatt testnonparametric density estimationNonparametric density estimationweak dependenceExact testGoodness of fitModeling and SimulationQA1-939Neyman's smooth testEconometricsChi-square testTest statisticApplied mathematicsGoldfeld–Quandt testMathematicsAnalysisMathematicsMathematical Modelling and Analysis
researchProduct

Experimental approach for testing the uncoupling between cardiovascular variability series

2002

In cardiovascular variability analysis, the significance of the coupling between two series is commonly assessed by defining a zero level on the magnitude-squared coherence (MSC). Although the use of the conventional value of 0.5 does not consider the dependence of MSC estimates on the analysis parameters, a theoretical threshold Tt is available only for the weighted covariance (WC) estimator. In this study, an experimental threshold for zero coherence Te was derived by a statistical test from the sampling distribution of MSC estimated on completely uncoupled time series. MSC was estimated by the WC method (Parzen window, spectral bandwidth B = 0.015, 0.02, 0.025, 0.03 Hz) and by the parame…

Series (mathematics)Kernel density estimationModels CardiovascularMyocardial InfarctionBiomedical EngineeringEstimatorComputer Science Applications1707 Computer Vision and Pattern RecognitionSignal Processing Computer-AssistedCoherence (statistics)CovarianceFeedbackComputer Science ApplicationsSpectral analysiElectrocardiographySampling distributionAutoregressive modelCardiovascular variability serieStatisticsHumansMagnitude-squared coherenceParametric statisticsMathematicsMedical & Biological Engineering & Computing
researchProduct

Robust refinement of initial prototypes for partitioning-based clustering algorithms

2007

Non-uniqueness of solutions and sensitivity to erroneous data are common problems to large-scale data clustering tasks. In order to avoid poor quality of solutions with partitioning-based clustering methods, robust estimates (that are highly insensitive to erroneous data values) are needed and initial cluster prototypes should be determined properly. In this paper, a robust density estimation initialization method that exploits the spatial median estimate to the prototype update is presented. Besides being insensitive to noise and outliers, the new method is also computationally comparable with other traditional methods. The methods are compared by numerical experiments on a set of syntheti…

Set (abstract data type)Computer scienceCorrelation clusteringOutlierInitializationSensitivity (control systems)Density estimationNoise (video)Data miningCluster analysiscomputer.software_genrecomputerRecent Advances in Stochastic Modeling and Data Analysis
researchProduct

Support Vector Machines Framework for Linear Signal Processing

2005

This paper presents a support vector machines (SVM) framework to deal with linear signal processing (LSP) problems. The approach relies on three basic steps for model building: (1) identifying the suitable base of the Hilbert signal space in the model, (2) using a robust cost function, and (3) minimizing a constrained, regularized functional by means of the method of Lagrange multipliers. Recently, autoregressive moving average (ARMA) system identification and non-parametric spectral analysis have been formulated under this framework. The generalized, yet simple, formulation of SVM LSP problems is particularized here for three different issues: parametric spectral estimation, stability of I…

Signal processingTelecomunicacionesSupport vector machinesSystem identificationLinear signal processingSpectral density estimationSpectral estimationSupport vector machineGamma filterControl and Systems EngineeringControl theoryComplex ARMASignal ProcessingAutoregressive–moving-average model3325 Tecnología de las TelecomunicacionesComputer Vision and Pattern RecognitionElectrical and Electronic EngineeringInfinite impulse responseDigital filterAlgorithmSoftwareParametric statisticsMathematics
researchProduct