Search results for "dimensionality"

showing 10 items of 231 documents

Gaussianizing the Earth: Multidimensional Information Measures for Earth Data Analysis

2021

Information theory is an excellent framework for analyzing Earth system data because it allows us to characterize uncertainty and redundancy, and is universally interpretable. However, accurately estimating information content is challenging because spatio-temporal data is high-dimensional, heterogeneous and has non-linear characteristics. In this paper, we apply multivariate Gaussianization for probability density estimation which is robust to dimensionality, comes with statistical guarantees, and is easy to apply. In addition, this methodology allows us to estimate information-theoretic measures to characterize multivariate densities: information, entropy, total correlation, and mutual in…

FOS: Computer and information sciencesMultivariate statisticsGeneral Computer ScienceComputer scienceMachine Learning (stat.ML)Mutual informationInformation theorycomputer.software_genreStatistics - ApplicationsEarth system scienceRedundancy (information theory)13. Climate actionStatistics - Machine LearningGeneral Earth and Planetary SciencesEntropy (information theory)Applications (stat.AP)Total correlationData miningElectrical and Electronic EngineeringInstrumentationcomputerCurse of dimensionality
researchProduct

PRINCIPAL POLYNOMIAL ANALYSIS

2014

© 2014 World Scientific Publishing Company. This paper presents a new framework for manifold learning based on a sequence of principal polynomials that capture the possibly nonlinear nature of the data. The proposed Principal Polynomial Analysis (PPA) generalizes PCA by modeling the directions of maximal variance by means of curves instead of straight lines. Contrarily to previous approaches PPA reduces to performing simple univariate regressions which makes it computationally feasible and robust. Moreover PPA shows a number of interesting analytical properties. First PPA is a volume preserving map which in turn guarantees the existence of the inverse. Second such an inverse can be obtained…

FOS: Computer and information sciencesPolynomialComputer Networks and CommunicationsComputer scienceMachine Learning (stat.ML)02 engineering and technologyReduction (complexity)03 medical and health sciencessymbols.namesake0302 clinical medicineStatistics - Machine LearningArtificial Intelligence0202 electrical engineering electronic engineering information engineeringPrincipal Polynomial AnalysisPrincipal Component AnalysisMahalanobis distanceModels StatisticalCodingDimensionality reductionNonlinear dimensionality reductionGeneral MedicineClassificationDimensionality reductionManifold learningNonlinear DynamicsMetric (mathematics)Jacobian matrix and determinantsymbolsRegression Analysis020201 artificial intelligence & image processingNeural Networks ComputerAlgorithmAlgorithms030217 neurology & neurosurgeryCurse of dimensionalityInternational Journal of Neural Systems
researchProduct

Asymptotic and bootstrap tests for subspace dimension

2022

Most linear dimension reduction methods proposed in the literature can be formulated using an appropriate pair of scatter matrices, see e.g. Ye and Weiss (2003), Tyler et al. (2009), Bura and Yang (2011), Liski et al. (2014) and Luo and Li (2016). The eigen-decomposition of one scatter matrix with respect to another is then often used to determine the dimension of the signal subspace and to separate signal and noise parts of the data. Three popular dimension reduction methods, namely principal component analysis (PCA), fourth order blind identification (FOBI) and sliced inverse regression (SIR) are considered in detail and the first two moments of subsets of the eigenvalues are used to test…

FOS: Computer and information sciencesStatistics and ProbabilityPrincipal component analysisMathematics - Statistics TheoryStatistics Theory (math.ST)01 natural sciencesMethodology (stat.ME)010104 statistics & probabilityDimension (vector space)Scatter matrixSliced inverse regression0502 economics and businessFOS: MathematicsSliced inverse regressionApplied mathematics0101 mathematicsEigenvalues and eigenvectorsStatistics - Methodology050205 econometrics MathematicsestimointiNumerical AnalysisOrder determinationDimensionality reduction05 social sciencesriippumattomien komponenttien analyysimonimuuttujamenetelmätPrincipal component analysisStatistics Probability and UncertaintySubspace topologySignal subspace
researchProduct

Dimensionality Reduction via Regression in Hyperspectral Imagery

2015

This paper introduces a new unsupervised method for dimensionality reduction via regression (DRR). The algorithm belongs to the family of invertible transforms that generalize Principal Component Analysis (PCA) by using curvilinear instead of linear features. DRR identifies the nonlinear features through multivariate regression to ensure the reduction in redundancy between he PCA coefficients, the reduction of the variance of the scores, and the reduction in the reconstruction error. More importantly, unlike other nonlinear dimensionality reduction methods, the invertibility, volume-preservation, and straightforward out-of-sample extension, makes DRR interpretable and easy to apply. The pro…

FOS: Computer and information sciencesbusiness.industryDimensionality reductionComputer Vision and Pattern Recognition (cs.CV)Feature extractionNonlinear dimensionality reductionDiffusion mapComputer Science - Computer Vision and Pattern RecognitionPattern recognitionMachine Learning (stat.ML)CollinearityReduction (complexity)Statistics - Machine LearningSignal ProcessingPrincipal component analysisArtificial intelligenceElectrical and Electronic EngineeringbusinessMathematicsCurse of dimensionality
researchProduct

Transfer of arbitrary two-qubit states via a spin chain

2015

We investigate the fidelity of the quantum state transfer (QST) of two qubits by means of an arbitrary spin-1/2 network, on a lattice of any dimensionality. Under the assumptions that the network Hamiltonian preserves the magnetization and that a fully polarized initial state is taken for the lattice, we obtain a general formula for the average fidelity of the two qubits QST, linking it to the one- and two-particle transfer amplitudes of the spin-excitations among the sites of the lattice. We then apply this formalism to a 1D spin chain with XX-Heisenberg type nearest-neighbour interactions adopting a protocol that is a generalization of the single qubit one proposed in Ref. [Phys. Rev. A 8…

FOS: Physical sciencesSettore FIS/03 - Fisica Della MateriaMagnetizationsymbols.namesakeAtomic and Molecular PhysicsLattice (order)Quantum mechanicstwo-qubit statesQuantum informationQuantum information sciencespin chainPhysicsQuantum Physicsspin chain quantum state transfer quantum communicationquantum state transferSpin quantum numberAtomic and Molecular Physics and OpticsCondensed Matter - Other Condensed MatterQubitsymbolsand OpticsHamiltonian (quantum mechanics)Quantum Physics (quant-ph)Curse of dimensionalityOther Condensed Matter (cond-mat.other)
researchProduct

From Bi-Dimensionality to Uni-Dimensionality in Self-Report Questionnaires

2021

Abstract. The common factor model – by far the most widely used model for factor analysis – assumes equal item intercepts across respondents. Due to idiosyncratic ways of understanding and answering items of a questionnaire, this assumption is often violated, leading to an underestimation of model fit. Maydeu-Olivares and Coffman (2006) suggested the introduction of a random intercept into the model to address this concern. The present study applies this method to six established instruments (measuring depression, procrastination, optimism, self-esteem, core self-evaluations, and self-regulation) with ambiguous factor structures, using data from representative general population samples. I…

Factor (chord)media_common.quotation_subjectStatisticsProcrastinationConstruct validityPsychological testingPersonality Assessment InventorySelf reportPsychologyApplied PsychologyRandom interceptmedia_commonCurse of dimensionalityEuropean Journal of Psychological Assessment
researchProduct

Combining feature extraction and expansion to improve classification based similarity learning

2017

Abstract Metric learning has been shown to outperform standard classification based similarity learning in a number of different contexts. In this paper, we show that the performance of classification similarity learning strongly depends on the data format used to learn the model. We then present an Enriched Classification Similarity Learning method that follows a hybrid approach that combines both feature extraction and feature expansion. In particular, we propose a data transformation and the use of a set of standard distances to supplement the information provided by the feature vectors of the training samples. The method is compared to state-of-the-art feature extraction and metric lear…

Feature extractionLinear classifier02 engineering and technologySemi-supervised learning010501 environmental sciencesMachine learningcomputer.software_genre01 natural sciencesk-nearest neighbors algorithmArtificial Intelligence0202 electrical engineering electronic engineering information engineering0105 earth and related environmental sciencesMathematicsbusiness.industryDimensionality reductionPattern recognitionStatistical classificationSignal Processing020201 artificial intelligence & image processingComputer Vision and Pattern RecognitionArtificial intelligencebusinessFeature learningcomputerSoftwareSimilarity learningPattern Recognition Letters
researchProduct

The EDTA Family of Molecular Based Ferromagnets

1991

The bimetallic compounds of the EDTA family offer a large variety of ferrimagnetic model systems in which the dimensionality as well as the exchange-anisotropy can be controlled with ease. This paper deals with the magneto-structural chemistry of this kind of materials, paying particular attention to both the low-dimensional magnetic behavior and the three-dimcnsional magnetic ordering.

FerromagnetismFerrimagnetismChemistryNanotechnologyBimetallic stripCurse of dimensionality
researchProduct

Functional principal component analysis for multivariate multidimensional environmental data

2015

Data with spatio-temporal structure can arise in many contexts, therefore a considerable interest in modelling these data has been generated, but the complexity of spatio-temporal models, together with the size of the dataset, results in a challenging task. The modelization is even more complex in presence of multivariate data. Since some modelling problems are more natural to think through in functional terms, even if only a finite number of observations is available, treating the data as functional can be useful (Berrendero et al. in Comput Stat Data Anal 55:2619–2634, 2011). Although in Ramsay and Silverman (Functional data analysis, 2nd edn. Springer, New York, 2005) the case of multiva…

Functional principal component analysisStatistics and ProbabilityMultivariate statistics2300GeneralizationDimensionality reductionGeneralized additive modelFunctional data analysisFunctional principal component analysiContext (language use)computer.software_genreMultivariate spatio-temporal dataCovariateP-splineData miningStatistics Probability and UncertaintycomputerSmoothingGeneral Environmental ScienceMathematics
researchProduct

Semisupervised nonlinear feature extraction for image classification

2012

Feature extraction is of paramount importance for an accurate classification of remote sensing images. Techniques based on data transformations are widely used in this context. However, linear feature extraction algorithms, such as the principal component analysis and partial least squares, can address this problem in a suboptimal way because the data relations are often nonlinear. Kernel methods may alleviate this problem only when the structure of the data manifold is properly captured. However, this is difficult to achieve when small-size training sets are available. In these cases, exploiting the information contained in unlabeled samples together with the available training data can si…

Graph kernelComputer scienceFeature extractioncomputer.software_genreKernel principal component analysisk-nearest neighbors algorithmKernel (linear algebra)Polynomial kernelPartial least squares regressionLeast squares support vector machineCluster analysisTraining setContextual image classificationbusiness.industryDimensionality reductionPattern recognitionManifoldKernel methodKernel embedding of distributionsKernel (statistics)Principal component analysisRadial basis function kernelPrincipal component regressionData miningArtificial intelligencebusinesscomputer2012 IEEE International Geoscience and Remote Sensing Symposium
researchProduct