Search results for "Entropy"

showing 10 items of 496 documents

Kernel methods and their derivatives: Concept and perspectives for the earth system sciences.

2020

Kernel methods are powerful machine learning techniques which implement generic non-linear functions to solve complex tasks in a simple way. They Have a solid mathematical background and exhibit excellent performance in practice. However, kernel machines are still considered black-box models as the feature mapping is not directly accessible and difficult to interpret.The aim of this work is to show that it is indeed possible to interpret the functions learned by various kernel methods is intuitive despite their complexity. Specifically, we show that derivatives of these functions have a simple mathematical formulation, are easy to compute, and can be applied to many different problems. We n…

FOS: Computer and information sciencesComputer Science - Machine LearningSupport Vector MachineTheoretical computer scienceComputer scienceEntropyKernel FunctionsNormal Distribution0211 other engineering and technologies02 engineering and technologyMachine Learning (cs.LG)Machine LearningStatistics - Machine LearningSimple (abstract algebra)0202 electrical engineering electronic engineering information engineeringOperator TheoryData ManagementMultidisciplinaryGeographyApplied MathematicsSimulation and ModelingQRDensity estimationKernel methodKernel (statistics)Physical SciencessymbolsMedicine020201 artificial intelligence & image processingAlgorithmsResearch ArticleComputer and Information SciencesScienceMachine Learning (stat.ML)Research and Analysis MethodsKernel MethodsKernel (linear algebra)symbols.namesakeArtificial IntelligenceSupport Vector MachinesHumansEntropy (information theory)Computer SimulationGaussian process021101 geological & geomatics engineeringData VisualizationCorrectionRandom VariablesFunction (mathematics)Probability TheorySupport vector machineAlgebraPhysical GeographyLinear AlgebraEarth SciencesEigenvectorsRandom variableMathematicsEarth SystemsPLoS ONE
researchProduct

A survey of active learning algorithms for supervised remote sensing image classification

2011

Defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. The complexity of the problem, the limited temporal and financial resources, as well as the high intraclass variance can make an algorithm fail if it is trained with a suboptimal dataset. Active learning aims at building efficient training sets by iteratively improving the model performance through sampling. A user-defined heuristic ranks the unlabeled pixels according to a function of the uncertainty of their class membership and then the user is asked to provide labels for the most uncertain pixels. This paper reviews and tests the main families of active …

FOS: Computer and information sciencesComputer scienceComputer Vision and Pattern Recognition (cs.CV)Computer Science - Computer Vision and Pattern RecognitionMachine learningcomputer.software_genreactive learningHyperspectral image classificationEntropy (information theory)Electrical and Electronic EngineeringArchitectureRemote sensingvery high resolution (VHR)PixelContextual image classificationbusiness.industryHyperspectral imagingSupport vector machinehyperspectraltraining set definitionSignal Processingsupport vector machine (SVM)Artificial intelligenceHeuristicsbusinessAlgorithmcomputerimage classification
researchProduct

Fractional generalized cumulative entropy and its dynamic version

2021

Following the theory of information measures based on the cumulative distribution function, we propose the fractional generalized cumulative entropy, and its dynamic version. These entropies are particularly suitable to deal with distributions satisfying the proportional reversed hazard model. We study the connection with fractional integrals, and some bounds and comparisons based on stochastic orderings, that allow to show that the proposed measure is actually a variability measure. The investigation also involves various notions of reliability theory, since the considered dynamic measure is a suitable extension of the mean inactivity time. We also introduce the empirical generalized fract…

FOS: Computer and information sciencesExponential distributionComputer Science - Information TheoryMathematics - Statistics TheoryStatistics Theory (math.ST)01 natural sciencesMeasure (mathematics)010305 fluids & plasmas0103 physical sciencesFOS: MathematicsApplied mathematicsAlmost surelyCumulative entropy; Fractional calculus; Stochastic orderings; EstimationEntropy (energy dispersal)010306 general physicsStochastic orderingsMathematicsCentral limit theoremNumerical AnalysisInformation Theory (cs.IT)Applied MathematicsCumulative distribution functionProbability (math.PR)Fractional calculusEmpirical measureFractional calculusModeling and SimulationEstimationCumulative entropyMathematics - ProbabilityCommunications in Nonlinear Science and Numerical Simulation
researchProduct

Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes

2017

Exploiting the theory of state space models, we derive the exact expressions of the information transfer, as well as redundant and synergistic transfer, for coupled Gaussian processes observed at multiple temporal scales. All of the terms, constituting the frameworks known as interaction information decomposition and partial information decomposition, can thus be analytically obtained for different time scales from the parameters of the VAR model that fits the processes. We report the application of the proposed methodology firstly to benchmark Gaussian systems, showing that this class of systems may generate patterns of information decomposition characterized by prevalently redundant or sy…

FOS: Computer and information sciencesInformation transferComputer scienceGaussianSocial SciencesGeneral Physics and AstronomyInformation theory01 natural sciences010305 fluids & plasmasState spaceStatistical physicslcsh:Scienceinformation theorymultiscale entropylcsh:QC1-999Interaction informationMathematics and Statisticssymbolsinformation dynamicsInformation dynamics; Information transfer; Multiscale entropy; Multivariate time series analysis; Redundancy and synergy; State space models; Vector autoregressive models; Physics and Astronomy (all)information dynamics; information transfer; multiscale entropy; multivariate time series analysis; redundancy and synergy; state space models; vector autoregressive modelsMultivariate time series analysiMathematics - Statistics Theorylcsh:AstrophysicsStatistics Theory (math.ST)Statistics - ApplicationsMethodology (stat.ME)symbols.namesakePhysics and Astronomy (all)0103 physical scienceslcsh:QB460-466FOS: Mathematicsinformation transferRelevance (information retrieval)Applications (stat.AP)Transfer Entropy010306 general physicsGaussian processStatistics - MethodologyState space modelstate space modelsmultivariate time series analysisredundancy and synergyvector autoregressive modelsInformation dynamicVector autoregressive modelSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaTransfer entropylcsh:Qlcsh:PhysicsEntropy
researchProduct

Local Granger causality

2021

Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. For Gaussian variables it is equivalent to transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes. We exploit such equivalence and calculate exactly the 'local Granger causality', i.e. the profile of the information transfer at each discrete time point in Gaussian processes; in this frame Granger causality is the average of its local version. Our approach offers a robust and computationally fast method to follow the information transfer along the time history of linear stochastic processes, as well as of nonlinear …

FOS: Computer and information sciencesInformation transferGaussianFOS: Physical sciencestechniques; information theory; granger causalityMachine Learning (stat.ML)Quantitative Biology - Quantitative Methods01 natural sciences010305 fluids & plasmasVector autoregressionsymbols.namesakegranger causalityGranger causalityStatistics - Machine Learning0103 physical sciencesApplied mathematicstime serie010306 general physicsQuantitative Methods (q-bio.QM)Mathematicsinformation theoryStochastic processDisordered Systems and Neural Networks (cond-mat.dis-nn)Condensed Matter - Disordered Systems and Neural NetworksComputational Physics (physics.comp-ph)Discrete time and continuous timeAutoregressive modelFOS: Biological sciencesSettore ING-INF/06 - Bioingegneria Elettronica E InformaticasymbolsTransfer entropytechniquesPhysics - Computational Physics
researchProduct

Multiscale analysis of information dynamics for linear multivariate processes.

2016

In the study of complex physical and physiological systems represented by multivariate time series, an issue of great interest is the description of the system dynamics over a range of different temporal scales. While information-theoretic approaches to the multiscale analysis of complex dynamics are being increasingly used, the theoretical properties of the applied measures are poorly understood. This study introduces for the first time a framework for the analytical computation of information dynamics for linear multivariate stochastic processes explored at different time scales. After showing that the multiscale processing of a vector autoregressive (VAR) process introduces a moving aver…

FOS: Computer and information sciencesInformation transferMultivariate statisticsMultivariate analysisComputer scienceComputer Science - Information Theory0206 medical engineeringStochastic ProcesseBiomedical EngineeringFOS: Physical sciencesInformation Storage and RetrievalHealth Informatics02 engineering and technology01 natural sciencesEntropy (classical thermodynamics)Moving average0103 physical sciencesEntropy (information theory)Computer SimulationStatistical physicsEntropy (energy dispersal)Time series010306 general physicsEntropy (arrow of time)Multivariate Analysi1707Stochastic ProcessesEntropy (statistical thermodynamics)Stochastic processInformation Theory (cs.IT)Probability and statisticsModels Theoretical020601 biomedical engineeringComplex dynamicsAutoregressive modelPhysics - Data Analysis Statistics and ProbabilitySignal ProcessingSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaMultivariate AnalysisData Analysis Statistics and Probability (physics.data-an)Entropy (order and disorder)Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
researchProduct

Gaussianizing the Earth: Multidimensional Information Measures for Earth Data Analysis

2021

Information theory is an excellent framework for analyzing Earth system data because it allows us to characterize uncertainty and redundancy, and is universally interpretable. However, accurately estimating information content is challenging because spatio-temporal data is high-dimensional, heterogeneous and has non-linear characteristics. In this paper, we apply multivariate Gaussianization for probability density estimation which is robust to dimensionality, comes with statistical guarantees, and is easy to apply. In addition, this methodology allows us to estimate information-theoretic measures to characterize multivariate densities: information, entropy, total correlation, and mutual in…

FOS: Computer and information sciencesMultivariate statisticsGeneral Computer ScienceComputer scienceMachine Learning (stat.ML)Mutual informationInformation theorycomputer.software_genreStatistics - ApplicationsEarth system scienceRedundancy (information theory)13. Climate actionStatistics - Machine LearningGeneral Earth and Planetary SciencesEntropy (information theory)Applications (stat.AP)Total correlationData miningElectrical and Electronic EngineeringInstrumentationcomputerCurse of dimensionality
researchProduct

Multiscale partial information decomposition of dynamic processes with short and long-range correlations: theory and application to cardiovascular co…

2022

Abstract Objective. In this work, an analytical framework for the multiscale analysis of multivariate Gaussian processes is presented, whereby the computation of Partial Information Decomposition measures is achieved accounting for the simultaneous presence of short-term dynamics and long-range correlations. Approach. We consider physiological time series mapping the activity of the cardiac, vascular and respiratory systems in the field of Network Physiology. In this context, the multiscale representation of transfer entropy within the network of interactions among Systolic arterial pressure (S), respiration (R) and heart period (H), as well as the decomposition into unique, redundant and s…

FOS: Computer and information sciencesmultivariate time seriesPhysiologyEntropyRespirationBiomedical EngineeringBiophysicsheart rate variabilitytransfer entropyredundancy and synergyBlood PressureHeartQuantitative Biology - Quantitative MethodsCardiovascular SystemMethodology (stat.ME)Heart RatePhysiology (medical)FOS: Biological sciencesCardiovascular controlSettore ING-INF/06 - Bioingegneria Elettronica E Informaticavector autoregressive fractionally integrated (VARFI) modelsHumansQuantitative Methods (q-bio.QM)Statistics - MethodologyPhysiological measurement
researchProduct

Fractional quantum Hall effect in the interacting Hofstadter model via tensor networks

2017

We show via tensor network methods that the Harper-Hofstadter Hamiltonian for hard-core bosons on a square geometry supports a topological phase realizing the $\nu=1/2$ fractional quantum Hall effect on the lattice. We address the robustness of the ground state degeneracy and of the energy gap, measure the many-body Chern number, and characterize the system using Green functions, showing that they decay algebraically at the edges of open geometries, indicating the presence of gapless edge modes. Moreover, we estimate the topological entanglement entropy by taking a combination of lattice bipartitions that reproduces the topological structure of the original proposals by Kitaev and Preskill,…

FOS: Physical sciencesQuantum entanglementQuantum Hall effectExpected value01 natural sciences010305 fluids & plasmasCondensed Matter - Strongly Correlated ElectronsQuantum spin Hall effectQuantum mechanics0103 physical sciencesElectronicEntropy (information theory)Optical and Magnetic Materials010306 general physicsBosonPhysicsQuantum PhysicsChern classStrongly Correlated Electrons (cond-mat.str-el)Condensed Matter PhysicsQuantum Gases (cond-mat.quant-gas)cond-mat.quant-gas; cond-mat.quant-gas; Physics - Strongly Correlated Electrons; Quantum Physics; Electronic Optical and Magnetic Materials; Condensed Matter PhysicsFractional quantum Hall effectPhysics - Strongly Correlated ElectronsCondensed Matter - Quantum GasesQuantum Physics (quant-ph)cond-mat.quant-gasPhysical Review B
researchProduct

Risk Assessment and Analysis

2014

Once threats are identified, they must be assessed or evaluated, which is the objective of this chapter. There is usually a large number of threats, making it impossible or unprofitable to analyze them all, meaning selection of the threats that will be addressed is important. It is a decision-making process; an example is proposed and solved by one of the many techniques available. The chapter proposes a very standard requested study, which is the assessment of the economic and financial risks of a project. This is done through a real-life-example, followed by another appraisal, this time devoted to economic issues, as well as another for transportation, introducing the important concept of…

Fault tree analysisRisk analysis (engineering)Probabilistic risk assessmentComputer scienceFinancial riskEntropy (information theory)Conditional probabilityRisk assessmentFailure mode and effects analysis
researchProduct