Search results for "Statistics - Machine Learning"

showing 10 items of 90 documents

Ignorance-Aware Approaches and Algorithms for Prototype Selection in Machine Learning

2019

Operating with ignorance is an important concern of the Machine Learning research, especially when the objective is to discover knowledge from the imperfect data. Data mining (driven by appropriate knowledge discovery tools) is about processing available (observed, known and understood) samples of data aiming to build a model (e.g., a classifier) to handle data samples, which are not yet observed, known or understood. These tools traditionally take samples of the available data (known facts) as an input for learning. We want to challenge the indispensability of this approach and we suggest considering the things the other way around. What if the task would be as follows: how to learn a mode…

FOS: Computer and information sciencesComputer Science - Machine LearningStatistics - Machine LearningMachine Learning (stat.ML)Machine Learning (cs.LG)
researchProduct

Information Theory in Density Destructors

2020

Density destructors are differentiable and invertible transforms that map multivariate PDFs of arbitrary structure (low entropy) into non-structured PDFs (maximum entropy). Multivariate Gaussianization and multivariate equalization are specific examples of this family, which break down the complexity of the original PDF through a set of elementary transforms that progressively remove the structure of the data. We demonstrate how this property of density destructive flows is connected to classical information theory, and how density destructors can be used to get more accurate estimates of information theoretic quantities. Experiments with total correlation and mutual information inmultivari…

FOS: Computer and information sciencesComputer Science - Machine LearningStatistics - Machine LearningMachine Learning (stat.ML)Machine Learning (cs.LG)
researchProduct

Information Theory Measures via Multidimensional Gaussianization

2020

Information theory is an outstanding framework to measure uncertainty, dependence and relevance in data and systems. It has several desirable properties for real world applications: it naturally deals with multivariate data, it can handle heterogeneous data types, and the measures can be interpreted in physical units. However, it has not been adopted by a wider audience because obtaining information from multidimensional data is a challenging problem due to the curse of dimensionality. Here we propose an indirect way of computing information based on a multivariate Gaussianization transform. Our proposal mitigates the difficulty of multivariate density estimation by reducing it to a composi…

FOS: Computer and information sciencesComputer Science - Machine LearningStatistics - Machine LearningMachine Learning (stat.ML)Machine Learning (cs.LG)
researchProduct

Scalable Initialization Methods for Large-Scale Clustering

2020

In this work, two new initialization methods for K-means clustering are proposed. Both proposals are based on applying a divide-and-conquer approach for the K-means|| type of an initialization strategy. The second proposal also utilizes multiple lower-dimensional subspaces produced by the random projection method for the initialization. The proposed methods are scalable and can be run in parallel, which make them suitable for initializing large-scale problems. In the experiments, comparison of the proposed methods to the K-means++ and K-means|| methods is conducted using an extensive set of reference and synthetic large-scale datasets. Concerning the latter, a novel high-dimensional cluster…

FOS: Computer and information sciencesComputer Science - Machine LearningStatistics - Machine LearningMachine Learning (stat.ML)Machine Learning (cs.LG)
researchProduct

Kernel methods and their derivatives: Concept and perspectives for the earth system sciences.

2020

Kernel methods are powerful machine learning techniques which implement generic non-linear functions to solve complex tasks in a simple way. They Have a solid mathematical background and exhibit excellent performance in practice. However, kernel machines are still considered black-box models as the feature mapping is not directly accessible and difficult to interpret.The aim of this work is to show that it is indeed possible to interpret the functions learned by various kernel methods is intuitive despite their complexity. Specifically, we show that derivatives of these functions have a simple mathematical formulation, are easy to compute, and can be applied to many different problems. We n…

FOS: Computer and information sciencesComputer Science - Machine LearningSupport Vector MachineTheoretical computer scienceComputer scienceEntropyKernel FunctionsNormal Distribution0211 other engineering and technologies02 engineering and technologyMachine Learning (cs.LG)Machine LearningStatistics - Machine LearningSimple (abstract algebra)0202 electrical engineering electronic engineering information engineeringOperator TheoryData ManagementMultidisciplinaryGeographyApplied MathematicsSimulation and ModelingQRDensity estimationKernel methodKernel (statistics)Physical SciencessymbolsMedicine020201 artificial intelligence & image processingAlgorithmsResearch ArticleComputer and Information SciencesScienceMachine Learning (stat.ML)Research and Analysis MethodsKernel MethodsKernel (linear algebra)symbols.namesakeArtificial IntelligenceSupport Vector MachinesHumansEntropy (information theory)Computer SimulationGaussian process021101 geological & geomatics engineeringData VisualizationCorrectionRandom VariablesFunction (mathematics)Probability TheorySupport vector machineAlgebraPhysical GeographyLinear AlgebraEarth SciencesEigenvectorsRandom variableMathematicsEarth SystemsPLoS ONE
researchProduct

Forecasting : theory and practice

2022

Forecasting has always been at the forefront of decision making and planning. The uncertainty that surrounds the future is both exciting and challenging, with individuals and organisations seeking to minimise risks and maximise utilities. The large number of forecasting applications calls for a diverse set of forecasting methods to tackle real-life challenges. This article provides a non-systematic review of the theory and the practice of forecasting. We provide an overview of a wide range of theoretical, state-of-the-art models, methods, principles, and approaches to prepare, produce, organise, and evaluate forecasts. We then demonstrate how such theoretical concepts are applied in a varie…

FOS: Computer and information sciencesComputer Science - Machine LearningTime seriesEconomicsApplicationOther Engineering and Technologies not elsewhere specifiedEconometrics (econ.EM)HAMethodMachine Learning (stat.ML)ReviewStatistics - ApplicationsMachine Learning (cs.LG)FOS: Economics and businessBusiness and EconomicsStatistics - Machine LearningMethodsPrincipleREVIEWApplications (stat.AP)Övrig annan teknikN100Business and International ManagementNationalekonomiEconomics - EconometricsBusiness AdministrationFöretagsekonomiAPPLICATIONSOther Statistics (stat.OT)Wirtschaftswissenschaftenstat.OTStatistics - Other StatisticsComputer Science - Learning003: SystemePRINCIPLESecon.EMApplicationsMETHODSStatistics - Applications; Statistics - Applications; Computer Science - Learning; econ.EM; Statistics - Machine Learning; stat.OTEncyclopediaPredictionPrinciplesREVIEW ENCYCLOPEDIA METHODS APPLICATIONS PRINCIPLES TIME SERIES PREDICTIONForecasting
researchProduct

PerceptNet: A Human Visual System Inspired Neural Network for Estimating Perceptual Distance

2019

Traditionally, the vision community has devised algorithms to estimate the distance between an original image and images that have been subject to perturbations. Inspiration was usually taken from the human visual perceptual system and how the system processes different perturbations in order to replicate to what extent it determines our ability to judge image quality. While recent works have presented deep neural networks trained to predict human perceptual quality, very few borrow any intuitions from the human visual system. To address this, we present PerceptNet, a convolutional neural network where the architecture has been chosen to reflect the structure and various stages in the human…

FOS: Computer and information sciencesComputer Science - Machine LearningVisual perceptionComputer scienceImage qualitymedia_common.quotation_subjectFeature extractionMachine Learning (stat.ML)02 engineering and technology01 natural sciencesConvolutional neural networkhuman visual systemMachine Learning (cs.LG)010309 opticsStatistics - Machine LearningPerception0103 physical sciences0202 electrical engineering electronic engineering information engineeringFOS: Electrical engineering electronic engineering information engineeringperceptual distancemedia_commonArtificial neural networkbusiness.industryDeep learningImage and Video Processing (eess.IV)Pattern recognitionElectrical Engineering and Systems Science - Image and Video Processingneural networksHuman visual system model020201 artificial intelligence & image processingArtificial intelligencebusiness
researchProduct

Implicit differentiation of Lasso-type models for hyperparameter optimization

2020

International audience; Setting regularization parameters for Lasso-type estimators is notoriously difficult, though crucial in practice. The most popular hyperparam-eter optimization approach is grid-search using held-out validation data. Grid-search however requires to choose a predefined grid for each parameter , which scales exponentially in the number of parameters. Another approach is to cast hyperparameter optimization as a bi-level optimization problem, one can solve by gradient descent. The key challenge for these methods is the estimation of the gradient w.r.t. the hyperpa-rameters. Computing this gradient via forward or backward automatic differentiation is possible yet usually s…

FOS: Computer and information sciencesComputer Science - Machine Learning[STAT.ML]Statistics [stat]/Machine Learning [stat.ML]Statistics - Machine LearningMachine Learning (stat.ML)[STAT.ML] Statistics [stat]/Machine Learning [stat.ML]Machine Learning (cs.LG)
researchProduct

On the Universality of Graph Neural Networks on Large Random Graphs

2021

International audience; We study the approximation power of Graph Neural Networks (GNNs) on latent position random graphs. In the large graph limit, GNNs are known to converge to certain "continuous" models known as c-GNNs, which directly enables a study of their approximation power on random graph models. In the absence of input node features however, just as GNNs are limited by the Weisfeiler-Lehman isomorphism test, c-GNNs will be severely limited on simple random graph models. For instance, they will fail to distinguish the communities of a well-separated Stochastic Block Model (SBM) with constant degree function. Thus, we consider recently proposed architectures that augment GNNs with …

FOS: Computer and information sciencesComputer Science - Machine Learning[STAT.ML]Statistics [stat]/Machine Learning [stat.ML]Statistics - Machine Learning[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST]Machine Learning (stat.ML)[MATH.MATH-ST] Mathematics [math]/Statistics [math.ST][STAT.ML] Statistics [stat]/Machine Learning [stat.ML]Machine Learning (cs.LG)
researchProduct

Convergence and Stability of Graph Convolutional Networks on Large Random Graphs

2020

International audience; We study properties of Graph Convolutional Networks (GCNs) by analyzing their behavior on standard models of random graphs, where nodes are represented by random latent variables and edges are drawn according to a similarity kernel. This allows us to overcome the difficulties of dealing with discrete notions such as isomorphisms on very large graphs, by considering instead more natural geometric aspects. We first study the convergence of GCNs to their continuous counterpart as the number of nodes grows. Our results are fully non-asymptotic and are valid for relatively sparse graphs with an average degree that grows logarithmically with the number of nodes. We then an…

FOS: Computer and information sciencesComputer Science - Machine Learning[STAT.ML]Statistics [stat]/Machine Learning [stat.ML][INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG]Statistics - Machine LearningMachine Learning (stat.ML)[INFO.INFO-LG] Computer Science [cs]/Machine Learning [cs.LG][STAT.ML] Statistics [stat]/Machine Learning [stat.ML]Machine Learning (cs.LG)
researchProduct