Search results for "optimization"

showing 10 items of 2824 documents

Improving table compression with combinatorial optimization

2002

We study the problem of compressing massive tables within the partition-training paradigm introduced by Buchsbaum et al. [SODA'00], in which a table is partitioned by an off-line training procedure into disjoint intervals of columns, each of which is compressed separately by a standard, on-line compressor like gzip. We provide a new theory that unifies previous experimental observations on partitioning and heuristic observations on column permutation, all of which are used to improve compression rates. Based on the theory, we devise the first on-line training algorithms for table compression, which can be applied to individual files, not just continuously operating sources; and also a new, …

FOS: Computer and information sciencesComputer scienceHeuristic (computer science)E.4G.2.1Data_CODINGANDINFORMATIONTHEORYDisjoint setsTravelling salesman problemPermutationArtificial IntelligenceCompression (functional analysis)Computer Science - Data Structures and AlgorithmsH.1.8H.2.7Data Structures and Algorithms (cs.DS)E.4; F.1.3; F.2.2; G.2.1; H.1.1; H.1.8; H.2.7H.1.1Dynamic programmingHardware and ArchitectureControl and Systems EngineeringCombinatorial optimizationTable (database)F.1.3F.2.2AlgorithmSoftwareInformation SystemsJournal of the ACM
researchProduct

On the Power of Non-adaptive Learning Graphs

2012

We introduce a notion of the quantum query complexity of a certificate structure. This is a formalisation of a well-known observation that many quantum query algorithms only require the knowledge of the disposition of possible certificates in the input string, not the precise values therein. Next, we derive a dual formulation of the complexity of a non-adaptive learning graph, and use it to show that non-adaptive learning graphs are tight for all certificate structures. By this, we mean that there exists a function possessing the certificate structure and such that a learning graph gives an optimal quantum query algorithm for it. For a special case of certificate structures generated by cer…

FOS: Computer and information sciencesDiscrete mathematicsQuantum PhysicsTheoretical computer scienceComputational complexity theoryComputer scienceGeneral MathematicsExistential quantificationFOS: Physical sciencesGraph theoryString searching algorithmComputational Complexity (cs.CC)Query optimizationCertificateUpper and lower boundsTheoretical Computer ScienceComputational MathematicsComputer Science - Computational ComplexityComputational Theory and MathematicsBounded functionAdaptive learningSpecial caseQuantum Physics (quant-ph)Quantum computerMathematics2013 IEEE Conference on Computational Complexity
researchProduct

CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration

2017

International audience; In this paper, we propose a new framework to remove parts of the systematic errors affecting popular restoration algorithms, with a special focus for image processing tasks. Generalizing ideas that emerged for $\ell_1$ regularization, we develop an approach re-fitting the results of standard methods towards the input data. Total variation regularizations and non-local means are special cases of interest. We identify important covariant information that should be preserved by the re-fitting method, and emphasize the importance of preserving the Jacobian (w.r.t. the observed signal) of the original estimator. Then, we provide an approach that has a ``twicing'' flavor a…

FOS: Computer and information sciencesInverse problemsMathematical optimization[ INFO.INFO-TS ] Computer Science [cs]/Signal and Image ProcessingComputer Vision and Pattern Recognition (cs.CV)General MathematicsComputer Science - Computer Vision and Pattern RecognitionMachine Learning (stat.ML)Mathematics - Statistics TheoryImage processingStatistics Theory (math.ST)02 engineering and technologyDebiasing[ INFO.INFO-CV ] Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]01 natural sciencesRegularization (mathematics)Boosting010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing[STAT.ML]Statistics [stat]/Machine Learning [stat.ML]Variational methods[MATH.MATH-ST]Mathematics [math]/Statistics [math.ST]Statistics - Machine LearningRefittingMSC: 49N45 65K10 68U10[ INFO.INFO-TI ] Computer Science [cs]/Image ProcessingFOS: Mathematics0202 electrical engineering electronic engineering information engineeringCovariant transformation[ MATH.MATH-ST ] Mathematics [math]/Statistics [math.ST]0101 mathematicsImage restoration[ STAT.ML ] Statistics [stat]/Machine Learning [stat.ML]MathematicsApplied Mathematics[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV]EstimatorInverse problem[INFO.INFO-TI]Computer Science [cs]/Image Processing [eess.IV]Jacobian matrix and determinantsymbolsTwicing020201 artificial intelligence & image processingAffine transformationAlgorithm
researchProduct

Randomized Block Frank–Wolfe for Convergent Large-Scale Learning

2017

Owing to their low-complexity iterations, Frank-Wolfe (FW) solvers are well suited for various large-scale learning tasks. When block-separable constraints are present, randomized block FW (RB-FW) has been shown to further reduce complexity by updating only a fraction of coordinate blocks per iteration. To circumvent the limitations of existing methods, the present work develops step sizes for RB-FW that enable a flexible selection of the number of blocks to update per iteration while ensuring convergence and feasibility of the iterates. To this end, convergence rates of RB-FW are established through computational bounds on a primal sub-optimality measure and on the duality gap. The novel b…

FOS: Computer and information sciencesMathematical optimization0102 computer and information sciences02 engineering and technology01 natural sciencesMeasure (mathematics)Machine Learning (cs.LG)Convergence (routing)FOS: Mathematics0202 electrical engineering electronic engineering information engineeringFraction (mathematics)Electrical and Electronic EngineeringMathematics - Optimization and ControlMathematicsSequenceDuality gapComputer Science - Numerical Analysis020206 networking & telecommunicationsNumerical Analysis (math.NA)Stationary pointSupport vector machineComputer Science - LearningOptimization and Control (math.OC)010201 computation theory & mathematicsIterated functionSignal ProcessingAlgorithmIEEE Transactions on Signal Processing
researchProduct

Adaptive independent sticky MCMC algorithms

2018

In this work, we introduce a novel class of adaptive Monte Carlo methods, called adaptive independent sticky MCMC algorithms, for efficient sampling from a generic target probability density function (pdf). The new class of algorithms employs adaptive non-parametric proposal densities which become closer and closer to the target as the number of iterations increases. The proposal pdf is built using interpolation procedures based on a set of support points which is constructed iteratively based on previously drawn samples. The algorithm's efficiency is ensured by a test that controls the evolution of the set of support points. This extra stage controls the computational cost and the converge…

FOS: Computer and information sciencesMathematical optimizationAdaptive Markov chain Monte Carlo (MCMC)Monte Carlo methodBayesian inferenceHASettore SECS-P/05 - Econometrialcsh:TK7800-8360Machine Learning (stat.ML)02 engineering and technologyBayesian inference01 natural sciencesStatistics - Computationlcsh:Telecommunication010104 statistics & probabilitysymbols.namesakeAdaptive Markov chain Monte Carlo (MCMC); Adaptive rejection Metropolis sampling (ARMS); Bayesian inference; Gibbs sampling; Hit and run algorithm; Metropolis-within-Gibbs; Monte Carlo methods; Signal Processing; Hardware and Architecture; Electrical and Electronic EngineeringGibbs samplingStatistics - Machine Learninglcsh:TK5101-67200202 electrical engineering electronic engineering information engineeringComputational statisticsMetropolis-within-GibbsHit and run algorithm0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)MathematicsSignal processinglcsh:Electronics020206 networking & telecommunicationsMarkov chain Monte CarloMonte Carlo methodsHardware and ArchitectureSignal ProcessingSettore SECS-S/03 - Statistica EconomicasymbolsSettore SECS-S/01 - StatisticaStatistical signal processingGibbs samplingAdaptive rejection Metropolis sampling (ARMS)EURASIP Journal on Advances in Signal Processing
researchProduct

Consistent Regression of Biophysical Parameters with Kernel Methods

2020

This paper introduces a novel statistical regression framework that allows the incorporation of consistency constraints. A linear and nonlinear (kernel-based) formulation are introduced, and both imply closed-form analytical solutions. The models exploit all the information from a set of drivers while being maximally independent of a set of auxiliary, protected variables. We successfully illustrate the performance in the estimation of chlorophyll content.

FOS: Computer and information sciencesMathematical optimizationComputer Science - Machine Learning010504 meteorology & atmospheric sciences0211 other engineering and technologiesRegression analysisMachine Learning (stat.ML)02 engineering and technology01 natural sciencesRegressionData modelingMachine Learning (cs.LG)Set (abstract data type)Methodology (stat.ME)Nonlinear systemKernel methodConsistency (statistics)Statistics - Machine LearningKernel (statistics)Statistics - Methodology021101 geological & geomatics engineering0105 earth and related environmental sciencesMathematicsIGARSS 2018 - 2018 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

Bayesian Unification of Gradient and Bandit-based Learning for Accelerated Global Optimisation

2017

Bandit based optimisation has a remarkable advantage over gradient based approaches due to their global perspective, which eliminates the danger of getting stuck at local optima. However, for continuous optimisation problems or problems with a large number of actions, bandit based approaches can be hindered by slow learning. Gradient based approaches, on the other hand, navigate quickly in high-dimensional continuous spaces through local optimisation, following the gradient in fine grained steps. Yet, apart from being susceptible to local optima, these schemes are less suited for online learning due to their reliance on extensive trial-and-error before the optimum can be identified. In this…

FOS: Computer and information sciencesMathematical optimizationComputer scienceComputer Science - Artificial IntelligenceBayesian probability02 engineering and technologyMachine learningcomputer.software_genreMachine Learning (cs.LG)symbols.namesakeLocal optimumMargin (machine learning)0202 electrical engineering electronic engineering information engineeringGaussian processFlexibility (engineering)business.industry020206 networking & telecommunicationsFunction (mathematics)Computer Science - LearningArtificial Intelligence (cs.AI)symbols020201 artificial intelligence & image processingAlgorithm designLinear approximationArtificial intelligencebusinesscomputer
researchProduct

Online shortest paths with confidence intervals for routing in a time varying random network

2018

International audience; The increase in the world's population and rising standards of living is leading to an ever-increasing number of vehicles on the roads, and with it ever-increasing difficulties in traffic management. This traffic management in transport networks can be clearly optimized by using information and communication technologies referred as Intelligent Transport Systems (ITS). This management problem is usually reformulated as finding the shortest path in a time varying random graph. In this article, an online shortest path computation using stochastic gradient descent is proposed. This routing algorithm for ITS traffic management is based on the online Frank-Wolfe approach.…

FOS: Computer and information sciencesMathematical optimizationComputer sciencePopulation02 engineering and technology[INFO.INFO-SE]Computer Science [cs]/Software Engineering [cs.SE][INFO.INFO-IU]Computer Science [cs]/Ubiquitous Computing[SPI]Engineering Sciences [physics][INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR]0502 economics and business11. SustainabilityComputer Science - Data Structures and Algorithms0202 electrical engineering electronic engineering information engineeringFOS: MathematicsData Structures and Algorithms (cs.DS)educationIntelligent transportation systemMathematics - Optimization and ControlRandom graph050210 logistics & transportationeducation.field_of_studyStochastic process[SPI.PLASMA]Engineering Sciences [physics]/Plasmas05 social sciencesApproximation algorithm[INFO.INFO-MO]Computer Science [cs]/Modeling and SimulationStochastic gradient descentOptimization and Control (math.OC)[INFO.INFO-MA]Computer Science [cs]/Multiagent Systems [cs.MA]Shortest path problem020201 artificial intelligence & image processing[INFO.INFO-ET]Computer Science [cs]/Emerging Technologies [cs.ET]Routing (electronic design automation)[INFO.INFO-DC]Computer Science [cs]/Distributed Parallel and Cluster Computing [cs.DC]
researchProduct

Scalability of using Restricted Boltzmann Machines for Combinatorial Optimization

2014

Abstract Estimation of Distribution Algorithms (EDAs) require flexible probability models that can be efficiently learned and sampled. Restricted Boltzmann Machines (RBMs) are generative neural networks with these desired properties. We integrate an RBM into an EDA and evaluate the performance of this system in solving combinatorial optimization problems with a single objective. We assess how the number of fitness evaluations and the CPU time scale with problem size and complexity. The results are compared to the Bayesian Optimization Algorithm (BOA), a state-of-the-art multivariate EDA, and the Dependency Tree Algorithm (DTA), which uses a simpler probability model requiring less computati…

FOS: Computer and information sciencesMathematical optimizationInformation Systems and ManagementOptimization problemGeneral Computer SciencePopulationComputer Science::Neural and Evolutionary Computation0211 other engineering and technologiesBoltzmann machine02 engineering and technologyManagement Science and Operations ResearchIndustrial and Manufacturing EngineeringEvolutionary computation0202 electrical engineering electronic engineering information engineeringNeural and Evolutionary Computing (cs.NE)educationMathematicseducation.field_of_study021103 operations researchArtificial neural networkI.2.6I.2.8Computer Science - Neural and Evolutionary ComputingEstimation of distribution algorithmModeling and SimulationScalabilityCombinatorial optimization020201 artificial intelligence & image processingI.2.6; I.2.8Algorithm
researchProduct

An LP-based hyperparameter optimization model for language modeling

2018

In order to find hyperparameters for a machine learning model, algorithms such as grid search or random search are used over the space of possible values of the models hyperparameters. These search algorithms opt the solution that minimizes a specific cost function. In language models, perplexity is one of the most popular cost functions. In this study, we propose a fractional nonlinear programming model that finds the optimal perplexity value. The special structure of the model allows us to approximate it by a linear programming model that can be solved using the well-known simplex algorithm. To the best of our knowledge, this is the first attempt to use optimization techniques to find per…

FOS: Computer and information sciencesMathematical optimizationPerplexityLinear programmingComputer scienceMachine Learning (stat.ML)02 engineering and technology010501 environmental sciences01 natural sciencesTheoretical Computer ScienceNonlinear programmingMachine Learning (cs.LG)Random searchSimplex algorithmSearch algorithmStatistics - Machine Learning0202 electrical engineering electronic engineering information engineeringFOS: MathematicsMathematics - Optimization and Control0105 earth and related environmental sciencesHyperparameterComputer Science::Computation and Language (Computational Linguistics and Natural Language and Speech Processing)Computer Science - LearningHardware and ArchitectureOptimization and Control (math.OC)Hyperparameter optimization020201 artificial intelligence & image processingLanguage modelSoftwareInformation Systems
researchProduct