Search results for "ML"

showing 10 items of 1465 documents

Does the GATT/WTO promote trade? After all, Rose was right

2019

This paper re-examines the effect of the GATT/WTO on trade using recent econometric developments that allow us estimating structural gravity equations with the Poisson pseudo-maximum likelihood (PPML) estimator on a large dataset that requires computing high-dimensional fixed effects. By doing so, we overcome computational limitations that are present in previous studies. In line with Rose’s (Am Econ Rev 94:98–114, 2004) seminal work, we find that, unlike regional trade agreements and currency unions, the GATT/WTO accession has not generated positive trade effects. This result is robust to the use of alternative measures of trade flows, across periods and country groups, to changes in the p…

Comerç RegulacióVariablesmedia_common.quotation_subject05 social sciencesInternational economicsPPMLAccessionRegional tradeGravity model of tradeCurrency0502 economics and businessEuropean integrationEconomicsEconomia Mètodes estadístics050207 economicsGeneral Economics Econometrics and Finance050205 econometrics media_common
researchProduct

Pure Component Pricing in a Duopoly

2002

In this paper, we focus on price competition between several multiproduct firms which produce differentiated systems, each consisting of two complementary products. It is shown here that if firms are restricted to pure component pricing (bundling is not allowed) whenever components produced are compatible, pure strategy equilibrium may not exist. With the use of bundling strategies, pure strategy equilibrium always exists. For the pure component pricing case we provide a full characterization for the existence of a pure strategy equilibrium.

Competition (economics)Economics and EconometricsStrategyComponent (UML)EconomicsDuopolyMathematical economicsComplementary goodThe Manchester School
researchProduct

Predictive and Evolutive Cross-Referencing for Web Textual Sources

2017

International audience; One of the main challenges in the domain of competitive intelligence is to harness important volumes of information from the web, and extract the most valuable pieces of information. As the amount of information available on the web grows rapidly and is very heterogeneous, this process becomes overwhelming for experts. To leverage this challenge, this paper presents a vision for a novel process that performs cross-referencing at web scale. This process uses a focused crawler and a semantic-based classifier to cross-reference textual items without expert intervention, based on Big Data and Semantic Web technologies. The system is described thoroughly, and interests of…

Competitive intelligenceComputer science[SPI] Engineering Sciences [physics]Big data02 engineering and technologyReasonningFocused crawlerDiscovery[INFO] Computer Science [cs]World Wide WebKnowledge-based systems[INFO.INFO-NI]Computer Science [cs]/Networking and Internet Architecture [cs.NI][SPI]Engineering Sciences [physics]020204 information systems0202 electrical engineering electronic engineering information engineeringLeverage (statistics)[INFO]Computer Science [cs]Semantic Web[INFO.INFO-NI] Computer Science [cs]/Networking and Internet Architecture [cs.NI]business.industryOntologyFocused CrawlerWork in processClassificationAdaptive[SPI.TRON] Engineering Sciences [physics]/Electronics[SPI.TRON]Engineering Sciences [physics]/ElectronicsCross-ReferencingClasssification020201 artificial intelligence & image processingbusinessClassifier (UML)Model
researchProduct

Data-Centric and Multimedia Components

2011

The content of XML documents is often primarily plain text, interspersed with various headers and perhaps some lists and tables. However, there are many applications for which the content of documents is not primarily narrative in nature, but instead includes (portions of) data records that are subject to storage and computational manipulation. The latter documents are sometimes referred to as data-centric or record-like, and they rely extensively on precise descriptions of the forms of data that can appear. In this chapter we first introduce the data type definition capabilities in XML Schema. We then consider the types of data very common in traditional databases: numeric data, dates, and…

Complex data typeMultimediaComputer sciencePlain textcomputer.internet_protocolSubject (documents)computer.file_formatcomputer.software_genreData typeDatabase-centric architectureComputingMethodologies_DOCUMENTANDTEXTPROCESSINGXML schemaGraphicscomputerXMLcomputer.programming_language
researchProduct

On the Online Classification of Data Streams Using Weak Estimators

2016

In this paper, we propose a novel online classifier for complex data streams which are generated from non-stationary stochastic properties. Instead of using a single training model and counters to keep important data statistics, the introduced online classifier scheme provides a real-time self-adjusting learning model. The learning model utilizes the multiplication-based update algorithm of the Stochastic Learning Weak Estimator (SLWE) at each time instant as a new labeled instance arrives. In this way, the data statistics are updated every time a new element is inserted, without requiring that we have to rebuild its model when changes occur in the data distributions. Finally, and most impo…

Complex data typeTraining setLearning automataComputer sciencebusiness.industryData stream miningEstimator020206 networking & telecommunications02 engineering and technologycomputer.software_genreMachine learning0202 electrical engineering electronic engineering information engineering020201 artificial intelligence & image processingData miningArtificial intelligencebusinesscomputerClassifier (UML)Juncture
researchProduct

Approximation of functions over manifolds : A Moving Least-Squares approach

2021

We present an algorithm for approximating a function defined over a $d$-dimensional manifold utilizing only noisy function values at locations sampled from the manifold with noise. To produce the approximation we do not require any knowledge regarding the manifold other than its dimension $d$. We use the Manifold Moving Least-Squares approach of (Sober and Levin 2016) to reconstruct the atlas of charts and the approximation is built on-top of those charts. The resulting approximant is shown to be a function defined over a neighborhood of a manifold, approximating the originally sampled manifold. In other words, given a new point, located near the manifold, the approximation can be evaluated…

Computational Geometry (cs.CG)FOS: Computer and information sciencesComputer Science - Machine LearningClosed manifolddimension reductionMachine Learning (stat.ML)010103 numerical & computational mathematicsComplex dimensionTopology01 natural sciencesMachine Learning (cs.LG)Volume formComputer Science - GraphicsStatistics - Machine Learningmanifold learningApplied mathematics0101 mathematicsfunktiotMathematicsManifold alignmentAtlas (topology)Applied Mathematicshigh dimensional approximationManifoldGraphics (cs.GR)Statistical manifold010101 applied mathematicsregression over manifoldsComputational Mathematicsout-of-sample extensionComputer Science - Computational Geometrynumeerinen analyysimonistotapproksimointimoving least-squaresCenter manifold
researchProduct

Computational Modelling of Public Policy: Reflections on Practice

2018

Computational models are increasingly being used to assist in developing, implementing and evaluating public policy. This paper reports on the experience of the authors in designing and using computational models of public policy (‘policy models’, for short). The paper considers the role of computational models in policy making, and some of the challenges that need to be overcome if policy models are to make an effective contribution. It suggests that policy models can have an important place in the policy process because they could allow policy makers to experiment in a virtual world, and have many advantages compared with randomised control trials and policy pilots. The paper then summari…

Computational modelCalibration and validationProcess (engineering)Computer science05 social sciencesControl (management)050401 social sciences methodsGeneral Social SciencesPublic policy02 engineering and technologyDomain (software engineering)0504 sociologyRisk analysis (engineering)020204 information systemsComponent (UML)0202 electrical engineering electronic engineering information engineeringComputer Science (miscellaneous)Abstraction (linguistics)
researchProduct

Dual-model approach for safety-critical embedded systems

2020

Abstract The paper presents the design of digital controllers based on two models: the Petri net model, and the UML state machine. These two approaches differ in many aspects of design flow, such as conceptual modelling, and analysis and synthesis. Each of these approaches can be used individually to design an efficient logic controller, and such solutions are well-known, but their interoperability can contribute to a much better understanding of logic controller design and validation. This is especially important in the case of safety- or life-critical embedded systems, and apart from this, a dual-model controller design can make up redundant system increasing its reliability.

Computer Networks and Communicationsbusiness.industryDual modelComputer scienceReliability (computer networking)020208 electrical & electronic engineeringInteroperabilityDesign flow02 engineering and technologyPetri net020202 computer hardware & architectureUML state machineArtificial IntelligenceHardware and ArchitectureControl theoryEmbedded system0202 electrical engineering electronic engineering information engineeringbusinessSoftwareMicroprocessors and Microsystems
researchProduct

Average Performance Analysis of the Stochastic Gradient Method for Online PCA

2019

International audience; This paper studies the complexity of the stochastic gradient algorithm for PCA when the data are observed in a streaming setting. We also propose an online approach for selecting the learning rate. Simulation experiments confirm the practical relevance of the plain stochastic gradient approach and that drastic improvements can be achieved by learning the learning rate.

Computer Science::Machine Learning[STAT.ML]Statistics [stat]/Machine Learning [stat.ML]Computer science0502 economics and business05 social sciencesMathematicsofComputing_NUMERICALANALYSISRelevance (information retrieval)050207 economics010501 environmental sciencesStochastic gradient method01 natural sciencesAlgorithm0105 earth and related environmental sciences
researchProduct

Extension of luminance component based demosaicking algorithm to 4- and 5-band multispectral images

2021

Abstract Multispectral imaging systems are currently expanding with a variety of multispectral demosaicking algorithms. But these algorithms have limitations due to the remarkable presence of artifacts in the reconstructed image. In this paper, we propose a powerful multispectral image demosaicking method that focuses on the G band and luminance component. We've first identified a relevant 4-and 5-band multispectral filter array (MSFA) with the dominant G band and then proposed an algorithm that consistently estimates the missing G values and other missing components using a convolution operator and a weighted bilinear interpolation algorithm based on the luminance component. Using the cons…

Computer engineering. Computer hardwareDemosaicingDemosaicking algorithmComputer scienceMultispectral imageBilinear interpolationQA75.5-76.95General MedicineExtension (predicate logic)Filter (signal processing)Multispectral filter arrayLuminanceConvolutionTK7885-7895G bandElectronic computers. Computer scienceComponent (UML)Weighted bilinear interpolationLuminance componentAlgorithmArray
researchProduct