Search results for "Probabilistic"

showing 10 items of 380 documents

Remote Sensing Image Classification with Large Scale Gaussian Processes

2017

Current remote sensing image classification problems have to deal with an unprecedented amount of heterogeneous and complex data sources. Upcoming missions will soon provide large data streams that will make land cover/use classification difficult. Machine learning classifiers can help at this, and many methods are currently available. A popular kernel classifier is the Gaussian process classifier (GPC), since it approaches the classification problem with a solid probabilistic treatment, thus yielding confidence intervals for the predictions as well as very competitive results to state-of-the-art neural networks and support vector machines. However, its computational cost is prohibitive for…

FOS: Computer and information sciences010504 meteorology & atmospheric sciencesComputer scienceMultispectral image0211 other engineering and technologiesMachine Learning (stat.ML)02 engineering and technologyLand cover01 natural sciencesStatistics - ApplicationsMachine Learning (cs.LG)Kernel (linear algebra)Bayes' theoremsymbols.namesakeStatistics - Machine LearningApplications (stat.AP)Electrical and Electronic EngineeringGaussian process021101 geological & geomatics engineering0105 earth and related environmental sciencesRemote sensingContextual image classificationArtificial neural networkData stream miningProbabilistic logicSupport vector machineComputer Science - LearningKernel (image processing)symbolsGeneral Earth and Planetary Sciences
researchProduct

On the use of approximate Bayesian computation Markov chain Monte Carlo with inflated tolerance and post-correction

2020

Approximate Bayesian computation allows for inference of complicated probabilistic models with intractable likelihoods using model simulations. The Markov chain Monte Carlo implementation of approximate Bayesian computation is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We consider an approach using a relatively large tolerance for the Markov chain Monte Carlo sampler to ensure its sufficient mixing, and post-processing the output leading to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators, and propose an adaptive approximate Bayesi…

FOS: Computer and information sciences0301 basic medicineStatistics and Probabilitytolerance choiceGeneral MathematicsMarkovin ketjutInference01 natural sciencesStatistics - Computationapproximate Bayesian computation010104 statistics & probability03 medical and health sciencessymbols.namesakeMixing (mathematics)adaptive algorithmalgoritmit0101 mathematicsComputation (stat.CO)MathematicsAdaptive algorithmMarkov chainbayesilainen menetelmäApplied MathematicsProbabilistic logicEstimatorMarkov chain Monte CarloAgricultural and Biological Sciences (miscellaneous)Markov chain Monte CarloMonte Carlo -menetelmätimportance sampling030104 developmental biologyconfidence intervalsymbolsStatistics Probability and UncertaintyApproximate Bayesian computationGeneral Agricultural and Biological SciencesAlgorithm
researchProduct

A probabilistic estimation and prediction technique for dynamic continuous social science models: The evolution of the attitude of the Basque Country…

2015

In this paper, a computational technique to deal with uncertainty in dynamic continuous models in Social Sciences is presented.Considering data from surveys,the method consists of determining the probability distribution of the survey output and this allows to sample data and fit the model to the sampled data using a goodness-of-fit criterion based the χ2-test. Taking the fitted parameters that were not rejected by the χ2-test, substituting them into the model and computing their outputs, 95% confidence intervals in each time instant capturing the uncertainty of the survey data (probabilistic estimation) is built. Using the same set of obtained model parameters, a prediction over …

FOS: Computer and information sciencesAttitude dynamicsProbabilistic predictionComputer sciencePopulationDivergence-from-randomness modelSample (statistics)computer.software_genreMachine Learning (cs.LG)Probabilistic estimationSocial scienceeducationProbabilistic relevance modeleducation.field_of_studyApplied MathematicsProbabilistic logicConfidence intervalComputer Science - LearningComputational MathematicsSocial dynamic modelsProbability distributionSurvey data collectionData miningMATEMATICA APLICADAcomputerApplied Mathematics and Computation
researchProduct

Probabilistic entailment in the setting of coherence: The role of quasi conjunction and inclusion relation

2013

In this paper, by adopting a coherence-based probabilistic approach to default reasoning, we focus the study on the logical operation of quasi conjunction and the Goodman-Nguyen inclusion relation for conditional events. We recall that quasi conjunction is a basic notion for defining consistency of conditional knowledge bases. By deepening some results given in a previous paper we show that, given any finite family of conditional events F and any nonempty subset S of F, the family F p-entails the quasi conjunction C(S); then, given any conditional event E|H, we analyze the equivalence between p-entailment of E|H from F and p-entailment of E|H from C(S), where S is some nonempty subset of F.…

FOS: Computer and information sciencesClass (set theory)Goodman–Nguyen’s inclusion relationQAND ruleSettore MAT/06 - Probabilita' E Statistica MatematicaComputer Science - Artificial IntelligenceMathematics - Statistics TheoryStatistics Theory (math.ST)Logical consequencegoodman-nguyen's inclusion relationTheoretical Computer ScienceArtificial IntelligenceQuasi conjunctionFOS: MathematicsEquivalence (measure theory)MathematicsEvent (probability theory)Discrete mathematicsSettore INF/01 - InformaticaApplied MathematicsProbability (math.PR)quasi conjunction; goodman-nguyen inclusion relation; qand rule; coherence; probabilistic default reasoning; p-entailment; goodman-nguyen's inclusion relationProbabilistic logicCoherence (statistics)Conjunction (grammar)Greatest elementArtificial Intelligence (cs.AI)Probabilistic default reasoninggoodman-nguyen inclusion relationp-EntailmentCoherenceSoftwareMathematics - Probability
researchProduct

Uncommon Suffix Tries

2011

Common assumptions on the source producing the words inserted in a suffix trie with $n$ leaves lead to a $\log n$ height and saturation level. We provide an example of a suffix trie whose height increases faster than a power of $n$ and another one whose saturation level is negligible with respect to $\log n$. Both are built from VLMC (Variable Length Markov Chain) probabilistic sources; they are easily extended to families of sources having the same properties. The first example corresponds to a ''logarithmic infinite comb'' and enjoys a non uniform polynomial mixing. The second one corresponds to a ''factorial infinite comb'' for which mixing is uniform and exponential.

FOS: Computer and information sciencesCompressed suffix arrayPolynomialLogarithmGeneral MathematicsSuffix treevariable length Markov chain[INFO.INFO-DS]Computer Science [cs]/Data Structures and Algorithms [cs.DS]Generalized suffix treeprobabilistic source0102 computer and information sciences02 engineering and technologysuffix trie01 natural scienceslaw.inventionCombinatoricslawComputer Science - Data Structures and AlgorithmsTrieFOS: Mathematics0202 electrical engineering electronic engineering information engineeringData Structures and Algorithms (cs.DS)Mixing (physics)[ INFO.INFO-DS ] Computer Science [cs]/Data Structures and Algorithms [cs.DS]MathematicsDiscrete mathematicsApplied MathematicsProbability (math.PR)020206 networking & telecommunicationssuffix trie.Computer Graphics and Computer-Aided Design[MATH.MATH-PR]Mathematics [math]/Probability [math.PR]010201 computation theory & mathematicsmixing properties60J05 37E05Suffix[ MATH.MATH-PR ] Mathematics [math]/Probability [math.PR]Mathematics - ProbabilitySoftware
researchProduct

Simplifying Probabilistic Expressions in Causal Inference

2018

Obtaining a non-parametric expression for an interventional distribution is one of the most fundamental tasks in causal inference. Such an expression can be obtained for an identifiable causal effect by an algorithm or by manual application of do-calculus. Often we are left with a complicated expression which can lead to biased or inefficient estimates when missing data or measurement errors are involved. We present an automatic simplification algorithm that seeks to eliminate symbolically unnecessary variables from these expressions by taking advantage of the structure of the underlying graphical model. Our method is applicable to all causal effect formulas and is readily available in the …

FOS: Computer and information sciencesComputer Science - Artificial Intelligencegraph theoryyksinkertaisuussimplificationgraphical modelMachine Learning (stat.ML)Machine Learning (cs.LG)Computer Science - Learningprobabilistic expressionArtificial Intelligence (cs.AI)Statistics - Machine Learningkausaliteettipiirrosmerkitcausal inferencegraafit
researchProduct

Probabilistic and team PFIN-type learning: General properties

2008

We consider the probability hierarchy for Popperian FINite learning and study the general properties of this hierarchy. We prove that the probability hierarchy is decidable, i.e. there exists an algorithm that receives p_1 and p_2 and answers whether PFIN-type learning with the probability of success p_1 is equivalent to PFIN-type learning with the probability of success p_2. To prove our result, we analyze the topological structure of the probability hierarchy. We prove that it is well-ordered in descending ordering and order-equivalent to ordinal epsilon_0. This shows that the structure of the hierarchy is very complicated. Using similar methods, we also prove that, for PFIN-type learning…

FOS: Computer and information sciencesComputer Science::Machine LearningTheoretical computer scienceComputer Networks and CommunicationsExistential quantificationStructure (category theory)DecidabilityType (model theory)Learning in the limitTheoretical Computer ScienceMachine Learning (cs.LG)Probability of successFinite limitsMathematicsOrdinalsDiscrete mathematicsHierarchybusiness.industryApplied MathematicsAlgorithmic learning theoryProbabilistic logicF.1.1 I.2.6Inductive inferenceInductive reasoningDecidabilityComputer Science - LearningTeam learningComputational Theory and MathematicsArtificial intelligencebusinessJournal of Computer and System Sciences
researchProduct

Optimal one-shot quantum algorithm for EQUALITY and AND

2017

We study the computation complexity of Boolean functions in the quantum black box model. In this model our task is to compute a function $f:\{0,1\}\to\{0,1\}$ on an input $x\in\{0,1\}^n$ that can be accessed by querying the black box. Quantum algorithms are inherently probabilistic; we are interested in the lowest possible probability that the algorithm outputs incorrect answer (the error probability) for a fixed number of queries. We show that the lowest possible error probability for $AND_n$ and $EQUALITY_{n+1}$ is $1/2-n/(n^2+1)$.

FOS: Computer and information sciencesDiscrete mathematicsOne shotQuantum PhysicsGeneral Computer ScienceProbabilistic logicFOS: Physical sciencesFunction (mathematics)Computational Complexity (cs.CC)Computer Science - Computational ComplexityProbability of errorComputation complexityQuantum algorithmQuantum Physics (quant-ph)Boolean functionQuantumMathematics
researchProduct

Quantum, stochastic, and pseudo stochastic languages with few states

2014

Stochastic languages are the languages recognized by probabilistic finite automata (PFAs) with cutpoint over the field of real numbers. More general computational models over the same field such as generalized finite automata (GFAs) and quantum finite automata (QFAs) define the same class. In 1963, Rabin proved the set of stochastic languages to be uncountable presenting a single 2-state PFA over the binary alphabet recognizing uncountably many languages depending on the cutpoint. In this paper, we show the same result for unary stochastic languages. Namely, we exhibit a 2-state unary GFA, a 2-state unary QFA, and a family of 3-state unary PFAs recognizing uncountably many languages; all th…

FOS: Computer and information sciencesFINITE AUTOMATAClass (set theory)Unary operationFormal Languages and Automata Theory (cs.FL)QUANTUM FINITE AUTOMATACOMPUTATIONAL MODELBINARY ALPHABETSFOS: Physical sciencesComputer Science - Formal Languages and Automata TheoryComputer Science::Computational ComplexityPROBABILISTIC FINITE AUTOMATAREAL NUMBERUNARY LANGUAGESQuantum finite automataCUT-POINTMathematicsReal numberDiscrete mathematicsQuantum PhysicsFinite-state machineGENERALIZED FINITE AUTOMATAComputer Science::Computation and Language (Computational Linguistics and Natural Language and Speech Processing)STOCHASTIC SYSTEMSAutomatonSTOCHASTIC LANGUAGESMathematics::LogicProbabilistic automatonComputer Science::Programming LanguagesQUANTUM THEORYUncountable setQuantum Physics (quant-ph)Computer Science::Formal Languages and Automata TheoryGENERALIZED FINITE AUTOMATON
researchProduct

Modeling Networks of Probabilistic Memristors in SPICE

2021

Efficient simulation of stochastic memristors and their networks requires novel modeling approaches. Utilizing a master equation to find occupation probabilities of network states is a recent major departure from typical memristor modeling [Chaos, solitons fractals 142, 110385 (2021)]. In the present article we show how to implement such master equations in SPICE – a general purpose circuit simulation program. In the case studies we simulate the dynamics of acdriven probabilistic binary and multi-state memristors, and dc-driven networks of probabilistic binary and multi-state memristors. Our SPICE results are in perfect agreement with known analytical solutions. Examples of LTspice code are…

FOS: Computer and information sciencesHardware_MEMORYSTRUCTURESCondensed Matter - Mesoscale and Nanoscale PhysicsFOS: Physical sciencesComputer Science - Emerging TechnologiesComputer Science::Hardware ArchitectureEmerging Technologies (cs.ET)Computer Science::Emerging TechnologiesmemristorsspicenetworksMesoscale and Nanoscale Physics (cond-mat.mes-hall)lcsh:Electrical engineering. Electronics. Nuclear engineeringprobabilistic computinglcsh:TK1-9971Radioengineering
researchProduct