Search results for "Probabilistic"
showing 10 items of 380 documents
Context Trees, Variable Length Markov Chains and Dynamical Sources
2012
Infinite random sequences of letters can be viewed as stochastic chains or as strings produced by a source, in the sense of information theory. The relationship between Variable Length Markov Chains (VLMC) and probabilistic dynamical sources is studied. We establish a probabilistic frame for context trees and VLMC and we prove that any VLMC is a dynamical source for which we explicitly build the mapping. On two examples, the "comb" and the "bamboo blossom", we find a necessary and sufficient condition for the existence and the uniqueness of a stationary probability measure for the VLMC. These two examples are detailed in order to provide the associated Dirichlet series as well as the genera…
Error-Free Affine, Unitary, and Probabilistic OBDDs
2018
We introduce the affine OBDD model and show that zero-error affine OBDDs can be exponentially narrower than bounded-error unitary and probabilistic OBDDs on certain problems. Moreover, we show that Las Vegas unitary and probabilistic OBDDs can be quadratically narrower than deterministic OBDDs. We also obtain the same results for the automata versions of these models.
A probabilistic meaning of certain quasinormal subgroups
2007
The role of the cyclic quasinormal subgroups has been recently described in groups both finite and infinite by S.Stonehewer and G.Zacher. This role can be better analyzed in the class of compact groups, obtaining restrictions for the probability that two randomly chosen elements commute. Mathematcs Subject Classification: 20D60, 20P05, 20D08
Probability Propagation in Selected Aristotelian Syllogisms
2019
This paper continues our work on a coherence-based probability semantics for Aristotelian syllogisms (Gilio, Pfeifer, and Sanfilippo, 2016; Pfeifer and Sanfilippo, 2018) by studying Figure III under coherence. We interpret the syllogistic sentence types by suitable conditional probability assessments. Since the probabilistic inference of $P|S$ from the premise set ${P|M, S|M}$ is not informative, we add $p(M|(S ee M))>0$ as a probabilistic constraint (i.e., an ``existential import assumption'') to obtain probabilistic informativeness. We show how to propagate the assigned premise probabilities to the conclusion. Thereby, we give a probabilistic meaning to all syllogisms of Figure~III. We…
Conditional Random Quantities and Compounds of Conditionals
2013
In this paper we consider finite conditional random quantities and conditional previsions assessments in the setting of coherence. We use a suitable representation for conditional random quantities; in particular the indicator of a conditional event $E|H$ is looked at as a three-valued quantity with values 1, or 0, or $p$, where $p$ is the probability of $E|H$. We introduce a notion of iterated conditional random quantity of the form $(X|H)|K$ defined as a suitable conditional random quantity, which coincides with $X|HK$ when $H \subseteq K$. Based on a recent paper by S. Kaufmann, we introduce a notion of conjunction of two conditional events and then we analyze it in the setting of cohere…
Compound conditionals, Fr\'echet-Hoeffding bounds, and Frank t-norms
2021
Abstract In this paper we consider compound conditionals, Frechet-Hoeffding bounds and the probabilistic interpretation of Frank t-norms. By studying the solvability of suitable linear systems, we show under logical independence the sharpness of the Frechet-Hoeffding bounds for the prevision of conjunctions and disjunctions of n conditional events. In addition, we illustrate some details in the case of three conditional events. We study the set of all coherent prevision assessments on a family containing n conditional events and their conjunction, by verifying that it is convex. We discuss the case where the prevision of conjunctions is assessed by Lukasiewicz t-norms and we give explicit s…
Generalized probabilistic modus ponens
2017
Modus ponens (from A and “if A then C” infer C) is one of the most basic inference rules. The probabilistic modus ponens allows for managing uncertainty by transmitting assigned uncertainties from the premises to the conclusion (i.e., from P(A) and P(C|A) infer P(C)). In this paper, we generalize the probabilistic modus ponens by replacing A by the conditional event A|H. The resulting inference rule involves iterated conditionals (formalized by conditional random quantities) and propagates previsions from the premises to the conclusion. Interestingly, the propagation rules for the lower and the upper bounds on the conclusion of the generalized probabilistic modus ponens coincide with the re…
Error-Free Affine, Unitary, and Probabilistic OBDDs
2021
We introduce the affine OBDD model and show that zero-error affine OBDDs can be exponentially narrower than bounded-error unitary and probabilistic OBDDs on certain problems. Moreover, we show that Las-Vegas unitary and probabilistic OBDDs can be quadratically narrower than deterministic OBDDs. We also obtain the same results for the automata counterparts of these models.
Probabilities to Accept Languages by Quantum Finite Automata
1999
We construct a hierarchy of regular languages such that the current language in the hierarchy can be accepted by 1-way quantum finite automata with a probability smaller than the corresponding probability for the preceding language in the hierarchy. These probabilities converge to 1/2.
Finite State Transducers with Intuition
2010
Finite automata that take advice have been studied from the point of view of what is the amount of advice needed to recognize nonregular languages. It turns out that there can be at least two different types of advice. In this paper we concentrate on cases when the given advice contains zero information about the input word and the language to be recognized. Nonetheless some nonregular languages can be recognized in this way. The help-word is merely a sufficiently long word with nearly maximum Kolmogorov complexity. Moreover, any sufficiently long word with nearly maximum Kolmogorov complexity can serve as a help-word. Finite automata with such help can recognize languages not recognizable …