Search results for " Reason."
showing 10 items of 462 documents
Inductive Inference with Procrastination: Back to Definitions
1999
In this paper, we reconsider the definition of procrastinating learning machines. In the original definition of Freivalds and Smith [FS93], constructive ordinals are used to bound mindchanges. We investigate possibility of using arbitrary linearly ordered sets to bound mindchanges in similar way. It turns out that using certain ordered sets it is possible to define inductive inference types different from the previously known ones. We investigate properties of the new inductive inference types and compare them to other types.
Derived sets and inductive inference
1994
The paper deals with using topological concepts in studies of the Gold paradigm of inductive inference. They are — accumulation points, derived sets of order α (α — constructive ordinal) and compactness. Identifiability of a class U of total recursive functions with a bound α on the number of mindchanges implies \(U^{(\alpha + 1)} = \not 0\). This allows to construct counter-examples — recursively enumerable classes of functions showing the proper inclusion between identification types: EXα⊂EXα+1.
Enumerable classes of total recursive functions: Complexity of inductive inference
1994
This paper includes some results on complexity of inductive inference for enumerable classes of total recursive functions, where enumeration is considered in more general meaning than usual recursive enumeration. The complexity is measured as the worst-case mindchange (error) number for the first n functions of the given class. Three generalizations are considered.
Quasi Conjunction and Inclusion Relation in Probabilistic Default Reasoning
2011
We study the quasi conjunction and the Goodman & Nguyen inclusion relation for conditional events, in the setting of probabilistic default reasoning under coherence. We deepen two recent results given in (Gilio and Sanfilippo, 2010): the first result concerns p-entailment from a family F of conditional events to the quasi conjunction C(S) associated with each nonempty subset S of F; the second result, among other aspects, analyzes the equivalence between p-entailment from F and p-entailment from C(S), where S is some nonempty subset of F. We also characterize p-entailment by some alternative theorems. Finally, we deepen the connections between p-entailment and the Goodman & Nguyen inclusion…
Application of kolmogorov complexity to inductive inference with limited memory
1995
A b s t r a c t . We consider inductive inference with limited memory[l]. We show that there exists a set U of total recursive functions such that U can be learned with linear long-term memory (and no short-term memory); U can be learned with logarithmic long-term memory (and some amount of short-term memory); if U is learned with sublinear long-term memory, then the short-term memory exceeds arbitrary recursive function. Thus an open problem posed by Freivalds, Kinber and Smith[l] is solved. To prove our result, we use Kolmogorov complexity.
Kolmogorov numberings and minimal identification
1995
Identification of programs for computable functions from their graphs by algorithmic devices is a well studied problem in learning theory. Freivalds and Chen consider identification of ‘minimal’ and ‘nearly minimal’ programs for functions from their graphs. To address certain problems in minimal identification for Godel numberings, Freivalds later considered minimal identification in Kolmogorov Numberings. Kolmogorov numberings are in some sense optimal numberings and have some nice properties. We prove certain hierarchy results for minimal identification in every Kolmogorov numbering. In addition we also compare minimal identification in Godel numbering versus minimal identification in Kol…
Parsimony hierarchies for inductive inference
2004
AbstractFreivalds defined an acceptable programming system independent criterion for learning programs for functions in which the final programs were required to be both correct and “nearly” minimal size. i.e.. within a computable function of being purely minimal size. Kinber showed that this parsimony requirement on final programs limits learning power. However, in scientific inference, parsimony is considered highly desirable. Alim-computable functionis (by definition) one calculable by a total procedure allowed to change its mind finitely many times about its output. Investigated is the possibility of assuaging somewhat the limitation on learning power resulting from requiring parsimonio…
General inductive inference types based on linearly-ordered sets
1996
In this paper, we reconsider the definitions of procrastinating learning machines. In the original definition of Freivalds and Smith [FS93], constructive ordinals are used to bound mindchanges. We investigate the possibility of using arbitrary linearly ordered sets to bound mindchanges in a similar way. It turns out that using certain ordered sets it is possible to define inductive inference types more general than the previously known ones. We investigate properties of the new inductive inference types and compare them to other types.
Learning with confidence
1996
Herein we investigate learning in the limit where confidence in the current conjecture accrues with time. Confidence levels are given by rational numbers between 0 and 1. The traditional requirement that for learning in the limit is that a device must converge (in the limit) to a correct answer. We further demand that the associated confidence in the answer (monotonically) approach 1 in the limit. In addition to being a more realistic model of learning, our new notion turns out to be a more powerful as well. In addition, we give precise characterizations of the classes of functions that are learnable in our new model(s).
Probability Propagation in Selected Aristotelian Syllogisms
2019
This paper continues our work on a coherence-based probability semantics for Aristotelian syllogisms (Gilio, Pfeifer, and Sanfilippo, 2016; Pfeifer and Sanfilippo, 2018) by studying Figure III under coherence. We interpret the syllogistic sentence types by suitable conditional probability assessments. Since the probabilistic inference of $P|S$ from the premise set ${P|M, S|M}$ is not informative, we add $p(M|(S ee M))>0$ as a probabilistic constraint (i.e., an ``existential import assumption'') to obtain probabilistic informativeness. We show how to propagate the assigned premise probabilities to the conclusion. Thereby, we give a probabilistic meaning to all syllogisms of Figure~III. We…