0000000000309425

AUTHOR

Thomas Zeugmann

On the Size Complexity of Deterministic Frequency Automata

Austinat, Diekert, Hertrampf, and Petersen [2] proved that every language L that is (m,n)-recognizable by a deterministic frequency automaton such that m > n/2 can be recognized by a deterministic finite automaton as well. First, the size of deterministic frequency automata and of deterministic finite automata recognizing the same language is compared. Then approximations of a language are considered, where a language L′ is called an approximation of a language L if L′ differs from L in only a finite number of strings. We prove that if a deterministic frequency automaton has k states and (m,n)-recognizes a language L, where m > n/2, then there is a language L′ approximating L such that L′ c…

research product

Co-learning of recursive languages from positive data

The present paper deals with the co-learnability of enumerable families L of uniformly recursive languages from positive data. This refers to the following scenario. A family L of target languages as well as hypothesis space for it are specified. The co-learner is fed eventually all positive examples of an unknown target language L chosen from L. The target language L is successfully co-learned iff the co-learner can definitely delete all but one possible hypotheses, and the remaining one has to correctly describe L.

research product

On the Influence of Technology on Learning Processes

Probabilistic computations and frequency computations were invented for the same purpose, namely, to study possible advantages of technology involving random choices. Recently several authors have discovered close relationships of these generalizations of deterministic computations to computations taking advice. Various forms of computation taking advice were studied by Karp and Lipton [1], Damm and Holzer [2], and Freivalds [3]. In the present paper, we apply the nonconstructive, probabilistic, and frequency methods to an inductive inference paradigm originally due to Gold [4] and investigate their impact on the resulting learning models. Several trade-offs with respect to the resulting l…

research product

On the Amount of Nonconstructivity in Learning Recursive Functions

Nonconstructive proofs are a powerful mechanism in mathematics. Furthermore, nonconstructive computations by various types of machines and automata have been considered by e.g., Karp and Lipton [17] and Freivalds [11]. They allow to regard more complicated algorithms from the viewpoint of much more primitive computational devices. The amount of nonconstructivity is a quantitative characterization of the distance between types of computational devices with respect to solving a specific problem. In the present paper, the amount of nonconstructivity in learning of recursive functions is studied. Different learning types are compared with respect to the amount of nonconstructivity needed to lea…

research product

Active Learning of Recursive Functions by Ultrametric Algorithms

We study active learning of classes of recursive functions by asking value queries about the target function f, where f is from the target class. That is, the query is a natural number x, and the answer to the query is f(x). The complexity measure in this paper is the worst-case number of queries asked. We prove that for some classes of recursive functions ultrametric active learning algorithms can achieve the learning goal by asking significantly fewer queries than deterministic, probabilistic, and even nondeterministic active learning algorithms. This is the first ever example of a problem where ultrametric algorithms have advantages over nondeterministic algorithms.

research product