0000000000520246

AUTHOR

Marek Karpinski

showing 8 related works from this author

Weak and strong recognition by 2-way randomized automata

1997

Languages weakly recognized by a Monte Carlo 2-way finite automaton with n states are proved to be strongly recognized by a Monte Carlo 2-way finite automaton with no(n) states. This improves dramatically over the previously known result by M.Karpinski and R.Verbeek [10] which is also nontrivial since these languages can be nonregular [5]. For tally languages the increase in the number of states is proved to be only polynomial, and these languages are regular.

Deterministic pushdown automatonCombinatoricsDeterministic automatonProbabilistic automatonPushdown automatonQuantum finite automataBüchi automatonTwo-way deterministic finite automatonNondeterministic finite automatonComputer Science::Computational ComplexityComputer Science::Formal Languages and Automata TheoryMathematics
researchProduct

Co-learnability and FIN-identifiability of enumerable classes of total recursive functions

1994

Co-learnability is an inference process where instead of producing the final result, the strategy produces all the natural numbers but one, and the omitted number is an encoding of the correct result. It has been proved in [1] that co-learnability of Goedel numbers is equivalent to EX-identifiability. We consider co-learnability of indices in recursively enumerable (r.e.) numberings. The power of co-learnability depends on the numberings used. Every r.e. class of total recursive functions is co-learnable in some r.e. numbering. FIN-identifiable classes are co-learnable in all r.e. numberings, and classes containing a function being accumulation point are not co-learnable in some r.e. number…

CombinatoricsClass (set theory)TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGESTheoryofComputation_COMPUTATIONBYABSTRACTDEVICESConjectureRecursively enumerable languageLimit pointIdentifiabilityNatural numberFunction (mathematics)NumberingMathematics
researchProduct

Learning by the Process of Elimination

2002

AbstractElimination of potential hypotheses is a fundamental component of many learning processes. In order to understand the nature of elimination, herein we study the following model of learning recursive functions from examples. On any target function, the learning machine has to eliminate all, save one, possible hypotheses such that the missing one correctly describes the target function. It turns out that this type of learning by the process of elimination (elm-learning, for short) can be stronger, weaker or of the same power as usual Gold style learning.While for usual learning any r.e. class of recursive functions can be learned in all of its numberings, this is no longer true for el…

Computer Science::Machine LearningProcess of eliminationGeneralization0102 computer and information sciences02 engineering and technology01 natural sciencesNumberingComputer Science ApplicationsTheoretical Computer ScienceDecidabilityAlgebraComputational Theory and Mathematics010201 computation theory & mathematicsPhysics::Plasma Physics0202 electrical engineering electronic engineering information engineeringRecursive functions020201 artificial intelligence & image processingEquivalence (formal languages)Information SystemsMathematicsInformation and Computation
researchProduct

Lower space bounds for randomized computation

1994

It is a fundamental problem in the randomized computation how to separate different randomized time or randomized space classes (c.f., e.g., [KV87, KV88]). We have separated randomized space classes below log n in [FK94]. Now we have succeeded to separate small randomized time classes for multi-tape 2-way Turing machines. Surprisingly, these “small” bounds are of type n+f(n) with f(n) not exceeding linear functions. This new approach to “sublinear” time complexity is a natural counterpart to sublinear space complexity. The latter was introduced by considering the input tape and the work tape as separate devices and distinguishing between the space used for processing information and the spa…

Discrete mathematicsCombinatoricsTuring machinesymbols.namesakeSublinear functionKolmogorov complexitysymbolsType (model theory)Binary logarithmSpace (mathematics)Time complexityWord (computer architecture)Mathematics
researchProduct

Quantum finite multitape automata

1999

Quantum finite automata were introduced by C.Moore, J.P. Crutchfield, and by A.Kondacs and J.Watrous. This notion is not a generalization of the deterministic finite automata. Moreover, it was proved that not all regular languages can be recognized by quantum finite automata. A.Ambainis and R.Freivalds proved that for some languages quantum finite automata may be exponentially more concise rather than both deterministic and probabilistic finite automata. In this paper we introduce the notion of quantum finite multitape automata and prove that there is a language recognized by a quantum finite automaton but not by a deterministic or probabilistic finite automata. This is the first result on …

FOS: Computer and information sciencesQuantum PhysicsComputer Science - Computational ComplexityTheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGESTheoryofComputation_COMPUTATIONBYABSTRACTDEVICESFormal Languages and Automata Theory (cs.FL)FOS: Physical sciencesComputer Science - Formal Languages and Automata TheoryComputational Complexity (cs.CC)Quantum Physics (quant-ph)Nonlinear Sciences::Cellular Automata and Lattice GasesComputer Science::Formal Languages and Automata Theory
researchProduct

Effects of Kolmogorov complexity present in inductive inference as well

1997

For all complexity measures in Kolmogorov complexity the effect discovered by P. Martin-Lof holds. For every infinite binary sequence there is a wide gap between the supremum and the infimum of the complexity of initial fragments of the sequence. It is assumed that that this inevitable gap is characteristic of Kolmogorov complexity, and it is caused by the highly abstract nature of the unrestricted Kolmogorov complexity.

PHAverage-case complexityDiscrete mathematicsStructural complexity theoryKolmogorov complexityKolmogorov structure functionChain rule for Kolmogorov complexityDescriptive complexity theoryMathematicsQuantum complexity theory
researchProduct

Quantum Finite Multitape Automata

1999

Quantum finite automata were introduced by C. Moore, J. P. Crutchfield [4], and by A. Kondacs and J. Watrous [3]. This notion is not a generalization of the deterministic finite automata. Moreover, in [3] it was proved that not all regular languages can be recognized by quantum finite automata. A. Ambainis and R. Freivalds [1] proved that for some languages quantum finite automata may be exponentially more concise rather than both deterministic and probabilistic finite automata. In this paper we introduce the notion of quantum finite multitape automata and prove that there is a language recognized by a quantum finite automaton but not by deterministic or probabilistic finite automata. This …

Discrete mathematicsProbabilistic finite automataFinite-state machineNested wordComputer scienceDeterministic context-free grammarTimed automatonω-automatonNonlinear Sciences::Cellular Automata and Lattice GasesAutomatonMobile automatonNondeterministic finite automaton with ε-movesDeterministic finite automatonDFA minimizationRegular languageDeterministic automatonProbabilistic automatonContinuous spatial automatonAutomata theoryQuantum finite automataTwo-way deterministic finite automatonNondeterministic finite automatonComputer Science::Formal Languages and Automata TheoryQuantum cellular automaton
researchProduct

Co-learning of total recursive functions

1994

AlgebraComputer scienceRecursive functionsProceedings of the seventh annual conference on Computational learning theory - COLT '94
researchProduct