Search results for "Complexity"

showing 10 items of 1094 documents

Inducing the Lyndon Array

2019

In this paper we propose a variant of the induced suffix sorting algorithm by Nong (TOIS, 2013) that computes simultaneously the Lyndon array and the suffix array of a text in $O(n)$ time using $\sigma + O(1)$ words of working space, where $n$ is the length of the text and $\sigma$ is the alphabet size. Our result improves the previous best space requirement for linear time computation of the Lyndon array. In fact, all the known linear algorithms for Lyndon array computation use suffix sorting as a preprocessing step and use $O(n)$ words of working space in addition to the Lyndon array and suffix array. Experimental results with real and synthetic datasets show that our algorithm is not onl…

FOS: Computer and information sciences050101 languages & linguisticsComputer scienceComputationInduced suffix sorting02 engineering and technologySpace (mathematics)law.inventionSuffix sortinglawSuffix arrayComputer Science - Data Structures and Algorithms0202 electrical engineering electronic engineering information engineeringData_FILESPreprocessorData Structures and Algorithms (cs.DS)0501 psychology and cognitive sciencesComputer Science::Data Structures and AlgorithmsTime complexitySettore ING-INF/05 - Sistemi Di Elaborazione Delle InformazioniSettore INF/01 - Informatica05 social sciencesLightweight algorithmSuffix arraySigmaComputer Science::Computation and Language (Computational Linguistics and Natural Language and Speech Processing)Induced suffix sorting; Lightweight algorithms; Lyndon array; Suffix arrayWorking spaceLyndon arrayLightweight algorithms020201 artificial intelligence & image processingAlgorithmComputer Science::Formal Languages and Automata Theory
researchProduct

On the Inner Product Predicate and a Generalization of Matching Vector Families

2018

Motivated by cryptographic applications such as predicate encryption, we consider the problem of representing an arbitrary predicate as the inner product predicate on two vectors. Concretely, fix a Boolean function $P$ and some modulus $q$. We are interested in encoding $x$ to $\vec x$ and $y$ to $\vec y$ so that $$P(x,y) = 1 \Longleftrightarrow \langle\vec x,\vec y\rangle= 0 \bmod q,$$ where the vectors should be as short as possible. This problem can also be viewed as a generalization of matching vector families, which corresponds to the equality predicate. Matching vector families have been used in the constructions of Ramsey graphs, private information retrieval (PIR) protocols, and mor…

FOS: Computer and information sciences060201 languages & linguistics000 Computer science knowledge general worksComputer Science - Cryptography and Security06 humanities and the arts02 engineering and technologyComputational Complexity (cs.CC)Computer Science - Computational Complexity0602 languages and literatureComputer ScienceFOS: Mathematics0202 electrical engineering electronic engineering information engineeringMathematics - Combinatorics020201 artificial intelligence & image processingCombinatorics (math.CO)Cryptography and Security (cs.CR)
researchProduct

Combinatorial proofs of two theorems of Lutz and Stull

2021

Recently, Lutz and Stull used methods from algorithmic information theory to prove two new Marstrand-type projection theorems, concerning subsets of Euclidean space which are not assumed to be Borel, or even analytic. One of the theorems states that if $K \subset \mathbb{R}^{n}$ is any set with equal Hausdorff and packing dimensions, then $$ \dim_{\mathrm{H}} π_{e}(K) = \min\{\dim_{\mathrm{H}} K,1\} $$ for almost every $e \in S^{n - 1}$. Here $π_{e}$ stands for orthogonal projection to $\mathrm{span}(e)$. The primary purpose of this paper is to present proofs for Lutz and Stull's projection theorems which do not refer to information theoretic concepts. Instead, they will rely on combinatori…

FOS: Computer and information sciences28A80 (primary) 28A78 (secondary)General MathematicskombinatoriikkaCombinatorial proofComputational Complexity (cs.CC)01 natural sciencesCombinatoricsMathematics - Metric GeometryHausdorff and packing measures0103 physical sciencesClassical Analysis and ODEs (math.CA)FOS: Mathematics0101 mathematicsMathematicsAlgorithmic information theoryLemma (mathematics)Euclidean spacePigeonhole principle010102 general mathematicsOrthographic projectionHausdorff spaceMetric Geometry (math.MG)Projection (relational algebra)Computer Science - Computational ComplexityMathematics - Classical Analysis and ODEsfraktaalit010307 mathematical physicsmittateoria
researchProduct

Unit contradiction versus unit propagation

2012

Some aspects of the result of applying unit resolution on a CNF formula can be formalized as functions with domain a set of partial truth assignments. We are interested in two ways for computing such functions, depending on whether the result is the production of the empty clause or the assignment of a variable with a given truth value. We show that these two models can compute the same functions with formulae of polynomially related sizes, and we explain how this result is related to the CNF encoding of Boolean constraints.

FOS: Computer and information sciencesArtificial Intelligence (cs.AI)TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGESComputer Science - Artificial IntelligenceComputer Science::Logic in Computer ScienceComputer Science::Computational Complexity
researchProduct

Adaptive learning of compressible strings

2020

Suppose an oracle knows a string $S$ that is unknown to us and that we want to determine. The oracle can answer queries of the form "Is $s$ a substring of $S$?". In 1995, Skiena and Sundaram showed that, in the worst case, any algorithm needs to ask the oracle $\sigma n/4 -O(n)$ queries in order to be able to reconstruct the hidden string, where $\sigma$ is the size of the alphabet of $S$ and $n$ its length, and gave an algorithm that spends $(\sigma-1)n+O(\sigma \sqrt{n})$ queries to reconstruct $S$. The main contribution of our paper is to improve the above upper-bound in the context where the string is compressible. We first present a universal algorithm that, given a (computable) compre…

FOS: Computer and information sciencesCentroid decompositionGeneral Computer ScienceString compressionAdaptive learningKolmogorov complexityContext (language use)Data_CODINGANDINFORMATIONTHEORYString reconstructionTheoretical Computer ScienceCombinatoricsString reconstruction; String learning; Adaptive learning; Kolmogorov complexity; String compression; Lempel-Ziv; Centroid decomposition; Suffix treeSuffix treeIntegerComputer Science - Data Structures and AlgorithmsOrder (group theory)Data Structures and Algorithms (cs.DS)Adaptive learning; Centroid decomposition; Kolmogorov complexity; Lempel-Ziv; String compression; String learning; String reconstruction; Suffix treeTime complexityComputer Science::DatabasesMathematicsLempel-ZivSettore INF/01 - InformaticaLinear spaceString (computer science)SubstringBounded functionString learningTheoretical Computer Science
researchProduct

New separation between $s(f)$ and $bs(f)$

2011

In this note we give a new separation between sensitivity and block sensitivity of Boolean functions: $bs(f)=(2/3)s(f)^2-(1/3)s(f)$.

FOS: Computer and information sciencesComputational Complexity (cs.CC)
researchProduct

Adaptive Lower Bound for Testing Monotonicity on the Line

2018

In the property testing model, the task is to distinguish objects possessing some property from the objects that are far from it. One of such properties is monotonicity, when the objects are functions from one poset to another. This is an active area of research. In this paper we study query complexity of $\epsilon$-testing monotonicity of a function $f\colon [n]\to[r]$. All our lower bounds are for adaptive two-sided testers. * We prove a nearly tight lower bound for this problem in terms of $r$. The bound is $\Omega(\frac{\log r}{\log \log r})$ when $\epsilon = 1/2$. No previous satisfactory lower bound in terms of $r$ was known. * We completely characterise query complexity of this probl…

FOS: Computer and information sciencesComputer Science - Computational Complexity000 Computer science knowledge general worksComputer Science - Data Structures and AlgorithmsComputer ScienceData Structures and Algorithms (cs.DS)Computational Complexity (cs.CC)
researchProduct

Testing convexity of functions over finite domains

2019

We establish new upper and lower bounds on the number of queries required to test convexity of functions over various discrete domains. 1. We provide a simplified version of the non-adaptive convexity tester on the line. We re-prove the upper bound $O(\frac{\log(\epsilon n)}{\epsilon})$ in the usual uniform model, and prove an $O(\frac{\log n}{\epsilon})$ upper bound in the distribution-free setting. 2. We show a tight lower bound of $\Omega(\frac{\log(\epsilon n)}{\epsilon})$ queries for testing convexity of functions $f: [n] \rightarrow \mathbb{R}$ on the line. This lower bound applies to both adaptive and non-adaptive algorithms, and matches the upper bound from item 1, showing that adap…

FOS: Computer and information sciencesComputer Science - Computational ComplexityComputational Complexity (cs.CC)
researchProduct

Sensitivity versus Certificate Complexity of Boolean Functions

2015

Sensitivity, block sensitivity and certificate complexity are basic complexity measures of Boolean functions. The famous sensitivity conjecture claims that sensitivity is polynomially related to block sensitivity. However, it has been notoriously hard to obtain even exponential bounds. Since block sensitivity is known to be polynomially related to certificate complexity, an equivalent of proving this conjecture would be showing that certificate complexity is polynomially related to sensitivity. Previously, it has been shown that $bs(f) \leq C(f) \leq 2^{s(f)-1} s(f) - (s(f)-1)$. In this work, we give a better upper bound of $bs(f) \leq C(f) \leq \max\left(2^{s(f)-1}\left(s(f)-\frac 1 3\righ…

FOS: Computer and information sciencesComputer Science - Computational ComplexityComputational Complexity (cs.CC)
researchProduct

Probabilistic verifiers for asymmetric debates

2012

We examine the power of silent constant-space probabilistic verifiers that watch asymmetric debates (where one side is unable to see some of the messages of the other) between two deterministic provers, and try to determine who is right. We prove that probabilistic verifiers outperform their deterministic counterparts as asymmetric debate checkers. It is shown that the membership problem for every language in NSPACE(s(n)) has a 2^{s(n)}-time debate where one prover is completely blind to the other one, for polynomially bounded space constructible s(n). When partial information is allowed to be seen by the handicapped prover, the class of languages debatable in 2^{s(n)} time contains TIME(2^…

FOS: Computer and information sciencesComputer Science - Computational ComplexityComputational Complexity (cs.CC)
researchProduct