Search results for "Analysis of algorithms"
showing 9 items of 29 documents
How to simulate free will in a computational device
1999
Since we believe that human brain is not a purely deterministic device merely reacting to the environment but rather it is capable to a free will, Theoretical Computer Science has also tried to develop a system of notions generalizing determinism. Nondeterministic and probabilistic algorithms were the first generalizations. Nondeterministic machines constitute an important part of the Theory of Computation. Nondeterminism is a useful way to describe possible choices. In real life there are many regulations restricting our behavior. These regulations nearly always leave some freedom for us how to react. Such regulations are best described in terms of nondeterministic algorithms. Nondetermini…
Centering and Compound Conditionals under Coherence
2016
There is wide support in logic , philosophy , and psychology for the hypothesis that the probability of the indicative conditional of natural language, \(P(\textit{if } A \textit{ then } B)\), is the conditional probability of B given A, P(B|A). We identify a conditional which is such that \(P(\textit{if } A \textit{ then } B)= P(B|A)\) with de Finetti’s conditional event, B|A. An objection to making this identification in the past was that it appeared unclear how to form compounds and iterations of conditional events. In this paper, we illustrate how to overcome this objection with a probabilistic analysis, based on coherence, of these compounds and iterations. We interpret the compounds a…
Communication complexity in a 3-computer model
1996
It is proved that the probabilistic communication complexity of the identity function in a 3-computer model isO(√n).
On the Greedy Algorithm for the Shortest Common Superstring Problem with Reversals
2015
We study a variation of the classical Shortest Common Superstring (SCS) problem in which a shortest superstring of a finite set of strings $S$ is sought containing as a factor every string of $S$ or its reversal. We call this problem Shortest Common Superstring with Reversals (SCS-R). This problem has been introduced by Jiang et al., who designed a greedy-like algorithm with length approximation ratio $4$. In this paper, we show that a natural adaptation of the classical greedy algorithm for SCS has (optimal) compression ratio $\frac12$, i.e., the sum of the overlaps in the output string is at least half the sum of the overlaps in an optimal solution. We also provide a linear-time implement…
Decremental 2- and 3-connectivity on planar graphs
1996
We study the problem of maintaining the 2-edge-, 2-vertex-, and 3-edge-connected components of a dynamic planar graph subject to edge deletions. The 2-edge-connected components can be maintained in a total ofO(n logn) time under any sequence of at mostO(n) deletions. This givesO(logn) amortized time per deletion. The 2-vertex- and 3-edge-connected components can be maintained in a total ofO(n log2n) time. This givesO(log2n) amortized time per deletion. The space required by all our data structures isO(n). All our time bounds improve previous bounds.
Case-studies on average-case analysis for an elementary course on algorithms
1999
Average-case algorithm analysis is usually viewed as a tough subject by students in the first courses in computer science. Traditionally, these topics are fully developed in advanced courses with a clear mathematical orientation. The work presented here is not an alternative to this, rather, it presents the analysis of algorithms (and average-case in particular) adapted to the mathematical background of students in an elementary course on algorithms or programming by using two selected case-studies.
Complexity of probabilistic versus deterministic automata
2005
Three-circle method in the investigations of shapes of gas bubble clusters in two-phase flow
1991
We present an attempt of formulating a quantitative criterion for division into homogeneous and heterogeneous flow patterns basing on the probabilistic analysis of gas bubble distribution in the liquid
Frequency Prediction of Functions
2012
Prediction of functions is one of processes considered in inductive inference. There is a "black box" with a given total function f in it. The result of the inductive inference machine F( ) is expected to be f(n+1). Deterministic and probabilistic prediction of functions has been widely studied. Frequency computation is a mechanism used to combine features of deterministic and probabilistic algorithms. Frequency computation has been used for several types of inductive inference, especially, for learning via queries. We study frequency prediction of functions and show that that there exists an interesting hierarchy of predictable classes of functions.