Search results for "Analysis of algorithm"
showing 9 items of 29 documents
Quantum query algorithms for certain functions and general algorithm construction techniques
2007
Quantum algorithms can be analyzed in a query model to compute Boolean functions where input is given in a black box, but the aim is to compute function value for arbitrary input using as few queries as possible. In this paper we concentrate on quantum query algorithm designing tasks. The main aim of research was to find new efficient algorithms and develop general algorithm designing techniques. We present several exact quantum query algorithms for certain problems that are better than classical counterparts. Next we introduce algorithm transformation methods that allow significant enlarging of sets of exactly computable functions. Finally, we propose quantum algorithm designing methods. G…
Probabilistic analysis of truss structures with uncertain parameters (virtual distortion method approach)
2004
A new approach for probabilistic characterization of linear elastic redundant trusses with uncertainty on the various members subjected to deterministic loads acting on the nodes of the structure is presented. The method is based on the simple observation that variations of structural parameters are equivalent to superimposed strains on a reference structure depending on the axial forces on the elastic modulus of the original structure as well as on the uncertainty (virtual distortion method approach). Superposition principle may be applied to separate contribution to mechanical response due to external loads and parameter variations. Statically determinate trusses dealt with the proposed m…
A probabilistic condensed representation of data for stream mining
2014
Data mining and machine learning algorithms usually operate directly on the data. However, if the data is not available at once or consists of billions of instances, these algorithms easily become infeasible with respect to memory and run-time concerns. As a solution to this problem, we propose a framework, called MiDEO (Mining Density Estimates inferred Online), in which algorithms are designed to operate on a condensed representation of the data. In particular, we propose to use density estimates, which are able to represent billions of instances in a compact form and can be updated when new instances arrive. As an example for an algorithm that operates on density estimates, we consider t…
Communication complexity in a 3-computer model
1996
It is proved that the probabilistic communication complexity of the identity function in a 3-computer model isO(√n).
Algorithmics for the Life Sciences
2013
The life sciences, in particular molecular biology and medicine, have wit- nessed fundamental progress since the discovery of the “the Double Helix”. A rele- vant part of such an incredible advancement in knowledge has been possible thanks to synergies with the mathematical sciences, on the one hand, and computer science, on the other. Here we review some of the most relevant aspects of this cooperation focusing on contributions given by the design, analysis and engineering of fast al- gorithms for the life sciences.
How to simulate free will in a computational device
1999
Since we believe that human brain is not a purely deterministic device merely reacting to the environment but rather it is capable to a free will, Theoretical Computer Science has also tried to develop a system of notions generalizing determinism. Nondeterministic and probabilistic algorithms were the first generalizations. Nondeterministic machines constitute an important part of the Theory of Computation. Nondeterminism is a useful way to describe possible choices. In real life there are many regulations restricting our behavior. These regulations nearly always leave some freedom for us how to react. Such regulations are best described in terms of nondeterministic algorithms. Nondetermini…
Decremental 2- and 3-connectivity on planar graphs
1996
We study the problem of maintaining the 2-edge-, 2-vertex-, and 3-edge-connected components of a dynamic planar graph subject to edge deletions. The 2-edge-connected components can be maintained in a total ofO(n logn) time under any sequence of at mostO(n) deletions. This givesO(logn) amortized time per deletion. The 2-vertex- and 3-edge-connected components can be maintained in a total ofO(n log2n) time. This givesO(log2n) amortized time per deletion. The space required by all our data structures isO(n). All our time bounds improve previous bounds.
CN2-R: Faster CN2 with randomly generated complexes
2011
Among the rule induction algorithms, the classic CN2 is still one of the most popular ones; a great amount of enhancements and improvements to it is to witness this. Despite the growing computing capacities since the algorithm was proposed, one of the main issues is resource demand. The proposed modification, CN2-R, substitutes the star concept of the original algorithm with a technique of randomly generated complexes in order to substantially improve on running times without significant loss in accuracy.
Probabilistic Logic under Coherence: Complexity and Algorithms
2005
In previous work [V. Biazzo, A. Gilio, T. Lukasiewicz and G. Sanfilippo, Probabilistic logic under coherence, model-theoretic probabilistic logic, and default reasoning in System P, Journal of Applied Non-Classical Logics 12(2) (2002) 189---213.], we have explored the relationship between probabilistic reasoning under coherence and model-theoretic probabilistic reasoning. In particular, we have shown that the notions of g-coherence and of g-coherent entailment in probabilistic reasoning under coherence can be expressed by combining notions in model-theoretic probabilistic reasoning with concepts from default reasoning. In this paper, we continue this line of research. Based on the above sem…