Search results for "Theoretical Computer Science"
showing 10 items of 1151 documents
MAD+. Introducing Misconceptions in the Temporal Analysis of the Mathematical Modelling Process of a Fermi Problem
2021
This work describes how the combination of the mistakes committed by a group of pre-service teachers when solving a Fermi problem, with the representation of the temporal analysis of their resolutions, can offer more in-depth information about their conceptual misconceptions regarding mathematical and modelling concepts. The combined representation allows knowing when mistakes occur and provides a powerful tool for instructors to adapt the teaching–learning processes of mathematics at all levels of education. Our study is based on a recent categorisation of students’ mistakes, together with the creation of a new representation tool, called MAD+, that can combine all this information. The ma…
Diffusive neural network
2002
Abstract A non-connectionist model of a neuronal network based on passive diffusion of neurotransmitters is presented as an alternative to hard-wired artificial neural networks. Classic thermodynamical approach shows that the diffusive network is capable of exhibiting asymptotic stability and a dynamics resembling that of a chaotic system. Basic computational capabilities of the net are discussed based on the equivalence with a Turing machine. The model offers a way to represent mass-sustained brain functions in terms of recurrent behaviors in the phase space.
Statistical analysis of RaptorQ failure probability applied to a data recovery software
2014
In this work, we have implemented a data recovery software integrating the most recent rateless codes, i.e., RaptorQ codes. Thanks to the above-mentioned software, it is possible to recover data loss occurring on several kinds of network conditions. We have performed a statistical analysis of failure probabilities at several configurations of RaptorQ parameters. We have found a good agreement with the theoretical values of a random linear fountain code over Galois Field GF(256). Moreover, we have shown that the probability of having a certain number of failed decoded source blocks - when sending a fixed size file - follows a Poisson distribution.
Multiple SIP strategies and bottom-up adorning in logic query optimization
1990
Preprocessing methods called “readorning” and “bottom-up adorning” are introduced as means of enlarging the application domain of magic sets and related query optimization strategies for logic databases. Readorning tries to make possible the simultaneous use of multiple sideways information passing (sip) strategies defined for a rule, thus yielding an optimization effect that may not be achieved by any particular choice of sip strategies. Bottom-up adorning is used to make magic sets applicable to cases in which potential optimizations can be derived from bindings coming upwards from rule bodies to rule heads in bottom-up evaluation. These include the cases in which we know that some base r…
Representation theory treatment of measurement semantics for ratio, ordinal and nominal scales
1997
Within the scope of the representational theory a formal framework for description of semantic aspects of measurement on different scales is proposed. This is done by means of a first-order formal logical system consisting of a set of empirical predicates which play the part of a data structure in the framework, a set of operations by means of which syntactically correct statements can be formed; a set of axioms being true statements and a set of numerical statements which is an aggregation of potential measurement results carrying a meaningful load. On this basis the notation of semantic information on various scales is introduced and some common claims about the measurement semantic infor…
A Representation of Relational Systems
2003
In this paper elements of a theory of multistructures are formulated. The theory of multistructures is used to define a binary representation of relational systems.
The Burrows-Wheeler Transform between Data Compression and Combinatorics on Words
2013
The Burrows-Wheeler Transform (BWT) is a tool of fundamental importance in Data Compression and, recently, has found many applications well beyond its original purpose. The main goal of this paper is to highlight the mathematical and combinatorial properties on which the outstanding versatility of the $BWT$ is based, i.e. its reversibility and the clustering effect on the output. Such properties have aroused curiosity and fervent interest in the scientific world both for theoretical aspects and for practical effects. In particular, in this paper we are interested both to survey the theoretical research issues which, by taking their cue from Data Compression, have been developed in the conte…
Correlation Analysis of Node and Edge Centrality Measures in Artificial Complex Networks
2021
The role of an actor in a social network is identified through a set of measures called centrality. Degree centrality, betweenness centrality, closeness centrality, and clustering coefficient are the most frequently used metrics to compute the node centrality. Their computational complexity in some cases makes unfeasible, when not practically impossible, their computations. For this reason, we focused on two alternative measures, WERW-Kpath and Game of Thieves, which are at the same time highly descriptive and computationally affordable. Our experiments show that a strong correlation exists between WERW-Kpath and Game of Thieves and the classical centrality measures. This may suggest the po…
Game of Thieves and WERW-Kpath: Two Novel Measures of Node and Edge Centrality for Mafia Networks
2021
Real-world complex systems can be modeled as homogeneous or heterogeneous graphs composed by nodes connected by edges. The importance of nodes and edges is formally described by a set of measures called centralities which are typically studied for graphs of small size. The proliferation of digital collection of data has led to huge graphs with billions of nodes and edges. For this reason, we focus on two new algorithms, Game of Thieves and WERW-Kpath which are computationally-light alternatives to the canonical centrality measures such as degree, node and edge betweenness, closeness and clustering. We explore the correlation among these measures using the Spearman’s correlation coefficient …
Algorithmics for the Life Sciences
2013
The life sciences, in particular molecular biology and medicine, have wit- nessed fundamental progress since the discovery of the “the Double Helix”. A rele- vant part of such an incredible advancement in knowledge has been possible thanks to synergies with the mathematical sciences, on the one hand, and computer science, on the other. Here we review some of the most relevant aspects of this cooperation focusing on contributions given by the design, analysis and engineering of fast al- gorithms for the life sciences.