Search results for "Theoretical Computer Science"
showing 10 items of 1151 documents
epiModel: A system to build automatically systems of differential equations of compartmental type-epidemiological models
2011
In this paper we describe epiModel, a code developed in Mathematica that facilitates the building of systems of differential equations corresponding to type-epidemiological linear or quadratic models whose characteristics are defined in text files following an easy syntax. It includes the possibility of obtaining the equations of models involving age and/or sex groups. © 2011.
Genetic Algorithms Applied to the Design of 3D Photonic Crystals
2011
We aim at determining the optimal configuration of photonic crystal structures capable of carrying out a certain optical task. An exhaustive search would require a high computational cost, in this work we show how genetic algorithms can be applied to reliably find an optimal topology of threedimensional photonic crystals. The fitness, representing the performance of each potential configuration, is calculated by means of finite element analysis. Different experiments are presented in order to illustrate the potential of this 3D design approach.
On the power of inductive inference from good examples
1993
Abstract The usual information in inductive inference available for the purposes of identifying an unknown recursive function f is the set of all input/output examples (x,f(x)),n eN. In contrast to this approach we show that it is considerably more powerful to work with finite sets of “good” examples even when these good examples are required to be effectively computable. The influence of the underlying numberings, with respect to which the identification has to be realized, to the capabilities of inference from good examples is also investigated. It turns out that nonstandard numberings can be much more powerful than Godel numberings.
3D high definition video coding on a GPU-based heterogeneous system
2013
H.264/MVC is a standard for supporting the sensation of 3D, based on coding from 2 (stereo) to N views. H.264/MVC adopts many coding options inherited from single view H.264/AVC, and thus its complexity is even higher, mainly because the number of processing views is higher. In this manuscript, we aim at an efficient parallelization of the most computationally intensive video encoding module for stereo sequences. In particular, inter prediction and its collaborative execution on a heterogeneous platform. The proposal is based on an efficient dynamic load balancing algorithm and on breaking encoding dependencies. Experimental results demonstrate the proposed algorithm's ability to reduce the…
Graph Clustering with Local Density-Cut
2018
In this paper, we introduce a new graph clustering algorithm, called Dcut. The basic idea is to envision the graph clustering as a local density-cut problem. To identify meaningful communities in a graph, a density-connected tree is first constructed in a local fashion. Building upon the local intuitive density-connected tree, Dcut allows partitioning a graph into multiple densely tight-knit clusters effectively and efficiently. We have demonstrated that our method has several attractive benefits: (a) Dcut provides an intuitive criterion to evaluate the goodness of a graph clustering in a more precise way; (b) Building upon the density-connected tree, Dcut allows identifying high-quality cl…
A loop-free two-close Gray-code algorithm for listing k-ary Dyck words
2006
AbstractP. Chase and F. Ruskey each published a Gray code for length n binary strings with m occurrences of 1, coding m-combinations of n objects, which is two-close—that is, in passing from one binary string to its successor a single 1 exchanges positions with a 0 which is either adjacent to the 1 or separated from it by a single 0. If we impose the restriction that any suffix of a string contains at least k−1 times as many 0's as 1's, we obtain k-suffixes: suffixes of k-ary Dyck words. Combinations are retrieved as special case by setting k=1 and k-ary Dyck words are retrieved as a special case by imposing the additional condition that the entire string has exactly k−1 times as many 0's a…
MuLiMs-MCoMPAs: A Novel Multiplatform Framework to Compute Tensor Algebra-Based Three-Dimensional Protein Descriptors
2019
This report introduces the MuLiMs-MCoMPAs software (acronym for Multi-Linear Maps based on N-Metric and Contact Matrices of 3D Protein and Amino-acid weightings), designed to compute tensor-based 3D protein structural descriptors by applying two- and three-linear algebraic forms. Moreover, these descriptors contemplate generalizing components such as novel 3D protein structural representations, (dis)similarity metrics, and multimetrics to extract geometrical related information between two and three amino acids, weighting schemes based on amino acid properties, matrix normalization procedures that consider simple-stochastic and mutual probability transformations, topological and geometrical…
Information dynamics: Temporal behavior of uncertainty measures
2008
We carry out a systematic study of uncertainty measures that are generic to dynamical processes of varied origins, provided they induce suitable continuous probability distributions. The major technical tool are the information theory methods and inequalities satisfied by Fisher and Shannon information measures. We focus on a compatibility of these inequalities with the prescribed (deterministic, random or quantum) temporal behavior of pertinent probability densities.
Stit Frames as Action Systems
2015
Stit semantics gives an account of action from a certain perspective: actions are seen not as operations performed in action systems and yielding new states of affairs, but rather as selections of preexistent trajectories of the system in time. Main problems of stit semantics are recapitulated. The interrelations between stit semantics and the approach based on ordered action systems are discussed more fully.
On the impact of forgetting on learning machines
1995
People tend not to have perfect memories when it comes to learning, or to anything else for that matter. Most formal studies of learning, however, assume a perfect memory. Some approaches have restricted the number of items that could be retained. We introduce a complexity theoretic accounting of memory utilization by learning machines. In our new model, memory is measured in bits as a function of the size of the input. There is a hierarchy of learnability based on increasing memory allotment. The lower bound results are proved using an unusual combination of pumping and mutual recursion theorem arguments. For technical reasons, it was necessary to consider two types of memory : long and sh…