Search results for "Software"
showing 10 items of 7396 documents
Towards a method to generate GUI prototypes from BPMN
2018
Business Process Model and Notation (BPMN) provides organizations with a standard that facilitates further compression of the business process. BPMN focuses on the functional processes, leaving the development of interfaces to one side. Thereby, interface design usually depends on the subjective experience of the analyst. This article aims to propose a new method to generate user interfaces from BPMN models and Class Diagrams. The proposed method is based on the identification of different rules and makes use of stereotypes to extend BPMN notation. The rules have been extracted from seven existing projects on the Bizagi repository. Specifically, the proposal is based on the extraction of ru…
Modeling user preferences in content-based image retrieval: A novel attempt to bridge the semantic gap
2015
This paper is concerned with content-based image retrieval from a stochastic point of view. The semantic gap problem is addressed in two ways. First, a dimensional reduction is applied using the (pre-calculated) distances among images. The dimension of the reduced vector is the number of preferences that we allow the user to choose from, in this case, three levels. Second, the conditional probability distribution of the random user preference, given this reduced feature vector, is modeled using a proportional odds model. A new model is fitted at each iteration. The score used to rank the image database is based on the estimated probability function of the random preference. Additionally, so…
A principled approach to network-based classification and data representation
2013
Measures of similarity are fundamental in pattern recognition and data mining. Typically the Euclidean metric is used in this context, weighting all variables equally and therefore assuming equal relevance, which is very rare in real applications. In contrast, given an estimate of a conditional density function, the Fisher information calculated in primary data space implicitly measures the relevance of variables in a principled way by reference to auxiliary data such as class labels. This paper proposes a framework that uses a distance metric based on Fisher information to construct similarity networks that achieve a more informative and principled representation of data. The framework ena…
Use of hierarchical Bayesian framework in MTS studies to model different causes and novel possible forms of acquired MTS
2015
Abstract: An integrative account of MTS could be cast in terms of hierarchical Bayesian inference. It may help to highlight a central role of sensory (tactile) precision could play in MTS. We suggest that anosognosic patients, with anesthetic hemisoma, can also be interpreted as a form of acquired MTS, providing additional data for the model.
Normalization 2.0: A longitudinal analysis of German online campaigns in the national elections 2002–9
2011
This article examines the functional, relational and discursive dimensions of the normalization thesis in one study, for both Web 1.0 and Web 2.0 features, in a longitudinal design. It is based on a quantitative content and structural analysis of German party websites in the national elections between 2002 and 2009. The results show that the normalization thesis holds true in all its dimensions over time and in the Web 2.0 era: parties still focus on the top-down elements of information provision and delivery while interactive options are scarce. The digital divide between parliamentary and non-parliamentary parties has narrowed over time, but remains visible for all online functions in 200…
Visualization of Memory Map Information in Embedded System Design
2018
Data compression is a common requirement for displaying large amounts of information. The goal is to reduce visual clutter. The approach given in this paper uses an analysis of a data set to construct a visual representation. The visualization is compressed using the address ranges of the memory structure. This method produces a compressed version of the initial visualization, retaining the same information as the original. The presented method has been implemented as a Memory Designer tool for ASIC, FPGA and embedded systems using IP-XACT. The Memory Designer is a user-friendly tool for model based embedded system design, providing access and adjustment of the memory layout from a single v…
Automating statistical diagrammatic representations with data characterization
2017
The search for an efficient method to enhance data cognition is especially important when managing data from multidimensional databases. Open data policies have dramatically increased not only the volume of data available to the public, but also the need to automate the translation of data into efficient graphical representations. Graphic automation involves producing an algorithm that necessarily contains inputs derived from the type of data. A set of rules are then applied to combine the input variables and produce a graphical representation. Automated systems, however, fail to provide an efficient graphical representation because they only consider either a one-dimensional characterizat…
Domain knowledge integration and semantical quality management -A biology case study
2008
International audience; The management of semantical quality is a major challenge in the context of knowledge integration. In this paper, we describe a new approach to constraint management that emphasizes constraint traceability when moving from the semantical level to the operational one.Our strategy for management of semantical quality is related to a metamo-deling-based approach to knowledge integration. We carry out knowledge integration “on the fly” by using transformations applied to models belonging to our metamodeling architecture. The resulting integrated models access available resources through web services whose input and output parameters are guarded by constraints. Integrated…
Mesh Visual Quality based on the combination of convolutional neural networks
2019
Blind quality assessment is a challenging issue since the evaluation is done without access to the reference nor any information about the distortion. In this work, we propose an objective blind method for the visual quality assessment of 3D meshes. The method estimates the perceived visual quality using only information from the distorted mesh to feed pre-trained deep convolutional neural networks. The input data is prepared by rendering 2D views from the 3D mesh and the corresponding saliency map. The views are split into small patches of fixed size that are filtered using a saliency threshold. Only the salient patches are selected as input data. After that, three pre-trained deep convolu…
Facilitating IP deployment in a MARTE-based MDE methodology using IP-XACT: a XILINX EDK case study
2012
International audience; In this paper we present framework for the deployment of hardware IPs at high-levels of abstraction. It is based in a model- driven approach that aims at the automatic generation of Dynamic Partial Reconfiguration designs created in Xilinx Platform Studio (XPS). Contrary to previous approaches, we make use of the IP-XACT standard to facilitate the deployment of hardware IPs, their parameterization and subsequent integration. We propose an extension to the MARTE profile for IP deployment, and we introduce the necessary model transformations to obtain a high- level representation from an IP-XACT component library. These models are then used to create a platform in MART…