Search results for "computer.software_genre"
showing 10 items of 3858 documents
A novel dynamic multi-model relevance feedback procedure for content-based image retrieval
2016
This paper deals with the problem of image retrieval in large databases with a big semantic gap by a relevance feedback procedure. We present a novel algorithm for modelling the users's preferences in the content-based image retrieval system.The proposed algorithm considers the probability of an image belonging to the set of those sought by the user, and estimates the parameters of several local logistic regression models whose inputs are the low-level image features. A Principal Component Analysis method is applied to the original vector to reduce its high dimensionality. The relevance probabilities predicted by these local models are combined by means of a weighted average. These weights …
Project thesaurus 2020 — Linguistic and ontological aspects
2011
Structures and linguistic concepts of thesauri are analyzed and compared. Proposals for the improvement of thesauri are developed.
Evaluation of an Educational Video Production Environment
2019
In this paper, a cloud-based solution for the easy production and meaningful distribution of video-based learning material is presented and evaluated. Using this system, lecturers can produce video material with their own computers anywhere, at any time. The system enables the production of longer lecture videos as well as short videos, each containing only one of the course's topics. The system also handles video sharing for students automatically in a meaningful way through the course's table of contents. An evaluation of the first production version of the video production and distribution system was carried out by collecting qualitative material from the lecturers and students.
Identification of Clusters of Investors from Their Real Trading Activity in a Financial Market
2011
We use statistically validated networks, a recently introduced method to validate links in a bipartite system, to identify clusters of investors trading in a financial market. Specifically, we investigate a special database allowing to track the trading activity of individual investors of the stock Nokia. We find that many statistically detected clusters of investors show a very high degree of synchronization in the time when they decide to trade and in the trading action taken. We investigate the composition of these clusters and we find that several of them show an over-expression of specific categories of investors.
Inter-Model Consistency and Complementarity: Learning from ex-vivo Imaging and Electrophysiological Data towards an Integrated Understanding of Cardi…
2011
International audience; Computational models of the heart at various scales and levels of complexity have been independently developed, parameterised and validated using a wide range of experimental data for over four decades. However, despite remarkable progress, the lack of coordinated efforts to compare and combine these computational models has limited their impact on the numerous open questions in cardiac physiology. To address this issue, a comprehensive dataset has previously been made available to the community that contains the cardiac anatomy and fibre orientations from magnetic resonance imaging as well as epicardial transmembrane potentials from optical mapping measured on a per…
Comparison of basis functions for 3D PET reconstruction using a Monte Carlo system matrix.
2012
In emission tomography, iterative statistical methods are accepted as the reconstruction algorithms that achieve the best image quality. The accuracy of these methods relies partly on the quality of the system response matrix (SRM) that characterizes the scanner. The more physical phenomena included in the SRM, the higher the SRM quality, and therefore higher image quality is obtained from the reconstruction process. High-resolution small animal scanners contain as many as 103?104 small crystal pairs, while the field of view (FOV) is divided into hundreds of thousands of small voxels. These two characteristics have a significant impact on the number of elements to be calculated in the SRM. …
Application of the group method of data handling (GMDH) approach for landslide susceptibility zonation using readily available spatial covariates
2022
Abstract Landslide susceptibility (LS) mapping is an essential tool for landslide risk assessment. This study aimed to provide a new approach with better performance for landslide mapping and adopting readily available variables. In addition, it investigates the capability of a state-of-the-art model developed using the group method of data handling (GMDH) to spatially model LS. Furthermore, hybridized models of GMDH were developed using different metaheuristic algorithms. The study area was the Bonghwa region of South Korea, for which an accurate landslide inventory dataset is available. We considered a total of 13 spatial covariates (altitude, slope, aspect, topographic wetness index, val…
Retrospection in interpreting and translation: explaining the process?
2014
Retrospection is one of the few research methods equally suitable for studying the processes involved in both translation and interpreting. At the first workshop on research methods in process-oriented research (Graz 2009), we presented the results of a pilot study of retrospection as a research method, published as Englund Dimitrova & Tiselius (2009). The study involved data from two groups (15 years of professional experience vs. no professional experience), each with 3+3 subjects (interpreter subjects vs. translator subjects, all with Swedish as their L1). The source text was a 10-minute plenary speech in English from the European Parliament, interpreted simultaneously into Swedish. …
Open-source software tools for measuring resources consumption and DASH metrics
2020
When designing and deploying multimedia systems, it is essential to accurately know about the necessary requirements and the Quality of Service (QoS) offered to the customers. This paper presents two open-source software tools that contribute to these key needs. The first tool is able to measure and register resources consumption metrics for any Windows program (i.e. process id), like the CPU, GPU and RAM usage. Unlike the Task Manager, which requires manual visual inspection for just a subset of these metrics, the developed tool runs on top of the Powershell to periodically measure these metrics, calculate statistics, and register them in log files. The second tool is able to measure QoS m…
Computer-Aided Diagnosis System with Backpropagation Artificial Neural Network—Improving Human Readers Performance
2016
This article presents the results of a study into possibility of artificial neural networks (ANNs) to classify cancer changes in mammographic images. Today’s Computer-Aided Detection (CAD) systems cannot detect 100 % of pathological changes. One of the properties of an ANN is generalized information —it can identify not only learned data but also data that is similar to training set. The combination of CAD and ANN could give better result and help radiologists to take the right decision.