Search results for "computer.software_genre"
showing 10 items of 3858 documents
Incremental Gaussian Discriminant Analysis based on Graybill and Deal weighted combination of estimators for brain tumour diagnosis
2011
In the last decade, machine learning (ML) techniques have been used for developing classifiers for automatic brain tumour diagnosis. However, the development of these ML models rely on a unique training set and learning stops once this set has been processed. Training these classifiers requires a representative amount of data, but the gathering, preprocess, and validation of samples is expensive and time-consuming. Therefore, for a classical, non-incremental approach to ML, it is necessary to wait long enough to collect all the required data. In contrast, an incremental learning approach may allow us to build an initial classifier with a smaller number of samples and update it incrementally…
Review of detection, assessment and mitigation of security risk in smart grid
2017
The integration of Information and Communication Technology (ICT) into the existing power grid has created new problems to the grid. The grid network has become more vulnerable to security threats and risk which is a corollary to the modern data network. Smart Grid has strict latency requirement for data communication and the violation of this latency is very costly. This paper assesses the threats and vulnerabilities associated with the Smart Grid network and reviews the methods to mitigate these security risks.
Domain Adaptation of Landsat-8 and Proba-V Data Using Generative Adversarial Networks for Cloud Detection
2019
Training machine learning algorithms for new satellites requires collecting new data. This is a critical drawback for most remote sensing applications and specially for cloud detection. A sensible strategy to mitigate this problem is to exploit available data from a similar sensor, which involves transforming this data to resemble the new sensor data. However, even taking into account the technical characteristics of both sensors to transform the images, statistical differences between data distributions still remain. This results in a poor performance of the methods trained on one sensor and applied to the new one. In this this work, we propose to use the generative adversarial networks (G…
Robust Principal Component Analysis of Data with Missing Values
2015
Principal component analysis is one of the most popular machine learning and data mining techniques. Having its origins in statistics, principal component analysis is used in numerous applications. However, there seems to be not much systematic testing and assessment of principal component analysis for cases with erroneous and incomplete data. The purpose of this article is to propose multiple robust approaches for carrying out principal component analysis and, especially, to estimate the relative importances of the principal components to explain the data variability. Computational experiments are first focused on carefully designed simulated tests where the ground truth is known and can b…
Policing and Site Protection, Guard Posts, and Enclosure Walls
2015
IT-Sicherheit in medizinischen Netzen - aktuelle Probleme und Lösungsansätze
2000
Designers and users of medical networks have to face strong requirements for data protection and security. Professional discretion and data protection laws allow the transfer of or access to patient data only in a therapeutic context. These data should also be protected from the network provider. Patients should be safe from any harm by faulty data or buggy procedures. On the other hand the security of the most used software products gets worse and worse. The use of the internet endangers more and more the integrity of the user's computer. The security requirements can be met only through strict care in planning, building, and configuring the infrastructure. Some concrete recommendations an…
A generic TG-186 shielded applicator for commissioning model-based dose calculation algorithms for high-dose-rate Ir-192 brachytherapy
2017
PurposeA joint working group was created by the American Association of Physicists in Medicine (AAPM), the European Society for Radiotherapy and Oncology (ESTRO), and the Australasian Brachytherapy Group (ABG) with the charge, among others, to develop a set of well-defined test case plans and perform calculations and comparisons with model-based dose calculation algorithms (MBDCAs). Its main goal is to facilitate a smooth transition from the AAPM Task Group No. 43 (TG-43) dose calculation formalism, widely being used in clinical practice for brachytherapy, to the one proposed by Task Group No. 186 (TG-186) for MBDCAs. To do so, in this work a hypothetical, generic high-dose rate (HDR) Ir-19…
Communication Interface Generation For HW/SW Architecture In The STARSoC Environment
2006
Mapping the application functionality to software and hardware requires automated methods to specify, generate and optimize the hardware, software, and the interface architectures between them. In this paper, we present a methodology flow to hardware-software communication synthesis for system-on-a-chip (SoC) design through STARSoC (Synthesis Tool for Adaptive and Reconfigurable System-on-a-Chip) tool for rapid prototyping. Our concept consists of a set of hardware and software processes, described in C-code, communicates through the streams channels. This methodology consists in analyzing dependences of data between processes and synthesis a custom architecture to interface it. Firstly, we…
AMCAS: Advanced Methods for the Co-Design of Complex Adaptive Systems
2006
Abstract This work proposes a new approximation to design and program Complex Adaptive Systems (CAS), these systems comprise neural network, intelligent agents, genetic algorithms, support vector machines and artificial intelligence systems in general. Due to the complexity of such systems, it is necessary to build a design environment able to ease the design work, allowing reusability and easy migration to hardware and/or software. Ptolemy II is used as the base system to simulate and evaluate the designs with different Models of Computation so that an optimum decision about the hardware or software implementation platform can be taken.
FADaC
2019
Solid state drives (SSDs) implement a log-structured write pattern, where obsolete data remains stored on flash pages until the flash translation layer (FTL) erases them. erase() operations, however, cannot erase a single page, but target entire flash blocks. Since these victim blocks typically store a mix of valid and obsolete pages, FTLs have to copy the valid data to a new block before issuing an erase() operation. This process therefore increases the latencies of concurrent I/Os and reduces the lifetime of flash memory. Data classification schemes identify data pages with similar update frequencies and group them together. FTLs can use this grouping to design garbage collection strategi…