Search results for "computer.software_genre"

showing 10 items of 3858 documents

Incremental Gaussian Discriminant Analysis based on Graybill and Deal weighted combination of estimators for brain tumour diagnosis

2011

In the last decade, machine learning (ML) techniques have been used for developing classifiers for automatic brain tumour diagnosis. However, the development of these ML models rely on a unique training set and learning stops once this set has been processed. Training these classifiers requires a representative amount of data, but the gathering, preprocess, and validation of samples is expensive and time-consuming. Therefore, for a classical, non-incremental approach to ML, it is necessary to wait long enough to collect all the required data. In contrast, an incremental learning approach may allow us to build an initial classifier with a smaller number of samples and update it incrementally…

Graybill-Deal estimatorDatabases FactualComputer sciencePopulation-based incremental learningGaussianTraining setsHealth InformaticsMachine learningcomputer.software_genreIncremental algorithmPersonalizationsymbols.namesakeAutomatic brain tumour diagnosisArtificial IntelligenceNumber of samplesMachine learningMagnetic resonance spectroscopyHumansPreprocessIncremental learningTraining setbusiness.industryBrain NeoplasmsBrain tumoursEstimatorComputational BiologyPattern recognitionLinear discriminant analysisMagnetic Resonance ImagingDiscriminant analysisTranslational research Tissue engineering and pathology [ONCOL 3]Graybill–Deal estimatorComputer Science ApplicationsGaussiansMagnetic resonanceFISICA APLICADAIncremental learningsymbolsEmpirical resultsArtificial intelligencebusinessClassifier (UML)computerEstimationAlgorithmsJournal of Biomedical Informatics
researchProduct

Review of detection, assessment and mitigation of security risk in smart grid

2017

The integration of Information and Communication Technology (ICT) into the existing power grid has created new problems to the grid. The grid network has become more vulnerable to security threats and risk which is a corollary to the modern data network. Smart Grid has strict latency requirement for data communication and the violation of this latency is very costly. This paper assesses the threats and vulnerabilities associated with the Smart Grid network and reviews the methods to mitigate these security risks.

Grid networkComputer sciencebusiness.industryAccess controlComputer securitycomputer.software_genreGridSmart gridInformation and Communications TechnologyMalwareRisk assessmentbusinesscomputerRisk management2017 2nd International Conference on Power and Renewable Energy (ICPRE)
researchProduct

Domain Adaptation of Landsat-8 and Proba-V Data Using Generative Adversarial Networks for Cloud Detection

2019

Training machine learning algorithms for new satellites requires collecting new data. This is a critical drawback for most remote sensing applications and specially for cloud detection. A sensible strategy to mitigate this problem is to exploit available data from a similar sensor, which involves transforming this data to resemble the new sensor data. However, even taking into account the technical characteristics of both sensors to transform the images, statistical differences between data distributions still remain. This results in a poor performance of the methods trained on one sensor and applied to the new one. In this this work, we propose to use the generative adversarial networks (G…

Ground truth010504 meteorology & atmospheric sciencesComputer scienceRemote sensing application0211 other engineering and technologies02 engineering and technologycomputer.software_genre01 natural sciencesConvolutional neural networkData miningAdaptation (computer science)computerGenerative grammar021101 geological & geomatics engineering0105 earth and related environmental sciencesIGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

Robust Principal Component Analysis of Data with Missing Values

2015

Principal component analysis is one of the most popular machine learning and data mining techniques. Having its origins in statistics, principal component analysis is used in numerous applications. However, there seems to be not much systematic testing and assessment of principal component analysis for cases with erroneous and incomplete data. The purpose of this article is to propose multiple robust approaches for carrying out principal component analysis and, especially, to estimate the relative importances of the principal components to explain the data variability. Computational experiments are first focused on carefully designed simulated tests where the ground truth is known and can b…

Ground truthPCAComputer scienceRobust statisticsMissing datacomputer.software_genreSet (abstract data type)missing dataMultiple correspondence analysisrobust statisticsPrincipal component analysisData miningcomputerRobust principal component analysis
researchProduct

Policing and Site Protection, Guard Posts, and Enclosure Walls

2015

Guard (information security)Engineeringbusiness.industryEnclosureForensic engineeringbusinessComputer securitycomputer.software_genrecomputer
researchProduct

IT-Sicherheit in medizinischen Netzen - aktuelle Probleme und Lösungsansätze

2000

Designers and users of medical networks have to face strong requirements for data protection and security. Professional discretion and data protection laws allow the transfer of or access to patient data only in a therapeutic context. These data should also be protected from the network provider. Patients should be safe from any harm by faulty data or buggy procedures. On the other hand the security of the most used software products gets worse and worse. The use of the internet endangers more and more the integrity of the user's computer. The security requirements can be met only through strict care in planning, building, and configuring the infrastructure. Some concrete recommendations an…

Guiding Principlesbusiness.industrymedia_common.quotation_subjectObstetrics and GynecologyContext (language use)DiscretionComputer securitycomputer.software_genreSoftwareHarmHealth careMedicineData Protection Act 1998The Internetbusinesscomputermedia_commonZentralblatt für Gynäkologie
researchProduct

A generic TG-186 shielded applicator for commissioning model-based dose calculation algorithms for high-dose-rate Ir-192 brachytherapy

2017

PurposeA joint working group was created by the American Association of Physicists in Medicine (AAPM), the European Society for Radiotherapy and Oncology (ESTRO), and the Australasian Brachytherapy Group (ABG) with the charge, among others, to develop a set of well-defined test case plans and perform calculations and comparisons with model-based dose calculation algorithms (MBDCAs). Its main goal is to facilitate a smooth transition from the AAPM Task Group No. 43 (TG-43) dose calculation formalism, widely being used in clinical practice for brachytherapy, to the one proposed by Task Group No. 186 (TG-186) for MBDCAs. To do so, in this work a hypothetical, generic high-dose rate (HDR) Ir-19…

HDR brachytherapymedicine.medical_specialtyComputer sciencemedicine.medical_treatmentBrachytherapyEQUATION SOLVERIr-192computer.software_genreGEC-ESTROImaging phantom030218 nuclear medicine & medical imaginglaw.invention03 medical and health sciences0302 clinical medicineMONTE-CARLOlawVoxelGEOMETRIESShielded cableAAPMmedicineMedical physicsRadiation treatment planningGEANT4business.industryMonte Carlo methodsGeneral MedicineDOSIMETRIC ACCURACYTRANSPORTmodel based dose calculation3. Good healthRadiation therapyTG-186030220 oncology & carcinogenesisAbsorbed doseSIMULATIONshielded applicatorTG-43 FORMALISMNuclear medicinebusinessDose rateAlgorithmcomputer
researchProduct

Communication Interface Generation For HW/SW Architecture In The STARSoC Environment

2006

Mapping the application functionality to software and hardware requires automated methods to specify, generate and optimize the hardware, software, and the interface architectures between them. In this paper, we present a methodology flow to hardware-software communication synthesis for system-on-a-chip (SoC) design through STARSoC (Synthesis Tool for Adaptive and Reconfigurable System-on-a-Chip) tool for rapid prototyping. Our concept consists of a set of hardware and software processes, described in C-code, communicates through the streams channels. This methodology consists in analyzing dependences of data between processes and synthesis a custom architecture to interface it. Firstly, we…

Hardware architectureResource-oriented architectureComputer sciencebusiness.industryInterface (computing)Software prototypingcomputer.software_genreSoftware frameworkComputer architectureEmbedded systemComponent-based software engineeringReference architecturebusinesscomputerFPGA prototype2006 IEEE International Conference on Reconfigurable Computing and FPGA's (ReConFig 2006)
researchProduct

AMCAS: Advanced Methods for the Co-Design of Complex Adaptive Systems

2006

Abstract This work proposes a new approximation to design and program Complex Adaptive Systems (CAS), these systems comprise neural network, intelligent agents, genetic algorithms, support vector machines and artificial intelligence systems in general. Due to the complexity of such systems, it is necessary to build a design environment able to ease the design work, allowing reusability and easy migration to hardware and/or software. Ptolemy II is used as the base system to simulate and evaluate the designs with different Models of Computation so that an optimum decision about the hardware or software implementation platform can be taken.

Hardware architectureSystem of systemsComputer sciencebusiness.industryModel of computationDistributed computingcomputer.software_genreIntelligent agentSoftwareComputer engineeringSystems development life cycleSystems designHardware compatibility listbusinesscomputerReusability
researchProduct

FADaC

2019

Solid state drives (SSDs) implement a log-structured write pattern, where obsolete data remains stored on flash pages until the flash translation layer (FTL) erases them. erase() operations, however, cannot erase a single page, but target entire flash blocks. Since these victim blocks typically store a mix of valid and obsolete pages, FTLs have to copy the valid data to a new block before issuing an erase() operation. This process therefore increases the latencies of concurrent I/Os and reduces the lifetime of flash memory. Data classification schemes identify data pages with similar update frequencies and group them together. FTLs can use this grouping to design garbage collection strategi…

Hardware_MEMORYSTRUCTURESComputer science0202 electrical engineering electronic engineering information engineeringOperating system020206 networking & telecommunications02 engineering and technologycomputer.software_genrecomputerClassifier (UML)Flash memoryFlash file system020202 computer hardware & architectureGarbage collectionProceedings of the 12th ACM International Conference on Systems and Storage
researchProduct