Search results for "Terabyte"

showing 3 items of 3 documents

Next-generation sequencing: big data meets high performance computing

2017

The progress of next-generation sequencing has a major impact on medical and genomic research. This high-throughput technology can now produce billions of short DNA or RNA fragments in excess of a few terabytes of data in a single run. This leads to massive datasets used by a wide range of applications including personalized cancer treatment and precision medicine. In addition to the hugely increased throughput, the cost of using high-throughput technologies has been dramatically decreasing. A low sequencing cost of around US$1000 per genome has now rendered large population-scale projects feasible. However, to make effective use of the produced data, the design of big data algorithms and t…

0301 basic medicineComputer scienceDistributed computingGenomic researchBig dataTerabyteComputing MethodologiesDNA sequencing03 medical and health sciences0302 clinical medicineDatabases GeneticDrug DiscoveryHumansThroughput (business)PharmacologyGenomebusiness.industryHigh-Throughput Nucleotide SequencingGenomicsSequence Analysis DNAPrecision medicineSupercomputerData scienceCancer treatment030104 developmental biology030220 oncology & carcinogenesisbusinessAlgorithmsDrug Discovery Today
researchProduct

Big Data in Medical Science–a Biostatistical View

2015

Big data” is a universal buzzword in business and science, referring to the retrieval and handling of ever-growing amounts of information. It can be assumed, for example, that a typical hospital generates hundreds of terabytes (1 TB = 1012 bytes) of data annually in the course of patient care (1). For instance, exome sequencing, which results in 5 gigabytes (1 GB = 109 bytes) of data per patient, is on the way to becoming routine (2). The analysis of such enormous volumes of information, i.e., organization and description of the data and the drawing of (scientifically valid) conclusions, can already hardly be accomplished with the traditional tools of computer science and statistics. For ex…

Gigabytebusiness.industrymedia_common.quotation_subjectBig dataByteCloud computingGeneral MedicineTerabyteBioinformaticsData scienceData analysisMedicinebusinessFunction (engineering)media_commonDatasets as TopicDeutsches Ärzteblatt international
researchProduct

The NA48 event-building PC farm

2003

The NA48 experiment at the CERN SPS aims to measure the parameter $\Re(\epsilon'/ \epsilon)$ of direct CP violation in the neutral kaon system with an accuracy of $2 \times 10^{-4}$. Based on the requirements of: \\\\ * high event rates (up to 10 kHz) with negligible dead time;\\ * support for a variety of detectors with very wide variation in the number of readout channels;\\ * data rates of up to 150 MByte/s sustained over the beam burst;\\ * level-3 filtering and remote data logging in the CERN computer center; \\\\ the collaboration has designed and built a modular pipelined data flow system with 40 MHz sampling rate. The architecture combines custom-designed components with commerciall…

Nuclear and High Energy PhysicsCost effectivenessComputer sciencebusiness.industryPentiumNA48 experimentDead timeTerabyteData flow diagramData acquisitionNuclear Energy and EngineeringData loggerDetectors and Experimental TechniquesElectrical and Electronic EngineeringbusinessComputer hardwareIEEE Transactions on Nuclear Science
researchProduct