Search results for "Algorithm"
showing 10 items of 4887 documents
Worldwide burden of LDL cholesterol: Implications in cardiovascular disease
2020
Abstract Background and aim an increased value of low-density lipoprotein cholesterol (LDL-C) is now universally considered a major cardiovascular disease (CVD) risk factor. LDL-C is included in the vast majority of worldwide cardiovascular risk prediction algorithms, as well as in the guidelines for cardiovascular risk prevention. We aimed to provide an overview of the worldwide adverse healthcare impact of low-density lipoprotein cholesterol (LDL-C). Methods and results Data on the epidemiologic burden of LDL-C >1.3 mmol/L were retrieved from Global Health Data Exchange (GHDx) registry. The current burden is 94.92 million disability-adjusted life years (DALYs), with an exponential increas…
Introduction to coronary imaging with 64-slice computed tomography
2005
The aim of this article is to illustrate the main technical improvements in the last generation of 64-row CT scanners and the possible applications in coronary angiography. In particular, we describe the new physical components (X-ray tube-detectors system) and the general scan and reconstruction parameters. We then define the scan protocols for coronary angiography with the new generation of 64-row CT scanners to enable radiologists to perform a CT study on the basis of the diagnostic possibilities.
Equilibrium coverage fluctuations: a new approach to quantify reversible adsorption of proteins.
2005
Comparison of basis functions for 3D PET reconstruction using a Monte Carlo system matrix.
2012
In emission tomography, iterative statistical methods are accepted as the reconstruction algorithms that achieve the best image quality. The accuracy of these methods relies partly on the quality of the system response matrix (SRM) that characterizes the scanner. The more physical phenomena included in the SRM, the higher the SRM quality, and therefore higher image quality is obtained from the reconstruction process. High-resolution small animal scanners contain as many as 103?104 small crystal pairs, while the field of view (FOV) is divided into hundreds of thousands of small voxels. These two characteristics have a significant impact on the number of elements to be calculated in the SRM. …
Global retention models and their application to the prediction of chromatographic fingerprints
2020
Abstract The resolution of samples containing unknown compounds of different nature, or without standards available, as is the case of chromatographic fingerprints, is still a challenge. Possibly, the most problematic aspect that prevents systematic method development is finding models that describe without bias the retention behaviour of the compounds in the samples. In this work, the use of global models (able to describe the whole sample) is proposed as an alternative to the use of individual models for each solute. Global models contain parameters that are specific for each solute, while other parameters ‒related to the column and solvent‒ are common for all solutes. A special regressio…
Stiffness-Adaptive Taylor method for the integration of non-stiff and stiff kinetic models
1992
A systematic derivation procedure that greatly facilitates the application of the Taylor method to the integration of kinetic models is developed. In addition, an algorithm that gives the integration step as a function of the required level of accuracy is proposed. Using the Taylor method, application of this algorithm is immediate and largely reduces the integration time. In addition, a new method of integration of kinetic models, whose most important feature is the self-adaptability to the stiffness of the system along the integration process, is developed. This “stiffness-adaptive” Taylor method (SAT method) makes use of several algorithms, combining them to meet the particular requireme…
Performance comparison of residual related algorithms for ToA positioning in wireless terrestrial and sensor networks
2009
©2009 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE." Article also available from publisher: http://dx.doi.org/10.1109/WIRELESSVITAE.2009.5172462 Time of Arrival (ToA) is a popular technique for terrestrial positioning. This paper presents a comparison of ToA based residual related positioning algorithms in wireless terrestrial and sensor networks in both long range outdoor and short range indoor environments. Us…
Discrete Tomography Reconstruction Through a New Memetic Algorithm
2008
Discrete tomography is a particular case of computerized tomography that deals with the reconstruction of objects made of just one homogeneous material, where it is sometimes possible to reduce the number of projections to no more than four. Most methods for standard computerized tomography cannot be applied in the former case and ad hoc techniques must be developed to handle so few projections.
Blind Radio Tomography
2018
From the attenuation measurements collected by a network of spatially distributed sensors, radio tomography constructs spatial loss fields (SLFs) that quantify absorption of radiofrequency waves at each location. These SLFs can be used for interference prediction in (possibly cognitive) wireless communication networks, for environmental monitoring or intrusion detection in surveillance applications, for through-the-wall imaging, for survivor localization after earthquakes or fires, etc. The cornerstone of radio tomography is to model attenuation as the bidimensional integral of the SLF of interest scaled by a weight function. Unfortunately, existing approaches (i) rely on heuristic assumpti…
Multi-label Classification Using Stacked Hierarchical Dirichlet Processes with Reduced Sampling Complexity
2018
Nonparametric topic models based on hierarchical Dirichlet processes (HDPs) allow for the number of topics to be automatically discovered from the data. The computational complexity of standard Gibbs sampling techniques for model training is linear in the number of topics. Recently, it was reduced to be linear in the number of topics per word using a technique called alias sampling combined with Metropolis Hastings (MH) sampling. We propose a different proposal distribution for the MH step based on the observation that distributions on the upper hierarchy level change slower than the document-specific distributions at the lower level. This reduces the sampling complexity, making it linear i…