Search results for "Algorithm"

showing 10 items of 4887 documents

Worldwide burden of LDL cholesterol: Implications in cardiovascular disease

2020

Abstract Background and aim an increased value of low-density lipoprotein cholesterol (LDL-C) is now universally considered a major cardiovascular disease (CVD) risk factor. LDL-C is included in the vast majority of worldwide cardiovascular risk prediction algorithms, as well as in the guidelines for cardiovascular risk prevention. We aimed to provide an overview of the worldwide adverse healthcare impact of low-density lipoprotein cholesterol (LDL-C). Methods and results Data on the epidemiologic burden of LDL-C >1.3 mmol/L were retrieved from Global Health Data Exchange (GHDx) registry. The current burden is 94.92 million disability-adjusted life years (DALYs), with an exponential increas…

Time FactorsDatabases FactualHealth StatusEndocrinology Diabetes and MetabolismMedicine (miscellaneous)030209 endocrinology & metabolismDisease030204 cardiovascular system & hematologyGlobal HealthRisk Assessment03 medical and health scienceschemistry.chemical_compound0302 clinical medicineRisk FactorsEnvironmental healthHealth careGlobal healthHumansMedicineRegistriesRisk factorEpidemicsAtherosclerosis; Cardiovascular disease; Cholesterol; Low-density lipoproteinsDyslipidemiasLdl cholesterolNutrition and Dieteticsbusiness.industryCholesterolCholesterol LDLAtherosclerosisCardiovascular diseaseUp-RegulationPrediction algorithmsCholesterolchemistryCardiovascular DiseasesLow-density lipoproteinslipids (amino acids peptides and proteins)Risk preventionQuality-Adjusted Life YearsCardiology and Cardiovascular MedicinebusinessBiomarkersNutrition, Metabolism and Cardiovascular Diseases
researchProduct

Introduction to coronary imaging with 64-slice computed tomography

2005

The aim of this article is to illustrate the main technical improvements in the last generation of 64-row CT scanners and the possible applications in coronary angiography. In particular, we describe the new physical components (X-ray tube-detectors system) and the general scan and reconstruction parameters. We then define the scan protocols for coronary angiography with the new generation of 64-row CT scanners to enable radiologists to perform a CT study on the basis of the diagnostic possibilities.

Time FactorsImaging Sensitivity and Specificity Stents Time Factors TomographyContrast MediaSensitivity and Specificity64-row CTElectrocardiographyHeart RateSpiral Computed TomographyImage Processing Computer-AssistedHumanstechnical improvementsCoronary Artery BypassPhantoms ImagingPatient SelectionAlgorithms Artifacts Contrast Media Coronary Angiography/instrumentation/*methods Coronary Artery Bypass Coronary Stenosis/*radiography Electrocardiography Heart Rate Humans Image Processing; Computer-Assisted Patient Selection Phantoms; Imaging Sensitivity and Specificity Stents Time Factors Tomography; Spiral Computed Tomography; X-Ray Computed/instrumentation/*methodsCoronary StenosisComputer-Assisted Patient Selection PhantomsX-Ray Computed/instrumentation/*methodstechnical improvements; 64-row CT; coronary angiographyComputer-Assisted Patient Selection PhantomStentsAlgorithms Artifacts Contrast Media Coronary Angiography/instrumentation/*methods Coronary Artery Bypass Coronary Stenosis/*radiography Electrocardiography Heart Rate Humans Image Processingcoronary angiographyArtifactsTomography X-Ray ComputedTomography Spiral ComputedAlgorithms
researchProduct

Equilibrium coverage fluctuations: a new approach to quantify reversible adsorption of proteins.

2005

Time FactorsProtein ConformationKineticsBiophysicsBiosensing TechniquesModels BiologicalAdsorptionAb initio quantum chemistry methodsComputational chemistryElectrochemistryComputer SimulationPhysical and Theoretical ChemistryChemistryReversible adsorptionChemistry PhysicalProteinsSurface Plasmon ResonanceAtomic and Molecular Physics and OpticsNanostructuresKineticsSpectrophotometryAdsorptionStress MechanicalPeptidesMonte Carlo MethodAlgorithmsProtein BindingChemphyschem : a European journal of chemical physics and physical chemistry
researchProduct

Comparison of basis functions for 3D PET reconstruction using a Monte Carlo system matrix.

2012

In emission tomography, iterative statistical methods are accepted as the reconstruction algorithms that achieve the best image quality. The accuracy of these methods relies partly on the quality of the system response matrix (SRM) that characterizes the scanner. The more physical phenomena included in the SRM, the higher the SRM quality, and therefore higher image quality is obtained from the reconstruction process. High-resolution small animal scanners contain as many as 103?104 small crystal pairs, while the field of view (FOV) is divided into hundreds of thousands of small voxels. These two characteristics have a significant impact on the number of elements to be calculated in the SRM. …

Time FactorsRadiological and Ultrasound TechnologyRotationStatistical noisebusiness.industryImage qualityPhantoms ImagingMonte Carlo methodBasis functioncomputer.software_genreNoiseImaging Three-DimensionalVoxelPositron-Emission TomographyRadiology Nuclear Medicine and imagingComputer visionArtificial intelligencebusinesscomputerAlgorithmImage resolutionMonte Carlo MethodSmoothingMathematicsPhysics in medicine and biology
researchProduct

Global retention models and their application to the prediction of chromatographic fingerprints

2020

Abstract The resolution of samples containing unknown compounds of different nature, or without standards available, as is the case of chromatographic fingerprints, is still a challenge. Possibly, the most problematic aspect that prevents systematic method development is finding models that describe without bias the retention behaviour of the compounds in the samples. In this work, the use of global models (able to describe the whole sample) is proposed as an alternative to the use of individual models for each solute. Global models contain parameters that are specific for each solute, while other parameters ‒related to the column and solvent‒ are common for all solutes. A special regressio…

Time FactorsResolution (mass spectrometry)Predictive capability010402 general chemistry01 natural sciencesBiochemistryHigh-performance liquid chromatographyAnalytical ChemistryComputer SimulationChromatographySulfonamidesChromatographyPlant ExtractsChemistryElution010401 analytical chemistryOrganic ChemistryChamomileWaterGeneral MedicineModels TheoreticalReference StandardsMethod development0104 chemical sciencesRegression AnalysisGradient elutionAlgorithmsJournal of Chromatography A
researchProduct

Stiffness-Adaptive Taylor method for the integration of non-stiff and stiff kinetic models

1992

A systematic derivation procedure that greatly facilitates the application of the Taylor method to the integration of kinetic models is developed. In addition, an algorithm that gives the integration step as a function of the required level of accuracy is proposed. Using the Taylor method, application of this algorithm is immediate and largely reduces the integration time. In addition, a new method of integration of kinetic models, whose most important feature is the self-adaptability to the stiffness of the system along the integration process, is developed. This “stiffness-adaptive” Taylor method (SAT method) makes use of several algorithms, combining them to meet the particular requireme…

Time delay and integrationProcess (engineering)MathematicsofComputing_NUMERICALANALYSISStiffnessGeneral ChemistryFunction (mathematics)Kinetic energyDerivation procedureComputational MathematicsTaylor methodFeature (computer vision)medicinemedicine.symptomAlgorithmMathematicsJournal of Computational Chemistry
researchProduct

Performance comparison of residual related algorithms for ToA positioning in wireless terrestrial and sensor networks

2009

©2009 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE." Article also available from publisher: http://dx.doi.org/10.1109/WIRELESSVITAE.2009.5172462 Time of Arrival (ToA) is a popular technique for terrestrial positioning. This paper presents a comparison of ToA based residual related positioning algorithms in wireless terrestrial and sensor networks in both long range outdoor and short range indoor environments. Us…

Time of arrivalComputational complexity theoryComputer scienceIterative methodbusiness.industryVDP::Technology: 500::Information and communication technology: 550::Telecommunication: 552WirelessResidualCommunication complexitybusinessWireless sensor networkAlgorithmWeighting
researchProduct

Discrete Tomography Reconstruction Through a New Memetic Algorithm

2008

Discrete tomography is a particular case of computerized tomography that deals with the reconstruction of objects made of just one homogeneous material, where it is sometimes possible to reduce the number of projections to no more than four. Most methods for standard computerized tomography cannot be applied in the former case and ad hoc techniques must be developed to handle so few projections.

Tomographic reconstructionSettore INF/01 - Informaticabusiness.industryBinary imageGenetic algorithmInstrumental noiseMemetic algorithmComputer visionTomographyArtificial intelligenceDiscrete Tomography Memetic Algorithms Evolutionary methods.businessDiscrete tomographyMathematics
researchProduct

Blind Radio Tomography

2018

From the attenuation measurements collected by a network of spatially distributed sensors, radio tomography constructs spatial loss fields (SLFs) that quantify absorption of radiofrequency waves at each location. These SLFs can be used for interference prediction in (possibly cognitive) wireless communication networks, for environmental monitoring or intrusion detection in surveillance applications, for through-the-wall imaging, for survivor localization after earthquakes or fires, etc. The cornerstone of radio tomography is to model attenuation as the bidimensional integral of the SLF of interest scaled by a weight function. Unfortunately, existing approaches (i) rely on heuristic assumpti…

Tomographic reconstructionbusiness.industryComputer scienceAttenuationComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION020206 networking & telecommunications02 engineering and technologyInterference (wave propagation)Signal Processing0202 electrical engineering electronic engineering information engineeringWireless020201 artificial intelligence & image processingTomographyElectrical and Electronic EngineeringbusinessAlgorithmRadio tomography
researchProduct

Multi-label Classification Using Stacked Hierarchical Dirichlet Processes with Reduced Sampling Complexity

2018

Nonparametric topic models based on hierarchical Dirichlet processes (HDPs) allow for the number of topics to be automatically discovered from the data. The computational complexity of standard Gibbs sampling techniques for model training is linear in the number of topics. Recently, it was reduced to be linear in the number of topics per word using a technique called alias sampling combined with Metropolis Hastings (MH) sampling. We propose a different proposal distribution for the MH step based on the observation that distributions on the upper hierarchy level change slower than the document-specific distributions at the lower level. This reduces the sampling complexity, making it linear i…

Topic modelComputational complexity theoryComputer science02 engineering and technologyLatent Dirichlet allocationDirichlet distributionsymbols.namesakeArtificial Intelligence020204 information systems0202 electrical engineering electronic engineering information engineeringMathematicsMulti-label classificationbusiness.industrySampling (statistics)Pattern recognitionHuman-Computer InteractionDirichlet processMetropolis–Hastings algorithmHardware and ArchitectureTest setsymbols020201 artificial intelligence & image processingArtificial intelligencebusinessAlgorithmSoftwareInformation SystemsGibbs sampling2017 IEEE International Conference on Big Knowledge (ICBK)
researchProduct