Search results for "normalization"

showing 10 items of 632 documents

Statistical modelling of non-stationary processes of atmospheric pollution from natural sources: example of birch pollen

2016

Abstract A statistical model for predicting daily mean pollen concentrations during the flowering season is constructed and its parameterization and application to birch pollen in Riga (Latvia) are discussed. The model involves several steps of transformations of both meteorological data and pollen observations, aiming at a normally distributed homogeneous stationary dataset with linearized dependencies between the transformed meteorological predictors and pollen concentrations. The data transformation includes normalization of daily mean birch pollen concentrations, a switch of the independent axis from time to heat sum, a projection of governing parameters to pollen concentrations, and a …

Normalization (statistics)Atmospheric ScienceGlobal and Planetary Change010504 meteorology & atmospheric sciencesPollen seasonMeteorologyForestryAtmospheric pollutionStatistical model010501 environmental sciencesAtmospheric sciencesmedicine.disease_cause01 natural sciencesRegressionBirch pollenFlowering seasonPollenmedicineEnvironmental scienceAgronomy and Crop Science0105 earth and related environmental sciencesAgricultural and Forest Meteorology
researchProduct

Propagation pattern analysis during atrial fibrillation based on sparse modeling.

2012

In this study, sparse modeling is introduced for the estimation of propagation patterns in intracardiac atrial fibrillation (AF) signals. The estimation is based on the partial directed coherence function, derived from fitting a multivariate autoregressive model to the observed signal using least-squares (LS) estimation. The propagation pattern analysis incorporates prior information on sparse coupling as well as the distance between the recording sites. Two optimization methods are employed for estimation of the model parameters, namely, the adaptive group least absolute selection and shrinkage operator (aLASSO), and a novel method named the distance-adaptive group LASSO (dLASSO). Using si…

Normalization (statistics)Computer scienceAtrial fibrillation (AF)Biomedical EngineeringSignalPattern Recognition AutomatedElectrocardiographyelectrogramgroup least absolute selection and shrinkage operator (LASSO)Operator (computer programming)StatisticsAtrial FibrillationHumansComputer SimulationSelection (genetic algorithm)ShrinkageSignal processingNoise (signal processing)partial directed coherence (PDC)Models CardiovascularSignal Processing Computer-Assistedpropagation pattern analysiFrequency domainSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaPattern recognition (psychology)AlgorithmAlgorithmsIEEE transactions on bio-medical engineering
researchProduct

Propagation pattern analysis during atrial fibrillation based on the adaptive group LASSO.

2012

The present study introduces sparse modeling for the estimation of propagation patterns in intracardiac atrial fibrillation (AF) signals. The estimation is based on the partial directed coherence (PDC) function, derived from fitting a multivariate autoregressive model to the observed signals. A sparse optimization method is proposed for estimation of the model parameters, namely, the adaptive group least absolute selection and shrinkage operator (aLASSO). In simulations aLASSO was found superior to the commonly used least-squares (LS) estimation with respect to estimation performance. The normalized error between the true and estimated model parameters dropped from 0.200.04 for LS estimatio…

Normalization (statistics)Computer scienceBiomedical EngineeringHealth InformaticsGroup lassoSensitivity and SpecificityPattern Recognition AutomatedHeart Conduction SystemStatisticsAtrial FibrillationCoherence (signal processing)AnimalsHumansComputer SimulationDiagnosis Computer-AssistedTime series1707ShrinkageSparse matrixPropagation patternModels CardiovascularReproducibility of ResultsElectroencephalographySignal ProcessingSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaAlgorithmAlgorithmsAnnual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference
researchProduct

Normalization of T2W-MRI Prostate Images using Rician a priori

2016

International audience; Prostate cancer is reported to be the second most frequently diagnosed cancer of men in the world. In practise, diagnosis can be affected by multiple factors which reduces the chance to detect the potential lesions. In the last decades, new imaging techniques mainly based on MRI are developed in conjunction with Computer-Aided Diagnosis (CAD) systems to help radiologists for such diagnosis. CAD systems are usually designed as a sequential process consisting of four stages: pre-processing, segmentation, registration and classification. As a pre-processing, image normalization is a critical and important step of the chain in order to design a robust classifier and over…

Normalization (statistics)Computer scienceNormalization (image processing)T2W-MRI02 engineering and technology[ SPI.SIGNAL ] Engineering Sciences [physics]/Signal and Image processing030218 nuclear medicine & medical imaging03 medical and health sciencesProstate cancer0302 clinical medicineProstateRician fading0202 electrical engineering electronic engineering information engineeringmedicineComputer visionSegmentation[SPI.SIGNAL] Engineering Sciences [physics]/Signal and Image processingpre-processingProstate cancermedicine.diagnostic_testbusiness.industryCancerMagnetic resonance imagingImage segmentationmedicine.diseasemedicine.anatomical_structurenormalizationComputer-aided diagnosisA priori and a posteriori020201 artificial intelligence & image processingcomputer-aided diagnosisArtificial intelligencebusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processing
researchProduct

Data-independent acquisition strategies for quantitative proteomics

2013

In shotgun proteomics, data-dependent precursor acquisition (DDA) is widely used to profile protein components in complex samples. Although very popular, there are some inherent limitations to the DDA approach, such as irreproducible precursor ion selection, under-sampling and long instrument cycle times. Unbiased ‘data-independent acquisition’ (DIA) strategies try to overcome those limitations. In MSE, which is supported by Waters Q-TOF instrument platforms, such as the Synapt G2-S, a wide band pass filter is used for precursor selection. During acquisition, alternating MS scans are collected at low and high collision energy (CE), providing precursor and fragment ion information, respectiv…

Normalization (statistics)Computer sciencePipeline (computing)Quantitative proteomicsData-independent acquisitionFilter (signal processing)Shotgun proteomicsCluster analysisProteomicsBiological system
researchProduct

Affine compensation of illumination in hyperspectral remote sensing images

2009

A problem when working with optical satellite or airborne images is the need to compensate for changes in the illumination conditions at the time of acquisition. This is particularly critical when working with time series of data. Atmospheric correction strategies based on radiative transfer codes may provide a rigorous solution but it may not be the best solution for situations where a huge amount of hyperspectral images may need to be processed and computational time is a critical factor. The GMES (”Global Monitoring for Environment and Security”) initiative has promoted the creation of a new generation of satellites (the SENTINEL series) with ”ultra-high resolution” and ”superspectral im…

Normalization (statistics)Computer sciencebusiness.industryMultispectral imageNormalization (image processing)Atmospheric correctionHyperspectral imagingData acquisitionRadianceRadiative transferComputer visionArtificial intelligenceAffine transformationbusinessImage resolutionRemote sensing2009 IEEE International Geoscience and Remote Sensing Symposium
researchProduct

Combining Inter-Subject Modeling with a Subject-Based Data Transformation to Improve Affect Recognition from EEG Signals

2019

Existing correlations between features extracted from Electroencephalography (EEG) signals and emotional aspects have motivated the development of a diversity of EEG-based affect detection methods. Both intra-subject and inter-subject approaches have been used in this context. Intra-subject approaches generally suffer from the small sample problem, and require the collection of exhaustive data for each new user before the detection system is usable. On the contrary, inter-subject models do not account for the personality and physiological influence of how the individual is feeling and expressing emotions. In this paper, we analyze both modeling approaches, using three public repositories. T…

Normalization (statistics)Data AnalysisSupport Vector MachineDatabases FactualComputer sciencemedia_common.quotation_subjectEmotionsData transformation (statistics)Context (language use)02 engineering and technologyvalence detectionElectroencephalographyAffect (psychology)Machine learningcomputer.software_genrelcsh:Chemical technologyBiochemistryModels BiologicalArticleAnalytical Chemistrydata transformation0202 electrical engineering electronic engineering information engineeringmedicinePersonalityHumanslcsh:TP1-1185EEGElectrical and Electronic EngineeringInstrumentationarousal detectionmedia_commonmedicine.diagnostic_testbusiness.industry020206 networking & telecommunicationsSubject (documents)ElectroencephalographySignal Processing Computer-AssistedAtomic and Molecular Physics and Opticsnormalization020201 artificial intelligence & image processingArtificial intelligencebusinessArousalcomputerSensors
researchProduct

A revised model for lipid-normalizing δ13C values from aquatic organisms, with implications for isotope mixing models

2006

1. Stable isotope analyses coupled with mixing models are being used increasingly to evaluate ecological management issues and questions. Such applications of stable isotope analyses often require simultaneous carbon and nitrogen analyses from the same sample. Correction of the carbon isotope values to take account of the varying content of 13 C-depleted lipids is then frequently achieved by a lipid-normalization procedure using a model describing the relationship between change in δ 13 C following lipid removal and the original C:N ratio of a sample. 2. We evaluated the applicability of two widely used normalization models using empirical data for muscle tissue from a wide range of fish an…

Normalization (statistics)EcologyIsotopeδ13CIsotopes of carbonStable isotope ratioEcologyNormalization modelFreshwater fishBiologyBiological systembiology.organism_classificationAquatic organismsJournal of Applied Ecology
researchProduct

How do normalization schemes affect net spillovers? A replication of the Diebold and Yilmaz (2012) study

2019

Abstract This paper replicates the Diebold and Yilmaz (2012) study on the connectedness of the commodity market and three other financial markets: the stock market, the bond market, and the FX market, based on the Generalized Forecast Error Variance Decomposition, GEFVD. We show that the net spillover indices (of directional connectedness), used to assess the net contribution of one market to overall risk in the system, are sensitive to the normalization scheme applied to the GEFVD. We show that, considering data generating processes characterized by different degrees of persistence and covariance, a scalar-based normalization of the Generalized Forecast Error Variance Decomposition is pref…

Normalization (statistics)Economics and EconometricsSocial connectedness020209 energySettore SECS-P/05 - Econometria02 engineering and technologyNormalization schemeconnectednessSpillover effect0502 economics and business0202 electrical engineering electronic engineering information engineeringEconometrics050207 economicsMathematicsspillover normalization connectednessVector autoregression models05 social sciencesFinancial marketCovarianceCausalitySpilloverGeneral EnergynormalizationGeneralized forecast error variance decompositionCommodity price fluctuations Driving forces Nonparametric additive regression modelsVariance decomposition of forecast errorsBond marketStock marketSimulationNormalization schemes
researchProduct

Large two-dimensional electronic systems: Self-consistent energies and densities at low cost

2013

We derive a self-consistent local variant of the Thomas-Fermi approximation for (quasi-) two-dimensional (2D) systems by localizing the Hartree term. The scheme results in an explicit orbital-free representation of the electron density and energy in terms of the external potential, the number of electrons, and the chemical potential determined upon normalization. We test the method over a variety 2D nanostructures by comparing to the Kohn-Sham 2D local-density approximation (LDA) calculations up to 600 electrons. Accurate results are obtained in view of the negligible computational cost. We also assess a local upper bound for the Hartree energy. Peer reviewed

Normalization (statistics)Electron densityThomas-Fermi approximationta221educationFOS: Physical sciencesquantum dotsElectron114 Physical sciencesUpper and lower boundsCondensed Matter - Strongly Correlated ElectronsQuantum mechanicsMesoscale and Nanoscale Physics (cond-mat.mes-hall)Electronic systemsta218density functional theoryPhysicsta214ta114Condensed Matter - Mesoscale and Nanoscale PhysicsStrongly Correlated Electrons (cond-mat.str-el)HartreeCondensed Matter PhysicsElectronic Optical and Magnetic MaterialsComputational physicsorbital free functionalQuantum dotDensity functional theory
researchProduct