Search results for "normalization"
showing 10 items of 632 documents
Statistical modelling of non-stationary processes of atmospheric pollution from natural sources: example of birch pollen
2016
Abstract A statistical model for predicting daily mean pollen concentrations during the flowering season is constructed and its parameterization and application to birch pollen in Riga (Latvia) are discussed. The model involves several steps of transformations of both meteorological data and pollen observations, aiming at a normally distributed homogeneous stationary dataset with linearized dependencies between the transformed meteorological predictors and pollen concentrations. The data transformation includes normalization of daily mean birch pollen concentrations, a switch of the independent axis from time to heat sum, a projection of governing parameters to pollen concentrations, and a …
Propagation pattern analysis during atrial fibrillation based on sparse modeling.
2012
In this study, sparse modeling is introduced for the estimation of propagation patterns in intracardiac atrial fibrillation (AF) signals. The estimation is based on the partial directed coherence function, derived from fitting a multivariate autoregressive model to the observed signal using least-squares (LS) estimation. The propagation pattern analysis incorporates prior information on sparse coupling as well as the distance between the recording sites. Two optimization methods are employed for estimation of the model parameters, namely, the adaptive group least absolute selection and shrinkage operator (aLASSO), and a novel method named the distance-adaptive group LASSO (dLASSO). Using si…
Propagation pattern analysis during atrial fibrillation based on the adaptive group LASSO.
2012
The present study introduces sparse modeling for the estimation of propagation patterns in intracardiac atrial fibrillation (AF) signals. The estimation is based on the partial directed coherence (PDC) function, derived from fitting a multivariate autoregressive model to the observed signals. A sparse optimization method is proposed for estimation of the model parameters, namely, the adaptive group least absolute selection and shrinkage operator (aLASSO). In simulations aLASSO was found superior to the commonly used least-squares (LS) estimation with respect to estimation performance. The normalized error between the true and estimated model parameters dropped from 0.200.04 for LS estimatio…
Normalization of T2W-MRI Prostate Images using Rician a priori
2016
International audience; Prostate cancer is reported to be the second most frequently diagnosed cancer of men in the world. In practise, diagnosis can be affected by multiple factors which reduces the chance to detect the potential lesions. In the last decades, new imaging techniques mainly based on MRI are developed in conjunction with Computer-Aided Diagnosis (CAD) systems to help radiologists for such diagnosis. CAD systems are usually designed as a sequential process consisting of four stages: pre-processing, segmentation, registration and classification. As a pre-processing, image normalization is a critical and important step of the chain in order to design a robust classifier and over…
Data-independent acquisition strategies for quantitative proteomics
2013
In shotgun proteomics, data-dependent precursor acquisition (DDA) is widely used to profile protein components in complex samples. Although very popular, there are some inherent limitations to the DDA approach, such as irreproducible precursor ion selection, under-sampling and long instrument cycle times. Unbiased ‘data-independent acquisition’ (DIA) strategies try to overcome those limitations. In MSE, which is supported by Waters Q-TOF instrument platforms, such as the Synapt G2-S, a wide band pass filter is used for precursor selection. During acquisition, alternating MS scans are collected at low and high collision energy (CE), providing precursor and fragment ion information, respectiv…
Affine compensation of illumination in hyperspectral remote sensing images
2009
A problem when working with optical satellite or airborne images is the need to compensate for changes in the illumination conditions at the time of acquisition. This is particularly critical when working with time series of data. Atmospheric correction strategies based on radiative transfer codes may provide a rigorous solution but it may not be the best solution for situations where a huge amount of hyperspectral images may need to be processed and computational time is a critical factor. The GMES (”Global Monitoring for Environment and Security”) initiative has promoted the creation of a new generation of satellites (the SENTINEL series) with ”ultra-high resolution” and ”superspectral im…
Combining Inter-Subject Modeling with a Subject-Based Data Transformation to Improve Affect Recognition from EEG Signals
2019
Existing correlations between features extracted from Electroencephalography (EEG) signals and emotional aspects have motivated the development of a diversity of EEG-based affect detection methods. Both intra-subject and inter-subject approaches have been used in this context. Intra-subject approaches generally suffer from the small sample problem, and require the collection of exhaustive data for each new user before the detection system is usable. On the contrary, inter-subject models do not account for the personality and physiological influence of how the individual is feeling and expressing emotions. In this paper, we analyze both modeling approaches, using three public repositories. T…
A revised model for lipid-normalizing δ13C values from aquatic organisms, with implications for isotope mixing models
2006
1. Stable isotope analyses coupled with mixing models are being used increasingly to evaluate ecological management issues and questions. Such applications of stable isotope analyses often require simultaneous carbon and nitrogen analyses from the same sample. Correction of the carbon isotope values to take account of the varying content of 13 C-depleted lipids is then frequently achieved by a lipid-normalization procedure using a model describing the relationship between change in δ 13 C following lipid removal and the original C:N ratio of a sample. 2. We evaluated the applicability of two widely used normalization models using empirical data for muscle tissue from a wide range of fish an…
How do normalization schemes affect net spillovers? A replication of the Diebold and Yilmaz (2012) study
2019
Abstract This paper replicates the Diebold and Yilmaz (2012) study on the connectedness of the commodity market and three other financial markets: the stock market, the bond market, and the FX market, based on the Generalized Forecast Error Variance Decomposition, GEFVD. We show that the net spillover indices (of directional connectedness), used to assess the net contribution of one market to overall risk in the system, are sensitive to the normalization scheme applied to the GEFVD. We show that, considering data generating processes characterized by different degrees of persistence and covariance, a scalar-based normalization of the Generalized Forecast Error Variance Decomposition is pref…
Large two-dimensional electronic systems: Self-consistent energies and densities at low cost
2013
We derive a self-consistent local variant of the Thomas-Fermi approximation for (quasi-) two-dimensional (2D) systems by localizing the Hartree term. The scheme results in an explicit orbital-free representation of the electron density and energy in terms of the external potential, the number of electrons, and the chemical potential determined upon normalization. We test the method over a variety 2D nanostructures by comparing to the Kohn-Sham 2D local-density approximation (LDA) calculations up to 600 electrons. Accurate results are obtained in view of the negligible computational cost. We also assess a local upper bound for the Hartree energy. Peer reviewed