Search results for "normalization"
showing 10 items of 632 documents
BANΔIT: B’‐factor Analysis for Drug Design and Structural Biology
2020
The analysis of B‐factor profiles from X‐ray protein structures can be utilized for structure‐based drug design since protein mobility changes have been associated with the quality of protein‐ligand interactions. With the BANΔIT (B’‐factor analysis and ΔB’ interpretation toolkit), we have developed a JavaScript‐based browser application that provides a graphical user interface for the normalization and analysis of B’‐factor profiles. To emphasize the usability for rational drug design applications, we have analyzed a selection of crystallographic protein‐ligand complexes and have given exemplary conclusions for further drug optimization including the development of a B’‐factor‐supported pha…
79. Amyloid-PET analysis based on tissue probability maps
2018
Purpose The regional quantification of amyloid burden is crucial for the clinical diagnosis of Alzheimer’s disease [1]. The best method to evaluate regional amyloid deposition in PET is through the use MR imaging for brain space normalization. However, since MR imaging is not always available in the clinical practice, a MR-less methodology is needed in order to compute semi-quantitative and analyze regional amyloid burden. Methods Forty-four patients with clinical evidence of dementia, underwent 18F-Florbetaben PET (FBB-PET), FDG-PET, neuropsychological assessment and cerebrospinal fluid analysis. We implemented a methodology that uses SPM12 to import and normalize the FBB-PET images in Mon…
Identification of Spatial-Temporal Muscle Synergies from EMG Epochs of Various Durations: A Time-Warped Tensor Decomposition
2018
Extraction of muscle synergies from electromyography (EMG) recordings relies on the analysis of multi-trial muscle activation data. To identify the underlying modular structure, dimensionality reduction algorithms are usually applied to the EMG signals. This process requires a rigid alignment of muscle activity across trials that is typically achieved by the normalization of the length of each trial. However, this time-normalization ignores important temporal variability that is present on single trials as result of neuromechanical processes or task demands. To overcome this limitation, we propose a novel method that simultaneously aligns muscle activity data and extracts spatial and tempor…
Testing the effects of pre-processing on voxel based morphometry analysis
2015
Voxel based morphometry (VBM) is an automated analysis technique which allows voxel-wise comparison of mainly grey-matter volumes between two magnetic resonance images (MRI). Two main analysis processes in VBM are possible. One is cross-sectional data analysis, where one group is compared with another to depict see the regions in the brain, which show changes in their grey-matter volume. Second is longitudinal data analysis, where MRIs, taken at different time points, are compared to see the regions in the brain that show changes in their grey matter volume for one time point with respect to another time point. Both types of analyses require pre-processing steps before performing the statis…
Determining the optimal maximal and submaximal voluntary contraction tests for normalizing the erector spinae muscles
2019
Background This study aimed to identify which maximum voluntary isometric contraction (MVIC) and sub-MVIC tests produce the highest activation of the erector spinae muscles and the greatest reduction in inter-individual variability, to put them forward as reference normalization maneuvers for future studies. Methods Erector spinae EMG activity was recorded in 38 healthy women during five submaximal and three maximal exercises. Results None of the three MVIC tests generated the maximal activation level in all the participants. The maximal activation level was achieved in 68.4% of cases with the test performed on the roman chair in the horizontal position (96.3 ± 7.3; p < 0.01). Of the fi…
On a new robust workflow for the statistical and spatial analysis of fracture data collected with scanlines (or the importance of stationarity)
2020
Abstract. We present an innovative workflow for the statistical analysis of fracture data collected along scanlines, composed of two major stages, each one with alternative options. A prerequisite in our analysis is the assessment of stationarity of the dataset, which is motivated by statistical and geological considerations. Calculating statistics on non-stationary data can be statistically meaningless, and moreover the normalization and/or sub-setting approach that we discuss here can greatly improve our understanding of geological deformation processes. Our methodology is based on performing non-parametric statistical tests, which allow detecting important features of the spatial distrib…
A code to calculate (high order) Bessel functions based on the continued fractions method
1993
Abstract We have developed a fast code to calculate Bessel functions of integer and fractional order based on the continued fractions method. This algorithm is specially useful in the case of Bessel functions of high order because it does not require any recalculation using normalization relations.
Visual aftereffects and sensory nonlinearities from a single statistical framework
2015
When adapted to a particular scenery our senses may fool us: colors are misinterpreted, certain spatial patterns seem to fade out, and static objects appear to move in reverse. A mere empirical description of the mechanisms tuned to color, texture, and motion may tell us where these visual illusions come from. However, such empirical models of gain control do not explain why these mechanisms work in this apparently dysfunctional manner. Current normative explanations of aftereffects based on scene statistics derive gain changes by (1) invoking decorrelation and linear manifold matching/equalization, or (2) using nonlinear divisive normalization obtained from parametric scene models. These p…
How to standardize (if you must)
2017
In many situations we are interested in appraising the value of a certain characteristic for a given individual relative to the context in which this value is observed. In recent years this problem has become prominent in the evaluation of scientific productivity and impact. A popular approach to such relative valuations consists in using percentile ranks. This is a purely ordinal method that may sometimes lead to counterintuitive appraisals, in that it discards all information about the distance between the raw values within a given context. By contrast, this information is partly preserved by using standardization, i.e., by transforming the absolute values in such a way that, within the s…
Large area strip edgeless detectors fabricated by plasma etching process
2007
This work presents the last results from large area edgeless detector, fabricated by Plasma Etching Process to reduce the conventional width of the terminating structure of position sensitive detectors to the detector rim.. A current terminating ring is used to decouple the electrical behavior of the surface from the sensitive volume within a few tens of micrometers. The detectors have been illuminated using an infrared laser and their surface scanned in order to understand their collection behavior at the cut edge. The detectors have very high efficiency up to the insensitive area which is located about 60 mum from the detector edge.