Search results for "normalization"

showing 10 items of 632 documents

BANΔIT: B’‐factor Analysis for Drug Design and Structural Biology

2020

The analysis of B‐factor profiles from X‐ray protein structures can be utilized for structure‐based drug design since protein mobility changes have been associated with the quality of protein‐ligand interactions. With the BANΔIT (B’‐factor analysis and ΔB’ interpretation toolkit), we have developed a JavaScript‐based browser application that provides a graphical user interface for the normalization and analysis of B’‐factor profiles. To emphasize the usability for rational drug design applications, we have analyzed a selection of crystallographic protein‐ligand complexes and have given exemplary conclusions for further drug optimization including the development of a B’‐factor‐supported pha…

Normalization (statistics)Source codeComputer scienceBioinformaticsmedia_common.quotation_subjectDrug designB-factorMolecular modelingWeb BrowserJavaScriptcomputer.software_genre01 natural sciences03 medical and health sciencesStructural BiologyFactor (programming language)Drug DiscoveryApplication NoteHumansProtein flexibilityProtease Inhibitors030304 developmental biologycomputer.programming_languagemedia_commonGraphical user interface0303 health sciencesbusiness.industrySARS-CoV-2Organic ChemistryComputational BiologyUsabilityAdenosine Monophosphate0104 chemical sciencesComputer Science ApplicationsCOVID-19 Drug Treatment010404 medicinal & biomolecular chemistryDrug DesignMolecular MedicineData miningPharmacophorebusinesscomputerMolecular Informatics
researchProduct

79. Amyloid-PET analysis based on tissue probability maps

2018

Purpose The regional quantification of amyloid burden is crucial for the clinical diagnosis of Alzheimer’s disease [1]. The best method to evaluate regional amyloid deposition in PET is through the use MR imaging for brain space normalization. However, since MR imaging is not always available in the clinical practice, a MR-less methodology is needed in order to compute semi-quantitative and analyze regional amyloid burden. Methods Forty-four patients with clinical evidence of dementia, underwent 18F-Florbetaben PET (FBB-PET), FDG-PET, neuropsychological assessment and cerebrospinal fluid analysis. We implemented a methodology that uses SPM12 to import and normalize the FBB-PET images in Mon…

Normalization (statistics)medicine.diagnostic_testReceiver operating characteristicbusiness.industryBiophysicsPrecuneusGeneral Physics and AstronomyAmyloid petGeneral Medicinemedicine.diseasemedicine.anatomical_structuremedicineDementiaCutoffRadiology Nuclear Medicine and imagingNeuropsychological assessmentParacentral lobuleNuclear medicinebusinessPhysica Medica
researchProduct

Identification of Spatial-Temporal Muscle Synergies from EMG Epochs of Various Durations: A Time-Warped Tensor Decomposition

2018

Extraction of muscle synergies from electromyography (EMG) recordings relies on the analysis of multi-trial muscle activation data. To identify the underlying modular structure, dimensionality reduction algorithms are usually applied to the EMG signals. This process requires a rigid alignment of muscle activity across trials that is typically achieved by the normalization of the length of each trial. However, this time-normalization ignores important temporal variability that is present on single trials as result of neuromechanical processes or task demands. To overcome this limitation, we propose a novel method that simultaneously aligns muscle activity data and extracts spatial and tempor…

Normalization (statistics)medicine.diagnostic_testbusiness.industryComputer scienceDimensionality reductionProcess (computing)Pattern recognitionElectromyographyTemporal muscleTask (project management)Identification (information)medicineArtificial intelligencebusinessTime complexity
researchProduct

Testing the effects of pre-processing on voxel based morphometry analysis

2015

Voxel based morphometry (VBM) is an automated analysis technique which allows voxel-wise comparison of mainly grey-matter volumes between two magnetic resonance images (MRI). Two main analysis processes in VBM are possible. One is cross-sectional data analysis, where one group is compared with another to depict see the regions in the brain, which show changes in their grey-matter volume. Second is longitudinal data analysis, where MRIs, taken at different time points, are compared to see the regions in the brain that show changes in their grey matter volume for one time point with respect to another time point. Both types of analyses require pre-processing steps before performing the statis…

Normalization (statistics)medicine.diagnostic_testbusiness.industryPattern recognitionMagnetic resonance imagingVoxel-based morphometryGrey matterMagnetic Resonance ImagingCross-Sectional Studiesmedicine.anatomical_structureImage Processing Computer-AssistedmedicineHumansPreprocessorComputer visionArtificial intelligenceGray MatterTime pointPsychologybusinessSmoothingVolume (compression)2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC)
researchProduct

Determining the optimal maximal and submaximal voluntary contraction tests for normalizing the erector spinae muscles

2019

Background This study aimed to identify which maximum voluntary isometric contraction (MVIC) and sub-MVIC tests produce the highest activation of the erector spinae muscles and the greatest reduction in inter-individual variability, to put them forward as reference normalization maneuvers for future studies. Methods Erector spinae EMG activity was recorded in 38 healthy women during five submaximal and three maximal exercises. Results None of the three MVIC tests generated the maximal activation level in all the participants. The maximal activation level was achieved in 68.4% of cases with the test performed on the roman chair in the horizontal position (96.3 ± 7.3; p < 0.01). Of the fi…

Normalization (statistics)medicine.medical_specialtyAnatomy and Physiologylcsh:MedicineElectromyographyIsometric exerciseGeneral Biochemistry Genetics and Molecular BiologyErector spinae03 medical and health sciences0302 clinical medicinePhysical medicine and rehabilitationVoluntary contractionRoman chairErector spinae musclesmedicineColumna vertebral - Músculos - Electromiografía.Maximum voluntary isometric contractionmedicine.diagnostic_testbusiness.industryElectromyographyGeneral Neurosciencelcsh:R030229 sport sciencesGeneral MedicineSpine - Muscles - Electromyography.KinesiologyTrunkSub-maximum voluntary isometric contractionNormalizationOrthopedicsElectromyography.Horizontal position representationGeneral Agricultural and Biological SciencesbusinessElectromiografía.030217 neurology & neurosurgery
researchProduct

On a new robust workflow for the statistical and spatial analysis of fracture data collected with scanlines (or the importance of stationarity)

2020

Abstract. We present an innovative workflow for the statistical analysis of fracture data collected along scanlines, composed of two major stages, each one with alternative options. A prerequisite in our analysis is the assessment of stationarity of the dataset, which is motivated by statistical and geological considerations. Calculating statistics on non-stationary data can be statistically meaningless, and moreover the normalization and/or sub-setting approach that we discuss here can greatly improve our understanding of geological deformation processes. Our methodology is based on performing non-parametric statistical tests, which allow detecting important features of the spatial distrib…

Normalization (statistics)statistical and spatial analysis of fracture dataComputer scienceStratigraphylcsh:QE1-996.5PaleontologySoil ScienceGeologyFunction (mathematics)computer.software_genrelcsh:GeologyGeophysicsWorkflowlcsh:StratigraphyGeochemistry and PetrologyFracture (geology)Statistical analysisData miningSpatial analysiscomputerlcsh:QE640-699Earth-Surface ProcessesSolid Earth
researchProduct

A code to calculate (high order) Bessel functions based on the continued fractions method

1993

Abstract We have developed a fast code to calculate Bessel functions of integer and fractional order based on the continued fractions method. This algorithm is specially useful in the case of Bessel functions of high order because it does not require any recalculation using normalization relations.

Normalization (statistics)symbols.namesakeComputer programHardware and ArchitectureComputingMethodologies_SYMBOLICANDALGEBRAICMANIPULATIONMathematical analysissymbolsCode (cryptography)General Physics and AstronomyHigh orderBessel functionMathematicsComputer Physics Communications
researchProduct

Visual aftereffects and sensory nonlinearities from a single statistical framework

2015

When adapted to a particular scenery our senses may fool us: colors are misinterpreted, certain spatial patterns seem to fade out, and static objects appear to move in reverse. A mere empirical description of the mechanisms tuned to color, texture, and motion may tell us where these visual illusions come from. However, such empirical models of gain control do not explain why these mechanisms work in this apparently dysfunctional manner. Current normative explanations of aftereffects based on scene statistics derive gain changes by (1) invoking decorrelation and linear manifold matching/equalization, or (2) using nonlinear divisive normalization obtained from parametric scene models. These p…

Normalization (statistics)texture aftereffectComputer scienceadaptationunsupervised learningscene statisticslcsh:RC321-571Behavioral Neurosciencelcsh:Neurosciences. Biological psychiatry. NeuropsychiatryDecorrelationBiological Psychiatrycolor aftereffectParametric statisticsOriginal ResearchCurves analysisbusiness.industryOptical illusionNonparametric statisticsScene statisticsMaximizationsequential principal curves analysisPsychiatry and Mental healthNeuropsychology and Physiological PsychologyNeurologyA priori and a posterioriArtificial intelligencebusinessAlgorithmNeurosciencemotion aftereffectFrontiers in Human Neuroscience
researchProduct

How to standardize (if you must)

2017

In many situations we are interested in appraising the value of a certain characteristic for a given individual relative to the context in which this value is observed. In recent years this problem has become prominent in the evaluation of scientific productivity and impact. A popular approach to such relative valuations consists in using percentile ranks. This is a purely ordinal method that may sometimes lead to counterintuitive appraisals, in that it discards all information about the distance between the raw values within a given context. By contrast, this information is partly preserved by using standardization, i.e., by transforming the absolute values in such a way that, within the s…

Normalization (statistics)z-scoreLocation statisticsStandardizationMonotonic functionLibrary and Information Sciences050905 science studiesSocial Sciences (all)NOPercentile rankCitation analysisEconometricsMathematicsCitation analysis; Dispersion statistics; Location statistics; m-score; Normalization; Standardization; z-score; Social Sciences (all); Computer Science Applications1707 Computer Vision and Pattern Recognition; Library and Information Sciences05 social sciencesCounterintuitiveGeneral Social SciencesLocation statisticDispersion statisticsComputer Science Applications1707 Computer Vision and Pattern RecognitionStandardizationComputer Science Applicationsm-scoreNormalizationConceptual frameworkCitation analysisCitation analysiNormative0509 other social sciences050904 information & library sciencesDispersion statistic
researchProduct

Large area strip edgeless detectors fabricated by plasma etching process

2007

This work presents the last results from large area edgeless detector, fabricated by Plasma Etching Process to reduce the conventional width of the terminating structure of position sensitive detectors to the detector rim.. A current terminating ring is used to decouple the electrical behavior of the surface from the sensitive volume within a few tens of micrometers. The detectors have been illuminated using an infrared laser and their surface scanned in order to understand their collection behavior at the cut edge. The detectors have very high efficiency up to the insensitive area which is located about 60 mum from the detector edge.

Normalization propertyOpticsPlasma etchingMaterials sciencePhysics::Instrumentation and Detectorsbusiness.industryDetectorFar-infrared laserProcess (computing)Readout electronicsHigh Energy Physics::ExperimentEdge (geometry)business2007 IEEE Nuclear Science Symposium Conference Record
researchProduct