Search results for "Transfer entropy"
showing 10 items of 44 documents
Multiscale partial information decomposition of dynamic processes with short and long-range correlations: theory and application to cardiovascular co…
2022
Abstract Objective. In this work, an analytical framework for the multiscale analysis of multivariate Gaussian processes is presented, whereby the computation of Partial Information Decomposition measures is achieved accounting for the simultaneous presence of short-term dynamics and long-range correlations. Approach. We consider physiological time series mapping the activity of the cardiac, vascular and respiratory systems in the field of Network Physiology. In this context, the multiscale representation of transfer entropy within the network of interactions among Systolic arterial pressure (S), respiration (R) and heart period (H), as well as the decomposition into unique, redundant and s…
Bidirected Information Flow in the High-Level Visual Cortex
2021
Understanding the brain function requires investigating information transfer across brain regions. Shannon began the remarkable new field of information theory in 1948. It basically can be divided into two categories: directed and undirected information-theoretical approaches. As we all know, neural signals are typically nonlinear and directed flow between brain regions. We can use directed information to quantify feed-forward information flow, feedback information, and instantaneous influence in the high-level visual cortex. Moreover, neural signals have bidirectional information flow properties and are not captured by the transfer entropy approach. Therefore, we used directed information …
Synergistic Information Transfer in the Global System of Financial Markets.
2020
Uncovering dynamic information flow between stock market indices has been the topic of several studies which exploited the notion of transfer entropy or Granger causality, its linear version. The output of the transfer entropy approach is a directed weighted graph measuring the information about the future state of each target provided by the knowledge of the state of each driving stock market index. In order to go beyond the pairwise description of the information flow, thus looking at higher order informational circuits, here we apply the partial information decomposition to triplets consisting of a pair of driving markets (belonging to America or Europe) and a target market in Asia. Our …
Compensating for instantaneous signal mixing in transfer entropy analysis of neurobiological time series
2013
The transfer entropy (TE) has recently emerged as a nonlinear model-free tool, framed in information theory, to detect directed interactions in coupled processes. Unfortunately, when applied to neurobiological time series TE is biased by signal cross-talk due to volume conduction. To compensate for this bias, in this study we introduce a modified TE measure which accounts for possible instantaneous effects between the analyzed time series. The new measure, denoted as compensated TE (cTE), is tested on simulated time series reproducing conditions typical of neuroscience applications, and on real magnetoencephalographic (MEG) multi-trial data measured during a visuo-tactile cognitive experime…
Information decomposition of multichannel EMG to map functional interactions in the distributed motor system
2019
AbstractThe central nervous system needs to coordinate multiple muscles during postural control. Functional coordination is established through the neural circuitry that interconnects different muscles. Here we used multivariate information decomposition of multichannel EMG acquired from 14 healthy participants during postural tasks to investigate the neural interactions between muscles. A set of information measures were estimated from an instantaneous linear regression model and a time-lagged VAR model fitted to the EMG envelopes of 36 muscles. We used network analysis to quantify the structure of functional interactions between muscles and compared them across experimental conditions. Co…
Linear and non-linear brain-heart and brain-brain interactions during sleep.
2015
In this study, the physiological networks underlying the joint modulation of the parasympathetic component of heart rate variability (HRV) and of the different electroencephalographic (EEG) rhythms during sleep were assessed using two popular measures of directed interaction in multivariate time series, namely Granger causality (GC) and transfer entropy (TE). Time series representative of cardiac and brain activities were obtained in 10 young healthy subjects as the normalized high frequency (HF) component of HRV and EEG power in the δ, θ, α, Ï, and β bands, measured during the whole duration of sleep. The magnitude and statistical significance of GC and TE were evaluated between each …
Information dynamics in cardiorespiratory analyses: application to controlled breathing
2014
Voluntary adjustment of the breathing pattern is widely used to deal with stress-related conditions. In this study, effects of slow and fast breathing with a low and high inspiratory to expiratory time on heart rate variability (HRV) are evaluated by means of information dynamics. Information transfer is quantified both as the traditional transfer entropy as well as the cross entropy, where the latter does not condition on the past of HRV, thereby taking the highly unidirectional relation between respiration and heart rate into account. The results show that the cross entropy is more suited to quantify cardiorespiratory information transfer as this measure increases during slow breathing, i…
MuTE: a new matlab toolbox for estimating the multivariate transfer entropy in physiological variability series
2014
We present a new time series analysis toolbox, developed in Matlab, for the estimation of the Transfer entropy (TE) between time series taken from a multivariate dataset. The main feature of the toolbox is its fully multivariate implementation, that is made possible by the design of an approach for the non-uniform embedding (NUE) of the observed time series. The toolbox is equipped with parametric (linear) and non-parametric (based on binning or nearest neighbors) entropy estimators. All these estimators, implemented using the NUE approach in comparison with the classical approach based on uniform embedding, are tested on RR interval, systolic pressure and respiration variability series mea…
Information Dynamics Analysis: A new approach based on Sparse Identification of Linear Parametric Models*
2020
The framework of information dynamics allows to quantify different aspects of the statistical structure of multivariate processes reflecting the temporal dynamics of a complex network. The information transfer from one process to another can be quantified through Transfer Entropy, and under the assumption of joint Gaussian variables it is strictly related to the concept of Granger Causality (GC). According to the most recent developments in the field, the computation of GC entails representing the processes through a Vector Autoregressive (VAR) model and a state space (SS) model typically identified by means of the Ordinary Least Squares (OLS). In this work, we propose a new identification …
MuTE: a MATLAB toolbox to compare established and novel estimators of the multivariate transfer entropy.
2014
A challenge for physiologists and neuroscientists is to map information transfer between components of the systems that they study at different scales, in order to derive important knowledge on structure and function from the analysis of the recorded dynamics. The components of physiological networks often interact in a nonlinear way and through mechanisms which are in general not completely known. It is then safer that the method of choice for analyzing these interactions does not rely on any model or assumption on the nature of the data and their interactions. Transfer entropy has emerged as a powerful tool to quantify directed dynamical interactions. In this paper we compare different ap…