Search results for "information dynamic"
showing 10 items of 30 documents
Disentangling cardiovascular control mechanisms during head-down tilt via joint transfer entropy and self-entropy decompositions
2015
A full decomposition of the predictive entropy (PE) of the spontaneous variations of the heart period (HP) given systolic arterial pressure (SAP) and respiration (R) is proposed. The PE of HP is decomposed into the joint transfer entropy (JTE) from SAP and R to HP and self-entropy (SE) of HP. The SE is the sum of three terms quantifying the synergistic/redundant contributions of HP and SAP, when taken individually and jointly, to SE and one term conditioned on HP and SAP denoted as the conditional SE (CSE) of HP given SAP and R. The JTE from SAP and R to HP is the sum of two terms attributable to SAP or R plus an extra term describing the redundant/synergistic contribution to the JTE. All q…
Information-theoretic assessment of cardiovascular variability during postural and mental stress
2016
This study was aimed at investigating the individual and combined effects of postural and mental stress on short-term cardiovascular regulation. To this end, we applied measures taken from the emerging framework of information dynamics on the beat-to-beat spontaneous variability of RR interval and systolic arterial pressure (SAP) measured from healthy subjects in the resting supine position and during the separate and simultaneous execution of experimental protocols performing head-up tilt (HUT) and mental arithmetics (MA). The information stored in RR interval variability, a measure inversely related to the complexity of the time series, increased significantly during HUT and HUT+MA compar…
A new framework for the time- and frequency-domain assessment of high-order interactions in networks of random processes
2022
While the standard network description of complex systems is based on quantifying the link between pairs of system units, higher-order interactions (HOIs) involving three or more units often play a major role in governing the collective network behavior. This work introduces a new approach to quantify pairwise and HOIs for multivariate rhythmic processes interacting across multiple time scales. We define the so-called O-information rate (OIR) as a new metric to assess HOIs for multivariate time series, and present a framework to decompose the OIR into measures quantifying Granger-causal and instantaneous influences, as well as to expand all measures in the frequency domain. The framework ex…
Information dynamics: Temporal behavior of uncertainty measures
2008
We carry out a systematic study of uncertainty measures that are generic to dynamical processes of varied origins, provided they induce suitable continuous probability distributions. The major technical tool are the information theory methods and inequalities satisfied by Fisher and Shannon information measures. We focus on a compatibility of these inequalities with the prescribed (deterministic, random or quantum) temporal behavior of pertinent probability densities.
Information Decomposition in Bivariate Systems: Theory and Application to Cardiorespiratory Dynamics
2015
In the framework of information dynamics, the temporal evolution of coupled systems can be studied by decomposing the predictive information about an assigned target system into amounts quantifying the information stored inside the system and the information transferred to it. While information storage and transfer are computed through the known self-entropy (SE) and transfer entropy (TE), an alternative decomposition evidences the so-called cross entropy (CE) and conditional SE (cSE), quantifying the cross information and internal information of the target system, respectively. This study presents a thorough evaluation of SE, TE, CE and cSE as quantities related to the causal statistical s…
Information Transfer in Linear Multivariate Processes Assessed through Penalized Regression Techniques: Validation and Application to Physiological N…
2020
The framework of information dynamics allows the dissection of the information processed in a network of multiple interacting dynamical systems into meaningful elements of computation that quantify the information generated in a target system, stored in it, transferred to it from one or more source systems, and modified in a synergistic or redundant way. The concepts of information transfer and modification have been recently formulated in the context of linear parametric modeling of vector stochastic processes, linking them to the notion of Granger causality and providing efficient tools for their computation based on the state&ndash
Measuring the Rate of Information Exchange in Point-Process Data With Application to Cardiovascular Variability
2022
The amount of information exchanged per unit of time between two dynamic processes is an important concept for the analysis of complex systems. Theoretical formulations and data-efficient estimators have been recently introduced for this quantity, known as the mutual information rate (MIR), allowing its continuous-time computation for event-based data sets measured as realizations of coupled point processes. This work presents the implementation of MIR for point process applications in Network Physiology and cardiovascular variability, which typically feature short and noisy experimental time series. We assess the bias of MIR estimated for uncoupled point processes in the frame of surrogate…
Assessing Complexity in Physiological Systems through Biomedical Signals Analysis
2020
The idea that most physiological systems are complex has become increasingly popular in recent decades [...]
Pairwise and higher-order measures of brain-heart interactions in children with temporal lobe epilepsy
2022
Abstract Objective. While it is well-known that epilepsy has a clear impact on the activity of both the central nervous system (CNS) and the autonomic nervous system (ANS), its role on the complex interplay between CNS and ANS has not been fully elucidated yet. In this work, pairwise and higher-order predictability measures based on the concepts of Granger Causality (GC) and partial information decomposition (PID) were applied on time series of electroencephalographic (EEG) brain wave amplitude and heart rate variability (HRV) in order to investigate directed brain-heart interactions associated with the occurrence of focal epilepsy. Approach. HRV and the envelopes of δ and α EEG activity re…
Information-Theoretic Analysis of Cardiorespiratory Interactions during Apneic Events in Sleep
2020
In this work, measures of information dynamics are used to describe the dynamics of heart rate and cardiorespiratory interaction associated to sleep breathing disorders. In a large group of patients reporting repeated episodes of hypopneas, apneas (central, obstructive, mixed) and respiratory effort related arousals (RERA), we computed information storage of heart period variability and information transfer from heart period to airflow amplitude before, during and after each event. We find a general tendency to decrease of the information storage, suggesting higher complexity of the cardiac dynamics. The information transfer decreased during apneic events, and increased during milder disord…