• search hit 1 of 6
Back to Result List

Partial autoinformation to characterize symbolic sequences

  • An information-theoretic approach to numerically determine the Markov order of discrete stochastic processes defined over a finite state space is introduced. To measure statistical dependencies between different time points of symbolic time series, two information-theoretic measures are proposed. The first measure is time-lagged mutual information between the random variables Xn and Xn+k, representing the values of the process at time points n and n + k, respectively. The measure will be termed autoinformation, in analogy to the autocorrelation function for metric time series, but using Shannon entropy rather than linear correlation. This measure is complemented by the conditional mutual information between Xn and Xn+k, removing the influence of the intermediate values Xn+k−1, …, Xn+1. The second measure is termed partial autoinformation, in analogy to the partial autocorrelation function (PACF) in metric time series analysis. Mathematical relations with known quantities such as the entropy rate and active information storage are established. Both measures are applied to a number of examples, ranging from theoretical Markov and non-Markov processes with known stochastic properties, to models from statistical physics, and finally, to a discrete transform of an EEG data set. The combination of autoinformation and partial autoinformation yields important insights into the temporal structure of the data in all test cases. For first- and higher-order Markov processes, partial autoinformation correctly identifies the order parameter, but also suggests extended, non-Markovian effects in the examples that lack the Markov property. For three hidden Markov models (HMMs), the underlying Markov order is found. The combination of both quantities may be used as an early step in the analysis of experimental, non-metric time series and can be employed to discover higher-order Markov dependencies, non-Markovianity and periodicities in symbolic time series.

Download full text files

Export metadata

Metadaten
Author:Frederic von Wegner
URN:urn:nbn:de:hebis:30:3-474569
DOI:https://doi.org/10.3389/fphys.2018.01382
ISSN:1664-042X
Pubmed Id:https://pubmed.ncbi.nlm.nih.gov/30369884
Parent Title (English):Frontiers in physiology
Publisher:Frontiers Research Foundation
Place of publication:Lausanne
Contributor(s):Srdjan Kesić
Document Type:Article
Language:English
Year of Completion:2018
Date of first Publication:2018/10/11
Publishing Institution:Universitätsbibliothek Johann Christian Senckenberg
Release Date:2018/11/06
Tag:EEG microstates; Markovianity; entropy; information theory; mutual information; stationarity
Volume:9
Issue:Art. 1382
Page Number:14
First Page:1
Last Page:14
Note:
Copyright © 2018 von Wegner. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
HeBIS-PPN:44066554X
Institutes:Medizin / Medizin
Dewey Decimal Classification:6 Technik, Medizin, angewandte Wissenschaften / 61 Medizin und Gesundheit / 610 Medizin und Gesundheit
Sammlungen:Universitätspublikationen
Open-Access-Publikationsfonds:Medizin
Licence (German):License LogoCreative Commons - Namensnennung 4.0