Feedback

X

Information-based methods for neuroimaging: analyzing structure, function and dynamics

0 Ungluers have Faved this Work
The aim of this Research Topic is to discuss the state of the art on the use of Information-based methods in the analysis of neuroimaging data. Information-based methods, typically built as extensions of the Shannon Entropy, are at the basis of model-free approaches which, being based on probability distributions rather than on specific expectations, can account for all possible non-linearities present in the data in a model-independent fashion.Mutual Information-like methods can also be applied on interacting dynamical variables described by time-series, thus addressing the uncertainty reduction (or information) in one variable by conditioning on another set of variables.In the last years, different Information-based methods have been shown to be flexible and powerful tools to analyze neuroimaging data, with a wide range of different methodologies, including formulations-based on bivariate vs multivariate representations, frequency vs time domains, etc. Apart from methodological issues, the information bit as a common unit represents a convenient way to open the road for comparison and integration between different measurements of neuroimaging data in three complementary contexts: Structural Connectivity, Dynamical (Functional and Effective) Connectivity, and Modelling of brain activity. Applications are ubiquitous, starting from resting state in healthy subjects to modulations of consciousness and other aspects of pathophysiology.Mutual Information-based methods have provided new insights about common-principles in brain organization, showing the existence of an active default network when the brain is at rest. It is not clear, however, how this default network is generated, the different modules are intra-interacting, or disappearing in the presence of stimulation. Some of these open-questions at the functional level might find their mechanisms on their structural correlates. A key question is the link between structure and function and the use of structural priors for the understanding of the functional connectivity measures. As effective connectivity is concerned, recently a common framework has been proposed for Transfer Entropy and Granger Causality, a well-established methodology originally based on autoregressive models. This framework can open the way to new theories and applications.This Research Topic brings together contributions from researchers from different backgrounds which are either developing new approaches, or applying existing methodologies to new data, and we hope it will set the basis for discussing the development and validation of new Information-based methodologies for the understanding of brain structure, function, and dynamics.

This book is included in DOAB.

Why read this book? Have your say.

You must be logged in to comment.

Rights Information

Are you the author or publisher of this work? If so, you can claim it as yours by registering as an Unglue.it rights holder.

Downloads

This work has been downloaded 149 times via unglue.it ebook links.
  1. 16 - epub (CC BY) at Unglue.it.
  2. 15 - epub (CC BY) at Unglue.it.
  3. 41 - mobi (CC BY) at Unglue.it.
  4. 26 - epub (CC BY) at Unglue.it.
  5. 31 - pdf (CC BY) at Unglue.it.

Keywords

  • Granger causality
  • Information Theory
  • mutual information
  • network theory
  • transfer entropy
  • computational neuroscience
  • functional connectome
  • neuroinformatics
  • structural connectome
  • Biology, Life Sciences
  • Brain connectivity
  • Computational neuroscience
  • functional connectome
  • Granger causality
  • Information theory
  • Life sciences: general issues
  • Mathematics & science
  • mutual information
  • Network theory
  • neuroinformatics
  • Neurosciences
  • structural connectome
  • thema EDItEUR::P Mathematics and Science::PS Biology, life sciences::PSA Life sciences: general issues::PSAN Neurosciences
  • transfer entropy

Links

DOI: 10.3389/978-2-88919-502-2

Editions

edition cover

Share

Copy/paste this into your site: