Skip to main content

EDITORIAL article

Front. Physiol., 11 May 2022
Sec. Fractal Physiology
This article is part of the Research Topic Inference, Causality and Control in Networks of Dynamical Systems: Data Science and Modeling Perspectives to Network Physiology with Implications for Artificial Intelligence View all 5 articles

Editorial: Inference, Causality and Control in Networks of Dynamical Systems: Data Science and Modeling Perspectives to Network Physiology With Implications for Artificial Intelligence

  • 1Ming-Hsieh Department of Electrical and Computer Engineering, University of Southern California, Los Angeles, CA, United States
  • 2Keck Laboratory for Network Physiology, Department of Physics, Boston University, Boston, MA, United States
  • 3Harvard Medical School and Division of Sleep Medicine, Brigham and Women’s Hospital, Boston, MA, United States
  • 4Institute of Solid State Physics, Bulgarian Academy of Sciences, Sofia, Bulgaria
  • 5Delft Center for Systems and Control, Delft University of Technology, Delft, Netherlands

A fundamental problem crisscrossing the fields of physiology and artificial intelligence is understanding how complex activity and behavior emerge from the intrinsic underlying structure and dynamics. To address this problem, we need new methodologies and tools to perform a comprehensive analysis of complex systems dynamics. Multifractal formalism and methodology enable us to investigate local interactions underlying physiological systems and quantify the organization of physiological temporal fluctuations and their cascades across scales (Plamen Ch. Ivanov et al., 1999, 2001; Ivanov et al., 2002; Mukli et al., 2015). Besides, we need a general network framework to examine networks of interactions among diverse subsystems across space and time scales that lead to emergent complex behaviors at the systems level (Bashan et al., 2012; P. C. H. Ivanov et al., 2016; Ivanov and Bartsch, 2014). Despite recent progress in the theory of dynamic networks, there are fundamental methodological and conceptual challenges in understanding how global states and functions emerge in networks of diverse dynamical systems with time-varying interactions and the basic principles of their hierarchical integration. In particular, when mining the time-varying complex networks structure and dynamics, one has to overcome various internal or external perturbations that can transiently or permanently mask the activity of particular nodes and their causal interactions (Gupta et al., 2019; Gupta et al., 2018; Xue and Bogdan, 2017a; Xue and Bogdan, 2019; Xue and Bogdan, 2017b).

Novel artificial intelligence techniques and machine learning algorithms may equip us with the tools to classify and predict the emergent behavior in dynamical networks based simultaneously on network topology and temporal patterns in network dynamics. Key insights and knowledge that emerge from multifractal and differential geometry concepts can help analyze and quantify their complexity. Furthermore, they allow us to determine the most efficient network architecture to generate a given function, quantify key universalities, and identify new theoretical directions for artificial intelligence and machine learning based on physiological principles (Richards et al., 2019). Ultimately, we will attain sustainable systems that enjoy seamlessly indistinguishable features of physiological systems (Wu et al., 2021).

From genomic, proteomic, and metabolic networks to microbial communities, neural systems, and human network physiology of organ systems, complex systems display multi-scale spatiotemporal patterns that are frequently classified as non-linear, non-Gaussian, scale-invariant, and multifractal (Bassingthwaighte et al., 2013; Ivanov et al., 2009; Stanley et al., 1999; West and Zweifel, 1992). While several efforts have demonstrated that electromyographic signals possess fractal properties (Sanders et al., 1996; Xue et al., 2016; Garcia-Retortillo et al., 2020; Rizzo et al., 2020), (Martin del Campo Vera and Jonckheere) report a complex bursting rate variability phenomenon where the surface electromyographic (sEMG) bursts are synchronous with wavelet packets in the D8 sub-band of the Daubechies 3 (db3) wavelet decomposition of the raw signal. Their db3 wavelet decomposition analysis reconstructs the sEMG bursts with two high coefficients at level 8, indicating a high incidence of two consecutive neuronal discharges. In contrast to heart rate variability (P. Ch Ivanov et al., 1998), the newly reported bursting rate variability phenomenon involves a time-localization of the burst with a statistical waveform matching between the “D8 doublet” and the burst in the raw sEMG signal. While this analysis focused on an available small cohort of patients, further comprehensive studies can elucidate the interdependencies between the electromyographic signals and other brain and physiological processes, determine the mechanistic role, and implications for medical applications.

Quests for understanding the inner workings of complex biological dynamics have provided not only more appropriate and efficacious medical therapies but have also led to new artificial intelligence algorithms and architectures. For instance, inspired by early modeling of how biological neurons process information, the reservoir computer model–a type of recurrent neural network where the set of outputs are fit to a training signal–provided promising high-performance low-power consumption computational strategies in classification tasks. Along these lines, (Carroll) considers the computational difficulty of parameter optimization in a reservoir computer and demonstrates that the optimum classification performance occurs for the hyperparameters that maximize the entropy of either a spiking reservoir computer or a reservoir computer. Intriguingly, T. Carroll shows that optimizing for entropy only requires a single realization of each signal to be classified, which provides a fast and low power computational strategy.

Intelligence is an essential trait characteristic of healthy biological systems, allowing them not only to locally optimize in search for better fitness states, but also to cope with unknown rare environmental perturbations. While much of the complex dynamical systems theory focused on defining, quantifying, and analyzing the degree of emergence (Balaban et al., 2018; Koorehdavoudi and Bogdan, 2016), self-organization (Balaban et al., 2018; Koorehdavoudi and Bogdan, 2016; Polani, 2008), self-optimization (Gershenson et al., 2021; Koorehdavoudi and Bogdan, 2016; Prokopenko et al., 2009; Prokopenko et al., 2014) and complexity (Adami, 2002; Jost, 2004; Koorehdavoudi and Bogdan, 2016) of various complex biological systems towards providing a definition of “intelligence” (Hernández-Orallo et al., 2021), generalization–the ability of a system to handle unexpected (future) situations for which it was not trained with a similar degree of success to that small data on which it was trained—remains an essential feature distinguishing human and artificial intelligence. Current efforts in artificial intelligence and machine learning investigate the degree to which a variety of neural network architectures are capable of “generalizing” and exhibiting intelligent behavior. Along these lines, Stoop) provides a series of fundamental examples demonstrating that for situations not included in the training efforts, the AI systems tend to run into substantial problems. These fundamental examples highlight not only the difference between human and artificial intelligence but also call for renewed interest in defining the theoretical foundations of intelligence.

By taking inspiration from biological neural systems capable of solving complex multi-objective problems characterized by ill-conditioned Hessians, Chatterjee et al. pioneer a fractional time series analysis framework that can not only model the neuro-physiological processes but also can circumvent the challenges of current optimization tools. More precisely, they show that the long-range memory observed in many biological systems and neurophysiological signals in particular exhibits non-exponential power-law decay of trajectories that can model the behavior associated with the objective function’s local curvature at a given time point. This allows them to propose the NEuro-inspired Optimization (NEO) method to deal with ill-conditioned Hessian problems. While promising, this effort shows that mathematical approaches targeting understanding the multifractality of biological systems can provide new theoretical directions for artificial intelligence.

The works presented in this Research Topic collection and current advances in the field of fractal and multifractal investigations of physiological systems structure and dynamics, and their applications to artificial intelligence, outline new challenges and opportunities in multidisciplinary research and applications. Dealing with the heterogeneity, multi-modality, and complexity of physiological and artificial systems requires rigorous mathematical and algorithmic techniques to extract causal interdependencies between systems across different scales while overcoming various noise sources. As such, progress in this direction will require new algorithmic strategies to quantify time-varying information flow among diverse physiological and artificial processes across scales and determine how it influences the system dynamics.

Furthermore, there is an urgent need to adopt a cross-scale perspective and a corresponding theoretical framework to investigate the multi-scale regulatory mechanisms underlying the overall network and its relation to emergent states and functions in physiological and artificial systems. This urges the interactions of statistical physics, non-linear dynamics, information theory, probability and stochastic processes, artificial intelligence, machine learning, control theory and optimization, basic physiology, and medicine, such that new theoretical and algorithmic foundations will emerge for analyzing and designing physiological and artificial systems. Only then, the biomedical and engineering communities will be able to develop new control methodologies that do not seek to only enforce a specific reference value but rather ensure that the complexity and multifractality are restored to a desirable profile.

Author Contributions

PB and SP wrote the editorial under the mentorship and feedback from PI.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Publisher’s Note

All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.

References

Adami C. (2002). What Is Complexity? BioEssays 24 (12), 1085–1094. doi:10.1002/bies.10192

PubMed Abstract | CrossRef Full Text | Google Scholar

Balaban V., Lim S., Gupta G., Boedicker J., Bogdan P. (2018). Quantifying Emergence and Self-Organisation of Enterobacter cloacae Microbial Communities. Sci. Rep. 8 (1). doi:10.1038/s41598-018-30654-9

PubMed Abstract | CrossRef Full Text | Google Scholar

Bashan A., Bartsch R. P., Kantelhardt J. W., Havlin S., Ivanov P. C. (2012). Network Physiology Reveals Relations between Network Topology and Physiological Function. Nat. Commun. 3. doi:10.1038/ncomms1705

PubMed Abstract | CrossRef Full Text | Google Scholar

Bassingthwaighte J., Liebovitch L., West B. (2013). “Fractal Physiology,” in Fractal Physiology (Springer).

Google Scholar

Garcia-Retortillo S., Rizzo R., Wang J. W. J. L., Sitges C., Ivanov P. C. (2020). Universal Spectral Profile and Dynamic Evolution of Muscle Activation: A Hallmark of Muscle Type and Physiological State. J. Appl. Physiology 129 (3), 419–441. doi:10.1152/japplphysiol.00385.2020

CrossRef Full Text | Google Scholar

Gershenson C., Polani D., Martius G. (2021). Editorial: Complexity and Self-Organization. Front. Robot. AI 8, 668305. doi:10.3389/frobt.2021.668305

PubMed Abstract | CrossRef Full Text | Google Scholar

Gupta G., Pequito S., Bogdan P. (2018). Dealing with Unknown Unknowns: Identification and Selection of Minimal Sensing for Fractional Dynamics with Unknown Inputs. Proc. Am. Control Conf., 2814–2820. doi:10.23919/ACC.2018.8430866

CrossRef Full Text | Google Scholar

Gupta G., Pequito S., Bogdan P. (2019). Learning Latent Fractional Dynamics with Unknown Unknowns. Proc. Am. Control Conf., 217–222. doi:10.23919/acc.2019.8815074

CrossRef Full Text | Google Scholar

Hernández-Orallo J., Loe B. S., Cheke L., Martínez-Plumed F., Ó hÉigeartaigh S. (2021). General Intelligence Disentangled via a Generality Metric for Natural and Artificial Intelligence. Sci. Rep. 11 (1). doi:10.1038/s41598-021-01997-7

CrossRef Full Text | Google Scholar

Ivanov P. C., Amaral L. A. N., Goldberger A. L., Havlin S., Rosenblum M. G., Struzik Z. R., et al. (1999). Multifractality in Human Heartbeat Dynamics. Nature 399 (6735), 461–465. doi:10.1038/20924

PubMed Abstract | CrossRef Full Text | Google Scholar

Ivanov P. C., Bartsch R. P., Bartsch R. P. (2016). Focus on the Emerging New Fields of Network Physiology and Network Medicine. New J. Phys. 18 (10), 100201. doi:10.1088/1367-2630/18/10/100201

PubMed Abstract | CrossRef Full Text | Google Scholar

Ivanov P. C., Bartsch R. P. (2014). Network Physiology: Mapping Interactions between Networks of Physiologic Networks. Underst. Complex Syst., 203–222. doi:10.1007/978-3-319-03518-5_10

CrossRef Full Text | Google Scholar

Ivanov P. C., Goldberger A. L., Stanley H. E. (2002). “Fractal and Multifractal Approaches in Physiology,” in The Science of Disasters (Springer Berlin Heidelberg), 218–257. doi:10.1007/978-3-642-56257-0_7

CrossRef Full Text | Google Scholar

Ivanov PCh , Rosenblum M. G., Peng C. K., Mietus J. E., Havlin S., Stanley H. E., et al. (1998). Scaling and Universality in Heart Rate Variability Distributions. Phys. A 249 (1–4), 587–593. doi:10.1016/S0378-4371(97)00522-0

CrossRef Full Text | Google Scholar

Ivanov P. C., Ma Q. D. Y., Bartsch R. P., Hausdorff J. M., Nunes Amaral L. A., Schulte-Frohlinde V., et al. (2009). Levels of Complexity in Scale-Invariant Neural Signals. Phys. Rev. E 79 (4). doi:10.1103/PhysRevE.79.041920

PubMed Abstract | CrossRef Full Text | Google Scholar

Ivanov P. C., Nunes Amaral L. A., Goldberger A. L., Havlin S., Rosenblum M. G., Stanley H. E., et al. (2001). From 1/f Noise to Multifractal Cascades in Heartbeat Dynamics. Chaos 11 (3), 641–652. doi:10.1063/1.1395631

PubMed Abstract | CrossRef Full Text | Google Scholar

Jost J. (2004). External and Internal Complexity of Complex Adaptive Systems. Theory Biosci. 123 (1), 69–88. doi:10.1016/j.thbio.2003.10.001

PubMed Abstract | CrossRef Full Text | Google Scholar

Koorehdavoudi H., Bogdan P. (2016). A Statistical Physics Characterization of the Complex Systems Dynamics: Quantifying Complexity from Spatio-Temporal Interactions. Sci. Rep. 6. doi:10.1038/srep27602

PubMed Abstract | CrossRef Full Text | Google Scholar

Mukli P., Nagy Z., Eke A. (2015). Multifractal Formalism by Enforcing the Universal Behavior of Scaling Functions. Phys. A Stat. Mech. Its Appl. 417, 150–167. doi:10.1016/j.physa.2014.09.002

CrossRef Full Text | Google Scholar

Polani D. (2008). “Foundations and Formalizations of Self-Organization,” in Advanced Information and Knowledge Processing (Springer), 19–37. doi:10.1007/978-1-84628-982-8_2

CrossRef Full Text | Google Scholar

Prokopenko M., Boschetti F., Ryan A. J. (2009). An Information-Theoretic Primer on Complexity, Self-Organization, and Emergence. Complexity 15 (1), 11–28. doi:10.1002/cplx.20249

CrossRef Full Text | Google Scholar

Prokopenko M., Polani D., Ay N. (2014). On the Cross-Disciplinary Nature of Guided Self-Organisation. 3–15. doi:10.1007/978-3-642-53734-9_1

CrossRef Full Text | Google Scholar

Richards B. A., Lillicrap T. P., Beaudoin P., Bengio Y., Bogacz R., Christensen A., et al. (2019). A Deep Learning Framework for Neuroscience. Nat. Neurosci. 22 (11), 1761–1770. doi:10.1038/s41593-019-0520-2

PubMed Abstract | CrossRef Full Text | Google Scholar

Rizzo R., Zhang X., Wang J. W. J. L., Lombardi F., Ivanov P. C. (2020). Network Physiology of Cortico-Muscular Interactions. Front. Physiol. 11. doi:10.3389/fphys.2020.558070

CrossRef Full Text | Google Scholar

Sanders D. B., Stålberg E. V., Nandedkar S. D. (1996). Analysis of the Electromyographic Interference Pattern. J. Clin. Neurophysiology 13 (5), 385–400. doi:10.1097/00004691-199609000-00003

CrossRef Full Text | Google Scholar

Stanley H. E., Amaral L. A., Goldberger A. L., Havlin S., Ivanov PCh P. C., Peng C. K. (1999). Statistical Physics and Physiology: Monofractal and Multifractal Approaches. Phys. A 270 (1), 309–324. doi:10.1016/S0378-4371(99)00230-7

CrossRef Full Text | Google Scholar

West B. J., Zweifel P. F. (1992). Fractal Physiology and Chaos in Medicine. Phys. Today 45 (3), 68–70. doi:10.1063/1.2809583

CrossRef Full Text | Google Scholar

Wu C.-J., Raghavendra R., Gupta U., Acun B., Ardalani N., Maeng K., et al. (2021). Sustainable AI: Environmental Implications, Challenges and Opportunities. Available at: https://arxiv.org/abs/2111.00364.

Google Scholar

Xue Y., Bogdan P. (2017b). “Constructing Compact Causal Mathematical Models for Complex Dynamics,” in Proceedings - 2017 ACM/IEEE 8th International Conference on Cyber-Physical Systems, 97–107. ICCPS 2017 (Part of CPS Week). doi:10.1145/3055004.3055017

CrossRef Full Text | Google Scholar

Xue Y., Bogdan P. (2019). Reconstructing Missing Complex Networks against Adversarial Interventions. Nat. Commun. 10 (1). doi:10.1038/s41467-019-09774-x

PubMed Abstract | CrossRef Full Text | Google Scholar

Xue Y., Bogdan P. (2017a). Reliable Multi-Fractal Characterization of Weighted Complex Networks: Algorithms and Implications. Sci. Rep. 7 (1). doi:10.1038/s41598-017-07209-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Xue Y., Rodriguez S., Bogdan P. (2016). “A Spatio-Temporal Fractal Model for a CPS Approach to Brain-Machine-Body Interfaces,” in Proceedings of the 2016 Design, Automation and Test in Europe Conference and Exhibition, 642–647. DATE. doi:10.3850/9783981537079_0502

CrossRef Full Text | Google Scholar

Keywords: fractional-order dynamic models, artificial Inteligence-AI, physiology, multifractal networks, neuro-inspired artificial intelligence

Citation: Bogdan P, Ivanov PC and Pequito S (2022) Editorial: Inference, Causality and Control in Networks of Dynamical Systems: Data Science and Modeling Perspectives to Network Physiology With Implications for Artificial Intelligence. Front. Physiol. 13:917001. doi: 10.3389/fphys.2022.917001

Received: 10 April 2022; Accepted: 21 April 2022;
Published: 11 May 2022.

Edited and reviewed by:

Francoise Argoul, Centre National de la Recherche Scientifique (CNRS), France

Copyright © 2022 Bogdan, Ivanov and Pequito. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Sergio Pequito, sergio.pequito@tudelft.nl

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.