The human brain’s remarkable ability to perform complex computations despite relatively constant energy consumption presents a long-standing puzzle, and researchers are now proposing a solution rooted in the subtle interplay of interactions between brain regions. Yoshiaki Horiike from Nagoya University and the University of Copenhagen, along with Shin Fujishiro from Kyoto University, lead a team that demonstrates how brain function arises not from large energy shifts, but from delicate modifications to the way brain regions communicate. Analysing patterns of brain activity during both rest and task performance, the team reveals that the brain operates through distinct sequences of state transitions, measurable as probability fluxes. Their work establishes that while strong, symmetrical interactions between brain regions remain consistent, it is the subtle, task-dependent antisymmetric interactions that shape specific functions, potentially allowing the brain to achieve complex computation with minimal energy expenditure.
The emerging technology enables the recording of population activity in neurons, and the theory of neural networks is expected to explain and extract functional computations from the data. Thermodynamically, a large proportion of the whole-body energy is consumed by the brain, yet functional computation seems to involve high energy consumption, raising a fundamental question: how can the human brain perform its wide repertoire of functional computations without drastically changing its energy consumption? Here, researchers present a mechanism….
Brain Entropy Production From Network Transitions
This document details the theoretical framework and empirical methods used to estimate entropy production rates in the human brain, combining principles of stochastic thermodynamics with analysis of brain network data. The study focuses on quantifying entropy, a measure of the system’s dissipation of energy, as a key indicator of brain function, and demonstrates how entropy production relates to energy consumption. The authors highlight the importance of broken detailed balance, suggesting brain network transitions are not reversible, a prerequisite for entropy production. The team analyzed brain network data, obtained through neuroimaging techniques, to estimate entropy production rates during various cognitive tasks, including rest, emotional processing, and working memory.
They employed a method of coarse-graining, simplifying the complex brain network data by grouping states into clusters, acknowledging that the number of clusters influences the estimated entropy production rate. Results show that the proportion of observed state transitions decreases as the network is simplified, and that estimated entropy production rates vary depending on the cognitive task. Statistical tests confirm that entropy production rates for different tasks are significantly different, though a weak correlation exists between response rate and entropy production. This work bridges the gap between neuroscience and physics by applying thermodynamic principles to understand brain function, suggesting that entropy production may be a fundamental constraint on brain activity and could provide insights into the efficiency and adaptability of the brain. The findings suggest that different cognitive tasks are associated with different levels of entropy production, reflecting their complexity and energy demands.
Brain Efficiency Arises From Sequential State Transitions
Researchers have uncovered a mechanism explaining how the human brain performs complex computations with surprisingly low energy consumption. Experiments reveal that brain function arises from unique sequential patterns in state transitions, indicating that brain activity progresses through distinct phases depending on the task being performed. Analysis of whole-cerebral-cortex activity, recorded using functional magnetic resonance imaging, demonstrates that these state transitions exhibit task-specific patterns, suggesting a dynamic reorganization of brain activity. The team developed a computational model, based on Ising spin systems with asymmetric interactions, to investigate the underlying network structure responsible for these patterns.
Results show that strong, task-independent interactions form the foundation of the brain’s network, while subtle, task-dependent antisymmetric interactions modulate activity and drive the observed functional computation. This model accurately reproduces the observed patterns of probability flux, demonstrating that the brain achieves its computational flexibility not through large energy expenditures, but through minute adjustments to these antisymmetric interactions. The findings indicate that approximately 60 to 80% of the brain’s energy consumption is not directly involved in specific functions, but maintains the underlying network structure, and that the brain’s efficiency stems from its ability to subtly modify interactions within this established network. This research anticipates potential applications in brain-inspired computing technologies, such as neuromorphic computing, and provides a novel method for analyzing complex, high-dimensional systems to infer underlying interactions.
Dynamic Brain States Enable Efficient Computation
This research investigates how the human brain performs complex computations without a corresponding increase in energy consumption. The team demonstrates that brain function is characterised by distinct sequential patterns of state transitions, emerging from subtle asymmetries within the network of interactions between brain regions. Analysis of fMRI data reveals that these asymmetric interactions are task-dependent, while the stronger, symmetric interactions remain relatively constant, suggesting the brain modifies its internal connectivity rather than overall energy expenditure to perform different functions. The findings support a view of cognition as a dynamic, sequential process, aligning with recent theories proposing wave-like motifs in brain activity.
Importantly, the research highlights the significance of probability flux, a measure of state transitions, as a more informative indicator of brain dynamics than simply quantifying overall entropy production. While the study successfully captures key features of brain activity, the authors acknowledge that the model could be improved by addressing the limited correlation observed in the working memory task. Future work may also explore the broader implications of this approach for understanding nonequilibrium statistical physics and stochastic thermodynamics, potentially offering new avenues for investigating the fundamental principles governing brain function.
👉 More information
🗞 Distinct weak asymmetric interactions shape human brain functions as probability fluxes
🧠 ArXiv: https://arxiv.org/abs/2508.20961
