Simple rhythmic sounds can reshape the brain’s entire network landscape, study finds

Even a basic auditory rhythm can reconfigure how the brain organizes itself, according to a new study published in Advanced Science. The researchers introduced a new analytical tool called FREQ-NESS, which maps how brain networks operate simultaneously across different frequencies. The results suggest that hearing rhythmic tones engages the auditory cortex while also reorganizing the brain’s broader network configuration, shifting dominant oscillations and enhancing communication between slower and faster brain rhythms.

Cognitive neuroscience aims to understand how the brain functions as a dynamic system that processes ongoing information from the environment. A key challenge has been accurately capturing how different brain networks—each with their own frequency and spatial characteristics—operate at the same time, particularly when the brain is responding to external stimuli like music or speech.

Previous research has tended to focus either on specific anatomical regions or on broad, canonical frequency bands such as alpha or gamma. These approaches have produced useful but fragmented insights. Another complication is that multiple brain processes overlap in time and space, which makes it hard to isolate individual networks using traditional methods.

To address this, the authors developed a method that could simultaneously capture frequency-specific brain networks with fine spatial and temporal resolution. Their approach also aimed to avoid the limitations of prior methods that depend on pre-defined anatomical regions or that group oscillations into broad frequency categories. The result was FREQ-NESS, short for FREQuency-resolved Network Estimation via Source Separation.

“We wanted a simple, transparent way to see how multiple brain networks operate at specific frequencies at the same time, and how this organization changes when the brain engages with sound and rhythm,” said study author Mattia Rosso (@mattiarosso), a postdoctoral researcher at the Center for Music in the Brain at Aarhus University.

“Existing tools either predefine regions/bands or blur overlapping processes, so we built FREQ-NESS as an analytical pipeline to estimate frequency-resolved networks directly at the level of the brain’s anatomical sources, facilitating the analysis of the temporal and spatial dynamics within the networks as well as the interactions across networks.”

The researchers recruited 29 participants, mostly non-musicians, and recorded their brain activity using magnetoencephalography (MEG), which offers high temporal precision. Participants underwent two five-minute sessions: one where they sat in a resting state while watching a silent movie, and another where they passively listened to rhythmic tones while watching the same movie. The tones were presented at a steady rate of 2.4 Hz—about two beats per second.

The MEG data were source-reconstructed to estimate the activity of over 3,500 brain voxels, representing fine-grained locations across the brain. These voxel time series were analyzed using generalized eigendecomposition, or GED, to separate narrowband (frequency-specific) activity from broader background signals. This allowed the researchers to isolate networks that were most active at specific frequencies.

They applied this analysis to 86 frequencies ranging from 0.2 Hz to nearly 100 Hz. Each frequency’s network structure was characterized by both how much variance it explained (a proxy for prominence) and by its spatial activation pattern. The researchers also examined how low-frequency activity modulated the amplitude of higher-frequency networks, a phenomenon known as cross-frequency coupling.

During the resting condition, the brain showed a typical 1/f pattern, with low frequencies dominating. Familiar networks emerged, including the default mode network (DMN) at low frequencies, a parieto-occipital alpha network peaking around 10.9 Hz, and a motor-related beta network around 22.9 Hz. These findings are consistent with past work showing that different brain areas tend to operate preferentially in certain frequency bands.

When participants listened to the rhythmic tones, the brain’s network landscape changed in three key ways.

First, the auditory stimulation caused the emergence of new networks that were sharply attuned to the stimulation frequency of 2.4 Hz and its harmonic at 4.8 Hz. These networks were concentrated in the auditory cortex, including Heschl’s gyrus, and extended to medial temporal areas such as the hippocampus and insula. These regions are known to play roles in both early and higher-level auditory processing.

Second, existing brain networks shifted their frequency preferences and spatial configurations. For example, the peak alpha activity moved from 10.9 Hz to 12.1 Hz, and the spatial focus of alpha activity shifted from parieto-occipital regions to sensorimotor areas. This suggests that the alpha network, commonly involved in attention and inhibition, reorganized itself in response to rhythmic input—perhaps to support movement preparation or sensory prediction.

Third, some networks remained largely unchanged. The motor-related beta network, centered around 22.9 Hz and focused in the precentral gyrus, showed similar prominence and topography in both resting and listening conditions. This stability suggests that not all networks are influenced equally by external stimuli.

Beyond these shifts, the researchers also found that listening to rhythmic tones strengthened the coordination between slower and faster rhythms. Specifically, the phase of the auditory-attuned network at 2.4 Hz modulated the amplitude of gamma-band activity—fast oscillations above 60 Hz—in brain areas such as the insula and frontal operculum. This cross-frequency coupling was stronger during listening than at rest, suggesting that the brain ramps up coordination across timescales to process rhythmic stimuli.

Importantly, these gamma-band networks were not located in the auditory cortex itself, but appeared in broader associative regions. This indicates that the brain’s response to rhythm is not limited to processing sounds per se—it may also involve integrating sensory input with memory, attention, or predictive processes.

“While based on our simulations we expected the primary auditory network to attune to the auditory stimulus, we were surprised to find that the rest of the network landscape underwent a global reorganization,” Rosso told PsyPost. “In particular, we found that the alpha network not only sped up (from about 10.9 Hz to 12.1 Hz) but also shifted from occipital to sensorimotor regions, suggesting a preparatory state for action in response to rhythmic input.”

“We were also struck that the gamma effects weren’t centered in primary auditory cortex but in a broader network (insula, inferior temporal, hippocampal, frontal operculum/IFG), suggesting that fast activity is mediated by interactions with low-frequency auditory networks rather than originating locally. Finally, the robustness of the method with very short recordings (down to ~30 seconds) was a pleasant surprise, confirming that the method is both sensitive and feasible.”

The researchers confirmed the robustness of their method through several replication tests. They showed that similar network landscapes could be detected in independent datasets, that shorter recording durations still produced reliable patterns, and that using different sensor types (gradiometers vs. magnetometers) had little effect on the results.

To test the importance of the method’s spatial and temporal structure, they also introduced randomization procedures. When the spatial arrangement of voxels was scrambled, the frequency content of the data was preserved but the resulting networks were meaningless. When both space and time structure were disrupted, the method failed to isolate any coherent networks. This provided further evidence that the detected patterns were not artifacts but reflected genuine brain dynamics.

“Even during very simple listening, the brain reconfigures, attuning its internal dynamics to the external world in order to process information,” Rosso summarized. “It does so in three main ways: 1) it attunes its primary sensory networks to the stimulation frequency, 2) adapts the frequency and spatial arrangement of intrinsic oscillations, and 3) boosts communication between slow and faster internal rhythms. In short, rhythm doesn’t just ‘light up the auditory area’—it retunes the brain’s network landscape as a whole.”

While FREQ-NESS provides a powerful tool for mapping frequency-specific brain networks, the study had some limitations, including a modest sample size, the absence of baseline noise data from “empty-room” MEG recordings, and a minimalistic task design. The researchers suggest that future work could involve more ecologically valid stimuli and expand the method to support broadband and multimodal applications.

“The brain is an extremely complex system: frequency is only one criterion for its organization,” Rosso noted. “FREQ-NESS is explicitly designed with the limited scope of separating brain networks and analyzing their interactions based on their frequency-specificity. We are very much aware of this, and so should be whoever is going to use the method for their investigation of the brain. Frequency alone does not ‘solve the brain.’”

“Our study presents the frequency-resolved variant of the broader NESS framework, which stands for ‘Network Estimation via Source Separation’. In a research landscape moving toward overly complex or AI-based ‘black box’ approaches, our aim is to extract insights from simple and interpretable linear decomposition methods, with very few degrees of freedom.”

“The two major directions now are: making the method multimodal (e.g., applicable to data of different nature), and developing a broadband variant, which is already in press in a high-impact journal. In the meantime, we and collaborators across European centers are applying the pipeline to study how the brain’s network landscape changes across states of consciousness, during psychedelic intake, with aging, in musicians, and other experimental conditions and populations.”

“The FREQ-NESS toolbox and full pipeline are openly available on our GitHub repository (https://github.com/mattiaRosso92/Frequency-resolved_brain_network_estimation_via_source_separation_FREQ-NESS/tree/main/FREQNESS_Toolbox), and the paper is open access,” he added. “We hope others will use it to probe how ongoing brain rhythms shape perception and action. We are also very open to providing guidance and setting up collaborations, so please do not hesitate to reach out.”

The study, “FREQ-NESS Reveals the Dynamic Reconfiguration of Frequency-Resolved Brain Networks During Auditory Stimulation,” was authored by Mattia Rosso, Gemma Fernández-Rubio, Peter Erik Keller, Elvira Brattico, Peter Vuust, Morten L. Kringelbach, and Leonardo Bonetti.

Stay up to date
Register now to get updates on promotions and coupons
HTML Snippets Powered By : XYZScripts.com

Shopping cart

×