
It has long been understood that experiencing two senses simultaneously, like seeing and hearing, can lead to improved responses relative to those seen when only one sensory input is experienced by itself. For example, a potential prey that gets visual and auditory clues that it is about to be attacked by a snake in the grass has a better chance of survival.
Precisely how multiple senses are integrated or work together in the brain has been an area of fascination for neuroscientists for decades. New research by an international collaboration of scientists at the University of Rochester and a research team in Dublin, Ireland, has revealed some new key insights.
“Just like sensory integration, sometimes you need human integration,” said John Foxe, Ph.D., director of the Del Monte Institute for Neuroscience at the University of Rochester and co-author of the study that shows how multisensory integration happens in the brain.
These findings were published in Nature Human Behavior.
“This research was built on decades of study and friendship. Sometimes ideas need time to percolate. There is a pace to science, and this research is the perfect example of that.”
Simon Kelly, Ph.D., professor at University College Dublin, led the study. In 2012, his lab discovered a way to measure information for a decision being gathered over time in the brain using an electroencephalographic (EEG) signal. This step followed years of research that set the stage for this work.
“We were uniquely positioned to tackle this,” Kelly said. “The more we know about the fundamental brain architecture underlying such elementary behaviors, the better we can interpret differences in the behaviors and signals associated with such tasks in clinical groups and design mechanistically informed diagnostics and treatments.”
Research participants were asked to watch a simple dot animation while listening to a series of tones and press a button when they noticed a change in the dots, the tones, or both.
Using EEG, the scientists were able to infer that when changes happened in both the dots and tones, auditory and visual decision processes unfolded in parallel but came together in the motor system. This allowed participants to speed up their reaction times.
“We found that the EEG accumulation signal reached very different amplitudes when auditory versus visual targets were detected, indicating that there are distinct auditory and visual accumulators,” Kelly said.
Using computational models, the researchers then tried to explain the decision signal patterns as well as reaction times. In one model, the auditory and visual accumulators race against each other to trigger a motor reaction, while the other model integrates the auditory and visual accumulators and then sends the information to the motor system. Both models worked until researchers added a slight delay to either the audio or visual signals.
Then the integration model did a much better job at explaining all the data, suggesting that during a multisensory (audiovisual) experience, the decision signals may start on their own sensory-specific tracks but then integrate when sending the information to areas of the brain that generate movement.
“The research provides a concrete model of the neural architecture through which multisensory decisions are made,” Kelly said. “It clarifies that distinct decision processes gather information from different modalities, but their outputs converge onto a single motor process where they combine to meet a single criterion for action.”
Team science takes a village
In the 2000s, Foxe’s Cognitive Neurophysiology Lab, which was then located at City College in New York City, brought in a multitude of young researchers, including Kelly and Manuel Gomez-Ramirez, Ph.D., assistant professor of Brain and Cognitive Sciences at the University of Rochester and a co-author of the research.
It is here that Kelly spent his time as a postdoc and was first introduced to multisensory integration and the tools and metrics used to assess audiovisual detection. Gomez-Ramirez, who was a Ph.D. student in the lab at the time, designed an experiment to understand the integration of auditory, visual, and tactile inputs.
“The three of us have been friends across the years with very different backgrounds,” Foxe said.
“But we are bound together by a common interest in answering fundamental questions about the brain. When we get together, we talk about these things, we run ideas by each other, and then six months later, something will come to you. This is a really good example that sometimes science operates on a longer temporal horizon.”
More information:
Distinct audio and visual accumulators co-activate motor preparation for multisensory detection, Nature Human Behaviour (2025). DOI: 10.1038/s41562-025-02280-9
Citation:
Scientists reveal how senses work together in the brain (2025, August 15)
retrieved 15 August 2025
from https://medicalxpress.com/news/2025-08-scientists-reveal-brain.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

It has long been understood that experiencing two senses simultaneously, like seeing and hearing, can lead to improved responses relative to those seen when only one sensory input is experienced by itself. For example, a potential prey that gets visual and auditory clues that it is about to be attacked by a snake in the grass has a better chance of survival.
Precisely how multiple senses are integrated or work together in the brain has been an area of fascination for neuroscientists for decades. New research by an international collaboration of scientists at the University of Rochester and a research team in Dublin, Ireland, has revealed some new key insights.
“Just like sensory integration, sometimes you need human integration,” said John Foxe, Ph.D., director of the Del Monte Institute for Neuroscience at the University of Rochester and co-author of the study that shows how multisensory integration happens in the brain.
These findings were published in Nature Human Behavior.
“This research was built on decades of study and friendship. Sometimes ideas need time to percolate. There is a pace to science, and this research is the perfect example of that.”
Simon Kelly, Ph.D., professor at University College Dublin, led the study. In 2012, his lab discovered a way to measure information for a decision being gathered over time in the brain using an electroencephalographic (EEG) signal. This step followed years of research that set the stage for this work.
“We were uniquely positioned to tackle this,” Kelly said. “The more we know about the fundamental brain architecture underlying such elementary behaviors, the better we can interpret differences in the behaviors and signals associated with such tasks in clinical groups and design mechanistically informed diagnostics and treatments.”
Research participants were asked to watch a simple dot animation while listening to a series of tones and press a button when they noticed a change in the dots, the tones, or both.
Using EEG, the scientists were able to infer that when changes happened in both the dots and tones, auditory and visual decision processes unfolded in parallel but came together in the motor system. This allowed participants to speed up their reaction times.
“We found that the EEG accumulation signal reached very different amplitudes when auditory versus visual targets were detected, indicating that there are distinct auditory and visual accumulators,” Kelly said.
Using computational models, the researchers then tried to explain the decision signal patterns as well as reaction times. In one model, the auditory and visual accumulators race against each other to trigger a motor reaction, while the other model integrates the auditory and visual accumulators and then sends the information to the motor system. Both models worked until researchers added a slight delay to either the audio or visual signals.
Then the integration model did a much better job at explaining all the data, suggesting that during a multisensory (audiovisual) experience, the decision signals may start on their own sensory-specific tracks but then integrate when sending the information to areas of the brain that generate movement.
“The research provides a concrete model of the neural architecture through which multisensory decisions are made,” Kelly said. “It clarifies that distinct decision processes gather information from different modalities, but their outputs converge onto a single motor process where they combine to meet a single criterion for action.”
Team science takes a village
In the 2000s, Foxe’s Cognitive Neurophysiology Lab, which was then located at City College in New York City, brought in a multitude of young researchers, including Kelly and Manuel Gomez-Ramirez, Ph.D., assistant professor of Brain and Cognitive Sciences at the University of Rochester and a co-author of the research.
It is here that Kelly spent his time as a postdoc and was first introduced to multisensory integration and the tools and metrics used to assess audiovisual detection. Gomez-Ramirez, who was a Ph.D. student in the lab at the time, designed an experiment to understand the integration of auditory, visual, and tactile inputs.
“The three of us have been friends across the years with very different backgrounds,” Foxe said.
“But we are bound together by a common interest in answering fundamental questions about the brain. When we get together, we talk about these things, we run ideas by each other, and then six months later, something will come to you. This is a really good example that sometimes science operates on a longer temporal horizon.”
More information:
Distinct audio and visual accumulators co-activate motor preparation for multisensory detection, Nature Human Behaviour (2025). DOI: 10.1038/s41562-025-02280-9
Citation:
Scientists reveal how senses work together in the brain (2025, August 15)
retrieved 15 August 2025
from https://medicalxpress.com/news/2025-08-scientists-reveal-brain.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.