How the sounds we hear help us predict how


Researchers from the University of East Anglia have made an important discovery about how our brain processes sound and tactile sensations.

A new study published today shows how the brain’s different sensory systems are all tightly interconnected, with regions that respond to touch also being involved when we listen to specific sounds associated with touching objects.

They discovered that these areas of the brain can tell the difference between hearing sounds like a bouncing ball or the sound of a keyboard being hit.

It is hoped that understanding this key area of ​​brain function may in the future help people who are neurodiverse or suffer from conditions such as schizophrenia or anxiety. And that could lead to developments in brain-inspired computing and AI.

Lead researcher Dr Fraser Smith, from the UEA School of Psychology, said: “We know that when we hear a familiar sound such as a bouncing ball, it leads us to expect to see a particular object. But what we discovered is that it also causes the brain to represent what it might feel like to touch and interact with that object.

“These expectations may help the brain process sensory information more efficiently.”

The research team used an MRI scanner to collect brain imaging data while 10 participants listened to sounds generated by interacting with objects – such as bouncing a ball, knocking on a door, crushing paper or tapping on a keyboard.

Using a special imaging technique called functional MRI (fMRI), they measured brain activity throughout the brain.

They used sophisticated machine learning analysis techniques to test whether activity generated in the first touch areas of the brain (primary somatosensory cortex) could distinguish between sounds generated by different types of object interaction (bouncing a ball , typing verses on a keyboard).

They also performed a similar analysis for control sounds, similar to those used in hearing tests, to rule out that any sound could be discriminated in this brain region.

Researcher Dr Kerri Bailey said: “Our research shows that parts of our brains that were thought to only respond when we touch objects are also involved when we listen to specific sounds associated with touching objects.

“This supports the idea that a key role of these brain areas is to predict what we might feel next, regardless of the sensory input currently available.

Dr Smith added: “Our findings challenge the way neuroscientists traditionally understand how sensory areas of the brain work and demonstrate that the different sensory systems in the brain are in fact all highly interconnected.

“Our hypothesis is that sounds provide predictions to help our future interaction with objects, in line with a key theory of brain function – called predictive processing.

“Understanding this key mechanism of brain function can provide compelling insights into mental health issues such as schizophrenia, autism or anxiety and further lead to developments in computing and AI inspired by the brain.”

This study was conducted by UEA, in collaboration with researchers from Aix-Marseille University (France) and Maastricht University (Netherlands).

‘Decoding sounds describing hand-object interactions in the primary somatosensory cortex’ is published in the journal Cerebral cortex on August 24, 2022.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of press releases posted on EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.


About Author

Comments are closed.