Researchers, one of them an Indian-origin scientist, from Ohio State University have discovered the area of the brain responsible for recognising human facial expressions.
Using the functional magnetic resonance imaging (fMRI) technique as participants looked at images of people making different facial expressions, the team found that it is situated on the right side of the brain behind the ear in a region called the posterior superior temporal sulcus (pSTS).
The researchers also found that neural patterns within the pSTS are specialised for recognising movement in specific parts of the face.
One pattern is tuned to detect a furrowed brow, another is tuned to detect the upturn of lips into a smile, and so on.
“This suggests that our brains decode facial expressions by adding up sets of key muscle movements in the face of the person we are looking at,” said Aleix Martinez, cognitive scientist and professor of electrical and computer engineering at Ohio State.
Doctoral student Ramprakash Srinivasan, Martinez and the team placed 10 college students into an fMRI machine and showed them more than 1,000 photographs of people making facial expressions.
The expressions corresponded to seven different emotional categories: Disgusted, happily surprised, happily disgusted, angrily surprised, fearfully surprised, sadly fearful and fearfully disgusted.
The team was able to create a machine learning algorithm that uses this brain activity to identify what facial expression a person is looking at based solely on the fMRI signal.
Humans use a very large number of facial expressions to convey emotion, other non-verbal communication signals and language.
Yet, when we see someone make a face, we recognise it instantly, seemingly without conscious awareness.
“Now we know that there is a small part of the brain devoted to this task,” Martinez added.
Using the fMRI data, the researchers developed a machine learning algorithm that has about a 60 percent success rate in decoding human facial expressions, regardless of the facial expression and regardless of the person viewing it.
“That’s a very powerful development, because it suggests that the coding of facial expressions is very similar in your brain and my brain and most everyone else’s brain,” Martinez pointed out in a paper published in the Journal of Neuroscience.
The work could have a variety of applications, helping us not only understand how the brain processes facial expressions, but ultimately how this process may differ in people with autism, for example.