Neuroscience study reveals shared processing of human and dog facial expressions

Researchers recently examined how human brains respond to emotional facial expressions from both humans and dogs, uncovering similarities in how these expressions are processed. Published in Social Cognitive and Affective Neuroscience, the study found that the brain’s response patterns to emotional human and canine faces follow comparable temporal dynamics in specific brain regions. Interestingly, participants with higher levels of empathy demonstrated improved accuracy in distinguishing between aggressive and happy dog faces, as well as happy and neutral human faces, suggesting that empathy influences how these expressions are perceived and processed.

Humans and dogs have shared a close social bond for thousands of years, with dogs often integrated into human households and social activities. Given this relationship, researchers have long been curious about whether the mechanisms humans use to interpret emotional expressions extend to dogs.

Previous research has shown that human and dog faces activate overlapping brain regions, but much of this work has focused on slow, hemodynamic processes rather than the fast, millisecond-scale dynamics of neural activity. The current study aimed to bridge this gap by using advanced brain-imaging techniques to examine the rapid neural responses to emotional faces of both humans and dogs.

“We had comparative, social cognitive, and methodological motivations for this study,” said study author Miiamaaria Kujala, an adjunct professor in comparative cognitive neuroscience at the University of Jyväskylä. “Humans have sophisticated neural processing machinery for perceiving faces, and we wanted to explore how the human brain’s processing of human and dog facial expressions proceeds in time, during the first half a second of perception. Furthermore, as empathy affects the subjective experience of others’ emotional expressions, we asked if this may influence how well we are able to predict the individual brain responses with machine-learning procedures.”

The researchers conducted the study using 15 healthy adult participants, averaging 28 years old, with normal or corrected-to-normal vision. These individuals were right-handed and had varying levels of familiarity with dogs, although most had limited experience interpreting canine behavior.

The participants were exposed to a set of images depicting human and dog faces with aggressive, happy, and neutral expressions. In addition, images of objects and scrambled visuals were included as controls. The visual stimuli were carefully prepared to ensure that differences in low-level visual properties, such as brightness and contrast, did not influence the results. Both color and grayscale versions of the images were used to further validate the findings.

Participants’ brain activity was recorded using electroencephalography (EEG) and magnetoencephalography (MEG). These techniques capture rapid, millisecond-scale changes in neural activity, allowing the researchers to map brain responses over time. Each image was displayed for 500 milliseconds, with short breaks between blocks of images to avoid fatigue. The EEG and MEG recordings were supplemented with structural brain imaging to enhance the precision of the neural data.

To assess how individual traits influenced neural responses, participants completed questionnaires measuring empathy levels and rated the emotional valence (positive or negative) and arousal of the facial expressions they viewed. The researchers also used machine-learning algorithms to analyze the recorded brain activity, focusing on the accuracy of classifying different facial expressions and species based on neural patterns.

The results indicated that human brains process emotional expressions from humans and dogs in surprisingly similar ways. Neural activity recorded during the first 500 milliseconds after a face was presented showed consistent patterns across both species. The earliest responses were observed in the occipital cortex, which processes visual information, followed by activity in the temporal and parietal cortices. These brain regions are associated with interpreting emotional and social cues. Notably, responses to dog faces were particularly pronounced in the temporal cortex, a region linked to attentional engagement with emotionally salient stimuli.

The machine-learning analysis demonstrated that neural responses could reliably differentiate between human and dog faces and between emotional expressions within each species. Classification accuracy was highest for aggressive expressions, which elicited stronger and more distinct neural responses compared to happy or neutral faces. This suggests that threatening or negative expressions capture more attention and provoke more robust brain activity, regardless of the species.

Empathy emerged as a key factor influencing these neural patterns. Participants with higher levels of emotional empathy showed better classification accuracy for certain expressions, such as distinguishing between aggressive and happy dog faces and happy and neutral human faces. This finding suggests that empathic individuals are more attuned to emotional cues, enabling their brains to allocate greater attention to emotionally significant stimuli. These results also indicate that empathy may play a role in how humans interpret and respond to emotions across species, highlighting its broader relevance in social cognition.

“We do differ in the way we experience our surroundings,” Kujala told PsyPost. “Empathic people have heightened attention for emotional information, which likely showed in our results—their brain responses were likewise clearer for the algorithm to detect differences between categories. This may have pros and cons—if you are constantly alert, you may react quicker to changing situations, but you may also tire and burn out quicker.”

While the study provides new insights, it is not without limitations. The sample size was relatively small, consisting of only 15 participants, which may limit the generalizability of the findings. Future research could include larger, more diverse samples and explore whether individuals with extensive experience with dogs, such as professional trainers or veterinarians, show different patterns of brain activity.

“We had quite a small sample size by today’s standards, but our subjects were experienced and knew how to ‘subject’ in neurophysiological studies—meaning, how to relax while having a head full of electrodes, understanding what happens, and how to concentrate on the task,” Kujala said. “But the results may be difficult to repeat within a sample that is novice, doesn’t understand how their motion affects the data, and may be intimidated by the research procedures, as the unrelated noise may also affect the success of decoding brain responses, masking this kind of effect. Adding more data to predictions does not always help if the data is noisy.”

Looking ahead, Kujala said she would “like to understand the interplay between empathy, anthropomorphism, theory of mind, and accurate interpretation of non-human minds. Machine learning is also an interesting tool, and I find it educational to explore the kinds of questions we can try to answer with it.”

The study, “Empathy enhances decoding accuracy of human neurophysiological responses to emotional facial expressions of humans and dogs,” was authored by Miiamaaria V. Kujala, Lauri Parkkonen, and Jan Kujala.

Leave a comment
Stay up to date
Register now to get updates on promotions and coupons
Optimized by Optimole

Shopping cart

×