A Parisian research group are using PowerLab systems to help understand why we mimic the emotional facial expressions we perceive in others during social interactions. As part of their studies, the group has configured their PowerLab to automatically annotate LabChart software’s physiological recordings with events generated by E-Prime®, their chosen experiment design and presentation software.
The Social Research Group within the Department of Cognitive Studies at the Ecole Normale Supérieure in Paris are interested in how we use other peoples’ non-verbal behaviour to understand their emotions and intentions. The group, led by Dr Julie Grèzes in collaboration with Dr Laurence Conty (LPN, Paris 8), are specifically concerned with identifying the underlying mechanisms and neurobiological bases of intentional and emotional body movements.
Smile and the whole world smiles with you. But why?
Past emotional perception studies have observed that we automatically imitate the emotions we see in other people’s faces. For example, when we perceive a happy face, our zygomaticus muscles tense – or to put it another way, we smile. Similarly, when presented with an angry face, our eyebrows will set into a frown.
Two different theories have been put forward to explain facial mimicry. The first argues that emotional contagion relies on automatic rapid motor matching of observed emotional expressions, which then cause emotion through facial feed-backs. The second theory proposes that these rapid facial reactions are the results (and not the cause) of emotional processes.
Dr Grèzes’ team is concerned with separating these two theories, and is investigating if the phenomenon is emotion-dependent - that is, if the reason we mimic other people’s facial expressions depends on the emotion being expressed.
Using PowerLab with E-Prime
As part of their research, the team is studying how different people react physiologically to different facial expressions.
The team use E-Prime software to present their subjects with a variety of “emotional” visual stimulus, including images of real faces and computer-generated avatars, as well as videos of actors using movement and body language to depict particular emotions.
They record subjects’ physiological reactions to the experiment presentation using a PowerLab system and LabChart software running on a second computer. Currently, the researchers are using PowerLab to simultaneously record GSR, ECG and EMG signals from their subjects.
The team has configured their experiment setup so the LabChart recordings are automatically annotated with comments marking when E-Prime presented the subject with specific image and video stimuli. This allows the team to identify and compare the physiological effects of different stimuli.
When designing their experiment in E-Prime, the researchers assign a specific digital byte value to each stimulus (for example, a photograph of a woman smiling). When the subject is presented with this stimulus, the computer running E-Prime uses a parallel port card to send the specific digital signal to the PowerLab’s digital inputs.
In response to this signal, LabChart's Preset Comments feature automatically adds a pre-programmed comment to the physiological recording at the correct point on the time axis.
What's Next – Integrating the Video Capture Module
The researchers filmed the experiments to observe and account for changes in participant's physiological parameters that don't relate to experimental stimulation (such as a loss of focus or a yawn).
The team is currently looking to integrate their video and physiological recordings using LabChart's Video Capture Module. Video Capture allows researchers to record and playback video and LabChart recordings in sync. According to Dr Conty, this will allow the team to "easily recognize and exclude the physiological response linked to an event like a yawn."
® E-Prime is a registered trademark of Psychology Software Tools, Inc. Other products and company names mentioned herein may be the trademarks or trade names of their respective owners.