Recently in Facial Expression Category

 

Kathleen Bogart, pictured above, has a rare congenital condition called Moebious syndrome.  Moebious syndrome is a neurological disorder that primarily affects the 6th and 7th cranial nerves, leaving those with the condition unable to move their faces. The facial paralysis causes those with the disorder to be unable to smile, frown, suck, grimace, or even blink their eyes.  In addition, their eyes only move laterally, making sideways glances and eye rolling out of the picture as well.  (To learn more about Moebious Syndrome, go to the Moebious Syndrome Foundation). 

Reeve (2009) spent quiet a bit of time throughout the book discussing how important facial expressions are to emotion.  The facial feedback hypothesis, introduced in Chapter 12, states that emotion is the awareness of feedback from our own facial expressions.  Facial expressions are also important in social interactions.  Facial expressions allow us to ascertain the emotion & mood of the people around us and allow the people around us to ascertain out emotional state & mood.  Reeve states that emotions are intrinsic to interpersonal relationships, and they play a role in creating, maintaining, and dissolving interpersonal relationships.  We often automatically mimic other people's emotions during interactions.  By mimicking facial expressions facial feedback hypothesis would state that we are then able to understand the other person's emotional state.

Obviously emotions play a large role in our social interactions, whether it is by how we are feeling or understanding and mimicking the emotions of someone else.  Then by conjecture the inability to express emotions via facial expression, like those suffering from Moebious syndrome, can cause a variety of problems with social interaction.  Some researchers assumed that because those with Moebious cannot mimic facial expressions they would not be able to read other people's emotions as well as those of us who do not have Moebious syndrome.  However, recent research has shown that people who are suffering from Moebious syndrome are able to read facial expression just as well as the rest of us can.  This suggests that the brain uses more than just facial mimicry to evaluate emotions. 

While those who suffer from Moebious syndrome can read others facial expressions just as well as the rest of us, they have to use other methods to display their own emotions.  Most individuals with Moebious develop other nonverbal cues to express emotions.  In the same way that those who are blind have better developed senses of smell, hearing, and touch, those with Moebious syndrome have developed better vocal cues, gestures, and body positions.  This development may also aid those with Moebious in reading other's emotions outside of facial expressions.  In fact it was found that by mimicking one's conversation partner, it is more difficult to determine if they are lying or even uncomfortable. 

While, many of these individuals are able to develop such skills, that still does not make social interaction easy.  Many people are uncomfortable when interacting with someone who does not mimic their facial expressions.  I had never thought about how lucky I am to be able to express my emotions on my face - mostly because many times people are able to read me really easily because I don't control my facial expressions - but I interacting with others is made so much easier by having that ability.

To learn more about some of the research being conducted concerning facial expressions, Moebious syndrome, and Kathleen Bogart read this recent NY Times article http://www.nytimes.com/2010/04/06/health/06mind.html?pagewanted=1&sq=emotions&st=cse&scp=4

Affective Computing

| 0 Comments
So as I was reading chapter twelve, one section that caught my interest was affective computing.  What exactly is affective computing?  Affective computing is the result of persuasive evidence that exists for distinctive autonomic nervous system activity associated with fear, anger, disgust, and sadness. For example, with anger, there is increased heart rate and increased skin temperature which facilitate strong, assertive (adaptive) behavior.  With fear, heart rate increased while skin temperature decreased.  With disgust, both heart rate and skin temperature decreased.  With sadness, heart rate increased while skin temperature was stable. These four emotions, however, are among the few emotions with distinct autonomic nervous system reactions.  As Reeve states, "If no specific pattern of behavior has survival value for an emotion (like jealousy), there is little reason for the development of a specific pattern of autonomic nervous system activity." In other words, for jealousy and other emotions which don't fit distinct patterns, there is no universally appropriate bodily response because it depends on the situation more than it does on the emotion itself.  Thus, it is very unlikely that in the future humans will ever develop and evolve a single pattern of ANS activity for those emotions. 

STILL, for those emotions that show an ANS specific pattern (found through various studies) which I mentioned above (anger, fear, sadness, disgust, and additionally joy), there are future technological implications.  Essentially, it is possible that in the future we are going to be able to build machines that read our emotions.  As Reeve states, "Imagine electronic sensors built into steering wheels, mobile telephones, handles of bicycles, pilot simulators, computer joysticks, and golf clubs which constantly monitor its user's ANS (autonomic nervous system) arousal.  This would be the field of affective computing!  While these sensors would be limited in measuring only those basic emotions, additional technology like a digital camera or video camera could capture and analyze facial expressions and monitor movements of the user's face like the following features:  the user's frontalis, corrugators, orbiculris oculi, zygomaticus, nasalis, depressors, etc.  There is a great picture demonstrating faces of interest in our text after Tiger Woods hits a tee shot on page 341.  Computers already using technology analyzing user's facial muscles are actually already in existence, and are able to score facial movements just as accurate and actually faster than people. 

Here is a link which discusses affective computing and past, present, and future research projects regarding the technology:

http://affect.media.mit.edu/

Here is quick excerpt from the link:

Affective Computing is computing that relates to, arises from, or deliberately influences emotion or other affective phenomena.

Emotion is fundamental to human experience, influencing cognition, perception, and everyday tasks such as learning, communication, and even rational decision-making. However, technologists have largely ignored emotion and created an often frustrating experience for people, in part because affect has been misunderstood and hard to measure. Our research develops new technologies and theories that advance basic understanding of affect and its role in human experience. We aim to restore a proper balance between emotion and cognition in the design of technologies for addressing human needs.

Our research has contributed to: (1) Designing new ways for people to communicate affective-cognitive states, especially through creation of novel wearable sensors and new machine learning algorithms that jointly analyze multimodal channels of information; (2) Creating new techniques to assess frustration, stress, and mood indirectly, through natural interaction and conversation; (3) Showing how computers can be more emotionally intelligent, especially responding to a person's frustration in a way that reduces negative feelings; (4) Inventing personal technologies for improving self-awareness of affective state and its selective communication to others; (5) Increasing understanding of how affect influences personal health; and (6) Pioneering studies examining ethical issues in affective computing.

 

Will these technologies actually come into play in the near future?  Who knows, but if they do, they will revolutionize seemingly every field of business, sport, and life.  Would this necessarily be a progressive adaptation within our culture? Definitely an interesting topic to consider...  

Does smiling make you happier?

| 1 Comment

In my intro to psych class I remember my professor made us hold a pencil between our upper lip and nose for one minute. Then we had to hold a pencil between our lower lip and chin for one minute. We were then asked how we felt after doing both these exercises. The point my professor was trying to make was not to make the class look like idiots, but to explain the importance of smiling. More importantly, she was trying to get us to understand the facial feedback hypothesis. Chapter 12 of our textbook explains this hypothesis by saying that we don't' smiling because we are happy, we are happy because we smile. While this might not seem right to some people, I found a pretty hilarious video of a girl whose situation might have been improved if she had tried smiling. This clip is of a band member of Boise State who may be stuck on a less than desirable instrument, but her lack of emotion and motivation is clearly displayed on her face. The facial feedback hypothesis would say that if she smiled and displayed some positive emotion on her face, it might trigger further cognitive and bodily participation to prolong the perceived emotion of happiness.

I started to think about think hypothesis and if I think it really works, and I for one do think smiling helps. I wouldn't go as far as to say that smiling helps in all situations, but at least smiling gives you the appearance of being happy. If I smile it not only makes me feel better, but I think other people around you feel more at ease also. Are there any situations where smiling even though you were in a bad mood has helped you? Does smiling motivate us to be in a better mood?

http://www.youtube.com/watch?v=o-r02-oZAW4&feature=player_embedded