STILL, for those emotions that show an ANS specific pattern (found through various studies) which I mentioned above (anger, fear, sadness, disgust, and additionally joy), there are future technological implications. Essentially, it is possible that in the future we are going to be able to build machines that read our emotions. As Reeve states, "Imagine electronic sensors built into steering wheels, mobile telephones, handles of bicycles, pilot simulators, computer joysticks, and golf clubs which constantly monitor its user's ANS (autonomic nervous system) arousal. This would be the field of affective computing! While these sensors would be limited in measuring only those basic emotions, additional technology like a digital camera or video camera could capture and analyze facial expressions and monitor movements of the user's face like the following features: the user's frontalis, corrugators, orbiculris oculi, zygomaticus, nasalis, depressors, etc. There is a great picture demonstrating faces of interest in our text after Tiger Woods hits a tee shot on page 341. Computers already using technology analyzing user's facial muscles are actually already in existence, and are able to score facial movements just as accurate and actually faster than people.
Here is a link which discusses affective computing and past, present, and future research projects regarding the technology:
http://affect.media.mit.edu/
Here is quick excerpt from the link:
Affective Computing is computing that relates to, arises from, or deliberately influences emotion or other affective phenomena.
Emotion is fundamental to human experience, influencing cognition, perception, and everyday tasks such as learning, communication, and even rational decision-making. However, technologists have largely ignored emotion and created an often frustrating experience for people, in part because affect has been misunderstood and hard to measure. Our research develops new technologies and theories that advance basic understanding of affect and its role in human experience. We aim to restore a proper balance between emotion and cognition in the design of technologies for addressing human needs.
Our research has contributed to: (1) Designing new ways for people to communicate affective-cognitive states, especially through creation of novel wearable sensors and new machine learning algorithms that jointly analyze multimodal channels of information; (2) Creating new techniques to assess frustration, stress, and mood indirectly, through natural interaction and conversation; (3) Showing how computers can be more emotionally intelligent, especially responding to a person's frustration in a way that reduces negative feelings; (4) Inventing personal technologies for improving self-awareness of affective state and its selective communication to others; (5) Increasing understanding of how affect influences personal health; and (6) Pioneering studies examining ethical issues in affective computing.
Will these technologies actually come into play in the near future? Who knows, but if they do, they will revolutionize seemingly every field of business, sport, and life. Would this necessarily be a progressive adaptation within our culture? Definitely an interesting topic to consider...
Leave a comment