Archives

Affective Computing: Emotion-Sensitive Technology

Until quite recently, technology has been unable to translate emotions into the machine-readable language of 1s and 0s. However, with a new approach known as affective computing, researchers are developing computers that have the ability to read or “affect” users’ emotions. Imagine a computer that can determine if students are losing interest or confused during an online course. Or a GPS that can sense if you are lost and getting frustrated about it.

Introduction to Affective Computing

An affect sensor is a device that takes an input signal and processes it for some evidence of emotions. There are a number of different techniques and modalities used to detect affect. These include: physiological signals, facial expression recognition, speech prosody recognition and pressure sensors.

What is affective computing? There are two main elements involved in affective computing:

  • Detect the emotion via facial expressions, gestures, tone of voice, pressure on a keyboard, or other physiological measurements.
  • Put this information into context.

This may seem straightforward, as this is a process that humans do all the time. We can determine just from a glance whether someone is upset, interested or bored. However, computers are unable to do this, partially because emotional response studies were carried out on the assumption that emotions are expressed in the same way across a population. This, however, is not the case.

Applications for Affective Computing

Technologies that can read human emotions can help people who may have trouble determining the emotions of others, like those with Autism, or provide companionship and support for nursing home residents. For instance, Dr. Rosalind Picard, director of the affective computing research group at the Massachusetts Institute of Technology Media Lab has been working for over twenty years to translate emotions into a machine-readable language.

On one project with collaborator Dr. Rana el Kaliouby, Picard helped to design glasses for people with Asperger’s syndrome, a mild variant of autism. The glasses helped to warn the wearer when he/she was boring someone. Those with Asperger’s often focus on particular topics and find it hard to translate social cues that someone else is bored, such as fidgeting, yawning and avoiding eye contact.

Another project that Dr. Picard and Dr. el Kaliouby are developing is known as Q Sensors. These are bands worn on the wrist to measure emotional arousal through the skin’s electrical conductance and temperature, as well as activity level. For those with autism, who may have trouble speaking or articulating their feelings, these Q Sensors can help to provide insights into emotional states that the users may not be able to articulate themselves.

According to Dr. Picard, “With this technology in the future, we’ll be able to understand things about our loved ones that we weren’t able to see before, things that calm them, things that stress them. I’ve always thought of the skin as a covering, hiding what goes on inside our body. Who would have ever thought of our skin as a peephole?”

Affective Interfaces is a development group that helps other companies to know and understand what their customers are doing. Affective uses emotion sensing technology, via a webcam to capture customers’ facial expressions in response to the product. This can help uncover non-rational influences that might affect decisions from purchase to engagement.

The webcam captures facial expressions and synchronizes the image with the recorded screen capture video. The customer then uploads the videos to Affective’s servers, then analyzes the video, perform analytics and produces reports on market-relevant emotions correlated to elements of the product or branding.

The Ethical Side of Affective Computing

While these developments seem like a way to have computers better support people, there are other issues to consider. For instance, would it be ethical for a computer to sense users’ emotions? If a perceptual user interface has the capability to detect emotions, would this not be considered an invasion of privacy? Would the general public be comfortable with having their emotions sensed?

Much of the emphasis in affective computing detection has been on building systems that are able to recognize, express, help communicate and respond to human emotion. A mounting concern is how users feel about such technology – is it respectful of privacy and other needs and on what basis is its use acceptable.

Summary

This article takes a look at affective computing, technology that can identify and analyze human emotions. Affective computing involves two main elements: 1) To detect emotion via physiological measurements; and 2) To put this information into context. The article introduces a few emerging practical applications of affective computing, such as assisting those with Asperger’s syndrome to read emotions and helping companies market more effectively to their customer base. However, with this developing technology comes some ethical considerations, some of which are mentioned in this article.

CIPP Exam Preparation

In preparation for the Certified Information Privacy Professional/Information Technology (CIPP/IT) exam,  a privacy professional should be comfortable with topics related to this post, including:

  • Privacy intersections in the development process (I.B.a.)
  • Implementing technologies with privacy impacts (VI.)
Share

1 comment to Affective Computing: Emotion-Sensitive Technology

  • Joseph Biggs

    This is fantastic! As someone addicted to the show “Lie to Me”, this is particularly interesting. I would think that clinical use is more likely than widespread personal use. The privacy issue is an important one that has many obstacles. With the right web application security in place, this could be an incredible tool.

Leave a Reply

 

 

 

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>