The Long View | When Computers Know How You Feel

The rise of emotionally-aware or ‘affective’ computing is set to change the way machines understand and interact with people. BoF speaks with Rana el Kaliouby, co-founder and chief technology officer of MIT-spinoff Affectiva, to understand what’s possible and the implications for the business of fashion.

Source: Shutterstock

CAMBRIDGE, United States — Recent advances in neuroscience reveal that emotions are at the very core of human decision-making. Rather than cognitive thought, emotion is what fundamentally drives the way people engage with brands and products. But traditional methods of measuring emotional response, like surveys and focus groups, generally fail to accurately capture honest, unfiltered and immediate feelings.

Now, by measuring and analysing things like facial expressions, gestures, voice, sweat and heart rate, new ‘affective’ computing technologies are enabling laptops, smartphones and other personal devices to track and respond to human emotions, with powerful implications for product development, marketing, sales and service.

Indeed, in a not-too-distant future, fashion e-tailers may have the ability to automatically adapt their merchandising strategies, in realtime, in response to the emotional reactions of individual customers, while emotional data gathered from viewers watching online fashion shows could be used to inform collection development and buying.

To find out more, BoF spoke with Rana el Kaliouby, co-founder and chief technology officer of MIT-spinoff Affectiva, about the power of affective computing and the implications for the business of fashion.

BoF: What is affective computing?

At the highest level, our emotions influence every aspect of our lives. So, if you look at health and well-being, emotions are a very important factor. Emotions also play an important role in how we connect with the world and the people around us — how we socialise with others, what we like to wear, how we want to be perceived. And, of course, emotions drive the decisions we make: what products we buy, what services we use, what content we consume.

Measuring emotions is the next wave in intelligent personal data. If you look at the digital world today, there’s a lot of information about who you are: your Facebook profile, your Twitter profile, location services that know where you are. But there isn’t anything that captures how you really feel. The way you do this today online is really very crude: all you get is a ‘Like’ button. But our emotions are way richer and much more nuanced than a simple ‘Like’ button or emoticon. The idea behind affective computing is to bring to the world emotionally aware technologies that are able to sense and adapt to a full range of emotional experiences.

BoF: How does it work? What are the specific technologies?

There are a lot of channels from which we can distil a range of emotional states. The face is one of the most powerful social and emotional communication channels. It can communicate everything from joy to interest to disgust to confusion to worry. Our gestures are also important — both head gestures and body movements. Similarly our voice carries information on our emotional states. There are also physiological measures. You can look at skin conductance, which is the level of sweat on your skin — and that gives you a measure of arousal, how activated or calm you are. Of course, you can look at heart rate, and there’s also technology that let’s you gauge heart rate from video, so you don’t have to wear anything. But, of course, you can also do this through wearable devices. So, there are a whole host of technologies that can sense your emotional state.

BoF: How are these new technologies better than traditional ways of measuring emotional response, like surveys and focus groups? What makes emotional data so valuable?

Self-reported measures are interesting. They definitely capture a cognitive aspect of our experience. But they are very filtered. You think very carefully about how you want to respond to a question. And if you are in a focus group, you think even more carefully, because there are other people in the room and you want to come across a certain way. It’s not your unfiltered, immediate emotional response — but it’s an immediate emotional response that actually drives behaviour and that’s what [affective computing] is trying to capture, this visceral reaction to things.

BoF: Where will we see this technology deployed? And what types of experiences will we start to see?

I think we’re going to see an explosion of devices that sense emotion. It’s going to be everywhere, from our existing gadgets to intelligent earrings and bracelets that monitor heart rate — maybe even embedded in your clothes. In terms of expression recognition via video, the technology works with any type of camera, so it could be the camera on your laptop, the camera on your phone, the camera on your tablet. So cameras are becoming ubiquitous — and soon they will be emotionally aware.

One application is tracking emotional health over time. Another is personalised services. For example, you’re watching TV and the system knows what kind of content you like and can better recommend shows or games, based on what we call your ‘emotigraphic’ profile; it’s taking your demographic profile one step further to capture your emotional preferences.

BoF: How might this apply to fashion? 

When I walk into a Zara store, a salesperson walks up to you and knows very quickly, based on how you look and how you are responding, what type of stuff you like and how to make this a more engaging shopping experience. But on a website, there’s none of that.

BoF: What might an emotionally intelligent fashion e-tailer look like?

If I’m browsing and I clearly display a signal of interest about a pair of jeans, the next thing you see could be something similar.

BoF: Emotionally responsive merchandising?

Yes. Or perhaps if you’re browsing and the system senses confusion or frustration, then maybe a customer service person jumps in and offers you help.

BoF: One really important interface between fashion companies and consumers these days is the online fashion show.

Imagine if online fashion shows were emotionally sensitive. I could simply turn on my webcam to share my response to various looks. And surely, that emotional data is very useful for a brand who can use it to optimise its next set of products based on unfiltered, immediate consumer response.

A lot of money goes into creating new products, some of which flop. We have done some testing for a big makeup company — gauging emotional response to everything from opening the packaging to applying the mascara to looking at yourself in the mirror. All of this can help people who are designing new products to optimise their output.

We talked about the potential for online stores. But I think there’s also an opportunity to use this technology to enhance the physical store. A while ago, when we started doing this research [into video-based facial expression recognition], Armani came to us. Their problem was that a lot of consumers who might otherwise consider buying their products wouldn’t even come close to their stores because they dismissed them as very upscale and ‘not my kind of shop.’ They wanted to use our technology to understand how people engaged with and responded to the physical store, gauging emotional response and ultimately trying to affect that experience. As consumers stopped at certain shelves or sections of the shop, Armani wanted to measure their interest level. And one thing in particular that they wanted to measure was whether price was really the deterrent.

BoF: What about dynamic pricing based on emotional response?

I think realtime, emotionally responsive pricing is a real possibility. I’m Egyptian and a lot of our markets are dynamically priced [laughs]. Go to the bazaar and the seller sizes you up and gives you a price. And he’ll gauge how much you really want to buy this or not — and the price reflects that.

This interview has been edited and condensed.