Leveraging Big Data to Understand Emotion

Edited by Maedbh King
Alan Cowen grew up in Chicago and attended Yale University where he double majored in Cognitive Science and Applied Math. He graduated from the psychology graduate program at UC Berkeley where he studied cognitive neuroscience and social-personality. He was supervised by Prof. Dacher Keltner.

Alan Cowen is a former graduate student who worked under the supervision of Prof. Dacher Keltner
Alan Cowen is a former graduate student who worked under the supervision of Prof. Dacher Keltner

What drew you to study emotion?

I’m interested in emotions because I believe they underlie everything we do. If we didn’t have feelings like restlessness, anxiety, or excitement, we wouldn’t get up in the morning. We’d have no basis upon which to evaluate our lives, we could only be apathetic. Emotions motivate every thought we have. Even when solving math problems, it is curiosity, the drive to succeed, or the threat of failure, that motivate us to move from one cognitive step to another.

What is the goal of your research?

My goal has been to come up with a taxonomy of emotion, which means moving beyond reductive models like the six basic emotions and the affective circumplex. I’ve developed a new framework for taxonomies of emotion, with accompanying statistical tools, and used it to derive taxonomies of vocal expression, facial expression, the emotions evoked by music and video, and more.

Vocal Maps Facial Maps Music Maps Video Maps

How will your findings have real-world implications?

I’ve helped big tech companies give people the tools to recognize and appropriately respond to their own and others’ emotions. Smartphone cameras now use information about people’s emotional expressions to optimize pictures, choosing the best frame out of a 300 or so millisecond window of time after you release the shutter in part based on whether it captures the right emotions in a given context. Mental illness researchers are using taxonomies of emotion to develop diagnostic tests of emotion recognition. Autism researchers are beginning to successfully use augmented reality devices to help patients attend to others’ emotions, which requires understanding what those emotions actually are. In theory, I think we could help people save their marriages, fix bad habits, and improve their people skills by addressing these problems at their emotional roots.

Finally, what advances would you like to see in your field in the next two decades?

I’m hoping that we can develop tools to accurately quantify people’s emotional behaviors in real time from video, audio, and language, in a context-sensitive way. Once we can do that, we can enter a world where computer models help us solve a vast array of human-centric problems. We could use algorithms that learn from millions of responses to buildings and cityscapes to evaluate or optimize architectural plans, for example, designing libraries that evoke awe, hospitals that make people feel safe, and so on. We could develop individualized assessments of mental illnesses. We could develop self-guided camera drones that take perfect wedding photos. There are a lot of possibilities. One of the ones that inspires me the most is the possibility to design machines that help vulnerable people who have difficulty recognizing or addressing their own emotional needs, infants, the elderly, people who struggle with severe mental illness, and so on, doing things as simple as changing the song or the channel when they aren’t enjoying themselves. That’s the kind of technology that could have helped my grandmother, who had severe Alzheimer’s, find enjoyable ways to spend her time even when people couldn’t be there to care for her.