Artificial intelligence continues to evolve, changing the way people work and, increasingly, the way they learn. Key to expanding AI is teaching these systems to recognize and respond to emotional nuances that are fundamental to communication.
Emotion AI, also known as affective computing, enables systems to detect, analyze, process and respond to emotional cues and moods — including love, fear, anger and shame.
“By 2022, your personal device will know more about your emotional state than your own family,” says Annette Zimmermann, research vice president at Gartner, in a company blog post.
AI Offers Clues to the Emotions Behind Learning
That shift will have big implications, says Hayley Sutherland, a senior research analyst for AI software platforms at IDC.
Consider that by 2024, AI-enabled human-computer interfaces will replace an estimated one-third of screen-based business-to-business and business-to-consumer applications.
By 2022, IDC predicts, 30 percent of enterprises will use interactive conversational speech technologies to power customer engagement, and affective computing will see a 25 percent jump in real-world applications.
“It’s not necessarily going to be everywhere,” Sutherland says. “But we do expect to see a pickup in terms of moving from experimentation to actual production.”
You want to make sure that technology can communicate the way humans communicate.”
Cognitive Scientist and Professor, The Ohio State University
As researchers and private companies teach machines to recognize differentiation in vocal inflection, facial expressions and other cues, experts say the field is ripe for applications in higher education.
While AI already helps colleges automate core functions to improve efficiency, advances in emotion research could expand the role of AI considerably. Adaptive learning and online courses are two potential beneficiaries.
For example, Sutherland says, AI could learn a student’s patterns and adapt course material to improve outcomes, or sense if the student was becoming distracted or bored and switch to a more engaging mode of communication.
Aleix Martinez, a cognitive scientist and professor of electrical and computer engineering at The Ohio State University who has studied affective computing extensively, says researchers seek to make AI more “humanlike.”
“You want to make sure that technology can communicate the way humans communicate,” he says. “The technology has improved, but we are still lacking that human touch.”
MORE FROM EDTECH: See how universities are using AI to boost graduation rates.
Learn from the Movies
Kevin S. LaBar, associate director of the Center for Cognitive Neuroscience at Duke University, was part of a team that developed a neural network capable of classifying images into 11 emotion categories.
Researchers trained the system using photos and screen grabs from movie trailers.
The key, he says, is to train deeply, exposing the system to thousands of images over several years. “Now, these new tools are really permitting insights we didn't have the framework to explore,” he says. “It’s a really exciting time to be looking at emotion more broadly.”
The estimated percentage of enterprises that will use interactive conversational speech technologies to power customer engagement by 2022
Source: IDC, “IDC Innovators: Affective Computing, 2019,” June 2019
As adoption spreads, ethical and privacy concerns won’t be far behind, says Sutherland.
“Right now, we are in this privacy backlash,” she says, driven by high-profile data breaches and revelations about the gathering of personal data by Facebook and others.
When it comes to emotion AI, she says, “by looking at microexpressions, we’re almost able to understand what you’re feeling even if you’re not aware. It’s really important for these companies to develop a solid foundation of ethics.