Emotional Artificial Intelligence - Vani Sachdev

Updated: Sep 9


Artificial emotional intelligence is a new technology with many implications for the future. It works by connecting affective states to a user’s physical cues and facial movement in order to offer a personal, enriched automated experience. The definition for what exactly artificial emotional intelligence means is often fraught with confusion: the goal is not to make a machine “emotional”, but to have it respond to a user’s emotional output. The implications of being able to implement artificially cultivated emotional intelligence are endless; it adds a new layer of intrinsic understanding between a user and technology that can make every experience unique.

Necessity breeds ingenuity, and this is apparent when we consider the creation of machines such as cars and indoor plumbing. However, in our age, it is not necessity, but comfort that breeds ingenuity. This is an important element to consider when implementing emotional intelligence. Machines don’t need to be in tune with a user’s emotions to do their job well, they need to be empathetic to give the user the most convenient experience. It is not about making machines “emotional” but using this intelligence to better enable future technologies. Clippit, a talking cartoon paper clip that was standard on Microsoft Office until Windows 7, was a piece of intelligence that was designed to help a user navigate through Microsoft Word. However, Clippit was “an idiot about people ” (Picards) and “too in your face” (Guardian). He regularly failed to notice when a user was annoyed, which often caused an already frustrated user to get even angrier. The fact that Clippit would wink and dance sporadically after giving instruction didn’t help.


As of December 2018, computers can adapt and learn how to behave to only a small set of stimuli. Machines rely on sensors that pick up on physical cues as much as they do facial features. A research project by the Computing Research Association (CRA) perfectly demonstrates how emotional and artificial intelligence together can produce a beneficial product that provides a helpful service. The vision for the application that the CRA wished to design was a personalized tutor that continuously adapted to the needs of individual children. Their program was able to determine when a child was making mistakes and still interested in the subject versus when a child was making mistakes and getting increasingly angry. This feat enabled the machine to respond to the child appropriately. The robot accomplished this by recognizing how the student was sitting using many highly sensitive sensors that calculated the distribution of weight across the seat. Research showed that how an elementary schooler sat gave a 83% accuracy (Bickmore) to how interested they were in the information being taught to them. Using this information, the researchers developed a program that showed how the postures of students changed as a function of time. Using machine learning, the program was able to develop an inductive model of how each student worked and, with accuracy levels ranging from 72% to 80% , able to gauge high interest, low interest, and boredom amongst users studying specific subjects. Without emotional intelligence, the machine could have interrupted a curious individual’s learning journey too early or waited too long before helping a frustrated student. The machine would only respond to the action of behavior, not the behavior itself.


Emotional artificial intelligence has the potential to make human-machine interactions more akin to human-to-human conversation. The need for this emotional intelligence is dire in today’s world of automated services: maybe the next time you’ll enjoy being on hold!


57 views1 comment

Let's Stay in Touch!

Contact us

Social Media

FF

Sponsorships & partnerships

Editorial

Marketing

General inquiries

  • Instagram
Discord-Logo-Black.png
  • YouTube

© 2020 Font Femme by Ariana Pineda.