The Emotional Future of Artificial Intelligence

0
269

Artificial emotions – we’re all guilty. ‘Faking it’ is ubiquitous among humans. However, there may be something more to artificial emotions when it comes to artificial intelligence and ML.  It’s hard to imagine that there are improvements that can be made to artificial intelligence. Sure there’s the minor software bug to correct. There’s the latest version to download. But the technology is already so impressive, can it really be improved that much? Developers say there are miles to go in the area of emotional artificial intelligence.  

Artificial intelligence grew from being able to auto recognize people on Facebook, to machines being able to see for themselves. Machine vision became most popular through the automotive industry with self stopping cars. And now its going one step further with emotional awareness in machines.

Rana el Kaliouby is CEO and co-founder Affectiva, an MIT media lab that specializes in emotional artificial intelligence. She claims that emotional awareness in machines can help sift through the responses of people from marketing campaigns they’ve been exposed to.

Millions of dollars are spent in marketing campaigns. And these campaigns are trying to connect with people on an emotional level. Yet determining the success of these campaigns is based on clicks and internet activity, not the success of the emotional connection. This is where machine vision can shine.

Machine vision can look at the actual reactions of customers engagements with marketing campaigns in real time. This could take place through the camera on a mobile device or computer screen. A customer can give access to their camera in order for companies to study their reaction. Then, AI analyzes the reactions of customers by studying the micro-expressions on their face. It can then categorize the reaction as either positive or negative.

Skeptical? Rana provides a case study from the automotive industry. According to current regulations with self-driving vehicles, a human co-pilot is necessary at all times. At some point the controls will be given back to a human operator. In this scenario, machine vision would be able to know if the human operator is mentally capable of taking control of the car. The car would be able to tell general alertness and even gauge sobriety.

With emotional artificial intelligence, a car could track the blink rate or yawns and determine the cognitive ability of the driver. It could even go as far as knowing if the driver is frustrated or confused.

What do we learn from all this? AI is evolving. And with cameras everywhere, expect this technology to be monitoring and gauging responses from customers. This data can be analyzed to learn how we respond in various situations.

Expect in 5-10 years for our machines to connect with us beyond giving us data. They’ll be able to connect emotionally as well. Artificial emotions may not be so bad after all.

View our latest topics here.

LEAVE A REPLY

Please enter your comment!
Please enter your name here