Article

We Can Teach AI Virtually Anything — But Can We Teach It to Feel?

In popular science-fiction series Black Mirror, a lonely teenager is given a talking, AI-enabled doll named Ashley Too. The doll immediately recognize...

We Can Teach AI Virtually Anything — But Can We Teach It to Feel?

In popular science-fiction series Black Mirror, a lonely teenager is given a talking, AI-enabled doll named Ashley Too. The doll immediately recognizes that its new owner is unhappy and its ability to chat empathically and meaningfully soon leads the teenager to regard the doll as her best (and only) friend.

As with many of the dystopian technologies explored by the creators of Black Mirror, the emergence of robots that read and react to human emotions may not be too far off.

What Can AI Do?

Artificial intelligence (AI) is capable of absorbing and processing inconceivable amounts of data; it can beat world-class chess champions at the drop of a hat and its super-cognitive abilities are proving invaluable to organizations around the world.

Where AI struggles, however, is in interpreting human emotions and using common sense to draw rational conclusions. As time goes on, these shortfalls are becoming increasingly problematic to researchers who believe emotional intelligence and common sense will become a crucial part of AI’s development and future applications.

Companies including Affectiva, Cyc, SenSay Analytics and Beyond Verbal are employing different techniques to boost these capabilities in AI. Affectiva, an emotion measurement technology company, has compiled a database of 7.9 million diverse faces and is now training its algorithms to detect patterns in these facial expressions, coupled with vocal cues, to interpret and respond to emotions or complex cognitive states. The Cyc project has spent 34 years coding over 25 million common-sense rules. Research of this kind, termed “Emotion AI” by Affectiva, will have applications in industries including:

  • Advertising and market research:Brands can use Emotion AI to analyze consumer responses to advertisements to understand the effectiveness of their campaigns.
  • Medical research:Pharmaceuticals can monitor a patient’s mood changes in response to medication or detect early signs of Parkinson’s Disease. AI may one day prove to be an invaluable early-warning system for changes in mental health.
  • Virtual assistants and customer service:Amazon is developing a new capability for its virtual assistant, Alexa, that will enable it to detect emotion in people’s voices and respond appropriately and empathically. As this technology matures we can expect to see the market shift from virtual assistants to virtual companions.
  • Automotive industry:Ford is working to develop software that can detect a driver’s emotions such as anger or distraction and take action to keep the driver and others on the road safe.

Manufacturers could benefit significantly from more intuitive AI and an increase in AI-enabled “cobots,” or robots that work directly alongside humans. MIT, for example, developed a robot that uses machine learning to track signals in workers’ muscles as they lift and move items using electromyography sensors. These movements can then be matched by the robot.

Researchers at MIT have also developed AI that can gauge an object’s texture from sight and imagine an object’s appearance from touch, in the same way a human can. This process involved recording 200 objects being handled or touched 12,000 times and converted into a dataset of over three million visual/tactile paired images. AI like this could significantly reduce the amount of data robots require to complete tasks such as grasping and moving objects.

How Can Researchers Use Autism to Train AI?

When two humans interact face-to-face they constantly adjust their approach in response to facial cues from the other party. Happening at a subconscious level, facial expression recognition is one of the key elements of human interaction that people on the autism spectrum have difficulty with. A parent having a conversation with an autistic child, for example, will often need to articulate the fact that they are upset or frustrated because they cannot assume the child has seen and understood this emotion by reading their faces.

Similarly, if AI is unable to detect facial cues it will fail to adjust its approach until being explicitly informed that there is a problem (such as performing a task incorrectly). Facial recognition will, therefore, serve to remove an extra step in human-AI interactions and enable AI to correct course without being told to do so.

Ray Diamond
Ray Diamond
Ray is an expert in grinding polycrystalline diamond (PCD) and cubic boron nitride (CBN) tools. He works with technologies like laser machining, EDM, and CBN wheels to deliver ultra-precise results for hard and brittle tool materials.