Devices are getting smarter these days; they know almost everything we do! Right from perceiving our touch gestures to predicting what we would be typing next, they can also recognise our facial expressions, track our whereabouts, monitor our health conditions, and what not! Well, though machines have reached such an advanced and cognitive stage where they can probably read our facial expressions or predict few of our intentions by analysing our body movements or gestures, they are yet to comprehend most of the finer human emotions like anger, anticipation, joy, trust, fear surprise, sadness, and disgust!
Emotions play a crucial role in our lives; deciphering our emotions would definitely give an insight about how we feel, what we mean, and what would be our preferred set of actions. And while talking about expressing ourselves through emotions, it is undoubtedly our ‘voice’ that transmits most of the information. Further ahead, it is the way we empathise and our reflexes that back up our vocals, and conveys the final set of messages about how we feel, what we mean or what would be our probable actions.
The difference between reading skills and oratory skills would help understand this better. Storytelling is an art, and it needs good oratory skills to create an impact in the audiences’ mind. It’s all about the tone we use and variation in our voice that helps creating this impact. It’s not only about what we say, but the way we say it!
This, in technical term, can be referred to as ‘Emotional Analytics’, and can be defined as a program that collects data on how a person communicates verbally and nonverbally to understand his/her mood and attitude. It provides insight into how a customer perceives a product, the presentation of a product, or their interaction with customer service representative.
Now, what if we are able to integrate Emotional Analytics into machines or programme them with applications that can empower them to comprehend human emotions? Would we able to build a human-machine connection or can we enhance a machine’s abilities to an extent wherein it can analyse our emotional side and get us in touch with our own emotional facets?
Well, here are two communication models that could serve as the starting point to empower machines to comprehend human emotions.
The ‘7%-38%-55%’ Communication Model:
Developed by Professor Albert Mehrabian, this model talks about how non-verbal communication impacts the way we communicate with others. According to this model, only 7% of our communication, is entirely verbal and carries the literal meaning or the content of the message. As individuals with emotions, we tend to use this 7% as the ice-breaker while trying to ignite a conversation or to create the first impression while having a conversation. Looking deeper into this segment, it is also evident that the set of vocabularies we choose to communicate, can also reveal a plenitude of information about the emotions we carry or in simple words, our state of mind.
The 38% and 55% in this model talk about our tone and body language. The tone we use while having a conversation and the body languages we exhibit while in a conversation, also reveals a lot about how we feel and perceive things at that point of time.
If machines can be trained to read and comprehend our vocabularies, the tone of our voice, and our body movements, it can estimate our state of mind or how we feel at any given point of time!
The Emotion Wheel – Plutchik’s Wheel of Emotions:
Dr Robert Plutchik; an American psychologist, proposed that out of the 34,000 known human emotions, there are eight such emotions that can be called as the foundation; as shown in the second circle. These are further divided as pairs: joy and sadness, acceptance and disgust, fear and anger, surprise and anticipation. The core circle denotes emotions getting more intense, and the outer circle denote emotions getting lighter. Also, combining the polar opposites of these emotions, it gives a whole new set of emotions.
Being able to identify these emotions, we can understand the set of prevailing emotions or how the opposite person will react. Thus, we can easily comprehend emotions and shift our focus on how to respond so that the other person feels good.
Similarly, if we can train machines to understand, analyse, and detect the type of emotions, machines would soon be able to detect emotions and act as an emotional guide to the mankind!
What is the need of integrating machines with emotions?
Today, we are living in the age of Customer Experience (CX). Almost every marketer, no matter what, is focussed on customizing Customer Experience and deliver unparalleled services to customers. While this entire CX cycle can be divided into Customer Acquisition, Customer Retention, and Customer Loyalty, marketers are now more inclined towards creating a streamlined channel through which they can apprehend what or how their customers think.
Though the same can be achieved best with human effort, due to the fact that businesses need to cater to millions of customers at a time, using machines proves to be an efficient as well as effective way to comprehend what or how customers think. And if these machines can be trained to a level, wherein they can decode the emotion quotient of customers, it will empower businesses to deliver unmatched services to their customers and personalise Customer Experience to a whole new level.
At Racetrack.ai, we are on a quest to empower machines to comprehend human emotions. We are training our Virtual Customer Assistant (VCA) – MARVIN, to understand, analyse and comprehend human emotions.
Advanced algorithms and natural ability to learn language skills from ongoing conversations, makes MARVIN as efficient and proactive customer assistant for businesses from almost all the verticals. The more it interacts with customers, its cognitive abilities, analytical skills, and capability to comprehend emotions keep on augmenting.
*Image Credits: Positive Psychology Program & Tools Hero