Jun 25

What is emotion analytics?

As we experience what is arguably the most significant technological evolution in human history, emotion analytics is provoking questions. Also referred to as “affective computing”, in this blog post we explain what emotion analytics is and what its implications are. 

History of emotion analytics 

Philosophy has questioned machine sentiment for hundreds of years, but we can trace the history of emotion artificial intelligence (AI) as a concept back to the 1990’s, at which point researchers started to explore the idea of using computers to recognize and interpret different human emotions. The term “Affective Computing” was born thanks to Professor Rosalind Picard, founder and director of the Affective Computing Research Group at the Massachusetts Institute of Technology in her 1995 paper of the same name.   

Since then, we’ve seen significant progress in the development of multiple emotion analytics technologies. Initially, most researchers were focusing primarily on analyzing facial expression and vocal tone to decipher emotions. However, as the discipline grew, new techniques and approaches evolved. Nowadays these include the use of machine learning algorithms to analyze more complex patterns of behaviors. 

The field of emotion analytics is making rapid progress but as it stands today it is already applied to a multitude of domains, from healthcare to entertainment. Despite the progress made in relatively little time, there remain a few challenges to overcome such as privacy concerns, accuracy, and bias. As the technology evolves, it will be increasingly critical for enterprises to address those concerns and ensure the tools are used in an ethical and responsible way. 

So, what is emotion analytics? 

Emotion analytics is a field of AI focusing on developing AI systems which can recognize, interpret, and interact with human emotions in a way which feels authentic and appropriate to the human. The goal of emotion AI systems is to enable more fluent interactions between humans and machines in which the human feels that the emotional response from the system is appropriate and helpful. 

a woman speaking to a robot

In her aforementioned paper, Professor Rosalind Picard defines emotion analytics as “computing that relates to, arises from, or deliberately influences emotions”Gartner explains that “Emotion AI, also known as affective computing, enables everyday objects to detect, analyze, process and respond to people's emotional states and moods - from happiness and love to fear and shame”. Massachusetts Institute of Technology defines it as “a subset of artificial intelligence (the broad term for machines replicating the way humans think) that measures, understands, simulates, and reacts to human emotions”

The technology involves combining a variety of techniques such a machine learning, natural language processing, and sometimes computer vision, to examine and decipher signals from humans which pertain to their emotional state. These signals can be facial expressions, vocal tone, and other types of body language. These interpretations are then used in applications like healthcare and marketing to enable a more empathetic customer experience. 

Read more: What is speech analytics?

Where do we encounter emotion analytics? 

There is a plethora of use cases already on the market and in full use, where we encounter emotional analytics in different ways. Here we look at few of the more common cases where enterprises are already reporting beneficial business outcomes. 

When emotion analytics is used in healthcare, it can be used as a tool to help professionals to diagnose different mental health conditions such as anxiety and depression more quickly and more accurately. It can also be used by professionals in the field as a tool to monitor their patients’ emotional states during interactions, to help them to improve their bedside manner and patient comfort. Patient comfort could be further improved with emotion analytics by using it to monitor patients’ emotions during different procedures and treatments. 

In customer service we see emotion analytics analyzing customer emotions and using the information to respond in the most appropriate way to improve business outcomes. The emotional states can be used to provide more personalized customer service and can be used for detecting for instance when customers become frustrated or angry, in order to react more appropriately or provide more targeted responses. 

sentiment analysis data speech recognition

The field of education is seeing emotion analytics used to personalize the actual learning experience of students. The algorithm will analyze the emotional response of students to different stimuli and use the results to help educators to curate more engaging teaching methods. This should result in more motivated students and better learning outcomes in the education field. 

Surprisingly the entertainment industry is already taking emotion analytics into full use to create more immersive and interactive experiences across video games and virtual reality applications. The game content and story line can be explored in different directions depending on the players’ emotions, and this is one field where we expect to see more exciting developments shortly. 

When it comes to enterprise adoption of emotion analytics, we see it applied in various functions such as marketing and human resources. Emotion analytics can be used to track consumer emotions, preferences, and reactions – providing the enterprise with insights to improve their marketing campaigns and targeting.  In human resources it can be applied to the interview process to analyze the candidates’ emotional intelligence and suitability for specific roles. It can also be used to provide feedback to employees and coach them on such topics. 

Finally, we see significant use of emotion analytics in the public safety domain, where it is used to monitor drivers in logistics to detect fatigue or distraction in order to protect their safety and the safety of other road users. Again, this is a field where we can expect to see significant growth when it comes to implementation of emotion analytics. 

These examples constitute only a few of many potential use cases for the adoption of emotion analytics.  As the tools improve over time, we can expect to see new and more innovative applications emerging to the market. 

Read more: Integrating AI into your business process

What emotion analytics applications are out there? 

There are several types of application that exist on the market today and are already widely implemented. These can be grouped into three principal types: textual emotion recognition (TER), facial emotion recognition (FER), and speech emotion recognition (SER).    

The most common applications are text-based and usually involve measuring sentiment and emotions related to enterprise branding. Enterprises use the technology to survey social media and check for negative sentiment so that such incidences can be resolved more quickly. An example of such technology is Brandwatch which provides its customers with a suite of technologies to recognize and react to market trends in real-time. TER can also be used for making decisions based on the prevailing sentiment in the market at that time. This is particularly useful when it comes to use cases such as stock market trading.  Solutions like Stockpulse AI scan texts across multiple text-based platforms such as Telegram and Reddit to track human reactions and emotions, and use this information to assist users in making better investment decisions. 

FER is related to studying the facial structures in relation to human emotion. The primary emotions involved are usually happy, sad, angry, surprised, scared, disgusted, and neutral. Cameras will detect the placement of various points such as cheeks, upper lip height, eye direction, eyebrow heights and so on.  Most mainstream algorithms combine valence, arousal, gaze direction, head orientation, and personal characteristics such as gender and age with the spatial information of the face, to identify the correct emotion.   

facial emotion recognition FER

Noldus is one example of a company bringing futuristic ideas to the now, using FER to allow researchers to better understand bipolar disorder, parent-child interaction across socio-cultural differences, and even interactions between animals. Their software enables their customers to better understand human behavior for example on ships, to allow them to enhance their safety features relating to human behavior that might present a risk. Other customers use it to research neuro-degenerative diseases such as Parkinson’s Disease to enhance our understanding of such conditions, to understand functional rehabilitation processes in rodents, and more. 

SER is related to studying the vocal cues in relation to human emotion. The primary emotions are typically the same in SER as in FER: happy, sad, angry, surprised, scared, disgusted, and neutral. Microphones will detect the voice and algorithms will track points of interest such as the tone, the speed, the pitch, and the selection of words the person used, in order to identify the correct emotion at the right time. 

Benefits of emotion analytics 

There are several potential benefits of emotion analytics and overall, it has a strong potential for providing a more nuanced understanding to algorithms of human emotion, making interactions with such algorithms far more comfortable for us. 

As in the examples of use cases above, we can see benefits across the board. In healthcare we see improved understanding of the emotional states of patients (in mental and physical healthcare cases).  Emotional analytics can be used to analyze facial expressions and vocal tone to provide more appropriate assistance more accurately.   

In education, the emotional reaction to a student’s interaction with different types of learning materials and learning environments can be used to help educators to tailor their teaching methods more appropriately. The outcomes would be more engaging and motivating education for the students and less guesswork for the educators. 

Customer service is greatly enhanced by emotion analytics and outcomes are already improved thanks to the adoption of different emotion analytics tools. By analyzing the customer emotions during interactions with the customer service agents and tools, systems can recognize and alert when customers are frustrated or angry, triggering faster and more appropriate responses to improve customer experience. It also saves time and resources in other enterprise fields such as human resources where it can be used to automate certain tasks such as interview processes. 

Emotion analytics is also increasingly used to enhance safety in different areas, where a person’s emotional state can be a trigger for dangerous circumstances. One example of this could be monitoring drivers for signs of fatigue or distraction as mentioned, but other examples could include crowd control to avoid injury during riots or crushing during mass events. 

All in all, emotion analytics has the potential to enhance our interactions with machines, but we should bear in mind the potential drawbacks too. 

Drawbacks to emotion analytics 

Despite the benefits there are a few drawbacks associated with emotion analytics which should be addressed. 

Emotional analytics can be pretty inaccurate. Since its still in the relatively early stages of its development, there are of course improvements to be made, and accuracy is one of them. The detection is sometimes inaccurate, and then the interpretation can also sometimes be wrong. This leads to wrong responses from the algorithms, which can lead to negative outcomes. 

The algorithm can display bias or discriminative prejudices. These tools are trained on human behaviors and can accidentally amplify human biases such as racism or sexism. This can lead to discriminatory outcomes which is of particular concern when it comes to emotion analytics and the potential reinforcement of harmful stereotypes regarding emotions wrongfully linked to specific demographics.   

Privacy invasion is another concern. Since emotion AI requires the collection and analysis of emotions, clearly there is the potential for abuse of analytics systems involved. Greater regulatory control would be needed to mitigate the potential for misuse, and data security protocol should be observed strictly in order to protect the information of individuals involved. 

There are also other ethical concerns involved, especially when it comes to consent, transparency, and accountability: people should have to opt in to consent to the use of emotion analytics, and enterprises should be transparent in their use of such tools. Another ethical concern regards the overreliance of humans on machines. If humans would be too reliant on machines to interpret and respond to emotions, they might lack empathy and emotional intelligence during human-to-human interactions. 

So, while we see that there are clear benefits to emotion analytics, we should also keep in mind the potential drawbacks and ensure that structured guidance is in place for mitigating those. 

Read more: StageZero's guide and checklist to privacy and AI and Data diversity and why it is important for your AI models

What could the future hold for emotion analytics? 

This field is still in its early stages of development, and we can expect to see significant progress, quickly.   

New types of data such as changes in muscle movements, skin temperature and tone, and more could be used to infer a wider variety of emotional states. Heart rate and heart rate variation are new areas of research where we could expect to see significant developments. Scientists are studying micro-expressions and we predict further improvements there, which could make algorithms more accurate at detecting some emotions than humans are. Micro-expressions are very fast facial expressions which can go undetected by the human eye but can reveal a person’s true emotional state. 

emotion AI facial emotion detection FER

In the future, we expect to see larger and more diverse datasets used for training the emotion analytics algorithms, helping them to recognize more varied emotional states. As natural language processing evolves as a whole, we expect the advancements to lead to better emotion recognition also via text applications. The algorithms could analyze a person’s choice of words, their sentence structure, and other linguistic choices to determine a person’s emotion and intent. 

Overall, we can expect to see multiple improvements in the field of emotion analytics which would have potentially fantastic implications in basically all areas of human-computer interactions. 

Are you interested in keeping up to date with the developments in emotion analytics? Subscribe to our newsletter and follow us on LinkedIn. 

Share on:

Subscribe to receive the latest news and insights about AI


Palkkatilanportti 1, 4th floor, 00240 Helsinki, Finland
info@stagezero.ai
2733057-9
©2022 StageZero Technologies
envelope linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram