Emotion AI is a subset of artificial intelligence that measures, simulates and reacts to human emotions. Market research firm Gartner predicts that by 2022, your personal device will know more about your emotional state than your own family members.
(Subscribe to our Today’s Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)
Researchers at the University of California San Diego have devised an artificial intelligence (AI) tool to predict the level of loneliness in adults, with 94% accuracy.
The tool used Natural Language Processing (NLP) developed by IBM to process large amounts of unstructured natural speech and text data. It analysed factors like cognition, mobility, sleep and physical activity to understand the process of aging.
This tool is an example of how AI can be used in devices to detect mental health conditions. Market research firm Gartner predicts, by 2022, your personal device will know more about your emotional state than your own family members.
What is Emotion AI?
Emotion AI is a subset of artificial intelligence that measures, simulates and reacts to human emotions. Part of affective computing, the modern version of the field roughly dates back to MIT lab professor Rosalind Picard’s book Affective Computing, published in 1997. Picard put forth theories of how research of emotions can be improved through new technologies.
Since machines are good at analysing large amounts of data, they can listen to voice inflections and recognise when those inflections correlate with stress or anger, according to research by business school MIT Sloan. Machines can analyse images and pick up subtleties in micro-expressions on humans’ faces that might happen too fast for a person to recognise, it said.
Where is Emotion AI used?
The most common use lies in virtual personal assistants like Amazon’s Alexa and Apple’s Siri, that perform tasks on command.
Amazon incorporated AI in its fitness tracker Amazon Halo, that analyses the consumer’s voice to determine mood and emotional state of mind.
Earlier in October, laptop maker HP introduced a virtual reality (VR) headset that uses sensors to measure physiological responses like facial expressions, eye movement and heartbeat to provide user-centric gaming experiences.
The benefits of emotion-sensing tech include personalisation, especially in corporate and medical fields where user experiences are subjective.
In medicine, nurse-bots keep track of a patient’s wellbeing in addition to reminding them to take medication. AI-controlled software can help practitioners diagnose depression and dementia via voice analysis.
In the workplace, AI in the form of chatbots and robots is said to provide judgment-free, unbiased and quick assistance to users. A survey conducted by technology company Oracle in October found that more than 90% respondents in India and China were more open to speaking to robots about their mental health over their managers.
The use of AI in tracking human emotions has been criticised, with bias being the top concern.
For instance, emotional analysis technology assigns more negative emotions to black men’s faces than white men’s faces, according to a study titled ‘Racial Influence on Automated Perceptions of Emotions’ written by a professor at University of Maryland.
AI is not sophisticated enough to understand cultural differences in expressing and reading emotions, making it harder to draw accurate conclusions. For instance, a smile might mean one thing in Germany and another in Japan. Confusing these meanings can lead to make wrong decisions, especially in businesses, according to Harvard Business Review.
Institutions need to be aware of potential AI biases that may affect the accuracy of findings. Emotion detection and recognition not only improves human and computer interfaces, but also enhances the feedback mechanism actions taken by computers from the users.
Valued at $17.19 billion in 2019, the market is expected to reach $45.48 billion by 2025, according to research firm Mordor Intelligence.