Emotion-sensing technology in the Internet of Things
After Facebook introduced the possibility to mark various reactions to a post from a list instead of just a ‘like,’ personalization has evolved to measure a user’s emotional reactions or feelings at a given moment. Personalization of online experiences has enabled close tracking of Internet activities (comments, likes, tags, recommendations, photos, and videos) and impacted search results seen by a user, the content of their social networks’ feeds, and online interactions with other users.
Having caught on to this emotional outreach, the Internet of Things (IoT) is further changing the way technology perceives human emotions. IoT is integrating innovative technologies into apps and devices to track and reflect the users’ daily emotional states and to impact their behavior patterns accordingly.
Wearable emotion sensors
Skin conductance sensors are widely used by psychologists and therapists, in clinics, hospitals, etc. IoT is implementing emotional ingredients in the wristbands too. Their sensor technology also allows for gathering data on heart rate, blood pressure and temperature to define an individual’s emotional states. Such emotion sensors are relatively low-cost, easy to use, and have a wide variety of utilization.
Smart watches and health wearables create a foundation for technology that helps coordinate daily habits while avoiding potential health issues and other problems. With a modern smart device, the user can track and evaluate their physical condition and how they react to stress-inducing situations, as well as learn to manage stress and anxiety better. The device might instruct them to practice a mind controlling technique or to do breathing exercises to calm down, or turn on relaxing music.
Other technology focusing on emotions
New emotion-sensing technologies and software fueled by artificial emotional intelligence can read and analyze not only skin conductance, breathing and heart rate, but also eye movements, facial expressions, changes in voice, etc. And they don’t necessarily require installing expensive hardware, but rather just some recognition software or additional code for computers or smartphones. For example, even slow or uneven cursor movements may reflect distraction or negative emotions of the user.
The growth of emotional intelligence technology is going to produce a number of focus areas that will sense emotions:
- Reflecting on a personal emotional experience
New emotion detection technologies could help employees make better decisions, improve their focus and performance in the workplace, manage stress, and adopt healthier and more productive work styles.
Traders are a good example. They tend to overpay for assets and downplay risk in what they call a ‘bidding frenzy’. To address this problem, Philips and ABN AMRO developed the Rationalizer bracelet back in 2009. While the bracelet measured emotions via electrodermal activity, a display was reflecting the user’s heightened emotional states. The display thus signaled the need to pause and rethink financial decisions.
Some of the world’s elite coaches, teams, and individual athletes have used headsets produced by San Francisco-based SenseLabs Inc. Their Versus gear connects to an iPhone or iPad via bluetooth and has dry sensors for assessing brain performance. This makes it possible to identify strengths and weaknesses in problem-solving, multitasking, resource management, decision-making, and sleep tendencies. Versus then provides customized exercise protocols to improve mental acuity, concentration, and sleep management.
Aggregated data from such devices can help companies understand how internal and external environmental factors impact employees and groups. As a result, they might redesign processes accordingly to help keep personnel better engaged and productive.
Speech-based emotion analysis in real time opens up more business opportunities. This and other emotion-sensing technologies can enable companies to establish deeper emotional connections with their consumers through virtual assistants. Popular VPA like Siri, Cortana, and Google Assistant use natural-language processing and natural-language understanding to process verbal commands and questions. Adding emotion sensing capabilities will enable them to create more comfortable and natural user interactions.
Call centers are another potential customer group. Voice-based emotion sensing can enable automated customer service agents to recognize callers’ emotional states and adapt to them. It will also help management analyze stress levels of human workers.
- Enabling actions and reaction to emotions
In the future, more and more smart devices will be able to capture emotional reactions to certain data and facts, analyze situations accordingly, and come up with appropriate recommendations. Currently, the healthcare and automotive industries are among the most eager to adopt emotion-sensing features.
Car manufacturers are exploring the implementation of in-car emotion detection systems to improve road safety by managing the driver’s drowsiness, irritation, and anxiety. For instance, Panasonic Corporation’s new sensing technology can detect a person’s emotions and sense of being hot or cold in a contactless manner. This information can be used for predicting a driver’s drowsiness to help keep them awake.
The technology measures a driver’s blinking features and facial expressions captured by an in-vehicle camera and processes these signals using AI. Further, using the data on heat loss from the driver and in-vehicle illuminance, it predicts transitions in the driver’s drowsiness level. Combining this thermal sensation monitoring function, the system helps the driver to stay comfortably awake while driving. When the drowsiness level is high, it issues a sound alarm or a command to rest.
- Facilitating communication
With the help of mood sensor technology, children or elderly family members in need of care will be able to receive timely assistance and support from their families or caregivers. Emotion-sensing wearables will help monitor the state of mind of persons with mental and other health conditions 24/7. When necessary, they will alert doctors and caregivers and inform about upcoming changes in the person’s mood and behavior.
Remote emotions detection is possible as well. One of the devices created at MIT’s Computer Science and Artificial Intelligence Laboratory emits radio signals that reflect off a person’s front and back body. By measuring heartbeat and breathing, the device can accurately detect emotional reactions. Such remote sensing technologies could be used to diagnose or track conditions such as depression and anxiety, as well as for non-invasive health monitoring and diagnosis of heart conditions.
- Turning emotions into an experience
Technology that deduces human emotion based on audio-visual cues may enable businesses to detect consumers’ positive and negative moods to better understand their preferences, analyze customers’ choices to utilize in marketing, and detect users’ annoyances to improve product usability, etc.
Emotion-sensing technology also brings to life a new design approach which will include more complicated tasks than simply creating a visual design. It will combine monitoring and analysis of behavior patterns, measuring actions and noting facial expressions, voice intonation, and body language. Smart devices will be learning to assess the meaning of emotions they ‘perceive’ and to respond sensitively.
The ability of everyday objects to respond to users’ emotional states can be used to create more personalized user experiences. It can be applied in areas like educational and diagnostic software, driverless cars, personal robotics, pervasive computing, sentient virtual reality, video games, affective toys, and other major consumer electronic devices.
For instance, a fridge with a built-in emotion sensor may interpret a person’s mood and suggest a more suitable food. Emotion-sensing smart home devices could provide entertainment (music, videos, TV shows or imagery) which matches the user’s current state of mind. Video games might use emotion-based biofeedback technology to adjust game levels and difficulty according to the player’s emotional states.
MIT Media Lab spinoff Affectiva has been analyzing people’s facial expressions and non-verbal cues for applications in advertising, marketing, and video games for years. But their vision is to build a multimodal emotion AI platform that senses and measures emotions the way humans do. In September 2017, Affectiva announced the release of cloud-based software that identifies the speaker’s gender and observes changes in speech paralinguistics, tone, volume, speed, and voice quality to distinguish anger, laughter, or arousal.
IBM along with numerous startups are developing techniques to add human-like qualities to robotic systems. The development of emotional AI will lead to creating more effective personal assistant robots. They will be able to distinguish between, and react to, different people and their emotional states. For example, when a robot detects disappointment on the human’s part, it will respond apologetically in a modulated voice. Interacting with a specific person, it will gradually learn emotional awareness. Since emotions remain a fundamental need for humans, emotion-sensing technology should start teaching intelligent objects how to interact with humans as soon as possible.
Conclusion
Fields as diverse as medicine, advertising, robotics, virtual reality, gaming, education, working conditions and safety, automotive industry, home appliances, etc., will significantly benefit from emotion-sensing technology. Technology strategic planners should consider and take advantage of it to build and market products of the future. Intelligent machines with empathy for humans are sure to make the world a better place.
The field is definitely progressing on human emotion understanding thanks to achievements in computer vision, speech recognition, deep learning, and related technologies. Every year, we will see more mood sensor technology being realized. And while most existing technologies require on-body devices or voice/facial recognition software, research and development efforts will be increasingly directed towards technology which measures emotions in a contactless way.