Sweat Emotion
By measuring changes in the skin's electrical properties, this system can recognize emotional states for more natural interactions with AI.
We generally have a pretty good knack for reading other people’s emotional state. Being so essential to effective communication, it is something that we pick up early on in life. Without this understanding, things can go south real fast. You would not want to, for example, talk to someone who is angry or sad in the same way that you would interact with someone who is feeling happy. Yet that is what robots, chatbots, and other artificial systems do. They have no understanding of human emotional states, so the door is wide open for things to get awkward.
This is going to be a larger and larger problem going forward, especially as artificial intelligence algorithms like large language models find their way into more facets of our lives. For this reason, engineers have been experimenting with technologies that can read human emotions. The thought is that this additional information could be incorporated into the next generation of algorithms, enabling them to respond more appropriately in each unique situation.
At present, these systems are not entirely practical, however. The majority of them use cameras to capture images of one’s face, which certainly has the potential to recognize certain emotional states. However, using cameras can be seen as a violation of one’s privacy, and they also require significant energy consumption and processing resources, making them impractical for lots of use cases. Other solutions that use electrocardiography or facial temperature measurements suffer from similar issues.
Just recently, a more practical solution was proposed by researchers at the Tokyo Metropolitan University and Nagoya University. Their system relies on simple sensors that can be attached to the skin via a wearable device. These sensors measure skin conductance, which, it turns out, is altered in distinct ways in response to changes in one’s emotional state.
The team’s prototype consists of probes that are affixed to the skin, which connect to a commercial dermal activity sensor, which is capable of measuring conductance. An amplifier forwards this signal into a data acquisition device which collects the data such that it can be analyzed with MATLAB. Factors such as the steepness of the conductance response, and the speed at which it decays, were analyzed by a custom algorithm. The algorithm was designed to recognize three emotions — fear, family bond emotions, and funniness.
This approach works because changes in emotional state are also met with changes in perspiration, and that perspiration alters the skin’s electrical properties. These changes start to happen in between one and three seconds, so it can respond to changing situations surprisingly fast.
The team’s methods were evaluated in a series of human trials. Participants were instrumented with the skin conductance probes, then were instructed to watch videos intended to elicit one of the three emotions that the algorithm can recognize. The results were quite promising, showing that emotions can in fact reliably be detected via skin conductance.
While this approach is more practical than existing systems in some ways, it is not entirely clear just how useful it will be in the real world. It will still have to be determined how many emotions can be detected via changes in perspiration, and how finely the system can distinguish between a larger set of options. Other factors, like changes in temperature or humidity, will also have to be assessed to determine how they impact the system’s performance.