At MIT, a system has been developed that uses artificial intelligence to analyze conversations and to recognize the user’s emotions. The system also takes into account the movements, the heart rate and the body temperature of the user.
It is an unwritten law that a single conversation can be interpreted in a very different way. For men and women, it can crash because everyone understands something else. Such misinterpretations, however, mostly have nothing to do with the sex of a person, but much more with his character.
One can also break down conversations, however, only on the mood that resonates with what has been said. Researchers from MIT, who have invented an intelligent wearable, are trying to recognize the feeling situation within a conversation.
If someone tells a story, the system can analyze the tone, pitch, energy as well as the vocabulary. In addition, the device is worn on the wrist and thus also recognizes physiological signals. It measures movements, heart rate, blood pressure, blood flow and skin temperature.
Two algorithms have been developed from the results: the one determines the general emotional position of a conversation (happy, sad or neutral), while the second one checks the conversation every five seconds. This gives you an overview, but can also detect fast changes of feeling. According to the researchers, this works with 83 percent accuracy.
Currently, however, the system only works when a person is talking. In the future, one would like to test the whole thing with several people in a conversation. In this way, the system would also learn more and more, because the more data it can generate, the more precisely it can classify the emotional tone in real-time.
Imagine if, at the end of a conversation, you could rewind it and see the moments when the people around you felt the most anxious. Our work is a step in this direction, suggesting that we can not be in a better position.
Tuka Alhanai, co-developer
Data protection is a sensitive issue. Not everyone is likely to be enthusiastic about the fact that his conversation is overheard and analyzed by a wearable. The team emphasizes, however, that privacy is very important. The algorithm runs locally on a device and all data is neither loaded into a cloud nor passed on to the developers.
In the future, one wants to work on an optimization of the system, in order to recognize emotional situations such as “boring”, “tense” or “exciting”. If this is done at any time, the researchers would like to use the algorithm for social coaching.
For example, people suffering from anxiety or Asperger’s Syndrome should help other people understand better. The system should then also be available on popular smart watches.