MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) and Institute of Medical Engineering and Science (IMES) researchers are working on an interesting AI based wearable that can analyze human emotions in the conversation.

Although Artificial Intelligence was being used to decipher the words and context yet it was unable to detect the tone or mood of human being. We can say that AI couldn’t understand the emotions of humans. Now MIT researchers are making it possible to analyze various mood forms such as excitement, sadness, neutrality and positivity in the conversation by analyzing the pattern of speech.

analyze human emotions using wearable

As per the researchers, the system can analyze human emotions in the story with 83% accuracy. For this expanse of accuracy, the system examine audio, text transcriptions and physiological signals. The AI based system can also provide a sentimental score. Now the AI based wearable itself can explain the level of emotions in the particular story.

The MIT researchers used Samsung Simband as a research device in order to capture high- resolution physiological wave-forms to measure human features such as movement, heart rate, blood pressure, blood flow and even skin temperature. The system can also detect speaker’s tone, pitch, energy and vocabulary.

To test the system’s ability and accuracy, researchers captured 31 different conversations of several minutes each. They prepared two type of algorithms, one for analyzing whole conversation pattern while another algorithm classified each five-second block of conversation as a typical mood such as positive, negative or neutral.

As human body language get differentiated as per the emotions. When a human gets sad then his conversation contains long pauses and monotonous vocal tones. On the other hand, happy conversations contain different pitch and pattern. The intended AI based system can analyze all these human features to show the output.

It is not possible to analyze the emotions and feeling of any person correctly without using technology. Now it will be easy to know the actual feelings behind the pretending gestures with the help of mood predicting wearable.

MIT Researchers’ statement

“Our next step is to improve the algorithm’s emotional granularity so that it is more accurate at calling out boring, tense, and excited moments, rather than just labeling interactions as ‘positive’ or ‘negative,’” says Alhanai. “Developing technology that can take the pulse of human emotions has the potential to dramatically improve how we communicate with each other.”

Click here to watch the video of MIT invention

MIT researchers are working to improve the algorithm as it is not yet much reliable to be deployed for the social coaching. The researchers team plans to collect the data on a larger scale to test the algorithm, so that the system can analyze human emotions appropriately. For this, they also plan to use Apple watch and hopefully many other wearable will be used to test the specified algorithm.

LEAVE A REPLY

Please enter your comment!
Please enter your name here