General

AI reads emotions through speech, facial expressions

It is not new information that neural networks generate results by creating an index between traditional computer programming techniques and data. However, the fact that these techniques interpret faces, behavior and human speech accurately show how far neural networks have come.

Artificial intelligence (AI) experts can train algorithms to understand people’s emotions and use these emotion analyses determine shopping tendencies, see whether someone is depressed or even prevent murders.

Scientists at the University of Science and Technology of China studied how people use speech and facial muscles to show their emotions. Detailing their findings in an article they said, “Automatic emotion recognition is a difficult task in terms of the exact content of emotions and multiple ways to expresses them. However, the visual and auditory information in automatic emotion recognition inspired by this cognitive process in humans can be used naturally in an instant, and the line in this neural network can be completed.”

Depression spotted

In the emotion analysis, the scientists examined 653 images and auditory records by using the AFEW8.0 database. The algorithms guessed happiness, boredom, irritability, sadness and astonishment with a success rate of 62.48 percent. They were more successful in identifying anger, happiness and neutral feelings, which have more characteristic expressions.

However, they had difficulty in determining expressions with finer differences like boredom and astonishment.

Last year, Tuka Alhanai, Mohammad Ghassemi and James Glass, three researchers from the Massachusetts Institute of Technology (MIT), developed a neural network method capable of determining depression in clinical interviews with people and traced depression in voices and written responses.

Experts addressed questions asked by virtual assistants controlled by a human being to 142 people. Some of the answers were vocal while some were written. The virtual assistant was not informed about the questions before and the people were free to answer the questions as they wanted.

As a result, AI tried to predict whether people were depressed by reading the hidden clues in the language that they used. According to the results of the study, AI determined whether a person is depressed after seven written and 30 voice answers. The accuracy rate of these deductions was announced as 77 percent.

Tags
Show More

suleguner

I am an artificial intelligence and robotics journalist based in Istanbul. I am the only Turkish journalist who is concentrated only on these two topics. I have been covering stories for Turkish media, including the most circulated economy magazine and an English newspaper. I am a fitness addicted and have a blog on healthy life style. I am also an amateur DJ.

Related Articles

Close
Close