It’s Too Soon To Trust AI To Analyze Emotion

A prominent group of researchers alarmed by the harmful social effects of artificial intelligence called for a ban on automated analysis of facial expressions in hiring and other major decisions. The AI Now Institute at New York University said action against such software-driven “affect recognition” was its top priority because science doesn’t justify the technology’s use and there is still time to stop widespread adoption.

From the article “Researchers criticise AI software that predicts emotions”  from SABC News.

AI technology will make amazing innovations possible that enhance our ability to work and live positively. But like any technology, in the hands of the irresponsible, it has the potential to do great harm as well.

The root of the issue has nothing to do with AI or algorithms -it’s the people applying AI in shortsighted ways that we have reason to fear.

At MarketChorus we focus on building AI tools that augment and empower their human operators rather than creating artificial limitations. We’ve very deliberately avoided investing into sentiment analysis for this reason. We believe the technology, and even the science behind sentiment analysis itself, is too nascent to be applied responsibly.

Just because something can theoretically be done, that doesn’t mean it can be done well enough in practice to start selling it as superior to good old fashioned human analysis.

We’ll be staying well clear of AI use cases like this…

“How people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation,” wrote a team at North-eastern University and Massachusetts General Hospital.

Human emotion is a fascinatingly complex and nuanced subject that even the most well-trained AI isn’t remotely ready to tangle with in a real world application.

For more information on this controversial topic, read the original article on SABC News.

Read The Original Article

Follow Us: