Microsoft Emotion Recognition shuts down

Have you ever heard of the emotion recognition of Microsoft? The Azure Face software, which is powered through the artificial intelligence (AI), was created to recognize a person's emotion from videos and photos. However, the company has announced that it will retire this feature. Understand the reasons!

Read more: The 4 best apps to earn money in 2022

see more

Replaced by ChatGPT at work, woman spends three months…

Towards artificial intelligence: Apple plans to integrate chatbot in…

Microsoft Emotion Recognition

Activists and academics have raised concerns for years, claiming that facial recognition software that claims to be able to identify gender, age and emotional state of a human being can be quite biased, unreliable and invasive – and for that reason it should not be sold.

Microsoft's tool was criticized for trivializing the so-called "emotion recognition". According to experts, the facial expressions considered universal by the application are different from depend on populations, therefore, it is not possible to equate external displays of emotion with feelings more deep.

Natasha Crampton, director in charge of the AI ​​tool, wrote in a post announcing the news that experts from both inside and outside from outside the company pointed to the lack of scientific consensus regarding the definition of emotions, as well as the widespread results problems.

Microsoft has already stopped offering emotion recognition features to new customers. However, for those who already used the service, access must be revoked by June 2023.

Access to the tool will still be allowed in some cases

The tool can still be used in some situations. From now on, users will need to sign up to use Azure Face, telling Microsoft how, where and for what purpose the systems will be used.

In this way, use cases that present less harmful potential (such as automatically blurred faces in videos and images, for example) will remain with available access.

The decision to discontinue the indiscriminate use of the tool is part of a long review of Microsoft's ethics policies regarding artificial intelligence. In addition, the company intends to limit access to some other features, as well as completely remove others from its list of applications.

Russia accuses US of spying when it says it hacked iPhones

On Thursday the 1st, the Federal Security Service of Russia (FSB) revealed a security system disc...

read more

See the 4 best fruits for those with diabetes

Diabetics should take a variety of health precautions, particularly with regard to nutrition, in ...

read more

4 ways to create a 'perfect connection' with anyone

If feeling attracted to someone is something intrinsic to human beings, the existing chemistry is...

read more