Microsoft Emotion Recognition shuts down

Have you ever heard of the emotion recognition of Microsoft? The Azure Face software, which is powered through the artificial intelligence (AI), was created to recognize a person's emotion from videos and photos. However, the company has announced that it will retire this feature. Understand the reasons!

Read more: The 4 best apps to earn money in 2022

see more

Replaced by ChatGPT at work, woman spends three months…

Towards artificial intelligence: Apple plans to integrate chatbot in…

Microsoft Emotion Recognition

Activists and academics have raised concerns for years, claiming that facial recognition software that claims to be able to identify gender, age and emotional state of a human being can be quite biased, unreliable and invasive – and for that reason it should not be sold.

Microsoft's tool was criticized for trivializing the so-called "emotion recognition". According to experts, the facial expressions considered universal by the application are different from depend on populations, therefore, it is not possible to equate external displays of emotion with feelings more deep.

Natasha Crampton, director in charge of the AI ​​tool, wrote in a post announcing the news that experts from both inside and outside from outside the company pointed to the lack of scientific consensus regarding the definition of emotions, as well as the widespread results problems.

Microsoft has already stopped offering emotion recognition features to new customers. However, for those who already used the service, access must be revoked by June 2023.

Access to the tool will still be allowed in some cases

The tool can still be used in some situations. From now on, users will need to sign up to use Azure Face, telling Microsoft how, where and for what purpose the systems will be used.

In this way, use cases that present less harmful potential (such as automatically blurred faces in videos and images, for example) will remain with available access.

The decision to discontinue the indiscriminate use of the tool is part of a long review of Microsoft's ethics policies regarding artificial intelligence. In addition, the company intends to limit access to some other features, as well as completely remove others from its list of applications.

Travel medicine kit: What to pack in your toiletry bag?

Many of us have medical conditions that need special care, such as the constant use of medication...

read more

An ideal time for the Leo moon! check out

This Tuesday, the 11th, the leonine Moon boosted everyone's spirits, both to enjoy life and to re...

read more

'Until when can I postpone the payment of the electricity bill?'; see the maximum period

The reality of life for many Brazilians is quite hard, even more so given the financial instabili...

read more