WhatsApp takes a stand against Apple's pedophile identification tool

A rift between Apple and the Whatsapp was among the main news of the last week. The new “Apple” tool to identify cases of child abuse in photos generated controversy with the messenger.

Read more: Check out the most anticipated news of WhatsApp

see more

Alert: THIS poisonous plant landed a young man in the hospital

Google develops AI tool to help journalists in…

The announcement of the feature on the iPhone was made last Thursday (5). The objective is to identify images that constitute a crime of child abuse/exploitation. However, WhatsApp representative Will Cathcart said the novelty would violate legal issues. Furthermore, it would also violate the privacy policy.

“I read the information that Apple released yesterday [05/08] and I am concerned. I think this approach is wrong and a setback for the privacy of people around the world. People ask if we are going to adopt this system for WhatsApp. The answer is no”, emphasized Cathcart, in his Twitter account.

Will posted several messages to justify his concerns regarding Apple's announcement. According to him, governments and spyware companies could intercept the software on the iPhone. This would violate user privacy.

The executive also questions Apple's transparency in the matter. “Can this scanning software on your phone be foolproof? The researchers weren't allowed to find out. Why not? How will we know how often mistakes are violating people's privacy?”, he asked in one of the posts.

“I've had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning. This is a really bad idea. These tools will allow Apple to scan your iPhone photos for images that match to a specific perceptual hash and report them to the company's servers if too many of them show up," he said. also.

Apple said the message scanning information is not exactly correct. The data acquired from the security analysis would be placed in an encrypted bank. The company would not be accumulating data from users' photo libraries. This would only happen if criminal material was detected.

Apple aims for the safety of minors

Apple, in turn, reported that the new feature is part of an update package scheduled for 2021. This package is part of a new user protection policy launched by the company. The objective is to identify and prevent children and adolescents from sexual predators.

The new system uses “NeuralHash”, a kind of neural matching. Thus, he would be able to detect images with fingerprints corresponding to pedophilia.

The problem with the novelty is the risks it can bring to the security of user data. Experts in the field say it could generate surveillance in encrypted messages. This is even the opinion of Matthew Green, from the Johns Hopkins Security Institute.

According to Apple, while the potential for a misread exists, the rate of misread users would be less than one in 1 trillion per year.

Find out what kind of genius you are by looking at this optical illusion

Find out what kind of genius you are by looking at this optical illusion

A optical illusion is widely used by psychologists in their tests, after all, the way in which pe...

read more
Salad Word Search: How much do you know about vegetables?

Salad Word Search: How much do you know about vegetables?

The salad is normally a colorful preparation, composed of several vegetables and it has many heal...

read more

Chinese students go viral after posting iconic photos

Chinese university students went viral on the internet for making fun of graduation photos. Inste...

read more