September 18, 2021

National Youth Film Festival

Welcome To National Youth Film Festival

Facebook Spying on Two Billion WhatsApp Users – Did You Deceive Us About Encryption?

In 2018, when the United States launched its initial investigation against Facebook, its founder Mark Zuckerberg declared in the Senate that all messages and content on WhatsApp were encrypted.

Shockingly, the company appears to be lying! A detailed report published by ProPublica revealed the intricacies of managing content on WhatsApp, indicating that the company has content moderators, that WhatsApp has provided metadata to law enforcement agencies, and Facebook has shared user data among its group of companies.

Essentially, if you report someone’s message, Facebook has the ability to read the message, but that goes against its claim that everything is end-to-end encrypted.

WhatsApp, the world’s most popular messaging app with more than two billion monthly active users, says its parent company, Facebook, cannot access conversations between users. However, it has also been reported that Facebook pays more than 1,000 employees worldwide to read and monitor supposedly “private” WhatsApp messages, casting doubt on the social media giant’s privacy practices.

The messaging app has had end-to-end encryption since 2016. However, there are some instances where messages can be read by these moderators.

Apparently, Accenture’s contract with Facebook employs 1,000 moderators who review user-reported content, which is identified by its machine learning algorithm. ProPublica writes that Facebook monitors spam, disinformation, hate speech, potential terrorist threats, child sexual abuse, extortion, and “sexually oriented acts,” among other things.

How does the process happen?

When someone reports a message, even if it’s in a private chat, a machine learning algorithm will scan for suspicious behavior and forward it, along with four previous messages plus any photos or videos, to a real human for evaluation. WhatsApp moderators have told ProPublica that the app’s AI sends them a huge amount of posts, and each reviewer handles up to 600 complaints per day, averaging less than a minute per case.

Depending on the assessment, the user can either be blocked, rejected, or added to a watch list, and unencrypted messages from users in the “proactive” list can be viewed along with other user data such as user groups, phone number, unique phone ID, status message, battery level and signal strength.

The company is also known to share some private data with law enforcement agencies. Furthermore, ProPublica claimed that WhatsApp user data helped prosecutors build a high-profile case against a Treasury Department employee who leaked classified records to BuzzFeed News exposing how the alleged money flows through US banks.

For example, WhatsApp chief Will Cathcart stated in an op-ed on Wired earlier this year that the company submitted “400,000 reports to child safety authorities last year and people have been prosecuted as a result.”

All of these practices are mentioned in the users privacy policy text, according to ProPublica, but you have to look hard to find them!

In response to the report, a WhatsApp spokeswoman told The Post, “WhatsApp provides a way for people to report spam or abuse, which includes sharing the latest messages in chat. This feature is important to prevent the worst abuse online.. We strongly oppose the notion that that accepting reports that a user chooses to send to us is incompatible with end-to-end encryption.”