Relax! Your Messages are Encrypted, only abuse reports are inspected: WhatsApp
A WhatsApp spokesperson has clarified that WhatsApp only inspects users' messages when a co-user generates a report of abusive content against a particular message. Such a report causes a message to be forwarded to WhatsApp, allowing you to check the app for abusers and spammers. The spokesperson claims that their end-to-end message encryption is not broken as a result of this, as voluntary abuse reports are only sent to the company in exceptional circumstances. ProPublica's source report has also reflected WhatsApp's stance on the matter below. Find the original report below.
Social media giant Facebook touts WhatsApp as a secure messaging platform where user chats are end-to-end encrypted. A recent report has found that WhatsApp can allow content moderators to access user messages in certain cases. According to a report by ProPublica, there are more than 1,000 workers hired in office buildings in Austin, Texas, Dublin and Singapore. These hourly workers, according to the report, can only view messages that users have reported. This means that these moderators can only see the messages, images and videos of the users only when the receiver presses the report button to report the message to WhatsApp.
The report in ProPublica says that this message review is one element in a broader monitoring operation in which the company also reviews material that is not encrypted, including data about the sender and their account. A 49-slide internal marketing presentation from December 2020 accessed by ProPublica emphasizes the "fierce" promotion of WhatsApp's "privacy narrative." Compare the character of the brand with "the immigrant mother." This marketing material does not mention the company's content moderation efforts.
WhatsApp communications director Carl Woog acknowledged that contractor teams in Austin and elsewhere review WhatsApp messages to identify and remove abusers. However, he told the post that Facebook does not consider this work to be content moderation. "The decisions we make around how we build our app are focused around the privacy of our users, maintaining a high degree of reliability and preventing abuse," said Wong in the report.