LightBlog

mardi 7 septembre 2021

[Update: Facebook issues clarification] Facebook can reportedly read some of your WhatsApp messages

Update (09/07/2021 @ 09:08 ET): Facebook issues clarification. Scroll to the bottom for more information. The article as published on September 7, 2021, is preserved below.

After WhatsApp shared its updated terms and privacy policy earlier this year, several rumors started circulating online, claiming that the company could read your private messages and share their contents with Facebook. WhatsApp vehemently denied these rumors and claimed that neither it nor Facebook could read your messages or hear your calls on the platform thanks to end-to-end encryption. It even took the opportunity to take a dig at Telegram, as it doesn’t offer end-to-end encryption. However, a new report claims that both WhatsApp and Facebook can somehow view the contents of your private messages.

The damning report comes from ProPublica, a non-profit investigative journalism organization with a solid track record. It claims (via 9to5Mac) that both Facebook and WhatsApp can view the contents of your private WhatsApp messages. The report notes:

[An] assurance automatically appears on-screen before users send messages: “No one outside of this chat, not even WhatsApp, can read or listen to them.”

Those assurances are not true. WhatsApp has more than 1,000 contract workers filling floors of office buildings in Austin, Texas, Dublin and Singapore, where they examine millions of pieces of users’ content. Seated at computers in pods organized by work assignments, these hourly workers use special Facebook software to sift through streams of private messages, images and videos that have been reported by WhatsApp users as improper and then screened by the company’s artificial intelligence systems. These contractors pass judgment on whatever flashes on their screen — claims of everything from fraud or spam to child porn and potential terrorist plotting — typically in less than a minute.

Since WhatsApp maintains that it uses end-to-end encryption, the aforementioned moderators shouldn’t be able to see the contents of your messages. That’s because end-to-end encryption should mean that only the sender and the recipient have the ability to decrypt messages. But that doesn’t seem to be the case.

The report further notes:

Because WhatsApp’s content is encrypted, artificial intelligence systems can’t automatically scan all chats, images and videos, as they do on Facebook and Instagram. Instead, WhatsApp reviewers gain access to private content when users hit the “report” button on the app, identifying a message as allegedly violating the platform’s terms of service. This forwards five messages — the allegedly offending one along with the four previous ones in the exchange, including any images or videos — to WhatsApp in unscrambled form, according to former WhatsApp engineers and moderators. Automated systems then feed these tickets into “reactive” queues for contract workers to assess.

In response to the report, a WhatsApp spokesperson said: “We build WhatsApp in a manner that limits the data we collect while providing us tools to prevent spam, investigate threats, and ban those engaged in abuse, including based on user reports we receive. This work takes extraordinary effort from security experts and a valued trust and safety team that works tirelessly to help provide the world with private communication.” While the spokesperson didn’t directly address the alleged lack of end-to-end encryption, they added that “Based on the feedback we’ve received from users, we’re confident people understand when they make reports to WhatsApp we receive the content they send us.”

If the details mentioned in the ProPublica report are accurate, both Facebook and WhatsApp could get in some serious trouble. 9to5Mac speculates that there might have been some misunderstanding during the investigation, and the moderators could be reviewing Facebook messages and not WhatsApp messages. But ProPublica claims that WhatsApp’s director of communications, Carl Woog, “acknowledged that teams of contractors in Austin and elsewhere review WhatsApp messages to identify and remove “the worst” abusers.” Woog also told ProPublica that the company doesn’t consider this work to be content moderation, and added, “We actually don’t typically use the term for WhatsApp.”

Furthermore, the report cites a confidential whistleblower complaint filed last year with the U.S. Securities and Exchange Commission to solidify its claims. The complaint details WhatsApp’s use of external contractors, AI systems, and account information to “examine user messages, images and videos. It alleges that the company’s claims of protecting users’ privacy are false.” The SEC hasn’t taken any public action on this complaint.

It’s worth noting that the ProPublica report clarifies that WhatsApp moderators only get access to reported messages. Be that as it may, neither WhatsApp nor Facebook should be able to see the contents of your messages if they’re truly end-to-end encrypted.


Update: Facebook issues clarification

Facebook has issued a clarification, stating that WhatsApp moderators are only able to read messages that are reported by users. This behavior is clearly specified in its privacy policy. We apologize for the confusion.

In a statement to 9to5Mac, the company has further revealed that when you use WhatsApp’s Report feature, the message is automatically sent to Facebook. Moderators are then able to review the message, along with four preceding messages from the same chat. This provides moderators with enough context to evaluate the offending message. The company maintains that it can’t see other messages that are not reported.

The post [Update: Facebook issues clarification] Facebook can reportedly read some of your WhatsApp messages appeared first on xda-developers.



from xda-developers https://ift.tt/3kZUTPY
via IFTTT

Aucun commentaire:

Enregistrer un commentaire