A new campaign wants to see child abuse protection in messaging apps before End-to-End encryption
A new campaign backed by the UK government appeals to tech giants like Google and Meta, to stop the implementation of End-to-End encryption (E2EE) in their messaging services until they make sure they have found ways to protect children from abuse (via BBC). E2EE encrypts the sent messages, making them readable only by the sender and the receiver. Not even the app creators or law enforcement can read these messages.
The concern of the No Place to Hide campaign is that if tech companies implement E2EE in their messaging apps, this may make the detection of child abusers more difficult because the tech companies won't be able to detect messages containing child abuse material. A spokesperson from the campaign said that implementing E2EE would be "like turning the lights off on the ability to identify child sex abusers online."
Social media platforms that don't use E2EE detect and report child sexual abuse content to law enforcement. They can see what messages have been sent through their messaging app and notice if a potential child abuser is using their services to molest children.
The concern of the No Place to Hide campaign may be justified. Data shared by the US National Center for Missing and Exploited Children (NCMEC) shows that in 2020, 21.7 million reports of child sexual abuse have been made across social media. According to the NCMEC, 14 million reports of possible child sex abuse could be missed if tech companies roll out E2EE to their messaging platforms.
Back then, the NSPCC expressed its concern that if Meta rolls out E2EE to all of its messaging apps without safeguards to protect children, this may lead to many of the child abuse cases going undetected. The NSPCC is a British child protection charity.
Meta's opinion is that it can implement E2EE into its messaging apps while using methods to keep children safe. In November 2021, Meta stated that some of these methods might be:
Experts say that it is probably impossible to offer truly secure encryption and at the same time to scan for messages containing child abuse content.
Ciaran Martin, the former head of the National Cyber Security Centre, stated that the idea of having E2EE and giving law enforcement targeted access to messages was "technological 'cakeism.'" The two are incompatible solutions.
At the same time, however, in a statement, the UK's Home Office said, "The UK government supports encryption, and believes that E2EE can be implemented responsibly in a way which is consistent with public safety. Our view is that online privacy and cyber-security must be protected, but that these are compatible with safety measures that can ensure the detection of child sexual exploitation and abuse."
Apps like WhatsApp and Signal are already using E2EE, and Meta is expected to introduce E2EE into its Facebook Messenger and Instagram apps in 2023.
What is the concern behind the campaign?
The concern of the No Place to Hide campaign is that if tech companies implement E2EE in their messaging apps, this may make the detection of child abusers more difficult because the tech companies won't be able to detect messages containing child abuse material. A spokesperson from the campaign said that implementing E2EE would be "like turning the lights off on the ability to identify child sex abusers online."
Is the concern of the No Place to Hide campaign justified?
The concern of the No Place to Hide campaign may be justified. Data shared by the US National Center for Missing and Exploited Children (NCMEC) shows that in 2020, 21.7 million reports of child sexual abuse have been made across social media. According to the NCMEC, 14 million reports of possible child sex abuse could be missed if tech companies roll out E2EE to their messaging platforms.
A report from March 2021 shared with the National Society for the Prevention of Cruelty to Children (NSPCC) shows that from October 2019 to September 2020, there were over 9,470 child abuse cases in England, Wales, Scotland, and the Channel Islands, and 52% of these attempts took place on Facebook-owned apps.
What is Meta's opinion regarding the E2EE implementation and child safety?
Meta's opinion is that it can implement E2EE into its messaging apps while using methods to keep children safe. In November 2021, Meta stated that some of these methods might be:
- The "proactive detection technology" that Facebook currently uses. This technology will search for suspicious activity and take action against suspected accounts. For example, if someone creates several accounts or sends messages to a large number of unknown users, Facebook will detect and ban these accounts or persons.
- Making the accounts of those under the age of 18 private or "friends only" and limiting the messages that can be received from unknown adults.
- Teaching young people through in-app tips how to avoid unwanted interactions.
Is it really possible to use E2EE and, at the same time, keep children safe?
Experts say that it is probably impossible to offer truly secure encryption and at the same time to scan for messages containing child abuse content.
At the same time, however, in a statement, the UK's Home Office said, "The UK government supports encryption, and believes that E2EE can be implemented responsibly in a way which is consistent with public safety. Our view is that online privacy and cyber-security must be protected, but that these are compatible with safety measures that can ensure the detection of child sexual exploitation and abuse."
Are there apps that currently use E2EE?
Apps like WhatsApp and Signal are already using E2EE, and Meta is expected to introduce E2EE into its Facebook Messenger and Instagram apps in 2023.
Things that are NOT allowed: