By Foo Yun Chee, Charlotte, Van and Campenhout
BRUSSELS (Reuters) -Google, Meta and other online service providers will be required to find and remove online child pornography under proposed European Commission rules, a move some privacy groups say could put people’s communications at risk.
The EU executive said its proposal announced on Wednesday aimed to replace the current system of voluntary detection and reporting by companies which has proven to be insufficient to protect children.
It cited the more than one million reports of child sexual abuse in the 27-country bloc in 2020, with the COVID-19 pandemic a factor in the 64% rise in such reports in 2021 compared to the previous year. On top of that, 60% of child sexual abuse material worldwide is hosted on EU servers.
“The proposed rules introduce an obligation for relevant online service providers to assess the risk of their services‘ misuse for the dissemination of child sexual abuse materials or for the solicitation of children (grooming),” the Commission said in a statement.
The companies will then have to report and remove known and new images and videos, as well as cases of grooming. An EU Centre on Child Sexual Abuse will be set up to act as a hub of expertise and to forward reports to the police.
The rules will apply to hosting services and interpersonal communication services such as messaging services, app stores and internet access providers.
Meta subsidiary WhatsApp echoed the same concerns.
“Incredibly disappointing to see a proposed EU regulation on the internet fail to protect end-to-end encryption,” Will Cathcart, WhatsApp head, said in a tweet.
The draft EU rules need to be thrashed out with EU countries and EU lawmakers before they can become law.
(Reporting by Charlotte Van CampenhoutEditing by John Stonestreet and Mark Potter)