* Any views expressed in this opinion piece are those of the author and not of Thomson Reuters Foundation.
As the conversation heats up between privacy versus protection lobbyists and legislators in the UK and Europe, what does the public think?
By John Carr, ECPAT senior advisor
From artificial intelligence to the metaverse, from age verification for access to social media to privacy angst, where is tech taking us? Can social media ever really be considered ‘safe’ for children?
2021 has been momentous in many ways. In the European Union, the legislative framework governing the rights, responsibilities, and obligations of online platforms has been brought sharply into focus. At times, there has been dramatic rhetoric revealing seemingly intransigent, opposing views on ‘our’ collective appetite for tech-related societal risks.
Following intense lobbying and campaigning by child’s rights organisations, the European Union finally agreed to a temporary solution that meant companies that had stopped protecting children by scanning their platforms for child sexual abuse and exploitation as part of the ePrivacy Directive in December 2020 started doing so again. But the fact that it is a “temporary” derogation highlights that the European Union must now turn its mind to a permanent or long-term solution. As we speak, the Commission is drafting that proposal.
At the moment, online platforms are able to identify criminal and potentially harmful behaviour on their messaging services, not by reading the content of the messages, but by being able to identify known, underlying patterns of criminal content or criminal behaviour. End-to-end encryption could bring that to an end so while the messages themselves might become more secure it would also grant bad actors anonymity and a potential gateway to target and harm children.
In the name of improving privacy, Facebook, now part of Meta, announced plans to delay its proposed introduction of end-to-end encryption for Messenger and Instagram Direct until 2023. The company said it wants to ensure it can implement the technology safely. This is very welcome but it is difficult to see how that can be done if they proceed in the way it was originally described.
Apple recently came up with a great idea. The company recognised that its already encrypted services could be providing a safe haven for child offenders by providing them with anonymity. However, the company decided to go one step further in the protection of children in the digital space by introducing privacy-respecting technology that would enable the company to detect known child sexual abuse material on devices prior to being uploaded to iCloud. This was not a wholly original idea but given the context it was definitely radical. Children’s groups from around the world applauded their decision.
However, Apple mishandled the communications to announce this new child protection approach, and this has forced them to pause and regroup. It is vital that Apple follow through on their original plan. If they fold Meta will feel under no pressure to abandon their disastrous idea. If Apple doesn't fold, Meta will have to follow their lead as will the tech sector as a whole. That would be incredible news for children everywhere.
ECPAT International is the world’s largest network of civil society organisations fighting to end the sexual exploitation of children. A core part of its work involves advocating to ensure children are protected from harm on the internet. While focusing intensely on the European Union legislation, we realised that somehow the public’s voice, including those of parents and caregivers, was being lost or ignored.
We wanted to hear their thoughts, so we asked: ‘What does the public really think about the issues of privacy and child protection online?’ We commissioned YouGov to carry out representative polling in eight European Union Member States.
We discovered that adults were very clear. Parents and non-parents alike want children protected, even if it means losing some of their own privacy. The research showed widespread public support, with 68% in favour of the European Union strengthening legislation to mandate tech companies to turn on automated tools that can detect and identify images of child sexual exploitation. In all, 76% of respondents were willing to give up some of their personal privacy online to allow for automated technology tools to scan and detect images of child sexual abuse and detect other forms of sexual exploitation of children. So why aren’t we making sure it happens?
We know that technology isn’t perfect and that it can be misused, but ultimately, we must not forget that every child has the right to be protected online. This is written in EU and international law, and it is reflected in the public voice.
It’s time for us to keep children safe online, everywhere and all the time. EU institutions and tech platforms must listen to what the public want and the message is clear - constituents and users want the tools to protect children from harm turned on. Let’s make 2022 the year to listen and act in order to keep children safe.
Our Standards: The Thomson Reuters Trust Principles.