Last Monday I had the privilege to be invited to 10 Downing Street to meet with the Home Secretary and Technology Minister for a round table meeting bringing together bringing together law enforcement, NGOs, and the tech sector, to discuss the problem of Online Child Sexual Exploitation and Abuse. I was there in my capacity as Chair of OSTIA (the Online Safety Technology Industry Association) and I was pleased to be asked by the Technology Minister to talk about the capabilities our members are creating that can help with implementation of measures required by the bill, and particularly in the context of End to End Encrypted (E2EE) messaging.

While the overall challenge of keeping children safe online is an enormous one, I was pleased to be able to describe specific solutions from OSTIA members to key parts of the challenge. One of these is Cyacomb Safety’s capability to detecting known Child Sexual Abuse Material (CSAM) in E2EE while completely preserving user privacy, enabling messaging companies to take appropriate action. The importance of this type of solution was underlined by remarks from the Home Secretary, NCA, NSPCC and IWF about the impact of known CSAM on normalising offender behaviour and revictimising survivors of child abuse.

The position of the government on the Online Safety Bill appears to be clear at this point, as reaffirmed by the bill passing it’s third reading in the House of Lords later last week. The bill will apply to E2EE just as it does to all other spaces, at the same time recognising the greater technology and privacy challenges of operating in such an environment. That is precisely what our technology does, and calls to mind the remarks of the ICO after a project we worked on, saying our “efforts to establish ‘data protection by design and default’ are what responsible innovation looks like.” 

Last week I was also asked to speak to a large group of MEPs and EU policymakers considering new EU regulations to protect children online. I was asked to explain the capabilities of Cyacomb Safety, to help them understand the feasibility of applying regulation to E2EE environments. The EU is still considering a fairly broad range of options, from those with far-reaching implications for the future of privacy and cybersecurity through to almost abandoning child safety regulation in E2EE altogether. I was delighted to be able to talk about a specific and demonstrable solution that could improve child safety while preserving privacy, and from the number of questions I received I think the audience found it a thought-provoking topic. More on Cyacomb Safety and Combatting online child sexual abuse while protecting privacy can found in our Whitepaper. 

Both in Downing St and with the EU the need to protect children was at the heart of the conversation, as was the desire of citizens to see appropriate protections in place. This often seems to be overlooked in the media where often the counter narrative to safety comes from privacy advocates claiming to represent the interests of the public. When you ask the UK public what they want, the answer is clear as NSPCC data shows - 60% think it should be a legal requirement to scan for CSAM in private messaging, and 79% think that technology to do this in E2EE should be developed where it doesn’t already exist. Data for the EU is very similar.

Society wants to be protected from harm while carrying out their lives online, and there are solutions to deliver just that. I’m delighted to see Ministers, MPs, MEPs and policymakers across the UK and EU making good progress towards regulating for a safer online world, especially for children. This is a pivotal moment in the tech landscape, and there is hope.


If you'd like to know more about Cyacomb's work in this space then please contact Ian Stevenson (ian.stevenson@cyacomb.com

 

Please enter your details below to download your resource

By submitting this form you acknowledge that your personal data will be processed in accordance with our Privacy Policy.

Thank you.

Please click here to start downloading your file.