The UK government has repeatedly stated that it wishes to make the UK the safest place in the world to go online by means of the Online Safety Bill. 
Cyacomb CEO, Ian Stevenson, has written to the Secretary of State for Digital, Culture Media and Sport because in the key area of Encrypted Messaging the UK is currently behind most other countries in the world, and it’s our understanding that the Online Safety Bill will not change this.  There appears to be a disconnect between the stated intent of government and the effect of the proposed Bill. 
Read Ian’s full letter here

The intentions we heard from Ministers with the Tech and Digital Economy brief have been clear.

“We won't allow capability to detect CSAM to be degraded or removed by introduction of encryption”
“There is an insistence from government (and it will be an insistence) that comparable detection is in place in E2EE messaging”

It is not clear that these intentions are reflected in the Online Safety Bill as it now stands. 

Through the Safety Tech Challenge Fund (STCF) DCMS and Home Office has invested in the creation and validation of new technologies that can prevent the spread of illegal content in End to End Encrypted messaging platforms, with a focus on Child Sexual Abuse Material (CSAM).   

STCF suppliers received written advice from the Information Commissioners Office stating that under the Privacy and Electronic Communications Regulations (PECR) any deployment of such CSAM detection technology for UK users on a messaging platform would require explicit opt in consent from the user.  It seems obvious that criminals seeking to exchange CSAM will not choose to consent, and therefore that any deployment of the technology for UK users would have no positive impact (and therefore would not take place). 

PECR is an EU regulation, and the requirement for opt-in consent for CSAM detection appears to have been an unintended consequence of the regulations. and tThe EU has subsequently issued a temporary derogation from PECR for the purposes of detecting CSAM.  EU Commissioner Ylva Johannsson is leading on new EU legislation to succeed the temporary derogation with new EU regulations to ensure the fight against CSAM can continue. 

Britain having left the EU, the EU derogation does not apply, and the UK government has not implemented any equivalent. 

If a company offering Encrypted Messaging services in the UK wished to implement CSAM detection technology such as those that came out of the UK Government STCF it would be able to do so effectively in the EU and most of the rest of the world.  It is only in the UK that regulations would require user opt-in, rendering deployment pointless. 

If the Online Safety Bill were passed in its current form, a platform offering Encrypted Messaging services to users and wishing to offer leading edge safety features would not be able to implement CSAM detection without requiring explicit opt-in consent from UK users, even if the proposed technology could comply with all aspects of established Data Protection (and other) regulations. 

The amended bill fails to effectively mandate CSAM detection, and is not even sufficient to make it practical for companies to offer it voluntarily.  This appears to be inconsistent with the stated intentions of the UK Government in the statements of Ministers and indeed the UK Government Factsheet on the Online Safety Bill:  

“All platforms in scope will need to tackle and remove illegal material online, particularly material relating to terrorism and child sexual exploitation and abuse.” (1)  

A year or so ago this might have seemed like a fairly abstract issue as there were no technically feasible solutions to the problem. However, innovation in this area has been continuing at pace and as a result technically feasible solutions are now available. Cyacomb was one of the participants in the STCF programme, and received mentorship and support from DCMS, Home Office, GCHQ, and the ICO in developing a technically feasible solution. As a result of this programme and subsequent work we now have a technology that can achieve this aim while completely preserving user privacy (it is not simply a back door to encryption).   

The credentials of this technology are excellent.  It is based on the class of technology described and advocated in the recent paper (2) from Dr Ian Levy and Crispin Robinson (both of GCHQ). We also have detailed written advice from the Information Commissioners Office on the interactions of this technology with data protection law, as a result of which we believe there are no fundamental legal barriers deployment of the technology.  In the words of Dan Sexton, CTO of the Internet Watch Foundation: 

“Cyacomb has demonstrated that the detection and blocking of known child sexual abuse material in E2EE environments is no longer an engineering problem.” 

We believe (thanks in no small part to STCF), the question of “what do we do about CSAM in Encrypted Messaging” has moved from being a technical problem to a policy choice.  The choice the UK government appears to be making in the Online Safety Bill, in direct contradiction with publicly stated goals, is to leave CSAM in Encrypted Messaging essentially unchallenged.  I hope this is an oversight that can be rectified. 

We are calling on the UK Government to remove the PECR barrier to offering effective CSAM detection and blocking in messaging, where compliance with all other regulations (including Data Protection) can be achieved.  This could be through a similar mechanism to the EU (PECR Derogation) which has already acted to correct this unintended consequence, or a provision of the Online Safety Bill. 

Furthermore, we are calling on the UK Government to implement its own commitment to require detection of CSAM in Encrypted Messaging in the Online Safety Bill.  As more platforms implement encryption, the provisions of the bill relating to CSAM (and indeed other encrypted content) will be increasingly meaningless if encrypted platforms are left out of scope.

To pass the Online Safety Bill in a form that does not allow for the possibility of secure, privacy protecting CSAM detection solutions now or in the future would be to miss a crucial opportunity to do enormous good.

Read Ian’s full letter here.  


Please enter your details below to download your resource

By submitting this form you acknowledge that your personal data will be processed in accordance with our Privacy Policy.

Thank you.

Please click here to start downloading your file.