I recently read this BBC article. The article explains why it is impossible to prevent illegal content spreading through Encryption, and how Signal would leave the UK market if forced to do so.  It ends with the following quote:
"the idea that complex social problems are amenable to cheap technical solutions is the siren song of the software salesman".

It’s a powerful statement, and of course it’s absolutely true.

That doesn’t mean that “cheap technical solutions” can’t make important contributions however – in fact I’d suggest they can, they can improve matters considerably even for complex social problems.  For example, seatbelts have not “solved” deaths on our roads, but they have certainly saved a lot of lives.

The idea that complex social problems are amenable to simplistic soundbites is dangerously naive. It has led to the debate between Governments and Encryption advocates lapsing into a simplistic “false binary” – one where we can apparently either have safety, or encryption, but not both. This certainly simplifies political soundbites, but it also ignores the reality of technological progress.

This “false binary” arises precisely because encryption, safety and privacy are complex social problems - as well as complex technological ones. Technology involved in solving this complex problem will necessarily be beyond intuitive or simplistic understanding.


There are parallels with the evolution of bank or credit cards.

Initially credit cards relied on a number embossed on the card.  Every transaction slip contained enough data to duplicate the card, so it was relatively easy for criminals to create clones and use them to steal goods.

The second generation of credit cards used magnetic stripes to contain the necessary data. This was essentially “security through obscurity”. At first most criminals didn’t have access to machinery to read and write magnetic stripes and so couldn’t duplicate cards and commit fraud. Of course criminals caught up when technology to read and write magnetic stripes came down in cost.

The premise behind both of these approaches was that retailers need to be able to read a card quickly and easily for convenience. So, security was not possible, because if retailers can quickly and easily read the data, so can criminals. The cost of this fraud was just accepted and built into the system. The idea that you can’t have both convenience and security was well established. It formed a “false binary” similar to the one we see today with Encryption and Online Safety.

The current generation of credit cards uses a Chip built into the card. Cryptographic techniques (protected by a PIN) make it possible for a card to confirm validity of a transaction for a retailer without “leaking” enough information to allow criminals to duplicate the card. Most people have little or no understanding of how this technology works, because it relies on complex and counterintuitive principles of encryption. In spite of this lack of understanding we are willing to put our faith in it, despite the enormous importance of financial security.

Chip and PIN technology solved the problem, but not by resolving the “false binary”. Put another way, no-one found a “magic” way of making the card easy to read for retailers and difficult to read for criminals. 

Instead, innovators challenged the premise that “reading the card” was the right requirement to build to.  Chip and PIN goes beyond “reading” a card, and instead enables “conversation” between the card and the bank. This conversation uses cryptographic techniques that allow the bank to confirm the card is genuine without revealing enough information for duplication. 

When the idea was first mooted, detractors claimed it was magical or wishful thinking – because their mental model was rooted in a false set of assumptions about the nature of the problem.

This analogy is, of course, an oversimplification. However, I believe it illustrates how these “false binary” arguments can arise.


Much of the debate around Online Safety and Encryption is missing key information. This is leading to the “false binary” that products and platforms can EITHER have safety capabilities OR have strong user privacy protections.

We have developed technology working with the UK Government (through the Safety Tech Challenge Fund) and Internet Watch Foundation that can detect and block CSAM (Child Sexual Abuse Material) in E2EE (End to End Encrypted) messaging environments without compromising user privacy.  

This solution has been closely examined by security experts at GCHQ, who subsequently published the paper “Thoughts on Child Safety on Commodity Platforms”. The Home Secretary made an amendment to the UK Online Safety Bill to bring End-to-End Encrypted Messaging into scope and credited the Safety Tech Challenge Fund work as enabling this. 

Our solution has also been examined in detail by the UK Information Commissioners Office which has both written publicly on the subject, and provided us directly with written advice showing that in the UK there are no fundamental barriers to deployment from a data protection perspective.  As the UK retains similar privacy legislation to the EU at this point, this advice effectively also covers the EU.

This is a complex area and what we offer is not a complete solution to the problem of child abuse online. It is however a significant solution to the sharing of known child abuse content.  While I am, of course, a software salesman at some level, and my “siren song” should be treated with caution, I believe that the new class of technology we have developed can, and should, be considered in the debate. 

Debating this issue without understanding of new technologies like ours is the equivalent of settling for magnetic stripes because of a refusal to believe in the potential of “Chip and PIN.”


While we’re taking an analytical look at the subject, it’s worth reviewing that earlier quote through a different lens:

"the idea that complex social problems are amenable to cheap technical solutions is the siren song of the software salesman".

The platforms offering End-to-End Encryption are of course also “software salesmen”. While they position themselves as solving problems for society, they also cause them.  Protecting journalists, whistle-blowers and brave campaigners in repressive regimes is a worthy mission. It’s a mission I wholeheartedly support and our technology does nothing to compromise. Nonetheless this seductive privacy focused “siren song” comes at the expense of other “complex social problems” like enabling criminals and abusers to operate with impunity at huge scale. 

Perhaps now is the time to get beyond punchy quotes, aggressive headlines and the “false binary”, and put in the hard work to look at new technology in detail. We should scientifically evaluate whether it actually works, rather than rejecting the premise that it is possible. We should seek out high quality solutions and take forward the ones that stand up to proper scrutiny.


Download our recent 'Combatting online child sexual abuse while protecting privacy' White Paper to learn more about our solution.

You can also subscribe to our newsletter for more insights or get in touch with us to learn more. 

Please enter your details below to download your resource

By submitting this form you acknowledge that your personal data will be processed in accordance with our Privacy Policy.

Thank you.

Please click here to start downloading your file.