The UK Parliament has passed the Online Safety Bill, aimed at combatting child abuse content on online platforms. Following the bill’s approval, cabinet officials have called on Meta, the parent company of Facebook, to halt its rollout of end-to-end encryption for messaging platforms like Facebook Messenger and Instagram.
Furthermore, Home Secretary Suella Braverman expressed concerns that this encryption could jeopardize children’s safety, prompting criticism from privacy advocates and tech companies like WhatsApp and Signal. These entities oppose weakening end-to-end encryption for detection purposes, and Apple has called on the government to explicitly protect end-to-end encryption.
Additionally, the Online Safety Bill grants regulator Ofcom the authority to fine violators up to £18 million or 10% of their global annual revenue. The bill’s language allowing Ofcom to mandate “accredited technology” for identifying and preventing child abuse or terrorism content has drawn ire from privacy advocates. They argue that such technology would require weakening encryption or deploying scanning devices on users’ devices. This stance has prompted threats from U.S. tech companies like WhatsApp and Signal to withdraw from the UK market rather than comply. Despite these concerns, Meta has indicated that it will proceed with the rollout of end-to-end encryption.
Cabinet officials’ attack on encryption has divided opinions, with some viewing it as divergent from the rest of the UK government’s stance and established expert consensus. The criticism raises questions about the government’s commitments made during the Online Safety Bill’s debate in the House of Lords, where the use of accredited technology was conditioned on meeting certain accuracy standards for detecting specific content.
Regardless, the encryption debate continues to stir controversy in the UK, with privacy advocates and tech companies emphasizing the importance of secure communication channels while seeking to address the issue of harmful content online.