The UK’s telecom regulatory authority, Ofcom, faces significant hurdles in implementing the Online Safety Act, according to a report by the House of Commons Committee of Public Accounts. The legislation, aimed at safeguarding children from harmful online content, imposes a duty of care on online platforms and introduces measures for identifying and removing prohibited material. However, analysis reveals challenges in enforcing the act, potentially delaying its full rollout by at least a year beyond the proposed 2025 deadline.
Ofcom is tasked with overseeing compliance with the regulation, including the monitoring of nearly 100,000 service providers, but lacks clarity on processing such a vast amount of data. The absence of a finalized automated compliance monitoring system further complicates the agency’s ability to address individual complaints effectively. Moreover, concerns persist regarding the regulation’s impact on privacy, with some arguing that provisions requiring content scanning could weaken encryption and expose users to increased surveillance and hacking risks.
Despite amendments to limit the use of scanning tools to those meeting minimum accuracy standards, privacy groups like the Open Rights Group remain critical of the legislation. They assert that rushed implementation without adequate scrutiny has created an “overblown legislative mess,” burdening online intermediaries with impractical obligations. With ongoing debates over the balance between online safety and privacy rights, the effectiveness and feasibility of the Online Safety Act remain under scrutiny, highlighting the complexities of regulating digital platforms in an increasingly interconnected world.