CyAN Position on Proposed Amendments to the UK Children’s Wellbeing and Schools Bill  (HL Bill 135)

CyAN Position on Proposed Amendments to the UK Children’s Wellbeing and Schools Bill  (HL Bill 135)

The Cybersecurity Advisors Network (CyAN) supports the protection of children from online harm and recognises the seriousness of crimes involving child sexual abuse material (CSAM).  These are important shared priorities across developed societies, and across democratic societies, and across political, social, and economic divides.

However, good intentions alone do not guarantee good policy outcomes.  CyAN is concerned that recent proposed amendments to the Children’s Wellbeing and Schools Bill, specifically those seeking to restrict virtual private network (VPN) usage by minors through mandatory identification (pages 19-20), and those requiring “tamper-proof” mechanisms to prevent the creation, transmission, and viewing of CSAM on consumer devices (pages 20-21), risk undermining online safety rather than strengthening it.

CyAN has consistently opposed legislative proposals that weaken privacy and security in pursuit of poorly defined safety goals, including the European Union’s Chat Control proposals and various other national measures that mandate encryption backdoors or otherwise normalise identification or surveillance as a condition of digital participation.[1][2][3]  In CyAN’s view, the amendments now under consideration follow a similar pattern. 

Privacy Tools That Require Identification Are Self-Defeating

VPNs and other privacy-enhancing technologies are widely recommended by cybersecurity authorities as basic protective measures, particularly when users access the internet over public or untrusted networks.[4]  Their effectiveness depends on minimising the collection and retention of identifying information.

Requiring identification or age verification to access such tools risks defeating their purpose.  It introduces new repositories of sensitive personal data, creates incentives for logging and retention, and increases the consequences of data breaches.[5]  There is also concern that parts of the age-verification ecosystem prioritise commercial interests over demonstrable public interest outcomes.[6][7] 

Empirical research demonstrates that awareness of surveillance or traceability can suppress lawful behaviour.  Peer-reviewed studies have shown measurable chilling effects on information-seeking and expression following revelations of government monitoring.[8][9]  These effects have been shown to fall disproportionately on marginalised groups, including LGBTQ+ users, abuse survivors, journalists, and individuals seeking information on sensitive or stigmatised topics.

There is limited evidence that such requirements meaningfully deter criminal behaviour.  Determined actors routinely bypass regulated services, while compliant providers and ordinary users bear the cost and risk.  Real-world experience with age-verification schemes in the UK has already shown significant technical difficulties and limited effectiveness, ultimately leading to abandonment of earlier proposals.[10]

Tamper-Proof Devices Are Neither Feasible Nor Safe

The proposal to mandate tamper-proof mechanisms on consumer devices to prevent access to or recording of CSAM rests on an incomplete understanding of how modern computing systems operate.  There is no such thing as a tamper-proof general-purpose computing device.  Systems that must remain functional, updatable, repairable, and accessible cannot be rendered immune to modification. 

Decades of experience with digital rights management, trusted hardware modules, and secure enclaves demonstrate that such controls are routinely bypassed—often quickly and at scale.[11]  Furthermore, as leading security engineers have noted, systems designed to enforce exceptional access or content control tend to introduce new vulnerabilities and weaken overall security.[12]

Embedding mandatory scanning or enforcement mechanisms at the device level would also impose significant economic costs.  Manufacturers and developers would be required to redesign hardware and operating systems, maintain detection databases, and assume ongoing compliance burdens.  These costs would risk disproportionately harming smaller vendors and open-source projects, as well as lower-income users of technology unable to afford compliant devices, while potentially favouring large, vertically integrated incumbents.

Worse, the amendment’s open-ended wording (“must enable the Secretary of State, by further regulations, to expand the definition of ‘relevant devices’ to include other categories of device which may be used to record, transmit or view CSAM”) raises particular concern, as it potentially encompasses almost every type of electronic device in existence.

Civil Liberties and the Risk of Expansion

Both amendments would risk normalising continuous monitoring and control as default features of consumer technology.  Device-level scanning and identity-linked access would, in practice, function as generalised surveillance applied without individual suspicion.[13] 

Experience shows that such infrastructure is rarely confined to its original purpose.  Democratic societies are not static, and technical capabilities often[14]  outlast the political contexts in which they are introduced.  Documented cases in Europe illustrate how surveillance tools have been repurposed against journalists, political opponents, and civil society as democratic safeguards erode.

Comparable concerns have arisen in other democratic jurisdictions.  In the United States, for example, law-enforcement access to large-scale personal data systems such as automated license-plate readers has at times been misused, underscoring how lawful access mechanisms can be abused when governance is insufficient.[14]

Ineffective Against Harm, Harmful to Security

Taken together, these amendments would unlikely to meaningfully impede serious offenders, who already operate outside mainstream platforms, devices, and jurisdictions.  Instead, they would discourage the use of legitimate security tools, weaken device integrity, and displace harmful activity into harder-to-monitor spaces.

Security and privacy are fundamental and important values in every liberal democratic society.  They are not incompatible with the protection of children, and must not be made to appear as such.  Weakening privacy protections and embedding surveillance into consumer devices risks reducing safety by increasing attack surfaces and exposing users to new forms of exploitation.  Protecting children requires investment in targeted investigations, victim support, education, and platform accountability.  It does not and must not be made conditional on the abandonment of citizens’ security.

CyAN therefore urges legislators to reconsider these amendments and to pursue child-protection measures that are evidence-based, proportionate, and compatible with fundamental principles of privacy, security, and trust.

References

  1. CyAN, Position on the Proposed EU Chat Control Regulation (2024)
    https://cybersecurityadvisors.network/2024/06/24/cyans-position-on-the-proposed-eu-chat-control-regulation/
  2. CyAN, Position on Encryption Back-Door Legislation (2025)
    https://cybersecurityadvisors.network/2025/03/04/cyans-position-on-encryption-back-door-legislation/
  3. CyAN, CyAN Supports the Fight Against the UK’s Anti-Privacy Overreach (2025)
    https://cybersecurityadvisors.network/2025/02/14/cyan-supports-the-fight-against-the-uks-anti-privacy-overreach/
  4. UK National Cyber Security Centre, Virtual Private Networks (VPNs)
    https://www.ncsc.gov.uk/collection/device-security-guidance/infrastructure/virtual-private-networks
  5. Electronic Frontier Foundation (EFF), Privacy is For the Children (Too)
    https://www.eff.org/deeplinks/2025/11/privacy-children-too
  6. Electronic Frontier Foundation, Age Verification Is a Danger to Privacy
    https://www.eff.org/deeplinks/2023/03/age-verification-harmful
  7. Open Rights Group, Online Safety Act: Age assurance industry must be regulated
    https://www.openrightsgroup.org/press-releases/online-safety-act-org-calls-for-regulation-of-age-assurance-industry/
  8. Penney – Chilling Effects
    https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2769645
  9. Marthews, A.  & Tucker, C., Government Surveillance and Internet Search Behavior
    https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2412564
  10. The Guardian, UK drops plans for online pornography age verification system
    https://www.theguardian.com/culture/2019/oct/16/uk-drops-plans-for-online-pornography-age-verification-system
  11. Electronic Frontier Foundation (EFF), Digital Rights Management: A failure in the developed world, a danger to the developing world
    https://www.eff.org/wp/digital-rights-management-failure-developed-world-danger-developing-world
  12. Abelson et al., Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications, MIT CSAIL
    https://www.csail.mit.edu/research/keys-under-doormats
  13. Amnesty International, Pegasus Project: Apple iPhones compromised by NSO spyware
    https://www.amnesty.org/en/latest/news/2021/07/pegasus-project-apple-iphones-compromised-by-nso-spyware/
  14. Brennan Center for Justice, Automated License Plate Readers and Privacy
    https://www.brennancenter.org/our-work/research-reports/automatic-license-plate-readers-legal-status-and-policy-recommendations