Online pornography: Age verification is en route - but the devil’s in the detail

The fact is that nobody wants to show their passport to an adult site, however encrypted the process

Governments are moving from finger-wagging to mandates: if you host pornography or other adult content, you must put something sturdier than a checkbox between minors and the material. Photograph: Dominic Lipinski/PA
Governments are moving from finger-wagging to mandates: if you host pornography or other adult content, you must put something sturdier than a checkbox between minors and the material. Photograph: Dominic Lipinski/PA

There’s an old joke about the internet: nobody knows you’re a dog, nobody knows you’re a teenager, and nobody knows whether the boxes you’ve ticked saying “I am a human” and “I am over 18” mean anything at all. But legislators across the democratic world have finally decided the punchline isn’t funny any more.

Governments are moving from finger-wagging to mandates: if you host pornography or other adult content, you must put something sturdier than a checkbox between minors and the material.

The push is not new, but the momentum feels different. Political patience for vague commitments from platforms has worn thin. The conversation now is about enforcement, not aspiration.

In Ireland, Coimisiún na Meán’s online safety code now requires video-sharing platforms hosting pornography and similar material to deploy genuinely effective mechanisms to keep minors out.

The words “genuinely effective” are doing some heavy lifting here, covering a menu of options including third-party verification tokens, privacy-preserving age estimation, device-level parental settings and more. The unresolved question is whether enough services will choose solutions that satisfy regulators without frightening off users – and how fast sanctions will follow if they don’t.

Across the Irish Sea, the UK is deep in implementation. The Online Safety Act passed two years ago by the Conservative government requires services that make pornography available to implement “highly effective” checks. In practice, that means adult content sites will have to block UK visitors unless they pass an age test.

Critics argue these mandated checks threaten anonymity, create honeypots of sensitive data and inevitably overshoot, sweeping in sexual health resources, LGBTQ+ information or art that crude filters misclassify. They also fear a creeping extension of age limits to other types of content deemed inappropriate for minors.

Proponents say these risks can be managed. The argument has acquired a partisan edge, with the Labour government and the opposition Reform UK hurling insults at each other. The political mood music will colour both enforcement and whether the British model is exported or quietly abandoned.

Elsewhere in the EU, the same debate is playing out. France has moved fastest, introducing its own age-verification law for pornography with fines for noncompliance, while Germany has long had age-gating rules that are now being updated in light of new EU-level discussions under the Digital Services Act. Several member states, watching the UK’s early moves and legal wrangles, are waiting to see whether the approach survives contact with the courts before committing themselves.

In the US, the absence of a federal law has encouraged states to experiment. Louisiana’s law kicked things off, followed by others. Some require site-by-site checks; others, notably Utah and Texas, have gone for the more ambitious route of imposing verification duties on app stores themselves, so that Apple’s App Store and Google Play would check ages and obtain parental consent at the point of download, passing only an age “signal” to the app. That model is gathering attention abroad, because it could, in theory, standardise age checks across thousands of services.

This brings us to one of the underlying tensions in the entire debate: who should be responsible, and therefore liable, for making the checks?

Meta argues that app stores are the logical choke point – the bouncer at the club door – since they already control installation, payment and in many cases device-level settings. Apple and Google prefer to push the responsibility down to individual apps, warning that centralised checks could mean overcollection of data and disadvantage smaller developers.

Behind the polite disagreement, the row is about who takes the blame when something goes wrong. Meanwhile, Apple has been quietly developing its own privacy-friendly verification system, using on-device processing and cryptographic proof to confirm age without revealing identity. Whether it chooses to roll that out broadly may depend as much on politics as on engineering.

Age verification won’t stop children accessing porn onlineOpens in new window ]

Underpinning all of this is a fast-growing commercial ecosystem. Third-party age-verification providers – some specialising in facial analysis, others in document scans, others in reusable “digital ID” credentials – have attracted billions of dollars in investment over the past five years. Investors are betting that regulatory momentum is irresistible: if one country mandates verification, others will follow, and platforms will prefer to buy in expertise rather than build their own.

The sums involved show that industry insiders expect this to become a routine part of online life.

But the fact remains that nobody wants to show their passport to a porn site, however encrypted the process. The most thoughtful proposals try to separate the question “are you over 18?” from “who are you?”, using privacy-preserving techniques such as zero-knowledge proofs, third-party attestations and cryptographic tokens.

In principle, these approaches can give a binary answer without leaking personal details. In practice, systems are built under time and budget pressure by companies with varying incentives, and data that “shouldn’t” be retained sometimes is, simply because it is convenient.

If all this is going to work, a few principles will need to hold. Data should be minimised and retained for as close to zero time as possible. Verification services should be independent of the content platforms they serve. Adults should have a choice among privacy-preserving methods, and teenagers should not be pushed into darker corners of the internet by clumsy design.

And there should be real accountability – audits, penalties and transparency, when things go wrong.

Age checks will not, by themselves, fix the internet for children. They can make some harms less accessible, but they can also create new risks if treated as a magic key. The question now is not whether the age checkers are coming – they are – but whether they will be established on terms set in the interests of users or of companies which, despite their protestations, don’t give a damn about children’s welfare.