The UK telecoms and media regulator, Ofcom, has today moved forward with implementation of the government’s tedious new Online Safety Act (OSA) by publishing industry guidance on how websites and social media services should introduce “effective age checks“. The goal is to prevent children from encountering online porn and protect them from other harmful content.
The focus around the new age verification requirement is frequently expressed as being something targeted towards pornography services, which must introduce age checks by July 2025 at the latest. But some of the new requirements also stretch to “all user-to-user and search services” in scope of the act (e.g. social media, online forums, tube sites, cam sites, and fan platforms) – both big and small sites alike.
The regulator’s new guidance sets out how the new legal duty will work and makes clear that any age-checking methods deployed by services must be “technically accurate, robust, reliable and fair” in order to be considered “highly effective“.
Previous methods, including self-declaration of age and online payments that don’t require a person to be 18, are deemed NOT highly effective. By comparison, Ofcom says that open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services and email-based age estimation are highly effective.
What are online services required to do, and by when?
The Online Safety Act divides online services into different categories with distinct routes to implement age checks. However, the action we expect all of them to take starts from today:
- Requirement to carry out a children’s access assessment. All user-to-user and search services – defined as ‘Part 3’ services – in scope of the Act, must carry out a children’s access assessment to establish if their service – or part of their service – is likely to be accessed by children. From today, these services have three months to complete their children’s access assessments, in line with our guidance, with a final deadline of 16 April. Unless they are already using highly effective age assurance and can evidence this, we anticipate that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act. Services that fall into this category must comply with the children’s risk assessment duties and the children’s safety duties. [i.e. they must record the outcome of their assessment and must repeat the children’s access assessment at least annually].
- Measures to protect children on social media and other user-to-user services. We will publish our Protection of Children Codes and children’s risk assessment guidance in April 2025. This means that services that are likely to be accessed by children will need to conduct a children’s risk assessment by July 2025 – that is, within three months. Following this, they will need to implement measures to protect children on their services, in line with our Protection of Children Codes to address the risks of harm identified. These measures may include introducing age checks to determine which of their users are under-18 and protect them from harmful content.
- Services that allow pornography must introduce processes to check the age of users: all services which allow pornography must have highly effective age assurance processes in place by July 2025 at the latest to protect children from encountering it. The Act imposes different deadlines on different types of providers. Services that publish their own pornographic content (defined as ‘Part 5 Services) including certain Generative AI tools, must begin taking steps immediately to introduce robust age checks, in line with our published guidance. Services that allow user-generated pornographic content – which fall under ‘Part 3’ services – must have fully implemented age checks by July.
As part of this, Ofcom have also opened an age assurance enforcement programme, albeit focusing their attention “first” on Part 5 services that display or publish their own pornographic content. If sites fail to act, the OSA allows Ofcom to impose financial penalties worth up to 10% of a company’s annual worldwide turnover (max of £18m) and they could also implement “business disruption measures” against third-parties, such as by imposing restrictions via internet search engines, payment providers or by requiring broadband ISPs to block the website.
In addition, porn providers are effectively also forbidden from directing or encouraging people to use circumvention measures (VPN, Proxy Servers, DNS changes etc.), although anybody under 18 who does go actively seeking such content (let’s face it, there will be a lot of active seeking) will have no difficulty finding and using circumvention measures, as has always been the case. The horse on this one bolted a long.. time ago.
Dame Melanie Dawes, Ofcom’s CEO, said:
“For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don’t ask or, when they do, the checks are minimal and easy to avoid. That means companies have effectively been treating all users as if they’re adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.
As age checks start to roll out in the coming months, adults will start to notice a difference in how they access certain online services. Services which host their own pornography must start to introduce age checks immediately, while other user-to-user services – including social media – which allow pornography and certain other types of content harmful to children will have to follow suit by July at the latest.
We’ll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to face enforcement action from Ofcom.”
As usual Ofcom, just like the government, are still giving the impression above that this is only impacting “companies“, which remains very misleading as many of the rules also catch small blogs and forums that may have nothing to do with porn content – often imposing an intolerable level of legal liabilities and complexity on those least able to be able to understand, afford or handle it. This is to say nothing of the wider problems.
One of the risks above stems from the fact that users of some services may end up being forced to share their private personal details with companies connected to unreliable porn peddlers. The infamous “Ashley Madison” data breach in 2015 highlighted just how dangerous such information could be in the wrong hands (multiple cases of blackmail and suicide etc.).
Ofcom does state that all age assurance methods must be subject to the UK’s privacy laws, including those concerning the processing of personal data – as enforced by the Information Commissioner’s Office (ICO). Porn services must also keep written records explaining how they protect users from a breach of these laws. But we suspect that won’t provide end-users with much reassurance, given the frequency of modern data breaches – even at state level.
Back in 2023 the European Policy Information Center (EPICENTER) published a report that summed these challenges up quite nicely, not least by highlighting the tendency of politicians to “promise the impossible without fully understanding the dynamics of what they are trying to regulate and without giving sufficient consideration to the side-effects of the proposed solutions.” That’s really the OSA, in a nutshell.
At the same time there’s a concern about treating all under-18’s so generically as merely “children” in the realm of any internet content. This is something that many in their late teens (particularly the 15-18 bracket) will no doubt find to be quite insulting. It could also make it harder for them to engage online in even safe communities that having nothing to do with porn or harmful adult content, as site owners will be thinking first of their own liability etc.
Finally, many have questioned whether such a system is even necessary, since all of the major broadband and mobile providers already offer optional network-level filtering systems that cover porn and adult content (e.g. gambling) – these are usually enabled by default.
Lest we also forget that there could be unintended impacts in other areas too, such as on sex workers (i.e. pushing them off-line and back onto the streets). Likewise, there’s the question of freedom of expression, not least with respect to the debate over what is and what is not porn (i.e. general nudity, medical content and erotic stories). The puritanical approach being taken by the Government does seem to create a few grey areas.