On Monday, the U.K.’s internet regulator, Ofcom, published the first set of final guidelines for online service providers subject to the Online Safety Act. This starts the clock ticking on the sprawling online harms law’s first compliance deadline, which the regulator expects to kick in in three months’ time.
Ofcom has been under pressure to move faster in implementing the online safety regime following riots in the summer that were widely perceived to have been fuelled by social media activity. Although it is just following the process lawmakers set out, which has required it to consult on, and have parliament approve, final compliance measures.
“We are ready to take enforcement action if providers do not act promptly to address the risks on their services,” it added.
According to Ofcom, more than 100,000 tech firms could be in scope of the law’s duties to protect users from a range of illegal content types — in relation to the over 130 “priority offences” the Act sets out, which cover areas including terrorism, hate speech, child sexual abuse and exploitation, and fraud and financial offences.
Failure to comply risks fines of up to 10% of global annual turnover (or up to £18 million, whichever is greater).
In-scope firms range from tech giants to “very small” service providers, with various sectors impacted including social media, dating, gaming, search, and pornography.
“The duties in the Act apply to providers of services with links to the UK regardless of where in the world they are based. The number of online services subject to regulation could total more than 100,000 and range from some of the largest tech companies in the world to very small services,” wrote Ofcom.
The codes and guidance follow a consultation, with Ofcom looking at research and taking stakeholder responses to help shape these rules, since the legislation passed parliament last fall and became law back in October 2023.
The regulator has outlined measures for user-to-user and search services to reduce risks associated with illegal content. Guidance on risk assessments, record-keeping, and reviews is summarized in an official document.
Ofcom has also published a summary covering each chapter in today’s policy statement.
The approach the U.K. law takes is the opposite of one-size-fits all — with, generally, more obligations placed on larger services and platforms where multiple risks may arise compared to smaller services with fewer risks.
However, smaller lower risk services do not get a carve out from obligations, either. And — indeed — many requirements apply to all services, such as having a content moderation system that allows for swift take-down of illegal content; having mechanism for users to submit content complaints; having clear and accessible terms of service; removing accounts of proscribed organizations; and many others. Although many of these blanket measures are features that mainstream services, at least, are likely to already offer.
But it’s fair to say that every tech firm that offers user-to-user or search services in the U.K. is going to need to undertake an assessment of how the law applies to their business, at a minimum, if not make operational revisions to address specific areas of regulatory risk.
For larger platforms with engagement-centric business models — where their ability to monetize user-generated content is linked to keeping a tight leash on people’s attention — greater operational changes may be required to avoid falling foul of the law’s duties to protect users from myriad harms.
A key lever to drive change is the law introducing criminal liability for senior executives in certain circumstances, meaning tech CEOs could be held personally accountable for some types of non-compliance.
Speaking to BBC Radio 4’s Today program on Monday morning, Ofcom CEO Melanie Dawes suggested that 2025 will finally see significant changes in how major tech platforms operate.
“What we’re announcing today is a big moment, actually, for online safety, because in three months time, the tech companies are going to need to start taking proper action,” she said. “What are they going to need to change? They’ve got to change the way the algorithms work. They’ve got to test them so that illegal content like terror and hate, intimate image abuse, lots more, actually, so that doesn’t appear on our feeds.”
“And then if things slip through the net, they’re going to have to take it down. And for children, we want their accounts to be set to be private, so they can’t be contacted by strangers,” she added.
That said, Ofcom’s policy statement is just the start of it actioning the legal requirements, with the regulator still working on further measures and duties in relation to other aspects of the law — including what Dawes couched as “wider protections for children” that she said would be introduced in the new year.
So more substantive child safety-related changes to platforms that parents have been clamouring to force may not filter through until later in the year.
“In January, we’re going to come forward with our requirements on age checks so that we know where children are,” said Dawes. “And then in April, we’ll finalize the rules on our wider protections for children — and that’s going to be about pornography, suicide and self harm material, violent content and so, just not being fed to kids in the way that has become so normal but is really harmful today.”
Ofcom’s summary document also notes that further measures may be required to keep pace with tech developments such as the rise of generative AI, indicating that it will continue to review risks and may further evolve requirements on service providers.
The regulator is also planning “crisis response protocols for emergency events” such as last summer’s riots; proposals for blocking the accounts of those who have shared CSAM (child sexual abuse material); and guidance for using AI to tackle illegal harms.
Hey, I am a multifaceted professional excelling in the realms of blogging, YouTube content creation, and entrepreneurship.
With a passion for sharing knowledge and inspiring others, I established a strong presence in the digital sphere through his captivating blog articles and engaging video content.