close
arrow_back_ios
close
arrow_back_ios
close
arrow_back_ios
close
arrow_back_ios
close
Blogs

UK Online Safety Act – 2025 and beyond

  • Published Feb 04, 2025
04 02 2025 IC BW Evergreen Website Assets

Since the UK’s Online Safety Act (OSA) came into law in October 2023, digital services have been preparing for the new rules, including enhanced user protections clamping down on illegal content, child safety by design and the new transparency requirements. Last year was notable due to several challenges for UK platforms and their users which often blurred the lines between the online and offline world, or at least attested to the interconnectedness of the two. In this piece, we reflect on what seems to be going well with the Act, and looking forward, we discuss what more needs to be done in 2025 for the OSA to deliver its overall objective of enhancing the safety of individuals online.

What is the UK’s Online Safety Act (OSA)?

The OSA establishes new regulations to protect children and adults online, imposing responsibilities on platforms (websites, apps, and other services, including social media services, video-sharing platforms, online forums, dating services and online instant messaging services) and search services for user safety. These companies will be required to put in place measures to reduce the risk of their services being used for illegal offending. They will also need systems that clamp down on illegal content by allowing it to be swiftly removed once identified. Services must also carry out risk assessments addressing the harms caused by illegal content on their platform. The Act puts great emphasis on children’s safety by requiring platforms to block access to harmful content and provide clear reporting mechanisms for parents and children. Services likely to be accessed by children must also carry out a dedicated risk assessment. Certain categorised companies which are considered particularly risky have additional obligations, most notably regarding user empowerment.

The Leading Role of Ofcom

Ofcom is the independent regulator that sets guidelines for compliance and enforces safety standards among providers. Its role is intrinsically linked to the proper functioning and effectiveness of the OSA. This is because Ofcom's guidance and codes help flesh out the legal duties of companies and enhance our understanding of what companies need to do to be compliant. Ofcom will also enforce the rules in the event of non-compliance via financial penalties, service restrictions and, in limited cases, even criminal liability of senior management.

Ofcom has made great progress with its public consultations that invite key stakeholders to share their knowledge. At The Internet Commission, we participated in this process and provided our insights. In mid-December, Ofcom released a range of key documents and updated us on their plans and progress. Most notable are the illegal harms updates: alongside sharing the Illegal Content Codes of Practice, which are now awaiting parliamentary approval, Ofcom also published the Guidance on Risk Assessment and Risk Profiles for illegal harms online. These publications have now been complemented by the regulator’s recently launched digital safety toolkit designed to help businesses comply with the new rules.

Companies now have plenty of work to do on the following fronts:

  • By March 16, 2025, they must begin identifying and understanding illegal harms risks on their services to complete the first risk assessment’s submission
  • They must start implementing the guidance measures in the Illegal Content Codes or select and justify alternative safety measures
  • Following the publishing of the children’s assessment guidance, services have three months to complete their assessment guidance
  • From April 2025, Ofcom expects to publish the Child Safety Codes and the associated Risk Assessment guidance. From that date, companies will have three months to complete the assessment, and after three months lapse, the child safety duties will be enforceable by the regulator

With this said, Ofcom has already pointed to many changes companies had adopted before these Codes were published, for instance: changes relating to safety protections for video-sharing platforms. This includes tighter age verification for adult websites, moderation improvements at social media platform Bitchute, new measures introduced at Twitch preventing children from seeing harmful video content, and new measures at Meta to better protect younger users from grooming.

However, Ofcom admits that their work is only just getting started and it seems clear that effective implementation will not come without further challenges.

Calls for Stricter Rules

Alongside the issues arising from how the constant flow of content can be weaponised, the Molly Rose Foundation (MRF) focused on suicide prevention and the dangers for young people presented by social media. In its latest report, MRF argued that the implementation of the Act to date has been ‘risk averse and unambitious’. The MRF report follows their earlier research criticising TikTok, among other platforms, for detecting almost three million items of suicide and self-harm content but only suspending two accounts.

Similar calls for strengthened rules arose following the spread of disinformation relating to the violent stabbing of young children in Southport. This pervasive, toxic spread of hatred and disinformation demonstrated both the progress and the remaining limitations of the OSA in situations such as this. Progress in that some prosecutions were enabled by the OSA, limitations in the ability, so far, for Ofcom to scrutinise the efficacy of the actions taken by online platforms to prevent the sharing and amplification of racism, hatred, and falsehoods connected with the murders.

Despite this, there is widespread agreement that the key issues of algorithmic promotion and content virality, which led to the rapid proliferation of vicious rumours and hatred online and offline, remain unsolved. It is for these reasons that the new Labour Government has not closed the door to a review of how the OSA can better address disinformation and its real-life consequences. At The Internet Commission, we will be watching this space closely and hope to take part in relevant conversations surrounding it.

The Importance of Enforcement

Last year was of seminal importance for the UK Government’s mission of making the UK ‘the safest place to be online’. The OSA puts Ofcom in the lead on this, and despite several challenges, it seems to have fared well so far. The real challenges, however, may lie in its coming OSA enforcement activity. As highlighted above, Ofcom has several forms of enforcement at its disposal which vary depending on the severity of the service’s breach, ranging from financial penalties to service restrictions and senior management liability.

Dame Melanie Dawes, Ofcom’s Chief Executive, recently stated that ‘the time for talk is over’ and regulated services must now act upon the OSA’s requirements. It remains to be seen how quickly Ofcom will crack down on non-compliant services given that 2025 also contains several key milestones relating to illegal content and the protection of children online. Yet the overarching objective of the OSA is clear: protect users and within that, prioritise protecting vulnerable users such as children. To achieve this, Ofcom will not be afraid to ‘come down hard’ on those who fail to put sufficient measures in place.

Beyond Enforcement

One key element of protecting users online is through enhancing user empowerment. The EU Digital Services Act (DSA) enables users to challenge a platform’s content-related decision via independent, expert, non-judicial bodies. Even though the impact of this new DSA external redress solution is only beginning to emerge in recent months it is a hot topic and one garnering increased attention from companies across Europe, including those operating in both the EU and the UK. As discussed in our recent LSE blog, “Resolving content disputes outside the courtroom using the Digital Services Act”, it is likely to have an important role in enhancing user agency and restoring trust in digital services.

Given the inherent borderlessness of digital services, an extension of the EU approach to the UK is likely to be beneficial for users and companies alike: it would not only enhance user agency by making it easier to challenge platforms’ decisions irrespective of the jurisdiction, but also help to simplify the rules that businesses must navigate. This possibility is provided for in section 217 of the OSA, which leaves it open to the Secretary of State to amend the Regulation such that digital services would be obliged to engage with alternative dispute resolution bodies.

Against this backdrop, organisations like Trust Alliance Group, who have many years of experience providing Ombudsman services, intend to step up activity in this space. This will be done by facilitating conversations among different stakeholder groups, including regulators and businesses with the common goal of greater rule harmonisation and transparency. By bringing together committed organisations in a collaborative manner, the aim is to help them protect their users from online harm through the implementation of effective safety measures in and beyond compliance with existing rules, thus reaching higher standards of online trust and safety.


Want to learn more? We’re happy to help.