Guide
Understanding the Online Safety Act
A strategic overview of the Online Safety Act 2023, explaining what it is, who it affects, how the regulatory framework operates, and where it sits within the broader UK digital regulation landscape. Essential reading for any business operating an online platform or service with user interaction.
What is the Online Safety Act?
The Online Safety Act 2023 (OSA) is the UK's landmark legislation for regulating online platforms and services. It creates a duty of care framework that requires platforms to protect users β particularly children β from illegal and harmful content. The Act received Royal Assent on 26 October 2023 and is being implemented in phases through Ofcom's codes of practice and guidance.
The OSA represents a fundamental shift in how the UK regulates the internet. Rather than treating platforms as passive hosts, it holds them responsible for the systems and processes they use to manage content and user interactions. If your business operates any service where users can post content, communicate with each other, or search the internet, the OSA likely applies to you.
Why the OSA matters to your business
The OSA is not limited to social media giants. It applies to a wide range of online services, from community forums and review sites to messaging platforms and online marketplaces with user interaction features. Many small and medium-sized businesses operate services that fall within scope without realising it.
Non-compliance carries severe consequences. Ofcom has the power to impose fines of up to 10% of qualifying worldwide revenue (or GBP 18 million, whichever is greater), and in extreme cases can seek court orders to block services in the UK. Senior managers can face personal criminal liability for certain failures.
Who is affected
The OSA applies to two main types of service:
- User-to-user services β any internet service that allows users to encounter content generated, uploaded, or shared by other users. This includes social media, forums, messaging apps, online marketplaces with reviews, gaming platforms with chat, and comment sections on websites.
- Search services β internet services that include a search engine enabling users to search multiple websites or databases. This goes beyond Google and Bing to include specialist search engines and aggregation services.
The Act applies to services that have links with the UK β meaning they have a significant number of UK users, target the UK market, or are capable of being used in the UK. Even services based entirely overseas are in scope if they have UK users.
How platform categories work
Ofcom categorises regulated services into tiers based on their size, reach, and risk profile. The category your service falls into determines which duties apply.
All regulated services β regardless of category β must comply with the illegal content duties. Smaller platforms (Category 2B) face fewer additional obligations, while the largest platforms (Category 1) must also address legal but harmful content and provide user empowerment tools.
The four key duties
The OSA creates a layered duty framework. Understanding which duties apply to your service is the first step towards compliance.
Implementation timeline
The OSA is being implemented in phases, with Ofcom publishing codes of practice and guidance on a rolling schedule. Understanding the timeline is critical for planning your compliance programme.
Ofcom's role as regulator
Ofcom is the designated regulator for online safety. Its role includes:
- Setting standards β publishing codes of practice and guidance that set out how platforms should meet their duties
- Enforcement β investigating non-compliance, issuing notices, and imposing penalties
- Research β conducting ongoing research into online harms to inform regulatory development
- Transparency β requiring platforms to publish transparency reports on how they manage content
Platforms can either follow Ofcom's codes of practice (the "safe harbour" route) or demonstrate through alternative means that they are meeting their duties. In practice, most businesses will want to follow the codes of practice closely.
Relationship to other digital regulation
The OSA does not operate in isolation. It intersects with several other regulatory frameworks:
- UK GDPR and Data Protection Act 2018 β age assurance and content moderation must comply with data protection law. The ICO's Children's Code applies alongside OSA children's safety duties.
- Digital Markets, Competition and Consumers Act 2024 β the CMA's digital markets regime addresses competition issues on platforms, complementing the OSA's safety focus.
- Audio Visual Media Services Regulations β video-sharing platforms face duties under both AVMS and OSA, with Ofcom regulating under both regimes.
- Electronic Commerce Regulations β some hosting intermediary protections are modified by the OSA.
- Equality Act 2010 β content moderation decisions must not discriminate against protected characteristics.
The Digital Regulation Cooperation Forum (DRCF), comprising Ofcom, the ICO, the CMA, and the FCA, coordinates across these overlapping regimes to reduce duplication for businesses.
What to do next
If you operate an online service, your immediate priorities are:
- Determine whether your service is in scope β review whether it qualifies as a user-to-user service or search service under the Act
- Identify your category β check Ofcom's categorisation thresholds to understand which duties apply
- Conduct your illegal content risk assessment β this is the foundational compliance step for all regulated services
- Assess children's access β determine whether children are likely to access your service, which triggers additional duties
- Review Ofcom's codes of practice β ensure your systems and processes align with the published standards