Guide
Online Safety Act: duties for online services
How to comply with the Online Safety Act 2023 if you operate a user-to-user service or search service. Covers service categories, illegal content duties, children's safety duties, implementation dates, and Ofcom enforcement.
The Online Safety Act 2023 creates a new regulatory framework for online services operating in the UK. If your service allows users to post content, share messages, or search the internet, you likely have duties under this Act.
Ofcom is the regulator. Non-compliance can result in fines of up to 10% of global turnover or £18 million (whichever is greater), and senior managers can face personal criminal liability for failing to cooperate with investigations.
Is your service in scope?
The Act regulates two types of service:
User-to-user services - platforms where users can share content with other users:
- Social media platforms
- Video sharing services
- Forums and message boards
- Dating apps
- Gaming platforms with chat or community features
- Online marketplaces with user reviews or comments
- Messaging services (where content can reach multiple recipients)
Search services - services enabling users to search multiple websites or databases:
- General search engines
- Specialist search engines (not limited to one website)
Out of scope:
- One-to-one email services
- Internal business communication tools
- Services with no UK users or UK links
- News publisher websites (regulated separately)
Overview of your duties
The Act imposes duties based on service type and size. All regulated services have baseline duties, with additional obligations for larger platforms and those accessible by children.
Service categories
Ofcom categorises regulated services based on UK user numbers and platform functionality. Your category determines which additional duties apply beyond baseline requirements.
Category 1 (largest user-to-user services)
These platforms have the highest duties. Ofcom determines Category 1 status using two alternative tests:
- Option A: 34 million UK users AND a content recommender algorithm
- Option B: 7 million UK users AND a recommender system AND content sharing functionality
Category 1 platforms must address legal but harmful content for adults, protect journalistic and democratic content, give users content filtering tools, and publish transparency reports.
Category 2A (search services)
Search engines with 7 million or more UK users (excluding vertical search limited to specific websites). Must prevent illegal content appearing in search results and implement child safety measures.
Category 2B (user-to-user with messaging)
Services with 3 million or more UK users that include private messaging functionality. Must take measures against illegal content shared via private messages.
All other regulated services
Services below category thresholds still have baseline duties covering illegal content risk assessment, content removal, terms of service, and complaints procedures.
Key duties in detail
Illegal content duty
All regulated services must take steps to prevent users encountering priority illegal content. This covers 130+ offences listed in Schedule 7 of the Act, including:
- Child sexual exploitation and abuse material (CSEA)
- Terrorism content
- Hate crimes
- Fraud and financial crime
- Drugs and weapons offences
- Revenge pornography and intimate image abuse
- Harassment and stalking
- Immigration crime facilitation
You must conduct a risk assessment, implement safety measures proportionate to your risks, and remove illegal content swiftly when you become aware of it.
Children's safety duty
If children can access your service, you must protect them from content that is harmful to children, even if that content is legal for adults. This includes:
- Pornographic content
- Content promoting self-harm, suicide, or eating disorders
- Content promoting violence
- Bullying content
- Content that is harmful for children to see
You must conduct a children's access assessment to determine if children are likely to access your service. If yes, you must implement age assurance measures and apply additional protections.
Implementation deadlines
The Online Safety Act introduces duties in stages. Missing these deadlines puts you at risk of enforcement action.
- Illegal content risk assessment deadline
- 16 March 2025
- Illegal content duties enforceable
- 17 March 2025
- Children's access assessment deadline
- 16 April 2025
- Children's safety duties enforceable
- 25 July 2025
- Ofcom fee notification window
- December 2025 to March 2026
What the deadlines mean for you
By 16 March 2025: Complete your illegal content risk assessment. This is a written document identifying how your service could be used to commit or facilitate illegal activity. You must be able to show this to Ofcom if asked.
From 17 March 2025: Ofcom can enforce illegal content duties. Your safety measures must be operational - content moderation systems, reporting mechanisms, and removal processes.
By 16 April 2025: Complete your children's access assessment to determine if children are likely to access your service.
From 25 July 2025: Children's safety duties become enforceable. Age assurance and child-specific protections must be in place.
Steps to compliance
-
Determine if your service is in scope
Review your service against Ofcom's guidance. If users can post content visible to others, or if you operate a search service, you are likely regulated.
-
Complete your illegal content risk assessment
Identify how your service could be used for priority illegal content. Document risks, existing safety measures, and gaps. Must be completed by 16 March 2025.
-
Assess whether children can access your service
Consider user demographics, content type, and access controls. If children might reasonably access your service, complete a children's risk assessment by 16 April 2025.
-
Implement content moderation systems
Put in place proportionate measures to detect, prevent, and remove illegal content. This may include automated tools, human moderators, and user reporting systems.
-
Create or update terms of service
Your terms must clearly explain what content is prohibited, how you enforce your policies, and how users can complain about content or moderation decisions.
-
Establish user reporting and complaints procedures
Users must be able to report illegal content easily. You must respond to reports promptly and have a clear complaints process for moderation decisions.
-
Implement age assurance (if children can access)
Use age verification or age estimation to identify child users. Apply default privacy settings and content restrictions for under-18s.
-
Review and update annually
Risk assessments must be reviewed when your service changes significantly, or at least annually. Keep documentation for Ofcom inspection.
Enforcement and penalties
Ofcom has substantial enforcement powers under the Online Safety Act:
- Information notices: Ofcom can require you to provide information about your service, users, and safety measures
- Provisional and confirmation notices: Formal warnings requiring you to take specific action to comply
- Fines: Up to 10% of qualifying worldwide revenue or £18 million, whichever is greater
- Business disruption measures: Court orders to restrict payment services or advertising, or require ISPs to block access to non-compliant services
- Senior manager liability: Criminal offences for failing to comply with information notices or obstructing Ofcom investigations
Ofcom prioritises the most serious harms - particularly child sexual exploitation, terrorism, and fraud. However, all in-scope services must meet baseline compliance requirements.