Guide
Online Safety Act compliance checklist
Quick-check verification of your Online Safety Act compliance status. Covers scope assessment, risk assessments, content moderation, terms of service, complaints, age assurance, Ofcom registration, and record-keeping.
Use this checklist to verify that your service meets its obligations under the Online Safety Act 2023. Work through each section and resolve gaps before moving on.
Scope assessment
- Determined whether your service is a user-to-user service or search service within the meaning of the Act
- Identified whether your service has links with the UK (UK users, UK-targeted, or accessible from the UK)
- Checked whether any exemptions apply (e.g. internal business services, limited functionality services)
- Assessed your platform category (Category 1, 2A, 2B, or uncategorised) and the duties that apply to each
Illegal content risk assessment
- Completed a written illegal content risk assessment covering all 130+ Schedule 7 priority offences
- Identified how your service could be used to commit or facilitate each relevant offence category
- Assessed the level of risk for each offence type based on your service's features, user base, and content types
- Documented proportionate measures to mitigate each identified risk
- Scheduled annual review of the risk assessment, or earlier if significant changes are made to the service
- Retained the written record and made it available for Ofcom inspection on request
Children's access assessment
- Completed a children's access assessment under section 37
- Considered user demographics, content type, design features, marketing, and terms of service age restrictions
- Recorded the assessment conclusion and supporting rationale
- If children are likely to access the service: completed a children's risk assessment and implemented children's safety duties
- If children are not likely to access the service: documented the basis for that conclusion and scheduled review before any significant change
Content moderation systems
- Implemented proportionate systems to prevent users encountering priority illegal content
- Deployed automated detection tools appropriate to your service size and risk profile (e.g. hash-matching, keyword filtering, AI classifiers)
- Established human moderation capacity with trained moderators and escalation procedures
- Set up swift removal processes for illegal content once identified
- Put moderator wellbeing support in place (psychological support, regular breaks, content exposure limits)
Terms of service
- Published clear, accessible terms of service that state how the service deals with illegal content
- Terms specify the types of content and behaviour prohibited on the service
- Terms explain what happens when content or an account is removed (including the right to appeal)
- Terms are consistently enforced across all users
- Terms are reviewed and updated when the service changes or new duties take effect
User reporting and complaints
- Provided an easy-to-use mechanism for users to report illegal content
- Reports are acknowledged and resolved within a reasonable timeframe
- Established a complaints procedure for users who disagree with content moderation decisions
- Appeals process allows users to challenge content removal or account suspension
- Maintained records of reports received, actions taken, and outcomes
Age assurance
- If children are likely to access your service: implemented age assurance measures proportionate to the risk of harm
- Age assurance method is effective at estimating or verifying age (e.g. age estimation technology, age verification, parental controls)
- Age assurance does not process more personal data than necessary
- Reviewed age assurance measures against Ofcom guidance and updated as technology evolves
Ofcom registration and fees
- Determined whether your qualifying worldwide revenue exceeds the fee threshold
- If above the threshold: notified Ofcom within the notification window and paid applicable fees
- If below the threshold: documented revenue position and scheduled annual review
- Prepared for transparency reporting obligations if your service is categorised (Category 1, 2A, or 2B)
Record-keeping and annual review
- Maintained written records of all risk assessments, moderation actions, user reports, complaints, and outcomes
- Records are available for Ofcom inspection on request
- Named a senior manager responsible for Online Safety Act compliance
- Scheduled annual review of all risk assessments, safety measures, and terms of service
- Established a process to review obligations when significant changes are made to the service
Key deadlines
Verify your compliance against the implementation timeline below.