Essential regulations for tech businesses
Technology businesses operate in one of the most heavily regulated sectors in the UK. Your compliance requirements depend on what you do: process personal data, host user-generated content, provide communications services, develop hardware, or offer financial services.
This guide covers the core regulatory framework that applies to most tech businesses operating in the UK.
Data protection compliance
All tech businesses processing personal data must comply with UK GDPR and the Data Protection Act 2018. This is the foundation of tech sector regulation.
UK GDPR / Data Protection Act 2018
All organisations processing personal data in the UK must comply with UK GDPR and the Data Protection Act 2018. This includes tech businesses handling user data, customer information, employee records, and any identifiable information about individuals.
Core data protection principles
Personal data must be:
Lawful, fair and transparent: Have a legal basis and inform people how you use their data
Purpose limitation: Only collect data for specified, explicit purposes
Data minimisation: Only collect what you need
Accuracy: Keep data accurate and up to date
Storage limitation: Don't keep data longer than necessary
Integrity and confidentiality: Keep data secure
Accountability: Demonstrate compliance
Lawful bases for processing
You must have at least one of these legal bases:
Consent: Freely given, specific, informed agreement
Contract: Processing necessary to fulfil a contract
Legal obligation: Required by law
Vital interests: Protection of someone's life
Public task: Official functions or public interest
Legitimate interests: Your interests (unless outweighed by individual's rights)
Individual rights
You must facilitate these rights:
Right to be informed (privacy notices)
Right of access (subject access requests within 1 month)
Right to rectification
Right to erasure (right to be forgotten)
Right to restrict processing
Right to data portability
Right to object
Rights related to automated decision making and profiling
When to notify the ICO
You must notify the ICO within 72 hours of becoming aware of a
personal data breach that poses a risk to people's rights and freedoms.
Step 1
Conduct a data audit to identify all personal data you process
Step 2
Document your lawful bases for each processing activity
Step 3
Create or update privacy notices for customers, users, and employees
Step 4
Implement data security measures appropriate to the risks (encryption, access controls)
Step 5
Establish procedures for handling subject access requests within 1 month
Step 6
Create a data breach response plan with ICO notification procedures
Step 7
Appoint a Data Protection Officer if required (public authorities, large-scale monitoring, or special category data)
Step 8
Maintain Records of Processing Activities (ROPA) if you have 250+ employees or high-risk processing
Step 9
Review and update data retention schedules
Step 10
Train staff on data protection obligations
ICO Guide to UK GDPR
Electronic marketing and cookies
If your website uses cookies, tracks users, or sends marketing communications, you must comply with the Privacy and Electronic Communications Regulations (PECR).
Privacy and Electronic Communications Regulations (PECR)
PECR governs electronic marketing, cookies, and similar technologies. Tech businesses must comply with rules on cookies, marketing emails, texts, and automated calls. PECR works alongside UK GDPR.
Cookie and tracking technology rules
You must:
Tell users about cookies and similar technologies (JavaScript, pixels, fingerprinting)
Get consent before setting non-essential cookies
Provide clear information about what each cookie does
Allow users to withdraw consent easily
Exceptions: Strictly necessary cookies (authentication, security,
load balancing) do not require consent but must be disclosed.
Email and text marketing rules
To individuals:
Get consent before sending marketing (opt-in)
Soft opt-in allowed if you collected contact details during a sale and are marketing similar products
Provide clear opt-out in every message
Include your identity and valid contact details
To businesses (B2B):
Emails: Can market without consent but must allow opt-out
Texts and automated calls: Need consent
Location data and traffic data
Processing location data requires consent unless for network security.
Traffic data (connection metadata) has specific retention and security requirements.
Step 1
Audit your website/app for all cookies and tracking technologies
Step 2
Implement a compliant cookie consent banner with granular controls
Step 3
Create a detailed cookie policy explaining each technology
Step 4
For marketing emails, maintain clear opt-in records and honour opt-outs immediately
Step 5
Include your business name and physical address in all marketing emails
Step 6
For SMS marketing, only contact individuals who have opted in
Step 7
Review third-party tools (analytics, advertising, CRM) for compliance
Step 8
Ensure consent mechanisms work on all devices and browsers
Step 9
Document your PECR compliance measures
ICO guidance on PECR
Online safety duties
If your platform allows user-generated content, you must comply with the Online Safety Act 2023. This applies to social media, forums, marketplaces with reviews, gaming platforms, and any service where users can interact.
Online Safety Act 2023
The Online Safety Act requires platforms hosting user-generated content to protect users from illegal content and, for large platforms, harmful content. This applies to social media, search engines, forums, marketplaces, gaming platforms, and any service allowing user interaction.
Who the Act applies to
If your service allows user-generated content (posts, comments, reviews,
profiles, direct messages, live-streaming), you are likely in scope. This includes:
Social media platforms
Video sharing platforms
Forums and message boards
Online marketplaces with user reviews
Gaming platforms with chat functions
Dating apps
Search engines (separate duties)
Illegal content duties (all services)
All in-scope services must:
Take steps to prevent users encountering priority illegal content (CSEA, terrorism, hate crimes, fraud)
Swiftly remove illegal content when reported
Operate effective complaints procedures
Be transparent about content moderation
Conduct illegal content risk assessments
Maintain illegal content policies
Child safety duties (services accessible by children)
If children can access your service, you must:
Conduct child safety risk assessments
Implement age assurance (verification or estimation)
Protect children from harmful content (self-harm, eating disorders, bullying)
Apply default protections for child users
Provide child user reporting mechanisms
Category 1 duties (largest platforms)
Platforms with 7+ million UK users or high-risk functionality must also:
Tackle legal but harmful content for adults (if risk assessment identifies it)
Protect freedom of expression and journalistic content
Provide adult users with tools to filter certain content
Ensure algorithmic transparency
Publish transparency reports
Enforcement and penalties
Ofcom can issue fines up to £18 million or 10% of global revenue (whichever
is higher) for non-compliance. Senior managers can face criminal liability for
failure to cooperate with Ofcom.
Step 1
Determine if your service hosts user-generated content and is in scope
Step 2
Conduct an illegal content risk assessment for your service
Step 3
If accessible by children, conduct a child safety risk assessment
Step 4
Implement content moderation systems to detect and remove priority illegal content
Step 5
Create clear terms of service and community standards
Step 6
Establish user reporting mechanisms with timely response procedures
Step 7
For services accessed by children, implement age assurance measures
Step 8
Prepare for Ofcom registration and reporting requirements (details to be confirmed)
Step 9
Document all safety measures and risk assessments
Step 10
Train moderation teams on UK illegal content definitions
Step 11
Consider designating a UK-based point of contact for Ofcom
Ofcom Online Safety Act guidance
Network and information security
Tech businesses providing cloud computing, online marketplaces, or search engines may be subject to the Network and Information Systems Regulations 2018 (NIS).
Network and Information Systems Regulations 2018 (NIS)
The NIS Regulations require operators of essential services and relevant digital service providers to implement security measures and report significant cyber incidents. Tech businesses providing cloud computing, online marketplaces, or search engines may be in scope.
Who must comply
Operators of Essential Services (OES): Critical
infrastructure in energy, transport, water, health, digital infrastructure.
Relevant Digital Service Providers (RDSP):
Online marketplaces (bringing together buyers and sellers)
Online search engines
Cloud computing services (IaaS, PaaS, SaaS)
Exemptions: Micro and small enterprises (fewer than 50 employees or
turnover/balance sheet under €10m) are exempt from RDSP requirements.
Security requirements
You must:
Implement appropriate technical and organisational measures to manage security risks
Take measures to prevent and minimise impact of incidents
Follow NCSC guidance and industry best practices
Maintain business continuity plans
Incident reporting
Notify the relevant authority within 72 hours of becoming aware of an
incident that has a significant impact on service continuity. Significant
impacts include:
Service unavailable to substantial number of users
Data breach affecting user data
Financial loss or reputational damage
For RDSPs, report to ICO. For OES, report to sector-specific regulator.
NIS2 Directive (upcoming)
The EU's NIS2 Directive expands scope and requirements. UK implementation
expected through the Cyber Security and Resilience Bill, expanding coverage
to more digital services and managed service providers.
Step 1
Determine if your business qualifies as an RDSP (check service type and size exemptions)
Step 2
If in scope, designate a point of contact and notify the ICO of your RDSP status
Step 3
Implement NCSC Cyber Essentials controls as a baseline
Step 4
Conduct regular risk assessments of your systems and supply chain
Step 5
Establish incident detection and response procedures
Step 6
Create an incident reporting plan with ICO notification within 72 hours
Step 7
Document all security measures and maintain evidence of compliance
Step 8
Review third-party supplier security arrangements
Step 9
Monitor NCSC alerts and advisories
NIS Regulations guidance
E-commerce requirements
All online businesses must comply with the Electronic Commerce Regulations 2002, which set out information requirements and contractual obligations.
Electronic Commerce (EC Directive) Regulations 2002
The E-Commerce Regulations set out information requirements for online businesses, liability rules for service providers, and requirements for commercial communications. All UK businesses providing information society services must comply.
Information you must provide
Your website or app must display:
Business name and legal status (Ltd, LLP, etc.)
Geographic address (not just PO Box)
Contact details including email address
Company registration number (if applicable)
VAT number (if VAT-registered)
Details of any trade body membership
Professional rules that apply (if regulated profession)
Commercial communications
Marketing emails and promotional content must:
Be clearly identifiable as commercial communications
Identify the business sending them
Clearly identify promotional offers and conditions
Clearly identify competitions and prizes
Service provider liability
The regulations provide limited liability protections for:
Mere conduit: Transmitting user content without modification
Caching: Temporary automatic storage
Hosting: Storing user content, provided you act expeditiously
to remove illegal content when you become aware of it
Note: Online Safety Act 2023 now imposes proactive duties that go beyond
reactive notice-and-takedown.
Order placement requirements
For e-commerce transactions:
Provide appropriate means for identifying and correcting input errors
Provide contract terms and conditions in a way that allows storage and reproduction
Acknowledge receipt of orders without undue delay
Step 1
Ensure your website displays all required business information prominently
Step 2
Create or update your Terms of Service and make them easily accessible
Step 3
Include clear contact details including geographic address and email
Step 4
For e-commerce, implement order confirmation emails
Step 5
Provide customers with ability to review and correct orders before submitting
Step 6
Make contract terms available in downloadable/printable format
Step 7
Clearly identify all promotional content and marketing communications
Step 8
Review your website footer and "About" pages for compliance
E-Commerce Regulations guidance
Consumer rights for digital content
If you sell software, apps, downloads, or SaaS subscriptions to consumers, the Consumer Rights Act 2015 gives customers specific protections.
Consumer Rights Act 2015 - Digital Content
The Consumer Rights Act includes specific provisions for digital content, giving consumers rights when purchasing software, apps, digital downloads, and online services. Digital content must be of satisfactory quality, fit for purpose, and as described.
What is digital content?
Digital content means data produced and supplied in digital form, including:
Software and applications
Music, video, and e-books
Games and in-game content
SaaS and cloud services
Digital downloads
Rights for digital content
Digital content must be:
Satisfactory quality: Meets standard a reasonable person
would consider satisfactory (free from bugs, fit for common purposes)
Fit for particular purpose: Suitable for any specific
purpose the consumer made known
As described: Matches any description, demonstration, or sample
Consumer remedies
If digital content doesn't meet these standards:
Repair or replacement first: Unlike goods, digital content has no 30-day short-term rejection right - repair/replacement is the first remedy
If repair/replacement fails: Right to price reduction or refund
No 30-day rejection: Digital content remedies differ from physical goods - be aware of this distinction
Compensation for damage
If faulty digital content damages a consumer's device or other digital
content, you may be liable for repair or compensation.
Pre-contract information requirements
Before purchase, you must provide:
Main characteristics of the digital content
Functionality and compatibility requirements
Whether ongoing updates will be provided
Total price including taxes
Step 1
Test software thoroughly before release to ensure satisfactory quality
Step 2
Clearly describe functionality, features, and system requirements
Step 3
Provide prominent information about compatibility and device requirements
Step 4
Implement quality assurance processes to identify bugs before release
Step 5
Create clear terms of service explaining update policies and support duration
Step 6
Establish customer support channels for reporting issues
Step 7
Develop procedures for handling refund requests within legal timeframes
Step 8
Include warranty information and consumer rights in purchase confirmation
Step 9
For subscription services, clearly state trial periods and cancellation terms
Step 10
Document all testing and quality control measures
Consumer Rights Act guidance
Computer security and penetration testing
Security professionals and businesses developing security tools must understand the Computer Misuse Act 1990 to avoid criminal liability.
Computer Misuse Act 1990 (see compliance_requirements/computer_misuse_act.yaml)
DEPRECATED - This snippet has been superseded by comprehensive CMA snippets in compliance_requirements/computer_misuse_act.yaml. Use cma_offences_penalties, cma_reform_2025, and legitimate_security_testing_requirements instead.
This snippet has been superseded. See the following snippets for comprehensive CMA coverage:
cma_offences_penalties - All four offences with correct penalties (including Section 3ZA: up to life imprisonment)
cma_reform_2025 - December 2025 government commitment to statutory defence for researchers
legitimate_security_testing_requirements - How to conduct security testing legally
Computer Misuse Act 1990 (legislation)
Export controls for technology
If you export controlled technology, software, or provide technical services to foreign entities, you may need export licences under the Export Control Order 2008.
Export Control Order 2008 - Technology and Software
Certain technology, software, and cryptographic items require export licences under strategic export controls. This includes dual-use technology with potential military applications, encryption software, and cyber-surveillance tools.
Controlled items
You need an export licence for:
Cryptography: Encryption software exceeding certain key
lengths (56-bit symmetric, 512-bit asymmetric thresholds, with exceptions
for mass-market products)
Intrusion software: Tools designed to avoid detection
by monitoring tools or defeat protective countermeasures
Surveillance technology: Interception, monitoring, or
forensic analysis tools
Dual-use technology: Software/tech with potential military
applications (network analysis, cybersecurity tools beyond certain thresholds)
Technology transfer: Technical data, blueprints, know-how
for developing controlled items
What counts as export?
Export includes:
Physical export from the UK
Electronic transmission to persons outside the UK (including cloud storage)
Making technology available to non-UK nationals in the UK
Providing services enabling use of controlled items abroad
Mass-market cryptography exemption
Generally available encryption products sold through retail channels
without restriction may be exempt, but you must submit a classification
request for confirmation.
Open source exemption
Cryptographic software in the public domain (open source) may be exempt,
but notification to Export Control Joint Unit may be required.
Embargoed destinations
Additional restrictions apply for exports to sanctioned countries (Russia,
Belarus, Iran, North Korea, Syria, etc.). Check current sanctions before
any export.
Step 1
Review your software and technology against the dual-use list in Annex I of the Export Control Order
Step 2
For cryptography, determine key lengths and whether products qualify as mass-market
Step 3
Submit classification requests to Export Control Joint Unit if uncertain
Step 4
For controlled items, apply for export licence before shipping or transmitting
Step 5
Check destination country against current UK sanctions lists
Step 6
Implement internal compliance procedures to screen customers and destinations
Step 7
Train staff on export control obligations, especially sales and technical teams
Step 8
Maintain records of export licence applications and decisions
Step 9
For cloud services, consider data residency implications
Step 10
If open source, determine if notification to ECJU is required
Export controls for technology
Web accessibility standards
While legally binding only for public sector websites, the accessibility regulations represent best practice for all digital services under the Equality Act.
Public Sector Bodies Accessibility Regulations 2018
Public sector websites and apps must meet accessibility standards (WCAG 2.1 Level AA). While legally binding only for public sector, these standards represent best practice for all digital services and may be legally required under Equality Act for commercial services.
Who must comply
Legally required for:
Public sector organisations (government, councils, NHS, schools)
Private companies delivering public sector websites/apps
Strongly recommended for all digital services due to Equality Act
obligations to make reasonable adjustments for disabled users.
WCAG 2.1 Level AA requirements
Websites and apps must be:
Perceivable: Information presented in ways users can perceive
(text alternatives, captions, adaptable layouts, sufficient contrast)
Operable: Interface components and navigation are operable
(keyboard accessible, sufficient time, no seizure triggers, navigable)
Understandable: Information and interface are understandable
(readable, predictable, input assistance)
Robust: Content works with assistive technologies
(valid HTML, proper ARIA, compatible with tools)
Accessibility statement
Public sector bodies must publish an accessibility statement covering:
Conformance status (fully compliant, partially compliant, not compliant)
Known issues and non-accessible content
How to request accessible alternatives
Enforcement procedure if not satisfied with response
Technical specifications used
Date of testing
Exemptions
Some content is exempt:
Pre-23 September 2018 PDFs and documents (unless essential to services)
Live video and audio (but captions required for pre-recorded)
Heritage collections (maps, scanned manuscripts)
Third-party content not funded or developed by the public body
Disproportionate burden (must be justified)
Step 1
Conduct accessibility audit using WCAG 2.1 Level AA criteria
Step 2
Test with screen readers (NVDA, JAWS, VoiceOver) and keyboard-only navigation
Step 3
Ensure all images have descriptive alt text
Step 4
Provide captions and transcripts for video/audio content
Step 5
Maintain minimum 4.5:1 contrast ratio for normal text (3:1 for large text)
Step 6
Ensure all functionality is keyboard accessible
Step 7
Use semantic HTML and proper heading structure
Step 8
Label form fields clearly and provide error messages
Step 9
Test with automated tools (WAVE, axe DevTools) and real users with disabilities
Step 10
Create and publish accessibility statement (if public sector)
Step 11
Establish process for handling accessibility feedback and requests
Step 12
Train developers on accessible coding practices
Accessibility requirements guidance
AI and algorithmic transparency
If you deploy artificial intelligence or automated decision-making systems, multiple existing regulations apply, even though the UK doesn't yet have specific AI legislation.
AI Regulation and Algorithmic Transparency
While the UK does not yet have specific AI legislation, existing regulations apply to AI systems (GDPR, Equality Act, Consumer Protection, Product Safety). The government's pro-innovation approach focuses on sector-specific guidance. Businesses deploying AI must ensure transparency, fairness, and accountability.
Current regulatory landscape
No standalone AI Act exists yet, but multiple regulations apply:
UK GDPR: Automated decision-making requires human intervention
for significant decisions; right to explanation for solely automated decisions
Equality Act 2010: AI systems must not discriminate based
on protected characteristics
Consumer Protection: Misleading claims about AI capabilities
violate consumer protection law
Product Safety: AI products must be safe and not misleading
UK AI regulation principles (2023 White Paper)
The government's pro-innovation framework sets out 5 cross-sector principles:
Safety, security and robustness: AI functions securely, safely, and reliably
Transparency and explainability: Appropriate transparency about AI use
Fairness: No unlawful discrimination or bias
Accountability and governance: Clear responsibility and oversight
Contestability and redress: Ability to challenge AI decisions
Automated decision-making (GDPR)
Individuals have the right not to be subject to solely automated decisions
with legal or similarly significant effects, unless:
Necessary for contract performance
Authorised by law
Based on explicit consent
You must provide information about logic involved, significance, and
envisaged consequences. Implement meaningful human review for high-impact decisions.
Algorithmic bias and discrimination
AI systems making decisions about employment, credit, housing, services
must not discriminate based on protected characteristics. Test for bias
across demographic groups and implement fairness metrics.
Future regulation
The EU AI Act may influence UK approach. High-risk AI systems
(employment, credit, law enforcement) face strict requirements. Monitor
government consultations on AI legislation.
Step 1
Conduct algorithmic impact assessments for high-risk AI systems
Step 2
Implement transparency measures explaining AI decision-making to users
Step 3
Test AI models for bias across protected characteristics and demographic groups
Step 4
For automated decisions with significant impact, implement human review processes
Step 5
Document training data sources, model development, and testing procedures
Step 6
Provide clear information to users when AI is making decisions about them
Step 7
Establish procedures for users to contest AI decisions
Step 8
Maintain audit trails of AI decision-making
Step 9
Implement ongoing monitoring for model drift and emerging biases
Step 10
Create governance structures with clear accountability for AI systems
Step 11
Train staff on AI ethics and responsible deployment
Step 12
Stay informed about emerging AI regulation and sector-specific guidance
AI regulation guidance