The UK's Online Safety Act 2023 is now law and affects anyone who runs online services in the UK. If you have a website, app, or platform where people can post content or talk to each other, you likely need to follow these new rules.
This guide explains what the Online Safety Act means for your business. We'll cover who needs to comply, what the deadlines are, and what you need to do to avoid fines.
The Online Safety Act affects more than 100,000 online services worldwide. Since 17 March 2025, Ofcom can now fine companies that don't follow the rules. The time to prepare is over - these are active laws with real penalties for businesses that don't comply.
Many business owners want to know: "Do these new UK online safety rules apply to me?" Here's how to find out if your business needs to follow the Online Safety Act.
The Simple Test: If UK users can access your service AND they can post content or interact with each other, you probably need to comply. It doesn't matter if your business is based outside the UK.
Services That Must Follow These Rules:
Location Doesn't Matter: Even if your business operates completely outside the UK, you still need to follow these rules if:
Small Businesses Aren't Exempt: Your business size doesn't matter. Whether you're a startup or a huge company, if UK users can access your service, the basic safety rules apply to you.
If this quick check suggests the Act applies to your business, you have deadlines to meet. Some rules are already in effect as of March 2025.
Understanding when things need to be done is crucial. Ofcom has split enforcement into three phases, each with specific deadlines.
Phase 1: Illegal Content Rules (Already Started - March 2025) The most urgent deadline has already passed. By 16 March 2025, all covered services should have completed their illegal content risk assessments.
From 17 March 2025, Ofcom can actively enforce these rules. This means your platform must now have working systems to find and remove illegal content. If you haven't done your risk assessment or put safety measures in place, you're already breaking the law.
Phase 2: Child Safety Rules (July 2025) The next big deadline is coming fast. By 16 April 2025, all platforms must complete a "children's access assessment." This determines if children are likely to use your service.
If children do use your service, you'll need to complete a children's risk assessment by July 2025. That's when child safety duties become fully enforceable.
Phase 3: Extra Rules for Big Platforms (2026) The biggest platforms will face additional rules starting in 2026. Ofcom will publish a list of these platforms in summer 2025.
What This Means Right Now: If you haven't started preparing for compliance, you're facing urgent deadlines. The illegal content deadline has passed, but you still have ongoing duties. If you haven't started, get legal help immediately to understand where you stand.
Ofcom's enforcement director recently said "2025 is the year of action." The preparation time is over.
The Online Safety Act became law on 26 October 2023. It gives the government power to control online content that's illegal or harmful to children.
But this law does much more than just remove bad content. It requires online services to prevent problems before they happen. Instead of waiting for harmful content to appear and then removing it, you now need to set up systems to stop it from appearing in the first place.
Think of it like this: instead of cleaning up messes after they happen, you need to child-proof your house to prevent accidents.
This law didn't appear overnight. It started with a government paper in 2020 called the "Online Harms White Paper." After years of discussions with parliament and businesses, it became the law we have today.
The long development process explains why the law tries to balance safety with free speech. The government wanted to make the internet safer without stopping people from expressing their views legally.
The Online Safety Act applies much more broadly than similar laws in other countries. It covers more than 100,000 online services that UK users can access.
This includes:
The key point: if UK users can access your service and interact through it, you probably need to comply. Your company's location doesn't matter.
Since March 2025, the most important change has been enforcement of illegal content rules. Ofcom can now take action against platforms that don't follow these rules.
You need to identify and remove illegal content across more than 130 types of crimes. These include:
Your systems need to be "proportionate" to your platform size. A small forum has different expectations than Facebook. But even small platforms need systematic approaches to finding and stopping illegal content.
You must:
Child safety is the main focus of this law. If children are likely to use your service, you face much stricter rules.
First, every platform must complete a "children's access assessment" by 16 April 2025. This determines if children are likely to use your service. The test is broad - if children could reasonably access your service, even if they're not your target audience, the enhanced rules probably apply.
If children do use your service, you must:
The law requires you to understand the specific risks your platform creates. This isn't a one-size-fits-all approach - you need to look at your unique features and user base.
Your risk assessment must cover:
For example, you need to think about:
This isn't a one-time task. You need to regularly update your assessment as your platform changes or you notice new risks.
Ofcom is the regulator in charge of enforcing the Online Safety Act. They now have much more power than they used to have with just TV and radio.
For many international companies, losing access to the UK market is a bigger threat than the money penalties.
Ofcom's enforcement approach combines helping companies comply with firm action against those who don't.
Ofcom's enforcement director recently said companies have a choice: "fix it now or we will take enforcement action." She called this the "last chance saloon."
But Ofcom's patience has limits. They've said they'll "drag [companies] kicking and screaming into compliance" if they have to. Companies that put profits above user safety will face the full force of enforcement.
One of the most serious parts of the law creates personal criminal liability for senior executives. If a company refuses to comply with Ofcom's orders about child safety, directors and senior managers could face criminal charges.
This ensures that online safety gets attention at the board level. Executives can't just delegate these responsibilities and assume they're not personally at risk.
These criminal powers won't be fully available until 2026, but Ofcom has made clear they "won't hesitate to use them if necessary."
The law is being enforced in three phases, each with specific deadlines.
Phase 1: Illegal Content (Currently Enforced) Illegal content rules are now in effect. As of 17 March 2025, Ofcom can actively enforce these rules. All covered services should have completed their illegal content risk assessments by 16 March 2025.
If you haven't done this, you're potentially breaking enforceable rules right now.
Phase 2: Child Safety (July 2025) Child safety requirements are coming fast. All platforms must complete children's access assessments by 16 April 2025. If children use your service, you must complete children's risk assessments by July 2025.
Phase 3: Big Platform Rules (2026) The biggest platforms will face extra rules starting in 2026. These include transparency reporting and giving users more control over what they see.
Ofcom has already begun investigating companies that aren't following the rules.
They've announced investigations into two pornographic websites - Itai Tech Ltd and Score Internet Group LLC - because they don't have effective age checks to stop children accessing their content.
Ofcom also launched an investigation into an online suicide forum. This shows they're willing to take action across different types of platforms, not just mainstream social media.
These early actions show Ofcom is focusing on clear cases where platforms have failed to put basic safety measures in place, especially for child protection.
Following the Online Safety Act costs money, especially for smaller businesses. There are concerns that compliance costs could be too high for small companies, and that big tech companies might pass their costs down to smaller businesses that use their services.
For many platforms, full compliance means hiring new staff, buying new technology, and changing how their platform works. These changes often go far beyond just the safety team.
The law says safety measures should be "proportionate" to your platform's size and capabilities. But even small platforms need real risk management and safety measures.
Many platforms face big technical challenges in meeting the law's requirements while keeping their services user-friendly.
Age Verification: The law requires age checks to be "highly effective" at determining if someone is a child. Simple checkboxes asking "Are you over 18?" won't meet this standard for risky platforms. You might need identity document checks, credit card verification, or other robust methods.
Content Moderation: Your systems need to find illegal content across many categories while not removing too much legal content by mistake. This requires both automated tools and human reviewers who can handle complex decisions.
Algorithm Auditing: You need to examine how your recommendation systems might promote harmful content and fix problems without breaking how your platform works. This requires deep technical analysis of complex systems.
The Online Safety Act is similar to the EU's Digital Services Act but goes further in some areas, especially child safety.
While the EU law focuses on general risk management, the UK's Online Safety Act gives specific detailed requirements for platforms that children might use. This includes specific age verification rules and algorithm safeguards that go beyond what other countries currently require.
These differences create challenges for international platforms. Some UK requirements might conflict with rules in other countries, especially around privacy and encryption.
Because the UK law applies to any platform with UK users, many international companies are implementing UK standards globally rather than creating UK-specific systems. This is simpler operationally and ensures consistent user experiences.
This means the UK law's influence extends far beyond UK users. However, some platforms might choose to block UK access rather than comply, especially privacy-focused services that view content scanning as incompatible with their values.
The law's content rules have sparked debate about impacts on free speech. Critics worry that broad content removal requirements could lead to over-censorship, especially when automated systems remove borderline content to be safe.
The law tries to address these concerns by protecting journalism and democratic content. Large social media platforms must preserve access to journalistic content and political discussions. But doing this in practice requires sophisticated systems that can tell the difference between legitimate political discussion and harmful content.
One of the most controversial parts of the law involves potentially requiring platforms to scan encrypted messages for illegal content. The law didn't remove these powers, and Ofcom can require companies to break encryption at any time.
This followed threats from companies like Signal to leave the UK market rather than weaken their encryption. This creates tension between user privacy and content safety goals.
Privacy-focused services face difficult choices between following UK law and maintaining encryption standards that protect users worldwide.
Company responses vary based on size, resources, and risk tolerance.
Large platforms like Meta and TikTok have invested heavily in compliance systems, often doing more than the minimum to show good faith efforts and build positive relationships with the regulator.
Smaller platforms face bigger challenges, especially with age verification and content moderation technology. Many are exploring shared services and third-party tools to manage costs.
Some platforms have chosen to block UK access rather than comply fully, especially where requirements conflict with their core business models.
If you haven't started compliance preparations, you face urgent deadlines and potential enforcement action.
Complete Your Risk Assessment Immediately: Your illegal content risk assessment should have been done by 16 March 2025. If you haven't done this, you're potentially breaking the law right now.
Your assessment needs to look at:
Fix Your Content Moderation: Review and improve how you find and remove illegal content. Your system needs to work across all the priority categories in the law.
Your content moderation must be appropriate for your platform's size and risk level, but it needs to actually work at addressing the risks you've identified.
Sort Out Age Verification: If children might use your platform, you need appropriate age verification or age estimation technology. The standard requires these to be "highly effective," not just token gestures.
Think about whether children could reasonably access your service, even if they're not your target audience.
Success requires strategic long-term planning, not just quick fixes.
Invest in Technology: You'll need ongoing investment in safety technology, including:
Budget for regular updates as both the law and harmful content evolve.
Build Your Team: You'll need internal expertise in online safety, compliance, and risk management. Many platforms will need to hire specialists or train existing staff.
Online safety isn't a one-time project - it needs constant attention and improvement.
Work with Others: Engage with Ofcom and other companies. The regulator prefers working together over enforcement action.
Industry collaboration can help manage costs and share best practices. Many challenges are common across platforms.
The Online Safety Act will keep evolving as Ofcom publishes more guidance.
According to Ofcom's timeline:
These changes will create extra requirements for the biggest and highest-risk platforms.
Other countries are watching the UK's implementation closely. Similar laws are being developed worldwide, which could lead to more consistent rules globally but also more complexity for platforms operating in multiple countries.
The UK's child safety rules are particularly influential. The detailed age verification and algorithm requirements are likely to become international best practices.
The Online Safety Act fundamentally changes how digital platforms must operate in the UK. With enforcement now active and penalties severe, compliance isn't optional for any platform with UK users.
Strategic Importance: Online safety is now a core business function that needs board-level attention and ongoing investment. Don't treat this as just a technical compliance exercise.
Get Professional Help: The law is complex and the penalties are severe. Professional legal advice is essential for developing effective compliance strategies.
The Bottom Line: The UK is now a leader in digital safety regulation. Understanding and following these requirements isn't just about avoiding fines - it's about contributing to a safer internet for everyone.
For any business operating online services accessible to UK users, the message is clear: comprehensive online safety compliance is now required by law. The regulator has shown they're willing to take enforcement action, and the preparation period is over.
References: