Nationwide Defence
Results you can trust
24/7 support

Expert Criminal Defence Solicitors and barristers

The Online Safety Act 2023 Explained: What It Means for the UK, Users & Tech Platforms

The UK's Online Safety Act 2023 is now law and affects anyone who runs online services in the UK. If you have a website, app, or platform where people can post content or talk to each other, you likely need to follow these new rules.

This guide explains what the Online Safety Act means for your business. We'll cover who needs to comply, what the deadlines are, and what you need to do to avoid fines.

The Online Safety Act affects more than 100,000 online services worldwide. Since 17 March 2025, Ofcom can now fine companies that don't follow the rules. The time to prepare is over - these are active laws with real penalties for businesses that don't comply.

Does the Online Safety Act Apply to Your Business? Quick Check

Many business owners want to know: "Do these new UK online safety rules apply to me?" Here's how to find out if your business needs to follow the Online Safety Act.

The Simple Test: If UK users can access your service AND they can post content or interact with each other, you probably need to comply. It doesn't matter if your business is based outside the UK.

Services That Must Follow These Rules:

  • Social media platforms
  • Video sharing sites like YouTube
  • Messaging apps
  • Online marketplaces
  • Gaming platforms where users talk to each other
  • Search engines
  • Dating apps
  • Any website where users can post comments or content

Location Doesn't Matter: Even if your business operates completely outside the UK, you still need to follow these rules if:

  • You have lots of UK users
  • You target UK customers
  • UK users are an important part of your business

Small Businesses Aren't Exempt: Your business size doesn't matter. Whether you're a startup or a huge company, if UK users can access your service, the basic safety rules apply to you.

If this quick check suggests the Act applies to your business, you have deadlines to meet. Some rules are already in effect as of March 2025.

Online Safety Act Deadlines 2025: When You Need to Act

Understanding when things need to be done is crucial. Ofcom has split enforcement into three phases, each with specific deadlines.

Phase 1: Illegal Content Rules (Already Started - March 2025) The most urgent deadline has already passed. By 16 March 2025, all covered services should have completed their illegal content risk assessments.

From 17 March 2025, Ofcom can actively enforce these rules. This means your platform must now have working systems to find and remove illegal content. If you haven't done your risk assessment or put safety measures in place, you're already breaking the law.

Phase 2: Child Safety Rules (July 2025) The next big deadline is coming fast. By 16 April 2025, all platforms must complete a "children's access assessment." This determines if children are likely to use your service.

If children do use your service, you'll need to complete a children's risk assessment by July 2025. That's when child safety duties become fully enforceable.

Phase 3: Extra Rules for Big Platforms (2026) The biggest platforms will face additional rules starting in 2026. Ofcom will publish a list of these platforms in summer 2025.

What This Means Right Now: If you haven't started preparing for compliance, you're facing urgent deadlines. The illegal content deadline has passed, but you still have ongoing duties. If you haven't started, get legal help immediately to understand where you stand.

Ofcom's enforcement director recently said "2025 is the year of action." The preparation time is over.

What Is the Online Safety Act? The Basics

Why This Law Exists

The Online Safety Act became law on 26 October 2023. It gives the government power to control online content that's illegal or harmful to children.

But this law does much more than just remove bad content. It requires online services to prevent problems before they happen. Instead of waiting for harmful content to appear and then removing it, you now need to set up systems to stop it from appearing in the first place.

Think of it like this: instead of cleaning up messes after they happen, you need to child-proof your house to prevent accidents.

How the Law Developed

This law didn't appear overnight. It started with a government paper in 2020 called the "Online Harms White Paper." After years of discussions with parliament and businesses, it became the law we have today.

The long development process explains why the law tries to balance safety with free speech. The government wanted to make the internet safer without stopping people from expressing their views legally.

Who Needs to Follow These Rules

The Online Safety Act applies much more broadly than similar laws in other countries. It covers more than 100,000 online services that UK users can access.

This includes:

  • Social media apps
  • Video sharing platforms
  • Private messaging apps
  • Online marketplaces
  • Gaming sites where users interact
  • Search engines
  • Dating services

The key point: if UK users can access your service and interact through it, you probably need to comply. Your company's location doesn't matter.

What You Need to Do: Key Requirements

Illegal Content Rules (Already in Effect)

Since March 2025, the most important change has been enforcement of illegal content rules. Ofcom can now take action against platforms that don't follow these rules.

You need to identify and remove illegal content across more than 130 types of crimes. These include:

  • Child sexual abuse material
  • Terrorist content
  • Fraud and scams
  • Hate speech

Your systems need to be "proportionate" to your platform size. A small forum has different expectations than Facebook. But even small platforms need systematic approaches to finding and stopping illegal content.

You must:

  • Complete comprehensive risk assessments
  • Put safety measures in place based on those assessments
  • Keep detailed records of what you're doing
  • Regularly review and update your approach

Child Protection Rules (Coming July 2025)

Child safety is the main focus of this law. If children are likely to use your service, you face much stricter rules.

First, every platform must complete a "children's access assessment" by 16 April 2025. This determines if children are likely to use your service. The test is broad - if children could reasonably access your service, even if they're not your target audience, the enhanced rules probably apply.

If children do use your service, you must:

  • Complete a children's risk assessment by July 2025
  • Set up age verification where appropriate
  • Configure your algorithms to reduce children's exposure to harmful content
  • Design your interface with child safety in mind
  • Provide clear reporting tools for children and parents

Risk Assessment Requirements

The law requires you to understand the specific risks your platform creates. This isn't a one-size-fits-all approach - you need to look at your unique features and user base.

Your risk assessment must cover:

  • What types of harmful content might appear
  • How your platform's design might make problems worse
  • How likely different types of harm are to occur
  • How severe those harms could be

For example, you need to think about:

  • How your recommendation algorithm might promote harmful content
  • Whether your messaging features create opportunities for abuse
  • How live-streaming might be misused

This isn't a one-time task. You need to regularly update your assessment as your platform changes or you notice new risks.

Ofcom's Role: The Regulator in Charge

What Ofcom Can Do

Ofcom is the regulator in charge of enforcing the Online Safety Act. They now have much more power than they used to have with just TV and radio.

The Penalties Are Serious:

  • Fines up to £18 million OR 10% of your worldwide revenue (whichever is bigger)
  • Court orders that can block your service in the UK
  • In the most serious cases, criminal charges for company directors

For many international companies, losing access to the UK market is a bigger threat than the money penalties.

How Ofcom Enforces the Rules

Ofcom's enforcement approach combines helping companies comply with firm action against those who don't.

Ofcom's enforcement director recently said companies have a choice: "fix it now or we will take enforcement action." She called this the "last chance saloon."

But Ofcom's patience has limits. They've said they'll "drag [companies] kicking and screaming into compliance" if they have to. Companies that put profits above user safety will face the full force of enforcement.

Criminal Liability for Executives

One of the most serious parts of the law creates personal criminal liability for senior executives. If a company refuses to comply with Ofcom's orders about child safety, directors and senior managers could face criminal charges.

This ensures that online safety gets attention at the board level. Executives can't just delegate these responsibilities and assume they're not personally at risk.

These criminal powers won't be fully available until 2026, but Ofcom has made clear they "won't hesitate to use them if necessary."

Current Status: What's Happening Now

The Three-Phase Rollout

The law is being enforced in three phases, each with specific deadlines.

Phase 1: Illegal Content (Currently Enforced) Illegal content rules are now in effect. As of 17 March 2025, Ofcom can actively enforce these rules. All covered services should have completed their illegal content risk assessments by 16 March 2025.

If you haven't done this, you're potentially breaking enforceable rules right now.

Phase 2: Child Safety (July 2025) Child safety requirements are coming fast. All platforms must complete children's access assessments by 16 April 2025. If children use your service, you must complete children's risk assessments by July 2025.

Phase 3: Big Platform Rules (2026) The biggest platforms will face extra rules starting in 2026. These include transparency reporting and giving users more control over what they see.

Enforcement Actions Already Started

Ofcom has already begun investigating companies that aren't following the rules.

They've announced investigations into two pornographic websites - Itai Tech Ltd and Score Internet Group LLC - because they don't have effective age checks to stop children accessing their content.

Ofcom also launched an investigation into an online suicide forum. This shows they're willing to take action across different types of platforms, not just mainstream social media.

These early actions show Ofcom is focusing on clear cases where platforms have failed to put basic safety measures in place, especially for child protection.

How Much Does Compliance Cost? Business Impact

The Financial Reality

Following the Online Safety Act costs money, especially for smaller businesses. There are concerns that compliance costs could be too high for small companies, and that big tech companies might pass their costs down to smaller businesses that use their services.

What You Need to Pay For:

  • Risk assessment (either internal staff or consultants)
  • Content moderation systems (technology and people)
  • Age verification technology (licensing and setup costs)
  • User reporting systems (support staff)
  • Ongoing monitoring and audits

For many platforms, full compliance means hiring new staff, buying new technology, and changing how their platform works. These changes often go far beyond just the safety team.

The law says safety measures should be "proportionate" to your platform's size and capabilities. But even small platforms need real risk management and safety measures.

Technical Challenges

Many platforms face big technical challenges in meeting the law's requirements while keeping their services user-friendly.

Age Verification: The law requires age checks to be "highly effective" at determining if someone is a child. Simple checkboxes asking "Are you over 18?" won't meet this standard for risky platforms. You might need identity document checks, credit card verification, or other robust methods.

Content Moderation: Your systems need to find illegal content across many categories while not removing too much legal content by mistake. This requires both automated tools and human reviewers who can handle complex decisions.

Algorithm Auditing: You need to examine how your recommendation systems might promote harmful content and fix problems without breaking how your platform works. This requires deep technical analysis of complex systems.

How This Compares Globally

Relationship to EU Rules

The Online Safety Act is similar to the EU's Digital Services Act but goes further in some areas, especially child safety.

While the EU law focuses on general risk management, the UK's Online Safety Act gives specific detailed requirements for platforms that children might use. This includes specific age verification rules and algorithm safeguards that go beyond what other countries currently require.

These differences create challenges for international platforms. Some UK requirements might conflict with rules in other countries, especially around privacy and encryption.

Global Impact

Because the UK law applies to any platform with UK users, many international companies are implementing UK standards globally rather than creating UK-specific systems. This is simpler operationally and ensures consistent user experiences.

This means the UK law's influence extends far beyond UK users. However, some platforms might choose to block UK access rather than comply, especially privacy-focused services that view content scanning as incompatible with their values.

Major Concerns and Challenges

Free Speech Worries

The law's content rules have sparked debate about impacts on free speech. Critics worry that broad content removal requirements could lead to over-censorship, especially when automated systems remove borderline content to be safe.

The law tries to address these concerns by protecting journalism and democratic content. Large social media platforms must preserve access to journalistic content and political discussions. But doing this in practice requires sophisticated systems that can tell the difference between legitimate political discussion and harmful content.

Encryption and Privacy Issues

One of the most controversial parts of the law involves potentially requiring platforms to scan encrypted messages for illegal content. The law didn't remove these powers, and Ofcom can require companies to break encryption at any time.

This followed threats from companies like Signal to leave the UK market rather than weaken their encryption. This creates tension between user privacy and content safety goals.

Privacy-focused services face difficult choices between following UK law and maintaining encryption standards that protect users worldwide.

How Companies Are Responding

Company responses vary based on size, resources, and risk tolerance.

Large platforms like Meta and TikTok have invested heavily in compliance systems, often doing more than the minimum to show good faith efforts and build positive relationships with the regulator.

Smaller platforms face bigger challenges, especially with age verification and content moderation technology. Many are exploring shared services and third-party tools to manage costs.

Some platforms have chosen to block UK access rather than comply fully, especially where requirements conflict with their core business models.

Your Compliance Checklist: What to Do Now

Immediate Steps

If you haven't started compliance preparations, you face urgent deadlines and potential enforcement action.

Complete Your Risk Assessment Immediately: Your illegal content risk assessment should have been done by 16 March 2025. If you haven't done this, you're potentially breaking the law right now.

Your assessment needs to look at:

  • What illegal content might appear on your platform
  • How your platform's features might make problems worse
  • All the priority crime categories in the law

Fix Your Content Moderation: Review and improve how you find and remove illegal content. Your system needs to work across all the priority categories in the law.

Your content moderation must be appropriate for your platform's size and risk level, but it needs to actually work at addressing the risks you've identified.

Sort Out Age Verification: If children might use your platform, you need appropriate age verification or age estimation technology. The standard requires these to be "highly effective," not just token gestures.

Think about whether children could reasonably access your service, even if they're not your target audience.

Long-Term Planning

Success requires strategic long-term planning, not just quick fixes.

Invest in Technology: You'll need ongoing investment in safety technology, including:

  • Content detection systems
  • User verification technology
  • Risk assessment tools

Budget for regular updates as both the law and harmful content evolve.

Build Your Team: You'll need internal expertise in online safety, compliance, and risk management. Many platforms will need to hire specialists or train existing staff.

Online safety isn't a one-time project - it needs constant attention and improvement.

Work with Others: Engage with Ofcom and other companies. The regulator prefers working together over enforcement action.

Industry collaboration can help manage costs and share best practices. Many challenges are common across platforms.

What's Coming Next

Future Rule Changes

The Online Safety Act will keep evolving as Ofcom publishes more guidance.

According to Ofcom's timeline:

  • Register of categorised services: Summer 2025
  • Draft transparency notices: A few weeks after the register
  • Final transparency notices: Soon after the drafts
  • Additional duties for big platforms: Early 2026

These changes will create extra requirements for the biggest and highest-risk platforms.

International Trends

Other countries are watching the UK's implementation closely. Similar laws are being developed worldwide, which could lead to more consistent rules globally but also more complexity for platforms operating in multiple countries.

The UK's child safety rules are particularly influential. The detailed age verification and algorithm requirements are likely to become international best practices.

Key Takeaways

The Online Safety Act fundamentally changes how digital platforms must operate in the UK. With enforcement now active and penalties severe, compliance isn't optional for any platform with UK users.

Do This Now:

  • Make sure you've completed required risk assessments
  • Put appropriate safety measures in place
  • Understand that illegal content duties are already enforceable
  • Prepare for child safety requirements in July 2025

Strategic Importance: Online safety is now a core business function that needs board-level attention and ongoing investment. Don't treat this as just a technical compliance exercise.

Get Professional Help: The law is complex and the penalties are severe. Professional legal advice is essential for developing effective compliance strategies.

The Bottom Line: The UK is now a leader in digital safety regulation. Understanding and following these requirements isn't just about avoiding fines - it's about contributing to a safer internet for everyone.

For any business operating online services accessible to UK users, the message is clear: comprehensive online safety compliance is now required by law. The regulator has shown they're willing to take enforcement action, and the preparation period is over.

References:

  1. Online Safety Act: explainer - GOV.UK. (2024). Available at: https://www.gov.uk/government/publications/online-safety-act-explainer/online-safety-act-explainer
  2. Online Safety Act 2023 - Wikipedia. (2025). Available at: https://en.wikipedia.org/wiki/Online_Safety_Act_2023
  3. UK Online Safety Act – 2025 and beyond | The Internet Commission. (2025). Available at: https://www.inetco.org/news/uk-online-safety-act-2025-and-beyond
  4. UK Online Safety Act — Spring 2025 Deadlines | Latham.London. (2025). Available at: https://www.latham.london/2025/03/uk-online-safety-act-spring-2025-deadlines/
  5. 2025: Ofcom's Year of Enforcement - Lexology. (2024). Available at: https://www.lexology.com/library/detail.aspx?g=87a7042c-8b25-4524-8bf3-27e778c105f5
  6. UK online safety act - Next Move: PwC. Available at: https://www.pwc.com/us/en/services/consulting/cybersecurity-risk-regulatory/library/tech-regulatory-policy-developments/uk-online-safety-act.html