The UK Online Safety Act is here: What it Means & How to Prepare

← Blog Home
Table of Contents

The debates have closed and the ink is now dry -  the UK’s Online Safety Act has passed and is now awaiting Royal Assent.

Source: UK Parliament

NOTE: On October 26th, 2023, the Online Safety Bill became the Online Safety Act - this article has been edited to reflect those changes.

The UK Online Safety Act, a topic of heated debate for nearly five years, has finally come to the forefront. Both the UK Times and the Economist have characterized the bill as a potential threat to the current state of the internet. Still, regardless of your stance on this, one thing is clear: the UK Online Safety Act will play a significant role in shaping our understanding of the future of the internet.

Much like the Digital Services Act, the Online Safety Act was designed to make social media companies more responsible for their users’ safety. However, unlike the DSA, the UK Online Safety Bill takes a more explicit approach by defining what is legal and illegal within the UK’s jurisdiction, and provides greater detail on the actions that platforms should take in response. These changes are likely to deviate from the way UGC platforms currently handle Terms of Service violations.

In a further departure from the DSA, the new bill enlists the services of Ofcom, an independent yet government-endorsed watchdog that oversees various public media, to take charge of regulatory enforcement. Platforms will have to show they have processes in place to meet the requirements set out by the new Bill by mid-2024.

Should the UK Online Safety Act proceed as anticipated, tech companies, particularly those not categorized as VLOPs, could face a concurrent implementation of both the DSA and the UK Online Safety Bill. Considering the breakneck speed at which many companies are preparing for the DSA, this simultaneous implementation could intensify the urgency to meet the diverse legislative requirements.

With that context, let’s dive deep into the intricacies of this new legislation and its implications for businesses, social media and UGC platforms, as well as internet users at large.

What is the Online Safety Act?

For some time, social media companies and UGC platforms have faced criticism for their perceived sluggishness is addressing harmful or illegal content on their platforms. The UK Online Safety Act (OSA) is a legislative effort designed to remedy the situation. It squarely tackles some of the internet’s most pressing concerns, including misinformation and freedom of expression, child sexual exploitation and abuse, and the promotion of pro-suicidal content. Through a set of regulatory requirements, it aims to curtail the dissemination of harm in these and other areas.

Overall, the main goal of the OSA is to hold platforms and other online services accountable for the safety and wellbeing of their users. Failure to do so will result in significant fines and penalties depending on the type of violation committed.

How the Online Safety Act Protects Children

Social media companies are legally obligated to prioritize children's and young people's safety and wellbeing on their platforms, specifically calling out the need for more protective measures for children than adults. From age verification to content filtering, the bill outlines a comprehensive strategy to make the internet a more secure playground for its youngest users.

Platforms will have to:

Handle Illegal Content Removals Comprehensively OSA introduces fresh regulations regarding what is deemed illegal content and how it should be managed. Platforms will be mandated to adopt a proactive approach whenever technically feasible. Additionally, they must clearly outline in their Terms of Service the categories of illegal content that need removal and the role of proactive technology in the process.  

Restrict Access to Content Deemed Harmful: Measures must be in place to block kids from accessing harmful or age-inappropriate content, building on previous legislation to block access to pornographic websites.

Age Verification and Estimation: Enforce age limits and age-checking measures through either direct age verification or frameworks which can reliably predict age. While optional in the past, this is now a hard requirement for multi-age platforms

 

Transparency: Multi-age platforms must publish risk assessments for potential harms to children, addressing risks across age groups, especially where adults can contact kids. They should also share plans for risk reduction, such as governance models and proactive tech.

 

Reporting Mechanisms: Parents and children should have easy ways to report issues online.

Triple Shield: Protecting Adults

The OSA mandates a multi-layered approach to moderating online platforms' content and users, focusing on key elements from illegal content to user controls. The good news is that many online platforms already employ these requirements, although sometimes in a piecemeal fashion. The OSA standardizes the criteria to reduce the existing variation.

For our analysis, we'll call this multi-layered approach a "triple shield."

1. Preventing and Removing Illegal Content

Under the first layer of the “triple shield,” online platforms are tasked with the rapid detection and removal of content and activities deemed illegal under OSA.This means everything from hate crimes and fraudulent behavior  to severe cases of harassment.Building upon the proactive approach, platforms must prioritize safety by incorporating algorithms or related features to manage user interactions and content. They must also establish transparent reporting systems, akin to DSA requirements.

2. Adherence to Platform Terms and Conditions

The second layer requires Category 1 services (those with the largest user base and highest risk) to enforce their own terms and conditions more precisely. Platforms are now legally obligated to adhere strictly to their terms of service, outside of the already defined content the UK deems illegal. Since the OSA mandates platforms to clearly define their content moderation methods, any policy areas not explicitly defined and outlined in the terms of service are not to be taken down or restricted. Conversely, if a platform's terms prohibit certain types of speech or content, the company is legally obligated to monitor and remove such content.

This places an additional layer of accountability on platforms to adhere to their own community guidelines in a transparent and equitable way.

3. Empowering Users With Control Tools

Category 1 services must offer adults sophisticated tools to control the content they encounter and the people they interact with.

Features could range from advanced filtering options allowing users to block content categories or keywords to muting or blocking interactions from unverified or anonymous accounts. The emphasis here is on providing users with the power to curate their own online environment.

Types of Content to Be Addressed

Illegal Content Categories

The bill targets various types of illegal content, such as:

  • Child sexual exploitation and abuse
  • Fraud
  • Hate crimes
  • Terrorism

While the majority of big platform policies address content falling within these categories, it’s likely that certain adjustments will be necessary to align with the more specific definitions provided for each category of illegal content, and the promised guidelines of compliance from Ofcom.

Further Restrictions on Content Accessible by Children

Not all content is illegal, but it can still be harmful to children. The bill includes measures to protect children from:

  • Pornographic content
  • Bullying
  • Content encouraging self-harm or eating disorders

You can see the most recent list of content to be tackled here.

Regulatory Oversight by Ofcom

Ofcom is the regulator for communications services in the UK and has the power to enforce these new laws on the regulated platforms.

Currently, OSA stipulates that companies that fail to comply can face fines of up to £18 million or 10% of their global turnover, whichever is greater. It’s unclear if these penalties will change when the bill officially passes. Ofcom's oversight powers will be introduced in a phased approach, initially concentrating on addressing illegal content.

These laws will apply to all social media and UGC platforms accessible to UK users, irrespective of where the company is based. However, just like the Digital Services Act, there are different categories of services which are affected in different ways.

Categorization of Services - What applies to you?

Ofcom will categorize regulated services, and this categorization is still in process.

According to Ofcom:  

“Once the new laws are enacted, Ofcom will be required to carry out research to help advise the Government on the thresholds it sets in secondary legislation. Ofcom will then produce a list of categorized services based on these thresholds.”

How is The Online Safety Act different from the Digital Services Act?

Both the EU’s Digital Services Act (DSA) and the UK’s Online Safety Act (OSA) aim to regulate the digital landscape but differ significantly in scope, focus, and requirements.

Some of the key differences are:

Scope and Detail: While the DSA is broad in its coverage, focusing on multiple areas like intellectual property and illegal goods, the OSA is narrower but provides a more granular and detailed approach, mainly focusing on content the UK deems illegal or harmful.

Definition of Illegal Content: The DSA offers flexibility by not providing a specific definition of illegal content, leaving it up to EU member states. OSA is more explicit, categorizing illegal content into general offenses and higher-tier "priority offenses" such as child sexual exploitation and abuse and terrorist content.

Platform Classification: The DSA identifies Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) designated by the European Commission. OSA classifies platforms into Category 1 based on risk and size, requiring different levels of diligence.

Protection of Minors: While OSA aims for a comprehensive protective framework within one legislative instrument, DSA addresses minor safety at a higher level and is supplemented by other EU regulations like the Audio-visual Media Services Directive (AVMSD).

The UK Online Safety Act isn't just a collection of new laws; it's a profound transformation in how we approach Trust and Safety on the internet.

For users, this translates into a more consistent and well-defined experience across different platforms and their policies, empowering users to shape their digital lives as they see fit.

For companies, it entails integrating safety-by-design principles into their operations and adjusting their content moderation systems to meet the new standards, under the threat of substantial penalties for non-compliance.

We will be keeping a close eye on the new wave of regulations underway so we can help the community comply in an easy, foolproof way.

We've also written about the differences between the Online Safety Act and The Digital Services Act.

Meet the Author

Jessica Dees

Jessica is the Director of Trust & Safety Policy and Operations at TrustLab.

Let's Work Together

Partner with us to make the internet a safer place.

Get in Touch