Online Safety

UK's Online Safety Act is a seismic regulatory shift for service providers

Published on 26th Oct 2023

Implementation raises complex challenges for businesses when navigating both domestic and international legislation

Close up of people in a meeting, hands holding pens and going over papers

The UK government's long-awaited Online Safety Act (OSA), which has finally received royal assent, aims to create a new regulatory framework to protect users from objectionable content and make the online world a safer place.

The coming into force of the OSA represents a seismic shift in the regulatory environment for many online service providers, which were previously not subject to any safety regulation in the UK and will now fall under Ofcom's jurisdiction.

Broadly, the OSA applies to platforms with users in the UK that host user-generated content or facilitate the interaction between users online. It imposes duties of care on these businesses to have systems and processes in place to improve user safety. It also appoints Ofcom as the online safety regulator and sets out its enforcement powers.

What will it attempt to tackle?

The OSA introduces numerous duties of care intended to make the online landscape a safer place. Which of these will apply will depend on the nature and size of the online platform.

One of the main duties relates to illegal content. All in-scope services will be required to undertake detailed risk assessments to ascertain the risks of illegal content appearing on their service, have measures in place to prevent access to illegal content (for example, terrorism content, child sexual exploitation and abuse content, threats to kill, harassment and fraud) and mitigate the risks of harm identified in the risk assessments. This sits alongside a duty to have in place a robust content-reporting systems, with which platforms caught by the EU Digital Services Act will be familiar.

All in-scope services will also need to carry out an assessment of whether children are likely to access the service. If this is the case, there are specific duties aimed at tackling content that may not be illegal but may still be harmful (for example, content that promotes suicide, self-harm or eating disorders).

The biggest user-to-user services – the threshold is yet to be established – will also need to implement "user empowerment" features to give adults more control over whether they see certain specific types of harmful content.

While the OSA is generally geared towards organic user content, it does introduce specific duties in relation to advertising of a fraudulent nature for the biggest user-to-user and search services. There are also age verification duties for pornography sites.

It's worth noting that the OSA is not a content regime. In particular, it does not hold platforms liable for specific types of objectionable content. Instead, it aims to takes a systemic, proportionate and risk-based approach to regulation. In practice, this means the focus of the legislation is on regulating the systems and processes that in-scope service providers have in place to meet their legal duties. The Act is not prescriptive as to how platforms achieve this, but imposes obligations to adopt measures, mechanisms and processes that are proportionate to the size and capacity of the platform. The practical expectation around these obligations will become clearer as Ofcom publishes its codes of practice and guidance.

Who is in scope?

The OSA imposes obligations on the following online service providers:

  • User-to-user services. These are internet services that allow users to encounter user-generated content (for example, social media platforms, online marketplaces, dating apps, online forums and video games).
  • Search services. These are providers of search services, that allow users to search multiple websites and databases (that is, search engines or smart speakers with internet-enabled search functionality).
  • Other online service providers. These are providers that publish or display pornographic content.

There are a number of types of services that are expressly out of scope of the OSA. These include:

  • Services where the only form of user-generated content that is supported by the service is email, SMS, MMS or one-to-one live voice calls (although video calling services and instant messaging apps are within scope).
  • Services where users can only leave reviews on a product page, share comments, express views using like/dislike buttons or emojis, or display usernames or avatars.
  • "Internal business services" that are only used by a closed group of individuals, like work intranet pages and CRM systems.

(Please see our more detailed article for further insight on platforms that will be in scope.)

Balancing measures

Some commentators have highlighted the potential risks of automatic content moderation and the impact this could have on free speech and political debate, particularly if content is algorithmically filtered. The OSA therefore introduces a number of balancing measures designed to mitigate against unduly zealous content moderation by service providers.

All regulated platforms must have particular regard to the user's rights of freedom of expression before removing content or adopting other measures in order to comply with their duties. Furthermore, UK news publisher content is exempt from the online safety duties, meaning that news publishers do not have to comply with the same requirements as other online platforms under the OSA.

Additionally, the largest user-to-user services also need to have systems and processes in place to protect "content of democratic importance", "news publisher content" and "journalistic content". This includes a "temporary must carry" provision, giving news publishers a right to appeal before their content is taken down. Below the line comments are also out of scope of the OSA.

Notwithstanding these balancing measures, there is likely to be some tension between the preservation of free speech and the other duties to protect users from illegal or harmful content under the act. Now that the OSA is in force, Ofcom plans to review its impact on the availability of journalistic and news publisher content to ensure that it has not adversely affected the availability of news material.

Regulatory powers

The OSA gives powers to Ofcom to act as the online safety regulator. Ofcom will be able to fine companies up to £18 million, or 10% of qualifying worldwide revenue, if they fail to meet their new duties of care. Ofcom can also apply to the courts for an order imposing business disruption measures. In addition to these possible sanctions, the OSA imposes criminal liability on senior managers who fail to comply with steps set out in certain a confirmation decisions from Ofcom.

To assist with their regulation, Ofcom can compel in-scope service providers to provide information and require an individual from that organisation to attend an interview. Ofcom also has powers of entry and inspection, as well as the power to require a service provider to undertake, and pay for, a report from a skilled person.

Regulated services above a qualifying threshold of worldwide revenue will also need to pay Ofcom a supervisory fee, the amount to be based on their revenue.

Ofcom's enforcement decisions and categorisation decisions are subject to appeal to the Upper Tribunal. There is also potential scope for other actions taken by Ofcom in the course of exercising its powers under the OSA to be subject to challenge by way of judicial review.

Ofcom's phased approach

Now that the act is in effect, Ofcom will be under a duty to draft, consult on and publish various codes of practice and guidance. The key substantive duties on in-scope platforms will become binding once the relevant code of practice has come into effect.

Ofcom has said that it will take a phased approach to getting the new regime up and running, with the initial focus on the illegal content duties as its phase one, child safety and pornography duties to follow as phase two, and then other duties to follow as phase three.

Claims by users?

On its face, the OSA does not create any new rights for users to bring claims against platforms. The original Online Harms White Paper stated that the proposed framework would not create new avenues for civil litigation. However, the OSA uses the language of "duties of care" (more typically seen in the context of negligence claims), which may encourage the view that the OSA imposes direct responsibility on platforms for users' welfare.

There are likely to be attempts by individuals or groups of users to formulate claims against platforms for breaches of their duty of care under the OSA. Such claims would not be straightforward and their chance of success remains to be seen. However, if courts are persuaded that the OSA does create new private rights, or strengthen existing ones, this could create a potentially very significant new area of liability for platforms.   

The OSA framework also places heavy emphasis on the role of platforms' terms of service. This may embolden users to pursue claims for breach of contract where platforms fail to comply with their own terms.

Osborne Clarke comment

The OSA introduces complex and challenging issues for businesses to navigate in the context of both domestic and international legislation. Platforms with users in the UK and Europe may now find themselves subject to dual-platform regulatory regimes in the UK and EU by way of the Digital Services Act (DSA). In particular, the OSA is likely to require platforms to be more proactive in finding and deleting illegal content, a significant divergence from the DSA.

Platforms will also have to navigate their new legal duties to keep users safe from harmful content, overlayed with existing legal obligations imposed by data protection legislation. They will also be eager to understand how their new obligations interact with the intermediary liability defences, particularly when these defences sit at odds with obligations to take proactive action.

Guidance from Ofcom will be key to fully understanding the practical expectations. But, in any event, online platforms should start thinking about their compliance with the OSA now, as it is likely to be no small task.

Share

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?