Online Safety

The UK Online Safety Act: Top 10 takeaways for online service providers

Published on 22nd Nov 2023

What are the key points to be aware of and how will it work in practice?

Person holding a smartphone, dark background, phone lit up

The Online Safety Act (OSA) aims to make the UK "the safest place in the world to be online". It seeks to achieve this is by requiring platforms to tackle the illegal content which appears on their services, imposing duties on platforms to ensure they provide a higher standard of protection for children, and introducing potential criminal penalties for corporate officers in certain cases of non-compliance as well as significant financial penalties for the platforms in breach.

It also makes the regulator, Ofcom, very powerful. Without needing wider endorsement from a court or separate body, it can investigate, enforce, and effectively shut services down.

In this article, we look at 10 key takeaways for those looking to understand the new regime.

1. Who is in scope?

If a platform allows users to share content to other users or search more than one website or database, it is likely to be subject to at least some of the duties in the OSA .

The OSA is targeted primarily at "user-to-user services" and "search services" including:

  • social media platforms
  • online marketplaces
  • instant messaging apps
  • discussion or community forums
  • dating apps
  • games with chat functionalities
  • wikis
  • search engines (including voice assistants)

There are some important exceptions - purely email, SMS or one-to-one live aural communications are out of scope. So are "internal business services" such as business intranets and CRM systems, as well as services where the only user-generated content is comments/reviews on the provider's own content.

The OSA is intended to protect internet users in the UK, so services must have "links with the UK" in order to be caught. However, there is a relatively low threshold to establish this.

For a more information on those services in scope of the OSA, please see this Insight.

2. What are the illegal content duties?

All in-scope providers must undertake risk assessments to ascertain the risk of illegal content appearing on their service. They must also have systems in place to:

  • prevent users from encountering the most serious types of illegal content;
  • manage the risk of the service being used to commit an offence;
  • mitigate the risks of harm identified in the risk assessment; and
  • apply the protections set out in their terms of service consistently.

Providers must also have systems to minimise the length of time illegal content is present on the service, and swiftly take down illegal content once alerted.

They are not required to eliminate illegal content from ever appearing on their sites, but they will need to be proactive in identifying illegal content and minimising risks to users.

Ofcom's first consultation on illegal harms was published on 9 November, and contains its detailed proposals for complying with illegal content duties. The deadline for responses to the consultation is 5pm on Friday 23 February 2024 (see our Insight for more on this).

3. Judging illegal content

In-scope providers will need to make judgements about whether content is in fact illegal, for example when alerted to allegedly illegal content via a user notice.

When making these judgements, a provider must base its decision on "all relevant information that is reasonably available" to it. Using this information, it will need to determine whether it has "reasonable grounds" to infer that the content is illegal, which, in turn, will require consideration of whether:

  • all elements of the offence (including the mental elements) are met; and
  • any defence (to the possibly illegal offence) may be available.

Where the relevant content has been created by an automated tool, such as a bot, the provider must apply the test above to a person assumed to be controlling the automated tool.

Whether this "reasonable grounds" test sets the right threshold for assessing illegality remains to be seen. If the bar has been set too low, the result may be significant over-removal of legal content. If it has been set too high, illegal content may slip through without takedown. Making these judgements will not be easy, particularly for offences that are very context-specific. Ofcom has provided draft guidance on judgement for illegal content within its consultation documents.

4. Ofcom's consultations, codes of practice and guidance

The OSA obliges Ofcom to draft codes of practice and guidance, to help in-scope providers have a better idea of how to comply with their duties of care on a practical level. While codes of practice will not be binding, providers that adhere to them will be deemed compliant with their online safety duties. The guidance will sit alongside the codes, designed to help providers navigate issues such as conducting risk assessments and making judgements about illegal or harmful content.

Ofcom must consult on these documents. For the illegal harms codes and guidance, the consultation started this month (November 2023), and Ofcom expects to finalise and publish them in final form in autumn 2024. Providers would be well advised to start seriously thinking about compliance with the OSA now, on the basis of the draft documentation.

The illegal harms consultation makes it clear that services will need to come to a view on the likelihood and impact of different categories of illegal harms being accessible via their services (including but certainly not limited to terrorism content, child sexual abuse material (CSAM), and encouraging suicide and hate offences).

Depending on the levels of risk established, providers will then be expected to undertake a range of suggested measures, bucketed into the following topics:

  • Governance and accountability measures
  • Content moderation systems and processes
  • Reporting and complaints processes
  • Terms of service transparency
  • Default settings and user support for child users
  • Recommender system testing
  • Enhanced user controls
  • User access restrictions

5. Core child protection duties

Platforms will have to ensure they provide a higher standard of protection for children. Broadly speaking, this will be a three stage process: a duty to carry out a "Children's Access Assessment", a duty to carry out a "Children's Risk Assessment" and children's safety duties.

All providers within the scope of the OSA will need to carry out an assessment of whether their services are "likely to be accessed by children" (noting that there is a specific test for ascertaining this). Providers must do the first of these assessments within three months of Ofcom publishing its guidance on this subject, and then at least once a year if they decide that children cannot access the service.

For those providers whose services are deemed "likely to be accessed by children", the next step will be to undertake an assessment of the risks posed by harmful content on the service that may be encountered by children. This risk assessment will need to consider the prevalence and availability of different categories of harmful content and how the service's features or functionalities might facilitate child harm.

Providers will also be under a broad duty of care to do what they can to minimise the risks identified in the risk assessment. In general terms, this will require:

  • having proportionate measures in place regarding the design or operation of the service to mitigate the risk of harm to children in different age groups;
  • having proportionate systems and processes in place to prevent children (or children in certain age groups) from encountering certain types of content that may be harmful to them – in other words a more active obligation to limit children from seeing that content; and
  • including details in their terms of service of how they are protecting children from harm.

6. Balancing measures

In an attempt to alleviate concerns about the implications of the new law for freedom of expression online, the OSA contains various duties aimed at protecting free speech and democratic debate.

When implementing safety measures, all providers must have "particular regard" to the importance of protecting users' right to freedom of expression and privacy. In practice, these rights will need to be considered as part of any content moderation process.

 Additionally, the biggest providers (to be designated as "Category 1" providers) also have specific duties to protect:

  • "content of democratic importance", being content contributing to democratic political debate in the UK (it may prove difficult to work out whether a piece of content falls into this category in practice);
  • news publisher content published by recognised UK news publishers; and
  • journalism content.

Critics say that these balancing measures are not enough, and that they will simply encourage a tick-box approach where free speech remains stifled.

Much will depend on Ofcom's approach to regulation, and whether it enforces compliance with the balancing measures with the same gusto as the core safety duties.

7. Director and manager risks and responsibilities

Most notable is the introduction of a potential criminal offence for corporate officers if an offence under the OSA is committed by the platform which can be said to have been committed as the result of the "consent, connivance or neglect" of an individual officer of the company. Such offences are common in other UK regulatory areas.

Individuals can be criminally prosecuted if they fail to respond or deal appropriately with requests made by Ofcom as part of its investigations.

Providers can be required to name a senior manager who will be responsible for complying with an Ofcom notice and may be asked to provide that detail to Ofcom.

8. Ofcom's powers

Ofcom can demand information from providers about their services (including the role of their algorithms in displaying content). It can compel information from the company and its employees in interviews, enter and inspect premises and any documents or equipment at the premises, take copies of documents, and require explanations of any documents or request where they can be found.

If Ofcom concludes that a provider is in breach, it can issue a decision notice requiring the provider to remedy failures, and can impose fines of up to £18m or 10% of global annual turnover (whichever is higher), as well as officers potentially facing personal criminal liability.

Ofcom can also seek court orders to force ancillary service providers to withdraw services from companies in breach, effectively giving Ofcom the power to shut providers down.

Challenging the actions of Ofcom will be possible but will require application of judicial review standards which are notoriously difficult.

9. Risk of liability and negligence claims?

The OSA was not intended to create any new rights for users to bring claims against providers. However, there are at least two potential avenues of litigation risk for in-scope companies. 

The OSA's use of the language of "duties of care" (more typically seen in the context of negligence claims) may be perceived an open invitation by would-be claimants to try and formulate claims for perceived failures to keep users safe. Such claims would not be straightforward, and their chance of success remains to be seen. However, even the partial success of these claims could create a potentially very significant new area of liability.

The OSA's heavy emphasis on the role of providers' terms of service may lead to an increase in claims for breach of contract where companies fail to comply with their own terms.

10. Differences in EU and UK approaches

The UK's OSA and the EU's Digital Services Act (DSA) have the common aim of making the internet a safer place. In-scope providers will need to consider similar issues when implementing both regimes, with a particular focus on the scope for bad content to be found on their platforms and the mechanisms in place to prevent this happening.

There are some notable variations in the approaches taken by each piece of legislation to accomplish this goal. These include:

Monitoring obligations

Under the OSA, platforms will likely need to take proactive steps to prevent users from encountering the worst illegal content. Arguably the DSA's obligations are more reactive (at least for platforms that are not categorised as very large online platforms, or VLOPs), with more of a focus on prescribed notice and takedown procedures

Definition of illegal content

The DSA's definition covers online actions that are illegal under any EU or Member State laws, while the OSA's definition excludes (among other things) intellectual property infringement offences and product safety/quality offences.

The OSA puts a focus on the very worst "priority" illegal content specified in the law, such as CSAM and terrorism content.

Risk assessments

The OSA requires all in-scope platforms to conduct thorough risk assessments. The DSA reserves these for VLOPs only.

Child harm

The OSA sets out detailed requirements for platforms accessible by children, including duties to conduct risk assessments and implement proportionate measures to tackle content that is harmful to children (including specific harms set out in the legislation).

Under the DSA, the obligations are somewhat vaguer: requiring online platforms to ensure a high level of privacy, safety and security for minors and VLOPs to implement targeted measures to protect the rights of children. However there are express restrictions on targeting ads to users profiled as minors, which are not reflected in the OSA.

Notice and takedown mechanisms

The DSA sets out very specific requirements for notice and takedown systems, statements of reasons to affected parties, and complaints mechanisms.

While the OSA does mandate content reporting mechanisms and complaints procedures, it is not as prescriptive about what these need to look like.

Advertising

The DSA contains obligations relating to advertising (including ad transparency measures and ad targeting restrictions).

With the exception of requiring the biggest services to prevent fraudulent advertising, the OSA focuses on organic user content rather than paid-for ads.

Osborne Clarke comment

The new regime will require plenty of preparation. Osborne Clarke's specialist team is available to support organisations who wish to respond to Ofcom's proposals.

In the meantime, platforms will be well-served to give thought in advance to the scope for harm arising from their services, the measures in place and needed to prevent it and their existing governance structures in relation to compliance.

Training on responsibilities under the OSA and how a breach could result in personal liability may be required for managers and company directors in order to avoid suggestions that their action or inaction has resulted in a corporate breach.

Furthermore, providers that are already undertaking a DSA compliance exercise will probably want to start familiarising themselves with the OSA requirements to work out how they can dovetail.

Share

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?