Online Safety

Ofcom publishes final illegal content risk assessment guidance and codes of practice under UK Online Safety Act

Published on 4th Feb 2025

Publication sparks the start of the illegal harms duties under the OSA for user-to-user and search services

Man gaming, wearing headset and looking at screen

At the end of last year, amid the media noise for the Online Safety Act (OSA) to be brought into effect as quickly as possible, Ofcom was keen to point out that it was publishing the final versions of its codes of practice and guidance on illegal content four months ahead of the statutory deadline.

The publication of these documents (on 16 December 2024) means that in-scope services must now comply with the first duty on illegal content under the OSA, which is to complete their compulsory illegal content risk assessments, by 16 March 2025. In-scope services must also start putting in place measures to protect users from illegal harms and demonstrate effective governance and record keeping.

Ofcom was legally obliged to consult on these codes and guidance and to undertake research to gather evidence before finalising them. The consultation was launched over a year ago and received over 200 responses. Having considered these responses, alongside research, Ofcom says it has strengthened some areas of the codes and guidance. So, what has changed?

Documents published

Ofcom published final illegal content codes of practice for user-to-user services and for search services, with which, subject to them completing the Parliamentary process, in-scope services will need to comply from 17 March 2025.

It also published guidance on: risk assessment and risk profiles; record-keeping and review; illegal content judgements; enforcement; and content communicated "publicly" and "privately" under the OSA.

In addition, Ofcom published various overview and summary documents, as well as more detailed documents, explaining its decisions and approach.

Illegal content risk assessment

Ofcom's approach to regulating illegal content is risk-based.

In-scope services now have until 16 March 2025 to complete their compulsory risk assessments. The purpose of these risk assessments is to help services understand how risks of different kinds of illegal harm could arise and what safety measures they need to put in place to protect users and comply with their duties under the OSA.  

It is a complex and involved process, consisting of four steps that Ofcom recommends services follow:

  • understand the kinds of illegal content that need to be assessed;
  • assess the risk of harm;
  • decide measures, implement and record; and
  • report, review and update.

Step one – understand illegal content

Ofcom has categorised priority illegal content, against which providers must assess the risk of harm arising to users, into 17 categories (up from 15 in the draft consultation documentation), by adding in animal cruelty and separating unlawful immigration and human trafficking offences into two distinct types of illegal harm. In-scope services must also assess whether there is a risk of users encountering other illegal content.

Ofcom has provided tables of "risk profiles" that services must consult. These are essentially a list of different service characteristics representing a selection of risk factors that it has identified and set out in its register of risks as being linked to a risk of illegal harm. These are largely the same as before.

Considering these risk profiles, providers must identify the relevant risk factors on their services for each of the 17 types of illegal content. They are the starting point for conducting a risk assessment and are not exhaustive.

Ofcom says that it has strengthened its guidance to make it easier for services to identify illegal intimate image abuse and cyberflashing. Its guidance on making illegal content judgements now includes further advice as to what amounts to intimate image abuse, where it might be found and more usage examples.

Step two – risk levels

For this part of the assessment, providers must consider both the likelihood and impact of the illegal harm identified as occurring on the service.

Services must assess everything set out in the OSA and assign a risk level to each of the 17 kinds of illegal content based on evidence. To assist, Ofcom has produced a series of risk level tables to help classify each risk level into high, medium, low and negligible/no risk. This latter level is a new level that has been introduced.

Ofcom has now provided more detail in these risk level tables and on assessing child sexual exploitation and abuse risks, with additional specific risk tables on image-based child sexual abuse material (CSAM), CSAM URLs and grooming.

Step three – decide measures to reduce risk

This part of the risk assessment process involves deciding what measures to use to reduce the risk of harm, implementing them and recording them. The codes of practice for user-to-user (U2U) services and for search services set out Ofcom's recommended measures and act as a "safe harbour", meaning that services that implement all applicable recommended measures will be considered to be compliant with the OSA's illegal content safety duties.

Which measures to implement will depend on the size of the service, its functionalities, or risk levels. For U2U services, the measures vary based on whether a service is "large" (with an average user base of more than seven million monthly active UK users, equivalent to 10% of the UK population) or "smaller" (that is, not large services), low risk, single risk or multi-risk. Similar divisions apply for search services.

Service providers may elect to implement their own, alternative measures, but this requires record keeping and an explanation of how the duties to protect users have been met. They will also have to demonstrate how their alternative measures take account of users' rights to freedom of expression and privacy laws. Providers will only be considered compliant if Ofcom is satisfied that the alternative measures are suitably robust to meet the underlying duties of the OSA.

Codes of practice

Ofcom's original draft code of practice for U2U services contained 34 recommended measures. These have been expanded and there are now 41 recommended measures. The search services code has been expanded from 28 to 33 recommended measures.

One of the key changes is that all file-storage and file-sharing U2U services that are at high-risk of image-based CSAM will be expected to use automated tools called "hash-matching" and URL detection to find CSAM, regardless of their size. Before, these services were only caught if they had more than 70,000 monthly UK users.

The code of practice now also includes a recommendation that all U2U services employ a content moderation function to review and assess suspected illegal content (as well as a function that allows for the swift take down of illegal content). Where illegal content is suspected, the provider must make an illegal content judgement or consider whether it is in breach of its terms of service.

The measures to protect children set out in the U2U code of practice are largely the same as before: all services at a high risk of grooming (including smaller services) and large services at a medium risk of grooming, are expected to use safety defaults for child users to stop them receiving "friend" or "follow" requests, recommendations of user accounts to connect with, and direct messages from unknown accounts, and to stop location information being displayed to other users.

Other additional measures that both the U2U and search service codes now provide include:

  • extra requirements for large services and other "risky" services that are likely to be accessed by children to provide additional information regarding their complaints processes;
  • all large services and other risky services must now allow all complainants to opt out of non-ephemeral communications in relation to a complaint; and
  • all services are allowed to disregard complaints that are not appeals if they are manifestly unfounded, provided certain policies and processes are in place.

Providers are expected to keep contemporaneous records of each risk assessment made and how it was made. In its guidance on record keeping and review Ofcom has added a further category of information that should be recorded, which is consideration of any existing controls already in place on the service, what they are, what risks they are intended to mitigate and how they do this, as well as how these existing controls have impacted the risk level assigned.

Step 4 – review and update

All in-scope providers must keep their written records of their risk assessments up to date and reported appropriately.

Ofcom has not made any significant changes to this part of the guidance, other than to make clear that category 1 and 2A services must also supply Ofcom with a copy of their illegal content risk assessments as soon as reasonably practicable and category 1 service providers must include a summary in their terms of service. Category 2 service providers must make a summary publicly available.

Categorised threshold conditions

Implementation of the additional duties that some services will face as "categorised services" is the third phase of Ofcom's roadmap. However, draft legislation has now been laid before Parliament setting out the threshold conditions for the designation of services as category 1, 2A or 2B services.

Category 1 will capture U2U services that either: (i) use a content recommender system and have an average number of monthly active UK users of over 34 million; or (ii) allow users to forward or re-share user-generated content to other users of that same service, use a content recommender system and have an average number of monthly active UK users of over seven million.

Category 2A threshold conditions will capture search services and combined U2U and search services that have an average number of monthly active UK users of over seven million on the search engine part of the service.

Category 2B threshold conditions will capture services that allow users to send direct messages and have an average number of monthly active UK users of over three million on the user-to-user part of the service.

The government declined to create threshold conditions based solely on functionalities and other characteristics, without setting a user number threshold, which would have captured "small but risky" services. This has been criticised by some and the government has not ruled out the possibility of changes to the threshold conditions in the future.

Osborne Clarke comment

Overall, Ofcom has attempted to make both the codes of practice and guidance clearer by restructuring them and adding further detail to some of the requirements. However, the process for carrying out risk assessments and implementing appropriate risk mitigation measures remains a very complex one. Service providers are likely to have questions about what Ofcom's suggested evidence inputs mean in practice – for instance, to what extent do trust and safety teams need to rework and analyse significant amounts of content moderation data in an attempt to match Ofcom's taxonomy of 17 categories of priority illegal harm, particularly where existing content reporting systems use completely different taxonomies? It seems inevitable that to carry risk assessments to the level required by Ofcom will  require significant time and effort, even before implementation of the recommended codes of practice measures begins.

Ofcom has also made it clear that it stands ready to use the full extent of its enforcement powers and is gearing up to take early action against any services that do not comply. It will also be producing additional codes of practice measures next year, looking at blocking accounts that have shared CSAM, using AI to tackle illegal harms, using hash-matching to prevent the sharing of non-consensual intimate imagery and terrorist content, and crisis response protocols for emergency situations. This is a fast-moving area of law and it is important for services to keep up to speed.

Share

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Interested in hearing more from Osborne Clarke?