Digital regulation | UK Regulatory Outlook March 2025
Published on 26th March 2025
Online Safety Act updates | Ofcom consults on safety for women and girls online | Ofcom enforcement programmes including CSAM | Final information-gathering powers guidance from Ofcom | Digital regulation aspects of Crime and Policing Bill

Online Safety Act updates
Regulations on category threshold conditions in force
The Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025 came into force on 27 February 2025.
The regulations define the thresholds above which in-scope services become "categorised services" and subject to certain additional obligations under the Online Safety Act 2023 (OSA). See our Insight for background. The regulations define the categories as follows:
Category 1: this includes regulated user-to-user services that have an average number of monthly active UK users exceeding 34 million, or 7 million for file-sharing sites. This places "small but risky" platforms outside the category.
Category 2A: this covers search engines that have a monthly average of more than 7 million active UK users. This does not apply to search engines that only allow users to search specific websites or databases on particular topics, themes, or genres, and which rely on a third party's application programming interface or other technical means to show search results to users.
Category 2B: this covers direct messenger services that have a monthly average of more than 3 million active UK users.
Ofcom will now assess the services against these thresholds and publish a register of categorised services, as well as a list of emerging category 1 services. Formalisation of the category thresholds gives regulated services the opportunity to assess provisionally whether their services meet any of the thresholds and prepare to comply with the relevant additional obligations.
Ofcom consults on draft guidance on keeping women and girls safe online
Ofcom's draft guidance, which the regulator is obliged under the OSA to publish, sets out how regulated service providers can deal with content and activities that disproportionately impact women and girls. The regulator asks providers to take action in nine areas:
- Ensure that governance and accountability processes address online gender-based harms.
- Conduct risk assessments that focus on harms to women and girls.
- Be transparent about women's and girls' online safety.
- Conduct abusability evaluations and product testing.
- Set safer defaults.
- Reduce the circulation of content depicting, promoting or encouraging online gender-based harms.
- Give users better control over their experiences.
- Enable users who experience online gender-based harms to make reports.
- Take appropriate action when online gender-based harms occur.
In relation to each action, Ofcom sets out:
- "Foundational steps", that is, measures drawn from its codes and guidance on illegal content and the protection of children.
- "Good practice steps", that is, practical ways for providers to go further in demonstrating a commitment to the safety of women and girls.
Ofcom also expects services with the highest risk and largest reach to do more to ensure they provide safer experiences for women and girls.
The draft guidance is out for consultation until 23 May 2025. Ofcom expects to publish the final guidance by the end of 2025.
Ofcom launches enforcement programme
As the deadline (16 March 2025) for regulated services to complete their illegal content risk assessments under the OSA approached (see our Insight), Ofcom launched an enforcement programme to monitor compliance.
As foreshadowed at its recent OSA conference, Ofcom has formally requested certain regulated services, of all sizes, to submit their illegal content risk assessments to it for evaluation. Ofcom will use the information in these records to identify possible compliance concerns and to monitor how its guidance is being applied.
Ofcom expects its enforcement programme to run for at least a year, during which period it may initiate formal investigations if it suspects that a service provider is failing to meet its obligations under the OSA.
Ofcom launches enforcement programme in relation to CSAM
Ofcom has also launched an enforcement programme targeting child sexual abuse material (CSAM) online. In recognition of the prevalence of such material, making human content moderation insufficient to deal with the issue on its own, Ofcom's illegal harms codes of practice recommend that certain services use automated moderation technology, including perceptual hash-matching if the service is a file-sharing or file-storage service at high risk of hosting CSAM, to identify such content and swiftly remove it.
Ofcom has written to various file-sharing and file-storage service providers that present a particular risk of harm to UK users from CSAM to put them on notice that it will shortly be sending them formal requests for information. Ofcom wants to assess whether they are in-scope of the OSA and, if so, the measures they have in place, and/or plan to put in place, to identify, assess and remove known image-based CSAM. Ofcom has also written to other file-sharing and file-storage service providers to advise them of their duties under the OSA and plans to engage further with these services in due course.
Where Ofcom finds potential non-compliance, it will consider formal enforcement action. The regulator is therefore putting a marker down that it will immediately enforce the illegal content duties that are now in full effect where a breach of those duties relates to some of the worst types of illegal content.
Ofcom publishes its final information-gathering powers guidance
Ofcom has finalised guidance on its information-gathering powers under the OSA. Under the OSA, Ofcom has powers to require and obtain information from regulated services that it needs in order to exercise, or decide whether to exercise, its online safety duties. It will do this by issuing "information notices". The guidance provides an overview of Ofcom’s powers, how it will exercise these powers and the processes it will typically follow. Failing to comply with an information notice may result in Ofcom taking enforcement action under the OSA.
Other updates
Crime and Policing Bill introduced to Parliament: digital regulation aspects
The Crime and Policing Bill was introduced to Parliament in February and had its second reading on 10 March 2025. Among other things, the bill aims to combat online child sexual exploitation and abuse.
The bill introduces a new offence of carrying out a "relevant internet activity" with the intention of facilitating child sexual exploitation and abuse. "Relevant internet activity" covers: service providers within the meaning of the OSA; maintaining or helping to maintain an internet service (or part of such a service) provided by another person; administering, moderating or otherwise controlling access to content on an internet service; and facilitating the sharing of content on an internet service.
The bill also proposes replacing the OSA "communications offence" of encouraging or assisting serious self-harm through communication, whether electronic or verbal, with a broader offence covering all means of encouraging or assisting serious self-harm.
The bill has now entered the committee stage in the House of Commons, where it will be scrutinised line-by-line. The Public Bill Committee has launched a call for evidence, which is a chance for those with relevant expertise and experience, or a special interest, to submit their views. Written evidence should be submitted as soon as possible, and no later than 5pm on 13 May 2025.
See AI section on the proposed new offence relating to AI tools made or adapted for creating CSAM.
Protection of Children (Digital Safety and Data Protection) Bill: second reading
The Protection of Children (Digital Safety and Data Protection) Bill had its second reading on 7 March 2025.
The bill, which is a private members' bill, had cross-party support in its original form. It included provisions on raising the age limit for social media use from 13 to 16, committed the government to reviewing the sale of phones to teenagers and gave Ofcom further powers to protect children. The bill that was debated on 7 March was much scaled back and now only:
- Commits the Chief Medical Officer to publish advice for parents on the use of smartphones and social media by children.
- Requires the government to publish a plan for research into the impact of use of social media on children within 12 months.
- Requires the government to "assess" the extent to which the online experiences of children are age-appropriate, and the appropriateness and effectiveness of the digital age of consent, and provide a statement on whether the digital age of consent under Article 8 of the GDPR, should be increased.
It was reported that the MP who introduced the bill, Josh MacAllister, wanted government support for the bill and that this was the furthest the government was willing to go.
At the debate on 7 March, the Minister for Data Protection and Telecoms, Chris Bryant, said that the watered down bill's recommendations "chime very much" with what the government intends to do: the government wants the OSA to "bed in" and see the Data (Use and Access) Bill implemented before taking any further action. However, Mr Bryant said, he would "be amazed if there is not further legislation in this area in the coming years." The debate on the bill was then adjourned until 11 July 2025.