Digital Regulation | UK Regulatory Outlook November 2024
Published on 27th Nov 2024
Government consults on introducing sanctions for senior executives of online platforms and marketplaces for failure to remove content on weapons and knives | Ofcom explains how the OSA will apply to generative AI and chatbots | Ofcom calls for evidence to inform its report on researchers' access to information from regulated services under the OSA
UK updates
Government consults on introducing sanctions for senior executives of online platforms and marketplaces for failure to remove content on weapons and knives
The government is consulting on introducing personal liability measures for senior executives of online platforms and marketplaces who fail to remove illegal content relating to knives and offensive weapons.
Existing laws already make it a criminal offence to manufacture, sell and offer for sale prohibited offensive weapons, and to market knives. The Online Safety Act 2023 (OSA) also requires platforms to remove illegal content when they become aware of it and to protect children from harmful and age-inappropriate content.
However, the government is concerned that social media platforms are being used to sell prohibited weapons and knives, including to under 18s, in ways that encourage violence. Therefore, it believes that stronger action is needed in this area.
The government proposes giving the police the power to issue content removal notices to companies and designated senior executives, requiring the removal of illegal content within 48 hours. If the company fails to remove the content within the time limit, a second content removal notice would be sent to the senior executive. Continued non-compliance would result in a notice of intent being sent to the senior executive, stating that legal action will be taken against them if they fail to comply. The senior executive would have 28 days to object.
Non-compliant senior executives would then face civil action and the possibility of a fine of up to £10,000. The consultation closes on 11 December 2024.
Ofcom explains how the OSA will apply to generative AI and chatbots
Ofcom has published an open letter to UK online service providers on how the OSA will apply to generative AI and chatbots. Ofcom reminds providers that the following AI tools and content will be in scope of the OSA: user-to-user services; search services and pornographic material.
User-to-user services:
- Sites or apps which include a chatbot enabling users to share text, images or videos generated by the chatbot with other users.
- Services allowing users to upload or create their own chatbots (user chatbots), which are then made available to other users. Any content created by these chatbots is "user-generated content" and is regulated by the OSA.
- Any AI-generated content shared by users on a user-to-user service is user-generated content and would be regulated in the same way as human-generated content (for example, deepfake and human-generated fraud material). This applies regardless of whether the content was created on the platform where it is shared or uploaded from another site.
Search services: generative AI tools that enable the search of multiple websites and/or databases, including tools that modify or facilitate the delivery of search results, or which provide "live" internet results.
Pornographic material: sites and apps that include generative AI tools that can generate pornographic material. These services are required to use highly effective age assurance measures to ensure children cannot normally access such material.
Ofcom is ready to enforce and urges regulated services to start preparing for compliance now, as the first set of duties under the OSA will take effect in December this year. See our recent Insight on Ofcom's OSA implementation roadmap.
Ofcom calls for evidence to inform its report on researchers' access to information from regulated services under the OSA
The OSA requires Ofcom to report on the ways and extent to which independent researchers access information on online safety matters from providers of regulated services. Ofcom wants to understand how researchers currently obtain information from providers, the challenges they encounter, and how greater access to the information might be achieved. The findings from this call for evidence will inform Ofcom's report. The deadline for responses is 17 January 2025.
EU updates
EU Commission adopts implementing regulation on transparency reporting under the DSA
The implementing regulation standardises templates and reporting periods for the transparency reports that providers of intermediary services have to publish under the Digital Services Act (DSA) in relation to their content moderation practices.
Very large online platforms (VLOPs) and very large online search engines (VLOSEs) must report twice a year, while other services report annually.
The DSA details the specific categories of information that the transparency reports must contain, such as the number of content items and user accounts removed, the accuracy of any automated systems used and information on content moderation teams. The standardisation provisions in the new implementing regulation should simplify compliance for providers and ensure consistency in reporting practices so that comparisons can be made.
Under the implementing regulation, providers must start collecting data in line with the templates from 1 July 2025, with the first harmonised reports due at the beginning of 2026.
The Commission also plans to update the requirements for submitting statements of reasons to the DSA Transparency Database so that they align with the implementing regulation.
EU Commission consultation on draft delegated regulation on rules for researchers to access online platform data under the DSA
Under article 40(4) of the DSA, VLOPs and VLOSEs have to allow "vetted researchers" (who meet the relevant requirements in the DSA) to access their data, subject to approval from their Digital Services Coordinator, for the purposes of evaluating systemic risks and mitigation measures. The proposed regulation will further specify the procedures, conditions and purposes for such data sharing and use.
The consultation opened on 29 October and has been extended to 10 December 2024. The Commission plans to adopt the rules in the first quarter of 2025.