Online harm | New regulator to have power to block websites or impose personal liability under UK government proposals
Published on 11th Apr 2019
On 8 April 2019, the UK government published a white paper on tackling online harm. The government is consulting on a comprehensive new regime to tackle issues ranging from hate speech to cyberbullying and election interference. The proposals would impose a new statutory duty of care on businesses and would establish a new regulator with draconian enforcement powers including the ability to issues fines, block services or impose personal (criminal) sanctions on senior individuals.
What are the proposed regulations looking to tackle?
Online harms can take a number of forms. Whilst the government is in parallel looking at specific issues such as combatting fake news, this paper sets out to tackle a number of different types of online harm, including: child sexual exploitation and abuse, terrorist content, the promotion of gang culture, cyberbullying, the accessing of content relating to self-harm or suicide, online misinformation (fake news and interference in elections), and the online abuse of online figures (including journalists and MPs). It follows a House of Lords paper published on 9 March 2019 which addressed the challenges of regulating in a digital world, and covered many of the issues covered in this white paper.
The regulatory framework proposed by the white paper is designed to apply to websites and other online services that allow users to share or discover user-generated-content, or interact with each other online. This will catch social media networks, as well as messaging services and other content-sharing platforms.
What will the new regulations require businesses to do?
At the heart of the new regulations will be a new, statutory duty of care on relevant businesses to take "reasonable steps to keep users safe, and prevent other persons coming to harm as a direct consequence of activity on their services." It is being proposed that the legislation will require businesses to do what is "reasonably practicable", which is an established test in regulatory fields such as health and safety.
The regulator will publish codes of practice relating to specific areas, which will outline the measures that the regulator considers businesses should take to comply with their duty of care. These codes of practice may include measures such as systems and controls, technologies, staff training or the use of human moderators. In a similar way to regulatory codes of practice such as the Information Commissioner's Office's code of practice on data sharing, of the UK Corporate Governance Code, businesses will not necessarily need to follow the codes of practice, but if they do not do so, they will need to be able to explain and justify to the regulator the alternative action they are taking, and how that will deliver the same or better results.
In keeping with the trend of harnessing business transparency as a tool of regulation, the regulator will have the power to require annual reports from businesses, covering areas such as: the processes that the businesses has for reporting illegal or harmful content or behaviour; the technological tools that the businesses employs for these purposes; evidence of the business's enforcement of its own terms and conditions; evidence of cooperation with UK law enforcement or other agencies; and details of investment to support user education and awareness of online harms. As with other business transparency requirements such as gender pay reporting or transparency in supply chain (modern slavery) statements, these annual reports will be made public, and the regulator will publish an annual transparency report on businesses' compliance with their duty of care and the prevalence of online harms on different platforms.
What enforcement powers will the regulator have?
The white paper acknowledges the huge challenge of regulating such a diverse range of activities and business types: from start-ups to major multinationals and businesses without a presence in the UK. The solution being proposed is to give the regulator a full array of enforcement tools and flexibility as to which approach to deploy. As well as powers to gather information and issue improvement notices, the regulator will be able to levy "substantial" fines, taking into account factors such as annual turnover, volume of illegal material, numbers of views and the time taken to respond to the regulator. But the government considers that for the largest, global businesses, fines alone may not provide enough incentive to change behaviours. The government is therefore also proposing the following potential sanctions:
- Disruption of business activity, by forcing businesses to withdraw certain services.
- ISP blocking of non-compliant websites.
- Imposing liability on individual senior management.
The white paper acknowledges that these measures, which would be reserved for the most serious breaches, would be controversial, and the white paper sets out questions on which the government is seeking views from industry and other stakeholders.
The white paper also addresses redress for individuals. Companies will be expected to have effective and easy-to-access complaints functions, which the regulator will ultimately have oversight of. The government is also consulting on whether to allow designated bodies to bring so-called 'super complaints', as organisations such as Which? are able to do in relation to consumer or competition issues. Individuals will not, however, be able to bring private litigation for alleged breaches of the 'duty of care' imposed under the new regulations.
What about start-ups?
The ethos of these regulations is taking a risk-based approach, and the white paper readily acknowledges that regulation can impose a disproportionate burden on smaller companies. Part of the regulator's remit will be to make compliance as straightforward as possible, including through the use of RegTech solutions that can be made available to start-ups and SMEs. The regulator will also have a legal duty to pay due regard to innovation. This could include the regulator working with start-ups and those developing innovative products under a 'regulatory sandbox' (of the type that the Financial Conduct Authority runs for financial services start-ups).
What happens next?
This consultation runs until 1 July 2019, following which the government will review the submissions that it receives and publish the government's response. The government is also looking to run a series of engagement workshops and research, all of which will feed into the government's intended legislation, which is likely to follow in late 2019 or 2020.
Osborne Clarke comment
As the government acknowledges, online harm is a complex, multifaceted issue. The drivers, actors and technological tools available to tackle illegal content are very different from those of cyberbullying or interference in elections. Likewise, the types of services covered, and the types of business providing those service, are varied in all respects.
The UK government's proposal to tackle these issues is ambitious. It draws on regulatory trends such as business transparency and individual liability; it is intended to be risk-based, and imposes a statutory duty on the regulator to pay due regard to innovation. Yet, these are global issues and ultimately require a joined-up approach to tackle them at an international level. Although the white paper states that the new regime would be compatible with the EU e-Commerce Directive, it does not go into detail on how current intermediary liability exemptions would be protected, or how the new regime would sit alongside EU copyright reforms or other measures to tackle specific issues such as fake news. While the white paper acknowledges that many of the harms in scope are already covered by existing or impending regulation (such as the revised Audiovisual Media Services Directive, which will require video-sharing platform services to take appropriate measures against harmful content), intertwining the government's proposed regime with these rules will be an extremely complex exercise and require careful consideration.
The tech industry is alive to calls for regulation to tackle online harm, and in some cases have accepted the need for new regulation. To be the most effective, it is to be hoped that any legislation that results from these proposals takes into account the expertise and views of all stakeholders and any developments at an international level.
You can respond to the consultation using this form, which is open until 1 July 2019 – or please contact one of the experts listed below to discuss what this might mean for your business.