Online Safety

Online Harms | Moving towards a system based on responsibility, not liability

Published on 1st Feb 2021

The regulation of 'online harms' in Europe has just taken a big leap. On the same day, just before Christmas, the European Commission unveiled the first draft of its proposal to update the responsibilities and liabilities of digital service providers in its Digital Services Act (DSA), and the UK announced an update on its Online Harms Bill, which is now very close to publication. In the context of Brexit, the divergence of the two regimes is stark and paves the way for some fascinating debates in 2021, and potential compliance challenges beyond that.

The question of how to further regulate the proliferation of harmful online content is complex and tends to divide opinion. While some want online content-sharing platforms to be regulated as if they were primary publishers of such content, others consider that regulating those platforms in this way would seriously undermine the liberty of individuals and their rights to free expression. But the devil is in the detail and these two proposed regimes need to be examined and compared very closely.

How do the DSA and the Online Harms Bill compare?

The DSA replaces the much-talked-about platform protections provided by Articles 12-15 of the E-Commerce Directive. The protections in the DSA are supplemented by a long-awaited 'good samaritan' provision, meaning that platform providers will not lose the liability protections simply by implementing measures to detect, identify and remove illegal content and thereby taking a more active role in this process.

The new legislation also introduces a framework setting out responsibilities and accountability for platform providers, including notice and take-down procedures, annual reporting on moderation, and the procedures for the handling of complaints and disputes. Failures to adhere to these expectations may lead to fines of up to 6% of turnover. But, importantly, the scope of the DSA does not encroach beyond the regulation of unlawful content.

The UK Online Harms Bill shares some characteristics with the DSA by shifting the emphasis from liability to responsibility: the focus is on procedures and accountability rather than simply whether platforms have a defence to civil and criminal liability. But freed from the previously sacrosanct E-Commerce Directive, the UK is going out on a limb by seeking to impose a "duty of care" on certain large digital platforms. That duty applies not just in relation to unlawful online content, but also content that may be lawful but which "gives rise to a reasonably foreseeable risk of a significant adverse physical or psychological impact on individuals." The fines for breaching the duty will be up to 10% of turnover.

Which model will set the precedent?

Let the debate begin. What the UK is proposing is very radical (albeit scaled back slightly from the original proposals) and will be extremely difficult for large internet platforms to implement in practice. The rest of Europe, and indeed the world, will be watching closely to see how this plays out. Can the UK legislature and regulator (Ofcom, the media regulator that will police the Online Harms Bill) implement a regime that imposes obligations in relation to lawful content, without dramatically chilling online speech, inadvertently sparking a mass of litigation, and pushing away digital platforms? If it can, the rest of the world may follow suit. If not, the more platform-friendly European model may prove a more attractive template to other countries (potentially including the US).

In the meantime, the current international comparative position in relation to online harms is very fragmented. In Germany, the well-publicised "NetzDG" law sets out specific requirements that large social networks must meet in terms of deleting or blocking access to certain types of illegal content and reporting on their efforts. It has been seen as leading the charge in terms of the regulation of harmful content online but has been met with much controversy and opposition.

Others have tried to follow suit, with varied success. For instance, in France, the Avia law, which was directed at preventing online hate speech by adopting a similarly strict approach as its neighbour Germany, was struck down by the Constitutional Council as a result of incompatibility with fundamental rights.

Throughout these national attempts to get ahead of the curve, the European Commission has been watching and learning through collaboration with the major internet platforms. This co-operation has included the Code of Conduct on hate speech that it developed in May 2016 in conjunction with four major online operators (Facebook, Microsoft, Twitter and YouTube). It has delayed legislating to ensure that it understands the nuances of this complex area as well as possible. As a result, there is balance in the outcome of its extensive consultation.

However, the European position is extremely complex. Whilst the European Commission has delayed in reforming the E-Commerce Directive, it has legislated in related areas through the EU Audio-visual Media Services Directive, which requires each Member State to impose obligations on video-sharing platform services to protect minors from content that may impair their physical, mental or moral development, and the general public from content that contains incitement to violence or hatred, provocation to commit a terrorist offence and offences concerning child pornography, racism and xenophobia. One of the challenges for Europe will be how the national initiatives can fit into this complex regulatory framework, the aim of which is precisely to harmonize as far as possible the different regulatory systems for digital platforms within the EU.

Irrespective of whether the UK has overstretched itself in its radical new plans, the shift towards responsibility and accountability is to be welcomed. For too long, fear of liability has hampered digital platforms from being fully transparent about what they are doing, and what needs to be done, to tackle the worst of online content. The challenge, in both the UK and in the EU, will be keeping the measures simple and balanced enough to get platforms and regulators working effectively together. In this respect, the carrot may prove to be more important than the stick.

Ashley Hurst is the International Head of Tech, Media & Comms at Osborne Clarke.

Share

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Connect with one of our experts

Interested in hearing more from Osborne Clarke?