Skip to main content

Over New Year weekend, Logan Paul, one of Youtube’s most successful content creators, posted a vlog of a trip to a Japanese forest where he found the body of a suicide victim. The ensuing outcry forced him to remove the video, but not before it had been viewed six million times. Youtube, as well as Paul himself, took a heavy share of the blame. It seems, at least, that 2018 will be just as bad a year for tech companies as 2017 was.

Indeed, the heady utopianism of early tech has died, and been replaced by a more hard-headed approach to the major players. In particular, governments have begun to take an interest in how best to tame the online wilderness. In Germany, the Network Enforcement Law has recently come into force, threatening tech companies with fines of up to €50 million for failing to remove hate speech quickly. And in France this week, Emmanuel Macron announced a new law to combat “fake news.”

Last month’s report from the Committee of Standards on Public Life was in the same vein. Its most eye-catching recommendation was to reconsider to what extent online platforms should be held liable for their content, via reform to the E-Commerce Directive, a European law which came into force over 15 years ago.

Is this the correct approach? Well, the current arrangements are clearly not working, particularly in relation to hate speech, as a cursory Twitter search will attest. And the problem, rightly identified by the Committee, is that online platforms don’t have enough skin in the game when it comes to being held accountable for what their users post.

Reform of the E-Commerce Directive post-Brexit could certainly iron out a number of issues with the current framework. It is easy for platforms to use the complexity of the Directive to avoid accountability for content that their users post, and it hinders the government in forcing platforms to monitor and take-down illegal content. Put simply, the law as it stands ensures that the mechanisms for policing content are generally reactive, rather than proactive.

But reform of the E-Commerce Directive will not entirely solve the problem. Indeed, alone, it may even be a bit of a blunt instrument. Regulation generally favours incumbents, and therefore overly onerous regulation could limit the creative destruction integral to advances in technology. Additionally, Graham Smith has powerfully warned of the risks in relation to putting the burden of regulation upon platforms alone.

Whilst these warnings should be heeded, the greater danger is a lack of ambition in addressing the problem. Making platforms responsible for the content they post is a long-term, difficult project that requires a sea-change in our approach to regulating the internet. A first step should be to ensure our legal system is able to adequately deal with wrongs committed online. At the moment, it is difficult for people to vindicate their legal rights when these have been violated online. This must change.

Firstly, we should create online courts to provide a low-friction, low-cost method for individuals to defend their rights when they have been violated on the internet. Fraud and mis-selling in peer to peer online transactions is often resolved by online platforms within a matter of hours: there is no reason why the justice system should be so far behind.

In addition with this revolution in court capacity, a “from first principles” legal re-organisation should occur regarding online civil and criminal offences. At the moment civil offences are piecemeal in nature, and are often of limited effect against determined online wrongdoers. Similarly, criminal offences relating to trolling, doxxing, and other forms of online misconduct are unclear, and often not fully utilised by the police. An “online offences” act, similar in ambition and scope to the Sexual Offences Act 2003, should be considered, particularly in relation to developing laws to protect individuals against identity theft and online hate-mobs.

Alongside this reform the procedural tools for bringing civil litigation arising from online claims need to be modernised. In particular the “Norwich Pharmacal” regime, which forces online platforms to disclose the identity of users potentially guilty of wrongdoing, is extremely costly and slow. Therefore, many are able to act with impunity online safe in the knowledge that the people whose lives they ruin are unable to afford litigation. An online court system as described above should address this issue in particular.  

These are only preliminary steps. We also need to deal with the effects of social media on democracy and figure out the best way to create financial penalties for online platforms that deliberately flout rules.

However, when it comes to regulating the internet, there can be no shortcuts. The analogy with the American Wild West is overused, but it is a good one. Currently, for many the internet is an amoral playground in which action is divorced from consequence and might is right. With a fully specialised, world-leading online legal system, we can protect rights, tame the online Wild West, and fully harness the internet’s potential.

Aled Jones is a member of Bright Blue and media law barrister at 5RB. The views expressed in this article are those of the author, not necessarily those of Bright Blue.