Content Moderation – The Most Important Legal Issue of the New Decade

This essay was submitted to the PLN’s policy essay competition and placed in the Top 5. It responds to the question ‘What is the most important public interest law issue of this new decade?’

Written by Charles Johnson.


I THE UBIQUITY OF CONTENT MODERATION

The issue of content moderation is pervasive in our everyday lives. It affects the news we receive, our ability to share our opinions and ultimately alters our views and beliefs. However, due to rapid technological developments it has become increasingly difficult to regulate online content. The myriad of online sharing platforms and social networks has meant that regulators are constantly challenged in their attempts to control content. Although current legislation provided legal certainty at the time of its inception, due to significant changes in the way data is shared it has become infeasible to apply such classifications to new online activities. Furthermore, regulators have given greater self-regulating power to online platforms, which in turn may jeopardise democratic principles and fundamental human rights. There has been a critical shift in the way people receive their news. Whilst traditionally newspaper, radio and television were major actors, social media has become the most prevalent source of news. Importantly, it is not all doom and gloom in the domain of content moderation as there is a diverse range of regulations being introduced that will assist preventing the dissemination of illegal content online. However, if we do not heed the current warnings, our notions of freedom of expression and information are in danger.

II CURRENT MEASURES

The E-Commerce Directive (ECD) is the leading form of regulation in the field of data moderation in Europe.[1] It was developed in 2000 to provide legal certainty around the liability of Internet intermediaries. Importantly, the ECD did not focus on particular platforms; rather it focused on certain intermediary activities. An issue with this approach is that as intermediary activities begin to evolve – which we have seen over the last two decades – it is increasingly difficult to apply existing legislation to new functions. The exemptions for liability in the ECD were written for traditional data centres and upload websites, not for the prominent actors such as Facebook, Google and Youtube we see today.[2] As such, there has been much legal uncertainty as to whether social networks, search engines or video sharing platforms fall into the categories of a “mere conduit”,[3] or a “hosting provider” under the ECD.[4] Conversely, article 13 has not given rise to any case law at the EU level,[5] which may suggest that the article is perfectly understood, or alternatively (and more likely) that it is of any practical relevance.[6] Whilst aiming legislation at particular activities does not necessarily render it an unhelpful form of regulation, it does create uncertainties as new activities evolve which are not specifically covered.

III INHERENT LIMITATIONS OF NOTICE BASED LIABILITY

In implementing the ECD, the EU has taken a notice-based liability approach to content moderation.[7] Importantly this renders no absolute protection for Information Society Services (ISS) as only upon gaining awareness of any illegal content is there a requirement to “expediently remove it”.[8] Furthermore, the ECD sets out that the removing of content must be “undertaken in observation of the freedom of expression”.[9] Interestingly however, the ECD itself does not provide notice and takedown procedures. Instead, recital 40 and article 16 of the ECD encourage intermediaries to develop their own self-regulatory practices.[10]

Allowing online platforms to develop their own codes of conduct or community standards may jeopardise democratic principles. By allowing platforms – who many rely on for their news on public affairs – to permit what content remains online can have dire effects on freedom of information.[11] Government regulators are elected by the people to regulate on behalf of the public, thus by allowing private intermediaries to produce their own guidelines on what content can stay online, the interests of the public may not always be the priority. As such, this over-privatization of content moderation gives too much power to platforms to control what their users see. As the demand and reliance for balanced content on these pages grows, there is a greater public utility in the neutrality of such platforms.

Furthermore, in light of these takedown requirements many intermediaries are operating in fear of having illegal content on their platforms, and have been overly hasty in taking down content at the slightest suspicion of illegality.[12] This presents issues surrounding the infringement of human rights in that users’ content is being removed in a type of “cowboy-style” private system of justice.[13] That is, the private sector ultimately has control to deem whether content is appropriate for its platform or not. If intermediaries continue to function in fear of being held liable for illegal content on their platforms, it is not irrational for them to operate on the side of caution and remove potentially illegal material.[14] These unbalanced liability standards whereby ISS are held responsible for illegal content whilst users escape any liability have lead to the attitude of ISS taking-down content at the slightest chance of data being illegal.[15]

IV DANGER TO FREEDOM OF EXPRESSION

At the time of its inception, the EU was alert to the dangers of freedom of expression and innovation that came with holding intermediaries liable for their involvement in content sharing. We have seen a shift away from traditional means of communications such as newspapers, radio and television and towards social media as the main source of news.[16] Accordingly, there is mounting pressure on how to regulate the dissemination of information in this context.

The Council of Europe (CoE) commented on this issue, purporting any intervention by Member States to prohibit access to specific Internet content may indeed encroach on freedom of expression and access to information.[17] The CoE noted that measures to block or filter online can build confidence and safety for users, however such filters can infringe on these fundamental human rights. Therefore, in their guidelines The CoE calls for all online filters to be proportionate to the purpose it is set to achieve. In turn, this should assist in helping to avoid the unreasonable blocking of content.[18]

Ultimately, with regards to content moderation state policy should correspond to a “pressing social need”,[19] be essential to democratic society,[20] and be compatible with Article 10 of the ECHR.[21]

V RESTRICTION OF INFORMATION IN PRACTICE

We only need to look as far as to recent Instagram behaviour in the United States to see the dangers of operating under a notice based liability system.[22] Instagram has been found to be removing posts supporting Iranian leader Qasem Soleimani in order to follow U.S sanctions law. There has been much criticism around this action with calls from free speech advocates that this censoring is dangerous to the right to information. The International Federation of Journalists General Secretary Anthony Bellaneger said it is “unacceptable” that Instagram should choose to remove Iranian media content “at a time where Iranian citizens need access to information”.[23] Here, Instagram’s conduct poses a serious threat to freedom of information in Iran and goes a long way to “undermining the creditability of social media platforms as an area for free expression”.[24] In such a privately regulated domain, it is likely we will see more of this controlling behaviour by social media platforms.

VI RESHAPING THE FRAMEWORK

To combat the aforementioned issues, regulations to deal with liability for illegal online content should be principles-based so as to be adaptable to changes in technology and business models.[25] Furthermore, the EU Commission should provide clarification on these principle-based rules through delegated acts or interpretative guidance as online markets continue to evolve. Moreover, given the range of actors involved in the spread of illegal data (platforms, users, regulators), liability rules should more efficiently share the responsibility for detecting and removing content.[26] Existing regulations have demonstrated a tendency to place responsibility on one central actor; editor, data controller or moderator. However, unlike traditional media, online platforms’ business models revolve around user activity. Thus many of the issues such as disinformation or hate speech stem from user conduct. In turn, part of the future regulation should emphasise the responsibility of users and their behaviour online.

VII MOVING FORWARD

Ultimately, due to the ubiquitous presence of the Internet in our everyday lives, an optimal regulatory framework needs to consider a balance between a right to business, user safety and the freedom of expression. Currently, the privatisation of content moderation via the ECD creates outcomes that jeopardise judicial functions and freedom of expression. Furthermore, due to the specific activity focused structure of the ECD this has created much legal uncertainty in the domain of intermediary liability. Thus, as the online world continues to evolve exponentially, it may be important to consider the development of broader, more flexible principle-based regulations. The content we see online can have profound effects on our beliefs and values, and as such content moderation could be one of the most important issues we face as society today.

Footnotes

[1] Directive 2000/31/EC Of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce) [2000] OJ L 178/1 (‘E-Commerce Directive’).

[2] David Cappuccio, ‘The Data Center is Dead’, Gartner Blog Network (Blog Post) <https://blogs.gartner.com/david_cappuccio/2018/07/26/the-data-center-is-dead/&gt;.

[3] E-Commerce Directive, art 12.

[4] E-Commerce Directive, art 14.

[5] E-Commerce Directive, art 13.

[6] Stephen Kulk, Internet Intermediaries and Copyright Law: EU and US Perspectives (Kluwer Law Intentional, 2019).

[7] Yaman Adkeniz, ‘To block or not to block: European approaches to content regulation, and implications for freedom of expression’ (2010) 26(3) Computer Law & Security Review 260.

[8] E-Commerce Directive art 14(1)(b). 

[9] The Council of Europe, Freedom of expression in Europe – Case-Law concerning Article 10 of the European Convention on Human Rights (Council of Europe Publishing, Human Rights Files No.18, 2007).

[10] E-Commerce Directive, rec 40; E-Commerce Directive, art 16.

[11] Wharton School of the University of Pennsylvania, ‘Regulating Big Tech: Is a Day of Reckoning Coming?’, Knowledge at Wharton (Web Page, 11 June 2019) <https://knowledge.wharton.upenn.edu/article/regulating-big-tech-is-a-day-of- reckoning-coming/>.

[12] Sojoera Nas, ‘The Multatuli Project. ISP Notice & Take Down’ (Sjoera Nas Lecture, Bits of Freedom, SANE, 1 October 2004).

[13] Ibid.

[14] Joris van Hoboken et al, ‘Hosting Intermediary Services and Illegal Content Online’ (Report for European Commission, January 29 2019).

[15] Jennifer Urban et al, ‘Notice and Takedown in Everyday Practice’ (2016) SSRN Electronic Journal.

[16] Gordon Pennycook and David G Rand, ‘Fighting misinformation on social media using crowdsourced judgments of new source quality’ (2009) 116(7) Proceedings of the National Academy of Sciences of the United States of America, 2521.

[17] Council of Europe, Measures to promote the respect for freedom of expression and information with regard to Internet filters (Recommendation CM/Rec(2008)6).

[18] Council of Europe, Guidelines for the implementation of the European Landscape Convention (Recommendation CM/Rec(2008)3).

[19] Sürek v Turkey (No. 1) (Judgment) (European Court of Human Rights), Application no. 26882/95, 8 July 1999 [57].

[20] Case of the Sunday Times v The United Kingdom (No 2). (Judgment) (European Court of Human Rights) Plenary Court, Application no. 13166/87, 26 November 199 [50].

[21] Convention for the Protection of Human Rights and Fundamental Freedoms, open for signature 4 November 1950, 213 UNTS 221 (Entered into force 3 September 1953), as amended by Protocol No 14 to the Convention for the Protection of Human Rights and Fundamental Freedoms, Amending the Control System of the Convention, opened for signature 13 May 2004, CETS No 194 (entered into force 1 June 2010).

[22] Cat Zakrezewski, ‘The Technology 202: Instagram faces backlash for removing past supporting Soleimani’, The Washington Post (Article, 13 January 2020) <202/2020/01/13/the-technology-202-instagram-faces-backlash-for-removing-posts-praising-soleimani/5e1b7f1788e0fa2262dcbc72/>

[23] Zena Chamas, ‘Facebook admits censoring posts supporting slain Iranian General Qassem Soleimani’, Australian Broadcasting Corporation (Article, 15 January 2020), <https://www.abc.net.au/news/2020-01-15/instagram-bans-iranians-from-posting- about-soleimani/11864410>.

[24] Ibid.

[25] Miriam Buiten et al, Rethinking Liability Rules for Online Hosting Platforms ‘Discussion Paper No. 074, March 2019), 18.

[26] Natali Helberger et al, ‘Governing online platforms: From contested to cooperative responsibility’ 2018 34(1) The Information Society, 1.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s