The Obligation of Social Platforms to Remove Hate Speech and Illegal Speech

Introduction

This paper seeks to analyze the obligation of social platforms to remove hate speech and illegal speech. It also examines Germany’s Network Enforcement Act or NetzDG law with regards to its effectiveness in reducing hate speech. Firstly, the paper explains the genesis of Germany’s Network Enforcement Act or NetzDG law that was formulated in 2016 but came to effect on January 2018. Secondly, the paper presents the actions taken by various social media platforms in response to the formulation of the aforementioned law, in particular, the paper outlines the actions taken by Twitter, Facebook, Google (YouTube), and change.org. Thirdly, the paper inspects German’s government action on hate speech and illegal speech and how their actions have inspired other nations to follow suit in eradicating hate speech. Lastly, the paper offers recommendations on other ways that can help eradicate hate speech other than enforcing the law.

Germany’s Network Enforcement Act or NetzDG law was formulated in 2016 but came to effect on January 2018. The law is a mitigative measure to reduce hate speech and illegal speech. It attracts fines of up to 50 million Euros for failure to delete such content (Echikson & Knodt, 2018). Lobbyists for the law take the legislation as the best response to the extremism and online threats. However, there are others who take the law to be a draconian censorship program. The reproach used by the NetzDG is taking down the online content and question later (Balkin, 2018). Consequentially, it has seen the takedown from two major platforms; Facebook and Twitter to be 21.1% and 10.8% respectively. NetzDG has significantly reduced hate speech (Echikson & Knodt, 2018). However, there is still some struggle when it comes to compliance. With the evidence from Facebook, where the social site makes it a challenge to fill out Germany’s law compliance. Google and Twitter are far much ahead in reporting complaints according to the law (Kuczerawy, 2016). Nonetheless, the NetzDG did not significantly reduce their behaviour as most of their fourth and fifth party complaints were rejected. The German law obligates social platforms to remove hate speech and illegal speech in an effort to curb illegal web content through algorithms that seek to maximize user engagement with filters in an effort to reduce division and prejudices that minimize hateful language which triggers real-world crimes.
Genesis
In the advent of the Middle East refugee crisis, thousands of them freed their countries, and Germany accepted the refugees. In 2015, Germany accepted one million refugees. Consequentially, there was an anti-immigrant backlash on social media, which saw an increase in hate speech targeting refugees as well as government officials who were blamed for altering the nation’s immigration policy. The German justice minister created a task force comprising of representatives from Google, Facebook, Twitter, and civil societies. It resulted in the social site companies committing to enabling user-friendly reporting tools that would filter and removing illegal content within 24 hours of notification. Further, the German government committed to prosecuting online hate crimes.
Justice minister Maar authorized the testing of the result through jugendcrhutz.net in line with article 130 and article 86a; incitement to gathered and holocaust denial and use of symbols from unconstitutional organizations respectively as well as the Youth Protection Act. The results were disturbing as it led to YouTube, Facebook, and Twitter removed 90%, 39%, and 1% of the 200 reported cases. After 24 hours of testing, the rates reduced to 82%, 31%, and 0% for the aforementioned social sites (Echikson & Knodt, 2018). This formed the basis for justification of the NetzDG with the conclusion that social sites were allowing too much illegal online content with less consideration of the complaints filed. Consequentially, the NetxDG bill was introduced in the parliament on March 27th, 2017 which was approved within five months. The law applies to all social networking sites with more than two million register users, including WhatsApp, telegram, and large civil society movements.
Actions Taken by Social Platforms
As much as the NetzDG has seized its enforcement with bill fines imposed, social platforms have continued to tackle the issue relating to online hate speech and illegal speech. Below is a summary of the number of reported cases, their removal rate and the rate of action within 24 hours.(Echikson & Knodt, 2018)

(screenshot by Germany’s NetzDG: A key test for combatting online hate

Echikson & Knodt, 2018)

Facebook

Despite Facebook claiming that it has allocated lots of resources to ensure that it meets the legal requirements, only 1704 content was flagged off. Considering that only 65 employees are working in processing the complaints, it means that each employee processes only five reported cases per month, which makes the response rate to below. Within the first six months, only 74 German-based posts were blocked which were related to inviting (Echikson & Knodt, 2018). Additionally, in the same year, over six million posts were removed, which were in violation of the rules on hate speech. When this is compared to other social platforms; Twitter and Google, Facebook is still performing below expectations (Echikson & Knodt, 2018). Nonetheless, Facebook has come under pressure n Germany for deleting, deactivation accounts that are deemed legal. On the bright side, the German law through a running allowed Facebook to delete hate speech and temporarily block a user if the content posted is in violation of the German law. Sadly, the company has the lowest content removal rate of the most used in Germany.

Google

Google has the highest total number of reported cases, which is in tandem with the percentage of the content removed. However, in Germany, Google rejected 73%  of NetzDG reported complaints (Echikson & Knodt, 2018). Despite them rejecting some complaints, Google has taken a step further to reduce defamation, removal of sexual content, terrorist content, content triggering violence, political extremism speech, and content that violates individual privacy (see image below)

(screenshot by Germany’s NetzDG: A key test for combatting online hate

Echikson & Knodt, 2018)

Twitter

Of all social platforms, Twitter has the most reliable reputation when it comes to controlling and protecting speech. Twitter has been used against authoritarian regimes. Twitter has a stronger framework in place to allow complaints reporting. Consequentially, 260000 complaints were launched. However, only 10.8% of these complaints have attracted action by the company with the removal rate being the highest; 93.8%. To maintain compliance with NetzDG, the company has a workforce of 50 people, specially entrusted towards analyzing complaints emanating from Germany.

Change.org

With more than five million German users, change.org is required to file compliance reports to NetzGZ. The company has a workforce of 12 people to respond to hate speech and illegal speech in Germany. Additionally, Change.org has incorporated NetzDG to its compliance framework. However, the company does not limit the freedom of speech; nonetheless, it only deletes illegal contents while maintaining a conducive online presence where information is exchanged freely.

Action on Hate Speech and Illegal Speech

Governments can impose fines on online platforms that ignore to comply with the removal of harmful content. As mentioned earlier in the discussion, in Germany, social hate and illegal speech can attract fines for up to 50 million Euros while it comes to an individual violating this law is liable to pay a fine of up to five million pounds. (Kremlin Watch, 2017). Germany is exerting more pressure on the European Union to take punitive actions against violators of the framework set on controlling hate speech and illegal speech.

The steps taken by Germany have attracted more global initiative to replicate the fight against illegal speech and hate speech. To begin with, US republicans accused Facebook of being biased in terms of the trending news. The trending news tended to have a political bias that could harm online followers.  Due to the probe by the senate, Facebook reacted by laying off the editorial staff who twisted the trending news (Holznagel, 2017). Second, the public has the power to sue social platforms. Several lawsuits are in existence. For instance, in 2017, the Vietnamese court of appeal ruled that posts against Green Party Leader Eva Glawishnig be removed from Facebook (Kremlin Watch, 2017). Additionally, three French nationals have sued Facebook, Twitter, and YouTube over the allegation that the number of complaints of flagged posts does not reflect the ones that have been brought down (Kremlin Watch, 2017). In 2017, the British parliament committee for culture, media, and education probed an investigation on how fake news spread and the impact they have on democracy (Kremlin Watch, 2017). Third companies have taken a step forward to dropping ads that stand next to extremist content. This ensures that there is sanity in terms of the online content as the social platforms require such ads for revenue generation. When a company drops such ads, it hurts the company because that is lost revenue. For instance, companies such as Verizon, Wal-Mart, and Pepsi refused to put ads on Google and YouTube after these social sites failed to assure their clients that their ads will not be placed amidst extreme content (Holznagel, 2017). Such efforts show the commitment companies have on pressuring social platforms against hate speech and illegal speech.

Conclusion and Recommendations

Social platforms have an obligation to remove hate speech and illegal speech based on German law or NetzDG law. The law was formulated to control anti-immigrant backlash on social media during the 2015 refugee crisis. These social sites are required to receive complaints from users and take down such content within 24 hours. Twitter and Google are leaders in terms of the actions taken against hate and illegal speech. Other mitigative measures have been taken by other governments with some companies avoiding advertising on social sites that have illegal content. However, to increase the effectiveness in the control of hate speech and illegal speech, the following recommendations need to be put into consideration. To begin with, there should be clear reporting and enforcement standards. There should be a quality standard for complaints that will enable these social sites to be more user-friendly. Second, in terms of the content which is found online, more scrutiny should be done on terrorist content to reduce extremism. Finally, there should be more transparency in the manner in which social sites are controlling the spread of hate speech and illegal speech.

References list

Balkin, J. M. (2018). Free speech is a triangle. Columbia Law Review118(7), 2011-2056.https://www.jstor.org/stable/26524953

Echikson, W., & Knodt, O. (2018). Germany’s NetzDG: A key test for combatting online hate [Ebook] (pp. 3-17). Retrieved from https://www.ceps.eu/system/files/RR%20No2018-09_Germany%27s%20NetzDG.pdf

Kremlin Watch. (2017). Making online platforms responsible for news content[Ebook] (pp. 3-21). Retrieved from https://www.kremlinwatch.eu/userfiles/making-online-platforms-responsible-for-news-content_15220851525026.pdf

Holznagel, B. (2017). Organization for Security and Co-operation in Europe Office of the Representative on Freedom of the Media [Ebook] (pp. 2-33). Retrieved from https://www.osce.org/fom/333541?download=true

Kuczerawy, A. (2016). The Code of Conduct on Online Hate Speech: an example of state interference by proxy? Retrieved from https://www.law.kuleuven.be/citip/blog/the-code-of-conduct-on-online-hate-speech-an-example-of-state-interference-by-proxy/

 

Be the first to comment

Leave a Reply

Your email address will not be published.


*