This site would like to set some non-essential temporary cookies. Some cookies we use are essential to make our site work.
Others such as Google Analytics help us to improve the site or provide additional but non-essential features to you.
No behavioural or tracking cookies are used.
To change your consent settings, read about the cookies we set and your privacy, please see our Privacy Policy



Digital Business Lawyer
DON'T FORGET: Subscribers can download the latest issue in PDF format - click here to access your account and downloads.

The introduction of hate speech regulations in Germany

The German law against hate speech (Netzwerkdurchsetzungsgesetz) was passed by the Parliament and the Federal Council in July 2017. Even though it drew a lot of criticism for different reasons from jurists, media, trade associations, internet platforms and social media companies, the law will come into force on 1 October 2017 without any significant changes, as Kathrin Schürmann, Partner at Schürmann Wolschendorf Dreyer, explains.

The introduction of the new hate speech law is aimed at decreasing the amount of hate speech and fake news on social media networks.

In order to achieve this objective the new legislation obligates the social networks that fall within the scope of the law to implement a complaint management system and a tool that can easily be used by users to flag any content that might be illegal. The system must be designed in a way that enables the social media network to delete content within 24 hours if it is obviously unlawful, a period that can be extended to seven days for content in which the legality is less clear.

Furthermore, the system must allow social media operators to notify the user that issued the complaint about the decision taken and offer an explanation about the reasoning for the decision. Finally, the social media networks must issue a report on the execution of these requirements every six months. If they fail to comply with these legal requirements they can be fined up to €50 million.

These legal provisions pose huge logistical challenges for big social media companies that have to handle very large amounts of data everyday, as well as for smaller ones that may lack the resources to comply with the regulation.

In a statement to the German Parliament Facebook summarised the necessary actions it will have to take as a result of the new legislation. However, because there are no reference points on how much of the content on Facebook will be reported by users under the new law, the workload faced is not calculable. Thus, it is unclear how many additional staff members will be needed to examine the reported content and so accurate personnel planning is impossible. In addition to that, in-house guidelines will have to be amended to recognise the requirement of the new law, which in turn necessitates training for all staff members.

Moreover even a social network as big as Facebook does not currently provide a notification system that meets the requirements of the new legislation, so will have to invest heavily in the development of a system that complies with the new law. According to a study by Bitkom, the Federal Association for Information Technology, the new law will introduce additional annual costs of €530 million. Even before the requirements of the new law were disclosed to the public in the wake of extensive media coverage, Facebook and other such companies announced action against fake news and hate speech on their own initiative. By its own account Facebook is a member of several self-regulating bodies, offers internal guidelines on how to handle hate speech and fake news and currently employs over 4,500 examiners of complaints globally, a number that will be increased to 7,500 in the coming months. Facebook has also offered extended assistance to law enforcement agencies, an option that the company claims has only rarely been used by the competent authorities.

Facebook has also agreed to the European ‘Code of Conduct on countering illegal online hate speech,’ a cooperation between four major media platforms and the European Commission, which includes a series of commitments by members to combat hate speech and fake news on their platforms. A recent survey showed that the platforms applying this Code of Conduct have made significant progress in responding to notifications concerning illegal hate speech, most notably removing illegal content in 59% of cases compared to 28% six months ago and reducing the response time to less than 24 hours in 51% of cases.

However, even though Facebook and other social media platforms have made progress in their efforts to combat hate speech and fake news, they now fear that the new legislation in Germany will weaken the results; they are however well-advised to take the new standards seriously and if necessary take further ‘voluntary’ action. Not only are the potential fines severe, but it cannot be predicted as to whether strictly meeting the measures now required by law will have the desired impact. In fact, this has been one of the main arguments against the new law, for it is feared that in order to meet the short response periods set out by the law the companies will have to make use of automated deletion systems, which may infringe on diversity and freedom of opinion, and which may lead users to avoid using such platforms and instead use channels that do not have to adhere to the new law.

If this first attempt by the German authorities to control hate speech fails, the chances are that national lawmakers will embark on new legislative projects once the shortcomings become clear. In light of past experiences it can be assumed that the German legislator will be inclined to tighten the provisions, if the results do not show the required impact. Only seldom does the legislator recognise the ineffectiveness of regulations and lower or reverse legal requirements. It is therefore important that the affected companies do not take the path of least resistance, but rather strive to meet the legal standards in a way that does not impair the user experience.

Social media companies affected by the law should preserve their proactive approach to the issue, not only in order to minimise harmful content, but also in their own best interests. Only if they stay on top of hate speech and fake news can they make sure that they are free to decide which measures are appropriate and apply them in a manner that keeps users satisfied with the experience of their product.

Kathrin Schürmann Partner

schuermann@swd-rechtsanwaelte.de

Schürmann Wolschendorf Dreyer, Berlin

Search Publication Archives



Our publication archives contain all of our articles, dating back to 1999.
Can’t find what you are looking for?
Try an Advanced Search

Log out of digital business lawyer
Download latest issue PDF
E-Law Alerts
digital business lawyer Pricing

Social Media

Follow digital business lawyer on Twitterdigital business lawyer on LinkedIndigital business lawyer RSS Feed