Removing defamatory content from social network sites - what can the court order?

Removing defamatory content from social network sites - what can the court order?

A "like"ly story: CJEU rules that companies embedding Facebooks "Like" button can be treated as controller

As providers of ‘hosting’ services, social network platforms such as Facebook generally enjoy immunity from liability for illegal content on their platforms as long as they remove such content expeditiously when they are made aware of it[i]. However, they cannot be obliged to monitor user content generally to find and remove such content. In an opinion issued in connection with hate speech posted about the chair of the Austrian Green party, Advocate General Szpunar (AG) has considered how far the Court may go in ordering defamatory content to be removed before this becomes tantamount to a monitoring obligation.

 

The case[ii]

Eva Glawischnig-Piesczek is a member of the Austrian Parliament (Nationalrat) and chair of the Greens’ parliamentary party. In April 2016 a Facebook user shared an article which included a photograph of Dr Glawischnig-Piesczek and added a comment calling her a ‘lousy traitor of the people’, a ‘corrupt oaf’ and a member of a ‘fascist party’. Facebook Ireland, which operates the Facebook service outside the US and Canada, did not remove the comment when asked to do so, and Dr Glawischnig-Piesczek applied to the Austrian court for an injunction. She asked that Facebook be ordered to cease publication and/or dissemination of the photographs where they were accompanied by the same or an equivalent comment. A request that removal should be worldwide was later made. The case eventually made its way to the Supreme Court of Austria, which referred it to the Court of Justice (CJEU), and the AG has now issued his opinion advising the Court. Such opinions are not binding on the Court, but in most cases the Court follows them.

Identical and equivalent content

The AG took three main principles into account in deciding the scope of the order that could be made: the protection of Dr Glawischnig-Piesczek’s private life and personality, the protection of Facebook’s freedom to conduct its business and the protection of freedom of expression and information. He concluded first, that the hosting provider (Facebook) could be ordered to remove comments which are identical to those originally posted and which the Austrian court had found to be defamatory. This applied both where these comments were disseminated by the original user and also where they were disseminated by other users. Secondly, the court could be ordered to remove equivalent comments, but only where these were disseminated by the original user him- or herself. Factors influencing this decision included that it would be relatively easy, and therefore proportionate, for Facebook to search and identify identical comments and comments emanating from the original user: although doing so would require some monitoring this would be specific and targeted. It was also important that the order to remove should be limited in time and in this case this criterion was met because it was a preliminary order pending a trial. By contrast, it would be much more burdensome to have to search for and identify equivalent comments emanating from other users; this would require extensive active, non-automatic filtering. An obligation to remove equivalent comments would also necessarily be imprecise and could lead to censorship if applied to other users. This would, therefore, be tantamount to general monitoring and was a step too far.

Should removal be ordered worldwide?

This posed a more difficult question as defamation rules differ from country to country and the comments might not be illegal in all jurisdictions. The relevant EU law provisions on content removal and monitoring in the ecommerce Directive[iii] do not regulate the territorial scope of the removal. The AG therefore advises that the question whether the injunction and the obligations imposed could have effect outside Austria was a question to be decided under public and private international law – in practice this would be a question for the Austrian court to decide in accordance with its own national rules on conflict of laws.

The impact of the opinion

Provided that it is followed by the Court, the AG’s opinion is likely to be welcomed by hosting services providers as indicating how far they should be prepared to go in seeking out and removing defamatory content which has been brought to their attention. It is helpful that a major factor in the AG’s analysis was how easy it would be for Facebook to search for and delete certain types of information. The key take-away point is that, where the content is defamatory, identical comments should normally be sought out and removed as should equivalent comments distributed by the same user. The AG points out that in any specific case, there may be scope for argument about whether a comment is equivalent, and it is to be hoped that the Court will give further guidance on this if it follows the opinion. The hosting provider need not, however, generally search for and remove equivalent comments distributed by other users. No clear guidance is given by the opinion about whether removal should generally be worldwide, although it is clear that the AG would prefer a more measured approach, perhaps employing geo-blocking in appropriate cases. 

[i] ecommerce directive, 2000/31 Arts 14-15

[ii] Case C-18/18 04/06/2019

[iii] ecommerce directive – as above

Contact our experts for further advice

Search our site