24 June 2021

The Digital Services Act wants you to “sue” Facebook over content decisions in private de facto courts

According to Art. 18 of the Commission’s draft for a Digital Services Act (DSA) [Art. 21 of the final DSA text], Member States shall certify out-of-court dispute settlement bodies which might – at the request of online platform users – review platform decisions, e.g., content take-downs or account suspensions. Online platforms, including Facebook, are bound by such decisions.

While Art. 18 draft DSA [Art. 21 of the final DSA text] seeks to address a legitimate policy concern, namely the need to enable effective recourse mechanisms for platform decisions, its introduction of quasi-courts is incompatible with European Law. Moreover, Art. 18 draft DSA [Art. 21 of the final DSA text] is grounded in the unsound assumption that private dispute settlement bodies can be both cost-effective, high-quality, and independent.

ButArt. 18 draft DSA [Art. 21 of the final DSA text] is tied to an underlying, more general question: For recourse against online platform decisions, do we want private entities like the NetzDG-self-regulatory bodies or the Facebook Oversight Board to be our future decision-makers? How does this comply with our understanding of the Rule of Law? I argue that, instead of increasingly relying on self-regulation, strengthening the existing Member States’ courts is a simpler, more effective and better solution.

However, this discussion needs to take place now, as the EU is rushing forward with the DSA, a legislative mega-project that the Commission wants to become a success. While some Member States are raising concerns, the European Parliament might not attempt to add substantial repairs to Art. 18 draft DSA [Art. 21 of the final DSA text].

Background: The Risk of Overblocking

Everyday, online intermediaries must make decisions about the legality of their users’ activities. In other words, platforms need to figure out whether a certain piece of content is in line with national laws or the platform’s Terms of Services, before implementing a take-down or an account suspension. Platform transparency reports evidence that such decisions are increasingly taken on the platforms’ own initiative, after content has been found using automated means. To a lesser but nonetheless substantial extent, platform action results from third party take-down requests. Such decisions are not always justified, for example, when there is no sound legal basis for a take-down. Given the vast amount of such decisions (see, for example, at Facebook), the limited resources the companies spend on a single case, and the complexities that might arise, mistakes and overblocking might occur.

Legislative responsibility to provide safeguards and recourse

To a certain degree, the risk of over-blocking must be accepted as a side effect that comes with every form of rights enforcement (we don’t abolish Criminal Law, though we know that in some cases, innocent individuals are sent to jail). However, given their size, the importance that online platforms have for our citizens and economies, and considering the free speech issues at stake, legislators need to implement safeguards to minimize overblocking. Moreover, effective recourse mechanisms must be accessible, so users can effectively contest decisions and compel platforms to restore content or accounts after erroneous decisions (so called ‘put-back’).

At the practical level, there are different pathways that might lead to recourse or reinstatement:

In-house appeals mechanisms

Interestingly, all major platforms have introduced voluntary (in-house) appeals mechanisms where users can appeal content decisions, requesting the platform to review a decision. These proceedings are cost-free. A considerable number of appeals are reported as successful, leading to the restoration of content. Facebook reports about 84.000 content decisions on the grounds of hate speech alone that were successfully appealed in Q1/2021. However, it is still the platforms themselves that decide the appeals, no external body is consulted, although legislators have started to introduce some minimum quality standards for in-house appeals mechanisms (for example, in Germany by introducing § 3b NetzDG in 2021).

External self-regulatory bodies (for example, the Facebook Oversight Board)

To date, two prominent self-regulatory bodies have been established on platforms’ content decisions. Under the NetzDG, social networks can refer controversial decisions to a self-regulatory body. One such body, financed by Facebook and YouTube, has begun its work. In addition, Facebook established its Oversight Board, without any legislative underpinning. These self-regulatory bodies do not contribute to strengthening users rights, though, because it is generally only the platforms, not the users, who can refer cases to these bodies. The Facebook Oversight Board claims to accept appeals by users, but it is highly unlikely that appeals by everyday-users might ever be selected in relevant numbers (the Board selects eligible cases that are difficult, significant and globally relevant that can inform future policy). Platforms themselves are likely to select cases to avoid or abate controversy (for example the Trump ban), not to enable independent or just review for their users. Especially considering the costs of these bodies, it is unlikely that the platforms will open self-regulation to general user-appeals.

Member State Courts

And then we have the courts. Unlike in the U.S., under European Law, users can successfully sue platforms for reinstatement of content or restoration of accounts after overblocking. In Germany, there have been a large number of such cases, with even the German Constitutional Court issuing a preliminary injunction requiring Facebook to restore an account. Courts often grant (speedy) preliminary measures, and the financial risks of such proceedings are comparably modest (between 3.000 – 5.000 Euros), but might still be too high for many everyday users. Moreover, some big platforms order their counsel to raise every argument and move aggressively, possibly in an attempt to scare users away from court.

Art. 18 introduces “quasi-courts” incompatible with European Law

Art. 18 draft DSA [Art. 21 of the final DSA text] would add another layer of dispute settlement: self-regulatory bodies to which users can appeal to after platform decisions, delivering (partly) binding decisions. Thus, Art. 18 draft DSA [Art. 21 of the final DSA text] attempts to establish external private bodies that shall have all the essential characteristics of courts. According to the Explanatory Memorandum of the DSA, online-platforms “shall engage […] with the [dispute settlement] body.” This means users can lawfully force platforms to the forum. Additionally, online platforms “shall be bound by the decision taken by the body”. Since Art. 18 draft DSA [Art. 21 of the final DSA text] and recital 44 only mention that recipients of services shall have the rights to redress against the decision before a court, it must be concluded that online platforms shall not have such a right.

Stripped down, these characteristics (binding decisions from a decision-maker that one party did not consent to) describe the features of a court or what – traditionally – only courts can do. At least from the platforms’ perspective, the dispute settlement bodies would be de facto courts.

The European Union has no competence to introduce a new layer of de facto courts at Member State level. It is well-established that the Treaty on the Functioning of the European Union does not yield any competence to do so. Art. 4(2)(j) of the Treaty on the Functioning of the European Union (TFEU) mentions “justice” as a shared competence, but this is understood to back subject matters such as judicial cooperation, as mentioned in Art. 81 of the TFEU.

Of course, one could argue that Art. 18 draft DSA [Art. 21 of the final DSA text] dispute settlement bodies are not “real” courts in the meaning of European Law and do not need to meet its standards. If Art. 18 draft DSA [Art. 21 of the final DSA text] were to interpreted in that way, Art. 18 draft DSA [Art. 21 of the final DSA text] would be incompatible with other fundamental European laws guaranteeing access to justice. Since platforms shall have no right to contest Art. 18 draft DSA [Art. 21 of the final DSA text] – decisions before a (real) court, Art. 18 draft DSA [Art. 21 of the final DSA text] would violate the platforms’ fundamental right of access to justice, which is guaranteed by Art. 47(2) of The Charter of Fundamental Rights of the European Union and Art. 6 of the European Convention on Human Rights.

Art. 18 is unrealistic and unnecessary

Leaving legal concerns aside, Art. 18 draft DSA [Art. 21 of the final DSA text] is unrealistic in its objective to establish out-of-court tribunals that are bestowed with all the qualities of a court, but would still somehow be more effective and less costly. This is a bold and unrealistic dream: decisions based upon impartial expertise and procedural safeguards are not only hard to design (the very reason for the existence of public, independent courts) but extremely costly.

This is illustrated by the NetzDG self-regulatory bodies, which are about as costly as small court cases. Another example is the Facebook Oversight Board (which is very likely much more costly than respective court proceedings). Remember: platforms take part in these self-regulatory procedures voluntarily. If users start dragging platforms into contradictory Art. 18 draft DSA [Art. 21 of the final DSA text] out-of-court proceedings against their will, expect that the platforms’ counsel would find adequate means to burden plaintiffs by making proceedings as complicated as possible. A good example is what some platforms reportedly do now in the (real) courts, when they try to make it very hard even in the clearest of cases.

A Better Path Forward: strengthen litigation in the (real) courts

Ideally, Member States’ courts should already possess some of the necessary characteristics for dispute settlement in our context – expertise, impartiality, fair rules of procedure, and the capacity to deliver binding decisions. What may be lacking is speediness and cost-effectiveness. Reform efforts should focus on solving these specific weaknesses, which is less difficult than inventing and implementing a whole new layer of de facto courts from scratch.

There are at least three measures that can be taken to ensure speedier and more cost-effective court proceedings:

Expand Art. 11 DSA (Legal Representatives)

Legal proceedings get complicated and slow in cross-border matters. Thanks to existing European law (Art. 7 Brussels Ia – Regulation (EU) No 1215/2012), users are allowed to sue platforms over content decisions in the courts of their country of residence. For example, a German user can sue Facebook in a German court over an account suspension. However, it would still be necessary to serve documents. When platforms are established within the Union, European law already makes this comparably easy because Art. 14 of Regulation (EC) No 1393/2007 allows cross-border service by postal services. However, serving documents outside the Union can slow down proceedings extraordinarily. Therefore, at least for online platforms, Art. 11 should be expanded so future legal representatives of third-country online platforms must mandate their local legal representatives to receive service of process in civil proceedings, regarding decisions to take-down content or to suspend accounts (as mentioned in Art. 17(1) DSA). To strengthen effective rights enforcement, these legal representatives will also need to accept services when third parties require platforms to take action against users (for example to take down content).

Preliminary injunctions as a standard measure

Court proceedings are generally slow but can be expedited through interim proceedings. In some areas of law, especially when dealing with intellectual property infringements, such speedy proceedings have become the rule rather than the exception. Such expedited proceedings have long since been supported through legislative underpinning in European (see, for example, Art. 3(1) and Art. 11 sentence 3 of Enforcement-Directive (2004/48/EC) and national law (see, for example, § 101(7) German Copyright Act). Given the importance of very large platforms and how much people depend on them, the same should be true for users’ disputes with “their” platforms.

Capping the costs

Financial risks are a legitimate concern when users sue their platforms over content decisions or account suspensions, and costs should be reduced. There are examples for this: our legal regimes cap the costs and the financial risks of court proceedings for public policy reasons, for example in rental law and labor law, or even in non-commercial copyright infringement cases (see, for example, § 97a(3) German Copyright Act). Given the role of online platforms, where we “live”, “work” and “speak”, it is reasonable to cap the costs for court proceedings over content or account decisions between users and platforms.

Taken together, the proposals would strengthen litigation in courts. Of course, this would automatically pressure the platforms to take their in-house appeals mechanisms more seriously – a win-win situation. We would enable cost-effective court proceedings which would incentivize effective in-house appeals mechanisms (which are cost-free).

The basic rationale of Art. 18 draft DSA [Art. 21 of the final DSA text] might remain intact. Member States might certify out-of-court bodies for dispute settlement. But taking part in these proceedings should be voluntary.

This article was updated to reflect the revised numbering of articles in the Digital Services Act; Article 18 became Article 21.


SUGGESTED CITATION  Holznagel, Daniel: The Digital Services Act wants you to “sue” Facebook over content decisions in private de facto courts, VerfBlog, 2021/6/24, https://healthyhabit.life/dsa-art-21/, DOI: 10.17176/20230124-220043-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.




Explore posts related to this:
Article 18 DSA, Article 21 DSA, DSA, Platform Governance, Self-Regulation, out-of-court dispute settlement


Other posts about this region:
Europa