16 March 2022

A Self-Regulatory Race to the Bottom through Out-of-Court Dispute Settlement in the Digital Services Act

How the DSA will introduce competition for the Meta Oversight Board (and the German FSM) and why we should be worried about this

The (Meta) Oversight Board is a unique and expensive organisation. Financed by Facebook through a $130 million trust, its aim is to protect free expression by making principled, independent decisions about important pieces of content and by issuing advisory opinions on Facebook’s content policies. The Board has well-renowned experts in its ranks, and is getting a lot of attention.

A similar, though less well-known self-regulatory body has been set up by YouTube and Facebook in Germany. The NGO “Freiwillige Selbstkontrolle Multimedia-Diensteanbieter” (FSM) has been certified as a self-regulation institution under the NetzDG by the Federal Office of Justice. Platforms can now ask the FSM to decide tough content removal cases. Though working more quietly than the Oversight Board, the FSM also has set up high-quality processes, where panels consisting of specialised lawyers are delivering well-reasoned and, as it seems to me, thoroughly balanced decisions.

However, one problem remains: both institutions are exclusive. Everyday users cannot demand a review of a case, or at least – as with the Oversight Board – they only have a very slim theoretical chance of bringing their case to these high-profile entities.

These things are well-known. But it remained somewhat under the radar that both the Oversight Board as well as the FSM might face cheap and non-exclusive competition through future out-of-court dispute settlement bodies, which will be introduced by the Digital Services Act (DSA).

Art. 18 of the draft DSA [Art. 21 of the final text], which will introduce these new dispute settlement processes, addresses a legitimate policy concern, namely the need to enable effective recourse mechanisms for platform decisions. However, as I will explain in the following paragraphs, the concept fails when trying to combine the best of two worlds: solving disputes through real courts as well as through self-regulation. Art. 18 draft DSA [Art. 21 of the final text] raises serious concerns and should be substantially modified.

Certification of out-of-court dispute settlement bodies

Art. 18 draft DSA [Art. 21 of the final text] will allow for the certification of out-of-court dispute settlement bodies. Anyone who can show independence (from platforms and from its users alike) and who can demonstrate expertise in content moderation matters (across applicable laws and Terms of Services) can apply to be certified by authorities. One can imagine that lawyers or former platform trust and safety staff might be good fits to set up such bodies and to seek certification.

Once an entity is certified, users can request the body to review their dispute over a moderation decision. For example, when Facebook suspends the account of a user or removes a specific piece of content, that user can then ask the body to initiate a dispute settlement process concerning the platform’s decision. According to Art. 18 draft DSA [Art. 21 of the final text], the settlement body shall then resolve the conflict in a swift, efficient, and cost-effective manner.

There are still many open questions regarding what future proceedings will look like. As a central issue, it is still debated whether decisions by the settlement bodies will be binding (that’s the proposal of the European Commission and the European Parliament) or whether these decisions will be mere recommendations (that’s the current position of the EU member states). One underlying concern here is that binding decisions by future Art. 18 [Art. 21 of the final text] bodies would make these bodies private de-facto-courts, as I have criticised before.

In-built biases and one-sided incentives to “sue“

Some other specific features of the potential future settlement processes seem worrisome, as they might disproportionately favour users, who might be over-incentivized to “sue”:

  1. Platforms must cooperate: As a general rule, platforms must take part in the proceedings. This means a user can drag Facebook, for example, before these de-facto courts.
  2. Incentives to “sue”: The user will bear minimum financial risks, instead faces incentives to “sue”. If the user loses the case, they will bear their own costs, but generally will not have to reimburse the platform. It remains unclear whether the user, when losing their case, might at least have to bear the fees of the proceedings. This is not mirrored the other way around: If the user wins, the platform must reimburse them for any fees and other reasonable expenses (this would probably include reasonable attorney fees). Overall, this scheme comes close to playing roulette with free chips but where profits are paid out in real dollars.
  3. Built-in bias: The dispute settlement bodies will be financed through the fees charged for the dispute settlement. However, it will be the user affected by a content decision who can initiate the proceedings. In this setting, only when users are willing to bring cases, funding will be secured. Since the user may select any certified settlement-body this bears the risk of a built-in bias. To attract users, the settlement bodies might tend to interpret laws and Community Standards in an overly expression-friendly way, which should lead to higher chances for users to win a case. In my opinion, this problem would not be solved if Art. 18 draft DSA [Art. 21 of the final text] also allowed notice-senders to bring disputes (as suggested by the EU member states) . This might lead to some settlement bodies “specialising” the other way around, building a reputation for being very enforcement-friendly. Finally, I do not think that disabling forum shopping in Art. 18 draft DSA [Art. 21 of the final text] would solve the problem. Given the experiences with current litigation over content-removal decisions, it is highly likely that Art. 18 draft DSA [Art. 21 of the final text] will overwhelmingly be initiated by affected users (uploaders). So even if applicants couldn’t forum-shop, the Art. 18 draft DSA [Art. 21 of the final text] entities would still face incentives to structurally favour users.

Competition for the Meta Oversight Board and the FSM

To summarise, the future Art. 18 draft DSA [Art. 21 of the final text] – bodies are low-threshold, non-exclusive, and attempt to be attractive to users, who can drag platforms there. All this makes these proposed bodies a nasty competition to the Oversight Board and the FSM, which, given the costs which come with every decision, probably cannot open up their high-quality proceedings to every-day-user complaints. This alone bears the risk of diminishing, for example, the Oversight Board’s influence.

One could still try to brush off such concerns with the argument that the Board’s influence mainly lies beyond decisions on a specific item of content. A good part of the Board’s influence stems from its role to deliver more general policy recommendations, as seen, for example, in a decision published on 1 February 2022, where the Board made recommendations to Facebook on how to modify its Child Sexual Exploitation, Nudity and Abuse Community Standard. However, the Art. 18 draft DSA [Art. 21 of the final text] bodies may be active in the field of more general recommendations too, since they will need to raise attention for their business and to demonstrate expertise in the areas at hand. Indeed, future Art. 18 draft DSA [Art. 21 of the final text] bodies even have a legal basis to be outspoken beyond specific decisions: One central question when reviewing content moderation decisions is whether or not the respective Community Standards are valid in the first place. Art. 18 draft DSA [Art. 21 of the final text] bodies will need to make decisions on this, for example, whether a given Community Standard is invalid because it disproportionately discriminates against the users under applicable law, as the German Federal Court of Justice recently ruled on parts of Facebook’s Community Standards.

High costs for platforms

For the platforms themselves, what might be most worrying will be the costs that will come with numerous settlement proceedings. As I have described above, the procedures will produce costs on the side of the platforms, even if they win. If the platforms lose, costs rise: they will then also have to reimburse the users’ legal fees. This might easily result in hundreds or even thousands of Euros a platform has to pay for a proceeding over a single moderation decision. For very large platforms, costs might add up to millions of Euros per year. Even if companies like Facebook and YouTube could bear such costs coming with an additional layer of dispute-settlement: Art. 18 draft DSA [Art. 21 of the final text] does not only apply to such gigantic platforms. All online platforms bigger than small enterprises (see Art. 19 DSA) can be dragged into proceedings, that is, by the applicable definition, every platform with more than 50 employees or with an annual turnover of more than 10 Mio Euro. For many such medium-sized platforms, costs of about tens of thousands of Euros, which might result from only a few dispute settlement proceedings, might have severe consequences.

Not only platforms and the Oversight Board should be worried

In the end, you might ask: Why not? Is this not just lamenting over industry-financed advisory-bodies facing independent competition? Wouldn’t Meta and YouTube be very capable of bearing the costs? And for smaller platforms, couldn’t legislators just exempt them more broadly from Art. 18 draft DSA [Art. 21 of the final text]?

One might argue so, if we could expect Art. 18 draft DSA [Art. 21 of the final text] to introduce meaningful dispute-settlement in the first place, which seems highly questionable.

First of all, as I have argued elsewhere in more detail, since Art. 18 draft DSA [Art. 21 of the final text] allows users to “drag” platforms into dispute-settlement against their will (basically, at their expense), this probably conflicts with fundamental rights, especially if lawmakers decide to make Art. 18 draft DSA [Art. 21 of the final text] bodies’ decisions binding. Besides such legal concerns, the envisaged dispute settlement bodies might be biased by design (as described above). Moreover, Art. 18 draft DSA [Art. 21 of the final text] bears the risk of nourishing a questionable “settlement-industry” (given the costs/reimbursement scheme described above). Picture ambulance-chasing lawyers chasing Facebook users or think of kick-back agreements between users and some lawyers.

Finally, we should also be worried about which kind of users the future Art. 18 draft DSA [Art. 21 of the final text] bodies will attract. I expect that especially trolls, conspiracy theorists and right-wing users will be excited to drag Facebook & Co. into proceedings, fighting for their right to speak “awfully but lawfully”, enjoying the spotlight. Consider which litigation over moderation decisions we already see today: right-wing and/or hateful content is often at the center of court proceedings against platforms over the reinstatement of content or the re-activation of accounts (as indicated by observations for Germany, and for the U.S.). I also have a certain gut-feeling that the respective group of claimants might often enjoy dragging affected third parties into the proceedings, for example, the person who requested the take-down in the first place – Art. 18 draft DSA [Art. 21 of the final text] does not yet solve the problem of how these persons, especially when they are victims of the disputed content, can safely be part of the proceedings.

Conclusion

Art. 18 draft DSA [Art. 21 of the final text] seeks to address a legitimate policy concern: effective recourse mechanisms for platform decisions. But we already have the legal framework for this: in Europe, users can successfully sue platforms in “real” courts when platform decisions are unjustified. Sure, court proceedings can be lengthy and costly, and we should make these proceedings easier and more effective. Complementary to state courts, we might welcome truly voluntary self-regulation like the Oversight Board or the FSM and, of course, the platforms’ own in-house appeal-mechanisms. However, Art. 18 draft DSA [Art. 21 of the final text] fails in trying to combine both aspects – courts and self-regulation. It introduces a highly questionable regime of private de-facto courts. As a side effect, it risks fouling the widely welcomed voluntary self-regulation that Meta and YouTube have set up (Oversight Board, FSM). And at least smaller platforms will be overburdened by Art. 18 draft DSA [Art. 21 of the final text]. If European lawmakers do not want to let go of Art. 21 DSA, they should radically modify it: taking part in these proceedings needs to be voluntary for the platforms, even if at practical level this might come close to fully abandoning the proposal.

Edited to reflect the change in Article numbering in the official publication of the DSA; Article 18 DSA became Article 21.


SUGGESTED CITATION  Holznagel, Daniel: A Self-Regulatory Race to the Bottom through Out-of-Court Dispute Settlement in the Digital Services Act: How the DSA will introduce competition for the Meta Oversight Board (and the German FSM) and why we should be worried about this, VerfBlog, 2022/3/16, https://healthyhabit.life/a-self-regulatory-race-to-the-bottom-through-art-18-digital-services-act/, DOI: 10.17176/20220316-121129-0.

Leave A Comment

WRITE A COMMENT

1. We welcome your comments but you do so as our guest. Please note that we will exercise our property rights to make sure that Verfassungsblog remains a safe and attractive place for everyone. Your comment will not appear immediately but will be moderated by us. Just as with posts, we make a choice. That means not all submitted comments will be published.

2. We expect comments to be matter-of-fact, on-topic and free of sarcasm, innuendo and ad personam arguments.

3. Racist, sexist and otherwise discriminatory comments will not be published.

4. Comments under pseudonym are allowed but a valid email address is obligatory. The use of more than one pseudonym is not allowed.