Following the publication of The Internet Commission’s latest working paper on the opportunities and challenges presented by Article 21 of the Digital Services Act (DSA), some of our colleagues were recently approached by the London School of Economics to produce a blog on resolving content disputes outside the courtroom using the DSA for their readers.
Written by Jonny Shipp, Aebha Curtis, and Ruairi Harrison, their work was published on the Media@LSE blog, a website which encourages informed and insightful discussion and delivers research and expertise to a wider audience that includes fellow academics, civil society, policymakers, journalists and the broader media industry.
Resolving content disputes outside the courtroom using the Digital Services Act
Out-of-court Dispute Settlement in the DSA – An Overview
In 2022, the European Commission introduced the Digital Services Act (DSA), a new digital rulebook impacting EU users, platforms, and businesses alike. In general, the DSA’s core objectives are to improve the safety of EU internet users whilst protecting fundamental rights such as freedom of expression and to empower users to assert their rights online.
Article 21 DSA’s out-of-court dispute settlement (ODS) solution is one unique tool to achieve these objectives with potentially far-reaching implications for users. This is because the provision allows users to challenge a platform’s content-related decision via an independent, expert non-judicial body. Previously, users wishing to challenge a platform’s moderation decision could either challenge this via the platform’s internal complaints process (which only the larger platforms typically offer) or in a court of law. Considering the potential for bias in the former and the costs involved in the latter, ODS provides a novel redress solution which EU regulators believe will be affordable, independent, and easily accessible.
What is ODS?
ODS (also known as Alternative Dispute Resolution [ADR]) is a non-judicial form of redress that allows consumers to resolve their complaints with a business via a third party (ODS) body. Article 21 DSA represents the first time ODS is being used for content-related disputes, but ODS has been effective for disputes in various sectors including aviation, public transport, telecommunications and energy as well as playing a role online regarding domain name disputes since 2000. Across numerous markets, ODS has played an indispensable role in restoring consumer trust and recalibrating the power imbalance between businesses and consumers. Effective redress provides users with the tools to protect their interests and participate in the administration of justice. In doing so, redress can thereby enhance users’ sense of agency which in turn builds trust between the user and the service.
In the case of online platforms, Article 21 DSA introduces ODS bodies that will be certified by national regulators in the EU to settle content moderation-related disputes. This includes platforms’ decisions to remove content, to disable access to content, but also to merely demote content (as is the case for shadow banning, a popular platform practice of significantly lessening the visibility of content).
To be certified, ODS bodies must satisfy several conditions, most notably, their ‘independence’ and ‘expertise.’ The above conditions are the focus of the working paper we presented earlier this year at the DSA and Platform Regulation Conference at the Amsterdam Law School, entitled: “Settling DSA-related Disputes Outside the Courtroom: The Opportunities and Challenges Presented by Article 21 of the Digital Services Act”. The paper focuses on these conditions and anticipates the kinds of complaints that ODS bodies may resolve.
The Requirements of ODS Bodies under the DSA
Independence
Under the new EU rules, ODS bodies must prove to the certifying DSCs that they will be ‘impartial’ and ‘(financially) independent’ of both platforms and users. Our paper explores how minimum thresholds should be coordinated in line with a recent Commission Recommendation addressing quality requirements for dispute resolution procedures which explicitly address the above condition(s). Tying together both the most pragmatic understanding of impartiality and independence with the manner in which in it presented in the Commission Recommendation, we found that ‘independence’ means not being subject to any instructions from either party and that ‘impartiality’ will require bodies to guarantee that they are not remunerated in a way that is linked to the outcome of the procedure.
Our paper thus addresses where bias may impinge on the ODS ecosystem from the perspective of both the claimant and the ODS body. We looked to the Meta Oversight Board as a case study and looked at borderline cases of ex-employees with sufficient expertise but questionable independence. We present some scenarios to highlight the difficulties in establishing these minimum standards. We also discuss how threats to independence may emerge from parties not explicitly related to the dispute, for instance, interest groups or state interests, as these scenarios are not directly addressed in the regulation.
One overarching conclusion, which also applies to the ‘expertise’ condition, is that concerns of malicious claims and inherently pro-claimant bias may be addressed by effectively defining these ‘conditions’ via coordinated guidance at national (DSC) level. We note the importance of the independence of these bodies as a way to build effective engagement and establish the trust of users in this redress tool. Therefore, transparency regarding the funding and fee structures of ODS bodies will help maximise the trust in them and help to curb biased decision-making most effectively.
Expertise
ODS bodies must also prove their ‘expertise’ in resolving content disputes. This means that they must have expertise in not only the ‘subject matter’ of the complaint (for example illegal and/ or harmful types of content) but also in at least one EU language and the resolution of complaints more generally. Previous EU ADR rules provide useful context to flesh out DSA-ODS ‘expertise’ conditions in that they highlight the value of legal knowledge in establishing expertise. Thus, although not explicitly mentioned in the DSA, we argue that the expertise required of ODS bodies should encompass a minimum level of legal competence. This is crucial for complex fundamental rights-related disputes concerning, most notably, privacy versus freedom of expression online.
We also discuss the importance of further delineating what type of expertise each ODS body has. The emerging landscape of ODS bodies may be segmented in several ways, with bodies having particular expertise regarding individual platforms (e.g. Instagram), types of platforms (e.g. gaming platforms), types of content (e.g. hate speech), and/or state-specific expertise (relating to local culture and/or language).
The natural response for state-specific expertise is for states to establish their own state-backed ODS body. Austria and Hungary have announced that they will create state-backed ODS bodies, and others may follow. However, questions arise regarding how these state bodies can claim to be experts across the vast array of content disputes that may land on their doorstep and how regulators can guarantee such a vast ‘scope’ of expertise. Further uncertainties exist regarding how state-backed ODS bodies’ expertise can be effectively shared throughout the market, particularly for recurring types of disputes which proliferate across the EU and may require a coordinated approach. Effective dispute resolution for numerous platform types, such as dating platforms, online marketplaces and video-sharing platforms might be more easily achievable at a pan-European level, rather than across all services at a national level.
We argue that one key means to address this potential knowledge gap in the ODS market would be the creation of a hub for all ODS bodies, be they state-backed or private. ‘Expertise’ can be enhanced and shared at an industry-wide level by establishing mechanisms for the exchange of knowledge about the market, innovation and best practices. This could bring together existing and new organisations that have developed or are in the process of developing expertise, ensuring efficient referrals of cases to bodies that have the right expertise, as well as helping to consolidate insights about emerging risks and harms in order to communicate with the industry and facilitate improvements.
ODS in Practice: Opportunities and Challenges
To be effective, the recommendations of ODS bodies will need to reflect a balance of punishment and rehabilitation that is appropriate to the environment and its norms and values. For example, an online dating platform may be directed towards adopting specific approaches for first-time offenders versus repeat offenders. In a child-focused environment, the emphasis may be more appropriately rehabilitative or educational where the offending user is a child, just as the punitive element should be more definitive where the offending user is an adult. ODS bodies should recommend platforms to adjust their approach as new issues emerge and ensure they are able to scale their operations. Their rules of procedure will need to address the admissibility and reliability of evidence but allow for the fact that these are dynamic environments in which issues will extend across platforms and jurisdictions.
By engaging effectively with ODS bodies, platforms will have new opportunities to improve user experiences across national borders and to receive helpful, system-level insights. Our paper discusses these opportunities and challenges in detail, and we are keen to engage with all stakeholders as we develop our thinking in this field.
Want to learn more? We’re happy to help.