Research Opportunities and Challenges of the EU's Digital Services Act
The Digital Services Act (DSA) introduced by the European Union in 2022 offers a landmark framework for platform transparency, with Article 40 enabling vetted researchers to access data from major online platforms. Yet significant legal, technical, and organizational barriers still hinder effective research on systemic online risks. This piece outlines the key challenges emerging from the Article 40 process and proposes practical measures to ensure that the DSA fulfills its transparency and accountability goals.
💡 Research Summary
The paper provides a comprehensive examination of Article 40 of the European Union’s Digital Services Act (DSA), which creates a legal and technical pathway for vetted researchers to obtain both public and non‑public data from Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). While the provision promises unprecedented transparency for studying systemic risks—such as electoral interference, algorithmic bias, privacy violations, and public‑health threats—the authors, who participated directly in EU policy discussions, identify three interlocking categories of obstacles that are impeding its effective implementation.
First, misaligned incentives create a chronic conflict of interest between platforms and independent scholars. Platforms prioritize protecting proprietary algorithms, commercial secrets, and shareholder value, often responding to data‑access requests with delays, bureaucratic hurdles, or “independence by permission” schemes that shift the burden of hypothesis testing onto the platform itself. Empirical evidence cited in the paper shows that industry‑sponsored collaborations tend to produce outcomes favorable to the platform, thereby undermining the objective, critical scrutiny that the DSA intends to enable.
Second, a stark power and resource asymmetry exists. The rapid expansion of AI‑driven industry research has siphoned public funding and high‑performance computing resources away from academia. Platforms possess extensive legal teams, massive data‑storage infrastructure, and the ability to launch large‑scale public‑relations campaigns that frame the DSA as censorship or a trade‑lever. In contrast, university research groups often lack dedicated legal counsel, sufficient computational capacity, and the financial bandwidth to navigate the complex, iterative data‑request process. This asymmetry creates a vicious cycle: without data, researchers cannot secure grants; without grants, they cannot afford the legal and technical support needed to obtain data.
Third, procedural bottlenecks within the DSA’s implementation framework further slow progress. The Delegated Act (adopted July 2025) establishes a dedicated data‑access portal, but early‑stage pilots reveal mismatches in terminology, expectations, and risk assessments among researchers, national Digital Service Coordinators (DSCs), university Data Protection Officers (DPOs), and the platforms themselves. The lack of a standardized sensitivity‑classification scheme for datasets, coupled with overly cautious privacy and trade‑secret safeguards, leads to approval timelines measured in months rather than weeks.
To address these challenges, the authors propose three concrete recommendations. (1) Streamline access procedures by developing a systematic categorisation of datasets, adopting uniform data‑sharing agreements, and implementing API‑driven workflows that automate vetting, consent, and security checks. Training programmes for DSCs and DPOs are essential to harmonise interpretations of the DSA across Member States. (2) Foster and financially support independent research through an EU‑wide fund and a neutral mediation body that can cover legal fees, provide computational resources, and break the data‑grant feedback loop. Coordinated “bottom‑up” initiatives—such as shared best‑practice repositories, tutorials, and outreach activities—should be encouraged to build collective capacity. (3) Close regulatory blind spots by explicitly extending the DSA’s scope to large language models (LLMs) and other emerging intermediary services that now shape information access on search engines and social media. The authors argue that LLMs meet all DSA criteria (intermediary service, systemic risk potential, large‑scale user base) and therefore must be subject to the same transparency and risk‑assessment obligations.
Finally, the paper calls for a robust oversight mechanism within the European Commission to monitor the DSA’s rollout, prevent platform resistance, and ensure that the spirit of independent scrutiny is preserved. If Article 40’s implementation is obstructed, the ability of scholars to audit digital media will be severely curtailed, eroding public trust and the EU’s leadership in digital governance. The authors conclude that the EU’s proactive stance can serve as a global model, paving the way for a more transparent, accountable, and safe digital ecosystem worldwide.
Comments & Academic Discussion
Loading comments...
Leave a Comment