How to Stop Playing Whack-a-Mole: Mapping the Ecosystem of Technologies Facilitating AI-Generated Non-Consensual Intimate Images
The last decade has witnessed a rapid advancement of generative AI technology that significantly scaled the accessibility of AI-generated non-consensual intimate images (AIG-NCII), a form of image-based sexual abuse that disproportionately harms women and girls. There is a patchwork of commendable efforts across industry, policy, academia, and civil society to address AIG-NCII. However, these efforts lack a shared, consistent mental model that situates the technologies they target within the context of a large, interconnected, and ever-evolving technological ecosystem. As a result, interventions remain siloed and are difficult to evaluate and compare, leading to a reactive cycle of whack-a-mole. We contribute the first comprehensive AIG-NCII technological ecosystem that maps and taxonomizes 11 categories of technologies facilitating the creation, distribution, proliferation and discovery, infrastructural support, and monetization of AIG-NCII. First, we build and visualize the ecosystem through a synthesis of over a hundred primary sources from researchers, journalists, advocates, policymakers, and technologists. Next, we demonstrate how stakeholders can use the ecosystem as a tool to 1) understand new incidents of harm via a case study of Grok and 2) evaluate existing interventions via three more case studies. We conclude with three actionable recommendations, namely that stakeholders should 1) use the ecosystem to map out state, federal, and international laws to produce a clearer policy landscape, 2) collectively develop a database that dynamically tracks the 11 technologies in the ecosystem to better evaluate interventions, and 3) adopt a relational approach to researching AIG-NCII to better understand how the ecosystem technologies interact.
💡 Research Summary
The paper addresses the rapidly growing problem of AI‑generated non‑consensual intimate images (AIG‑NCII), a form of image‑based sexual abuse that disproportionately harms women and girls. While industry, policy, academia, and civil‑society groups have launched numerous interventions—ranging from U.S. federal takedown mandates and state statutes to lawsuits against “AI nudi‑er” apps, advocacy campaigns against distribution sites, and AI‑safety techniques—the authors argue that these efforts are fragmented because they lack a shared mental model of the underlying technology ecosystem.
To fill this gap, the authors synthesize more than one hundred primary sources (academic papers, investigative journalism, policy documents, technical reports, GitHub repositories, terms‑of‑service, etc.) and construct the first comprehensive “AIG‑NCII technological ecosystem” map. The map categorises technologies into five functional roles—creation, distribution, proliferation & discovery, infrastructural support, and monetisation—and identifies eleven concrete technology categories: (1) training datasets, (2) generative AI models, (3) generative AI interfaces, (4) distribution channels, (5) deep‑fake creation communities, (6) search engines, (7) advertising platforms, (8) app stores, (9) developer platforms, (10) critical service providers (e.g., cloud, CDN), and (11) payment processors. Each category is described with examples, historical evolution, and its specific contribution to the creation, spread, or profit from AIG‑NCII.
The authors then demonstrate the practical utility of the ecosystem through four case studies. The first examines the 2025 “Grok” incident, showing how the map can quickly identify which parts of the ecosystem (e.g., a new multimodal model, its public interface, payment integration) are responsible for a surge in harmful content. The remaining three evaluate existing interventions: the federal TAKE IT DOWN Act (focused on distribution channels), the San Francisco lawsuit against an “AI nudi‑er” app (targeting creation tools), and the shutdown of the Mr.DeepFakes community (addressing deep‑fake creation communities). Each analysis reveals mismatches between the intervention’s focus and the broader ecosystem, highlighting why some measures have limited impact.
Based on these insights, the paper offers three actionable recommendations: (1) stakeholders should use the ecosystem map to overlay and harmonise state, federal, and international laws, thereby exposing gaps and redundancies; (2) a centrally maintained, dynamically updated database should be built to track the eleven technology categories, enabling real‑time assessment of interventions and emerging threats; (3) research should adopt a relational approach that studies the “edges” where technologies intersect (e.g., developer platforms and payment processors) to anticipate new abuse pathways.
In sum, the work reframes AIG‑NCII not as isolated incidents caused by single models or apps, but as the product of a stable yet evolving technological network. By providing a shared taxonomy and visual map, the authors equip policymakers, technologists, and advocates with a tool to move beyond the reactive “whack‑a‑mole” game toward coordinated, systemic prevention strategies.
Comments & Academic Discussion
Loading comments...
Leave a Comment