Platform Governance and Online Safety

Developing pluralistic governance frameworks and technical tools, centering hermeneutic and restorative justice for the online safety concerns of minority and vulnerable populations.

Outcome

CSCW 24, ICTD 24, ICTD 20, and Submitted

Overview

This project examines how the mechanisms used to regulate online spaces, such as content moderation and reputation systems, frequently prioritize majority viewpoints while marginalizing religious and ethnic minorities. We seek to understand the social and technical factors that make online spheres unsafe for vulnerable groups, particularly in the Global South.

Approach

We employ a mixed-methods approach, combining semi-structured interviews with the co-creation of culturally grounded datasets. We interviewed religious minorities and Indigenous ethnic groups in the Global South to understand their experience in online interaction. Using the asynchronous remote community method, we engage with distributed populations to understand “minority hermeneutics”–how they interpret speech and experiences from their own perspective. We also build and evaluate technical artifacts, such as Mod-Guide, which uses retrieval-augmented generation to ground LLM-based moderation feedback in community-sourced knowledge.

Key Contributions

We developed the “usership” framework, drawing analogies to citizenship to analyze the legal status, rights, and responsibilities of intersectionally marginalized users online. Our work identifies the “politics of fear,” illustrating how surveillance and social isolation lead minority voices into a “spiral of silence” on platforms like Facebook. We have proposed a transition from “punishment-based” moderation to restorative justice, focusing on healing communal bonds and addressing hermeneutical injustice. The artifact contributions demonstrate how LLMs can be made more sensitive to pluralistic norms through community participation.