TOWARDS FAIRNESS, ACCOUNTABILITY, AND TRANSPARENCY IN PLATFORM GOVERNANCE
Robert Gorwa
Drawing inspiration from recent work on Fairness, Accountability, and Transparency (FAT) in machine learning, this paper explores a similar research agenda for fairness, accountability, and transparency in platform governance. The paper seeks to make two contributions: (a) provide the initial provocation for what could be termed FAT-platform studies, and to (b) build on the extant platform governance literature (e.g Gillespie 2010, 2015, 2017; Denardis & Hackl, 2015) with an empirical, qualitative case study of Facebook policy practices.
A NEW INSTITUTION ON THE BLOCK: AIRBNB AS COMMUNITY PLATFORM AND POLICY ENTREPRENEUR
Niels van Doorn
This paper argues that “sharing economy” platforms should be understood as new institutional forms that are transforming relations between market, state, and civil society actors. Focusing on Airbnb, in particular its Airbnb Citizen initiative and its recently introduced Policy Tool Chest, it examines the strategic conflation of the platform company and its user base, as Airbnb advances its public policy goals in cities around the globe. While Airbnb has been known to evade regulation, the company has increasingly sought to become a trusted partner in urban policy making. Assuming the more pro-active and agenda-setting role of the urban policy entrepreneur (Pollman and Barry 2017), Airbnb aims to co-shape the terms of current and future policy debates pertaining not just to home sharing/short-term rental but also to the very fabric of city life. It seeks to achieve this by mobilizing its user base, which it frames as a community of entrepreneurial middle-class citizens looking to supplement their income in a post-recession climate of economic insecurity and opportunity.
The paper demonstrates how Airbnb exploits the ambiguity of its exceptional status as a new hybrid actor in neoliberal urban governance networks: where it becomes a partner in policy making, market and civil society actors/interests converge. It is argued that we are witnessing an emerging mode of “platform urbanism” that introduces a new conundrum to public regulatory institutions: when “sharing economy” platforms collapse the public/private distinction, who benefits from (a lack of) regulation and whose wellbeing are we trying to protect?
HATRED OF/AND DEMOCRACY: THE POLITICAL CONTRADICTIONS OF REDDIT’S MODERATION STRUCTURE
Trevor Garrison Smith
This paper seeks to interpret Reddit moderation as a problem of political theory, rather than as a debate between the merits of human moderation and algorithmic moderation. Analyzing Reddit’s moderation structure shows that both the human moderation and the algorithmic moderation reinforce a form of anti-politics which leaves users feeling like they have no input and thus no interest in the well-being of the subreddits in which they participate. Online governance structures are largely top down and authoritarian in nature, despite often being couched in democratic rhetoric, reflecting what Jacques Rancière describes as a hatred of democracy. By looking at the example of how r/Canada came to be widely disparaged on Reddit as a bastion of hate, I make the argument that the key to rooting out online hate is not through more human moderation or by giving algorithms more control, but by creating a democratic culture of buy-in through which users are empowered with responsibility for the quality of content in a discussion space.
TOWARDS A COMMON LANGUAGE AND SHARED UNDERSTANDING: A VOCABULARY FOR ONLINE MODERATION
Sarah Myers West, Kat Lo, Claudia Lo, Rochelle LaPlante, Sarah T. Roberts
Online moderation comes in many different forms, but the discourse surrounding it currently lacks precision in terminology. This paper argues that it is therefore crucial to develop a clear vocabulary to define the different elements of moderation that reflects the variety of contexts and approaches employed by platforms of different types and levels of scale.
Drawing on our experiences as researchers and moderators, we propose a precise vocabulary for moderation to engender dialog about like concepts across domains and applications: how industry develops best practices; how regulators craft legal regimes to influence moderation; and how we as a public understand and debate this area. We distinguish commercial content moderation from community moderation and identify unique elements to each. Commercial content moderation is best distinguished by the distance created between involved parties, while community moderation features a much closer relationship between those involved. We also elaborate on the technical affordances, labor conditions, timescale and scope of work, and recruitment and training practices for these types of moderation.
This vocabulary is still emergent, but we believe that precision in terms is a valuable objective. Continued conflation over different types of moderation is hugely detrimental when the next step is to invoke solutions. Our aim is to facilitate interdisciplinary communication so that journalists, regulators, researchers and practitioners can come together and participate in critical analysis and intervention.