#AoIR2018 has ended
Back To Schedule
Thursday, October 11 • 11:00am - 12:30pm
Value(s) of Privacy

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

D.E. Wittkower
Studies conducted on the privacy paradox seek explanations on the user side and seek to reform users’ thoughts and behaviors to fit with the available technical systems of privacy. A normal engineering ethics perspective would ask instead why privacy systems do not fit with users’ mental models of privacy, and how these systems can be reformed to allow users to take effective action.

I outline users’ mental models of privacy and to use those mental models to explore interactions in the internet of things (IoT) to see where users’ mental models raise ignored privacy issues or fit poorly with existing privacy-relevant systems.

I outline an interpersonal phenomenology of privacy oriented by ethics of care, considering privacy as it appears in parenting, friendship, romantic relationships, and care for elderly and disabled persons. Three distinctive dynamics of privacy are identified in interpersonal contexts: (1.) how privacy is connected to self-determination, (2.) how privacy is used in information economies to create intimacy, and (3.) how constantly refreshed consent is integral to maintaining trust and intimacy in interpersonal privacy contexts.

These elements of the phenomenology of privacy in interpersonal contexts are then applied to a variety of kinds of IoT devices and systems: GPS navigators, the Amazon Alexa virtual assistant, Nest, and PARO and RIBA.

Through consideration of these examples, we see how privacy violations are not experienced through access and use of data, but through failures to exhibit care for the user in ongoing relationships that are respectful and open to renegotiation.

Christoph Lutz, Christian Pieter Hoffmann, Giulia Ranzini
Internet users face many risks and threats online. Among others, they can become victims of online harassment, spam, hacking, or identity theft. Academic literature has long investigated such experiences and related attitudes within the field of online privacy. Recently, a strand of privacy research has argued that users have resorted to privacy fatigue, privacy cynicism, or privacy apathy. Accordingly, Internet users develop coping mechanisms to subjectively resolve the paradoxical tension between wanting to use online applications and being concerned about their safety. Existing studies on privacy fatigue, privacy cynicism, or privacy apathy are, however, still in their infancy and mostly based on qualitative research. Therefore, we set out to provide more generalizable evidence on the phenomenon, summarizing initial findings from a large-scale survey on online privacy in Germany. In particular, we develop and present a first measurement of the concept of privacy cynicism, differentiating four dimensions. We further show that powerlessness and mistrust are the most prevalent dimensions of cynicism, followed by uncertainty and resignation.

Erika Pearson
This paper uses a case study to explore critical privacy issues inherent in a pilot IoT-based sensor system designed to measure urban flows. This case study, whose experiences have been shared with cities around the world intending to roll out similar IoT and “smart” city solutions, will illustrate the complexity of privacy as one facet amid the technical, financial and legal constraints acting on existing urban spaces as they attempt to use these new technologies for achieving laudable governmental objectives. In particular, this paper will explore the tensions found in the development of the project between technological and policy pressures to deliver data and individual attempts to incorporate a “privacy by design” framework into the system. Over the three years of this pilot, privacy was raised by multiple actors involved in the trial as an important concern, yet ultimately privacy was de-prioritized in comparison to the data-rich outputs of complex urban sensor networks. As this pilot will be one of a set of trials informing global experience, this paper will explore and deconstruct where privacy issues, particularly the privacy by design approach, both succeeded and failed to succeed, and offer suggestions for future experiments to more fully develop their privacy frameworks in the face of strong technical pressures.

Michael Zimmer, Katie Chamberlain Kritikos, Jessica Vitak, Priya Kumar, Yuting Liao
Fitness trackers are an increasingly popular tool for tracking health and physical activity. Their benefits hinge on ubiquitous data collection and the algorithmic processing of personal fitness information (PFI). While PFI can reveal novel insights about users’ physical activity, health, and personal habits, it also contains potentially sensitive information that third parties may access in contexts unanticipated by fitness tracker users. This paper argues while many users attempt to manage their PFI with privacy boundaries, they can also succumb to “information flow solipsism,” or being broadly unaware of how fitness tracker companies might collect and aggregate their PFI. Our mixed-methods approach involved a survey and semi-structured interviews. Most survey respondents had limited knowledge of companies’ data tracking and retention policies. Additionally, most interviewees expressed only minimal privacy concerns regarding PFI. While others recognized PFI may need boundaries to manage information flows, they did not find the information sensitive enough to require personal responsibility for the definition of such boundaries. Viewing these results through Communication Privacy Management theory, users’ conceptualizations of ownership, privacy rules, and turbulence regarding their PFI influence how they manage privacy boundaries. Inherent trust of fitness tracker companies also led users to assume privacy rules properly limit the flow of PFI. This combination suggests fitness tracker users are potentially in a state of information flow solipsism, a position of ignorance of how PFI flows across devices and platforms that creates unanticipated privacy risks.

Thursday October 11, 2018 11:00am - 12:30pm EDT
Sheraton - Drummond Centre