Engineering privacy and protest: A case study of AdNauseam

Daniel C. Howe, Helen Nissenbaum

Research output: Contribution to journalArticle

Abstract

The strategy of obfuscation has been broadly applied-in search, location tracking, private communication, anonymity-and has thus been recognized as an important element of the privacy engineer's toolbox. However, there remains a need for clearly articulated case studies describing not only the engineering of obfuscation mechanisms but, further, providing a critical appraisal of obfuscation's fit for specific socio-technical applications. This is the aim of our paper, which presents our experiences designing, implementing, and distributing AdNauseam, an open-source browser extension that leverages obfuscation to frustrate tracking by online advertisers. At its core, AdNauseam works like a list-based blocker, hiding or blocking ads and trackers. However, it provides two additional features. First, it collects the ads it finds in its 'Vault', allowing users to interactively explore the ads they have been served, and providing insight into the algorithmic profiles created by advertising networks. Second, AdNauseam simulates clicks on ads in order to confuse trackers and diminish the value of aggregated tracking data. A critic might ask: why click? Why not simply hide ads from users and hide users from trackers? The twofold answer reveals what may be distinctive elements of the AdNauseam approach. To begin, we conceptualize privacy as a societal value. Whereas many privacy tools offer solutions only for individual users, AdNauseam is built on the assumption that, often, effective privacy protection must be infused throughout a system. This assumption presents different and interesting engineering challenges. Second, AdNauseam seeks to concurrently achieve the goal of resistance through protest. And since protest frequently involves being vocal, AdNauseam's core design conflicts at times with conceptions of privacy based on secrecy or concealment. While such tensions, and the tradeoffs that result, are not uncommon in privacy engineering, the process of designing and building AdNauseam demanded their systematic consideration. In this paper we present challenges faced in attempting to apply obfuscation to a new domain, that of online tracking by advertisers. We begin with the goals of the project and the implemented features to which they map. We then present our engineering approach, the set of tensions that arose during implementation, and the ways in which these tensions were addressed.We discuss our initial evaluation efforts on both technical and ethical dimensions, and some of the challenges that remain. We conclude with thoughts on the broader issues facing privacy tools that must operate within complex sociotechnical contexts-especially those dominated by actors openly resistant to them-informed by our experience with AdNauseam's ban from Google's Chrome store.

Original languageEnglish (US)
Pages (from-to)57-64
Number of pages8
JournalCEUR Workshop Proceedings
Volume1873
StatePublished - 2017

Fingerprint

Marketing
Engineers
Communication

ASJC Scopus subject areas

  • Computer Science(all)

Cite this

Engineering privacy and protest : A case study of AdNauseam. / Howe, Daniel C.; Nissenbaum, Helen.

In: CEUR Workshop Proceedings, Vol. 1873, 2017, p. 57-64.

Research output: Contribution to journalArticle

Howe, DC & Nissenbaum, H 2017, 'Engineering privacy and protest: A case study of AdNauseam', CEUR Workshop Proceedings, vol. 1873, pp. 57-64.
Howe, Daniel C. ; Nissenbaum, Helen. / Engineering privacy and protest : A case study of AdNauseam. In: CEUR Workshop Proceedings. 2017 ; Vol. 1873. pp. 57-64.
@article{e02a5755350849eba1363e24b27d8ed2,
title = "Engineering privacy and protest: A case study of AdNauseam",
abstract = "The strategy of obfuscation has been broadly applied-in search, location tracking, private communication, anonymity-and has thus been recognized as an important element of the privacy engineer's toolbox. However, there remains a need for clearly articulated case studies describing not only the engineering of obfuscation mechanisms but, further, providing a critical appraisal of obfuscation's fit for specific socio-technical applications. This is the aim of our paper, which presents our experiences designing, implementing, and distributing AdNauseam, an open-source browser extension that leverages obfuscation to frustrate tracking by online advertisers. At its core, AdNauseam works like a list-based blocker, hiding or blocking ads and trackers. However, it provides two additional features. First, it collects the ads it finds in its 'Vault', allowing users to interactively explore the ads they have been served, and providing insight into the algorithmic profiles created by advertising networks. Second, AdNauseam simulates clicks on ads in order to confuse trackers and diminish the value of aggregated tracking data. A critic might ask: why click? Why not simply hide ads from users and hide users from trackers? The twofold answer reveals what may be distinctive elements of the AdNauseam approach. To begin, we conceptualize privacy as a societal value. Whereas many privacy tools offer solutions only for individual users, AdNauseam is built on the assumption that, often, effective privacy protection must be infused throughout a system. This assumption presents different and interesting engineering challenges. Second, AdNauseam seeks to concurrently achieve the goal of resistance through protest. And since protest frequently involves being vocal, AdNauseam's core design conflicts at times with conceptions of privacy based on secrecy or concealment. While such tensions, and the tradeoffs that result, are not uncommon in privacy engineering, the process of designing and building AdNauseam demanded their systematic consideration. In this paper we present challenges faced in attempting to apply obfuscation to a new domain, that of online tracking by advertisers. We begin with the goals of the project and the implemented features to which they map. We then present our engineering approach, the set of tensions that arose during implementation, and the ways in which these tensions were addressed.We discuss our initial evaluation efforts on both technical and ethical dimensions, and some of the challenges that remain. We conclude with thoughts on the broader issues facing privacy tools that must operate within complex sociotechnical contexts-especially those dominated by actors openly resistant to them-informed by our experience with AdNauseam's ban from Google's Chrome store.",
author = "Howe, {Daniel C.} and Helen Nissenbaum",
year = "2017",
language = "English (US)",
volume = "1873",
pages = "57--64",
journal = "CEUR Workshop Proceedings",
issn = "1613-0073",
publisher = "CEUR-WS",

}

TY - JOUR

T1 - Engineering privacy and protest

T2 - A case study of AdNauseam

AU - Howe, Daniel C.

AU - Nissenbaum, Helen

PY - 2017

Y1 - 2017

N2 - The strategy of obfuscation has been broadly applied-in search, location tracking, private communication, anonymity-and has thus been recognized as an important element of the privacy engineer's toolbox. However, there remains a need for clearly articulated case studies describing not only the engineering of obfuscation mechanisms but, further, providing a critical appraisal of obfuscation's fit for specific socio-technical applications. This is the aim of our paper, which presents our experiences designing, implementing, and distributing AdNauseam, an open-source browser extension that leverages obfuscation to frustrate tracking by online advertisers. At its core, AdNauseam works like a list-based blocker, hiding or blocking ads and trackers. However, it provides two additional features. First, it collects the ads it finds in its 'Vault', allowing users to interactively explore the ads they have been served, and providing insight into the algorithmic profiles created by advertising networks. Second, AdNauseam simulates clicks on ads in order to confuse trackers and diminish the value of aggregated tracking data. A critic might ask: why click? Why not simply hide ads from users and hide users from trackers? The twofold answer reveals what may be distinctive elements of the AdNauseam approach. To begin, we conceptualize privacy as a societal value. Whereas many privacy tools offer solutions only for individual users, AdNauseam is built on the assumption that, often, effective privacy protection must be infused throughout a system. This assumption presents different and interesting engineering challenges. Second, AdNauseam seeks to concurrently achieve the goal of resistance through protest. And since protest frequently involves being vocal, AdNauseam's core design conflicts at times with conceptions of privacy based on secrecy or concealment. While such tensions, and the tradeoffs that result, are not uncommon in privacy engineering, the process of designing and building AdNauseam demanded their systematic consideration. In this paper we present challenges faced in attempting to apply obfuscation to a new domain, that of online tracking by advertisers. We begin with the goals of the project and the implemented features to which they map. We then present our engineering approach, the set of tensions that arose during implementation, and the ways in which these tensions were addressed.We discuss our initial evaluation efforts on both technical and ethical dimensions, and some of the challenges that remain. We conclude with thoughts on the broader issues facing privacy tools that must operate within complex sociotechnical contexts-especially those dominated by actors openly resistant to them-informed by our experience with AdNauseam's ban from Google's Chrome store.

AB - The strategy of obfuscation has been broadly applied-in search, location tracking, private communication, anonymity-and has thus been recognized as an important element of the privacy engineer's toolbox. However, there remains a need for clearly articulated case studies describing not only the engineering of obfuscation mechanisms but, further, providing a critical appraisal of obfuscation's fit for specific socio-technical applications. This is the aim of our paper, which presents our experiences designing, implementing, and distributing AdNauseam, an open-source browser extension that leverages obfuscation to frustrate tracking by online advertisers. At its core, AdNauseam works like a list-based blocker, hiding or blocking ads and trackers. However, it provides two additional features. First, it collects the ads it finds in its 'Vault', allowing users to interactively explore the ads they have been served, and providing insight into the algorithmic profiles created by advertising networks. Second, AdNauseam simulates clicks on ads in order to confuse trackers and diminish the value of aggregated tracking data. A critic might ask: why click? Why not simply hide ads from users and hide users from trackers? The twofold answer reveals what may be distinctive elements of the AdNauseam approach. To begin, we conceptualize privacy as a societal value. Whereas many privacy tools offer solutions only for individual users, AdNauseam is built on the assumption that, often, effective privacy protection must be infused throughout a system. This assumption presents different and interesting engineering challenges. Second, AdNauseam seeks to concurrently achieve the goal of resistance through protest. And since protest frequently involves being vocal, AdNauseam's core design conflicts at times with conceptions of privacy based on secrecy or concealment. While such tensions, and the tradeoffs that result, are not uncommon in privacy engineering, the process of designing and building AdNauseam demanded their systematic consideration. In this paper we present challenges faced in attempting to apply obfuscation to a new domain, that of online tracking by advertisers. We begin with the goals of the project and the implemented features to which they map. We then present our engineering approach, the set of tensions that arose during implementation, and the ways in which these tensions were addressed.We discuss our initial evaluation efforts on both technical and ethical dimensions, and some of the challenges that remain. We conclude with thoughts on the broader issues facing privacy tools that must operate within complex sociotechnical contexts-especially those dominated by actors openly resistant to them-informed by our experience with AdNauseam's ban from Google's Chrome store.

UR - http://www.scopus.com/inward/record.url?scp=85027869910&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85027869910&partnerID=8YFLogxK

M3 - Article

VL - 1873

SP - 57

EP - 64

JO - CEUR Workshop Proceedings

JF - CEUR Workshop Proceedings

SN - 1613-0073

ER -