theharmonyguy’s Privacy Library
A collection of articles, papers, and talks on various privacy topics that I have found useful. This is by no means exhaustive, but it highlights some perspectives on privacy that I feel can get lost in many discussions these days. Feel free to contact me with any feedback or suggested additions.
Last updated July 22, 2023. Links in reverse chronological order by category.
tl;dr
I have summarized much of my perspective on privacy in a series of posts: What is Privacy?, Expanding Our Privacy Horizons, and Privacy is More Than Security. My current concise definition of privacy: “respecting people’s expectations, agency, and interests when handling information about them.”
In 2010, danah boyd (founder of Data & Society, Principal Researcher at Microsoft Research) gave two talks on the public/private binary that remain highly relevant: “Making Sense of Privacy and Publicity” at SXSW and “Privacy and Publicity in the Context of Big Data” at WWW2010 (transcripts)
More recently, Helen Nissenbaum (Professor of Information Science at Cornell Tech, creator of the “contextual integrity” framework) gave an interview on why we should not simply rely on consent for privacy: “Stop Thinking About Consent: It Isn’t Possible and It Isn’t Right”
Also, Maritza Johnson (researcher at the International Computer Science Institute) gave a 10-minute talk at the 2018 OURSA conference on baking privacy into the user experience: “Wait, how was I supposed to know?” (video)
Developer and engineer Jenny Zhang in 2021 published a wonderful essay on privacy as a collective interest entitled, “left alone, together.”
Finally, web expert Robin Berjon described how recent developments around privacy reflect more of the need for a reality check than a “privacy war.”
Privacy definitions and descriptions
“So it’s unfortunate but unsurprising that the use of one term to refer to the personal dimensions of both secrecy and autonomy has led to confusion over whether privacy really is a fundamental right. The problem arises when we take secrecy as an end in itself, and thus as the paradigm of privacy—an error that can be traced back to Warren and Brandeis’s parochial preoccupations. In truth, privacy with respect to the disclosure of information is an outgrowth of the deeper concern to preserve the conditions for individual autonomy, not the other way around. Rather than a prerogative of the privileged, intent on keeping the general public at bay, the right to privacy should have been understood from the start as a prerogative of the people, establishing a zone where the state cannot readily trespass.”
Why the “Privacy” Wars Rage On, Jeannie Suk Gersen, The New Yorker, June 2022
“This new survey asked Americans for their own definitions of the words ‘privacy’ and ‘digital privacy.’ Their written answers were coded into broad categories, and they reveal that across both questions, participants most often mention their concerns about the role other people and organizations can play in learning about them, their desire to shield their personal activities and possessions, and their interest in controlling who is given access to their personal information.”
How Americans think about privacy and the vulnerability of their personal data, Pew Research Center, November 2019
“What scholars call ‘informational privacy,’ the ability to influence how our personal information is collected, used, and disclosed, is a verb, not a noun. When you hear someone say ‘privacy’ in this context, you should think ‘negotiate.’ That is, you should think of an action word that emphasizes process. Informational privacy is a tenuous, revisable, ongoing discussion expressed in debates between individuals, groups, and institutions about how to set and enforce norms.”
Stop Saying Privacy is Dead, Evan Salinger, October 2018
“My definition of privacy is ‘an appropriate flow of information’ (or, ‘data,’ if you prefer). If you imagine a river, you can think about ways in which we can shape its flow. We can pause, dam, or divert it with different means and for different reasons… For different dataflows there are different constraints.”
Stop Thinking About Consent: It Isn’t Possible and It Isn’t Right, Helen Nissenbaum, Harvard Business Review, September 2018
“I thought it would be useful, by way of background, to walk through a classification of the major privacy concerns we as consumers seem to have, and how each of these is (or isn’t) relevant to the different companies that compete in this industry. I’ve done quite a bit of research into major privacy stories covered in the news over the last few years, and most of them fall into one of these categories.”
‘Fear of One Company Knowing Too Much About Us’ and Other Privacy Concerns, Jan Dawson, Vox, February 2015
“Privacy is not about control over data nor is it a property of data. It’s about a collective understanding of a social situation’s boundaries and knowing how to operate within them. In other words, it’s about having control over a situation. It’s about understanding the audience and knowing how far information will flow. It’s about trusting the people, the situating, and the context.”
Privacy and Publicity in the Context of Big Data, danah boyd, April 2010
“Fundamentally, privacy is about having control over how information flows. It’s about being able to understand the social setting in order to behave appropriately. To do so, people must trust their interpretation of the context, including the people in the room and the architecture that defines the setting. When they feel as though control has been taken away from them or when they lack the control they need to do the right thing, they scream privacy foul.”
Making Sense of Privacy and Publicity, danah boyd, March 2010
“A right to privacy can be understood as a right to maintain a certain level of control over the inner spheres of personal information and access to one’s body, capacities, and powers. It is a right to limit public access to oneself and to information about oneself. For example, suppose that I wear a glove because I am ashamed of a scar on my hand. If you were to snatch the glove away, you would not only be violating my right to property (the glove is mine to control), you would also violate my right to privacy—a right to restrict access to information about the scar on my hand. Similarly, if you were to focus your X-ray camera on my hand, take a picture of the scar through the glove, and then publish the photograph widely, you would violate a right to privacy. While your X-ray camera may diminish my ability to control the information in question, it does not undermine my right to control access. Privacy also includes a right over the use of bodies, locations, and personal information. If access is granted accidentally or otherwise, it does not follow that any subsequent use, manipulation, or sale of the good in question is justified. In this way, privacy is both a shield that affords control over access or inaccessibility and a kind of use and control right that yields justified authority over specific items—like a room or personal information.”
Defining Privacy, Adam D. Moore, September 2008
Privacy as a fundamental human right
“Privacy is essential to human agency and dignity. Denying someone privacy—even when it’s as seemingly small as a parent who won’t let their kid close the door—has a corrosive effect, eroding trust as well as our sense of interiority. When we scale up the individual to a body politic, it is the private sphere that’s crucial for our capacity for democracy and self-determination. As individuals, we need privacy to figure out who we are when we’re no longer performing the self. As a collective, we have to be able to distinguish who we are as individuals hidden from the norms and pressures of the group in order to reason clearly about how we want to shape the group.”
left alone, together, Jenny Zhang, May 2021
“Microsoft CEO Satya Nadella said on Thursday that data privacy at an individual level needs to be thought of as a human right, and pointed to the European Union’s GDPR regulation as a model for the rest of the world… ‘Data that you contribute to the world has utility for you, utility for the business that may be giving you a service in return—and the world at large. How do we account for that surplus being created around data? And who is in control around giving those rights?’”
Microsoft CEO: Data privacy must be thought of as a human right, Yahoo! News, January 2020
“[CMU Professor Alessandro] Acquisti proposes a different way of looking at online privacy altogether. As long as it’s viewed in economic terms, as a good to be bought, sold, and traded off between consumers and corporations, tech companies will have the upper hand, because individuals’ choices are so easily manipulated. An alternative, he suggests, is to view privacy more like a human right: something everyone deserves, whether they full grasp its value or not. That would make it more like freedom of speech, which the U.S. Bill of Rights protects equally for all citizens, regardless of whether some of them would be willing to trade it for a free pizza.”
How Much Is Your Privacy Really Worth?, Will Oremus, September 2019
“Cisco today issued a call to governments and citizens around the world to establish privacy as a fundamental human right in the digital economy. Today, connectivity and technology have become the foundation for peoples’ economic, social, and cultural opportunities. With IoT, 5G, and AI promising to soon reshape how we interact with technology, Cisco is urging governments to adopt comprehensive and interoperable data protection laws to secure that right.”
Cisco Calls for Privacy to be Considered a Fundamental Human Right, Cisco Press Release, February 2019
“We [at Apple] reject the excuse that getting the most out of technology means trading away your right to privacy. So we choose a different path: Collecting as little of your data as possible. Being thoughtful and respectful when it’s in our care. Because we know it belongs to you. In every way, at every turn, the question we ask ourselves is not ‘what can we do’ but ‘what should we do’. Because Steve [Jobs] taught us that’s how change happens. And from him I learned to never be content with things as they are.”
Duke University Commencement Address, Tim Cook, May 2018
“No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.”
Universal Declaration of Human Rights, United Nations, December 1948
“We must therefore conclude that the rights, so protected, whatever their exact nature, are not rights arising from contract or from special trust, but are rights as against the world; and, as above stated, the principle which has been applied to protect these rights is in reality not the principle of private property, unless that word be used in an extended and unusual sense. The principle which protects personal writings and any other productions of the intellect of or the emotions, is the right to privacy, and the law has no new principle to formulate when it extends this protection to the personal appearance, sayings, acts, and to personal relation, domestic or otherwise.”
The Right to Privacy, Samuel Warren and Louis Brandeis, Harvard Law Review, December 1890
Privacy as it relates to trust
“When you're talking about collecting data, talk about: do you really need to collect that data? Can you, once you've collected it, reduce it in fidelity and store something else? Do you really need to store the data for as long as you possibly can, or maybe can you have a reasonable retention policy? Can you add a new control, so that people have more control? Can you store it in aggregate form? All sorts of tough questions that maybe go against the culture internally, go against a certain product idea—but I think these are the tough questions that you have to be asking when you're dealing with data about people or people’s data… I often hear in response to those sorts of questions, but wait, we want to preserve option value. So one thing that’s interesting about privacy is that users kind of wrap privacy up in trust, and the way they think about companies using their data is very related to trust. So if you're building your products always preserving option value, think about how you build relationships in the real world. So if you're out there dating, and you're trying to preserve option value, are you building a trusting relationship?”
“Wait, how was I supposed to know?” Baking privacy into the user experience, Maritza Johnson, OURSA, April 2018
“When people make information available, they make themselves vulnerable. We do this all the time in social settings. We make ourselves vulnerable because we believe that we might have something to gain from it. This is how we build friendships. We also make ourselves vulnerable to machines because we hope that we can gain something from it. Yet, just like we trust people to understand the context in which information is shared, so too do we trust machines. When either our friends or our technology fail to maintain the social context, it feels like a huge privacy FAIL.”
Making Sense of Privacy and Publicity, danah boyd, March 2010
Privacy as it relates to power
“When it comes to privacy—particularly when it comes to data-based surveillance and manipulation of consumers—creepiness is a trap. What’s really important is not whether the creepy reaction is being triggered. Instead, what really matters is power—the substance of what’s going on whether we as consumers understand it or not, whether we as consumers notice it or not, or whether we as consumers are freaked out by it or not. Thinking of privacy in terms of creepiness is not only a bad way of gauging whether there’s a real privacy issue; it also confuses us about what’s really at stake, and it further enables the exercise of power by those who control our data.”
“Creepiness” Is the Wrong Way to Think About Privacy, Neil Richards, Slate, December 2021
“The answer to problems caused by consent isn’t more consent, automated consent, or consent that covers more processing. It’s to use consent where and when it is the right tool to increase people’s autonomy, and to stop pretending that you can bang nails in with it in all the other cases. And no matter how appealing it might feel on a priori grounds, if it can’t be enforced at scale and it is susceptible to hard-to-litigate gaming then it’s just a bad approach to data protection. There’s a reason why data brokers love consent so much and won’t ever stop talking about “transparency and choice”: it works in favour of whichever party has the time to invest in creating a consent funnel, and that’s never individuals.”
Privacy: A Quick Reality Check, Robin Berjon, July 2021
“Inventions are far from unavoidable. Treating data as a commodity is a way for companies to earn money, and has nothing to do with building good products. Hoarding data is a way of accumulating power. Instead of focusing only on their bottom line, tech companies can and should do better to design the online world in a way that contributes to people’s wellbeing. And we have many reasons to object to institutions collecting and using our data in the way that they do.”
Privacy is power, Carissa Véliz, Aeon September 2019
“Buying this device, even if the details are a bit creepy, shows that you care, that you’re a good parent. And because parents are shamed and nudged into buying these tracking devices, more and more of them pop up on the market. It’s these purchases that technologists equate to ‘natural selection,’ but it’s nearly impossible for most people to opt out of a lot of these arrangements.”
The biggest lie tech people tell themselves — and the rest of us, Rose Eveleth, Vox, October 2019
“The best measure of whether a company cares about privacy is what it does by default. Many apps and products are initially set up to be public: Instagram accounts are open to everyone until you lock them, and the whole world can see whom you split utilities with until you make your Venmo private. Even when companies announce convenient shortcuts for enhancing security, their products can never become truly private. Strangers may not be able to see your selfies, but you have no way to untether yourself from the larger ad-targeting ecosystem.”
Consumer Surveillance Enters Its Bargaining Phase, Sidney Fussell, June 2019
“Today, more and more, not only can corporations target you directly, they can model you directly and stealthily. They can figure out answers to questions they have never posed to you, and answers that you do not have any idea they have. Modeling means having answers without making it known you are asking, or having the target know that you know. This is a great information asymmetry, and combined with the behavioral applied science used increasingly by industry, political campaigns and corporations, and the ability to easily conduct random experiments (the A/B test of the said Facebook paper), it is clear that the powerful have increasingly more ways to engineer the public, and this is true for Facebook, this is true for presidential campaigns, this is true for other large actors: big corporations and governments.”
Facebook and Engineering the Public, Zeynep Tufekci, June 2014
Privacy as a collective interest
“The thing about common goods like public health, though, is that there’s only so much individual actions can achieve without a collective response that targets systemic problems. While we owe a duty of care to one another, it’s not enough for all of us to be willing to wear masks if there’s no contact tracing, no paid sick leave, no medical manufacturing and distribution capacity, no international sharing of vaccine research. And it’s not enough for each of us to be individually vigilant about our information if unscrupulous trackers are gathering up data we didn’t even know we were shedding, or if law enforcement is buying up that data on the private market to use for surveillance purposes none of us ever consented to.”
left alone, together, Jenny Zhang, May 2021
“One might view these situations as a conflict between Bob’s privacy and Alice’s freedom of speech. For our purposes, they highlight a more fundamental point: to the extent that people do not retreat completely from society, everyone’s privacy depends on what others do. There is no way to live in the world without putting yourself at risk that others might make use of information about you in ways to which you do not consent. This is as true of someone interacting with close family, friends, and colleagues as it is of someone walking down a busy city street among strangers. No one can claim exclusive privilege to the information communicated in these encounters. In this most basic sense, individuals’ privacy always depends on others’ discretion.
Privacy Dependencies, Solon Barocas and Karen Levy, Washington Law Review, September 2019
“I don’t believe what’s worthy of protection is fundamentally based on only the individual’s preferences or interests. The meaning of privacy I want to defend isn’t just about what I want as a user, consumer, citizen, family member, etc… in my view, the basic assumption that privacy is always about the right of an individual to selectively reveal gets us off on the wrong foot. I can imagine cases where you think it’s OK for people to be profiled with or without consent and whether or not it is strictly in their interests, not because we are trading privacy for other values, but because a right to privacy is one that is already bounded, or balanced. The analogy with environmental conservation can help. Imagine that I own forested land and a paper company offers to purchase and harvest the trees. Treating this as a business proposition I may decide it’s a good deal. But if one takes into consideration the future costs, the external costs, all those things that affect not just the two parties in question, then chopping down that forest is a problem.”
Stop Thinking About Consent: It Isn’t Possible and It Isn’t Right, Helen Nissenbaum, Harvard Business Review, September 2018
“I’ve long felt that the issue we call privacy is very similar to the issue we call environmentalism… It’s pervasive. It’s invisible. Attribution is hard. Even if you get cancer, you don’t know if it’s from that chemical plant down the road. Living in a world where all of your data is collected and swept up in these dragnets all the time and will be used against you in a way that you will probably never be able to trace and you will never know about it feels like that same type of collective harm.”
Why Surveillance Is the Climate Change of the Internet, quote by Julia Angwin, The Atlantic, May 2019
“The takeaway here is that we have to start thinking about privacy as a collective, environmental problem, not something that hits individual people, and certainly not something where the onus is on the individual to protect themselves. Privacy is starting to look like a problem similar to climate change—and in past eras, something similar to food safety. Like, yes, check the sell-by date on your chicken breasts, but we need a system that makes sure those sell-by dates get printed in the first place. A system that sends inspectors around to the meatpacking factories, that penalizes sellers, distributors and farmers if a lot of people end up getting sick. Maybe you’re never going to get sick from eating bad chicken, but the risk still means we’ve adopted food inspection standards… Privacy is part and parcel with the health of our society, the health of our civil liberties, and the health of the digital infrastructure that we depend on daily.”
The Internet Security Apocalypse You Probably Missed, Sarah Jeong, The New York Times, May 2019
“Privacy is both personal and collective. When you expose your privacy, you put us all at risk. Privacy power is necessary for democracy—for people to vote according to their beliefs and without undue pressure, for citizens to protest anonymously without fear of repercussions, for individuals to have freedom to associate, speak their minds, read what they are curious about. If we are going to live in a democracy, the bulk of power needs to be with the people. If most of the power lies with companies, we will have a plutocracy. If most of the power lies with the state, we will have some kind of authoritarianism. Democracy is not a given. It is something we have to fight for every day. And if we stop building the conditions in which it thrives, democracy will be no more. Privacy is important because it gives power to the people.”
Privacy is power, Carissa Véliz, Aeon, September 2019
Privacy and folk models used to understand it
“Previous studies have suggested that technically savvy users have more accurate or sophisticated mental models than their less technically savvy counterparts… However, we did not observe a clear relationship between technical knowledge and folk models. Somewhat surprisingly, our arguably most technically savvy participants P16 and P18, two web developers, held the 1st-party-pull model and the connected first-party model, respectively. Both of them were not aware of third-party trackers. This is important because even technically savvy users can have inaccurate models and need user education to gain a more accurate picture of OBA.”
Folk Models of Online Behavioral Advertising, Ruogu Kang, Laura Dabbish, Nathaniel Fruchter, and Sara Kiesler, February 2017
“People with a more articulated model expressed higher awareness of specifically who might have access to their personal data and communications. Yet technical background was not directly associated with more secure behavior online. Almost universally, participants’ privacy protective actions or lack of action were informed by personal context and experiences, such as a feeling they had nothing to hide, and in some cases by immediate cues in the online environment such as a security emblem or famous company name.”
‘My Data Just Goes Everywhere:’ User Mental Models of the Internet and Implications for Privacy and Security, Ruogu Kang, Laura Dabbish, Nathaniel Fruchter, and Sara Kiesler, July 2015
“Folk models are mental models that are not necessarily accurate in the real world, thus leading to erroneous decision making, but are shared among similar members of a culture. It is well-known that in technological contexts users often operate with incorrect folk models. To understand the rationale for home users’ behavior, it is important to understand the decision model that people use. If technology is designed on the assumption that users have correct mental models of security threats and security systems, it will not induce the desired behavior when they are in fact making choices according to a different model.”
Folk Models of Home Computer Security (PDF), Rick Wash, July 2010
“When people assess a situation, they develop mental models based on probability calculations and the expectations they bring to the table. They make guesses about who is more or less likely to run across them. Their calculations are completely reasonable, as it’s an efficient way of getting a decent handle on the social context, even if they are sometimes wrong. This is true both offline and online. People need to know how to behave so they use whatever information is available to them to make their best guess.”
Making Sense of Privacy and Publicity, danah boyd, March 2010
Privacy and its business impact on consumer choices
“There is now overwhelming evidence proving privacy is not merely a compliance issue but a fundamental consumer expectation and critical element of brand loyalty. For example, a recent study from Transcend revealed that 93% of Americans would switch to a company that prioritizes data privacy if given the option, and 38% believe it is worth spending more money with companies that prioritize data privacy. These findings are so beyond historical sentiment studies and indicate consumers don’t just care about privacy but are willing to reward companies that do, too.”
Beyond a compliance mindset: How we communicate about privacy impacts our influence, Melanie Ensign, September 2020
“An overwhelming majority (84%) of U.S. adults have decided against engaging with a company because it needed their personal info. And 71% did so more than once—indicating that it is a pattern; consumers will not engage if they feel that they need to give away too much of their personal information. Privacy concerns are also causing consumers to walk away from business relationships. Three-quarters (75%) of adults have stopped engaging with a company out of concern for the way they use personal data.”
Braze Data Privacy Report, Wakefield Research, February 2020
“A 2019 survey conducted by Cisco of 2,601 adults worldwide examined the actions, not just attitudes, of consumers with respect to their data privacy— The survey reveals an important new group of people—32% of respondents—who said they care about privacy, are willing to act, and have done so by switching companies or providers over data or data-sharing policies.”
Do You Care About Privacy as Much as Your Customers Do?, Harvard Business Review, January 2020
“Behind a privacy intrusion there is often an economic trade-off. The reduction of the cost of storing and manipulating information has led organizations to capture increasing amounts of data about individual behavior. The hunger for customization and usability has led individuals to reveal more about themselves to other parties. New trade-offs have emerged in which privacy, economics, and technology are inextricably linked: individuals want to avoid the misuse of the information they pass along to others, but they also want to share enough information to achieve satisfactory interactions; organizations want to know more about the parties with which they interact, but they do not want to alienate them with policies deemed as intrusive.”
Economics of Privacy, Alessandro Acquisti, updated February 2019
“But I think companies have an opportunity here to have allies in users. Of course users want an internet that empowers them. When you have your privacy protected, you’re empowered as a citizen, you’re empowered as an internet user. It gives you more opportunities. So if companies make sure that they have allies in the person of users, that they offer them internet services that are centered around their privacy, they will make sure that these citizens whenever the governments will want to block these services, disrupt internet, or any other censorship measure, these citizens will oppose their governments in the fiercest way to make sure that they still have access to tools that are empowering them.”
The State of Privacy in Sub-Saharan Africa, Julie Owono, October 2018
“The past 12 months have clearly demonstrated the value online companies place on consumer data, and the lengths they’re willing to go to acquire it. In light of this, consumers are now beginning to rethink the true worth of their personal information, and what this means for the idea of a value exchange between brands and themselves.”
Rethinking “Trust” in a New Era of Data Privacy, Chase Buckle, October 2018
“Contrary to the claim that a majority of Americans consent to discounts because the commercial benefits are worth the costs, our study suggests a new explanation for what has thus far been misconstrued as ‘tradeoff’ behavior in the digital world: a large pool of Americans feel resigned to the inevitability of surveillance and the power of marketers to harvest their data. People who are resigned do not predictably decide to give up their data. We actually found no statistical relationship between being resigned to marketers’ use of data and accepting or rejecting various kinds of supermarket discounts. Rather, the larger percentages of people in the population who are resigned compared to people who believe in principle that tradeoffs are a fair deal indicate that in the real world people who give up their data are more likely to do it while resigned rather than as the result of cost-benefit analysis… To further question marketers’ emphasis on Americans’ use of cost-benefit calculations, we found that large percentages of Americans often don’t have the basic knowledge to make informed cost-benefit choices about ways marketers use their information.”
The Tradeoff Fallacy: How Marketers are Misrepresenting American Consumers and Opening Them Up to Exploitation, Joseph Turow, Michael Hennessy, and Nora Draper, August 2016
“Nearly everyone who works in the consumer products industry knows that negative brand experiences can quickly negate years of brand-building, a hard-gained positive reputation, and—perhaps most importantly—the trust a consumer places in a brand. Consider the impact on consumer trust, then, when a company announces that it has experienced a data breach. In this age of big data and digital marketing, in which consumer product companies and retailers are building detailed profiles of individual consumers based on a plethora of data sources, even a single data breach can substantially damage consumer trust. Indeed, 59 percent of consumers state that the knowledge of a data breach at a company would negatively impact their likelihood of buying from that company. Only 51 percent of consumers, moreover, say they would be ‘forgiving’ of a consumer product company that experienced a breach as long as the company quickly addressed the issue… But there is also an upside. There is a clear connection between consumers’ perceptions of data privacy and security practices and commercial success. Half of the consumers we surveyed ‘definitely consider’ the privacy and security of their personal information when choosing an online retailer, and 80 percent say they are more likely to purchase from consumer product companies that they believe protect their personal information. Furthermore, 70 percent of consumers would be more likely to buy from a consumer product company that was verified by a third party as having the highest standards of data privacy and security. In short, strong data privacy and security practices are not just about risk mitigation, but also a potential source of competitive advantage.”
Building consumer trust, Deloitte Insights, November 2014
Privacy and its dependence on context
“Detractors often depict privacy work as being “ideological.” If believing that people shouldn’t live in fear of their tech betraying them is ideological, I’ll take it. But there’s a purely profits-driven consideration if that’s what you want: people don’t want to be recognised across contexts, and trying to force that to happen is an arms race against users which you’ll eventually lose.”
Privacy: A Quick Reality Check, Robin Berjon, July 2021
“The behavior involved in privacy paradox studies involves people making decisions about risk in very specific contexts. In contrast, people’s attitudes about their privacy concerns or how much they value privacy are much more general in nature. It is a leap in logic to generalize from people’s risk decisions involving specific personal data in specific contexts to reach broader conclusions about how people value their own privacy.”
The Myth of the Privacy Paradox, Daniel J. Solove, February 2020
“When confronted with the reality of what happens on the internet, people are often creeped out by the constant collection and analysis of their data. According to many studies, people sometimes think tailored ads can be useful, but often feel uncomfortable with the lack of notice or choice when they are used. Highly targeted ads not only creep people out, but may even backfire, leading people to feel less inclined to purchase advertised products. Just because we have data does not necessarily mean we should use it. The perception of creepiness also depends on the context for data collection. For example, according to one study, if users trust the source of data collection, they are generally less likely to be creeped out, though they might feel more comfortable if the data are not shared outside of the organization in question.”
How to Collect Data Without Being Creepy, Martin Shelton, August 2017
“Understanding the context is not just about understanding the audience. It’s also about understanding the environment. Just as people trust each other, they also trust the physical setting. And they blame the architecture when they feel as though they were duped. Consider the phrase ‘these walls have ears’ which dates back to at least Chaucer. The phrase highlights how people blame the architecture when it obscures their ability to properly interpret a context. Consider this in light of grumblings about Facebook’s approach to privacy. The core privacy challenge is that people believe that they understand the context in which they are operating; they get upset when they feel as though the context has been destabilized. They get upset and blame the technology. What’s interesting with technology is that unlike physical walls, social media systems DO have ears. And they're listening to, recording, and reproducing all that they here. Often out of context. This is why we’re seeing a constant state of confusion about privacy.”
Privacy and Publicity in the Context of Big Data, danah boyd, April 2010
“We don't just hold people accountable for helping us maintain privacy; we also hold the architecture around us accountable. We look around a specific place and decide whether or not we trust the space to allow us to speak freely to the people there… Think about a cafe that you like to visit. This is fundamentally a public space. There’s a possibility that you’ll intersect with all sorts of different people, but there are some people who you believe you are more likely to interact with than others. You have learned that you're more likely to run into your neighbors and you'd be startled if your mother ‘popped in’ since she lives 3000 miles away.”
Making Sense of Privacy and Publicity, danah boyd, March 2010
“Contexts, or spheres, offer a platform for a normative account of privacy in terms of contextual integrity. As mentioned before, contexts are partly constituted by norms, which determine and govern key aspects such as roles, expectations, behaviors, and limits. There are numerous possible sources of contextual norms, including history, culture, law, convention, etc. Among the norms present in most contexts are ones that govern information, and, most relevant to our discussion, information about the people involved in the contexts. I posit two types of informational norms: norms of appropriateness, and norms of flow or distribution. Contextual integrity is maintained when both types of norms are upheld, and it is violated when either of the norms is violated. The central thesis of this Article is that the benchmark of privacy is contextual integrity; that in any given situation, a complaint that privacy has been violated is sound in the event that one or the other types of the informational norms has been transgressed.”
Privacy as contextual integrity, Helen Nissenbaum, Washington Law Review, February 2004
Privacy in traditionally public spaces
“I would take these attributes of GPS monitoring into account when considering the existence of a reasonable societal expectation of privacy in the sum of one’s public movements. I would ask whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables the Government to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on…”
US v. Jones, concurring opinion by Justice Sonia Sotomayor, January 2012
“Just because something is publicly accessible does not mean that people want it to be publicized. Making something that is public more public is a violation of privacy.”
Making Sense of Privacy and Publicity, danah boyd, March 2010
“Privacy violations might occur when information about individuals might be readily available to persons not properly or specifically authorized to have access the data. As described above, subjects within the T3 dataset might have used technological means to restrict access to their profile information to only members of the Harvard community, thus making their data inaccessible to the rest of the world. By using research assistants from within the Harvard community, the T3 researchers—whether intentional or not—would be able to circumvent those access controls, thereby including these subjects’ information among those with more liberal restrictions. Further, no specific consent was sought or received from the subjects in the study; their profile information was simply considered freely accessible for collection and research, regardless of what the subject might have intended or desired regarding its accessibility to be harvested for research purposes. Combined, these two factors reveal how a privacy violation based on improper access has occurred due to the T3 project.”
“But the data is already public”: On the ethics of research in Facebook, Michael Zimmer, Ethics and Information Technology, December 2010
“This leads to the second point: just because users post information on Facebook doesn’t mean they intend for it to be scraped, aggregated, coded, disected, and distributed. Creating a Facebook account and posting information on the social networking site is a decision made with the intent to engage in a social community, to connect with people, share ideas and thoughts, communicate, be human. Just because some of the profile information is publicly avaiable (either consciously by the user, or due to a failure to adjust the default privacy settings), doesn’t mean there are no expectations of privacy with the data.”
On the “Anonymity” of the Facebook Dataset, Michael Zimmer, September 2008
Privacy and the impact of surveillance
“Unlike human operatives ‘with limited reserves of time and attention,’ AI systems never tire or fatigue. As a result, this creates a substantial ‘chilling effect’ even without resorting to physical violence; citizens never know if an automated bot is monitoring their text messages, reading their social media posts, or geotracking their movements around town.”
The Global Expansion of AI Surveillance, Steven Feldstein, Carnegie Endowment for International Peace, September 2019
“There is a long history of data about sensitive attributes being misused, including the use of the 1940 USA Census to intern Japanese Americans, a system of identity cards introduced by the Belgian colonial government that were later used during the 1994 Rwandan genocide (in which nearly a million people were murdered), and the role of IBM in helping Nazi Germany use punchcard computers to identify and track the mass killing of millions of Jewish people. More recently, the mass internment of over one million people who are part of an ethnic minority in Western China was facilitated through the use of a surveillance network of cameras, biometric data (including images of people’s faces, audio of their voices, and blood samples), and phone monitoring… Plenty of data collection is not involved with such extreme abuse as genocide; however, in a time of global resurgence of white supremacist, ethno-nationalist, and authoritarian movements, it would be deeply irresponsible to not consider how data & surveillance can and will be weaponized against already vulnerable groups.”
8 Things You Need to Know about Surveillance, Rachel Thomas, August 2019
“Biometric data like identified fingerprints generally require cooperation to obtain. Soft biometric data that do not reveal a unique identity, but instead estimate a demographic attribute like age, and can be obtained without cooperation using automated facial image analysis— How can the general public be notified of mass surveillance that uses soft demographics, which does not on the surface invade privacy directly? Can individuals opt out?”
Gender shades: intersectional phenotypic and demographic evaluation of face datasets and gender classifiers, Joy Buolamwini, September 2017
“I have become convinced that my earlier, bleak predictions about the Database of Ruin were in fact understated, arriving before it was clear how Big Data would accelerate the problem. Consider the most famous recent example of big data’s utility in invading personal privacy: Target’s analytics team can determine which shoppers are pregnant, and even predict their delivery dates, by detecting subtle shifts in purchasing habits. This is only one of countless similarly invasive Big Data efforts being pursued. In the absence of intervention, soon companies will know things about us that we do not even know about ourselves. This is the exciting possibility of Big Data, but for privacy, it is a recipe for disaster.”
Don’t Build a Database of Ruin, Paul Ohm, Harvard Business Review, August 2012
“However, developments in information technology challenge the conceptual, normative and empirical explanations for the lack of attention given to the problem of privacy in public. These developments include the ability to transmit and share large amounts of information across global digital networks, the ability to aggregate disparate sets of information into large databases, reductions in the cost of data storage to facilitate such databases, and the increase in processing power to ease the processing and analysis of data. These developments in information technology mean that there is virtually no limit to the amount of information that can be recorded, virtually no limit to the level of data analysis that can be performed, that the information can be shared with ease, and virtually stored forever. The consequence of the emergence of such powerful information technology is a rise in the magnitude, detail, thoroughness and scope of the ability to surveil everyday people engaging in their everyday, public activities.”
Privacy as Contextual Integrity (Part 1): Problem of Privacy in Public, Michael Zimmer, April 2005