Hiding in Plain Sight

Review of

Obfuscation: A User's Guide for Privacy and Protest

Cambridge, MA: MIT Press, 2015, 136 pp.

Hiding the truth isn’t often praised. In an age of “fake news,” spam Twitter accounts, and sham Facebook groups, there is already plenty of concealment and disinformation to go around. But in Obfuscation: A User’s Guide for Privacy and Protest, Finn Brunton and Helen Nissenbaum, professors at New York University, make the case for hiding the truth. They argue that the average user of technology should obfuscate—that is, deliberately add ambiguous, confusing, or misleading information to interfere with surveillance and data collection. They aim to start a limited revolution of the informationally unempowered by offering tools to bolster privacy, to make things marginally harder for an adversary, or even just to protest data collection.

Part I of this small-format book introduces obfuscation’s key characteristics and variations through examples. The book isn’t intended to be a taxonomy of obfuscation, but the examples offered provide insight into a range of strategies. First up: paper and foil chaff deployed by warplanes to create decoy signals and confuse radar systems. Like chaff, one strategy for obfuscation is to produce fake signals to hide the real signal. The authors discuss how, after irregularities in the 2011 Russian parliamentary elections, Russian government allies used Twitter bots to disrupt online protests by flooding protest hashtags with pro-Russia or nonsense messages. Other examples are more in keeping with the book’s theme of obfuscation as the tool of the oppressed, not the oppressor. TrackMeNot, for example, is a tool for online obfuscation that Nissenbaum and others developed to disguise a user’s online search queries by automatically generating a flood of fake queries.

Other obfuscation strategies involve genuine but misleading signals, such as the identical bowler hats of thieving confederates in The Thomas Crown Affair. A technology-based example given in the book is the swapping of cellphone SIM cards by terrorist operatives to thwart metadata surveillance and, ultimately, drone strikes. More mundanely, some groups of consumers swap store loyalty cards to be able to get discounts while disguising their personal purchasing patterns.

Having familiarized readers with obfuscation as a concept, Part II argues for the value of the technique. It attempts to address why obfuscation is needed, whether obfuscation is ethically justified, and whether it can work. That’s some heady work for a mere 50 pages. With such high ambitions, it’s no surprise that the section only partly achieves its goals.

There are good reasons for keeping obfuscation in the privacy toolbox, but Brunton and Nissenbaum overstate their case. They characterize obfuscation as a guerilla tool especially suited for use by “the small players, the humble, the stuck, those not in a position to decline or opt out or exert control over our data emanations.” Thus, they allege that everyday people need obfuscation to deal with power and information disparities. I’m not convinced, for two reasons.

First, the powerful may in fact benefit more from using obfuscation tools to protect their personal privacy. To the extent we participate in society, we must exchange or transmit information we cannot control. But for the average person, it is his or her very obscurity that provides the bulk of privacy. The powerful lack obscurity. An elevator CCTV camera’s eye falls on the rich and the poor alike, but it matters more if you’re as famous as Jay-Z.

Indeed, because of the attention focused on them, the famous and powerful often find it particularly difficult to exert control over their data emanations—just think of tabloid photographs and investigative journalism. Recent headlines about sexual harassment also suggest that it is increasingly difficult for prominent people to conceal past misdeeds. If powerful people face more scrutiny, then obfuscation is more useful for them. And contrary to the authors’ argument, obfuscation employed as a tool of disruption may be most potent in the hands of those with significant resources. Consider the allegations of Russian interference in the 2016 US presidential election, with covert groups attempting to influence the outcome through disinformation campaigns. The book itself provides many other examples of the powerful using or benefiting from obfuscation: pro-government Twitter bots disrupting activist protests; car-sharing companies faking orders to rival services; a government agency massively over-producing litigation documents to slow a court case. Such uses of obfuscation by the powerful undermine the authors’ argument that it is a tool uniquely suited to balancing power disparities.

My second objection is that Brunton and Nissenbaum don’t effectively consider both the costs and benefits of obfuscation. They fail to differentiate between political and commercial uses of data, even though the pragmatic and ethical reasons for obfuscation are strongest in resisting political power and weaker in other cases. The authors describe the ordinary person in a large city as “living in a condition of permanent and total surveillance.” But if the goal is to reduce power imbalances, it matters who is doing the surveilling, and why. Netflix’s power over a consumer suspected of liking romantic comedies is quite different from the CIA’s power over a suspected terrorist overseas. Targeted advertising is not targeted killing.

Furthermore, the book only backhandedly recognizes the massive benefits of information sharing. The authors argue that practically speaking, we cannot opt out of the collection of our personal data, because “the costs of refusal are high and getting higher.” Another way of saying this is that the benefits of sharing are large and getting larger, and that they are now so enormous that anyone opting out would be making a lifestyle choice on par with the Amish or ascetic monks. The authors’ description of losing these benefits as the “costs of refusal” is their begrudging acknowledgement that consumers don’t live in a fantasyland—they must make trade-offs.

And the authors only hint at the potential additional cost of obfuscation techniques to those using them. Academics have begun to study data deserts: populations that are not substantially engaged in the data economy. By being underrepresented, these groups may be missing the social benefits from data use. Imagine, for example, the potential policy consequences of an entire ethnic group obfuscating their census forms. Obfuscation could create self-imposed data deserts.

Although the authors overstate the need for obfuscation and understate the costs, they more successfully make the case that it is a technique accessible to those with little visible power. They explain how obfuscation is an example of what the political scientist James C. Scott called “weapons of the weak,” in his 1987 book of that name. Scott studied how Malaysian peasants with little visible power still manage to engage in resistance through an accumulation of individual small acts of defiance. Obfuscation fits well into the category of tools that are incremental, subversive, and emergent.

So obfuscation is a potentially useful tool for which there is at least some need. But is it ethical? Brunton and Nissenbaum tackle four different objections to their obfuscation strategies: dishonesty, waste, free riding, and data pollution.

On dishonesty, the authors concede that obfuscation is lying, but argue that it may be justified if done for legitimate ends. I wish they had further explored whether all obfuscation is lying; intuitively, there appears to be a difference between generating false signals and creating genuine but misleading signals.

The charge that obfuscation wastes resources appears to have hit a nerve with the authors. Critics have argued that Nissenbaum’s TrackMeNot tool floods search engines with unnecessary searches, wasting shared broadband resources as well as privately owned search engine resources. The authors observe that waste is in the eye of the beholder. Certainly TrackMeNot users find the tool to be a worthwhile use of resources. But many of these resources are owned by others. The longstanding social and legal consensus is that a resource’s owner is the primary judge on whether a particular use of that resource is wasteful. Yet the authors treat this question as a yet-unsettled “political question about the exercise of power and privilege.” If justifying obfuscation requires rewriting US property rights, the authors have a long road ahead.

I am particularly puzzled by the discussion of free riding. In economics, the free riding problem states that if one can use a service without paying, the service will be undersupplied, making everyone worse off. In the most extreme case, if everyone free rides, the service may cease being available entirely—no one rides at all. Thus the key ethical question about obfuscation is whether it is ethical to take an action that if everyone else does the same, all will be worse off.

This is not the question the authors ask. They ignore the effect of obfuscation on the viability of the service and the potential indirect harm to consumers. Instead, they focus exclusively on whether obfuscation directly harms non-obfuscating users. By pejoratively framing service providers as predators and consumers as prey, they transform the free riding debate into a debate about whether obfuscators have a duty to rescue the non-obfuscators from their “ignorance and foolishness.” Their conclusion to this patronizing question? No, because it is the predatory service provider’s fault. Never mind that the service provider is offering a service from which the obfuscator continues to benefit.

The discussion on data pollution is more measured. Obfuscation could “pollute” collections of data that may have social benefit. For example, obfuscation could contaminate a public health database, diminishing the benefits of such data. The authors note that unlike environmental pollution, there are no clear social norms about data pollution for most data sets, and, as with environmental pollution, it may be justifiable to sacrifice data integrity for other values.

For each of these ethical objections to obfuscation the authors resolve a few simple cases, but in what they call the “vast middle ground” of cases, they kick the can to politics. More specifically, to the political philosophy of John Rawls, known for his egalitarian theory of “justice as fairness,” and his theoretical “veil of ignorance” tool. Rawls sought to justify structuring society to ensure both equality and liberty, but his work is often seen as emphasizing equality over economic liberty. In this vein, Brunton and Nissenbaum argue that property rights and other societal and legal precepts ought to be “open to political negotiation and adjustment” with the goal of achieving the best results for justice, general prosperity, and social welfare. This analysis is skeletal at best—Rawlsian political theory can justify many different variations of property rights, for example—and the book provides no path to reopening negotiation on these foundational precepts. Given the authors’ own characterization of obfuscation as a tool of the politically weak, tying the ethics of obfuscation to the outcome of political and social reforms is a decidedly unsatisfactory solution.

The book steps out of the murk of ethical and political philosophy onto more solid ground when answering whether obfuscation works. It is easy to agree with Brunton and Nissenbaum’s cautious conclusion that obfuscation is a helpful strategy to meet certain limited goals. The concluding chapter comes closest to fulfilling the titular promise of a “user guide.” It repackages content from earlier chapters to identify six goals one might seek to achieve through obfuscation, such as buying time to evade a threat or to express protest. The authors then suggest four preliminary questions that obfuscation project developers should ask. For example, is the project intended for individual use, or does it require collective use to be effective? Is it intended to be hidden or public? Is it intended to obfuscate for just a short time, or for a longer period? This section is useful and practical.

In sum, the book demonstrates that obfuscation can be a useful tool for self-defense against the many entities that collect data about us. The justification for obfuscation is strongest when addressing unwarranted government data collection, where the power disparities are greatest and users have few alternative tools. It is weaker regarding commercial collection, where there are enormous benefits to the consumer and where other mechanisms, including markets and regulation, already constrain harmful behavior. Ultimately, however, obfuscation is an imprecise and incomplete privacy tool because it focuses on the collection phase of the data life cycle. Consumer benefits and harms occur later in the cycle—in the use or misuse of data. Obfuscation can only indirectly hinder harmful misuses of data, and in doing so, may also hinder beneficial uses.

Cite this Article

Chilson, Neil. “Hiding in Plain Sight.” Issues in Science and Technology 34, no. 2 (Winter 2018): 88–90.

Vol. XXXIV, No. 2, Winter 2018