Book Review: The end of the wild, wild Net

Regulating Code: Good Governance and Better Regulation in the Information Age by Ian Brown and Christopher T. Marsden. Cambridge, MA: MIT Press, 2013, 288 pp.

Rex Hughes

Who regulates the Internet? A decade ago, posing this question would have been considered heretical by the majority of those people and groups charged with maintaining the core principles and norms that govern the Internet. However, as Ian Brown and Christopher T. Marsden succinctly chronicle in Regulating Code, the Internet no longer operates in an unregulated space, thanks to an expanding web of technological innovation, market power, legislative agendas, pressure groups, and regulatory authorities. In writing what can be broadly categorized as an Internet regulatory scholar-practitioner book, Brown (a computer scientist) and Marsden (a lawyer) seek to improve the study and practice of Internet regulation by integrating more directly the technical principles and social norms associated with the Internet since its early ARPANET (Advanced Research Projects Agency Network) days. The authors also look to public-private mechanisms of “co-regulation” as necessary means of achieving better Internet regulation.

Given the increasing need for more technically informed regulatory analyses and practices that stay true to the Internet’s unique principles and norms, this book makes a significant contribution to the study of 21st century Internet regulation. In casting their interdisciplinary technical-legal gaze over the past 10 years of major battles arising in the United States and the European Union (E.U.) over Internet policy, Brown and Marsden show how even the most well-intentioned Internet regulation can experience major failure when its principal architects lose sight of what the Internet is really about.

In addressing the book’s central thesis of how to devise a more robust analytical framework for producing better Internet regulation, the authors make their case via five well-researched case studies. The first three case studies— in chapters titled “Privacy and Data Protection,” “Copyrights,” and “Censors”—focus on issues of what the authors consider “fundamental rights with economic implications.”

Beginning with privacy and data protection, the authors chronicle several of the most salient regulatory clashes in the United States and the European Union since 2002. As the authors note, the European Union in particular has been debating data protection policies for 20-plus years, and they argue that current regulatory logjams can best be alleviated if Brussels lawmakers and regulators apply technical engineering principles such as “interoperability” when constructing a new regulatory regime.

Brown and Marsden argue for a similar technically informed approach to Internet regulation in their case study of copyrights. They say that U.S. and E.U. lawmakers made a substantial error in trying to construct a digital copyright regime around the premise of “regulating the machine” rather than the market behavior made possible by global network distribution of digital intellectual property. Further, lawmakers failed to adequately allow for future technological and business innovation when authoring the Digital Millennium Copyright Act (DMCA) and the European Copyright Directive (ECD). Thus, both laws fell well short of their framers’ goal of constructing a regulatory regime that balances the power of rights holders with a new generation of digital consumers, or “prosumers,” as Brown and Marsden like to call them. Failing to understand emergent prosumer behavior (in what Jonathan Zittrain, a professor at Harvard University and an expert on Internet law, calls the “generative Internet”) made the public policy goal of reaching a proper balance between the rightsholder and the prosumer in either the DMCA or ECD nearly impossible.

In their case study of censors, the authors chronicle the ongoing struggle that U.S. and E.U. authorities have had in trying to impose limits on sensitive Internet content. They cite the WikiLeaks saga as an instance where the U.S. government failed to block the Internet distribution of over 250,000 classified documents.

Brown and Marsden selected the final two case studies, presented in chapters called “Social Networking Services” and “Smart Pipes,” because they provided “the most innovative platforms to develop new markets and protect those fundamental rights.” Regarding social networking services such as Facebook, LinkedIn, and Google+, they make a convincing case that U.S. and E.U. competition authorities have failed in their mission to design a new regulatory regime that sufficiently balances the interest of providers of these services and their customers. They see severe deficiencies in both the objectives and approach of U.S. and E.U. regulators. Although the authors admit that the rapid innovation associated with social networking services make traditional legal definitions of transparency and enforceability difficult, they call attention to a growing number of areas that leave a new generation of prosumers without adequate protections. Once again, the authors call upon transatlantic regulators to apply the interoperability principle to provide for greater user transparency and ownership.

In their case study of smart pipes, the authors confront the thorny public policy question of net neutrality. Here again they show the moral and economic hazard that national regulators create when they allow Internet service providers to deploy so-called smart filters that risk breaking the Internet’s longstanding end-to-end principle. In the authors’ words, “The pace of change in relation between architecture and content on the Internet requires continuous improvement in the regulator’s research and technological training.”

Brown and Marsden are to be commended for tackling a complex dynamic topic in a mere 267 pages. Their case studies offer clear empirical data that demonstrate that the future of the Internet is indeed intertwined in the high-priced regulatory and lobbying battles in Washington and Brussels. In applying their interdisciplinary technical-legal analysis and in-depth knowledge of historic Internet principles and norms to their analyses, they show the utility of bringing a technically informed interdisciplinary approach to the study and practice of Internet regulation. They also signal the need for a more balanced approach between European-style co-regulation and U.S.-style self-regulation. Had such a technically informed balanced approach been brought to bear with the creation of the DMCA and the ECD (two of the most consequential regulatory acts for the Internet economy), how many millions of dollars in contentious rights litigation could have been avoided?

The same principle holds true for the transatlantic data protection debates of the present. How many policymakers truly understand the technologies they regulate? What are the universities and professional bodies responsible for regulatory education and training doing to alter this critical calculus? Regulating Code is a step forward in challenging the core assumptions that guide the principal stakeholders involved in crafting 21st century Internet regulation.

Nobody’s perfect

But even as the book succinctly accomplishes its main analytical goals and normative aims, there are a few areas open for improvement. Although the book is not positioned as a pure scholarly work, the chosen methodologies and theoretical frameworks applied in the case study analyses could have benefited from some further explanation and context for readers not familiar with orthodox theories of computer science and regulatory studies. In some sections of the book, case study issues too easily bleed into one another. For example, there are elements of the chapter on social networking services that could have been integrated with the chapter on privacy and data protection (a classic problem when confronting such a converged set of technologies and issues).

Also, even though the book is written from a European-rooted transatlantic perspective, it would have been helpful to have had some additional discussion on how other non-European authorities are confronting similar issues. The so-called BRIC countries (Brazil, Russia, India, China) will be home to much of the next billion-plus Internet users. How are these countries helping or hindering the classic transatlantic club of Internet regulation? Perhaps issues for a future Brown-Marsden collaboration.

In summary, Regulating Code makes a significant contribution to Internet regulatory studies, and it would benefit anyone seeking a thorough account on the rise of the Internet regulatory state since 2002. For better or worse, the unregulated Internet is no more. As has been historically the case with other disruptive technological innovations that become mass communications mediums, the Internet is now enmeshed in some of today’s most pitched public policy battles. However, as Brown and Marsden skillfully remind us, keeping the public Internet true to the principles and norms that its ARPANET founders sought to embed in its core protocols and applications will require more technically informed regulatory stakeholders. Increasing technically informed analysis in the quest for better Internet regulation will require institutional change in both universities and regulatory bodies. And although the book does not offer an exact roadmap for making such institutional fixes, it does show the advantage of interdisciplinary collaboration when confronting the complex human-machine systems that span the multiple jurisdictions and cultures of Internet cyberspace.

Rex Hughes (rex.hughes@cl.cam.ac.uk) co-directs the Cyber Innovation Network at the University of Cambridge Computer Laboratory and is a visiting professor at the University of Toronto Munk School of Global Affairs.

Cite this Article

Hughes, Rex. “The end of the wild, wild Net.” Issues in Science and Technology 30, no. 2 (Winter 2014).

Vol. XXX, No. 2, Winter 2014