Future Tense Fiction

The Long Arm of Law and Technology

"The Long Arm of Law and Technology" illustration by Rey Velasquez Sagcal

When I started my career as a police officer, cell phones were unknown in my department. It was 1979, which meant that if you needed to talk to the dispatchers—let’s say to report back information, or get an update on a crime in progress—you had to use a payphone or one of the many police “call boxes” sprinkled around town. In Redlands, California, where I worked, these were phones in boxes on poles, hardwired directly into the dispatch center. Teenagers loved to annoy the dispatchers by picking up the handsets, saying some typical teen thing, and then running off.

Our first non-hardwired phones were “radio phones” located in three of our unmarked cars.

Imagine an old rotary-style phone installed on the transmission hump, below the dashboard. They rarely worked, but they were our first taste of any kind of mobile communication device, and we thought they were pretty cool. From there, we advanced to cell phones that weighed about five pounds and resembled infantry radios from a classic World War II movie; at that point, our department had only two.

Today, in contrast, the officers of my former department all carry at least one smartphone, which is constantly linked to data systems and can record photos, audio, and video, all while mapping and analyzing new information. In many ways, their offices are now held within the computing devices they carry on their duty belts or in their load- bearing vests.

Advanced technologies like artificial intelligence and augmented reality are only accelerating the technology transformation we’ve seen in the last several decades, and the result is nothing short of breathtaking. Almost every police technology vendor has integrated some form of AI into the products they’re pitching to police chiefs and sheriffs across America. And they’re doing so in a regulatory and legislative environment that lags far behind the technology it’s meant to govern.

When used for policing, what does technology illuminate—and what does it obscure?

All of this is part of a growing police-industrial complex. A term first popularized in the 2010s, it refers to the growing network of private sector individuals and organizations that produce weapons, vehicles, software, training, and high-tech hardware for the police—and the layered relationships among police, industry, legislators, and other political and economic actors that result. Initially, the police industrial complex revolved around hardware—things like tactical vehicles, radios, and less-lethal munitions. Now, the police-industrial complex is increasingly fueled by advances in AI, big data, and virtual and augmented reality, technologies that jumped from general use into policing.

There are important questions on the other side of this jump. When used for policing, what does technology illuminate—and what does it obscure? Kevin Galvin’s “A Time Between,” a new short story published by Future Tense Fiction, explores these questions through the eyes of Detective Carberry, an older, somewhat disillusioned officer who prefers tangible experiences over the augmented reality technologies that now dominate his fictional universe—and his police department. Carberry is wary of how reality and fiction are becoming increasingly hard to distinguish, and his skepticism comes to a head when he’s tasked with investigating the death of a college freshman who supposedly fell from a dorm-room window, but whose body cannot be found. When officers wearing AR glasses at the scene of the incident “see” her body lying on the street, Carberry suspects something more complex is at play.

Carberry and his partner eventually unravel the mystery, but Carberry’s internal disorientation in a world dominated by digital simulations remains unresolved. He discovers that even his familiar sanctuaries, like his favorite pub, are mere virtual reconstructions, estranging him from a society that favors artificial experiences over authentic ones.

In pulling on this tension, “A Time Between” makes the reader consider how immersive technology will—or already has—altered social dynamics, individual identity, and memory. It also asks us to consider what we hope for and demand of technology in law enforcement.

There are clear benefits to policing’s use of advanced technologies like AI or AR. When cops are able to conduct realistic AR training in places highly susceptible to active shooters—schools, theaters, churches—they may be more likely to effectively intervene and save lives in the real world. Exposing officers to a wide breadth of interpersonal scenarios in virtual reality could help prepare them to effectively de-escalate situations involving people experiencing mental health crises. And using AI to analyze vast amounts of public safety data can translate to a more focused, less intrusive use of crime control strategies and a more efficient use of limited police resources (which is vital given the current national police recruiting crisis).

While advancing technologies hold great promise for increasing the effectiveness of policing, they can also be perilous. Drones, facial recognition, surveillance cameras, thermal imaging, acoustic detection, conducted energy devices (Tasers), and other less lethal technologies are all examples of promising technologies that could have dire consequences if the police and community don’t “co-produce” the decision to use a particular technology.

Police leaders don’t have the luxury of waiting until they have sufficient resources to address compelling crime problems.

An example of co-production in practice is the Citizens Privacy Council in Redlands, California. The council is made up of a group of citizens who meet regularly to discuss the privacy implications of current and emerging uses of technology by the Redlands Police Department. The comments and recommendations of its members help guide the police chief’s technology decisions and policies. For instance, when the department was implementing an extensive surveillance camera system in the mid 2000s, the Privacy Council took a strong stand against the integration of microphones, which the police department ultimately heeded.  

These sorts of decisions are extremely complex. Police leaders don’t have the luxury of waiting until they have sufficient resources to address compelling crime problems. Their communities and politicians expect them to prevent crimes and solve those they couldn’t stop—quickly, effectively, and regardless of chronic staffing shortages. As a result, police leaders are frequently attracted to quick-fix advanced technologies shilled by vendors with big promises. It’s not hard to imagine the pitch for crime-solving AR glasses like the ones sketched out in “A Time Between.”

The overwhelming majority of American police organizations do not possess the technological knowledge to gauge the veracity of these vendor claims. As such, they fall victim to “vendor speak” and acquire tech solutions that are, at best, a waste of taxpayer money, and at worst, harmful to the very people the police are paid to protect. This is why police technologists and ethicists are so beneficial to policing. Regrettably, they are also rare.

Absent the thoughtful implementation of policies, procedures, and training, advanced technologies like AR can further separate cops from the public. Prior to the automobile’s integration into policing, for instance, cops walked a foot beat where they had frequent face-to-face interactions with the people of their community. They knew the people in their beats and were better positioned to respond to their needs. Implementing patrol cars—and later police radios—allowed police departments to do more with fewer cops. But this occurred at the expense of the interpersonal relationships that drove the public’s trust and confidence in police. In the same way, AR glasses could increase police effectiveness, but they could also degrade the empathetic, active listening approach that progressive police leaders hope their cops are practicing.

Absent the thoughtful implementation of policies, procedures, and training, advanced technologies like AR can further separate cops from the public.

Unfortunately, history amply demonstrates that people can use technology just as easily to commit crime as they can to prevent it. When cars were developed, robbers abandoned their “getaway horses” and used vehicles to facilitate their escapes. When the internet arrived, it facilitated transnational crime at a previously unthinkable level. Similarly, AR technology can be misused for disinformation, misinformation, state sponsored psyops, terrorism, and extremism. Imagine a hacker creating an AR-induced “swatting” incident—falsely portraying a crime, like an active shooter, to trigger a SWAT response. Or a cop responding with deadly force after “seeing” an unarmed person pointing a gun at him. Imagine the impact of state-sponsored AR hacking that foments extreme division among people of opposing political beliefs. 

None of this is difficult to envision. Many American police agencies have been victims of successful ransomware attacks, forced to pay out millions to recover computer systems and databases held hostage by hackers. If AR technology becomes an integral part of our society, and/or a critical, invaluable component of policing’s response to public safety, it will be vulnerable to similar attacks, and communities could be forced to pay millions to free their citizens or cops from mirages of reality.

If we wish to avoid such outcomes, it is incumbent on police, policymakers, and community leaders to work together to ensure policing technology is utilized in a safe and responsible manner. Only through ongoing collaboration among the triumvirate of what’s sometimes called rightful policing (cops, policymakers, and community) can we minimize the potential for misuse of police technology and increase its value in furthering public safety outcomes. Ultimately, this collaboration is key to create a future where we can all agree that American policing has become effective, empathetic, and just.

About the Author

Jim Bueermann is the founder and president of the Future Policing Institute and Center on Policing and Artificial Intelligence. He retired as chief of the Redlands Police Department in 2011 after 33 years of service. Subsequently, he served as an executive fellow with the USDOJ National Institute of Justice and led the National Police Foundation (now the National Policing Institute) in Washington, DC.

Future Tense Fiction is a partnership between Issues in Science and Technology and the Center for Science and the Imagination at Arizona State University.

Cite this article

Bueermann, Jim. “The Long Arm of Law and Technology.” Future Tense Fiction. Issues in Science and Technology (November 22, 2024).