The second law of thermodynamics gets right to the point: In any system, entropy will always increase. Warm bodies turn cold, movement slows, ice melts, disorder tugs at the edges of order, and the dark end of the temperature gradient draws us all into the night.
As much as we may pretend otherwise—imagining our terabytes of stored photos, files, and text to be eternal—data is no exception to this rule. Every digital calculation grinds away at its host servers at a molecular scale, producing accumulated frictions that escape as relentless heat. To keep it at bay, data centers depend on constant air-conditioning and convective pipes coursing with cooled water. Without continual monitoring and backup cooling systems ready to kick on at a moment’s notice, the heat produced by the internet’s constant calculations could easily spark the kind “thermal runaway event” detailed in E. G. Condé’s striking short story “Subsidence.” At scale, in less than half an hour, such an event would quite literally melt the cloud as we know it.
Every digital calculation grinds away at its host servers at a molecular scale, producing accumulated frictions that escape as relentless heat.
“Heat is the waste product of computation,” Condé writes in his other life as an anthropologist of computing, under the name Steven Gonzalez. “If left unchecked, it becomes a foil to the workings of digital civilization.” Modern data centers are climate bunkers; air-conditioning represents some 60% of their total energy usage. The cloud has a bigger carbon footprint than the airline industry, but practical ideas to reduce its reliance on air-conditioning, like relocating a majority of the world’s data centers to Nordic countries to take advantage of the “free cooling” offered by their frigid climates, are difficult to reconcile with the demands of the market. As Gonzalez observes, if our data emigrated to the remote north, its geographic distance would cause signal latency, an unforgivable inconvenience in our culture of instant gratification.
And so the data centers continue to be built close by—often alarmingly so, shoulder-to-shoulder with residential communities. Plagued by relentless noise and air pollution, competing for precious water, these communities have become the most visible human casualties in the cloud’s ongoing war against entropy. But it is a war we will all lose eventually. As the digital humanities scholar Jeffrey Moro wrote in a 2021 exploration of the thermodynamics of data centers, all data is ultimately destined for “total heat death,” an inevitable end that will only be accelerated by climate change. Data center owners are acutely aware of this, and have invested heavily in hyper-redundant air-conditioning systems to maintain the illusion of permanence; as detailed in Condé’s story, every square inch of their facilities is optimized to keep the cold air flowing and the bits whirling just a little bit longer. They’ve also invested in security—to keep out hacker-marauders as well as nonhuman hazards like fire and flood.
The second law of thermodynamics is the only law of physics that distinguishes between past and future. While all the other laws of nature are effectively reversible, the thermodynamic arrow of time moves in only one direction. In the eyes of a physicist, the “future” is simply the direction toward which entropy increases: a long slow march toward the inevitable heat death of the universe. And yet here we are, our warm hearts beating. Indeed, the only exception to the inevitability of decay is the brief and buoyant stand made by every living system, from the cell to the science fiction writer. As the cyberneticist Norbert Wiener put it in his 1954 book The Human Use of Human Beings, “Life is an island here and now in a dying world.”
The thermodynamic arrow of time moves in only one direction.
This raises a tantalizing question: Could life, with its onboard resilience against entropic forces, provide a workable solution to the problem of the data center? Perhaps. Silicon is hardly the sole province of memory, after all; preserving information for future use is an old evolutionary trick, the very basis of adaptability and survival. According to the neuroscientists Peter Sterling and Simon Laughlin, learning and memory constitute “deep principles” of biological design: Life has survived from the first split cell to the twenty-first century by learning from experience. Ancient traumas are woven into the neural circuitry of our species. More recent ones, long preserved through oral and written traditions, are now stored in the hot machinations of the data center. Regardless of the medium, memory is survival.
As a consequence, biology, with its billions of years of beta-testing in the rearview, has already produced the most powerful storage medium for information in the universe: DNA. Every nucleus of every cell in the human body holds 800 MB of information. In theory, DNA can store up to a billion gigabytes of data per cubic millimeter; with this efficiency, the 180-odd Zettabytes of information our global civilization produces each year would fit in a tennis ball. More importantly, it wouldn’t consume any energy—and it would be preserved for millennia.
This may all sound science-fictional, but over the last decade, technology companies and research institutions have successfully encoded all manner of precious cultural information into the double-helix: the works of Shakespeare, all 16GB of Wikipedia, an anthology of biotechnology essays and science fiction stories, the UN Declaration on the Rights of the Child, the Svalbard Global Seed Vault database, the private key of a single bitcoin, and the 1998 album Mezzanine by Massive Attack. Of course, these are PR gimmicks—snazzy proofs of concept for a nascent industry.
Could life, with its onboard resilience against entropic forces, provide a workable solution to the problem of the data center?
But beyond the hype, DNA data storage technology is evolving quickly, and biotech companies have pushed their offerings to the brink of commercial viability. Their approaches are diverse. Catalog, a Boston-based startup, has created a “printer” that can write synthetic DNA directly onto sheets of clear plastic; the French startup Biomemory stores data in credit-card sized “DNA Cards”; Atlas Data Storage, a spinoff of the biotechnology giant Twist Bioscience, encodes data onto synthetic DNA and then dehydrates it into a shelf-stable powder to be reconstituted at will. These propositions should be enticing to anyone tasked with maintaining the integrity of the cloud: plastic sheets, cards, and DNA powder, stashed in metal capsules the size of a AAA battery, don’t require air-conditioning.
This makes DNA storage the perfect storage medium for what experts call “cold” data: things like municipal and medical records, backups, research data, and archives that don’t need to be accessed on demand (“hot” data, in contrast, is the kind found on Instagram, YouTube, or your banking app). Some 60–80% of all data stored is accessed infrequently enough to be classified as cold, and is currently stored in magnetic tape libraries. Tape, by virtue of its physical nature, is secure and requires minimal power to maintain. But even under perfect environmental conditions, cooled to a precise 20–25°C temperature range, it only lasts for a few decades, and the technology for playing back magnetic tape is likely to go obsolete before the tape itself degrades.
The oldest DNA sample to be successfully read, on the other hand, was over two million years old. And given its importance in the life sciences, it’s not likely we’ll ever forget how to sequence DNA. So long as the relevant metadata—instructions for translating the four-letter code of DNA back into binary—is encoded alongside the data itself, information preserved in DNA will almost certainly outlast the technology companies encoding it. This is why Microsoft, Western Digital, and a small concern of biotech companies cofounded, in 2020, the DNA Data Storage Alliance, an organization to define industry-wide standards for the technology. As with all software, the interoperability of genetic technology will be key to its longevity.
At the time of the Alliance’s founding, Stefan Hellmond, a vice president at Western Digital, observed that DNA would be essential to the storage industry’s future because “the overall temperature of data is cooling down.” That is, the more data human culture produces, the bigger our archive—and the more essential a long-term, shelf-stable storage medium becomes in offsetting its enormity.
The more data human culture produces, the bigger our archive—and the more essential a long-term, shelf-stable storage medium becomes in offsetting its enormity.
But there is a spanner in these works: the power-hungry artificial intelligence systems currently driving a data center construction boom across the United States. AI-optimized servers consume far more energy than traditional ones, and, according to a 2024 Berkeley Lab Report, cooling these servers is expected to consume up to 33 billion gallons of water by 2028. In E. G. Condé’s imagination, however, AI isn’t only a source of heat, but its suicidal conductor: In his story, an unreliable AI system that helps to manage the data center, heat-mad and pushed to the brink by unrelenting consumer demand, makes the fatal decision that sparks a runaway thermal event.
This is a speculation, of course, albeit one informed by Condé’s own scholarship of the vulnerabilities of modern data centers. Here is my own: Heat is inevitable. Computing as we understand it—electrical operations in silicon and tungsten—operates at staggering financial and environmental cost in defiance of both physics and reason. The risk of overheating is forever imminent. And although our hot data still calls for hot servers (for now, anyway), introducing biology into our storage infrastructure, minimizing the cooling requirements for the “cold” data that occupies most of our storage, could help us develop greater resilience against the kind of thermodynamic failure Condé so evocatively describes. That is, may cooler heads prevail.



