How Generative AI Endangers Cultural Narratives
Sometime last summer, I needed to install a new dryer in my home in Bergen, Norway. I opened a localized version of Google and typed a request for instructions in Norwegian. Everything the search engine returned was irrelevant—most results assumed my dryer relied on gas, which is not a thing in Norway. Even refining responses for electric dryers assumed configurations that do not exist in my country. I realized that these useless results must be machine-translated from elsewhere. They appeared Norwegian, but they couldn’t help me get a dryer running in Norway. In this case, the solution was trivial: a trip to a neighborhood hardware store got me wired in.
But my experience underscores an underappreciated risk that comes with the spread of generative artificial intelligence: the loss of diverse cultural narratives, content, and heritage. Failing to take the cultural aspects of generative AI seriously is likely to result in the streamlining of human expression into the patterns of the largely American content that these systems are trained on.
As generative AI is integrated into everyday tools such as word processors and search engines, it’s time to think about what kinds of stories it can generate—and what stories it will not generate. It’s no secret that AI is biased. Researchers recently asked the image generator Midjourney to create images of Black physicians treating impoverished white children, but the system would only return images depicting the children as Black. Even after several iterations, Midjourney failed to produce the specified results. The closest it got to the prompt was a shirtless medicine man with feathers, leather bands, and beads, gazing at a similarly garbed blond child.
Here’s something that hits close to home: the potential loss of Cardamom Town. Thorbjørn Egner’s Folk og røvere i Kardemomme by (When the Robbers Came to Cardamom Town) is a children’s book and musical well known to anyone who grew up in Norway or Denmark after 1955. The songs and stories have been played, read, and sung in homes and preschools for decades; there’s even a theme park inspired by the book in the city of Kristiansand. The story features three comical thieves who steal food because they are hungry and don’t understand that work is necessary. After being caught stealing sausages and chocolate, they are rehabilitated by the kind police officer and townsfolk, then end up saving the town from a fire.
This story is more than a shared cultural reference—it supports the Norwegian criminal justice system’s priority of rehabilitation over punishment. It is distinct from Disney movies, with their unambiguous villains who are punished at the end, and from Hollywood bank heists and gangster movies that glorify criminals. Generative AI might well bury stories like Cardamom Town by stuffing chatbot responses and search results worldwide with homogenized American narratives.
Narrative archetypes give us templates to live by. Depending on the stories we hear, share, and create, we shape possibilities for action and for understanding. We learn that criminals can be rehabilitated, or that they deserve to come to a bad end. The humanities and social sciences have studied and critiqued AI for a long time, but almost all development of AI has happened within quantitative disciplines: computer science, data science, statistics, and mathematics. The current wave of AI is based on language, narratives, and culture; unchecked, this wave threatens to impoverish the world’s cultural narratives. We have reached a point where AI development needs the humanities. Not just so I can figure out how to install my appliances, but so we don’t lose the stories that shape our communities.