Dystopia and the End of Politics
by Benjamin Kunkel
Dissent Magazine
IN retrospect, the nineties can seem an anomalous decade, the only one since the Second World War when technological civilization did not appear particularly bent on self-destruction. Of course, not everyone greeted the end of the cold war as the dawning of a millennium of capitalist democracy, but even dismayed leftists tended to forecast the coming century by extrapolating from current trends. These included increased liberalization of trade, increased commodification of natural resources (such as water) and human roles (such as fertilization, courtship, and the care of the elderly), the internationalization of culture, continual advances in digital technology and genetic science, the rolling back of governmental authority to its police powers, and regular elections to ratify it all. This vision, whether taken for a nightmare or a dream, was of a world integrated under a total market and consecrated to private as opposed to public life: the “private sector” of corporations, and the “private life” of households. You called this tendency globalization if you liked it, neoliberalism if you didn’t. Either way, the sense was that capitalism would, for the foreseeable future, consolidate its achievements rather than undermine them.
This notion of the future neglected certain facts. For one thing, it’s not as if no one knew about global warming during the nineties. Indeed, the end of the cold war and the first public awareness of climate change arrived almost simultaneously. In 1988, the Soviet Union declared it would no longer intervene in the affairs of allied countries, and in the same year the scientist James Hansen testified before the U.S. Congress that he possessed a “99 per cent” certainty that “global warming is affecting our planet now.” In December of 1991, the Soviet Union was dissolved; the following summer, the so-called Earth Summit in Rio de Janeiro produced the UN’s first climate change treaty, with its aim of “preventing dangerous anthropogenic interference with Earth’s climate system.” And, though the connection was rarely noted, these developments were not quite unrelated: petroleum exports made up some 60 percent of the USSR’s foreign currency earnings, and the same high oil prices that buoyed the Soviet rivalry with the United States encouraged conservation in the West. When, in the mid-eighties, oil prices collapsed, it not only helped finish off the USSR but increased fuel consumption outside of the Soviet bloc, which in turn accelerated global warming, along with—something else to worry about—the depletion of the earth’s oil reserves. Many of our newer anxieties turn, in fact, on the idea that the oil-intensive planetary transportation system so vital to the functioning of contemporary capitalism ultimately abets climate change, the arrival of peak oil, and the circulation of viruses, while globalized financial markets are capable of spreading contagions (as in the “Asian flu” of 1998) of a different kind.
None of this was impossible to imagine during the nineties. But it may have been simply too much to take that the cold war should immediately be succeeded by awareness of a dangerously overheating planet. Part of this is simply that it’s not the same thing to know something yourself (you and your favorite periodicals), and to know something you know your neighbor also knows. As Susan Sontag noticed in an essay called “The Imagination of Disaster,” about the typical science fiction movie of the early cold war, the arrival of the new menace (monsters, aliens) was “usually witnessed or suspected by just one person, a scientist on a field trip.” That was phase one of the plot. Phase two involved the “confirmation of the hero’s report by a host of witnesses to a great act of destruction.”
As viewers of the old and many of the new disaster movies know, it’s in phase two, with its crowd of witnesses, that the feeling This is really happening dawns, and true panic begins. In the real world of history, things happen more slowly, and even a televised real-life version of that fundamental disaster movie set piece, the destruction of a great city—New Orleans, by Hurricane Katrina—hardly signifies the imminent end of life as we know it. Still, it changes one’s private mood to know the public mood has changed.
A VISIT to a bookstore or multiplex confirms the new strain of morbidity in the air. Every other month seems to bring the publication of at least one new so-called literary novel on dystopian or apocalyptic themes and the release of at least one similarly themed movie displaying some artistic trappings. (Artsy, but not quite aspiring to be art, films like 28 Days Later and Children of Men might be called, without scorn, “B+ movies,” to distinguish them from ordinary apocalyptic crowd-pleasers.) What is striking is not so much the proliferation of these futuristic works—something that has been going on for generations—but the wholesale rehabilitation of such “genre” material for serious or serious-seeming novels and movies. If ordinary citizens are taking their direst imaginings more to heart than before, so, it would appear, are novelists and filmmakers. The new cultural prestige of disaster will be worth returning to later on.
First, however, a distinction needs to be made between the dystopian and the apocalyptic, because these categories refer to different and even opposed futuristic scenarios. The end of the world or apocalypse typically brings about the collapse of order; dystopia, on the other hand, envisions a sinister perfection of order. In the most basic political terms, dystopia is a nightmare of authoritarian or totalitarian rule, while the end of the world is a nightmare of anarchy. (There is also the currently less fashionable kind of political dream known as utopia.) What the dystopian and the apocalyptic modes have in common is simply that they imagine our world changed, for the worse, almost beyond recognition.
Both versions of the future are plentifully on offer in recent literary fiction and B+ movies. In 28 Days Later (released 2002), an accidentally released supervirus transforms virtually all of Britain into a population of cannibalistic zombies. Margaret Atwood’s novel Oryx and Crake (published 2003) is a post-apocalyptic bestiary of genetically engineered species; among them, in a world half-drowned by rising seas, lives apparently the last surviving human. Michel Houellebecq’s Possibility of an Island (published 2004) is narrated by a misanthropic contemporary of ours named Daniel, as well as numbers 24 and 25 of the successive clones made from this not-quite individual. Kazuo Ishiguro’s Never Let Me Go (published 2005) is another clone novel; it concerns genetic supernumeraries raised for purposes of organ harvesting. And cloning likewise furnishes subject matter for David Mitchell’s Cloud Atlas (also published 2005), where one of five braided narrative strands takes the form of a Q & A between a normally human historian and an imprisoned rebel “fabricant,” who—unlike Ishiguro’s clones, with their lamblike passivity—has escaped an underground world of slavery into horrified awareness of the genocidal nature of a “corpocracy” raised on the blood of clones. Mitchell has imagined the smoothest-running and most cynically organized of possible dystopias, in which business and government have melded with one another—perhaps for this reason the narrative is set in South Korea, notorious in the late nineties for its state-supported chaebols, or conglomerates, and “crony capitalism”—and the sole revolutionary movement abroad in the land is in fact sponsored by the corporate state to supply it with the fictitious enemy it requires.
To Read the Rest of the Essay
No comments:
Post a Comment