We live in head-spinning times. One the one hand there is the daily diet of news stories suggesting that human beings have become exceptionally powerful and destructive: we’re warming the oceans, melting the ice sheets, and driving countless other species into extinction. Given the extent of such anthropogenic changes, some scientists argue we’ve entered a new geological age that should be named after ourselves: the Anthropocene or an “Age of Humans.”
Yet, on the other hand, hardly a day goes by without some new affront to the belief that humans are either exceptional or exceptionally powerful, at least in and of themselves. Whether it’s tool-using birds or altruistic mice, the evidence increasingly suggests that evolution generates intelligence, creativity, socialization, and a lot of other erstwhile “human” traits fairly frequently. Likewise, if modern humans are especially powerful and even destructive, it’s only because the non-human world has made them so.
So, which is to be: An Age of Humans or an Age of Human Humility? And which of these very different ways of understanding our place in the world might prove most effective in solving the climate crisis what we face today?
Given that the climate crisis and other global problems were in significant part created by our own self-obsession and arrogance, it seems unlikely that we can solve these problems by doubling down on the same old modernist faith in human exceptionalism and power. Instead, this head-spinning moment of uncertainty and contradiction might offer us a rare opportunity to radically rethink dangerous old ideas and embrace deeper change.
To understand this more modest approach, it’s helpful to think more deeply about the concept of the Anthropocene. When the atmospheric scientist Paul Crutzen and colleagues brought the term into wide use in the early 2000s, their intent was obviously good. Alarmed that so many were still denying the human role in climate change, Crutzen suggested a name that emphasizes human causality and agency. The neologism “Anthropocene” seemed to achieve that goal, as it echoed the related word “Anthropogenic”—a term which is typically taken to mean “human caused or produced”: a carcinogenic thing causes cancer, a mutagenic thing causes mutations. And yet, if you consider this example for a moment, you’ll quickly recognize that anthropogenic doesn’t quite fit. To really maintain the pattern, “anthropogenic” should mean a thing that causes the Anthropos—which is to say, a thing that causes or produces humans.
As it turns out, this actually was the original meaning of the word back in the late 19th century: “pertaining to the origin of humans.” Yet today when we hear the phrase “anthropogenic climate change” we understand it to mean “human caused climate change,” not “climate change caused humans.” True, “-genic” is also much less frequently used to mean “associated with” or “or well-suited to”—as in “photogenic” or “telegenic.” But here I want to argue that the older and more semantically consistent sense of anthropogenic—as something that “pertains to the origin” of humans—might help us to move beyond the unapologetic anthropocentrism of the Anthropocene concept itself. Perhaps it’s still not too late to instead develop a more humble and insightful understanding of the term that might help us to move beyond the still pervasive modernist faith in the centrality of humans.
Indeed, one of the most striking aspects of this supposed new “Age of Humans” is that many of its defining features—climate change, radioactive strata, ubiquitous plastic pollutants, etc.—were largely unintended and unexpected consequences of humans pursuing entirely unrelated goals. The early creators and adopters of coal-powered engines could not have even imagined that the exhaust from their machines might cause dangerous planet-wide warming several centuries later. Likewise, humans were by and large surprised to discover the bio-concentration of radioactive particles from atomic bomb tests and the stubborn persistence of microplastic particles in the environment.
Yet if the material changes that define the Anthropocene were for the most part unintended, this raises important questions about the nature of human and non-human agency. In the 1970s when so-called post-modernist theories began to dominate, many academic humanists argued that ideas and discourse played an outsized role in shaping the material world and society. The term post-modern notwithstanding, these theoretical approaches often reflected one of the central beliefs of modernism: that humans were special creatures who were not only distinct from, but superior to, the environment and all the other living organisms around them. Rather than recognizing the many connections and continuities between humans and their material environment and fellow creatures, modernists and post-modernists alike have by and large supported the dominant Western faith in human exceptionalism.
Ironically, post-modernist theories—and subsequently the Anthropocene concept itself—emerged during a period when the evidence against human exceptionalism has been rapidly accumulating. Hardly a week goes by without a new report of an animal who does something long assumed to be uniquely human. It’s been several decades now since the gorilla Koko showed us she could understand spoken English and communicate using sign language. We also now know that quite a few animals use simple tools, as with the New Caledonian crow Betty who was famously caught on tape bending a wire into a hook to snare some otherwise impossible-to-reach food. As the animal anthropologist Franz de Waal argues in his latest book, Mama’s Last Hug, it has also become increasingly difficult to deny that other animals experience emotions that were long believed to be solely human. Even the lowly little white lab rat can be empathetic and altruistic, postponing the immediate gratification of a chocolate treat in order first to rescue another rat from an unpleasant swim.
In sum, name your favorite “human” trait and there is probably some other animal who also has it—at least at some diminished level. Yet even if they accept that the difference between us and the other animals is one of degree, not kind, the defenders of human exceptionalism will not unreasonably argue that our species has succeeded in using these animal abilities to achieve an unprecedented level of power. Obviously, crows might bend wires into hooks, but they don’t smelt iron and copper, much less build nuclear reactors. In recent years, however, anthropologists, environmental historians, and advocates of neo-materialist ideas have repeatedly pointed out that even these supposedly human powers might better be understood as a product of the non-human environment. The rise of early agrarian societies depended on creative plants and animals that were able and willing to cooperate with humans in new ways. The European industrial revolution was as much the product of the massive stores of solar energy contained in coal and oil as it was of ingenious humans. Perhaps most broadly, some cognitive philosophers now argue that even our clever minds should be understood as embedded in a vibrant (and sometimes dangerous) material environment that sparks and sustains the ideas that we wrongly assume are entirely ours.
In sum, despite some rearguard resistance, the case against human exceptionalism grows stronger by the day. So, what might happen if we were to fully and enthusiastically embrace the idea that humans are not special? And with what consequences for our ability to effectively confront the climate crisis?
First, we would place ourselves on a continuum with other animals, putting the emphasis on the traits we share rather than those that separate us. The benefit of this is not so much that the human star dims in comparison, but rather that we would see our own accomplishments as a product of a material world that seems to generate intelligence, creativity, and other traits rather freely. In this framing, it’s not that we are an exceptional species on a modest planet, but rather that we are a rather ordinary species on an exceptional planet. To truly embrace such a modest view of ourselves would highlight the tremendous debt we own to the non-human organisms and things that have made us who we are. Would such a humble humanity tolerate the current global mass extinction of countless plants and animals or the cruel mistreatment of domestic species in factory farms?
Second, such a modest post-anthropocentric view would compel us to set aside the dangerous modernist belief that humans can manipulate and control the non-human world without necessarily making any significant changes to themselves and their cultures. By decentering the human animal, we can better recognize the many ways in which we become entangled with powerful material things, which not only push us down certain historical paths, but may also change our nature as human beings. Post-modernist theory tended to criticize such ideas as deterministic, preferring to emphasize the ability of humans to determine their own destinies. There is some truth in this, of course, yet it badly underestimates the power of dynamic and potentially dangerous non-human things, such as the coal and oil whose immense energies have lured us into destructive and unsustainable ways of thinking and living. Ironically, such post-modernist and anthropocentric views provided an unintended measure of support for neo-liberal politics that allow supposedly free markets to determine the material nature of the world: If humans by and large create themselves through discourse and an immaterial concept of culture, why should we worry excessively about changing our environment?
Third, a more modest reframing of the Anthropocene would help to better critique the neo-liberal argument that humans can manage the effects of global warming through massive geo-engineering projects. For example, some suggest it might be feasible to spray huge volumes of sulphate aerosol particles into the upper atmosphere, thus reflecting more of the sun’s heat back into space. As the Australian philosopher Clive Hamilton rightly observes, such a techno-fix is appealing because it “absolves us all of the need to change our ways.”
When Crutzen and others coined the Anthropocene concept, they hoped that by highlighting human agency and responsibility for climate change the term might encourage more individuals and governments to implement effective solutions. Yet in naming and framing climate change as conventionally “anthropogenic,” the term had the unintended effect of sustaining a misleading and dangerous overestimation of human centrality, power, and agency. Rather than encouraging us to think seriously about the many ways we stumbled into a deep entanglement with coal, oil, and other hydrocarbons, and how those non-human things have subsequently shaped how we think and act, the Anthropocene concept suggests we can simply choose to give up hydrocarbons if only we can muster the necessary political and technological will—no greater rethinking of the human relationship to the earth is required or desired.
By contrast, an Anthropocene that embraces the older concept of anthropogenic—one that asks how humans are produced by the world itself—might foster a more humble and cautious approach to dealing with our present crisis. Such a post-anthropocentric Anthropocene would ask not only: How can humans solve the challenge of climate change so that they can survive? But also: In this moment when the failures of modernist anthropocentrism have become clear, what non-human things and organisms should we embrace in order to become the kinds of humans we wish to be?
Ironically, perhaps the most exceptional thing we humans might do at this moment is to fully embrace how unexceptional we are.
Image credit: Lisa Reindorf / Internet