As a geologist and professor I speak and write rather cavalierly about eras and eons. One of the courses I routinely teach is “History of Earth and Life,” a survey of the 4.5-billion-year saga of the entire planet—in a 10-week trimester. But as a human, and more specifically as a daughter, mother, and widow, I struggle like everyone else to look Time honestly in the face. That is, I admit to some time hypocrisy.
Antipathy toward time clouds personal and collective thinking. The now risible “Y2K” crisis that threatened to cripple global computer systems and the world economy at the turn of the millennium was caused by programmers in the 1960s and ’70s who apparently didn’t really think the year 2000 would ever arrive. Over the past decade, Botox treatments and plastic surgery have come to be viewed as healthy boosts to self-esteem rather than what they really are: evidence that we fear and loathe our timeliness. Our natural aversion to death is amplified in a culture that casts Time as an enemy and does everything it can to deny its passage. As Woody Allen said: “Americans believe death is optional.”
This type of time denial, rooted in a very human combination of vanity and existential dread, is perhaps the most common and forgivable form of what might be called chronophobia. But there are other, more toxic varieties that work together with the mostly benign kind to create a pervasive, stubborn, and dangerous temporal illiteracy in our society. We in the 21st century would be shocked if an educated adult were unable to identify the continents on a world map, yet we are quite comfortable with widespread obliviousness about anything but the most superficial highlights from the planet’s long history (um, Bering Strait ... dinosaurs ... Pangaea?). Most humans, including those in affluent and technically advanced countries, have no sense of temporal proportion—the durations of the great chapters in Earth’s history, the rates of change during previous intervals of environmental instability, the intrinsic timescales of “natural capital” like groundwater systems. As a species, we have a childlike disinterest and partial disbelief in the time before our appearance on Earth. With no appetite for stories lacking human protagonists, many people simply can’t be bothered with natural history. We are thus both intemperate and intemporate—time illiterate. Like inexperienced but overconfident drivers, we accelerate into landscapes and ecosystems with no sense of their long-established traffic patterns, and then react with surprise and indignation when we face the penalties for ignoring natural laws. This ignorance of planetary history undermines any claims we may make to modernity. We are navigating recklessly toward our future using conceptions of time as primitive as a world map from the 14th century, when dragons lurked around the edges of a flat earth. The dragons of time denial still persist in a surprising range of habitats.
Among the various foes of time, Young Earth creationism breathes the most fire but is at least predictable in its opposition. In years of teaching geology at the university level, I have had students from evangelical Christian backgrounds who earnestly struggle to reconcile their faith with the scientific understanding of the Earth. I truly empathize with their distress and try to point out paths toward resolution of this internal discord. First, I emphasize that my job is not to challenge their personal beliefs but to teach the logic of geology (geo-logic?)—the methods and tools of the discipline that enable us not only to comprehend how the Earth works at present but also to document in detail its elaborate and awe-inspiring history. Some students seem satisfied with keeping science and religious beliefs separate through this methodological remove. But more often, as they learn to read rocks and landscapes on their own, the two worldviews seem increasingly incompatible. In this case, I use a variation on the argument made by Descartes in his Meditations about whether his experience of Being was real or an elaborate illusion created by a malevolent demon or god.1
Short-term thinkers are rewarded with reelection, while those who dare to take seriously our responsibility to future generations commonly find themselves out of office.
Early in an introductory geology course, one begins to understand that rocks are not nouns but verbs—visible evidence of processes: a volcanic eruption, the accretion of a coral reef, the growth of a mountain belt. Everywhere one looks, rocks bear witness to events that unfolded over long stretches of time. Little by little, over more than two centuries, the local stories told by rocks in all parts of the world have been stitched together into a great global tapestry—the geologic timescale. This “map” of Deep Time represents one of the great intellectual achievements of humanity, arduously constructed by stratigraphers, paleontologists, geochemists, and geochronologists from many cultures and faiths. It is still a work in progress to which details are constantly being added and finer and finer calibrations being made. So far, no one in more than 200 years has found an anachronistic rock or fossil—as biologist J.B.S. Haldane reputedly said, “a Precambrian rabbit”2—that would represent a fatal internal inconsistency in the logic of the timescale.
If one acknowledges the credibility of the methodical work by countless geologists from around the world (many in the service of petroleum companies), and one believes in a God as creator, the choice is then whether to accept the idea of (1) an ancient and complex Earth with epic tales to tell, set in motion eons ago by a benevolent creator, or (2) a young Earth fabricated only a few thousand years ago by a devious and deceitful creator who planted specious evidence of an old planet in every nook and cranny, from fossil beds to zircon crystals, in anticipation of our explorations and laboratory analyses. Which is more heretical? A corollary of this argument, to be deployed with tact and care, is that compared with the deep, rich, grand geologic story of Earth, the Genesis version is an offensive dumbing-down, an oversimplification so extreme as to be disrespectful to the Creation.
It is disheartening that we geologists must wrap ourselves in the hype of the space program to make legislators or the public interested in their own planet.
As exasperating as professional Young-Earthers and creationists can be, they are completely forthright about their chronophobia. More pervasive and corrosive are the nearly invisible forms of time denial that are built into the very infrastructure of our society. For example, in the logic of economics, in which labor productivity must always increase to justify higher wages, professions centered on tasks that simply take time—education, nursing, or art performance—constitute a problem because they cannot be made significantly more efficient. Playing a Haydn string quartet takes just as long in the 21st century as it did in the 18th; no progress has been made! This is sometimes called “Baumol’s disease” for one of the economists that first described the dilemma.4 That it is considered a pathology reveals much about our attitude toward time and the low value we in the West place on process, development, and maturation.
Fiscal years and congressional terms enforce a blinkered view of the future. Short-term thinkers are rewarded with bonuses and reelection, while those who dare to take seriously our responsibility to future generations commonly find themselves outnumbered, outshouted, and out of office. Few modern public entities are able to make plans beyond biennial budget cycles. Even two years of forethought seems beyond the capacity of Congress and state legislatures these days, when last-minute, stop-gap spending measures have become the norm. Institutions that do aspire to the long view—state and national parks, public libraries, and universities—are increasingly seen as taxpayer burdens (or untapped opportunities for corporate sponsorship).
Conserving natural resources—soil, forests, water—for the nation’s future was once considered a patriotic cause, evidence of love of country. But today, consumption and monetization have become strangely mixed up with the idea of good citizenship (a concept that now includes corporations). In fact, the word consumer has become more or less a synonym for citizen, and that doesn’t really seem to bother anyone. “Citizen” implies engagement, contribution, give-and-take. “Consumer” suggests only taking, as if our sole role is to devour everything in sight, in the manner of locusts descending on a field of grain. We might scoff at apocalyptic thinking, but the even more pervasive idea—indeed, economic credo—that levels of consumption can and should increase continuously is just as deluded. And while the need for long-range vision grows more acute, our attention spans are shrinking, as we text and tweet in a hermetic, narcissistic Now.
Academe, too, must take some responsibility for promulgating a subtle strain of time denial in the way that it privileges certain types of inquiry. Physics and chemistry occupy the top echelons in the hierarchy of intellectual pursuits owing to their quantitative exactitude. But such precision in characterizing how nature works is possible only under highly controlled, wholly unnatural conditions, divorced from any particular history or moment. Their designation as the “pure” sciences is revealing; they are pure in being essentially atemporal—unsullied by time, concerned only with universal truths and eternal laws.5 Like Plato’s “forms,” these immortal laws are often considered more real than any specific manifestation of them (e.g., the Earth). In contrast, the fields of biology and geology occupy lower rungs of the scholarly ladder because they are very “impure,” lacking the heady overtones of certainty because they are steeped through and through with time. The laws of physics and chemistry obviously apply to life-forms and rocks, and it is also possible to abstract some general principles about how biological and geologic systems function, but the heart of these fields lies in the idiosyncratic profusion of organisms, minerals, and landscapes that have emerged over the long history of this particular corner of the cosmos.
As a species, we have a childlike disinterest and partial disbelief in the time before our appearance on Earth.
Biology as a discipline is elevated by its molecular wing, with its white-coat laboratory focus and its venerable contributions to medicine. But lowly geology has never achieved the glossy prestige of the other sciences. It has no Nobel Prize, no high school Advanced Placement courses, and a public persona that is musty and dull. This of course rankles geologists, but it also has serious consequences for society at a time when politicians, CEOs, and ordinary citizens urgently need to have some grasp of the planet’s history, anatomy, and physiology.
For one thing, the perceived value of a science profoundly influences the funding it receives. Out of frustration with limited grant money for basic geologic investigations, some geochemists and paleontologists studying the early Earth and the most ancient traces of life in the rock record have cleverly recast themselves as “astrobiologists” to ride on the coattails of NASA initiatives that support research into the possibility of life elsewhere in the solar system or beyond. While I admire this shrewd maneuver, it is disheartening that we geologists must wrap ourselves in the hype of the space program to make legislators or the public interested in their own planet.
Second, the ignorance of and disregard for geology by scientists in other fields has serious environmental consequences. The great advances in physics, chemistry, and engineering made in the Cold War years—development of nuclear technologies; synthesis of new plastics, pesticides, fertilizers, and refrigerants; mechanization of agriculture; expansion of highways—ushered in an era of unprecedented prosperity but also left a dark legacy of groundwater contamination, ozone destruction, soil and biodiversity loss, and climate change for subsequent generations to pay for. To some extent, the scientists and engineers behind these achievements can’t be blamed; if one is trained to think of natural systems in highly simplified ways, stripping away the particulars so that idealized laws apply, and one has no experience with how perturbations to these systems may play out over time, then the undesirable consequences of these interventions will come as a surprise. And to be fair, until the 1970s, the geosciences themselves did not have the analytical tools with which to conceptualize the behavior of complex natural systems on decade to century timescales.
By now, however, we should have learned that treating the planet as if it were a simple, predictable, passive object in a controlled laboratory experiment is scientifically inexcusable. Yet the same old time-blind hubris is allowing the seductive idea of climate engineering, sometimes called geoengineering, to gain traction in certain academic and political circles. The most commonly discussed method for cooling the planet without having to do the hard work of cutting greenhouse gas emissions is the injection of reflective sulfate aerosol particles into the stratosphere—the upper atmosphere—to mimic the effect of large volcanic eruptions, which have cooled the planet temporarily in the past. The 1991 eruption of Mount Pinatubo in the Philippines, for example, caused a two-year pause in the steady climb of global temperatures. The chief advocates for this type of planetary tinkering are physicists and economists, who argue that it would be cheap, effective, and technologically feasible, and promote it under the benign, almost bureaucratic-sounding name “Solar Radiation Management.”
But most geoscientists, acutely aware of how even small changes to intricate natural systems can have large and unanticipated consequences, are profoundly skeptical. The volumes of sulfate required to reverse global warming would be equivalent to a Pinatubo-sized eruption every few years—for at least the next century—since halting the injections in the absence of significant reduction in greenhouse gas levels would result in an abrupt global temperature spike that might be beyond the adaptive capacity of much of the biosphere. Even worse, the effectiveness of the approach wanes with time, because as stratospheric sulfate concentrations increase, the tiny particles coalesce into larger ones, which are less reflective and have a shorter residence time in the atmosphere. Most important, even though there would probably be a net decrease in overall global temperature, we have no way of knowing exactly how regional or local weather systems would be affected. (And by the way, we have no international governance mechanism to oversee and regulate planetary-scale manipulation of the atmosphere).
In other words, it is time for all the sciences to adopt a geologic respect for time and its capacity to transfigure, destroy, renew, amplify, erode, propagate, entwine, innovate, and exterminate. Fathoming deep time is arguably geology’s single greatest contribution to humanity. Just as the microscope and telescope extended our vision into spatial realms once too minuscule or too immense for us to see, geology provides a lens through which we can witness time in a way that transcends the limits of our human experiences.
But even geology cannot exempt itself from culpability for public misconceptions about time. Since the birth of the discipline in the early 1800s, geologists—congenitally wary of Young-Earthers—have droned on about the unimaginable slowness of geologic processes, and the idea that geologic changes accrue only over immense periods of time. Moreover, geologic textbooks invariably point out (almost gleefully) that if the 4.5 billion-year story of the Earth is scaled to a 24-hour day, all of human history would transpire in the last fraction of a second before midnight. But this is a wrongheaded, and even irresponsible, way to understand our place in Time. For one thing, it suggests a degree of insignificance and disempowerment that not only is psychologically alienating but also allows us to ignore the magnitude of our effects on the planet in that quarter second. And it denies our deep roots and permanent entanglement with Earth’s history; our specific clan may not have shown up until just before the clock struck 12:00, but our extended family of living organisms has been around since at least 6 a.m. Finally, the analogy implies, apocalyptically, that there is no future—what happens after midnight?
While we humans may never completely stop worrying about time and learn to love it (to borrow a turn of phrase from Dr. Strangelove), perhaps we can find some middle ground between chronophobia and chronophilia, and develop the habit of timefulness—a clear-eyed view of our place in Time, both the past that came long before us and the future that will elapse without us.
Timefulness includes a feeling for distances and proximities in the geography of deep time. Focusing simply on the age of the Earth is like describing a symphony in terms of its total measure count. Without time, a symphony is a heap of sounds; the durations of notes and recurrence of themes give it shape. Similarly, the grandeur of Earth’s story lies in the gradually unfolding, interwoven rhythms of its many movements, with short motifs scampering over tones that resonate across the entire span of the planet’s history. We are learning that the tempo of many geologic processes is not quite as larghissimo as once thought; mountains grow at rates that can now be measured in real time, and the quickening pace of the climate system is surprising even those who have studied it for decades.
Still, I am comforted by the knowledge that we live on a very old, durable planet, not an immature, untested, and possibly fragile one. And my daily experience as an earthling is enriched by an awareness of the lingering presence of so many previous versions and denizens of this place. Understanding the reasons for the morphology of a particular landscape is similar to the rush of insight one has upon learning the etymology of an ordinary word. A window is opened, illuminating a distant yet recognizable past—almost like remembering something long forgotten. This enchants the world with layers of meaning and changes the way we perceive our place in it. Although we may fervently wish to deny time for reasons of vanity, existential angst, or intellectual snobbery, we diminish ourselves by denouncing our temporality. Bewitching as the fantasy of timelessness may be, there is far deeper and more mysterious beauty in timefulness.
Excerpted from Timefulness: How Earth’s Deep Past Can Change the Way We See the Future by Marcia Bjornerud. Copyright © 2018 by Princeton University Press. Reprinted by permission.
1. Descartes, R. Meditations on First Philosophy, with Selections from the Objections and Replies (1641). Translated by Moriarty, M. Oxford: Oxford World’s Classics (2008).
2. Haldane is supposed to have said this when asked what could cause him to abandon his certitude about evolution. The memorable quote has been cited many times, but its origin is not clear.
3. Baumol, W. & Bowen, W. Performing Arts—The Economic Dilemma: A Study of Problems Common to Theater, Opera, Music, and Dance New York: Twentieth Century Fund (1966).
4. Theoretical physicist Lee Smolin is a minority voice chiding his discipline for what he calls the systematic “expulsion of time.” Smolin, L. Time Reborn Boston: Houghton Mifflin Harcourt (2013).
5. Including Steven Levitt and Stephen Dubner in Chapter 5 of Super Freakonomics: Global Cooling, Patriotic Prostitutes, and Why Suicide Bombers Should Buy Life Insurance New York: William Morrow (2010).