Read Part 2
Many of the readers of Energy Bulletin, and now Resilience.org, have come to share a certain view of the world. It’s probably fair to say that, as a group, we see resource depletion, financial chaos, and environmental disasters (principally associated with global climate change) as looming storms converging on industrial civilization. We also tend to see the unprecedented level of complexity of our society today as resulting from the historically recent energy subsidies of fossil fuels, and to a certain extent the enabling factor of financial innovation. Thus, as the quality and quantity of our energy sources inevitably decline, and as financial claims melt away with the ongoing burst of history’s greatest credit bubble, a simplification and decentralization of societal systems is inevitable.
In this essay, which will appear in five installments, I hope to explore some of the social implications of simplification and decentralization. Will wars and revolutions break out with ever-greater frequency? Will democracy thrive, or will traumatized masses find themselves at the mercy of tyrants? Will nation states survive, or will they break apart? Will regional warlords rule over impoverished and enslaved survivors? Or will local food networks and Occupy groups positively transform society from the ground up?
I don’t claim to have a functioning crystal ball. But tracing current trends, and looking to historic analogies, may help us understand our prospects better, and help us make the most of them.
The 21st century landscape of conflict
Looking forward, four principal drivers of conflict are easily apparent. More may be lurking along the way.
First is the increasing prospect of conflict between rich and poor—i.e., between those who benefitted during history’s biggest growth bash on one hand, and on the other hand those who provided the labor, sat on the sidelines, or were pushed aside in resource grabs.
Economic growth produces inequality as a byproduct. Not only do industrialists appropriate the surplus value of the labor of their workers, as Marx pointed out, but bankers accumulate wealth from the interest paid by borrowers. We see inequality being generated by economic growth in real time in China, where roughly six hundred million people have been lifted from poverty in the last thirty years as a result of nine percent annual economic growth—but where economic inequality now surpasses levels in U.S. and even Eastern Europe.
Just as economic growth produces winners and losers domestically, the level of wealth inequality between nations grows as the global economy expands. Today the disparity between average incomes in the world’s richest and poorest nations is higher than ever.
The primary forces working against inequality as economies grow consist of government spending on social programs of all sorts, and on international aid projects.
As economic growth stops, those who have benefitted the most have both the incentive to maintain their relative advantage and, in many cases, the means to do so. Which means that in a contracting economy, those who have the least tend to lose the most. There are exceptions, of course. Billionaires can in theory go broke in a matter of hours or even seconds as a result of a market crash. But in the era of “too-big-to-fail” banks and corporations, government provides a safety net for the rich as well as the poor.
High and increasing inequality is usually bearable during boom times, as people at the bottom of the wealth pyramid are encouraged by the prospect of its overall expansion. Once growth ceases and slips into reverse, inequality becomes socially unsustainable. Declining expectations lead to unrest, while absolute misery (in the sense of not having enough to eat) often results in revolution.
We’ve seen plenty of examples of these trends in the past two years in Greece, Ireland, Spain, the U.S., and the Middle East.
In many countries, including the U.S., government efforts to forestall or head off uprisings appear to be taking the forms of criminalization of dissent, the militarization of police, and a massive expansion of surveillance using an array of new electronic spy technologies. At the same time, intelligence agencies are now able to employ up-to-date sociological and psychological research to infiltrate, co-opt, misdirect, and manipulate popular movements aimed at achieving economic redistribution.
However, these military, police, public relations, and intelligence efforts require massive funding as well as functioning grid, fuel, and transport infrastructures. Further, their effectiveness is limited if and when the nation’s level of economic pain becomes too intense, widespread, or prolonged.
A second source of conflict consists of increasing competition over access to depleting resources, including oil, water, and minerals. Among the wealthiest nations, oil is likely to be the object of the most intensive struggle, since oil is essential for nearly all transport and trade. The race for oil began in the early 20th century and has shaped the politics and geopolitics of the Middle East and Central Asia; now that race is expanding to include the Arctic and deep oceans, such as the South China Sea.
Resource conflicts occur not just between nations, but also within societies: witness the ongoing insurgencies in the Niger Delta, where oil revenue fuels rampant political corruption while drilling leads to environmental ravages felt primarily by the Ogoni ethnic group; see also the political infighting in fracking country here in the U.S., where ecological impacts put ever-greater strains on the social fabric. Neighbors who benefit from lease payments no longer speak to neighbors who have to put up with polluted water, a blighted landscape, and the noise of thousands of trucks carrying equipment, water, and chemicals. Eventually, however, boomtowns turn to ghost towns, and nearly everyone loses.
Third, climate change and other forms of ecological degradation are likely to lead to
conflict over access to places of refuge from natural disasters. The responsible agencies—including the
United Nations University Institute for Environment and Human Security—point out that there are already 12 million environmental refugees worldwide, and that this number is destined to soar as extreme weather events increase in frequency and severity. Typically, when bad weather strikes, people leave their homes only as a last resort; in the worst instances they have no other option. As America learned during the Dust Bowl of the 1930s, when hundreds of thousand were displaced from farms in the prairies, rapid shifts in population due to forced migration can create economic and social stresses, including competition for scarce jobs, land, and resources, leading to discrimination and sometimes violence.
Where do refugees go when the world is already full? Growing economies are usually able to absorb immigrants and governments may even encourage immigration in order to keep wages down. But when economic growth ceases, immigrants are often seen as taking jobs away from native-born workers.
In this instance as well, conflict will appear both within and between countries. Low-lying island nations may disappear completely, and cross-border weather-driven migrations will increase dramatically. Inhabitants of coastal communities will move further inland. Farmers in drought-plagued areas will pick up stakes. But can all of these people be absorbed into shantytowns in the world’s sprawling megacities? Or will at least some of these cities themselves see an exodus of population due to an inability to maintain basic life-support services?
Lastly, climate change, water scarcity, high oil prices, vanishing credit, and the leveling off of per-hectare productivity and the amount of arable land are all combining to create the conditions for
a historic food crisis, which will impact the poor first and most forcibly. High food prices breed social instability—whether in 18
th century France or 21
st century Egypt. As today’s high prices rise further,
social instability could spread, leading to demonstrations, riots, insurgencies, and revolutions.
In summary, conflict in the decades ahead will likely center on the four factors of money, energy, land, and food. These sources of conflict will overlap in various ways. While economic inequality will not itself be at the root of all this conflict (one could argue that population growth is a deeper if often unacknowledged cause of strife), inequality does seem destined to play a role in most conflict, whether the immediate trigger is extreme weather, high food prices, or energy shortages.
This is not to say that no other sources of conflict beyond money, energy, land, and food will exist. Undoubtedly religion will provide the ostensible banner for contention in many instances. However, as so often in history, this is likely to be a secondary rather than a primary driver of discord.
This essay was originally an address to the International Conference on Sustainability, Transition and Culture Change, November 16, 2012, by Richard Heinberg
Image credits: Occupy 6 - Guy Denning, Athens riot - flickr, Bangladesh 350 - 350.org
The following is Part 2 of an essay which was originally an address to the International Conference on Sustainability, Transition and Culture Change, November 16, 2012, by Richard Heinberg
Read Part 1
Will increasing conflict lead to expanding violence?
Not if neuropsychologist Stephen Pinker is right. In his expansive and widely praised book The Better Angels of Our Nature: the Decline of Violence in History and Its Causes, Pinker claims that, in general, violence has waned during the past few decades. He argues that this tendency has ancient roots in our shift from peripatetic hunting and gathering to settled farming; moreover, during the past couple of centuries the trend has greatly intensified. With the emergence of Enlightenment philosophy and its respect for the individual came what Pinker calls the Humanitarian Revolution. Much more recently, after World War II, violence was suppressed first by the “mutually assured destruction” policies of the two opposed nuclear-armed sides in the Cold War, and then by American global hegemony. Pinker calls this the Long Peace. Wars have become less frequent and less violent, and most societies have seen what might be called a decline of tolerance for intolerance—whether manifested in schoolyard fights, bullying, or picking on gays and minorities.
But there is a problem with Pinker’s implied conclusion that global violence will continue to decline. The Long Peace we have known since World War II may well turn out to be shorter than hoped as world economic growth stalls and as American hegemony falters—in John Michael Greer’s words, as “the costs of maintaining a global imperial presence soar and the profits of the imperial wealth pump slump.” Books and articles predicting the end of the American empire are legion; while some merely point to the rise of China as a global rival, others describe the looming failure of the essential basis of the U.S. imperial system—the global system of oil production and trade (with its petro-dollar recycling program) centered in the Middle East. There are any number of scenarios describing how the end of empire might come, but few credible narratives explaining why it won’t.
When empires crumble, as they always do, the result is often a free-for-all among previous subject nations and potential rivals as they sort out power relations. The British Empire was a seeming exception to this rule: in that instance, the locus of military, political, and economic power simply migrated to an ally across the Atlantic. A similar graceful transfer seems unlikely in the case of the U.S., as economic decline during the 21st century will be global in scope. A better analogy to the current case might be the fall of Rome, which led to centuries of incursions by barbarians as well as uprisings in client states.
Disaster per se need not lead to violence, as Rebecca Solnit argues in her book A Paradise Built in Hell: The Extraordinary Communities that Arise in Disaster. She documents five disasters—the aftermath of Hurricane Katrina; earthquakes in San Francisco and Mexico City; a giant ship explosion in Halifax, Canada; and 9/11—and shows that rioting, looting, rape, and murder were not automatic results. Instead, for the most part, people pulled together, shared what resources they had, cared for the victims, and in many instances found new sources of joy in everyday life.
However, the kinds of social stresses we are discussing now may differ from the disasters Solnit
surveys, in that they comprise a “long emergency,” to borrow James Kunstler’s durable phrase. For every heartwarming anecdote about the convergence of rescuers and caregivers on a disaster site, there is a grim historic tale of resource competition turning normal people into monsters.
We are in a race—but it’s not just an arms race; indeed, it may end up being an arms race in reverse. In many nations around the globe the means to pay for armaments and war are starting to disappear; meanwhile, however, there is increasing incentive to engage in international conflict as a way of re-channeling the energies of jobless young males and of distracting the general populace, which might otherwise be in a revolutionary mood. We can only hope that historical momentum can maintain The Great Peace until industrial nations are sufficiently bankrupt that they cannot afford to mount foreign wars on any substantial scale.
The following is Part 3 of an essay which was originally an address to the International Conference on Sustainability, Transition and Culture Change, November 16, 2012, by Richard Heinberg
Read Part 1, Part 2
Setting aside the discussion of international conflict, what will be the options of nations for dealing internally with economic decline?
So far, the first resort of many countries has been fiscal austerity. A shrinking economy leads to declining tax revenues, while deficit spending leads to increasing levels of government debt. If a nation controls its own currency, deficits and debt can theoretically continue to increase for some time, as has occurred in Japan since the 1980s; however, 17 individual European nations have given up their domestically-controlled currencies in favor of a common currency, the Euro, over which they have very little control. They therefore cannot follow the usual strategy of reducing the value of a domestic currency so as to reduce the weight of foreign debt. Other countries, including the U.S., are hesitant to run up colossal amounts of debt for fear that interest payments will eventually overwhelm the budget, or that inflation will eventually ensue, reducing the value of wealth held by those at the top of the economic pyramid. So, for the U.S. and much of Europe, a stagnant or contracting economy is assumed to require cuts in government spending.
There are two big problems with this tactic. The first is that it tends to shrink the economy even further and faster: as government payments dry up, citizens have even less to spend. As the economy contracts, investors tend to become skeptical that government will be able to pay off its debt, so they demand higher interest rates on government bonds. Having to pay more interest on debt, government must then cut spending even further to remain credit-worthy, which causes the economy to contract even more, and so on. This in essence is the crisis faced by the European peripheral nations. For the U.S., austerity will be the inevitable result of efforts to resolve the so-called “fiscal cliff” crisis. The need for social spending explodes when unemployment, homelessness, and malnutrition increase, while the availability of social services declines under austerity. The desired way out of this death spiral is a revival of rapid economic growth. But, as the world collides with environmental limits to growth and fiscal limits to debt, that’s simply not in the cards.
Eventually, at least some governments are likely to hit upon a different strategy: the increased provision of basic services as a way to minimize social instability. How to pay for an expansion of services in a time of over-indebtedness? Nations that control their own currencies can simply create more money without necessarily having to borrow from private banks. In the early stages, this need not lead to inflation. With energy and resources in short supply, the economy will continue to shrink no matter how much money governments spend into existence; but if there is more money to be had, that will lead to increased demand for scarce commodities, forcing up prices. Nevertheless, up to a point, increasing government payments—for example, by providing a universal basic guaranteed income—and more equitable distribution of income—most likely achieved through progressive taxation—could reduce human misery even as the economic pie continues to shrink. Under this regime, government would play an ever-larger role in every aspect of the economy.
Where the path of austerity is followed to its bitter end, social disintegration will eventually ensue. Centralized provision of basic services might postpone social unrest; but, as available energy declines, average standards of living will erode even if government takes on more responsibilities and wealth is more evenly distributed. Under those circumstances citizens would likely eventually rebel against what they perceived as being a monolithic, inefficient, and corrupt central government. Regardless which strategy the system’s managers choose, the scale and interconnectivity of today’s national and global systems of political organization and trade are likely to devolve.
This suggests a third approach to dealing with economic decline—
the building of localized, decentralized resilience. This strategy is unlikely to be supported by national policy, and may even be discouraged by regulations and laws that undergird the authority of large corporations and the central government. Indeed, a new source of conflict is likely to arise between local communities and failing national or global power hubs as communities seek to withdraw streams of support from federal authorities. Today localism is cute, trendy, and progressive; in a few years it may achieve the status of national security threat.
If and when there is a failure of transport networks, electricity grids, and other basic infrastructures that bind modern nations together, local communities will likely be able to maintain only a fraction of current energy and material flows. We will probably never see neighbors getting together in church basements to manufacture tablet computers from scratch, though they might congregate to try to repair whatever gadgetry can still be made to work. For the most part, during the next few decades a truly local economy will be mostly a salvage economy (as described by John Michael Greer in
The Ecotechnic Future, pp. 70 ff.).
John Robb at
ResilientCommunities.com disagrees with this view. He believes that new technologies like 3-D printing, powered by decentralized renewable energy production, will enable communities to manufacture everything they need to maintain a high level of communication and amenity even as total energy consumption declines. It may be too soon to see whether Greer or Robb is offering the more accurate scenario.
In either case, a more localized future seems inevitable; thus a managed pathway to that end state would seem preferable to an unplanned path forged by the cascading failures of centralized systems.
The following is Part 4 of an essay which was originally an address to the International Conference on Sustainability, Transition and Culture Change, November 16, 2012, by Richard Heinberg
Read Part 1, Part 2, Part 3
Are we headed toward a more autocratic or democratic future? There’s no hard and fast answer; the outcome may vary by region. However, recent history does offer some useful clues.
In his recent and important book Carbon Democracy: Political Power in the Age of Oil, Timothy Mitchell argues that modern democracy owes a lot to coal. Not only did coal fuel the railroads, which knitted large regions together, but striking coal miners were able to bring nations to a standstill, so their demands for unions, pensions, and better working conditions played a significant role in the creation of the modern welfare state. It was no mere whim that led Margaret Thatcher to crush the coal industry in Britain; she saw its demise as the indispensable precondition to neoliberalism’s triumph.
Coal was replaced, as a primary energy source, by oil. Mitchell suggests that oil offered industrial countries a path to reducing internal political pressures. Its production relied less on working-class miners and more upon university-trained geologists and engineers. Also, oil is traded globally, so that its production is influenced more by geopolitics and less by local labor strikes. “[P]oliticians saw the control of oil overseas as a means of weakening democratic forces at home,” according to Mitchell, and so it is no accident that by the late 20th century the welfare state was in retreat and oil wars in the Middle East had become almost routine. The problem of “excess democracy,” which reliance upon coal inevitably brought with it, has been successfully resolved, not surprisingly by still more teams of university-trained experts—economists, public relations professionals, war planners, political consultants, marketers, and pollsters. We have organized our political life around a new organism—“the economy”—which is expected to grow in perpetuity, or, in more practical terms, as long as the supply of oil continues to increase.
The suppression of democratic urges under an energy regime dominated by oil is also explored in Andrew Nikiforuk’s brilliant new book The Energy of Slaves: Oil and the New Servitude. The energy in oil effectively replaces human labor; as a result, each North American enjoys the services of roughly 150 “energy slaves.” But, according to Nikiforuk, that means that burning oil makes us slave masters—and slave masters all tend to mimic the same attitudes and behaviors, including contempt, arrogance, and impunity. As power addicts, we become both less sociable and easier to manipulate.
In the early 21st century, carbon democracy is ebbing, and so is the global oil regime hatched in the late 20th century. Domestic U.S. oil production based on fracking reduces the relative dominance of the Middle East petro-states, but to the advantage of Wall Street—which supplies the creative financing for speculative and marginally profitable domestic drilling. America’s oil wars have largely failed to establish and maintain the kind of order in the Middle East and Central Asia that was sought. High oil prices send dollars cascading toward energy producers, but starve the economy as a whole, and this eventually reduces petroleum demand. Governance systems appear to be incapable of solving or even seriously addressing looming financial, environmental, and resource issues, and “democracy” persists primarily in a highly diluted solution whose primary constituents are money, hype, and expert-driven opinion management.
In short, the 20th century governance system is itself fracturing. So what comes next?
As the fracking boom inevitably fails due to financial and geological constraints, a new energy regime will inevitably arise. It will almost surely be one mainly characterized by scarcity, but it will also eventually be dominated by renewable energy sources—whether solar panels or firewood. That effectively throws the door open to a range of governance possibilities. As mobility declines, smaller and more local governance systems will be more durable than empires and continent-spanning nation states. But will surviving regional and local governments end up looking like anarchist collectives or warlord compounds? Recent democratic innovations pioneered or implemented in the Arab Spring and the Occupy movement hold out more than a glimmer of hope for the former.
Anthropologist
David Graeber argues that the failure of centralized governmental institutions can open the way for democratic self-organization; as evidence, he cites his own experience doing doctoral research i
n Madagascar villages where the state had ceased collecting taxes and providing police protection. Collecting revenues and enforcing laws are the most basic functions of government; thus these communities were effectively left to govern and provide for themselves. According to Graeber, they did surprisingly well. “[T]he people had come up with ingenious expedients of how to deal with the fact that there was still technically a government, it was just really far away. Part of the idea was never to put the authorities in a situation where they lost face, or where they had to prove that they were in charge. They were incredibly nice to [government officials] if they didn’t try to exercise power, and made things as difficult as possible if they did. The course of least resistance was [for the authorities] to go along with the charade.”
Journalism professor
Greg Downey, commenting on Graeber’s ideas, notes, “I saw something very similar in camps of the Movimento Sem Terra (the MST or ‘Landless Movement’) in Brazil. Roadside shanty camps attracted former sharecroppers, poor farmers whose small plots were drowned out by hydroelectric projects, and other refugees from severe restructuring in agriculture toward large-scale corporate farming.” These farmers were victims, but they were by no means helpless. “Activists and religious leaders were helping these communities to set up their own governments, make collective decisions, and eventually occupy sprawling ranches. . . .
The MST leveraged the land occupations to demand that the Brazilian government adhere to the country’s constitution, which called for agrarian reform, especially of large holdings that were the fruits of fraud. . . . [C]ommunity-based groups, even cooperatives formed by people with very little education, developed greater and greater ability to run their own lives when the state was not around. They elected their own officials, held marathon community meetings in which every member voted (even children), and, when they eventually gained land, often became thriving, tight-knit communities.”
Conflict and change in the era of economic decline: Part 5 - A theory of change for a century of crisis
by Richard Heinberg
The following is the final part of an essay which was originally an address to the International Conference on Sustainability, Transition and Culture Change, November 16, 2012, by Richard Heinberg
Read Part 1, Part 2, Part 3, Part 4
If groups seeking to make the post-carbon transition go more smoothly and equitably are to have much hope of success, they need a sound strategy grounded in a realistic theory of change. Here, briefly, is a theory of change that makes sense to me.
For the past four decades, since the release of Limits to Growth, there have been many scattered efforts to develop alternatives to our current fossil-fueled, growth-based industrial paradigm. These include renewable energy systems; local, organic, and Permaculture food systems; urban design movements seeking to reduce the dominance of the automobile in our built environment; architectural programs with the goal of designing buildings that require no external energy input and that are constructed using renewable and recycled materials; alternative currencies not attached to interest-bearing debt, as well as alternative banking models; and alternative economic indicators that take account of social and environmental factors. While such efforts have achieved some small degree of implementation, varying significantly from place to place around the globe, they have generally failed to substantially reduce reliance on fossil fuels, to blunt the overall momentum of society toward increased consumption of a wide range of renewable and non-renewable materials, to reduce financial instability, or to curtail profound environmental impacts including climate change, loss of biodiversity and topsoil, and more.
What will it take for the conservers, localizers, and de-growthers to win? They have a lot stacked against them. The interests promoting a continuation of growth-as-usual are powerful and have spent decades honing advertising and public relations messages whose proliferation is subsidized by hundreds of billions of dollars annually. These interests have captured the allegiance of nearly every elected official in the world. Most ordinary folks are easily swept along because they want more and better jobs, cheaper gasoline, more flat-screen TVs, and all the other perks that come with fossil-fueled economic expansion.
The main downside to growth-as-usual is that it is unsustainable: it is destined to end in depletion of resources, economic unraveling, and environmental catastrophe. The hope of the conservers, localizers, and de-growthers must therefore be that if the growth-as-usual bandwagon cannot be turned back with persuasion, its inevitable crash will occur in increments, so that each incremental step-down in industrial output can be seized upon as an opportunity to demonstrate the need for alternatives and to promote them.
Advocates of the post-carbon crisis theory of change can point to several useful historic examples. One is the transformation of Cuba’s food system during the Special Period in the 1990s. The collapse of the Soviet Union and the resulting disappearance of subsidized Soviet oil shipments set the stage with a crisis. Several Cuban agronomists had previously advocated for more localized and organic agriculture, to no avail; but when the country was suddenly threatened with starvation, they were called upon to redesign the entire food system. The moral of the story: advocates of a post-carbon economy are likely to make limited headway during times of cheap energy and rapid economic growth, yet when push comes to shove obstacles may disappear. The Cuban example is encouraging, but it is often called into question on the grounds that what worked on an island with an authoritarian government might not work so well in a large, pluralistic democracy such as the U.S.
Paul Gilding, in his book The Great Disruption, proposes World War II as an illustration of the crisis-led theory of change: “[O]n the objective facts, Hitler represented a clear and undeniable threat long before action was taken to defeat him,” writes Gilding. “Famously, Churchill and others had long warned of this threat and been largely ignored or even ridiculed. Society remained in denial, preferring not to recognize the threat. This was because denial avoided full acceptance and what that meant—war and a strong change to the status quo. Yet once . . . denial ended, the response was swift and dramatic. Things change almost overnight. Without the benefit of a retrospective view, it would be much harder to predict when exactly the denial of Hitler’s threat would end. So it’s also hard to predict when the moment will come [when the need for action on climate change is finally recognized], even though in hindsight it will be ‘obvious.’”
Post-Fukushima Japan offers yet another example. In the wake of catastrophic nuclear plant meltdowns, the Japanese people have insisted that other reactors be idled; today only two of the nation’s atomic power plants are operating. That has left Japan with substantially less electricity than normal—enough of a shortfall that economic collapse could have resulted. Instead, businesses and households have slashed energy use, driven by a collective ethical imperative. PV systems have appeared on rooftops across the nation.
The Kansas town of Greensburg was flattened by a tornado in May 2007, but the residents—rather than drifting away or merely trying to rebuild what they had—decided instead to use insurance and government disaster aid money to build what they are calling “America’s greenest community,” emphasizing energy efficiency and using 100 percent renewable energy.
Economist Milton Friedman may have laid down a manifesto for crisis-led theories of change when he wrote: “Only a crisis—actual or perceived—produces real change. When the crisis occurs, the actions that are taken depend upon the ideas that are lying around. That, I believe, is our basic function: to develop alternatives to existing policies, to keep them alive and available until the politically impossible becomes politically inevitable.” In this brief passage, Friedman not only sums up the theory nicely, but also forces us to contemplate its dark side. In her book The Shock Doctrine: The Rise of Disaster Capitalism, Naomi Klein describes how Friedman and other neoliberal economists used crisis after crisis, beginning in the 1970s, as opportunities to undermine democracy and privatize institutions and infrastructure across the world. Somehow, the opportunities presented by crisis need to be seized first by citizens and communities to build local, low-carbon production and support infrastructure.
The post-carbon theory of change doesn’t seek to expedite or exacerbate crisis; instead, it encourages building resilience into societal systems in order to minimize the trauma of rapid change. Resilience is often defined as “the ability to absorb shocks, reorganize, and continue functioning.” Shocks are clearly on the way, so we should be doing what we can now to build local inventories and to disperse the control points for critical systems. We should neither simply wait around for crisis to hit, or hope for crisis as an opportunity to alter the status quo; rather, we should do as much as possible to conserve ecosystems and to re-localize production and trade now so as to minimize the crisis—which, after all, could potentially prove overwhelming for both humanity and non-human nature. If and when crisis arrives, such preparations will be crucial in guiding response efforts and providing a basis for resisting “disaster capitalism.”
What’s the likelihood of success? It depends partly on how we define the term in this context. Many people speak of “solving” problems like climate change, as though we could make a modest investment in new technology and then carry on living essentially as we are. Implicit in the post-carbon crisis theory of change is the understanding that the way we are living now is at the heart of our problem. Success could therefore be better defined in terms of minimizing human suffering and ecological disruption as we adapt toward a very different mode of existence characterized by greatly reduced energy and materials consumption.
Some self-proclaimed “doomers” have concluded that crisis will overwhelm society no matter what we do. Many have joined the “prepper” movement, stockpiling guns and canned goods in hopes of maintaining their own households as the rest of the world comes to resemble Cormac McCarthy’s The Road. Other doomers are convinced that human extinction is inevitable and that efforts to prevent that outcome are just so much wasted motion.
I do not share either outlook. Of course there is no guarantee that crisis will open opportunities for sensible adaptation and not simply wallop us, leaving humanity and nature wounded and reeling. But for those who understand what’s coming to simply give up efforts to protect nature and humanity before the going gets tough seems premature at best. There could hardly be more at stake; therefore extraordinary levels of effort and extreme persistence would appear justified if not morally mandatory. The post-carbon crisis theory of change may appear to be a strategy born of desperation. But we should hold open the possibility that it will prove surprisingly apt and effective—to the extent that we have invested our best efforts.
As we build resilience and prepare to make the most of the opportunities that come our way, it’s important that we celebrate the improvements in quality of life that come with reducing our dependency on consumption, advertising, automobiles, and all the other life-smothering accoutrements of our crumbling industrial existence. Let’s also celebrate our adaptability in times of crisis, and continually remind one another that small committed groups sometimes do make history—just as history makes them.