did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

did-you-know? rent-now

Amazon no longer offers textbook rentals. We do!

We're the #1 textbook rental company. Let us show you why.

9780805090161

The Limits of Power The End of American Exceptionalism

by
  • ISBN13:

    9780805090161

  • ISBN10:

    0805090169

  • Edition: 1st
  • Format: Paperback
  • Copyright: 2009-04-27
  • Publisher: Holt Paperbacks

Note: Supplemental materials are not guaranteed with Rental or Used book purchases.

Purchase Benefits

  • Free Shipping Icon Free Shipping On Orders Over $35!
    Your order must be $35 or more to qualify for free economy shipping. Bulk sales, PO's, Marketplace items, eBooks and apparel do not qualify for this offer.
  • eCampus.com Logo Get Rewarded for Ordering Your Textbooks! Enroll Now
List Price: $18.00 Save up to $6.88
  • Rent Book $11.12
    Add to Cart Free Shipping Icon Free Shipping

    TERM
    PRICE
    DUE
    USUALLY SHIPS IN 24-48 HOURS
    *This item is part of an exclusive publisher rental program and requires an additional convenience fee. This fee will be reflected in the shopping cart.

Supplemental Materials

What is included with this book?

Summary

An immediate "New York Times" bestseller, "The Limits of Power" offers an unparalleled examination of the profound triple crisis facing America. Bacevich speaks truth to power . . . which may be why . . . both the left and right listen to him--Bill Moyers.

Author Biography

Andrew J. Bacevich is the author of The New American Militarism, among other books. A professor of history and international relations at Boston University, he retired from the U.S. Army with the rank of colonel. His writing has appeared in Foreign Affairs, The Atlantic Monthly, The Nation, The New York Times, The Washington Post, and The Wall Street Journal. He is the recipient of a Lannan Award and a member of the Council on Foreign Relations.

Table of Contents

Chapter One 

The Crisis of Profligacy

Today, no less than in 1776, a passion for life, liberty, and the pursuit of happiness remains at the center of America’s civic theology. The Jeffersonian trinity summarizes our common inheritance, defines our aspirations, and provides the touchstone for our influence abroad.

Yet if Americans still cherish the sentiments contained in Jefferson’s Declaration of Independence, they have, over time, radically revised their understanding of those "inalienable rights." Today, individual Americans use their freedom to do many worthy things. Some read, write, paint, sculpt, compose, and play music. Others build, restore, and preserve. Still others attend plays, concerts, and sporting events, visit their local multiplexes, IM each other incessantly, and join "communities" of the like- minded in an ever- growing array of virtual worlds. They also pursue innumerable hobbies, worship, tithe, and, in commendably large numbers, attend to the needs of the less fortunate. Yet none of these in themselves define what it means to be an American in the twenty-first century.

If one were to choose a single word to characterize that identity, it would have to be more. For the majority of contemporary Americans, the essence of life, liberty, and the pursuit of happiness centers on a relentless personal quest to acquire, to consume, to indulge, and to shed whatever constraints might interfere with those endeavors. A bumper sticker, a sardonic motto, and a charge dating from the Age of Woodstock have recast the Jeffersonian trinity in modern vernacular: "Whoever dies with the most toys wins"; "Shop till you drop"; "If it feels good, do it."

It would be misleading to suggest that every American has surrendered to this ethic of self- gratification. Resistance to its demands persists and takes many forms. Yet dissenters, intent on curbing the American penchant for consumption and self- indulgence, are fighting a rear- guard action, valiant perhaps but unlikely to reverse the tide. The ethic of self- gratification has firmly entrenched itself as the defining feature of the American way of life. The point is neither to deplore nor to celebrate this fact, but simply to acknowledge it.

Others have described, dissected, and typically bemoaned the cultural—and even moral—implications of this development.1 Few, however, have considered how an American preoccupation with "more" has affected U.S. relations with rest of the world. Yet the foreign policy implications of our present- day penchant for consumption and self- indulgence are almost entirely negative. Over the past six decades, efforts to satisfy spiraling consumer demand have given birth to a condition of profound de pen den cy. The United States may still remain the mightiest power the world has ever seen, but the fact is that Americans are no longer masters of their own fate.

The ethic of self- gratification threatens the well- being of the United States. It does so not because Americans have lost touch with some mythical Puritan habits of hard work and self- abnegation, but because it saddles us with costly commitments abroad that we are increasingly ill- equipped to sustain while confronting us with dangers to which we have no ready response. As the prerequisites of the American way of life have grown, they have outstripped the means available to satisfy them. Americans of an earlier generation worried about bomber and missile gaps, both of which turned out to be fictitious. The present- day gap between requirements and the means available to satisfy those requirements is neither contrived nor imaginary. It is real and growing. This gap defines the crisis of American profligacy.

Power and Abundance

Placed in historical perspective, the triumph of this ethic of self- gratification hardly qualifies as a surprise. The restless search for a buck and the ruthless elimination of anyone—or anything—standing in the way of doing so have long been central to the American character. Touring the United States in the 1830s, Alexis de Tocqueville, astute observer of the young Republic, noted the "feverish ardor" of its citizens to accumulate. Yet, even as the typical American "clutches at everything," the Frenchman wrote, "he holds nothing fast, but soon loosens his grasp to pursue fresh gratifications." However munificent his possessions, the American hungered for more, an obsession that filled him with "anxiety, fear, and regret, and keeps his mind in ceaseless trepidation."2

Even in de Tocqueville’s day, satisfying such yearnings as well as easing the anxieties and fears they evoked had important policy implications. To quench their ardor, Americans looked abroad, seeking to extend the reach of U.S. power. The pursuit of "fresh gratifications" expressed itself collectively in an urge to expand, territorially and commercially. This expansionist project was already well begun when de Tocqueville’s famed Democracy in America appeared, most notably through Jefferson’s acquisition of the Louisiana territory in 1803 and through ongoing efforts to remove (or simply eliminate) Native Americans, an undertaking that continued throughout the nineteenth century.

Preferring to remember their collective story somewhat differently, Americans look to politicians to sanitize their past. When, in his 2005 inaugural address, George W. Bush identified the promulgation of freedom as "the mission that created our nation," neoconservative hearts certainly beat a little faster, as they undoubtedly did when he went on to declare that America’s "great liberating tradition" now required the United States to devote itself to "ending tyranny in our world." Yet Bush was simply putting his own gloss on a time- honored conviction ascribing to the United States a uniqueness of character and purpose. From its founding, America has expressed through its behavior and its evolution a providential purpose. Paying homage to, and therefore renewing, this tradition of American exceptionalism has long been one of the presidency’s primary extra constitutional obligations.

Many Americans find such sentiments compelling. Yet to credit the United States with possessing a "liberating tradition" is equivalent to saying that Hollywood has a "tradition of artistic excellence." The movie business is just that—a business. Its purpose is to make money. If once in a while a studio produces a .lm of aesthetic value, that may be cause for celebration, but profit, not revealing truth and beauty, defines the purpose of the enterprise.

Something of the same can be said of the enterprise launched on July 4, 1776. The hardheaded lawyers, merchants, farmers, and slaveholding plantation owners gathered in Philadelphia that summer did not set out to create a church. They founded a republic. Their purpose was not to save mankind. It was to ensure that people like themselves enjoyed unencumbered access to the Jeffersonian trinity.

In the years that followed, the United States achieved remarkable success in making good on those aims. Yet never during the course of America’s transformation from a small power to a great one did the United States exert itself to liberate others—absent an overriding perception that the nation had large security or economic interests at stake.

From time to time, although not nearly as frequently as we like to imagine, some of the world’s unfortunates managed as a consequence to escape from bondage. The Civil War did, for instance, produce emancipation. Yet to explain the conflagration of 1861–65 as a response to the plight of enslaved African Americans is to engage at best in an immense oversimplification. Near the end of World War II, GIs did liberate the surviving inmates of Nazi death camps. Yet for those who directed the American war effort of 1941–45, the fate of European Jews never figured as more than an afterthought.

Crediting the United States with a "great liberating tradition" distorts the past and obscures the actual motive force behind American politics and U.S. foreign policy. It transforms history into a morality tale, thereby providing a rationale for dodging serious moral analysis. To insist that the liberation of others has never been more than an ancillary motive of U.S. policy is not cynicism; it is a prerequisite to self-understanding.

If the young United States had a mission, it was not to liberate but to expand. "Of course," declared Theodore Roosevelt in 1899, as if explaining the self- evident to the obtuse, "our whole national history has been one of expansion." TR spoke truthfully. The founders viewed stasis as tantamount to suicide. From the outset, Americans evinced a compulsion to acquire territory and extend their commercial reach abroad.

How was expansion achieved? On this point, the historical record leaves no room for debate: by any means necessary. Depending on the circumstances, the United States relied on diplomacy, hard bargaining, bluster, chicanery, intimidation, or naked coercion. We infiltrated land belonging to our neighbors and then brazenly proclaimed it our own. We harassed, filibustered, and, when the situation called for it, launched full- scale invasions. We engaged in ethnic cleansing. At times, we insisted that treaties be considered sacrosanct. On other occasions, we blithely jettisoned solemn agreements that had outlived their usefulness.

As the methods employed varied, so too did the rationales offered to justify action. We touted our status as God’s new Chosen People, erecting a "city upon a hill" destined to illuminate the world. We acted at the behest of providential guidance or responded to the urgings of our "manifest destiny." We declared our obligation to spread the gospel of Jesus Christ or to "uplift little brown brother." With Woodrow Wilson as our tutor, we shouldered our responsibility to "show the way to the nations of the world how they shall walk in the paths of liberty."3 Critics who derided these claims as bunkum—the young Lincoln during the war with Mexico, Mark Twain after the imperial adventures of 1898,Senator Robert La Follette amid "the war to end all wars"— scored points but lost the argument. Periodically revised and refurbished, American exceptionalism (which implied exceptional American prerogatives) only gained greater currency.

When it came to action rather than talk, even the policy makers viewed as most idealistic remained fixated on one overriding aim: enhancing American influence, wealth, and power. The record of U.S. foreign relations from the earliest colonial encounters with Native Americans to the end of the Cold War is neither uniquely high- minded nor uniquely hypocritical and exploitive. In this sense, the interpretations of America’s past offered by both George W. Bush and Osama bin Laden fall equally wide of the mark. As a rising power, the United States adhered to the iron laws of international politics, which allow little space for altruism. If the tale of American expansion contains a moral theme at all, that theme is necessarily one of ambiguity.

To be sure, the ascent of the United States did not occur without missteps: opéra bouffe incursions into Canada; William McKinley’s ill- advised annexation of the Philippines; complicity in China’s "century of humiliation"; disastrous post–World War I economic policies that paved the way for the Great Depression; Harry Truman’s decision in 1950 to send U.S. forces north of Korea’s Thirty- eighth Parallel; among others. Most of these blunders and bonehead moves Americans have long since shrugged off. Some, like Vietnam, we find impossible to forget even as we persistently disregard their implications.

However embarrassing, these missteps pale in significance when compared to the masterstrokes of American presidential statecraft. In purchasing Louisiana from the French, Thomas Jefferson may have overstepped the bounds of his authority and in seizing California from Mexico, James Polk may have perpetrated a war of conquest, but their actions ensured that the United States would one day become a great power. To secure the isthmus of Panama, Theodore Roosevelt orchestrated an outrageous swindle. The canal he built there affirmed America’s hemispheric dominion. In collaborating with Joseph Stalin, FDR made common cause with an indisputably evil figure. Yet this pact with the devil destroyed the murderous Hitler while vaulting the United States to a position of unquestioned global economic supremacy. A similar collaboration—forged by Richard Nixon with the murderous Mao Zedong—helped bring down the Soviet empire, thereby elevating the United States to the self- proclaimed status of "sole superpower."

The achievements of these preeminent American statesmen derived not from their common devotion to a liberating tradition but from boldness unburdened by excessive scruples. Notwithstanding the high- sounding pronouncements that routinely emanate from the White House and the State Department, the defining characteristic of U.S. foreign policy at its most successful has not been idealism, but pragmatism, frequently laced with pragmatism’s first cousin, opportunism.

What self- congratulatory textbooks once referred to as America’s "rise to power" did not unfold according to some preconceived strategy for global preeminence. There was never a secret blueprint or master plan. A keen eye for the main chance, rather than fixed principles, guided policy. If the means employed were not always pretty, the results achieved were often stunning and paid enormous dividends for the American people.

Expansion made the United States the "land of opportunity." From expansion came abundance. Out of abundance came substantive freedom. Documents drafted in Philadelphia promised liberty. Making good on those promises required a political economy that facilitated the creation of wealth on an enormous scale.

Writing over a century ago, the historian Frederick Jackson Turner made the essential point. "Not the Constitution, but free land and an abundance of natural resources open to a .t people," he wrote, made American democracy possible.4 A half century later, the historian David Potter discovered a similar symbiosis between affluence and liberty. "A politics

of abundance," he claimed, had created the American way of life, "a politics which smiled both on those who valued abundance as a means to safeguard freedom and those who valued freedom as an aid in securing abundance."5 William Apple man Williams, another historian, found an even tighter correlation. For Americans, he observed, "abundance was freedom and freedom was abundance."6

In short, expansion fostered prosperity, which in turn created the environment within which Americans pursued their dreams of freedom even as they argued with one another about just who deserved to share in that dream. The promise—and reality—of ever-increasing material abundance kept that argument within bounds. As the Industrial Revolution took hold, Americans came to count on an ever-larger economic pie to anesthetize the unruly and ameliorate tensions related to class, race, religion, and ethnicity. Money became the preferred lubricant for keeping social and political friction within tolerable limits. Americans, Reinhold Niebuhr once observed, "seek a solution for practically every problem of life in quantitative terms," certain that more is better.7

This reciprocal relationship between expansion, abundance, and freedom reached its apotheosis in the immediate aftermath of World War II. Assisted mightily by the fratricidal behavior of the traditional Europe an powers through two world wars and helped by reckless Japanese policies that culminated in the attack on Pearl Harbor, the United States emerged as a global superpower, while the American people came to enjoy a standard of living that made them the envy of the world. By 1945, the "American Century" forecast by Time-Life publisher Henry Luce only four years earlier seemed miraculously at hand. The United States was the strongest, the richest, and—in the eyes of its white majority at least—the freest nation in all the world.

Political credit for this achievement lies squarely with the Left. Abundance, sustained in no small mea sure by a postwar presumption of American "global leadership," made possible the expansion of freedom at home. Rebutting Soviet charges of racism and hypocrisy lent the promotion of freedom domestically a strategic dimension. Yet possibility only became reality thanks to progressive political activism.

Pick the group: blacks, Jews, women, Asians, Hispanics, working stiffs, gays, the handicapped—in every case, the impetus for providing equal access to the rights guaranteed by the Constitution originated among pinks, lefties, liberals, and bleeding- heart fellow travelers. When it came to ensuring that every American should get a fair shake, the contribution of modern conservatism has been essentially nil. Had Martin Luther King counted on William F. Buckley and the National Review to take up the fight against racial segregation in the 1950s and 1960s, Jim Crow would still be alive and well.

The president had originally intended to speak on July 5, focusing his address exclusively on energy. At the last minute, he decided to postpone it. Instead, he spent ten days sequestered at Camp David, using the time, he explained, "to reach out and listen to the voices of America." At his invitation, a host of politicians, academics, business and labor leaders, clergy, and private citizens trooped through the presidential retreat to offer their views on what was wrong with America and what Carter needed to do to set things right. The result combined a seminar of sorts with an exercise in self- flagellation.

The speech that Carter delivered when he returned to the White House bore little resemblance to the one he had planned to give ten days earlier. He began by explaining that he had decided to look beyond energy because "the true problems of our Nation are much deeper." The energy crisis of 1979, he suggested, was merely a symptom of a far greater crisis. "So, I want to speak to you first to night about a subject even more serious than energy or inflation. I want to talk to you right now about a fundamental threat to American democracy."

In short order, Carter then proceeded to kill any chance he had of securing reelection. In American political discourse, fundamental threats are by definition external. Nazi Germany, Imperial Japan, or international communism could threaten the United States. That very year, Iran’s Islamic revolutionaries had emerged to pose another such threat. That the actions of everyday Americans might pose a comparable threat amounted to rank heresy. Yet Carter now dared to suggest that the real danger to American democracy lay within.

The nation as a whole was experiencing "a crisis of confidence," he announced. "It is a crisis that strikes at the very heart and soul and spirit of our national will. We can see this crisis in the growing doubt about the meaning of our own lives and in the loss of a unity of purpose for our nation." This erosion of confidence threatened "to destroy the social and the political fabric of America."

Americans had strayed from the path of righteousness. "In a nation that was proud of hard work, strong families, close- knit communities, and our faith in God," the president continued,

too many of us now tend to worship self- indulgence and consumption. Human identity is no longer defined by what one does, but by what one owns. But we’ve discovered that owning things and consuming things does not satisfy our longing for meaning. We’ve learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose.

In other words, the spreading American crisis of confidence was an outward manifestation of an underlying crisis of values. With his references to what "we’ve discovered" and what "we’ve learned," Carter implied that he was merely voicing concerns that his listeners already shared: that average Americans viewed their lives as empty, unsatisfying rituals of buying, and longed for something more meaningful.

To expect Washington to address these concerns was, he made clear, fanciful. According to the president, the federal government had become "an island," isolated from the people. Its major institutions were paralyzed and corrupt. It was "a system of government that seems incapable of action." Carter spoke of "a Congress twisted and pulled in every direction by hundreds of well financed and powerful special interests." Partisanship routinely trumped any concern for the common good: "You see every extreme position defended to the last vote, almost to the last breath by one unyielding group or another."

"We are at a turning point in our history," Carter announced.

There are two paths to choose. One is a path I’ve warned about to night, the path that leads to fragmentation and self- interest. Down that road lies a mistaken idea of freedom, the right to grasp for ourselves some advantage over others. That path would be one of constant conflict between narrow interests ending in chaos and immobility.

The continued pursuit of this mistaken idea of freedom was "a certain route to failure." The alternative—a course consistent with "all the traditions of our past [and] all the lessons of our heritage"—pointed down "another path, the path of common purpose and the restoration of American values." Down that path, the president claimed, lay "true freedom for our Nation and ourselves."

As portrayed by Carter, the mistaken idea of freedom was quantitative: It centered on the never- ending quest for more while exalting narrow self- interest. His conception of authentic freedom was qualitative: It meant living in accordance with permanent values. At least by implication, it meant settling for less.

How Americans dealt with the question of energy, the president believed, was likely to determine which idea of freedom would prevail. "Energy will be the immediate test of our ability to unite this Nation, and it can also be the standard around which we rally." By raising that standard, Carter insisted, "we can seize control again of our common destiny." With this in mind, Carter outlined a six- point program designed to end what he called "this intolerable dependence on foreign oil." He promised action to reduce oil imports by one- half within a decade. In the near term, he vowed to establish quotas capping the amount of oil coming into the country. He called for a national effort to develop alternative energy sources. He proposed legislation mandating reductions in the amount of oil used for power generation. He advocated establishment of a new federal agency "to cut through the red tape, the delays, and the endless roadblocks to completing key energy projects." And finally, he summoned the American people to conserve: "to take no unnecessary trips, to use carpools or public transportation whenever you can, to park your car one extra day per week, to obey the speed limit, and to set your thermostats to save fuel."

Although Carter expressed confidence that the United States could one day regain its energy independence, he acknowledged that in the near term "there [was] simply no way to avoid sacrifice." Indeed, implicit in Carter’s speech was the suggestion that sacrifice just might be a good thing. For the sinner, some sort of penance must necessarily precede redemption.

The response to his address—instantly labeled the "malaise" speech although Carter never used that word— was tepid at best. Carter’s remarks had blended religiosity and populism in ways that some found off- putting. Writing in the New York Times, Francis X. Clines called it the "cross- of- malaise" speech, comparing it unfavorably to the famous "cross- of- gold" oration that had vaulted William Jennings Bryan to political prominence many decades earlier.19 Others criticized what they saw as a penchant for anguished moralizing and a tendency to find fault everywhere except in his own White House. In the New York Times Magazine, Professor Eugene Kennedy mocked "Carter Agonistes," depicting the president as a "distressed angel, passing judgment on us all, and speaking solemnly not of blood and sweat but of oil and sin."20

The relationship between World Wars III and IV becomes apparent when recalling Reagan’s policy toward Afghanistan and Iraq—the former a seemingly brilliant success that within a decade gave birth to a quagmire, the latter a cynical gambit that backfired, touching off a sequence of events that would culminate in a stupendous disaster.

As noted in the final report of the National Commission on Terrorist Attacks Upon the United States, "A decade of conflict in Afghanistan, from 1979 to 1989, gave Islamist extremists a rallying point and a training .eld."27 The commissioners understate the case. In Afghanistan, jihadists took on a superpower, the Soviet Union, and won. They gained immeasurably in confidence and ambition, their efforts funded in large mea sure by the American taxpayer.

The billions that Reagan spent funneling weapons, ammunition, and other support to the Afghan mujahideen were as nothing compared to the $1.2 trillion his administration expended modernizing U.S. military forces. Yet American policy in Afghanistan during the 1980s illustrates the Reagan Doctrine in its purest form. In the eyes of Reagan’s admirers, it was his masterstroke, a bold and successful effort to roll back the Soviet empire. The exploits of the Afghan "resistance" .red the president’s imagination, and he offered the jihadists unstinting and enthusiastic support. In designating March 21, 1982, "Afghanistan Day," for example, Reagan proclaimed, "The freedom fighters of Afghanistan are defending principles of independence and freedom that form the basis of global security and stability."28

In January 1993, President Bill Clinton inherited this situation. To his credit, alone among recent presidents Clinton managed at least on occasion to balance the federal budget. With his enthusiasm for globalization, however, the forty- second president exacerbated the underlying contradictions of the American economy. Oil imports increased by more than 50 percent during the Clinton era.33 The trade imbalance nearly quadrupled.34 Gross federal debt climbed by nearly $1.5 trillion.35 During the go- go dot .com years, however, few Americans attended to such matters.

In the Persian Gulf, Clinton’s efforts to shore up U.S. hegemony took the form of a "dual containment" policy targeting both Iran and Iraq. With regard to Iran, containment meant further isolating the Islamic republic diplomatically and economically in order to prevent the rebuilding of its badly depleted military forces. With regard to Saddam Hussein’s Iraq, it meant much the same, including fierce UN sanctions and a program of armed harassment.

During the first year of his administration, Clinton developed a prodigious appetite for bombing and, thanks to a humiliating "Blackhawk down" failure in and retreat from Somalia, an equally sharp aversion to committing ground troops. Nowhere did Clinton’s infatuation with air power find greater application than in Iraq, which he periodically pummeled with precision- guided bombs and cruise missiles. In effect, the cease fire that terminated Operation Desert Storm in February 1991 did not end the Persian Gulf War. After a brief pause, hostilities resumed. Over time, they intensified, with the United States conducting punitive air strikes at will.

Although when it came to expending the lives of American soldiers, Clinton proved to be circumspect, he expended ordnance with abandon. During the course of his presidency, the navy and air force conducted tens of thousands of sorties into Iraqi airspace, dropped thousands of bombs, and launched hundreds of cruise missiles. Apart from turning various Iraqi military and government facilities into rubble, this cascade of pricy munitions had negligible impact. With American forces suffering not a single casualty, few Americans paid attention to what the ordnance cost or where it landed. After all, whatever the number of bombs dropped, more were always available in a seemingly inexhaustible supply.

Despite these exertions, many in Washington— Republicans and Democrats, politicians and pundits—worked themselves into a frenzy over the fact that Saddam Hussein had managed to survive, when the World’s Only Superpower now wished him gone. To fevered minds, Saddam’s defiance made him an existential threat, his mere survival an unendurable insult.

In 1998, the anti- Saddam lobby engineered passage through Congress of the Iraq Liberation Act, declaring it "the policy of the United States to seek to remove the Saddam Hussein regime from power in Iraq and to replace it with a democratic government." The legislation, passed unanimously in the Senate and by a 360–38 majority in the House, authorized that the princely sum of $100 million be dedicated to that objective. On October 31, President Clinton duly signed the act into law and issued a statement embracing the cause of freedom for all Iraqis. "I categorically reject arguments that this is unattainable due to Iraq’s history or its ethnic or sectarian make- up," the president said. "Iraqis deserve and desire freedom like everyone else."

All of this—both the gratuitous air war and the preposterously frivolous legislation—amounted to theater. Reality on the ground was another matter. A crushing sanctions regime authorized by the UN, but imposed by the United States and its allies, complicated Saddam’s life and limited the funds available from Iraqi oil, but primarily had the effect of making the wretched existence of the average Iraqi more wretched still. A 1996 UNICEF report estimated that up to half a million Iraqi children had died as a result of the sanctions. Asked to comment, Secretary of State Madeleine Albright did not even question the figure. Instead, she replied, "I think this is a very hard choice, but the price—we think the price is worth it."

No doubt Albright regretted her obtuse remark. Yet it captured something essential about U.S. policy in the Persian Gulf at a time when confidence in American power had reached its acme. In effect, the United States had forged a partnership with Saddam in imposing massive suffering on the Iraqi people. Yet as long as Americans at home were experiencing a decade of plenty—during the Clinton era, consumers enjoyed low gas prices and gorged themselves on cheap Asian imports—the price that others might be paying didn’t much matter.

Bill Clinton’s Iraq policy was both strategically misguided and morally indefensible—as ill- advised as John Kennedy’s campaign of subversion and sabotage directed

against Cuba in the 1960s, as reprehensible as Richard Nixon’s illegal bombing of Laos and Cambodia in the late 1960s and 1970s. Yet unlike those actions, which occurred in secret, U.S. policy toward Iraq in the 1990s unfolded in full view of the American people. To say that the policy commanded enthusiastic popular support would be to grossly overstate the case. Yet few Americans strenuously objected—to the bombing, to congressional posturing, or to the brutal sanctions. Paying next to no attention, the great majority quietly acquiesced and thus became complicit.

American Freedom, Iraqi Freedom

To the extent that Bill Clinton’s principal critics had a problem with his Iraq policy, their chief complaint was that the United States wasn’t dropping enough bombs. Committed to their own quantitative solutions, hawkish conservatives wanted to ratchet up the level of violence. If Saddam’s survival represented an affront to American hegemony in the Gulf, then Saddam’s elimination offered the necessary corrective. Among neo-Reaganite Republicans, well before 9/11, it became an article of faith that, with Saddam’s removal, everything was certain to fall into place. Writing in the Weekly Standard in February 1998, Robert Kagan, a leading neoconservative, urged a full- scale invasion. Eliminating the Baath Party regime, he promised, was sure to "open the way for a new post- Saddam Iraq whose intentions can safely be assumed to be benign."36

The possibility that military escalation might actually exacerbate America’s Persian Gulf dilemma received scant consideration. That the citizens of the United States might ease that dilemma by modifying their own behavior—that the antidote to our ailments might lie within rather than on the other side of the world—received no consideration at all.

The events of September 11, 2001, only hardened this disposition. Among hawks, 9/11 reinforced the conviction that dominance in the Gulf was a categorical imperative. Secretary of Defense Donald Rumsfeld aptly summarized the prevailing view in October 2001: "We have two choices. Either we change the way we live, or we must change the way they live. We choose the latter."37 If, today, this black- and- white perspective seems a trifle oversimplified, between 2002 and 2004, no politician of national stature had the wit or the gumption to voice a contrary view.38

As it trained its sights on modifying the way "they" lived, the Bush administration looked to America’s armed forces as its preferred agent of change. The United States would, as Bush and his chief advisers saw it, solidify its hold on the Persian Gulf by relying in the first instance on coercion. In 1991, the president’s father had shrunk from doing what they now believed needed to be done: marching on Baghdad and "decapitating" the regime of Saddam Hussein. Throughout the remainder of that decade Clinton had temporized. Now the gloves were coming off, with Saddam’s Iraq the primary, but by no means the final, target.

Here was an imperial vision on a truly colossal scale, a worthy successor to older claims of "manifest destiny" or of an American mission to "make the world safe for democracy." President Bush’s "freedom agenda" updated and expanded upon this tradition.

One might have thought that implementing such a vision would require sustained and large- scale national commitment. Yet soon after 9/11, the American people went back to business as usual—urged to do so by the president himself. "War costs money," Franklin D. Roosevelt had reminded his countrymen after Pearl Harbor. "That means taxes and bonds and bonds and taxes. It means cutting luxuries and other non-essentials."41 At the outset of its war on terrorism, the Bush administration saw things differently. Even as the United States embarked on a global conflict expected to last decades, the president made a point of reducing taxes. Rather than asking Americans to trim their appetite for luxuries, he called on them to carry on as if nothing had occurred. Barely two weeks after the World Trade Center had collapsed, the president was prodding his fellow citizens to "Get on board. Do your business around the country. Fly and enjoy America’s great destination spots. Get down to Disney World in Florida."

Predictably, as the scope of military operations grew, especially after the invasion of Iraq in March 2003, so too did the level of military spending. During the Bush years, the Pentagon’s annual bud get more than doubled, reaching $700 billion by 2008. This time, unlike in Operation Desert Storm when Germany, Japan, and friendly Gulf states ponied up tens of billions of dollars to defray the cost of U.S. operations, the burden of paying for the war fell entirely on Washington.

Less predictably, although perhaps not surprisingly, spending on entitlements also rose in the years after 9/11. Abetted by Congress, the Bush administration conducted a war of guns and butter, including huge increases in outlays for Medicare and Social Security. The federal bud get once more went into the red and stayed there.

Had the administration gotten a quick win in Iraq, it might have finessed the crisis of profligacy—for a while. To put it mildly, however, the war didn’t follow its assigned script.

Between April 28, 2003, and February 22, 2006, Iraq came apart at the seams. During this interval, the adverse foreign policy implications of American profligacy became indisputable. On the former date, skittish American soldiers in Fallujah fired into a crowd of demonstrators, killing a dozen or more Iraqis. If the insurgency had a trigger, this was it. On the latter date, terrorists blew up the Mosque of the Golden Dome in Samarra, igniting an already simmering Sunni- Shiite civil war. Prior to the incident in Fallujah, the administration could still convince itself that its grand strategy remained plausible. Even a month later, swaggering White House officials were still bragging: "Anyone can go to Baghdad. Real men go to Tehran." By the time the Samarra bombing occurred, events had not dealt kindly with such fantasies. Real men were holed up in Baghdad’s heavily fortified Green Zone.

As conditions in Iraq worsened, the disparity between pretensions and capacities became painfully evident. A generation of profligacy had produced strategic insolvency. The administration had counted on the qualitative superiority of U.S. forces compensating for their limited numbers. The enemy did not cooperate.

Although the United States is a wealthy nation with a population of over 300 million, closing the gap between means and ends posed a daunting task. By February 2005, this was so apparent that Los Angeles Times columnist Max Boot was suggesting that the armed forces "open up recruiting stations from Budapest to Bangkok, Cape Town to Cairo, Montreal to Mexico City." Boot’s suggestion that the Bush administration raise up a "Freedom Legion" of foreign mercenaries inadvertently illustrated the depth of the problem.47 If the Pentagon needed to comb the streets of Cape Town and Cairo to fill its ranks, the situation was indeed dire.

The United States had a shortage of soldiers; it also lacked funds. The longer the wars in Iraq and Afghanistan dragged on, the more costly they became. By 2007, to sustain its operations, the U.S. command in Baghdad was burning through $3 billion per week. That same year, the overall costs of the Iraq War topped the $500- billion mark, with some estimates already suggesting that the final bill could reach at least $2 trillion.48

Although these figures were widely reported, they had almost no political impact in Washington, indicating the extent to which habits of profligacy had become entrenched. Congress responded to bud get imbalances not by trimming spending or increasing revenues but by quietly and repeatedly raising the debt ceiling—by $3.015 trillion between 2002 and 2006.49 Future generations could figure out how to pay the bills.

All this red ink generated nervous speculation about a coming economic collapse comparable in magnitude to the Great Depression.50 Whatever the merit of such concerns, the interest here is not in what may yet happen to the American economy but in what has already occurred to its foreign policy.

By 2007, the United States was running out of troops and was already out of money. According to conventional wisdom, when it came to Iraq, there were "no good options." Yet Americans had limited the range of possible options by their stubborn insistence that the remedy to the nation’s problems in the Persian Gulf necessarily lay in the Persian Gulf rather than at home. The slightest suggestion that the United States ought to worry less about matters abroad and more about setting its own house in order elicited from the political elite, Republicans and Democrats alike, shrieks of "isolationism," the great imaginary sin to which Americans are allegedly prone. Yet to begin to put our house in order would be to open up a whole new array of options, once again permitting the United States to "choose peace or war, as our interest, guided by justice, shall counsel."

Long accustomed to thinking of the United States as a superpower, Americans have yet to realize that they have forfeited command of their own destiny. The reciprocal relationship between expansionism, abundance, and freedom—each reinforcing the other—no longer exists. If anything, the reverse is true: Expansionism squanders American wealth and power, while putting freedom at risk. As a consequence, the strategic tradition to which Jefferson and Polk, Lincoln and McKinley, TR and FDR all subscribed has been rendered not only obsolete but pernicious.

Rather than confronting this reality head- on, American grand strategy since the era of Ronald Reagan, and especially throughout the era of George W. Bush, has been characterized by attempts to wish reality away. Policy makers have been engaged in a de facto Ponzi scheme intended to extend indefinitely the American line of credit. The fiasco of the Iraq War and the quasi- permanent U.S. occupation of Afghanistan illustrate the results and prefigure what is yet to come if the crisis of American profligacy continues unabated.

Excepted from The Limits of Power by Andrew J. Bacevich

Copyright @ 2009 by Andrew J. Bacevich

Published in 2009 by Henry Holt and Company, LLC

All rights reserved. This work is protected under copyright laws and reproduction is strictly prohibited. Permission to reproduce the material in any manner or medium must be secured from the Publisher.

Supplemental Materials

What is included with this book?

The New copy of this book will include any supplemental materials advertised. Please check the title of the book to determine if it should include any access cards, study guides, lab manuals, CDs, etc.

The Used, Rental and eBook copies of this book are not guaranteed to include any supplemental materials. Typically, only the book itself is included. This is true even if the title states it includes any access cards, study guides, lab manuals, CDs, etc.

Excerpts

Chapter One The Crisis of ProfligacyToday, no less than in 1776, a passion for life, liberty, and the pursuit of happiness remains at the center of America’s civic theology. The Jeffersonian trinity summarizes our common inheritance, defines our aspirations, and provides the touchstone for our influence abroad.Yet if Americans still cherish the sentiments contained in Jefferson’s Declaration of Independence, they have, over time, radically revised their understanding of those "inalienable rights." Today, individual Americans use their freedom to do many worthy things. Some read, write, paint, sculpt, compose, and play music. Others build, restore, and preserve. Still others attend plays, concerts, and sporting events, visit their local multiplexes, IM each other incessantly, and join "communities" of the like- minded in an ever- growing array of virtual worlds. They also pursue innumerable hobbies, worship, tithe, and, in commendably large numbers, attend to the needs of the less fortunate. Yet none of these in themselves define what it means to be an American in the twenty-first century.If one were to choose a single word to characterize that identity, it would have to be more. For the majority of contemporary Americans, the essence of life, liberty, and the pursuit of happiness centers on a relentless personal quest to acquire, to consume, to indulge, and to shed whatever constraints might interfere with those endeavors. A bumper sticker, a sardonic motto, and a charge dating from the Age of Woodstock have recast the Jeffersonian trinity in modern vernacular: "Whoever dies with the most toys wins"; "Shop till you drop"; "If it feels good, do it."It would be misleading to suggest that every American has surrendered to this ethic of self- gratification. Resistance to its demands persists and takes many forms. Yet dissenters, intent on curbing the American penchant for consumption and self- indulgence, are fighting a rear- guard action, valiant perhaps but unlikely to reverse the tide. The ethic of self- gratification has firmly entrenched itself as the defining feature of the American way of life. The point is neither to deplore nor to celebrate this fact, but simply to acknowledge it.Others have described, dissected, and typically bemoaned the cultural—and even moral—implications of this development.1 Few, however, have considered how an American preoccupation with "more" has affected U.S. relations with rest of the world. Yet the foreign policy implications of our present- day penchant for consumption and self- indulgence are almost entirely negative. Over the past six decades, efforts to satisfy spiraling consumer demand have given birth to a condition of profound de pen den cy. The United States may still remain the mightiest power the world has ever seen, but the fact is that Americans are no longer masters of their own fate.The ethic of self- gratification threatens the well- being of the United States. It does so not because Americans have lost touch with some mythical Puritan habits of hard work and self- abnegation, but because it saddles us with costly commitments abroad that we are increasingly ill- equipped to sustain while confronting us with dangers to which we have no ready response. As the prerequisites of the American way of life have grown, they have outstripped the means available to satisfy them. Americans of an earlier generation worried about bomber and missile gaps, both of which turned out to be fictitious. The present- day gap between requirements and the means available to satisfy those requirements is neither contrived nor imaginary. It is real and growing. This gap defines the crisis of American profligacy.Power and AbundancePlaced in historical perspective, the triumph of this ethic of self- gratification hardly qualifies as a surprise. The restless search for a buck and the ruthless elimination of anyone—or anything—standing in the way of doing so have long been central to the American character. Touring the United States in the 1830s, Alexis de Tocqueville, astute observer of the young Republic, noted the "feverish ardor" of its citizens to accumulate. Yet, even as the typical American "clutches at everything," the Frenchman wrote, "he holds nothing fast, but soon loosens his grasp to pursue fresh gratifications." However munificent his possessions, the American hungered for more, an obsession that filled him with "anxiety, fear, and regret, and keeps his mind in ceaseless trepidation."2Even in de Tocqueville’s day, satisfying such yearnings as well as easing the anxieties and fears they evoked had important policy implications. To quench their ardor, Americans looked abroad, seeking to extend the reach of U.S. power. The pursuit of "fresh gratifications" expressed itself collectively in an urge to expand, territorially and commercially. This expansionist project was already well begun when de Tocqueville’s famed Democracy in America appeared, most notably through Jefferson’s acquisition of the Louisiana territory in 1803 and through ongoing efforts to remove (or simply eliminate) Native Americans, an undertaking that continued throughout the nineteenth century.Preferring to remember their collective story somewhat differently, Americans look to politicians to sanitize their past. When, in his 2005 inaugural address, George W. Bush identified the promulgation of freedom as "the mission that created our nation," neoconservative hearts certainly beat a little faster, as they undoubtedly did when he went on to declare that America’s "great liberating tradition" now required the United States to devote itself to "ending tyranny in our world." Yet Bush was simply putting his own gloss on a time- honored conviction ascribing to the United States a uniqueness of character and purpose. From its founding, America has expressed through its behavior and its evolution a providential purpose. Paying homage to, and therefore renewing, this tradition of American exceptionalism has long been one of the presidency’s primary extra constitutional obligations.Many Americans find such sentiments compelling. Yet to credit the United States with possessing a "liberating tradition" is equivalent to saying that Hollywood has a "tradition of artistic excellence." The movie business is just that—a business. Its purpose is to make money. If once in a while a studio produces a .lm of aesthetic value, that may be cause for celebration, but profit, not revealing truth and beauty, defines the purpose of the enterprise.Something of the same can be said of the enterprise launched on July 4, 1776. The hardheaded lawyers, merchants, farmers, and slaveholding plantation owners gathered in Philadelphia that summer did not set out to create a church. They founded a republic. Their purpose was not to save mankind. It was to ensure that people like themselves enjoyed unencumbered access to the Jeffersonian trinity.In the years that followed, the United States achieved remarkable success in making good on those aims. Yet never during the course of America’s transformation from a small power to a great one did the United States exert itself to liberate others—absent an overriding perception that the nation had large security or economic interests at stake.From time to time, although not nearly as frequently as we like to imagine, some of the world’s unfortunates managed as a consequence to escape from bondage. The Civil War did, for instance, produce emancipation. Yet to explain the conflagration of 1861–65 as a response to the plight of enslaved African Americans is to engage at best in an immense oversimplification. Near the end of World War II, GIs did liberate the surviving inmates of Nazi death camps. Yet for those who directed the American war effort of 1941–45, the fate of European Jews never figured as more than an afterthought.Crediting the United States with a "great liberating tradition" distorts the past and obscures the actual motive force behind American politics and U.S. foreign policy. It transforms history into a morality tale, thereby providing a rationale for dodging serious moral analysis. To insist that the liberation of others has never been more than an ancillary motive of U.S. policy is not cynicism; it is a prerequisite to self-understanding.If the young United States had a mission, it was not to liberate but to expand. "Of course," declared Theodore Roosevelt in 1899, as if explaining the self- evident to the obtuse, "our whole national history has been one of expansion." TR spoke truthfully. The founders viewed stasis as tantamount to suicide. From the outset, Americans evinced a compulsion to acquire territory and extend their commercial reach abroad.How was expansion achieved? On this point, the historical record leaves no room for debate: by any means necessary. Depending on the circumstances, the United States relied on diplomacy, hard bargaining, bluster, chicanery, intimidation, or naked coercion. We infiltrated land belonging to our neighbors and then brazenly proclaimed it our own. We harassed, filibustered, and, when the situation called for it, launched full- scale invasions. We engaged in ethnic cleansing. At times, we insisted that treaties be considered sacrosanct. On other occasions, we blithely jettisoned solemn agreements that had outlived their usefulness.As the methods employed varied, so too did the rationales offered to justify action. We touted our status as God’s new Chosen People, erecting a "city upon a hill" destined to illuminate the world. We acted at the behest of providential guidance or responded to the urgings of our "manifest destiny." We declared our obligation to spread the gospel of Jesus Christ or to "uplift little brown brother." With Woodrow Wilson as our tutor, we shouldered our responsibility to "show the way to the nations of the world how they shall walk in the paths of liberty."3 Critics who derided these claims as bunkum—the young Lincoln during the war with Mexico, Mark Twain after the imperial adventures of 1898,Senator Robert La Follette amid "the war to end all wars"— scored points but lost the argument. Periodically revised and refurbished, American exceptionalism (which implied exceptional American prerogatives) only gained greater currency.When it came to action rather than talk, even the policy makers viewed as most idealistic remained fixated on one overriding aim: enhancing American influence, wealth, and power. The record of U.S. foreign relations from the earliest colonial encounters with Native Americans to the end of the Cold War is neither uniquely high- minded nor uniquely hypocritical and exploitive. In this sense, the interpretations of America’s past offered by both George W. Bush and Osama bin Laden fall equally wide of the mark. As a rising power, the United States adhered to the iron laws of international politics, which allow little space for altruism. If the tale of American expansion contains a moral theme at all, that theme is necessarily one of ambiguity.To be sure, the ascent of the United States did not occur without missteps: opéra bouffe incursions into Canada; William McKinley’s ill- advised annexation of the Philippines; complicity in China’s "century of humiliation"; disastrous post–World War I economic policies that paved the way for the Great Depression; Harry Truman’s decision in 1950 to send U.S. forces north of Korea’s Thirty- eighth Parallel; among others. Most of these blunders and bonehead moves Americans have long since shrugged off. Some, like Vietnam, we find impossible to forget even as we persistently disregard their implications.However embarrassing, these missteps pale in significance when compared to the masterstrokes of American presidential statecraft. In purchasing Louisiana from the French, Thomas Jefferson may have overstepped the bounds of his authority and in seizing California from Mexico, James Polk may have perpetrated a war of conquest, but their actions ensured that the United States would one day become a great power. To secure the isthmus of Panama, Theodore Roosevelt orchestrated an outrageous swindle. The canal he built there affirmed America’s hemispheric dominion. In collaborating with Joseph Stalin, FDR made common cause with an indisputably evil figure. Yet this pact with the devil destroyed the murderous Hitler while vaulting the United States to a position of unquestioned global economic supremacy. A similar collaboration—forged by Richard Nixon with the murderous Mao Zedong—helped bring down the Soviet empire, thereby elevating the United States to the self- proclaimed status of "sole superpower."The achievements of these preeminent American statesmen derived not from their common devotion to a liberating tradition but from boldness unburdened by excessive scruples. Notwithstanding the high- sounding pronouncements that routinely emanate from the White House and the State Department, the defining characteristic of U.S. foreign policy at its most successful has not been idealism, but pragmatism, frequently laced with pragmatism’s first cousin, opportunism.What self- congratulatory textbooks once referred to as America’s "rise to power" did not unfold according to some preconceived strategy for global preeminence. There was never a secret blueprint or master plan. A keen eye for the main chance, rather than fixed principles, guided policy. If the means employed were not always pretty, the results achieved were often stunning and paid enormous dividends for the American people.Expansion made the United States the "land of opportunity." From expansion came abundance. Out of abundance came substantive freedom. Documents drafted in Philadelphia promised liberty. Making good on those promises required a political economy that facilitated the creation of wealth on an enormous scale.Writing over a century ago, the historian Frederick Jackson Turner made the essential point. "Not the Constitution, but free land and an abundance of natural resources open to a .t people," he wrote, made American democracy possible.4 A half century later, the historian David Potter discovered a similar symbiosis between affluence and liberty. "A politicsof abundance," he claimed, had created the American way of life, "a politics which smiled both on those who valued abundance as a means to safeguard freedom and those who valued freedom as an aid in securing abundance."5 William Apple man Williams, another historian, found an even tighter correlation. For Americans, he observed, "abundance was freedom and freedom was abundance."6In short, expansion fostered prosperity, which in turn created the environment within which Americans pursued their dreams of freedom even as they argued with one another about just who deserved to share in that dream. The promise—and reality—of ever-increasing material abundance kept that argument within bounds. As the Industrial Revolution took hold, Americans came to count on an ever-larger economic pie to anesthetize the unruly and ameliorate tensions related to class, race, religion, and ethnicity. Money became the preferred lubricant for keeping social and political friction within tolerable limits. Americans, Reinhold Niebuhr once observed, "seek a solution for practically every problem of life in quantitative terms," certain that more is better.7This reciprocal relationship between expansion, abundance, and freedom reached its apotheosis in the immediate aftermath of World War II. Assisted mightily by the fratricidal behavior of the traditional Europe an powers through two world wars and helped by reckless Japanese policies that culminated in the attack on Pearl Harbor, the United States emerged as a global superpower, while the American people came to enjoy a standard of living that made them the envy of the world. By 1945, the "American Century" forecast by Time-Life publisher Henry Luce only four years earlier seemed miraculously at hand. The United States was the strongest, the richest, and—in the eyes of its white majority at least—the freest nation in all the world.Political credit for this achievement lies squarely with the Left. Abundance, sustained in no small mea sure by a postwar presumption of American "global leadership," made possible the expansion of freedom at home. Rebutting Soviet charges of racism and hypocrisy lent the promotion of freedom domestically a strategic dimension. Yet possibility only became reality thanks to progressive political activism.Pick the group: blacks, Jews, women, Asians, Hispanics, working stiffs, gays, the handicapped—in every case, the impetus for providing equal access to the rights guaranteed by the Constitution originated among pinks, lefties, liberals, and bleeding- heart fellow travelers. When it came to ensuring that every American should get a fair shake, the contribution of modern conservatism has been essentially nil. Had Martin Luther King counted on William F. Buckley and the National Review to take up the fight against racial segregation in the 1950s and 1960s, Jim Crow would still be alive and well.The president had originally intended to speak on July 5, focusing his address exclusively on energy. At the last minute, he decided to postpone it. Instead, he spent ten days sequestered at Camp David, using the time, he explained, "to reach out and listen to the voices of America." At his invitation, a host of politicians, academics, business and labor leaders, clergy, and private citizens trooped through the presidential retreat to offer their views on what was wrong with America and what Carter needed to do to set things right. The result combined a seminar of sorts with an exercise in self- flagellation.The speech that Carter delivered when he returned to the White House bore little resemblance to the one he had planned to give ten days earlier. He began by explaining that he had decided to look beyond energy because "the true problems of our Nation are much deeper." The energy crisis of 1979, he suggested, was merely a symptom of a far greater crisis. "So, I want to speak to you first to night about a subject even more serious than energy or inflation. I want to talk to you right now about a fundamental threat to American democracy."In short order, Carter then proceeded to kill any chance he had of securing reelection. In American political discourse, fundamental threats are by definition external. Nazi Germany, Imperial Japan, or international communism could threaten the United States. That very year, Iran’s Islamic revolutionaries had emerged to pose another such threat. That the actions of everyday Americans might pose a comparable threat amounted to rank heresy. Yet Carter now dared to suggest that the real danger to American democracy lay within.The nation as a whole was experiencing "a crisis of confidence," he announced. "It is a crisis that strikes at the very heart and soul and spirit of our national will. We can see this crisis in the growing doubt about the meaning of our own lives and in the loss of a unity of purpose for our nation." This erosion of confidence threatened "to destroy the social and the political fabric of America."Americans had strayed from the path of righteousness. "In a nation that was proud of hard work, strong families, close- knit communities, and our faith in God," the president continued,too many of us now tend to worship self- indulgence and consumption. Human identity is no longer defined by what one does, but by what one owns. But we’ve discovered that owning things and consuming things does not satisfy our longing for meaning. We’ve learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose.In other words, the spreading American crisis of confidence was an outward manifestation of an underlying crisis of values. With his references to what "we’ve discovered" and what "we’ve learned," Carter implied that he was merely voicing concerns that his listeners already shared: that average Americans viewed their lives as empty, unsatisfying rituals of buying, and longed for something more meaningful.To expect Washington to address these concerns was, he made clear, fanciful. According to the president, the federal government had become "an island," isolated from the people. Its major institutions were paralyzed and corrupt. It was "a system of government that seems incapable of action." Carter spoke of "a Congress twisted and pulled in every direction by hundreds of well financed and powerful special interests." Partisanship routinely trumped any concern for the common good: "You see every extreme position defended to the last vote, almost to the last breath by one unyielding group or another.""We are at a turning point in our history," Carter announced.There are two paths to choose. One is a path I’ve warned about to night, the path that leads to fragmentation and self- interest. Down that road lies a mistaken idea of freedom, the right to grasp for ourselves some advantage over others. That path would be one of constant conflict between narrow interests ending in chaos and immobility.The continued pursuit of this mistaken idea of freedom was "a certain route to failure." The alternative—a course consistent with "all the traditions of our past [and] all the lessons of our heritage"—pointed down "another path, the path of common purpose and the restoration of American values." Down that path, the president claimed, lay "true freedom for our Nation and ourselves."As portrayed by Carter, the mistaken idea of freedom was quantitative: It centered on the never- ending quest for more while exalting narrow self- interest. His conception of authentic freedom was qualitative: It meant living in accordance with permanent values. At least by implication, it meant settling for less.How Americans dealt with the question of energy, the president believed, was likely to determine which idea of freedom would prevail. "Energy will be the immediate test of our ability to unite this Nation, and it can also be the standard around which we rally." By raising that standard, Carter insisted, "we can seize control again of our common destiny." With this in mind, Carter outlined a six- point program designed to end what he called "this intolerable dependence on foreign oil." He promised action to reduce oil imports by one- half within a decade. In the near term, he vowed to establish quotas capping the amount of oil coming into the country. He called for a national effort to develop alternative energy sources. He proposed legislation mandating reductions in the amount of oil used for power generation. He advocated establishment of a new federal agency "to cut through the red tape, the delays, and the endless roadblocks to completing key energy projects." And finally, he summoned the American people to conserve: "to take no unnecessary trips, to use carpools or public transportation whenever you can, to park your car one extra day per week, to obey the speed limit, and to set your thermostats to save fuel."Although Carter expressed confidence that the United States could one day regain its energy independence, he acknowledged that in the near term "there [was] simply no way to avoid sacrifice." Indeed, implicit in Carter’s speech was the suggestion that sacrifice just might be a good thing. For the sinner, some sort of penance must necessarily precede redemption.The response to his address—instantly labeled the "malaise" speech although Carter never used that word— was tepid at best. Carter’s remarks had blended religiosity and populism in ways that some found off- putting. Writing in the New York Times, Francis X. Clines called it the "cross- of- malaise" speech, comparing it unfavorably to the famous "cross- of- gold" oration that had vaulted William Jennings Bryan to political prominence many decades earlier.19 Others criticized what they saw as a penchant for anguished moralizing and a tendency to find fault everywhere except in his own White House. In the New York Times Magazine, Professor Eugene Kennedy mocked "Carter Agonistes," depicting the president as a "distressed angel, passing judgment on us all, and speaking solemnly not of blood and sweat but of oil and sin."20The relationship between World Wars III and IV becomes apparent when recalling Reagan’s policy toward Afghanistan and Iraq—the former a seemingly brilliant success that within a decade gave birth to a quagmire, the latter a cynical gambit that backfired, touching off a sequence of events that would culminate in a stupendous disaster.As noted in the final report of the National Commission on Terrorist Attacks Upon the United States, "A decade of conflict in Afghanistan, from 1979 to 1989, gave Islamist extremists a rallying point and a training .eld."27 The commissioners understate the case. In Afghanistan, jihadists took on a superpower, the Soviet Union, and won. They gained immeasurably in confidence and ambition, their efforts funded in large mea sure by the American taxpayer.The billions that Reagan spent funneling weapons, ammunition, and other support to the Afghan mujahideen were as nothing compared to the $1.2 trillion his administration expended modernizing U.S. military forces. Yet American policy in Afghanistan during the 1980s illustrates the Reagan Doctrine in its purest form. In the eyes of Reagan’s admirers, it was his masterstroke, a bold and successful effort to roll back the Soviet empire. The exploits of the Afghan "resistance" .red the president’s imagination, and he offered the jihadists unstinting and enthusiastic support. In designating March 21, 1982, "Afghanistan Day," for example, Reagan proclaimed, "The freedom fighters of Afghanistan are defending principles of independence and freedom that form the basis of global security and stability."28In January 1993, President Bill Clinton inherited this situation. To his credit, alone among recent presidents Clinton managed at least on occasion to balance the federal budget. With his enthusiasm for globalization, however, the forty- second president exacerbated the underlying contradictions of the American economy. Oil imports increased by more than 50 percent during the Clinton era.33 The trade imbalance nearly quadrupled.34 Gross federal debt climbed by nearly $1.5 trillion.35 During the go- go dot .com years, however, few Americans attended to such matters.In the Persian Gulf, Clinton’s efforts to shore up U.S. hegemony took the form of a "dual containment" policy targeting both Iran and Iraq. With regard to Iran, containment meant further isolating the Islamic republic diplomatically and economically in order to prevent the rebuilding of its badly depleted military forces. With regard to Saddam Hussein’s Iraq, it meant much the same, including fierce UN sanctions and a program of armed harassment.During the first year of his administration, Clinton developed a prodigious appetite for bombing and, thanks to a humiliating "Blackhawk down" failure in and retreat from Somalia, an equally sharp aversion to committing ground troops. Nowhere did Clinton’s infatuation with air power find greater application than in Iraq, which he periodically pummeled with precision- guided bombs and cruise missiles. In effect, the cease fire that terminated Operation Desert Storm in February 1991 did not end the Persian Gulf War. After a brief pause, hostilities resumed. Over time, they intensified, with the United States conducting punitive air strikes at will.Although when it came to expending the lives of American soldiers, Clinton proved to be circumspect, he expended ordnance with abandon. During the course of his presidency, the navy and air force conducted tens of thousands of sorties into Iraqi airspace, dropped thousands of bombs, and launched hundreds of cruise missiles. Apart from turning various Iraqi military and government facilities into rubble, this cascade of pricy munitions had negligible impact. With American forces suffering not a single casualty, few Americans paid attention to what the ordnance cost or where it landed. After all, whatever the number of bombs dropped, more were always available in a seemingly inexhaustible supply.Despite these exertions, many in Washington— Republicans and Democrats, politicians and pundits—worked themselves into a frenzy over the fact that Saddam Hussein had managed to survive, when the World’s Only Superpower now wished him gone. To fevered minds, Saddam’s defiance made him an existential threat, his mere survival an unendurable insult.In 1998, the anti- Saddam lobby engineered passage through Congress of the Iraq Liberation Act, declaring it "the policy of the United States to seek to remove the Saddam Hussein regime from power in Iraq and to replace it with a democratic government." The legislation, passed unanimously in the Senate and by a 360–38 majority in the House, authorized that the princely sum of $100 million be dedicated to that objective. On October 31, President Clinton duly signed the act into law and issued a statement embracing the cause of freedom for all Iraqis. "I categorically reject arguments that this is unattainable due to Iraq’s history or its ethnic or sectarian make- up," the president said. "Iraqis deserve and desire freedom like everyone else."All of this—both the gratuitous air war and the preposterously frivolous legislation—amounted to theater. Reality on the ground was another matter. A crushing sanctions regime authorized by the UN, but imposed by the United States and its allies, complicated Saddam’s life and limited the funds available from Iraqi oil, but primarily had the effect of making the wretched existence of the average Iraqi more wretched still. A 1996 UNICEF report estimated that up to half a million Iraqi children had died as a result of the sanctions. Asked to comment, Secretary of State Madeleine Albright did not even question the figure. Instead, she replied, "I think this is a very hard choice, but the price—we think the price is worth it."No doubt Albright regretted her obtuse remark. Yet it captured something essential about U.S. policy in the Persian Gulf at a time when confidence in American power had reached its acme. In effect, the United States had forged a partnership with Saddam in imposing massive suffering on the Iraqi people. Yet as long as Americans at home were experiencing a decade of plenty—during the Clinton era, consumers enjoyed low gas prices and gorged themselves on cheap Asian imports—the price that others might be paying didn’t much matter.Bill Clinton’s Iraq policy was both strategically misguided and morally indefensible—as ill- advised as John Kennedy’s campaign of subversion and sabotage directedagainst Cuba in the 1960s, as reprehensible as Richard Nixon’s illegal bombing of Laos and Cambodia in the late 1960s and 1970s. Yet unlike those actions, which occurred in secret, U.S. policy toward Iraq in the 1990s unfolded in full view of the American people. To say that the policy commanded enthusiastic popular support would be to grossly overstate the case. Yet few Americans strenuously objected—to the bombing, to congressional posturing, or to the brutal sanctions. Paying next to no attention, the great majority quietly acquiesced and thus became complicit.American Freedom, Iraqi FreedomTo the extent that Bill Clinton’s principal critics had a problem with his Iraq policy, their chief complaint was that the United States wasn’t dropping enough bombs. Committed to their own quantitative solutions, hawkish conservatives wanted to ratchet up the level of violence. If Saddam’s survival represented an affront to American hegemony in the Gulf, then Saddam’s elimination offered the necessary corrective. Among neo-Reaganite Republicans, well before 9/11, it became an article of faith that, with Saddam’s removal, everything was certain to fall into place. Writing in the Weekly Standard in February 1998, Robert Kagan, a leading neoconservative, urged a full- scale invasion. Eliminating the Baath Party regime, he promised, was sure to "open the way for a new post- Saddam Iraq whose intentions can safely be assumed to be benign."36The possibility that military escalation might actually exacerbate America’s Persian Gulf dilemma received scant consideration. That the citizens of the United States might ease that dilemma by modifying their own behavior—that the antidote to our ailments might lie within rather than on the other side of the world—received no consideration at all.The events of September 11, 2001, only hardened this disposition. Among hawks, 9/11 reinforced the conviction that dominance in the Gulf was a categorical imperative. Secretary of Defense Donald Rumsfeld aptly summarized the prevailing view in October 2001: "We have two choices. Either we change the way we live, or we must change the way they live. We choose the latter."37 If, today, this black- and- white perspective seems a trifle oversimplified, between 2002 and 2004, no politician of national stature had the wit or the gumption to voice a contrary view.38As it trained its sights on modifying the way "they" lived, the Bush administration looked to America’s armed forces as its preferred agent of change. The United States would, as Bush and his chief advisers saw it, solidify its hold on the Persian Gulf by relying in the first instance on coercion. In 1991, the president’s father had shrunk from doing what they now believed needed to be done: marching on Baghdad and "decapitating" the regime of Saddam Hussein. Throughout the remainder of that decade Clinton had temporized. Now the gloves were coming off, with Saddam’s Iraq the primary, but by no means the final, target.Here was an imperial vision on a truly colossal scale, a worthy successor to older claims of "manifest destiny" or of an American mission to "make the world safe for democracy." President Bush’s "freedom agenda" updated and expanded upon this tradition.One might have thought that implementing such a vision would require sustained and large- scale national commitment. Yet soon after 9/11, the American people went back to business as usual—urged to do so by the president himself. "War costs money," Franklin D. Roosevelt had reminded his countrymen after Pearl Harbor. "That means taxes and bonds and bonds and taxes. It means cutting luxuries and other non-essentials."41 At the outset of its war on terrorism, the Bush administration saw things differently. Even as the United States embarked on a global conflict expected to last decades, the president made a point of reducing taxes. Rather than asking Americans to trim their appetite for luxuries, he called on them to carry on as if nothing had occurred. Barely two weeks after the World Trade Center had collapsed, the president was prodding his fellow citizens to "Get on board. Do your business around the country. Fly and enjoy America’s great destination spots. Get down to Disney World in Florida."Predictably, as the scope of military operations grew, especially after the invasion of Iraq in March 2003, so too did the level of military spending. During the Bush years, the Pentagon’s annual bud get more than doubled, reaching $700 billion by 2008. This time, unlike in Operation Desert Storm when Germany, Japan, and friendly Gulf states ponied up tens of billions of dollars to defray the cost of U.S. operations, the burden of paying for the war fell entirely on Washington.Less predictably, although perhaps not surprisingly, spending on entitlements also rose in the years after 9/11. Abetted by Congress, the Bush administration conducted a war of guns and butter, including huge increases in outlays for Medicare and Social Security. The federal bud get once more went into the red and stayed there.Had the administration gotten a quick win in Iraq, it might have finessed the crisis of profligacy—for a while. To put it mildly, however, the war didn’t follow its assigned script.Between April 28, 2003, and February 22, 2006, Iraq came apart at the seams. During this interval, the adverse foreign policy implications of American profligacy became indisputable. On the former date, skittish American soldiers in Fallujah fired into a crowd of demonstrators, killing a dozen or more Iraqis. If the insurgency had a trigger, this was it. On the latter date, terrorists blew up the Mosque of the Golden Dome in Samarra, igniting an already simmering Sunni- Shiite civil war. Prior to the incident in Fallujah, the administration could still convince itself that its grand strategy remained plausible. Even a month later, swaggering White House officials were still bragging: "Anyone can go to Baghdad. Real men go to Tehran." By the time the Samarra bombing occurred, events had not dealt kindly with such fantasies. Real men were holed up in Baghdad’s heavily fortified Green Zone.As conditions in Iraq worsened, the disparity between pretensions and capacities became painfully evident. A generation of profligacy had produced strategic insolvency. The administration had counted on the qualitative superiority of U.S. forces compensating for their limited numbers. The enemy did not cooperate.Although the United States is a wealthy nation with a population of over 300 million, closing the gap between means and ends posed a daunting task. By February 2005, this was so apparent that Los Angeles Times columnist Max Boot was suggesting that the armed forces "open up recruiting stations from Budapest to Bangkok, Cape Town to Cairo, Montreal to Mexico City." Boot’s suggestion that the Bush administration raise up a "Freedom Legion" of foreign mercenaries inadvertently illustrated the depth of the problem.47 If the Pentagon needed to comb the streets of Cape Town and Cairo to fill its ranks, the situation was indeed dire.The United States had a shortage of soldiers; it also lacked funds. The longer the wars in Iraq and Afghanistan dragged on, the more costly they became. By 2007, to sustain its operations, the U.S. command in Baghdad was burning through $3 billion per week. That same year, the overall costs of the Iraq War topped the $500- billion mark, with some estimates already suggesting that the final bill could reach at least $2 trillion.48Although these figures were widely reported, they had almost no political impact in Washington, indicating the extent to which habits of profligacy had become entrenched. Congress responded to bud get imbalances not by trimming spending or increasing revenues but by quietly and repeatedly raising the debt ceiling—by $3.015 trillion between 2002 and 2006.49 Future generations could figure out how to pay the bills.All this red ink generated nervous speculation about a coming economic collapse comparable in magnitude to the Great Depression.50 Whatever the merit of such concerns, the interest here is not in what may yet happen to the American economy but in what has already occurred to its foreign policy.By 2007, the United States was running out of troops and was already out of money. According to conventional wisdom, when it came to Iraq, there were "no good options." Yet Americans had limited the range of possible options by their stubborn insistence that the remedy to the nation’s problems in the Persian Gulf necessarily lay in the Persian Gulf rather than at home. The slightest suggestion that the United States ought to worry less about matters abroad and more about setting its own house in order elicited from the political elite, Republicans and Democrats alike, shrieks of "isolationism," the great imaginary sin to which Americans are allegedly prone. Yet to begin to put our house in order would be to open up a whole new array of options, once again permitting the United States to "choose peace or war, as our interest, guided by justice, shall counsel."Long accustomed to thinking of the United States as a superpower, Americans have yet to realize that they have forfeited command of their own destiny. The reciprocal relationship between expansionism, abundance, and freedom—each reinforcing the other—no longer exists. If anything, the reverse is true: Expansionism squanders American wealth and power, while putting freedom at risk. As a consequence, the strategic tradition to which Jefferson and Polk, Lincoln and McKinley, TR and FDR all subscribed has been rendered not only obsolete but pernicious.Rather than confronting this reality head- on, American grand strategy since the era of Ronald Reagan, and especially throughout the era of George W. Bush, has been characterized by attempts to wish reality away. Policy makers have been engaged in a de facto Ponzi scheme intended to extend indefinitely the American line of credit. The fiasco of the Iraq War and the quasi- permanent U.S. occupation of Afghanistan illustrate the results and prefigure what is yet to come if the crisis of American profligacy continues unabated.Excepted from The Limits of Power by Andrew J. BacevichCopyright @ 2009 by Andrew J. BacevichPublished in 2009 by Henry Holt and Company, LLCAll rights reserved. This work is protected under copyright laws and reproduction is strictly prohibited. Permission to reproduce the material in any manner or medium must be secured from the Publisher.

Rewards Program