There’s a reason why exactly it is so hard to be president—in normal circumstances—and why most incumbents look decades older when they leave the job than when they began. It is that the only choices normal presidents get to make are the impossible ones—decisions that are not simply very close calls on the merits, but that are guaranteed to lead to tragedy and bitterness whichever way they go.
Take Barack Obama’s famed choice not to back up his “red line” promise in Syria, which was a focus of Jeffrey Goldberg’s “The Obama Doctrine” Atlantic cover story two years ago. The option Obama chose—not intervening in Syria—meant death and suffering for countless thousands of people. The option he rejected—intervening— would have meant death and suffering for some thousands of the same people or others. Agree or disagree on the outcome, any such decision is intellectually demanding and morally draining. Normal presidents have to make them, one after another, all day long. (Why don’t they get any easier choices? Because someone else has made those before they get to the president.) Obama’s decision to approve the raid on Osama bin Laden’s compound turned out to be a tactical and political success. When he made it, he had to weigh the possibility that it could end in world-publicized failure—like Jimmy Carter’s decision to attempt a rescue of American hostages in Iran, which ended in chaos, and which Carter later contended was what sealed his fate in his re-election run.
|
In this May 1, 2011, file photo, President Barack Obama reads his statement to photographers after making a televised statement on the death of Osama bin Laden from the East Room of the White House in Washington. More than half… |
A special category of impossible decision, which I was introduced to in the two years I worked for Jimmy Carter in the White House and have borne in mind ever since, turns on the inevitability of ignorance. To be clear, I don’t mean “stupidity.” People in the government and military are overall smarter than press portrayals might suggest. Instead I mean really registering the uncomfortable fact that you cannot know enough about the big choices you are going to make, before you have to make them. Sometimes that is because of deadline rush: The clock is ticking, and you have to act now. (To give a famous example: In 1980 U.S. radar erroneously indicated that the Soviets had launched a nuclear-missile attack, and Zbigniew Brzezinski, as Jimmy Carter’s national-security adviser, had to decide at 3 a.m. to whether to wake the president to consider retaliation. It was revealed as a false alarm before he could place the call.) Most of the time it is because the important variables are simply unknowable, and a president or other decision-maker has to go on judgment, experience, hunch.
|
OCT 25 1980 Brezezinski, Zbigniew-Ind Security advisor Brown Palace Hotel Zbigniew Brzezinski |
This point sounds obvious, because we deal with its analogues in daily-life decisions big and small. No one who decides to get married can really know what his or her spouse will be like 20 years in the future, or whether the partners will grow closer together or further apart. Taking a job—or offering one—is based at least as much on hope as on firm knowledge. You make an investment, you buy a house, you plan a vacation knowing that you can’t possibly foresee all the pitfalls or opportunities.
But this truism of life becomes far more consequential in the literally life-or-death choices that presidents must make, as commander in chief and as head of U.S. diplomatic and strategic efforts. The question of deciding about the unknowable looms large in my mind, as I think back 15 years to the run-up to the Iraq war, and think ahead to future such choices future presidents will weigh.
* * *
There’s a long list of books I wish presidents would have read before coming to office—before, because normal ones barely have time to think once they get there. For instance, the late David Fromkin’s A Peace to End All Peace is for me a useful starting point for thinking about strains within modern Middle East. (The book argues, in essence, that the way the Ottoman Empire was carved up at the end of World War I essentially set the stage for conflict in the region ever since. In that way it is a strategic counterpart to John Maynard Keynes’s famous The Economic Consequences of the Peace, written just after the conclusion of the Versailles agreements, which argues that the brutal economic terms dictated to the defeated Germans practically guaranteed future trouble there.)
High up among the books on my “wish they’d read” list is Thinking in Time: The Uses of History for Decision Makers, by two Harvard professors (and mentors of mine), Ernest May and Richard Neustadt. In this book May and Neustadt reverse the chestnut attributed to an earlier Harvard professor, George Santayana, that “those who do not remember the past are condemned to repeat it.” Instead they caution against over-remembering, or imagining that a choice faced now can ever be exactly like one faced before.
|
JUL 14 1963 Wyoming officials greet vice president on Cheyenne Arrival Members of the party are (left to right) Cheyenne Mayor Bill Nations, Mrs. Gale McGee, Vice President Lyndon Johnson, Wyoming Sen. Gale McGee. |
The most famous and frightening example is Lyndon Johnson’s, involving Vietnam. He “learned” so thoroughly the error of Neville Chamberlain, and others who tried to appease (rather than confront) the Nazis, that he thought the only risk in Vietnam was in delaying before confronting communists there. Because of the disaster Johnson’s decisions caused—the disaster for Vietnam, for its neighbors, for tens of thousands of Americans, all as vividly depicted in last year’s Ken Burns / Lynn Novick documentary—most American politicians, regardless of party, “learned” to avoid entanglement in Asian-jungle guerrilla wars. Thus in the late 1970s, as the post-Vietnam war Khmer Rouge genocide slaughtered millions of people in Cambodia, the U.S. kept its distance. It had given up the international moral standing, and the internal political stomach, to undertake another war in the place where it had so recently met defeat.
|
President Richard Nixon at a press conference, Washington DC, September 5, 1973 |
From its Vietnam trauma, the United States also codified a crass political lesson that Richard Nixon had learned later in the war. Just before Nixon took office, American troop levels in Vietnam were steadily on the way up, as were weekly death tolls, and monthly draft calls. The death-and-draft combination was the trigger for domestic protests. Callously but accurately, Nixon believed that he could reduce the protests if he ended the draft calls. Thus began the shift to the volunteer army—and what I called, in an Atlantic cover story three years ago, the “Chickenhawk Nation” phenomenon, in which the country is always at war but the vast majority of Americans are spared direct cost or exposure. (From the invasion of Iraq 15 years ago until now, the Americans who served at any point in Iraq or Afghanistan comes to just 1 percent of the U.S. population.)
May and Neustadt had a modest, practical ambition for their advice to study history, but to study it cautiously. “Marginal improvement in performance is worth seeking,” they wrote. “Indeed, we doubt that there is any other kind. Decisions come one at a time, and we would be satisfied to see a slight upturn in the average. This might produce much more improvement [than big dramatic changes] measured by results.”
My expectation is more modest still: I fear and expect that the U.S. is fated to lurch from one over-“learning” to its opposite, and continue making a steadily shifting range of errors. The decision to invade Iraq was itself clearly one of those. The elder George Bush fought a quick and victorious war to drive Saddam Hussein out of Kuwait in 1991. But he stopped short of continuing the war into Iraq to remove Saddam Hussein from power, and his son learned from that “failure” that he had to finish the job of eliminating Saddam. Two of the writers who were most eloquent in making their case for the war—Christopher Hitchens, who then wrote for the Atlantic among other places, and Michael Kelly, who was then our editor-in-chief—based much of their case on the evils Saddam Hussein had gotten away with after the original Gulf War. (Hitchens died of cancer in 2011; Kelly was killed in Iraq, as an embedded reporter in the war’s early stage.) Then Barack Obama, who had become president in large part because he opposed the Iraq war (which gave him his opening against the vastly better known Hillary Clinton), learned from Iraq about the dangers of intervention in Syria. And on through whatever cycles the future holds.
* * *
Is there escape from the cycles? In a fundamental sense, no, of course not. But I’ll offer the “lesson” I learned—50 years ago, in a classroom with Professor May; 40 years ago, when I watched Jimmy Carter weigh his choices; 15 years ago, in warning about the risks of invading Iraq. It involves a cast of mind, and a type of imagination.
As the Bush administration moved onto a war footing soon after the 9/11 attacks, no one could know the future risks and opportunities. But, at the suggestion of my friend and then-editor Cullen Murphy, I began reporting on what the range of possibilities might be. Starting in the spring of 2002, when the Bush team was supposedly still months away from a decision about the war, it was clear to us that the choice had been made. I interviewed dozens of historians, military planners, specialists in post-war occupations, and people from the region to try to foresee the likely pitfalls.
The result, which was in our November, 2002 issue (and which we put online three months earlier, in hopes of affecting the debate) was called “The Fifty-First State?” Its central argument was: The “war” part of the undertaking would be the easy part, and deceptively so. The hard part would begin when the statues of Saddam Hussein were pulled down—and would last for months, and years, and decades, all of which should be taken into consideration in weighing the choice for war.
It conceivably might have gone better in Iraq, and very well could have, if not for a series of disastrously arrogant and incompetent mistakes by members of the Bush team. (I won’t go into details here: I laid them out in several articles, including this, this, and this, and eventually a book.) But the premise of most people I interviewed before the war, who mostly had either a military background or extensive experience in the Middle East, was that this would be very hard, and would hold a myriad of bad surprises, and was almost certain to go worse than its proponents were saying. Therefore, they said, the United States should do everything possible to avoid invading unless it had absolutely no choice. Wars should be only of necessity. This would be folly, they said, and a war of choice.
|
Iraq War Protest with pictures of 1500+ US Soldiers who died in Iraq war |
The way I thought of the difference between those confidently urging on the war, and those carefully cautioning against it, was: cast of mind. The majority of people I spoke with expressed a bias was against military actions that could never be undone, and whose consequences could last for generations. I also thought of it as a capacity for tragic imagination, of envisioning what could go wrong as vividly as one might dream of what could go right. (“Mission Accomplished!”)
Any cast of mind has its biases and blind spots. But I’m impressed, in thinking about the history I have lived through and the histories I have read, by how frequently people with personal experience of war have been cautious about launching future wars. This does not make them pacifists: Harry Truman, infantry veteran of World War I, decided to drop the atomic bomb. But Ulysses Grant, Dwight Eisenhower, Colin Powell (in most of his career other than the Iraq-war salesmanship at the United Nations)—these were former commanding generals, cautious about committing troops to war. They had a tragic imagination of where that could lead and what it might mean.
What lesson do we end with? Inevitably any of them will mismatch. The reasons not to invade Iraq 15 years ago are different from the risks to consider in launching a strike on North Korea or on Iran, or provoking China in some dispute in the East China Sea. The value of tragic imagination remains: for leaders considering war or peace, for the media in stoking or questioning pro-war fever, for the 99 percent of the public in considering the causes for which the military 1 percent will be asked to kill, and die.