Publications by Type: Newspaper Article

2011

Killing terrorists with drones is great politics. To the question, “Is it legal?” a natural answer might well be, “Who cares?”

But the legal justifications in the war on terrorism do matter - and not just to people who care about civil liberties. They end up structuring policy. As it turns out, targeted killing, now the hallmark of the Barack Obama administration’s war on terrorism, has its roots in rejection of the legal justifications once offered for waterboarding prisoners.

The leaking of the basic content (but not the text) of an Obama administration memo authorizing the drone strike that killed US citizen Anwar Al-Awlaki therefore calls for serious reflection about where the war on terrorists has been - and where it is headed next.

The George W. Bush administration’s signature anti-terror policy after the September 11 attacks (apart from invading countries) was to capture suspected terrorists, detain them, and question them aggressively in the hopes of gaining actionable intelligence to prevent more attacks.

In the Bush years, after the CIA and other agencies balked at the interrogation techniques being urged by Vice President Dick Cheney, the White House asked the Department of Justice to explain why the most aggressive questioning tactics were legal. Lawyers at the Office of Legal Counsel—especially John Yoo, now a professor at the University of California at Berkeley—produced secret memos arguing that waterboarding wasn’t torture.

The Torture Memos
What was more, the memos maintained, it didn’t matter if it was torture or not, because the president had the inherent constitutional authority to do whatever was needed to protect the country.

Some of the documents were leaked and quickly dubbed “the torture memos.” A firestorm of legal criticism followed. One of the most astute and outraged critics was Marty Lederman, who had served in the Office of Legal Counsel under President Bill Clinton. With David Barron, a colleague of mine at Harvard, Lederman went on to write two academic articles attacking the Bush administration’s theories of expansive presidential power. Eventually, Jack Goldsmith, who led the Office of Legal Council in 2003–2004 (and is now also at Harvard), retracted the most extreme of Yoo’s arguments about the president’s inherent power.

In the years leading to the 2008 election, all this technical criticism of the Bush team’s legal strategy merged with domestic and global condemnation of the administration’s detention policies. The Supreme Court weighed in, finding that detainees were entitled to hearings and better tribunals than were being offered. As a candidate, Obama joined the bandwagon, promising to close the prison at Guantanamo Bay, Cuba, within a year of taking office.

Guantanamo is still open, in part because Congress put obstacles in the way. Instead of detaining new terror suspects there, however, Obama vastly expanded the tactic of targeting them, with eight times more drone strikes in his first year than in all of Bush’s time in office. Barron and Lederman, the erstwhile Bush critics, were appointed to senior positions in the Office of Legal Counsel—where they wrote the recent memo authorizing the Al-Awlaki killing.

What explains these startling developments? If it’s illegal and wrong to capture suspected terrorists and detain them indefinitely without a hearing, how exactly did the Obama administration decide it was desirable and lawful to target and kill them?

The politics were straightforward. Obama’s team observed that holding terror suspects exposed the Bush administration to harsh criticism (including their own). They wanted to avoid adding detainees at Guantanamo or elsewhere.

A Father’s Appeal
Dead terrorists tell no tales—and they also have no lawyers shouting about their human rights. Before Al-Awlaki was killed, his father sued the government for putting the son on its target list. The Obama Justice Department asked the court to dismiss the claim as being too closely related to government secrets. The court agreed—a result never reached in all the Guantanamo litigation. Anwar Al-Awlaki now has no posthumous recourse.

In the bigger picture, Obama also wanted to show measurable success in the war on terrorism while withdrawing troops from Iraq and Afghanistan. But even here the means were influenced by legal concerns.

Osama bin Laden is the best example. One suspects that the US forces who led the fatal raid in Abbottabad almost certainly could have taken him alive. But detaining and trying him would probably have been a political disaster. So they shot him on sight, as the international law of war allows for enemies unless they surrender.

The authority for targeted killing—as expressed in the Lederman-Barron memo—offers the legal counterpart to the political advantages of the Obama targeting policy. According to the leaks, the memo holds that the U.S. can kill suspected terrorists from the air not because the president has inherent power, but because Congress declared war on Al-Qaeda the week after the September 11 attacks.

The logic is that once Congress declares war, the president can determine whom we are fighting. The president found that Yemen-based Al-Qaeda in the Arabian Peninsula, which didn’t exist on September 11, had joined the war in progress. He determined that Al-Awlaki was an active member of the Yemeni groups with some role in planning attacks. And, the memo says, it’s not unlawful assassination or murder if the targets are wartime enemies.

From a formal legal standpoint, Lederman and Barron can claim consistency with their attacks on the Bush administration. They relied on Congress and international law; Yoo’s “torture memos” didn’t. But this argument misses the more basic point: Most critics rejected Bush’s policies not on technical grounds based on the Constitution, but because they thought there was something wrong with the president acting as judge and jury in the war on terrorism.

No Defense Allowed
Anwar al-Awlaki was killed because the president decided he was an enemy. Like the Bush-era Guantanamo detainees, he had no chance to deny this—even when his father tried to go to court while he was still alive. Naturally, a uniformed soldier in a regular war also wouldn’t get a hearing. But like the Guantanamo detainees, Al-Awlaki wore no uniform. Nor was he on a battlefield, except according to the view that anywhere in the world can be the battlefield in the war on terrorism.

Al-Awlaki might have maintained that he was merely a jihadi propagandist exercising his free speech rights as a U.S. citizen. Which might well have been a lie. Yet we have only the president’s word that he was an active terrorist—and that is all we will ever have. The future direction of the policy is therefore clear: Killing is safer, easier and legally superior to catching and detaining.

Sitting beside Al-Awlaki when he was killed was another US citizen, Samir Khan, who was apparently a full-time propagandist, not an operational terrorist. Khan was, we are told, not the target, but collateral damage—a good kill under the laws of war.

Legal memos are weapons of combat—no matter who is writing them.

Ferguson, Niall. 2011. “Romney to the Rescue”. Publisher's Version Abstract

This column is for Ted Forstmann: financier, fun lover, and philanthropist, who died on Nov. 20. But it’s not just for him. It’s to him.

Ted, I’m worried. I wish you were still around to help me get this right. The US is going nuts with populism. That’s always to be expected after a big financial crisis, I know. But this is dysfunctional.

On one side, there are conservative fundamentalists—the Tea Party—who think we can turn the clock back to before the New Deal, if not further. Some of them want to get rid not just of the Federal Reserve but of most of the federal government itself. I have more sympathy with these Teapopulists than with the other lot, the motley crew who want to Occupy Wall Street (call them the Occupopulists). But when it comes to practical politics, this Tea Party has more in common with the Mad Hatter’s than Boston’s.

To begin with, they’ve created a mood in the Republican Party that makes any kind of compromise on our fiscal crisis impossible. We just saw the ignominious failure of the supercommittee, which was supposed to come up with a plan to reduce the deficit. Predictably, each party blames the other side for this flop. Either way, the consequences are dire. First, the markets are spooked, just the way they were by the partisan dogfight over the debt ceiling earlier this year. Second, the country is now on course for more drastic spending cuts in 2013, which could not only slash our defense budget in an irresponsible way but also plunge the economy back into recession.

There’s another problem. Just like the populists of a century ago, the Teapopulists are drawn compulsively to disastrous presidential wannabes. I never asked you what you thought of Mitt Romney, Ted. But I am sure you’d prefer him over the other contenders. Bachmann, Perry, Cain, Gingrich—the one thing these people have in common is that they would lose to Barack Obama next year even if the unemployment rate were twice what it is now. Their appeal to the crucial center—to the independents and the undecided—is just too low.

What’s the case against Romney? That he’s a Mormon? Ted, you were a devout Catholic, just as I am a doubting atheist. But this is America. Religion and government are separate. And we tolerate all faiths, no matter how idiosyncratic, provided they tolerate ours too. That he’s changed his mind on hot-button issues? Well, so does any intelligent person. You often did. What is this, a dogmatism contest?

It is obvious that the Turkish foreign minister Ahmet Davutoglu’s “zero problems with the neighbors” policy no longer works, in the face of Turkey’s support for the Syrian defectors who oppose the Assad regime. The foreign minister must now deal with potentially hostile reactions by Syria and its closest ally, Iran, that could have destabilizing regional implications. Iran, for one, cannot afford to allow the Assad regime to fail. It provides Iran with a foothold in the Levant from which to support Hezbollah and threaten Israel on its Lebanese border.

Syrian and Iranian retaliation against Turkey can readily take the form of support for the Kurdistan Worker’s Party, or P.K.K. This group once again has become increasingly violent in its promotion of Kurdish separatism in the Turkish southeast. Syria, Iran and Turkey share a common cause in resisting demands by Kurdish opposition movements in their countries. Only months ago, all three were cooperating in suppressing the P.K.K. For Turkey, this was a welcome change from the 1990s, when Syria and Iran supported the P.K.K. in order to pressure Ankara for foreign policy concessions. Now Damascus and Tehran could again play the P.K.K. card.

To counteract potential Syrian and Iranian subversion and the separatist appeals of the P.K.K., Turkey needs to adapt its zero problems policy to its own southeast. In 2009, the prime minister, Recep Tayyip Erdogan announced a “Kurdish opening”—a bid at reconciliation with Turkey’s Kurds. However, he quickly closed it, leaving many Kurdish demands for economic development, political rights and cultural recognition unanswered. The Turkish foreign minister’s recent veiled threat to send troops across the Syrian border may be insufficient to deter Syria and Iran from subversively supporting the P.K.K. For a comprehensive resolution of the “Kurdish question,” Ankara also needs to implement effective policies that will over the long term improve the economic, political, and cultural life of Turkey’s Kurds.

After reading about the Greek debt crisis for over a year now, you might think you understand what it’s all about. You’re probably wrong. International media focus on how the Greek government and people spend their money. But an equally important problem is the inability of the Greek state to collect revenues.

The story constantly aired by various news outlets is simple enough. Greece, we are told, free-rode on the security offered by the rest of Europe to attract money from foreign investors, and then spent it lavishly on its bloated public sector. In case you don’t get it, BBC’s website has a recurring instructional slide show titled “What went wrong in Greece?” Apparently, Greece’s adoption of the euro “made it easier for the country to borrow money.... Greece went on a big, debt-funded spending spree, including paying for high-profile projects such as the 2004 Athens Olympics.”

This brief media lesson on Greek economics has proven very appealing to audiences abroad for two reasons. First, it rhymes with the stereotype of lazy Mediterranean people conning their hard-working North European partners and then shamelessly asking for a bailout. (Now that Italy may be heading the same way, there will be more of this coming.) It also resonates in the ears of the euro’s sworn opponents, above all in the UK.

Unfortunately, it is only half the story. Greek public debt as a percentage of GDP did not dramatically rise right after Greece joined the euro. Greek debt actually accumulated back in the 1980s and early 90s, years before Europe got its common currency. The size of the Greek public sector (as a percentage of GDP or share of the labor market) is around or even below average compared to the rest of Europe. Greece did try to spend its way out of the global recession in 2008-2009 and ran large deficits; but so did most other developed countries, including the UK and the US.

There are two sides of the public finance coin: expenditure and revenue. What is left out is that while Greek public spending and debt crept up, government revenue fell or remained constant in the years after Greece adopted the euro. Between 2001 and 2007 Greece’s average government revenues totaled 39.4% of GDP, whereas the EU average was 44.4%. Taxes are by far the largest component of government revenue. The issue is not unique to Greece. Declining tax revenues were observed in Ireland, Spain, and in the US after the Bush tax cuts kicked in.

In Greece the culprit has been rampant tax evasion by corporations owing millions in taxes and self-employed professionals who can hide their earnings, unlike salaried employees and pensioners. Under international pressure to balance its budget, the outgoing Greek government axed salaries and pensions and slapped new taxes on the bulk of citizens who were not tax-delinquent. This only drove the country deeper into recession and insolvency, making it necessary for EU leaders to write off part of Greece’s debt in July and then again in October.

Whether the government is reluctant to tax the very wealthy (as in the US) or lax in its duty to punish tax evasion (as in Greece), the results are similar. Revenues can’t keep up with expenditures and lenders become uneasy. Meanwhile, those who are taxed too leniently have an interest in shifting public attention towards cutting government spending. The bitter partisan quarrels in Washington and Athens lately have this much in common. Yet, this obvious point is conspicuously absent from reports on Greece in the English-speaking world.

There is no denying that Greece overspent on security for its Olympics - they were the first games after 9/11. There is also no denying that the Greek public sector is very inefficient. But this has to do with how the money is used. Deep cuts will not make an inefficient public sector better. Other reforms, however, just might. Finally, there is no denying that the euro deprived Greece of the flexibility to devaluate its currency. However, Greece’s revenue collection problem has been perennial and is unrelated to the euro. The first reforms Greece’s new government should focus on are the tax and judicial systems.

Casting the crisis ravaging Greece and closing in on Italy as a fundamental story of governments drunk on loans, doling out stacks of euros to their shortsighted citizens is a half-truth. It makes it easy to caricature on a national basis and to categorize Greeks, Italians, Germans or Americans as people who collectively live either within or beyond their means. It also masks the fact that there are differences within each country: Those who benefit the most from high-profile government contracts are the hardest ones to tax when the creditors come banging on your door.

Call it reckless, call it bold, but the Greek Prime Minister, George Papandreou, has attempted to transform a referendum on the European Union bailout plan for Greece into a referendum about whether the Greeks want to stay in the Eurozone or not. The last time Greece had a popular referendum was in 1974 to decide if the people wanted to keep King Constantine, a descendent of the Royal family that European Powers foisted on the Greek people in the 1860s.

This time around, the Greek Prime Minister has shocked the rest of Europe—and even his own Vice President—with his plans to call for a popular vote on whether to accept the 50% haircut deal that EU heads of state agreed on last week to manage the country’s spiraling debt crisis. It’s the latest in a series of Hail Mary passes by Papandreou to keep his hold on power, but the proposed referendum is really only a distraction from the no-confidence vote he faces, which is scheduled in Greek Parliament this Friday. As hard as the Europeans leaders may have fought to prevent a Greek default, they failed to take into account the dire state of domestic Greek politics. But even at this moment the solution to the crisis must be a European one.

The gravest threat facing Papandreou right now is from the Greek people. His government party, PASOK, was elected two years ago on an anti-austerity platform, but has since been forced into the position of calling for more austerity than any Greek government in the postwar era. The demonstrations across the country last weekend that disrupted the parades commemorating the Greek resistance in World War II culminated with the forced departure of the President of the Republic, Karolos Papoulias, from the parade in Thessaloniki. The current political system has been facing a legitimacy crisis for a while now. The social contract, based on patronage, established between Greek politicians and the electorate following the fall of the Greek Junta in 1974 is under severe strain.

The second problem facing Papandreou is the dissent and distrust he is experiencing from his own party, which—for the moment—holds a bare majority of 152 seats out of 300 in the Greek Parliament. This past summer in a cabinet reshuffling, Papandreou tried to smooth out the problems in his party by appointing his main internal rival, Evangelos Venizelos, Vice President. But this accommodation reached its breaking point yesterday when Venizelos declared he had not been informed about the referendum by Papandreou, who nevertheless called on him to deliver the bad news to EU leaders. Meanwhile, the opposition parties claim that the government is blackmailing the Greek people and suggest that the only solution is to have early elections.

The crisis of legitimacy reached its peak yesterday when rumors about tensions between the government and the military leadership of the country became credible when the minister of Defense called for the replacement of all the heads of divisions of the armed forces. It would be a controversial decision in the best of times, but one that’s nearly impossible to carry out for a government facing unprecedented unpopularity.

The European Union leaders are dead against three outcomes: the collapse of the Greek parliament, the ouster of Papandreou on Friday, and the negative result of any kind of referendum on the bailout—all of which would ultimately spell the ejection of Greece from the Eurozone and spur financial chaos on the continent. The solution must come from Europe. The meeting at Cannes Thursday—where Papandreou has been invited by Merkel and Sarkozy—is his last chance to appease his European patrons.

The real question is not whether Greece will proceed or not with the referendum, but rather who controls Europe? Is it the Germans who seem to be the only ones who can undo the European Central Bank policy about printing money? The French and the Germans together who want to keep the Euro strong? Is it the speculators, banks and their interests? Or is the EU open to more democratic control whereby the voters can have a voice?

Whatever the outcome, Greece is now up against the wall thanks to Papandreou. The predicament has suddenly changed from a financial catastrophe and austerity measures to a question about political identity: Do Greeks belong in the European Union or not?

Co-author Thomas Meaney is a doctoral candidate in history at Columbia University and an editor of The Utopian.
Mylonas, Harris, and Thomas Meaney. 2011. “Greece's Legitimacy Crisis”. Publisher's Version Abstract

In the past 48 hours, Greek Prime Minister George Papandreou has succeeded in one thing: Stirring up the anger of nearly everyone around him. The European Union, his own party PASOK, the opposition party New Democracy and the Greek electorate are all pitted against Papandreou. The Greeks have a word for this special brand of rage—they call it “thymos.” This refers to the simmering resentment that arises when one's views are not recognized.

It’s little wonder Papandreou has had to back down from his initial call for a national referendum on the 50% haircut deal decided by the European Union heads of state on October 27.

First off, he failed to get the opposition to agree to the referendum. They called it blackmail, denounced Papandreou as an opportunist and asked for a grand coalition government or immediate elections. Main opposition leader Antonis Samaras’ consensus on Thursday was short-lived and with many conditions.

Meanwhile, the European leaders—French President Nicolas Sarkozy and German Chancellor Angela Merkel—called Papandreou’s bluff. 'Go ahead and make our day,' they told him. “Imagine what would happen if we called a referendum on the bailout in our countries?' The International Monetary Fund, for its part, threatened to freeze all of its loans to Greece.

Finally, for Papandreou’s party PASOK the situation is even more dire. Instead of shoring up support from his own party members, the referendum only emboldened cries for his resignation—including from his own Ministers and PASOK Parliamentarians.

Papandreou has recalled his decision for a referendum because he failed in all fronts and it’s become clear that he can no longer be part of the solution.

There are three possible ways the crisis will play out. First, Papandreou could refuse to resign and possibly win the no-confidence vote Friday. This is unlikely since his overall support has reached its all-time low. The second, more likely scenario is that Papandreou loses the vote tomorrow and the President of the Hellenic Republic, Karolos Papoulias, turns to the other political party leaders to determine if the existing Parliament could form a government. The final scenario, if these efforts fail to build a government, would be new elections, as called for by the Greek Constitution. But it is most likely that a one-party government will not emerge from these elections.

The only way out of these three scenarios is to form a Grand Coalition government. What is a Grand Coalition government? In multi-party parliamentary systems, sometimes one-party governments cannot form. In such instances, coalition governments are often formed including more than one party in order to secure a Parliamentary majority, manage to form a government and pass legislation.

Greece’s history with such governments in the late 1980s does not exactly inspire faith, and the global stakes were smaller then. A grand coalition would entail the cooperation of all the political parties that are in favor of a European future for Greece. They would be ready to support the austerity measures needed to balance the Greek budget and overcome the solvency problem, but most importantly they would be the the parties that can agree on the composition of such a government. This last feature of a Grand Coalition is particularly valuable at a time when consensus-building in the Greek parliament has become nearly impossible.

What remains left out of this discussion is the Greek people. They voted two years ago for a party running on an anti-austerity platform and this is not what they received. Perhaps the current political system is afraid to hear their message. "Thymos" may not be the best state of mind to make choices.

Regardless, the Greek political leadership's ownership of the austerity program and responsible governance are necessary steps toward resolving Greece's legitimacy crisis, which would then allow them to confront the Greek people with the responsibility they must take in order to end the financial crisis.

Co-author Thomas Meaney is a doctoral candidate in history at Columbia University and an editor of The Utopian.
Allison, Graham T., Jr. 2011. “What Egypt Means for the US”. Publisher's Version Abstract

What do the recent events in Egypt mean for the US? The answer is a lot more complicated than it might seem. Egypt is important to the US for a number of reasons. Topping the list is oil, and the flow of oil, for which the Suez Canal is an important transit conduit. There is no reason to believe that a successor to the Mubarak government would interrupt the flow of oil, but you could imagine events in the area that could interrupt the flow, and we’re seeing this concern reflected in the markets.

There is also the concern that what is happening in Egypt is contagious, and that it could lead to instability in other, seemingly analogous states—the most important of which is Saudi Arabia. There are regions in which the governments seem very sclerotic, the people running them seem old, the youth vote seems large, and the number of educated citizens who don’t seem adequately challenged seems to be growing. Such elements characterize quite a number of states in the region, including those that are important to the US for various reasons.

Egypt has been a major ally of the US when it comes to relations with Israel, where the resulting peace, though cold, has created a stable border, and is thus considered one of the great achievements of the last many decades. In the role of counterterrorism, Egypt has been a significant and cooperative ally on questions about Hamas, al-Qaida, or Hezbollah.

Finally, with respect to governance, Egypt is dealing with an autocratic regime that significantly restricts the political rights of the population. This has been a problem for the US, as it directly conflicts with American objectives and rhetoric. Nevertheless, such issues are of a lesser concern in the hierarchy of interests, as things like oil attract greater attention.

I suspect that peaceful relations between Egypt and Israel would be sustained. A new Egyptian government of any stripe will have so much to do that it will not want to take on any additional problems. On the other hand, Egypt’s current mix finds organized groups like the Muslim Brotherhood. The Brotherhood’s recent statements have been more internationally acceptable, but traditionally they have had quite strong and different views with respect to Israel. As you can imagine, if a Muslim Brotherhood group emerges after whatever process of transition Egypt undergoes, such a group might maintain a contrary view.

The best way to think about the issue is to consider alternative futures. One possibility is that Mubarak and the current regime will survive. I’d say this is very unlikely, though, with only about a five to ten percent chance of happening.

A second possibility is that a transitional process will take place, resulting in an emerging democratic government. I’d say that this second alternative is the most hopeful, but not the most likely scenario.

Another scenario features a tumultuous process in which a more or less participatory and democratic system emerges. If this scenario were to play out, I would bet on the most organized groups emerging as leaders. In this case, the most organized group is the military, which means that we would see the emergence of a military-dominated regime with a civilian face. That would be a good outcome as far as the US is concerned. A variation of that scenario is the possibility that the Muslim Brotherhood could step up to take control of the government, an outcome that would present its own opportunities and risks.

The key idea that we should take away from this is that future developments are uncertain, and that it is entirely possible to describe an outcome that looks more like Iran —though I don’t think such an outcome is likely. Think about Ayatollah Khomeini in Paris until the Iranian revolution, Lenin going home to Russia in a single-carriage train. True, those situations weren’t exactly like the one happening now, but history reminds us that outcomes are often quite different from the ones people anticipate—and that looking at the aspirations that have spurred a revolution is hardly a good way to predict what the outcomes will actually be.

America's last 10 years might be called “The Decade the Locusts Ate.’’ A nation that started with a credible claim to lead a second American century lost its way after the terrorist attacks of September 11, 2001. Whether the nation will continue on a path of decline, or, alternatively, find our way to recovery and renewal, is uncertain.

The nation began the decade with a growing fiscal surplus and ended with a deficit so uncontrolled that its AAA credit rating was downgraded for the first time in its history. Ten years on, Americans’ confidence in our country and the promise of the American Dream is lower than at any point in memory. The indispensable superpower that entered the decade as the most respected nation in the world has seen its standing plummet. Seven out of every 10 Americans say that the United States is worse off today than it was a decade ago. While many of the factors that contributed to these developments were evident before 9/11, this unprecedented reversal pivots on that tragic day - and the choices made in response to it. Those choices had costs: the inescapable costs of the attack, the chosen costs, and the opportunity costs.

Inescapable costs of 9/11 must be counted first in the 3,000 innocent lives extinguished that morning. In addition, the collapse of the World Trade Center and part of the Pentagon destroyed $30 billion of property. The Dow plunged, erasing $1.2 trillion in value. Psychologically, the assault punctured the “security bubble’’ in which most Americans imagined they lived securely. Today, 80 percent of Americans expect another major terrorist attack on the homeland in the next decade.

Were this the sum of the matter, 9/11 would stand as a day of infamy, but not as an historic turning point. Huge as these directs costs are, they pale in comparison to costs of choices the United States made in response to 9/11: about how to defend America; where to fight Al Qaeda; whether to attack Iraq (or Iran or North Korea) on grounds that they had chemical or biological weapons that could be transferred to Al Qaeda; and whether to pay for these choices by taxing the current generation, or borrowing from China and other lenders, leaving the bills to the next generation.

Unquestionably, much of what was done to protect citizens at home and to fight Al Qaeda abroad has made America safer. It is no accident that the United States has not suffered further megaterrorist attacks. The remarkable intelligence and Special Forces capabilities demonstrated in the operation that killed Osama bin Laden suggest how far we have come.

But the central storyline of the decade focuses on two choices made by President George W. Bush - his decision to go to war with Iraq and his commitment to cut taxes, especially for wealthy Americans, and thus not to pay for the wars in Iraq and Afghanistan.

The cost of his decision to go to war with Iraq is measured in 4,478 American deaths, 40,000 Americans gravely wounded, and a monetary cost of $2 trillion.

Bush justified his decision to attack Iraq on the grounds that Saddam Hussein might arm terrorists with weapons of mass destruction, arguing that “19 hijackers armed by Saddam Hussein…could bring a day of horror like none we have ever known.’’ In retrospect, even Bush supporters agree that we went to war on false premises—since we now know that Saddam had no chemical or biological weapons.

Suppose, however, that chemical weapons had been found in Iraq. Would that have made Bush’s choice a wise decision? What about the many other states that had chemical or biological weapons that could have been transferred to Al Qaeda, for example Libya, or Syria, or Iran? What about the state that unquestionably had an advanced nuclear weapons program, North Korea, which took advantage of the US preoccupation with Iraq to develop an arsenal of nuclear weapons and conduct its first nuclear weapons test?

As for cutting taxes for the wealthy, Bush’s decision left the nation with a widening gap between government revenues and its expenditures. Brute facts are hard to ignore: having entered office with a budgetary surplus that the CBO projected would total $3.5 trillion through 2008, Bush left office with an annual deficit of over $1 trillion that the CBO projected would grow to $3 trillion over the next decade.

Finally, and most difficult to assess, are opportunity costs, what could be Robert Frost’s “road not taken.’’ In the immediate aftermath of 9/11, the United States was the object of overwhelming international sympathy and solidarity. The leading French newspaper declared: “We are all Americans.’’ Citizens united behind their commander in chief, giving him license to do virtually anything he could plausibly argue would defend us against future attacks.

This rare combination of readiness to sacrifice at home plus solidarity abroad sparked imagination. Would Americans have willingly paid a “terrorist tax’’ on gas that could kick what Bush rightly called America’s “oil addiction’’? Could an international campaign against nuclear terrorism or megaterrorism have bent trend lines that leave Americans and the world increasingly vulnerable to future biological or nuclear terrorist attacks? What impact could $2 trillion invested in new technologies have had on American competitiveness?

That such a decade leaves Americans increasingly pessimistic about ourselves and our future is not surprising. American history, however, is a story of recurring, impending catastrophes from which there is no apparent escape—followed by miraculous recoveries. At one of our darkest hours in 1776 when defeat at the hands of the British occupying Boston seemed almost certain, the general commanding American forces, George Washington, observed: “Perseverance and spirit have done wonders in all ages.’’

President Obama should take a page from Ronald Reagan’s playbook in winning the final inning of the Cold War. Obama can challenge President Mahmoud Ahmadinejad to put his enriched uranium where his mouth is—by stopping all Iranian enrichment of uranium beyond the 5 percent level.

A quarter-century ago, Soviet leader Mikhail Gorbachev was touting a new “glasnost”: openness. President Reagan went to Berlin and called on Gorbachev to “tear down this wall.” Two years later, the Berlin Wall came tumbling down and, shortly thereafter, the Soviet “evil empire” fell as well.

While in New York for the opening of the UN General Assembly in September, Ahmadinejad on three occasions made an unambiguous offer: He said Iran would stop all enrichment of uranium beyond the levels used in civilian power plants—if his country is able to buy specialized fuel enriched at 20 percent, for use in its research reactor that produces medical isotopes to treat cancer patients.

Obama should seize this proposal and send negotiators straightaway to hammer out specifics. Iran has been enriching uranium since 2006, and it has accumulated a stockpile of uranium enriched at up to 5 percent, sufficient after further enrichment for several nuclear bombs. Iran is also producing 20 percent material every day, and it announced in June that it planned to triple its output. Halting Iran’s current production of 20 percent material and its projected growth would be significant.

A stockpile of uranium enriched at 20 percent shrinks the potential timeline for breaking out to bomb material from months to weeks. In effect, having uranium enriched at 20 percent takes Iran 90 yards along the football field to bomb-grade material. Pushing it back below 5 percent would effectively move Tehran back to the 30-yard line - much farther from the goal of bomb-grade material. Even more important, extracting from Iran a commitment to a bright red line capping enrichment at 5 percent would stop the Islamic Republic from advancing on its current path to 60 percent enrichment and then 90 percent.

Stopping Iran from enriching beyond 5 percent is not, in itself, a “solution” to its nuclear threat. Nor was Reagan’s proposal to Gorbachev. The question for Reagan was whether we would be better off with the Berlin Wall or without it.

Iran today is the most sanctioned member of the United Nations; it has been the target of five Security Council resolutions since 2006 demanding that it suspend all uranium enrichment. The United States and Europe have organized their own, tougher economic sanctions forbidding businesses from trading with Iranian companies and limiting Iran’s access to financial markets.

But Iran does not require the permission of the United Nations or, for that matter, the United States to advance its nuclear program within its borders. Nor are current or future sanctions likely to dissuade Iran from progressing steadily toward a nuclear weapon.

So far, Obama has essentially continued the Bush administration’s policy toward Iran with one addition: an authentic offer from the start of his administration to begin negotiations. Negotiations, however, have not been feasible because of sharp divisions within Iran. Those rifts were exacerbated after the June 2009 elections, in which Iran’s ruling powers (Supreme Leader Ayatollah Ali Khamanei, Ahmadinejad and the Revolutionary Guard) rigged the presidential vote and then moved to suppress the opposition Green Movement protests. In the last two years, they have tightened control over their society.

Enter Ahmadinejad’s proposal to stop all enrichment at the 5 percent level—without preconditions. Although differences between Ahmadinejad and the supreme leader have become evident, the United States should pay attention to the president’s offer.

Arguments against testing the offer are easy to make. An embattled Ahmadinejad may not be able to deliver. Iran will use negotiations to seek to relax or escape current sanctions. If a deal were reached, it would be more difficult to win international support for the next round of sanctions. An agreement that stops only the 20 percent enrichment could imply a degree of acceptance of Iran’s ongoing enrichment up to 5 percent.

Recognizing all of these negatives, however, the policy question remains: Would the United States be better off with Iran enriching its uranium to 20 percent or without it?

President Obama should act now to test Ahmadinejad’s word.

The good news is that today’s teenagers are avid readers and prolific writers. The bad news is that what they are reading and writing are text messages.

According to a survey carried out last year by Nielsen, Americans between the ages of 13 and 17 send and receive an average of 3,339 texts per month. Teenage girls send and receive more than 4,000.

It’s an unmissable trend. Even if you don’t have teenage kids, you’ll see other people’s offspring slouching around, eyes averted, tapping away, oblivious to their surroundings. Take a group of teenagers to see the seven wonders of the world. They’ll be texting all the way. Show a teenager Botticelli’s Adoration of the Magi. You might get a cursory glance before a buzz signals the arrival of the latest SMS. Seconds before the earth is hit by a gigantic asteroid or engulfed by a super tsunami, millions of lithe young fingers will be typing the human race’s last inane words to itself:

C u later NOT :(

Now, before I am accused of throwing stones in a glass house, let me confess. I probably send about 50 emails a day, and I receive what seem like 200. But there’s a difference. I also read books. It’s a quaint old habit I picked up as a kid, in the days before cellphones began nesting, cuckoolike, in the palms of the young.

Half of today’s teenagers don’t read books—except when they’re made to. According to the most recent survey by the National Endowment for the Arts, the proportion of Americans between the ages of 18 and 24 who read a book not required at school or at work is now 50.7 percent, the lowest for any adult age group younger than 75, and down from 59 percent 20 years ago.

Back in 2004, when the NEA last looked at younger readers’ habits, it was already the case that fewer than one in three 13-year-olds read for pleasure every day. Especially terrifying to me as a professor is the fact that two thirds of college freshmen read for pleasure for less than an hour per week. A third of seniors don’t read for pleasure at all.

Why does this matter? For two reasons. First, we are falling behind more-literate societies. According to the results of the Organization for Economic Cooperation and Development’s most recent Program for International Student Assessment, the gap in reading ability between the 15-year-olds in the Shanghai district of China and those in the United States is now as big as the gap between the U.S. and Serbia or Chile.

But the more important reason is that children who don’t read are cut off from the civilization of their ancestors.

So take a look at your bookshelves. Do you have all - better make that any - of the books on the Columbia University undergraduate core curriculum? It’s not perfect, but it’s as good a list of the canon of Western civilization as I know of. Let’s take the 11 books on the syllabus for the spring 2012 semester: (1) Virgil’s Aeneid; (2) Ovid’s Metamorphoses; (3) Saint Augustine’s Confessions; (4) Dante’s The Divine Comedy; (5) Montaigne’s Essays; (6) Shakespeare’s King Lear; (7) Cervantes’s Don Quixote; (8) Goethe’s Faust; (9) Austen’s Pride and Prejudice; (10) Dostoevsky’s Crime and Punishment; (11) Woolf’s To the Lighthouse.

Step one: Order the ones you haven’t got today. (And get War and Peace, Great Expectations, and Moby-Dick while you’re at it.)

Step two: When vacation time comes around, tell the teenagers in your life you are taking them to a party. Or to camp. They won’t resist.

Step three: Drive to a remote rural location where there is no cell-phone reception whatsoever.

Step four: Reveal that this is in fact a reading party and that for the next two weeks reading is all you are proposing to do—apart from eating, sleeping, and talking about the books.

Welcome to Book Camp, kids.

How different would the world be today if there had been no 9/11? What if the attacks had been foiled or bungled? One obvious answer is that Americans would probably care a lot less than they do about the rest of the world.

Back on the eve of destruction, in early September 2001, only 13 percent of Americans believed that the U.S. should be “the single world leader.” And fewer than a third favored higher defense spending. Now those figures are naturally much higher. Right?

Wrong. According to the most recent surveys, just 12 percent of Americans today think the U.S. should be the sole superpower—almost exactly the same proportion as on the eve of the 9/11 attacks. The share of Americans who want to see higher spending on national security is actually down to 26 percent. Paradoxically, Americans today seem less interested in the wider world than they were before the Twin Towers were felled.

In the past 10 years, the U.S. has directly or indirectly overthrown at least three governments in the Muslim world. Yet Americans today feel less powerful than they did then. In 2001 just over a quarter felt that the U.S. had “a less important role as a world leader compared to 10 years ago.” The latest figure is 41 percent.

Three explanations suggest themselves. First, wielding power abroad proved harder in practice than in neoconservative theory. Second, the financial crisis has dampened American spirits. A third possibility is that 9/11 simply didn’t have that big an impact on American opinion.

Yet to conclude that 9/11 didn’t change much is to misunderstand the historical process. The world is a seriously complex place, and a small change to the web of events can have huge consequences. Our difficulty is imagining what those consequences might have been.

So let’s play a game like the one my friends at the Muzzy Lane software company are currently designing, which has the working title “New World Disorder.” The game simulates the complex interaction of economics, politics, and international relations, allowing us to replay the past.

Let’s start in January 2001 and thwart the 9/11 attacks by having Condi Rice and Paul Wolfowitz heed Richard Clarke’s warnings about Al-Qaeda. The game starts off well. Al-Qaeda is preemptively decapitated, its leaders rounded up in a series of covert operations and left to the tender mercies of their home governments. President Bush gets to focus on tax cuts, his first love.

But then, three years later, the murky details of this operation surface on the front page of The New York Times. John Kerry, the Democratic candidate for the presidency, denounces the “criminal conduct” of the Bush administration. Liberal pundits foam at the mouth. Ordinary Americans, unseared by 9/11, are shocked. Osama bin Laden issues a fierce denunciation of the U.S. from his Saudi prison cell. It triggers a wave of popular anger in the Middle East that topples any regime seen as too close to Washington.

The government of Qatar - gone. The government of Kuwait - gone. Above all, the government of Saudi Arabia - gone. True to form, the experts are soon all over network TV explaining how this fundamentalist backlash against the U.S.-backed oil monarchies had been years in the making (even if they hadn’t quite gotten around to predicting it beforehand). “Who lost the Middle East?” demands Kerry, pointing an accusing finger at George W. Bush. (Remember, prior to 9/11 Bush favored a reduction of U.S. overseas commitments.) The Democrats win the 2004 election, where-upon bin Laden’s new Islamic Republic of Arabia takes hostages at the U.S. Embassy in Riyadh…

In other words, if things had happened differently 10 years ago - if there had been no 9/11 and no retaliatory invasions of Afghanistan and Iraq - we might be living through an Islamist Winter rather than an Arab Spring.

Replaying the history game without 9/11 suggests that, ironically, the real impact of the attacks was not on Americans but on the homelands of the attackers themselves.

Ferguson, Niall. 2011. “World on Wi-Fire”. Publisher's Version Abstract

The human race is interconnected as never before. Is that a good thing? Ask the Lords of the Internet—the men running the companies Eric Schmidt of Google recently called “the Four Horsemen”: Amazon, Apple, Facebook, and Google—and you’ll get an unequivocal “yes.” But is it true? In view of the extraordinary economic and political instability of recent months, it’s worth asking if the Netlords are the Four Horsemen of a new kind of information apocalypse.

Don’t get me wrong. I love all that these companies have achieved. I order practically everything except haircuts from Amazon. I write this column on a MacBook Pro. I communicate with my kids via Facebook. It’s 6:55 a.m., and I’ve already run six searches on Google. Did I forget to mention that I’ve already received 29 emails and sent 14?

I also really like the Netlords. They are among the smartest guys on the planet. Yet they are also self-deprecating and sometimes very funny. (OK, not Steve Jobs.) So my question for them is a real question, not some kind of Luddite rant: does the incredible network you have created, with its unprecedented scale and speed, not contain a vulnerability? I’m not talking here about the danger of its exploitation by Islamist extremists or its incapacitation by Chinese cyberwarriors, though I worry about those things too. No, I mean the possibility that the global computer network formed by technologically unified human minds is inherently unstable—and that it is ushering in an era of intolerable volatility.

The communications revolution we are living through has been driven by two great forces. One is Gordon E. Moore’s “law” (which he first proposed in 1965) that the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every 18 months. In its simplified form, Moore’s Law says that computing power will double every two years, implying a roughly 30-fold increase in 10 years. This exponential trend has now continued for more than half a century and is expected by the techies to continue until at least 2015 or 2020.

The other force is the exponential growth of human networks. The first email was sent at the Massachusetts Institute of Technology in the same year Moore’s Law was born. In 2006 people sent 50 billion emails; last year it was 300 billion. The Internet was born in 1982. As recently as 1993 only 1 percent of two-way telecommunication went through it. By 2000 it was 51 percent. Now it’s 97 percent. Facebook was dreamed up by an über-nerd at my university in 2004. It has 800 million active users today—eight times the number of three years ago.

Russian venture capitalist Yuri Milner sees this trend as our friend (it has certainly been his). As the number of people online doubles from 2 billion to 4 billion over the next 10 years and the number of Internet-linked devices quadruples from 5 billion to 20 billion, mankind collectively gets more knowledge—and gets smarter. Speaking at a conference in Ukraine in mid-September, Milner asserted that data equivalent to the total volume of information created from the beginning of human civilization until 2003 can now be generated in the space of just two days. To cope with this information overload, he looks forward to “the emergence of the global brain, which consists of all the humans connected to each other and to the machine and interacting in a very unique and profound way, creating an intelligence that does not belong to any single human being or computer.”

In the future as imagined by Google, this global brain will do much of our thinking for us, telling us (through our handheld devices) which of our friends is just around the next corner and where we can buy that new suit we need for the best price. And if the best price is on Amazon, we’ll just click once and look forward to its next-day delivery. Maybe it’ll already be there when we get home.

That’s the kind of sci-fi scenario that gets a true nerd out of bed in the morning. But is it just a bit too utopian?

Exhibit one for a contrarian view is the recent behavior of global financial markets, the area of human activity furthest down the road of computerization and automation. According to math wonk Kevin Slavin, algorithms with names like the “Boston Shuffler” are the new masters of the financial universe. Whole tower blocks have been hollowed out to accommodate the computing power required by high-frequency (and very high-speed) trading. So how is this brave new world of robot traders doing?

Well, the VIX index of volatility—Wall Street’s so-called fear gauge, which infers the expected volatility of the U.S. stock market from options prices—reached an all-time high of 80 in the aftermath of Lehman Brothers’ failure and surged back up above 30 in early 2010 and again this summer. Part of this is just a good old-fashioned, man-made financial crisis, of course. But some of the volatility we’ve seen in the past four years is surely attributable to technology: think only of the “flash crash” of May 6 last year, when the Dow Jones industrial average plummeted 9 percent and then rallied in a matter of minutes.

Could the same kind of volatility spread into other markets as these become as wired and as integrated as Planet Finance? The answer must be yes. Consider how Greece’s fiscal woes have destabilized markets across Europe and around the world in recent months. Then there’s the market for consumer durables. We know that the speed with which new technologies have been adopted by American households has increased around eightfold over the past hundred years. But that speed of adoption has its obverse in the speed of obsolescence. Consumers are becoming ever more fickle. Millions bought RIM’s BlackBerry after its advent in 1999. But today the iPhone is the hotter handheld device, and I am far from alone in having a dead BlackBerry in my bottom desk drawer. In late September Amazon launched the Kindle Fire in a bid to challenge the iPad’s dominance of the tablet market. The name is appropriate. The market for such devices is on fire. The whole world is on wi-fire.

In politics, too, online electorates are becoming more volatile. The current race to find a Republican candidate for the presidency is a case in point. Only the other day Sarah Palin was a serious contender. Then Mitt Romney was a shoe-in. Until Rick Perry came along. Until Chris Christie came along. Meanwhile, the number of independent voters who have uncoupled themselves from the traditional parties has reached a historic high of 37 percent. Floating voters are the high-frequency traders of the political market.

Computing power has grown exponentially. So has the human network. But the brain of Homo sapiens remains pretty much the same organ that evolved in the heads of African hunter-gatherers 200,000 years ago. And that brain has a tendency to swing in its mood, from greed to fear and from love to hate.

The reality may be that by joining us all together and deluging us with data, the Netlords have ushered in a new Age of Volatility, in which our primeval emotions are combined and amplified as never before.

We are LinkedIn, but StressedOut. And that “cloud” of downloadable data may yet turn out to be a thundercloud.

The Palestinian leader Mahmoud Abbas’s bid for full U.N. membership was dead on arrival in New York. So why bother even raising the subject? The answer: to drum up international sympathy for the plight of the Palestinians. Yet other defeated peoples have suffered far more than they. Think only of how—and at whose expense—the U.N. itself began.

Born in the gently foggy city of San Francisco, the U.N. was conceived in the Ukrainian resort of Yalta. Though nestled amid the green Crimean hills and lapped by the Black Sea’s languid waves, the city was severely battle-scarred in February 1945; Winston Churchill dubbed it “the Riviera of Hades.” Its diabolical master was the Soviet despot Joseph Stalin, who acted as host to Churchill and the ailing American President Franklin Roosevelt.

Of the Big Three, as Sergei Plokhy shows in his riveting study Yalta: The Price of Peace, Roosevelt alone truly believed in the dream of a world parliament, and even he knew the U.N. would need to give greater weight to the great powers than its ill-starred predecessor, the League of Nations. Thus it was Roosevelt who proposed a Security Council on which the war’s victors—plus France and China—would be permanently represented and armed with veto powers.

Churchill and Stalin were realists. They saw the postwar world in terms of “spheres of influence.” Though perfectly capable of such realism in practice, Roosevelt still yearned for the idealist’s world of peace based on collective security. Yet Churchill was deeply reluctant to accept that Stalin’s postwar sphere of influence would include Poland. His predecessor had acquiesced in the destruction of Czechoslovakia at Munich but had gone to war when Hitler (and Stalin) carved up Poland between them. Was Yalta to be the Poles’ Munich?

“We can’t agree,” grumbled Churchill, “that Poland shall be a mere puppet state of Russia, where the people who don’t agree with Stalin are bumped off.” But that was exactly what postwar Poland became.

A staggering 19 percent of the prewar population of Poland had been killed as a result of World War II, including a huge proportion of the country’s large Jewish population. Yalta inflicted further punishment. The country not only shrank; it was also shifted westward so that Stalin could keep his gains from the 1939 Nazi-Soviet Pact. And it became a Soviet vassal state for the next half century. After Yalta, chess players devised a variant of their game for three players, using a six-sided board. As at the conference, in the game “Yalta” two players can join forces against the third, but all such alliances are temporary. Briefly, Churchill got Roosevelt on his side over Poland, but the American cared more about getting Stalin to agree to join the U.N.; Poland was a pawn to be sacrificed.

Having got what he wanted, Roosevelt left Yalta early. His destination? The Middle East, which he was intent on adding to ... the American sphere of influence. The conflicting commitments he made on that trip—to the Arabs and the Jews—have bedeviled U.S. foreign policy ever since. Asked by Roosevelt if he was a Zionist, Stalin replied elliptically that he “was one in principle, but he recognized the difficulty.”

That “difficulty” remains that a Jewish state could be created only at the expense of non-Jews living in Palestine. The Arabs resisted Israel’s creation, but they lost. So it goes. A trip to Yalta provides a salutary reminder that throughout history those who lose at war generally lose land, too, and sometimes sovereignty with it. By comparison with what the Poles endured last century, the Palestinians have got off lightly.

They will get their own state eventually. But not until all the permanent members of the Security Council are convinced the Palestinians will not abuse the privileges of statehood.

Like it or not, that was how the U.N. was meant to work when the Big Three conceived it on Hell’s Riviera.

After years when young Americans yearned only to be occupied on Wall Street, suddenly they have taken to occupying it. It’s easy to scoff at this phenomenon. I know, because I have.

This is certainly not America’s answer to the Arab Spring—the Bobo Fall perhaps, unmistakably both bohemian and bourgeois. But it’s still worth taking seriously. What is it that makes evidently educated young people yearn to adopt leftist positions that are eerily reminiscent of the ones their parents adopted in 1968?

Check out the protesters’ website, which on Monday featured a speech by Slovenian critical theorist Slavoj Žižek. At first I thought this must be some kind of parody, but no, he really exists—red T-shirt, Krugman beard, and all: “The only sense in which we are communists is that we care for the commons. The commons of nature. The commons of what is privatized by intellectual property. The commons of biogenetics. For this and only for this we should fight.”

Yeah, man. Property is theft. Ne travaillez jamais. And all that.

There are three possible explanations for this retrogression to the language of ’68. 

1. Increasing inequality exemplified by Wall Street is worth protesting against.

2. So is the fact that only a handful of bankers have been prosecuted for their part in the financial crisis.

3. Demonstrating is way cool.

Yet if I were a young American today, occupying Wall St. would not be my objective. Just reflect for a minute on the unbridled economic mayhem that would ensue if the protesters actually succeeded. The headline “Goldman Sachs Under Control of Hip Teenage Revolutionaries” would be the last straw for an already fragile economic recovery.

Now ask yourself what the financial crisis really means for today's 15- to 24-year-olds. Not only has it raised the probability that they will be unemployed after graduation. More seriously, it has massively increased the debt that they will have to service when they do get jobs.

Never in the history of intergenerational transfers has one generation left such a mountain of IOUs to another as the baby boomers are leaving to their grandchildren.

When you do the math, there is only one logical political home for today’s teens and 20-somethings ... and that is the Tea Party. For who else is promising to slash Medicare and Social Security and keep the tax burden at its historical average?

Let’s just remind ourselves of the report of the Trustees of the Social Security and Medicare trust funds back in 2007, which projected a rise in the cost of these two programs from 7.3 percent of gross domestic product to 17.5 percent by 2030. The trustees warned that to achieve actuarial balance—in other words, solvency—for these two programs would require (for Social Security) an increase of 16 percent in payroll tax revenues or an immediate reduction in benefits of 13 percent. For Medicare we are talking a 122 percent increase in payroll taxes or a 51 percent cut in spending.

As Laurence Kotlikoff and Scott Burns pointed out in The Coming Generational Storm, by 2030 there will be twice as many retirees as there are today but only 18 percent more workers. Unless there is really radical reform of entitlement programs - especially Medicare - the next generation of American workers will be paying roughly double the taxes their parents and grandparents paid. This is what Kotlikoff and Burns mean by “fiscal child abuse.”

Of these harsh realities the occupiers of Wall Street seem blissfully unaware. Fixated on the idea that they somehow represent the 99 percent of people who scrape by on 80 percent of total income, they fail to see that the real distributional conflict of our time is not between percentiles, much less classes, but between generations. And no generation has a keener interest in slashing future spending on entitlements than today’s teens and 20-somethings.

So occupying Wall Street is not the answer to this generation’s problems. The answer is to occupy the Tea Party—and wrest it from the grumpy old men who currently run it.

Call it the Iced Tea Party.

Way cool.

The global crises of financial and housing markets are now being superseded by new crises of governments. The fiscal challenges for the weaker members of the eurozone are early warnings, as are analogous problems in American state governments weighed down by unfunded pension and healthcare liabilities. Without action, this new crisis of state competence could soon become just as damaging as its recent financial predecessor.

This week's US debt deal, along with the prospect of debate on fiscal solutions in the run-up to the 2012 elections, provides some room for optimism. But America's fiscal problems have deep roots. The recession of 2007-2009 stemmed from the unprecedented bust in the housing market, driven by reduced lending standards and propelled by congressional pressures on private lenders and the reckless expansions of Fannie Mae and Freddie Mac. It is, however, important to recognise that this mistake is now understood and will not be repeated.

In the aftermath of the debt ceiling agreement there will be calls for further stimulus for America's economy. This would be a grave mistake. In the financial turmoil of 2008, bail-outs by the US and other governments were unfortunate, but necessary. However, the subsequent $800bn American stimulus package was largely a waste of money that sharply enlarged the fiscal hole now facing our economy.

President Barack Obama's administration has consistently overestimated the benefits of stimulus, by using an unrealistically high spending multiplier. According to this Keynesian logic, government expenditure is more than a free lunch. This idea, if correct, would be more brilliant than the creation of triple A paper out of garbage. In any event, the elimination of the temporary spending is now contractionary and, more importantly, the resulting expansion of public debt eventually requires higher taxes, retarding growth.

I agree that budget deficits were appropriate during the great recession and, for that reason, the kind of balanced-budget rule currently proposed by some Republicans should be avoided. However, since government spending is warranted only if it passes the usual hurdles of social rates of return, the fiscal deficit should have concentrated on tax reductions, especially those that emphasised falls in marginal tax rates, which encourage investment and growth.

Despite relief at the debt-ceiling agreement, America's fiscal situation remains deeply problematic. Any attempt to head off a crisis of government competence must begin with serious long-term reform. Reductions in the long-term path of entitlement outlays have to be put on the table, with increases in ages of eligibility a part of any solution.

We also need sharp reductions in spending programmes initiated or expanded by Mr Obama and his extravagant predecessor, George W. Bush. Given the inevitable growth of the main entitlement programmes, especially healthcare, increases in long-term federal revenue must be part of an overall reform.

So what, specifically, can be done? An effective future tax package would begin by setting US corporate and estate tax rates permanently to zero, given these taxes are inefficient and generate little revenue. Next, it would gradually phase out major "tax-expenditure" items, such as tax preferences for home-mortgage interest, state and local income taxes, and employee fringe benefits.

The structure of marginal income-tax rates should then be lowered. Marginal rates should particularly not increase where they are already high, such as at upper incomes. The bulk of any extra revenue needed to make up the difference should then be raised via a broad-based, flat-rate expenditure tax, such as a value added tax. A rate of 10 per cent, with few exemptions, would raise about 5 per cent of gross domestic product.

Of course, such a new tax would be a two-edged sword: a highly efficient tax, but politically dangerous. To paraphrase Larry Summers from long ago, we don't have VAT in the US because Democrats think it is regressive, and Republicans think it is a money machine. We will get VAT when Democrats realise it is a money machine, and Republicans realise it is regressive. Obviously, I worry about the money machine property, but I see no serious alternative for raising the revenue needed for an overall next-stage reform package.

The raucous debt-ceiling debate represents a good start in forging a serious long-term fiscal plan. Substantial additional progress will be needed, sadly much of which will probably have to await the outcome of the next US election. Yet progress must be made - or the impending crises of governments, signalled by possible downgrades of US debt, will make the 2008-2009 recession look mild.

This essay is not about Steve Jobs. It is about the countless individuals with roughly the same combination of talents of whom we’ve never heard and never will.

Most of the 106 billion people who’ve ever lived are dead—around 94 percent of them. And most of those dead people were Asian—probably more than 60 percent. And most of those dead Asians were dirt poor. Born into illiterate peasant families enslaved by subsistence agriculture under some or other form of hierarchical government, the Steves of the past never stood a chance.

Chances are, those other Steves didn’t make it into their 30s, never mind their mid-50s. An appalling number died in childhood, killed off by afflictions far easier to treat than pancreatic cancer. The ones who made it to adulthood didn’t have the option to drop out of college because they never went to college. Even the tiny number of Steves who had the good fortune to rise to the top of premodern societies wasted their entire lives doing calligraphy (which he briefly dabbled in at Reed College). Those who sought to innovate were more likely to be punished than rewarded.

Today, according to estimates by Credit Suisse, there is approximately $195 trillion of wealth in the world. Most of it was made quite recently, in the wake of those great political and economic revolutions of the late 18th century, which, for the first time in human history, put a real premium on innovation. And most of it is owned by Westerners—Europeans and inhabitants of the New World and Antipodes inhabited by their descendants. We may account for less than a fifth of humanity, but we Westerners still own two thirds of global wealth.

A nontrivial portion of that wealth ($6.7 billion) belonged to Steve Jobs and now belongs to his heirs. In that respect, Jobs personified the rising inequality that is one of the striking characteristics of his lifetime. Back in 1955 the top 1 percent of Americans earned 9 percent of income. Today the figure is above 14 percent.

Yet there is no crowd of young people rampaging through Palo Alto threatening to “Occupy Silicon Valley.” The huge amounts of money made by Jobs and his fellow pioneers of personal computing are not resented the way the vampire squids of Wall Street are. On the contrary, Jobs is revered. One eminent hedge-fund manager (who probably holds a healthy slice of Apple stock as well as the full array of iGadgets) recently likened him to Leonardo da Vinci.

So the question is not, how do we produce more Steves? The normal process of human reproduction will ensure a steady supply of what Malcolm Gladwell has called “outliers.” The question should be, how do we ensure that the next Steve Jobs fulfills his potential?

An adopted child, the biological son of a Syrian Muslim immigrant, a college dropout, a hippie who briefly converted to Buddhism and experimented with LSD—Jobs was the type of guy no sane human resources department would have hired. I doubt that Apple itself would hire someone with his résumé at age 20. The only chance he ever had to become a chief executive officer was by founding his own company.

And that—China, please note—is why capitalism needs to be embedded in a truly free society in order to flourish. In a free society a weirdo can do his own thing. In a free society he can even fail at his own thing, as Jobs undoubtedly did in his first stint in charge of Apple. And in a free society he can bounce back and revolutionize all our lives.

Somewhere in his father’s native Syria another Steve Jobs has just died. But this other Steve was gunned down by a tyrannical government. And what wonders his genius might have produced we shall never know.

Ronald Reagan and Barack Obama have at least one similarity. They both were confronted by great economic challenges when they became president.

Mr. Reagan's immediate challenge was that inflation and interest rates were out of control. He met this great test by allying with the Federal Reserve chairman, Paul Volcker, in accomplishing a return to price stability, even through the 1982 recession when the unemployment rate hit 10.8%.

Reagan's success is not in doubt. Inflation and interest rates were reduced dramatically, and the recovery from the end of 1982 to the end of 1988 was strong and long with an average growth rate of real GDP of 4.6% per year. Moreover, Reagan focused on implementing good economic policies, not on blaming his incompetent predecessor for the terrible economy he had inherited.

Mr. Obama was equally in position to get credit for turning around a perilous economic situation that had been left by a weak predecessor. But he has pursued an array of poor economic policies, featuring the grand Keynesian experiment of sharply raising federal spending and the public debt. The results have been terrible and now, two and a half years into his administration, Mr. Obama is still blaming George W. Bush for all the problems.

Friday's downgrade of the U.S. credit rating by Standard & Poor's should have been a wake-up call to the administration. S&P is saying, accurately, that there is no coherent long-term plan in place to deal with the U.S. government's fiscal deficits.

The U.S. Treasury could have responded in two ways. First, it could have taken the downgrade as useful information and then focused on how to perform better to earn back a AAA rating. Instead, it chose to attack the rating agency as incompetent and not credible. In this respect, U.S. officials were almost as bad as Italian Prime Minister Silvio Berlusconi, who responded to warnings from S&P and Moody's about Italian government debt by launching police raids on the offices of the rating agencies in Milan last week. The U.S. Treasury's response also reminds me of Lehman Brothers blaming its financial problems in the summer of 2008 on evil financial analysts and short-sellers.

The way for the U.S. government to earn back a AAA rating is to enact a meaningful medium- and long-term plan for addressing the nation's fiscal problems. I have sketched a five-point plan that builds on ideas from the excellent 2010 report of the president's deficit commission.

First, make structural reforms to the main entitlement programs, starting with increases in ages of eligibility and a shift to an economically appropriate indexing formula. Second, lower the structure of marginal tax rates in the individual income tax. Third, in the spirit of Reagan's 1986 tax reform, pay for the rate cuts by gradually phasing out the main tax-expenditure items, including preferences for home-mortgage interest, state and local income taxes, and employee fringe benefits—not to mention eliminating ethanol subsidies. Fourth, permanently eliminate corporate and estate taxes, levies that are inefficient and raise little money.

Fifth, introduce a broad-based expenditure tax, such as a value-added tax (VAT), with a rate around 10%. The VAT's appeal to liberals can be enhanced, with some loss of economic efficiency, by exempting items such as food and housing.

I recognize that a VAT is anathema to many conservatives because it gives the government an added claim on revenues. My defense is that a VAT makes sense as part of a larger package that includes the other four points.

The loss of the U.S. government's AAA rating is a great symbolic blow, one that would cause great anguish to our first Treasury secretary, Alexander Hamilton. Frankly, the only respectable reaction by our current Treasury secretary is to fall on his sword. Then again, "the buck stops here" suggests that an even more appropriate resignation would come from our chief executive, who, by the way, is no Ronald Reagan.

The United States is in the third year of a grand experiment by the Obama administration to revive the economy through enormous borrowing and spending by the government, with the Federal Reserve playing a supporting role by keeping interest rates at record lows.

How is the experiment going? By the looks of it, not well.

The economy is growing much more slowly than in a typical recovery, housing prices remain depressed and the stock market has been in a slump—all troubling indicators that another recession may be on the way. Most worrisome is the anemic state of the labor market, underscored by the zero growth in the latest jobs report.

The poor results should not surprise us given the macroeconomic policies the government has pursued. I agree that the recession warranted fiscal deficits in 2008–2010, but the vast increase of public debt since 2007 and the uncertainty about the country’s long-run fiscal path mean that we no longer have the luxury of combating the weak economy with more deficits.

Today’s priority has to be austerity, not stimulus, and it will not work to announce a new $450 billion jobs plan while promising vaguely to pay for it with fiscal restraint over the next 10 years, as Mr. Obama did in his address to Congress on Thursday. Given the low level of government credibility, fiscal discipline has to start now to be taken seriously. But we have to do even more: I propose a consumption tax, an idea that offends many conservatives, and elimination of the corporate income tax, a proposal that outrages many liberals.

These difficult steps would be far more effective than the president’s failed experiment. The administration’s $800 billion stimulus program raised government demand for goods and services and was also intended to stimulate consumer demand. These interventions are usually described as Keynesian, but as John Maynard Keynes understood in his 1936 masterwork, “The General Theory of Employment, Interest and Money” (the first economics book I read), the main driver of business cycles is investment. As is typical, the main decline in G.D.P. during the recession showed up in the form of reduced investment by businesses and households.

What drives investment? Stable expectations of a sound economic environment, including the long-run path of tax rates, regulations and so on. And employment is akin to investment in that hiring decisions take into account the long-run economic climate.

The lesson is that effective incentives for investment and employment require permanence and transparency. Measures that are transient or uncertain will be ineffective.

And yet these are precisely the kinds of policies the Obama administration has pursued: temporarily cutting the payroll tax rate, maintaining the marginal income-tax rates from the George W. Bush era while vowing to raise them in the future, holding off on clean-air regulations while promising to implement them later and enacting an ambitious overhaul of Wall Street regulations while leaving lots of rules undefined and ambiguous.

Is there a better way? I believe that a long-term fiscal plan for the country requires six big steps.

Three of them were identified by the Bowles-Simpson deficit reduction commission: reforming Social Security and Medicare by increasing ages of eligibility and shifting to an appropriate formula for indexing benefits to inflation; phasing out “tax expenditures” like the deductions for mortgage interest, state and local taxes and employer-provided health care; and lowering the marginal income-tax rates for individuals.

I would add three more: reversing the vast and unwise increase in spending that occurred under Presidents Bush and Obama; introducing a tax on consumer spending, like the value-added tax (or VAT) common in other rich countries; and abolishing federal corporate taxes and estate taxes. All three measures would be enormously difficult—many say impossible—but crises are opportune times for these important, basic reforms.

A broad-based expenditure tax, like a VAT, amounts to a tax on consumption. If the base rate were 10 percent, the revenue would be roughly 5 percent of G.D.P. One benefit from a VAT is that it is more efficient than an income tax—and in particular the current American income tax system.

I received vigorous criticism from conservatives after advocating a VAT in an essay in The Wall Street Journal last month. The main objection—reminiscent of the complaints about income-tax withholding, which was introduced in the United States in 1943—is that a VAT would be a money machine, allowing the government to readily grow larger. For example, the availability of easy VAT revenue in Western Europe, where rates reach as high as 25 percent, has supported the vast increase in the welfare state there since World War II. I share these concerns and, therefore, favor a VAT only if it is part of a package that includes other sensible reforms. But given the likely path of government spending on health care and Social Security, I see no reasonable alternative.

Abolishing the corporate income tax is similarly controversial. Any tax on capital income distorts decisions on saving and investment. Moreover, the inefficiency is magnified here because of double taxation: the income is taxed when corporations make profits and again when owners receive dividends or capital gains. If we want to tax capital income, a preferred method treats corporate profits as accruing to owners when profits arise and then taxes this income only once—whether it is paid out as dividends or retained by companies.

Liberals love the idea of a levy on evil corporations, but taxes on corporate profits in fact make up only a small part of federal revenue, compared to the two main sources: the individual income tax and payroll taxes for Social Security and Medicare.

In 2009-10, taxes on corporate profits averaged 1.4 percent of G.D.P. and 8.6 percent of total federal receipts. Even from 2000 to 2008, when corporations were more profitable, these taxes averaged only 1.9 percent of G.D.P. and 10.3 percent of federal receipts. If we could get past the political fallout, we could get more revenue and improve economic efficiency by abolishing the corporate income tax and relying instead on a VAT.

I had a dream that Mr. Obama and Congress enacted this fiscal reform package—triggering a surge in the stock market and a boom in investment and G.D.P.—and that he was re-elected.

This dream could become reality if our leader were Ronald Reagan or Bill Clinton—the two presidential heroes of the American economy since World War II—but Mr. Obama is another story. To become market-friendly, he would have to abandon most of his core economic and political principles.

More likely, his administration will continue with more of the same: an expansion of payroll-tax cuts, short-term tax credits, promises to raise future taxes on the rich, and added spending on infrastructure, job training and unemployment benefits. The economy will probably continue in its sluggish state, possibly slipping into another recession. In that case, our best hope is for a Republican president far more committed to the principles of free markets and limited government than Mr. Bush ever was.

Keynesian economics—the go-to theory for those who like government at the controls of the economy—is in the forefront of the ongoing debate on fiscal-stimulus packages. For example, in true Keynesian spirit, Agriculture Secretary Tom Vilsack said recently that food stamps were an "economic stimulus" and that "every dollar of benefits generates $1.84 in the economy in terms of economic activity." Many observers may see how this idea—that one can magically get back more than one puts in—conflicts with what I will call "regular economics." What few know is that there is no meaningful theoretical or empirical support for the Keynesian position.

The overall prediction from regular economics is that an expansion of transfers, such as food stamps, decreases employment and, hence, gross domestic product (GDP). In regular economics, the central ideas involve incentives as the drivers of economic activity. Additional transfers to people with earnings below designated levels motivate less work effort by reducing the reward from working.

In addition, the financing of a transfer program requires more taxes—today or in the future in the case of deficit financing. These added levies likely further reduce work effort—in this instance by taxpayers expected to finance the transfer—and also lower investment because the return after taxes is diminished.

This result does not mean that food stamps and other transfers are necessarily bad ideas in the world of regular economics. But there is an acknowledged trade-off: Greater provision of social insurance and redistribution of income reduces the overall GDP pie.

Yet Keynesian economics argues that incentives and other forces in regular economics are overwhelmed, at least in recessions, by effects involving "aggregate demand." Recipients of food stamps use their transfers to consume more. Compared to this urge, the negative effects on consumption and investment by taxpayers are viewed as weaker in magnitude, particularly when the transfers are deficit-financed.

Thus, the aggregate demand for goods rises, and businesses respond by selling more goods and then by raising production and employment. The additional wage and profit income leads to further expansions of demand and, hence, to more production and employment. As per Mr. Vilsack, the administration believes that the cumulative effect is a multiplier around two.

If valid, this result would be truly miraculous. The recipients of food stamps get, say, $1 billion but they are not the only ones who benefit. Another $1 billion appears that can make the rest of society better off. Unlike the trade-off in regular economics, that extra $1 billion is the ultimate free lunch.

How can it be right? Where was the market failure that allowed the government to improve things just by borrowing money and giving it to people? Keynes, in his "General Theory" (1936), was not so good at explaining why this worked, and subsequent generations of Keynesian economists (including my own youthful efforts) have not been more successful.

Theorizing aside, Keynesian policy conclusions, such as the wisdom of additional stimulus geared to money transfers, should come down to empirical evidence. And there is zero evidence that deficit-financed transfers raise GDP and employment—not to mention evidence for a multiplier of two.

Gathering evidence is challenging. In the data, transfers are higher than normal during recessions but mainly because of the automatic increases in welfare programs, such as food stamps and unemployment benefits. To figure out the economic effects of transfers one needs "experiments" in which the government changes transfers in an unusual way—while other factors stay the same—but these events are rare.

Ironically, the administration created one informative data point by dramatically raising unemployment insurance eligibility to 99 weeks in 2009—a much bigger expansion than in previous recessions. Interestingly, the fraction of the unemployed who are long term (more than 26 weeks) has jumped since 2009—to over 44% today, whereas the previous peak had been only 26% during the 1982-83 recession. This pattern suggests that the dramatically longer unemployment-insurance eligibility period adversely affected the labor market. All we need now to get reliable estimates are a hundred more of these experiments.

The administration found the evidence it wanted—multipliers around two—by consulting some large-scale macro-econometric models, which substitute assumptions for identification. These models were undoubtedly the source of Mr. Vilsack's claim that a dollar more of food stamps led to an extra $1.84 of GDP. This multiplier is nonsense, but one has to admire the precision in the number.

There are two ways to view Keynesian stimulus through transfer programs. It's either a divine miracle—where one gets back more than one puts in—or else it's the macroeconomic equivalent of bloodletting. Obviously, I lean toward the latter position, but I am still hoping for more empirical evidence.

The way to restoring America's AAA credit-rating starts with President Obama moving beyond blaming the economy on the admittedly inept George W. Bush. 

Standard & Poor's recent downgrade of the U.S. government shows how far the world has moved into a crisis of governments.

The official reactions to the S&P action have not been promising. The Obama administration attacked S&P's competence, and the U.S. Congress has threatened hearings, apparently aimed at bullying S&P and the other agencies from further downgrades.

The main substantive criticism was that S&P made a $2 trillion mistake in its baseline projection of 10-year deficits. Of course, these projections came from the Congressional Budget Office, which lost its credibility in these matters when it scored President Obama's health care reform plan as reducing 10-year deficits - mostly because of the inclusion of phantom reductions in Medicare payments to doctors.

In truth, S&P's downgrade stemmed mainly from its legitimate concern that the U.S. government has no coherent medium- or long-term plan to eliminate budget deficits and stabilize the path of public debt. This judgment is accurate and courageous and goes some distance in offsetting the hit to S&P's reputation that came from the AAA ratings that it gave not so long ago to mounds of mortgage-backed securities built on subprime garbage.

Unfortunately, Obama's main response to S&P's downgrade and the economic crisis more generally has been to continue blaming almost everything on his admittedly inept predecessor, George W. Bush, and on the Republican Congress.

Another familiar theme is the unwillingness of the evil rich to pay more taxes. (I have one modest proposal that could save the President valuable time in this regard. Rather than continuing to repeat the long phrase "millionaires and billionaires," I suggest a merger: "mibillionaires." I know it looks funny and is hard to say on a first try, but after three or four repetitions it becomes strikingly mellifluous.)

Enough.

The way forward to restoring our AAA rating begins with Obama taking seriously the surprisingly sound report by his recent bipartisan debt and deficit commission. Building on those recommendations, I have constructed a fiscal plan:

• Make structural reforms to the main entitlement programs starting with increases in ages of eligibility and a shift to an economically appropriate indexing formula.

• Eliminate the unwise increases of federal spending by Bush and Obama, including added outlays for education, farm and ethanol subsidies, and expansions of Medicare and Medicaid.

• Lower the structure of marginal tax rates in the individual income tax.

• Pay for the rate cuts by gradually phasing out the main tax-expenditure items, including preferences for home-mortgage interest, state and local income taxes, and employee fringe benefits.

• Permanently eliminate federal corporate and estate taxes, levies that are inefficient and raise comparatively little money.

• Introduce a broad-based expenditure tax, such as a value-added tax (VAT). Depending on the structure of exemptions, a rate of 10% should raise about 5% of GDP in revenue.

The VAT system is present in most developed countries and can be highly efficient because it has a flat rate, falls on consumption and has built-in mechanisms for ensuring compliance. However, a VAT is also a magnet for criticism by conservatives - who worry about the promotion of a larger government.

I share this concern and would defend a VAT only if it can be firmly linked to the other parts of the reform package. But more fundamentally, given the projected path of entitlement spending, I see no reasonable alternative.

It is hard to imagine President Obama becoming the leader of this kind of broad fiscal initiative. Though he has endorsed some pieces of some of these components, the embrace has been halting. He is hedging, not leading.

Thus, as S&P observed, uncertainty about our fiscal path will likely not be resolved at least until the outcomes of next year's crucial elections.

The one person with the power to eliminate part of this uncertainty is the President, who could nobly decide not to stand for reelection, thereby following in the footsteps of Lyndon Johnson and Calvin Coolidge. Johnson was forced out by a different type of crisis, Vietnam, and he hung on too long, delaying his announcement until he saw his poor performance in the New Hampshire primary and in subsequent electoral polls. Coolidge is a more dignified model, as he opted out in 1927 while things were going fine. In fact, Obama could borrow Coolidge's memorable phrase, "I do not choose to run."

Pages