Publications by Author: Ferguson%2C%20Niall

2015

It is very rare for an official biography to be also a revisionist biography, but this one is. Usually it’s the official life that the revisionists attempt to dissect and ­refute, but such is the historical reputation of Henry Kissinger, and the avalanche of books and treatises already written about him, that Niall Ferguson’s official biography is in part an effort to revise the revisionists. Though not without trenchant criticisms, “Kissinger. Volume I. 1923-1968: The Ideal­ist” — which takes its subject up to the age of 45, about to begin his first stint of full-time government service — constitutes the most comprehensive defense of Kissinger’s outlooks and actions since his own three-volume, 3,900-page autobiography, published between 1979 and 1999.

Kissinger: 1923-1968: The Idealist
Ferguson, Niall. 2015. Kissinger: 1923-1968: The Idealist. Penguin Press. Publisher's Version Abstract

The definitive biography of Henry Kissinger, based on unprecedented access to his private papers

No American statesman has been as revered or as reviled as Henry Kissinger. Once hailed as “Super K”—the “indispensable man” whose advice has been sought by every president from Kennedy to Obama—he has also been hounded by conspiracy theorists, scouring his every “telcon” for evidence of Machiavellian malfeasance. Yet as Niall Ferguson shows in this magisterial two-volume biography, drawing not only on Kissinger’s hitherto closed private papers but also on documents from more than a hundred archives around the world, the idea of Kissinger as the ruthless arch-realist is based on a profound misunderstanding.

The first half of Kissinger’s life is usually skimmed over as a quintessential tale of American ascent: the Jewish refugee from Hitler’s Germany who made it to the White House. But in this first of two volumes, Ferguson shows that what Kissinger achieved before his appointment as Richard Nixon’s national security adviser was astonishing in its own right. Toiling as a teenager in a New York factory, he studied indefatigably at night. He was drafted into the U.S. infantry and saw action at the Battle of the Bulge—as well as the liberation of a concentration camp—but ended his army career interrogating Nazis. It was at Harvard that Kissinger found his vocation. Having immersed himself in the philosophy of Kant and the diplomacy of Metternich, he shot to celebrity by arguing for “limited nuclear war.” Nelson Rockefeller hired him. Kennedy called him to Camelot. Yet Kissinger’s rise was anything but irresistible. Dogged by press gaffes and disappointed by “Rocky,” Kissinger seemed stuck—until a trip to Vietnam changed everything.
 
The Idealist is the story of one of the most important strategic thinkers America has ever produced. It is also a political Bildungsroman, explaining how “Dr. Strangelove” ended up as consigliere to a politician he had always abhorred. Like Ferguson’s classic two-volume history of the House of Rothschild, Kissinger sheds dazzling new light on an entire era. The essential account of an extraordinary life, it recasts the Cold War world.

2014
Ferguson, Niall. 2014. “Scots Must Vote Nae.” New York Times. Publisher's Version Abstract

To most Americans, Scotland means golf, whisky and—if they go there—steady drizzle. Even to the millions of Americans whose surnames testify to their Scottish or Scotch-Irish ancestry, the idea that Scotland might be about to become an independent country is baffling.

2012
Ferguson, Niall. 2012. “Why Obama is Winning”. Publisher's Version
Ferguson, Niall. 2012. “The West, Plagued by Self-Doubt.” Harvard Gazette. Publisher's Version
2011
Civilization: The West and The Rest

The rise to global predominance of Western civilization is the single most important historical phenomenon of the past five hundred years. All over the world, an astonishing proportion of people now work for Western-style companies, study at Western-style universities, vote for Western-style governments, take Western medicines, wear Western clothes, and even work Western hours. Yet six hundred years ago the petty kingdoms of Western Europe seemed unlikely to achieve much more than perpetual internecine warfare. It was Ming China or Ottoman Turkey that had the look of world civilizations. How did the West overtake its Eastern rivals? And has the zenith of Western power now passed?

In Civilization: The West and the Rest, bestselling author Niall Ferguson argues that, beginning in the fifteenth century, the West developed six powerful new concepts that the Rest lacked: competition, science, the rule of law, consumerism, modern medicine, and the work ethic. These were the "killer applications" that allowed the West to leap ahead of the Rest, opening global trade routes, exploiting newly discovered scientific laws, evolving a system of representative government, more than doubling life expectancy, unleashing the Industrial Revolution, and embracing a dynamic work ethic. Civilization shows just how fewer than a dozen Western empires came to control more than half of humanity and four fifths of the world economy.

Yet now, Ferguson argues, the days of Western predominance are numbered-not because of clashes with rival civilizations, but simply because the Rest have now downloaded the six killer apps we once monopolized-while the West has literally lost faith in itself.

Civilization does more than tell the gripping story of the West's slow rise and sudden demise; it also explains world history with verve, clarity, and wit. Controversial but cogent and compelling, Civilization is Ferguson at his very best.

Some of you knew Ted Forstmann much better than I did. Most of you knew him much longer. When Ted’s family and closest colleagues asked me to join Mayor Bloomberg and Charlie Rose in offering a eulogy to Ted, I must admit I was hesitant, not to mention humbled. What could be more presumptuous than for a British-born professor to try to do justice to one of the great American capitalists?

And then I remembered the side of Ted that I suspect relatively few of you saw. Teddy the philosopher. Teddy, my coauthor.

When I heard the news of Ted’s death—which we’d been dreading for weeks—my first thought was: he was the most American American I’ve ever known. Financier. Fun lover. Philanthropist. And a man who couldn’t abide cant—in both senses. Cant in the sense of insincere humbug. And can’t in the sense of “this can’t be done.”

And yet there was another side to Ted that was a little less classically all-American. He was, after all, a single parent. He was a man for whom the color line—for so long this country’s curse - was simply not visible.

He was also a matchmaker: a Cupid with a Gulfstream 5 instead of wings. He took a fatherly interest in my romance with Ayaan, whom he did so much to help after she was forced to leave the Netherlands, and who can’t be here for the very excellent reason that she’s about to give birth to our son. Ted was one of those people who didn’t advise her against me, and I’ll be grateful for that until the day I die.

What I really want to remember today, however, is Ted’s secret life as an intellectual. Ted was no ordinary master of the financial universe. He saw things differently. He was what the Germans call a Querdenker, which the English “lateral thinker” doesn’t quite translate.

. From the moment we met, he and I talked about his fears for this country’s financial and political system. He had shared my foreboding about the excesses of the early 2000s. And he also shared my fear that when the crisis struck, people would leap to the wrong conclusions.

In a piece we wrote together for The Wall Street Journal back in April of last year, we made an argument that I believe still holds good: that in a mood of legitimate public anger at the consequences of the crisis, this country is drawing the wrong conclusions about its causes.

Unlike many people in the financial world, Ted Forstmann was not afraid to criticize Wall Street. (It was I who had to tone down his invective.) But what Ted dreaded was that the backlash that was bound to follow the crisis would lead to precisely the hypertrophic regulation we now see emerging over literally thousands of pages - as well as to demagogic calls for redistribution via higher tax rates and expanded federal programs.

Ted was convinced that any new regulation should focus strictly on excess leverage and the derivatives markets. Those, for him, were the root causes of the crisis.

With Ronald Reagan, he also passionately believed that enlarging the government was not the answer to the problem; often, it was the problem. That was why he wanted to see more disadvantaged kids going to private schools. His ideal was social mobility, not state-mandated equality. In this, as in so many ways, Ted was very wise.

A couple of years ago, two of my kids had the privilege of having lunch with Ted at one of his favorite restaurants, Harry Cipriani, just nine blocks from here. Last weekend I asked my younger son, who’s now 12, if he remembered the conversation. He did. Ted’s advice was this: “Don’t do the obvious thing. Don’t follow in anybody’s footsteps. Look around you and figure out what’s needed, what’s missing. Then do that.”

I hope my son heeds that advice. I hope his whole generation heeds it. I know, Everest and Siya, that you will.

I admit I was surprised by my own reaction to the news of his death. My first thought was: oh, no, now I won’t be able to ask Ted what he thinks anymore. What he thinks about the economy. What he thinks about politics. I won’t be able to get his take on the presidential candidates. And suddenly I felt really bereft.

That morning I had to write a column for Newsweek. I couldn’t help myself: I just sat down and addressed it directly to him. What’s your take, Ted? As I was writing it—and boy, did the words flow—I realized just how much I am going to miss his wisdom. Because I could never predict what Ted’s take would be. To a pedestrian, risk-averse academic like me, the way he thought about the world was full of surprises—and always illuminating ones.

Ted, you were in many ways the most American of Americans. You were the quintessential doer. But you were also a thinker. And we really do miss the unique way you thought.

Wisdom is in short supply these days. You took so much with you when you left us.

Ferguson, Niall. 2011. “Romney to the Rescue”. Publisher's Version Abstract

This column is for Ted Forstmann: financier, fun lover, and philanthropist, who died on Nov. 20. But it’s not just for him. It’s to him.

Ted, I’m worried. I wish you were still around to help me get this right. The US is going nuts with populism. That’s always to be expected after a big financial crisis, I know. But this is dysfunctional.

On one side, there are conservative fundamentalists—the Tea Party—who think we can turn the clock back to before the New Deal, if not further. Some of them want to get rid not just of the Federal Reserve but of most of the federal government itself. I have more sympathy with these Teapopulists than with the other lot, the motley crew who want to Occupy Wall Street (call them the Occupopulists). But when it comes to practical politics, this Tea Party has more in common with the Mad Hatter’s than Boston’s.

To begin with, they’ve created a mood in the Republican Party that makes any kind of compromise on our fiscal crisis impossible. We just saw the ignominious failure of the supercommittee, which was supposed to come up with a plan to reduce the deficit. Predictably, each party blames the other side for this flop. Either way, the consequences are dire. First, the markets are spooked, just the way they were by the partisan dogfight over the debt ceiling earlier this year. Second, the country is now on course for more drastic spending cuts in 2013, which could not only slash our defense budget in an irresponsible way but also plunge the economy back into recession.

There’s another problem. Just like the populists of a century ago, the Teapopulists are drawn compulsively to disastrous presidential wannabes. I never asked you what you thought of Mitt Romney, Ted. But I am sure you’d prefer him over the other contenders. Bachmann, Perry, Cain, Gingrich—the one thing these people have in common is that they would lose to Barack Obama next year even if the unemployment rate were twice what it is now. Their appeal to the crucial center—to the independents and the undecided—is just too low.

What’s the case against Romney? That he’s a Mormon? Ted, you were a devout Catholic, just as I am a doubting atheist. But this is America. Religion and government are separate. And we tolerate all faiths, no matter how idiosyncratic, provided they tolerate ours too. That he’s changed his mind on hot-button issues? Well, so does any intelligent person. You often did. What is this, a dogmatism contest?

The good news is that today’s teenagers are avid readers and prolific writers. The bad news is that what they are reading and writing are text messages.

According to a survey carried out last year by Nielsen, Americans between the ages of 13 and 17 send and receive an average of 3,339 texts per month. Teenage girls send and receive more than 4,000.

It’s an unmissable trend. Even if you don’t have teenage kids, you’ll see other people’s offspring slouching around, eyes averted, tapping away, oblivious to their surroundings. Take a group of teenagers to see the seven wonders of the world. They’ll be texting all the way. Show a teenager Botticelli’s Adoration of the Magi. You might get a cursory glance before a buzz signals the arrival of the latest SMS. Seconds before the earth is hit by a gigantic asteroid or engulfed by a super tsunami, millions of lithe young fingers will be typing the human race’s last inane words to itself:

C u later NOT :(

Now, before I am accused of throwing stones in a glass house, let me confess. I probably send about 50 emails a day, and I receive what seem like 200. But there’s a difference. I also read books. It’s a quaint old habit I picked up as a kid, in the days before cellphones began nesting, cuckoolike, in the palms of the young.

Half of today’s teenagers don’t read books—except when they’re made to. According to the most recent survey by the National Endowment for the Arts, the proportion of Americans between the ages of 18 and 24 who read a book not required at school or at work is now 50.7 percent, the lowest for any adult age group younger than 75, and down from 59 percent 20 years ago.

Back in 2004, when the NEA last looked at younger readers’ habits, it was already the case that fewer than one in three 13-year-olds read for pleasure every day. Especially terrifying to me as a professor is the fact that two thirds of college freshmen read for pleasure for less than an hour per week. A third of seniors don’t read for pleasure at all.

Why does this matter? For two reasons. First, we are falling behind more-literate societies. According to the results of the Organization for Economic Cooperation and Development’s most recent Program for International Student Assessment, the gap in reading ability between the 15-year-olds in the Shanghai district of China and those in the United States is now as big as the gap between the U.S. and Serbia or Chile.

But the more important reason is that children who don’t read are cut off from the civilization of their ancestors.

So take a look at your bookshelves. Do you have all - better make that any - of the books on the Columbia University undergraduate core curriculum? It’s not perfect, but it’s as good a list of the canon of Western civilization as I know of. Let’s take the 11 books on the syllabus for the spring 2012 semester: (1) Virgil’s Aeneid; (2) Ovid’s Metamorphoses; (3) Saint Augustine’s Confessions; (4) Dante’s The Divine Comedy; (5) Montaigne’s Essays; (6) Shakespeare’s King Lear; (7) Cervantes’s Don Quixote; (8) Goethe’s Faust; (9) Austen’s Pride and Prejudice; (10) Dostoevsky’s Crime and Punishment; (11) Woolf’s To the Lighthouse.

Step one: Order the ones you haven’t got today. (And get War and Peace, Great Expectations, and Moby-Dick while you’re at it.)

Step two: When vacation time comes around, tell the teenagers in your life you are taking them to a party. Or to camp. They won’t resist.

Step three: Drive to a remote rural location where there is no cell-phone reception whatsoever.

Step four: Reveal that this is in fact a reading party and that for the next two weeks reading is all you are proposing to do—apart from eating, sleeping, and talking about the books.

Welcome to Book Camp, kids.

How different would the world be today if there had been no 9/11? What if the attacks had been foiled or bungled? One obvious answer is that Americans would probably care a lot less than they do about the rest of the world.

Back on the eve of destruction, in early September 2001, only 13 percent of Americans believed that the U.S. should be “the single world leader.” And fewer than a third favored higher defense spending. Now those figures are naturally much higher. Right?

Wrong. According to the most recent surveys, just 12 percent of Americans today think the U.S. should be the sole superpower—almost exactly the same proportion as on the eve of the 9/11 attacks. The share of Americans who want to see higher spending on national security is actually down to 26 percent. Paradoxically, Americans today seem less interested in the wider world than they were before the Twin Towers were felled.

In the past 10 years, the U.S. has directly or indirectly overthrown at least three governments in the Muslim world. Yet Americans today feel less powerful than they did then. In 2001 just over a quarter felt that the U.S. had “a less important role as a world leader compared to 10 years ago.” The latest figure is 41 percent.

Three explanations suggest themselves. First, wielding power abroad proved harder in practice than in neoconservative theory. Second, the financial crisis has dampened American spirits. A third possibility is that 9/11 simply didn’t have that big an impact on American opinion.

Yet to conclude that 9/11 didn’t change much is to misunderstand the historical process. The world is a seriously complex place, and a small change to the web of events can have huge consequences. Our difficulty is imagining what those consequences might have been.

So let’s play a game like the one my friends at the Muzzy Lane software company are currently designing, which has the working title “New World Disorder.” The game simulates the complex interaction of economics, politics, and international relations, allowing us to replay the past.

Let’s start in January 2001 and thwart the 9/11 attacks by having Condi Rice and Paul Wolfowitz heed Richard Clarke’s warnings about Al-Qaeda. The game starts off well. Al-Qaeda is preemptively decapitated, its leaders rounded up in a series of covert operations and left to the tender mercies of their home governments. President Bush gets to focus on tax cuts, his first love.

But then, three years later, the murky details of this operation surface on the front page of The New York Times. John Kerry, the Democratic candidate for the presidency, denounces the “criminal conduct” of the Bush administration. Liberal pundits foam at the mouth. Ordinary Americans, unseared by 9/11, are shocked. Osama bin Laden issues a fierce denunciation of the U.S. from his Saudi prison cell. It triggers a wave of popular anger in the Middle East that topples any regime seen as too close to Washington.

The government of Qatar - gone. The government of Kuwait - gone. Above all, the government of Saudi Arabia - gone. True to form, the experts are soon all over network TV explaining how this fundamentalist backlash against the U.S.-backed oil monarchies had been years in the making (even if they hadn’t quite gotten around to predicting it beforehand). “Who lost the Middle East?” demands Kerry, pointing an accusing finger at George W. Bush. (Remember, prior to 9/11 Bush favored a reduction of U.S. overseas commitments.) The Democrats win the 2004 election, where-upon bin Laden’s new Islamic Republic of Arabia takes hostages at the U.S. Embassy in Riyadh…

In other words, if things had happened differently 10 years ago - if there had been no 9/11 and no retaliatory invasions of Afghanistan and Iraq - we might be living through an Islamist Winter rather than an Arab Spring.

Replaying the history game without 9/11 suggests that, ironically, the real impact of the attacks was not on Americans but on the homelands of the attackers themselves.

Ferguson, Niall. 2011. “World on Wi-Fire”. Publisher's Version Abstract

The human race is interconnected as never before. Is that a good thing? Ask the Lords of the Internet—the men running the companies Eric Schmidt of Google recently called “the Four Horsemen”: Amazon, Apple, Facebook, and Google—and you’ll get an unequivocal “yes.” But is it true? In view of the extraordinary economic and political instability of recent months, it’s worth asking if the Netlords are the Four Horsemen of a new kind of information apocalypse.

Don’t get me wrong. I love all that these companies have achieved. I order practically everything except haircuts from Amazon. I write this column on a MacBook Pro. I communicate with my kids via Facebook. It’s 6:55 a.m., and I’ve already run six searches on Google. Did I forget to mention that I’ve already received 29 emails and sent 14?

I also really like the Netlords. They are among the smartest guys on the planet. Yet they are also self-deprecating and sometimes very funny. (OK, not Steve Jobs.) So my question for them is a real question, not some kind of Luddite rant: does the incredible network you have created, with its unprecedented scale and speed, not contain a vulnerability? I’m not talking here about the danger of its exploitation by Islamist extremists or its incapacitation by Chinese cyberwarriors, though I worry about those things too. No, I mean the possibility that the global computer network formed by technologically unified human minds is inherently unstable—and that it is ushering in an era of intolerable volatility.

The communications revolution we are living through has been driven by two great forces. One is Gordon E. Moore’s “law” (which he first proposed in 1965) that the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every 18 months. In its simplified form, Moore’s Law says that computing power will double every two years, implying a roughly 30-fold increase in 10 years. This exponential trend has now continued for more than half a century and is expected by the techies to continue until at least 2015 or 2020.

The other force is the exponential growth of human networks. The first email was sent at the Massachusetts Institute of Technology in the same year Moore’s Law was born. In 2006 people sent 50 billion emails; last year it was 300 billion. The Internet was born in 1982. As recently as 1993 only 1 percent of two-way telecommunication went through it. By 2000 it was 51 percent. Now it’s 97 percent. Facebook was dreamed up by an über-nerd at my university in 2004. It has 800 million active users today—eight times the number of three years ago.

Russian venture capitalist Yuri Milner sees this trend as our friend (it has certainly been his). As the number of people online doubles from 2 billion to 4 billion over the next 10 years and the number of Internet-linked devices quadruples from 5 billion to 20 billion, mankind collectively gets more knowledge—and gets smarter. Speaking at a conference in Ukraine in mid-September, Milner asserted that data equivalent to the total volume of information created from the beginning of human civilization until 2003 can now be generated in the space of just two days. To cope with this information overload, he looks forward to “the emergence of the global brain, which consists of all the humans connected to each other and to the machine and interacting in a very unique and profound way, creating an intelligence that does not belong to any single human being or computer.”

In the future as imagined by Google, this global brain will do much of our thinking for us, telling us (through our handheld devices) which of our friends is just around the next corner and where we can buy that new suit we need for the best price. And if the best price is on Amazon, we’ll just click once and look forward to its next-day delivery. Maybe it’ll already be there when we get home.

That’s the kind of sci-fi scenario that gets a true nerd out of bed in the morning. But is it just a bit too utopian?

Exhibit one for a contrarian view is the recent behavior of global financial markets, the area of human activity furthest down the road of computerization and automation. According to math wonk Kevin Slavin, algorithms with names like the “Boston Shuffler” are the new masters of the financial universe. Whole tower blocks have been hollowed out to accommodate the computing power required by high-frequency (and very high-speed) trading. So how is this brave new world of robot traders doing?

Well, the VIX index of volatility—Wall Street’s so-called fear gauge, which infers the expected volatility of the U.S. stock market from options prices—reached an all-time high of 80 in the aftermath of Lehman Brothers’ failure and surged back up above 30 in early 2010 and again this summer. Part of this is just a good old-fashioned, man-made financial crisis, of course. But some of the volatility we’ve seen in the past four years is surely attributable to technology: think only of the “flash crash” of May 6 last year, when the Dow Jones industrial average plummeted 9 percent and then rallied in a matter of minutes.

Could the same kind of volatility spread into other markets as these become as wired and as integrated as Planet Finance? The answer must be yes. Consider how Greece’s fiscal woes have destabilized markets across Europe and around the world in recent months. Then there’s the market for consumer durables. We know that the speed with which new technologies have been adopted by American households has increased around eightfold over the past hundred years. But that speed of adoption has its obverse in the speed of obsolescence. Consumers are becoming ever more fickle. Millions bought RIM’s BlackBerry after its advent in 1999. But today the iPhone is the hotter handheld device, and I am far from alone in having a dead BlackBerry in my bottom desk drawer. In late September Amazon launched the Kindle Fire in a bid to challenge the iPad’s dominance of the tablet market. The name is appropriate. The market for such devices is on fire. The whole world is on wi-fire.

In politics, too, online electorates are becoming more volatile. The current race to find a Republican candidate for the presidency is a case in point. Only the other day Sarah Palin was a serious contender. Then Mitt Romney was a shoe-in. Until Rick Perry came along. Until Chris Christie came along. Meanwhile, the number of independent voters who have uncoupled themselves from the traditional parties has reached a historic high of 37 percent. Floating voters are the high-frequency traders of the political market.

Computing power has grown exponentially. So has the human network. But the brain of Homo sapiens remains pretty much the same organ that evolved in the heads of African hunter-gatherers 200,000 years ago. And that brain has a tendency to swing in its mood, from greed to fear and from love to hate.

The reality may be that by joining us all together and deluging us with data, the Netlords have ushered in a new Age of Volatility, in which our primeval emotions are combined and amplified as never before.

We are LinkedIn, but StressedOut. And that “cloud” of downloadable data may yet turn out to be a thundercloud.

The Palestinian leader Mahmoud Abbas’s bid for full U.N. membership was dead on arrival in New York. So why bother even raising the subject? The answer: to drum up international sympathy for the plight of the Palestinians. Yet other defeated peoples have suffered far more than they. Think only of how—and at whose expense—the U.N. itself began.

Born in the gently foggy city of San Francisco, the U.N. was conceived in the Ukrainian resort of Yalta. Though nestled amid the green Crimean hills and lapped by the Black Sea’s languid waves, the city was severely battle-scarred in February 1945; Winston Churchill dubbed it “the Riviera of Hades.” Its diabolical master was the Soviet despot Joseph Stalin, who acted as host to Churchill and the ailing American President Franklin Roosevelt.

Of the Big Three, as Sergei Plokhy shows in his riveting study Yalta: The Price of Peace, Roosevelt alone truly believed in the dream of a world parliament, and even he knew the U.N. would need to give greater weight to the great powers than its ill-starred predecessor, the League of Nations. Thus it was Roosevelt who proposed a Security Council on which the war’s victors—plus France and China—would be permanently represented and armed with veto powers.

Churchill and Stalin were realists. They saw the postwar world in terms of “spheres of influence.” Though perfectly capable of such realism in practice, Roosevelt still yearned for the idealist’s world of peace based on collective security. Yet Churchill was deeply reluctant to accept that Stalin’s postwar sphere of influence would include Poland. His predecessor had acquiesced in the destruction of Czechoslovakia at Munich but had gone to war when Hitler (and Stalin) carved up Poland between them. Was Yalta to be the Poles’ Munich?

“We can’t agree,” grumbled Churchill, “that Poland shall be a mere puppet state of Russia, where the people who don’t agree with Stalin are bumped off.” But that was exactly what postwar Poland became.

A staggering 19 percent of the prewar population of Poland had been killed as a result of World War II, including a huge proportion of the country’s large Jewish population. Yalta inflicted further punishment. The country not only shrank; it was also shifted westward so that Stalin could keep his gains from the 1939 Nazi-Soviet Pact. And it became a Soviet vassal state for the next half century. After Yalta, chess players devised a variant of their game for three players, using a six-sided board. As at the conference, in the game “Yalta” two players can join forces against the third, but all such alliances are temporary. Briefly, Churchill got Roosevelt on his side over Poland, but the American cared more about getting Stalin to agree to join the U.N.; Poland was a pawn to be sacrificed.

Having got what he wanted, Roosevelt left Yalta early. His destination? The Middle East, which he was intent on adding to ... the American sphere of influence. The conflicting commitments he made on that trip—to the Arabs and the Jews—have bedeviled U.S. foreign policy ever since. Asked by Roosevelt if he was a Zionist, Stalin replied elliptically that he “was one in principle, but he recognized the difficulty.”

That “difficulty” remains that a Jewish state could be created only at the expense of non-Jews living in Palestine. The Arabs resisted Israel’s creation, but they lost. So it goes. A trip to Yalta provides a salutary reminder that throughout history those who lose at war generally lose land, too, and sometimes sovereignty with it. By comparison with what the Poles endured last century, the Palestinians have got off lightly.

They will get their own state eventually. But not until all the permanent members of the Security Council are convinced the Palestinians will not abuse the privileges of statehood.

Like it or not, that was how the U.N. was meant to work when the Big Three conceived it on Hell’s Riviera.

After years when young Americans yearned only to be occupied on Wall Street, suddenly they have taken to occupying it. It’s easy to scoff at this phenomenon. I know, because I have.

This is certainly not America’s answer to the Arab Spring—the Bobo Fall perhaps, unmistakably both bohemian and bourgeois. But it’s still worth taking seriously. What is it that makes evidently educated young people yearn to adopt leftist positions that are eerily reminiscent of the ones their parents adopted in 1968?

Check out the protesters’ website, which on Monday featured a speech by Slovenian critical theorist Slavoj Žižek. At first I thought this must be some kind of parody, but no, he really exists—red T-shirt, Krugman beard, and all: “The only sense in which we are communists is that we care for the commons. The commons of nature. The commons of what is privatized by intellectual property. The commons of biogenetics. For this and only for this we should fight.”

Yeah, man. Property is theft. Ne travaillez jamais. And all that.

There are three possible explanations for this retrogression to the language of ’68. 

1. Increasing inequality exemplified by Wall Street is worth protesting against.

2. So is the fact that only a handful of bankers have been prosecuted for their part in the financial crisis.

3. Demonstrating is way cool.

Yet if I were a young American today, occupying Wall St. would not be my objective. Just reflect for a minute on the unbridled economic mayhem that would ensue if the protesters actually succeeded. The headline “Goldman Sachs Under Control of Hip Teenage Revolutionaries” would be the last straw for an already fragile economic recovery.

Now ask yourself what the financial crisis really means for today's 15- to 24-year-olds. Not only has it raised the probability that they will be unemployed after graduation. More seriously, it has massively increased the debt that they will have to service when they do get jobs.

Never in the history of intergenerational transfers has one generation left such a mountain of IOUs to another as the baby boomers are leaving to their grandchildren.

When you do the math, there is only one logical political home for today’s teens and 20-somethings ... and that is the Tea Party. For who else is promising to slash Medicare and Social Security and keep the tax burden at its historical average?

Let’s just remind ourselves of the report of the Trustees of the Social Security and Medicare trust funds back in 2007, which projected a rise in the cost of these two programs from 7.3 percent of gross domestic product to 17.5 percent by 2030. The trustees warned that to achieve actuarial balance—in other words, solvency—for these two programs would require (for Social Security) an increase of 16 percent in payroll tax revenues or an immediate reduction in benefits of 13 percent. For Medicare we are talking a 122 percent increase in payroll taxes or a 51 percent cut in spending.

As Laurence Kotlikoff and Scott Burns pointed out in The Coming Generational Storm, by 2030 there will be twice as many retirees as there are today but only 18 percent more workers. Unless there is really radical reform of entitlement programs - especially Medicare - the next generation of American workers will be paying roughly double the taxes their parents and grandparents paid. This is what Kotlikoff and Burns mean by “fiscal child abuse.”

Of these harsh realities the occupiers of Wall Street seem blissfully unaware. Fixated on the idea that they somehow represent the 99 percent of people who scrape by on 80 percent of total income, they fail to see that the real distributional conflict of our time is not between percentiles, much less classes, but between generations. And no generation has a keener interest in slashing future spending on entitlements than today’s teens and 20-somethings.

So occupying Wall Street is not the answer to this generation’s problems. The answer is to occupy the Tea Party—and wrest it from the grumpy old men who currently run it.

Call it the Iced Tea Party.

Way cool.

This essay is not about Steve Jobs. It is about the countless individuals with roughly the same combination of talents of whom we’ve never heard and never will.

Most of the 106 billion people who’ve ever lived are dead—around 94 percent of them. And most of those dead people were Asian—probably more than 60 percent. And most of those dead Asians were dirt poor. Born into illiterate peasant families enslaved by subsistence agriculture under some or other form of hierarchical government, the Steves of the past never stood a chance.

Chances are, those other Steves didn’t make it into their 30s, never mind their mid-50s. An appalling number died in childhood, killed off by afflictions far easier to treat than pancreatic cancer. The ones who made it to adulthood didn’t have the option to drop out of college because they never went to college. Even the tiny number of Steves who had the good fortune to rise to the top of premodern societies wasted their entire lives doing calligraphy (which he briefly dabbled in at Reed College). Those who sought to innovate were more likely to be punished than rewarded.

Today, according to estimates by Credit Suisse, there is approximately $195 trillion of wealth in the world. Most of it was made quite recently, in the wake of those great political and economic revolutions of the late 18th century, which, for the first time in human history, put a real premium on innovation. And most of it is owned by Westerners—Europeans and inhabitants of the New World and Antipodes inhabited by their descendants. We may account for less than a fifth of humanity, but we Westerners still own two thirds of global wealth.

A nontrivial portion of that wealth ($6.7 billion) belonged to Steve Jobs and now belongs to his heirs. In that respect, Jobs personified the rising inequality that is one of the striking characteristics of his lifetime. Back in 1955 the top 1 percent of Americans earned 9 percent of income. Today the figure is above 14 percent.

Yet there is no crowd of young people rampaging through Palo Alto threatening to “Occupy Silicon Valley.” The huge amounts of money made by Jobs and his fellow pioneers of personal computing are not resented the way the vampire squids of Wall Street are. On the contrary, Jobs is revered. One eminent hedge-fund manager (who probably holds a healthy slice of Apple stock as well as the full array of iGadgets) recently likened him to Leonardo da Vinci.

So the question is not, how do we produce more Steves? The normal process of human reproduction will ensure a steady supply of what Malcolm Gladwell has called “outliers.” The question should be, how do we ensure that the next Steve Jobs fulfills his potential?

An adopted child, the biological son of a Syrian Muslim immigrant, a college dropout, a hippie who briefly converted to Buddhism and experimented with LSD—Jobs was the type of guy no sane human resources department would have hired. I doubt that Apple itself would hire someone with his résumé at age 20. The only chance he ever had to become a chief executive officer was by founding his own company.

And that—China, please note—is why capitalism needs to be embedded in a truly free society in order to flourish. In a free society a weirdo can do his own thing. In a free society he can even fail at his own thing, as Jobs undoubtedly did in his first stint in charge of Apple. And in a free society he can bounce back and revolutionize all our lives.

Somewhere in his father’s native Syria another Steve Jobs has just died. But this other Steve was gunned down by a tyrannical government. And what wonders his genius might have produced we shall never know.

It was a scene to curdle liberal blood. A ballroom full of New York hedge-fund managers playing poker…to raise money for charter schools.

That’s where I found myself last Wednesday: at a Texas Hold ’Em tournament to raise money for the Success Charter Network, which currently runs nine schools in some of New York’s poorest neighborhoods.

While Naomi Wolf was being arrested for showing solidarity with the Occupy Wall Street movement, there I was, consorting with the 1 percent the protesters hate. It’s no surprise that the bread-heads enjoy gambling. But to see them using their ill-gotten gains to subvert this nation’s great system of public education! I was shocked, shocked.

Except that I wasn’t. I was hugely cheered up. America’s financial elite needs a compelling answer to Occupy Wall Street. This could be it: educate Harlem…with our poker chips.

Life, after all, is a lot like poker. No matter how innately smart you may be, it’s very hard to win if you are dealt a bad hand.

Americans used to believe in social mobility regardless of the hand you’re dealt. Ten years ago, polls showed that about two thirds believed “people are rewarded for intelligence and skill,” the highest percentage across 27 countries surveyed. Fewer than a fifth thought that “coming from a wealthy family is essential [or] very important to getting ahead.” Such views made Americans more tolerant than Europeans and Canadians of inequality and more suspicious of government attempts to reduce it.

Yet the hardships of the Great Recession may be changing that, giving an unexpected resonance to the Occupy Wall Street movement. Falling wages and rising unemployment are making us appreciate what we ignored during the good times. Social mobility is actually lower in the U.S. than in most other developed countries—and falling.

Academic studies show that if a child is born into the poorest quintile (20 percent) of the U.S. population, his chance of making it into the top decile (10 percent) is around 1 in 20, whereas a kid born into the top quintile has a better than 40 percent chance. On average, then, a father’s earnings are a pretty good predictor of his son’s earnings. This is less true in Europe or Canada. What’s more, American social mobility has declined markedly in the past 30 years.

A compelling explanation for our increasingly rigid social system is that American public education is failing poor kids. One way it does this is by stopping them from getting to college. If your parents are in the bottom quintile, you have a 19 percent chance of getting into the top quintile with a college degree—but a miserable 5 percent chance without one.

Your ZIP code can be your destiny, because poor neighborhoods tend to have bad schools, and bad schools perpetuate poverty. But the answer is not to increase spending on this failed system—nor to expand it at the kindergarten level, as proposed by Nicholas Kristof in The New York Times last week. As brave reformers like Eva Moskowitz know, the stranglehold exerted by the teachers’ unions makes it almost impossible to raise the quality of education in subprime public schools.

The right answer is to promote the kind of diversity and competition that already make the American university system the world’s best. And one highly effective way of doing this is by setting up more charter schools—publicly funded but independently run and union-free. The performance of the Success Charter Network speaks for itself. In New York City’s public schools, 60 percent of third, fourth, and fifth graders passed their math exams last year. The figure at Harlem Success was 94 percent.

The American Dream is about social mobility, not enforced equality. It’s about competition, not public monopoly. It’s also about philanthropy, not confiscatory taxation.

I’ll cheer up even more when I hear those words at a Republican presidential debate. Or maybe next week we should just tell the candidates to shut up and play poker.

Ferguson, Niall. 2011. “Murder on the EU Express.” Newsweek. Publisher's Version
Ferguson, Niall. 2011. “Sale of the Century”. Publisher's Version Abstract
In my favorite spaghetti western, The Good, the Bad and the Ugly, there is a memorable scene that sums up the world economy today. Blondie (Clint Eastwood) and Tuco (Eli Wallach) have finally found the cemetery where they know the gold is buried. Trouble is, they’re in a vast Civil War graveyard, and they don’t know where to find the loot. Eastwood looks at his gun, looks at Wallach, and utters the immortal line: “In this world, there are two kinds of people, my friend. Those with loaded guns … and those who dig.”

In the post-crisis economic order, there are likewise two kinds of economies. Those with vast accumulations of assets, including sovereign wealth funds (currently in excess of $4 trillion) and hard-currency reserves ($5.5 trillion for emerging markets alone), are the ones with loaded guns. The economies with huge public debts, by contrast, are the ones that have to dig. The question is, just how will they dig their way out?...

The U.S. needs to do exactly what it would if it were a severely indebted company: sell off assets to balance its books.

There are three different arguments against such asset sales. The first concerns national security. When Dubai Ports World bought the shipping company P&O in 2006—which would have given it control of facilities in a number of U.S. ports—the deal was killed in Congress in a fit of post-9/11 paranoia. The second argument is usually made by unions: private or foreign owners will be tougher on American workers than good old Uncle Sam. Finally, there’s the chauvinism that surfaced back in the 1980s when the Japanese were snapping up properties like Pebble Beach. How could the United States let its national treasures—the family silver—fall into the hands of inscrutable Asian rivals?

Such arguments were never very strong. Now, in the midst of the biggest crisis of American public finance since the Civil War, they simply collapse....

“The statesman can only wait and listen until he hears the footsteps of God resounding through events; then he must jump up and grasp the hem of His coat, that is all.” Thus Otto von Bismarck, the great Prussian statesman who united Germany and thereby reshaped Europe’s balance of power nearly a century and a half ago.

Last week, for the second time in his presidency, Barack Obama heard those footsteps, jumped up to grasp a historic opportunity … and missed it completely.

In Bismarck’s case it was not so much God’s coattails he caught as the revolutionary wave of mid-19th-century German nationalism. And he did more than catch it; he managed to surf it in a direction of his own choosing. The wave Obama just missed—again—is the revolutionary wave of Middle Eastern democracy. It has surged through the region twice since he was elected: once in Iran in the summer of 2009, the second time right across North Africa, from Tunisia all the way down the Red Sea to Yemen. But the swell has been biggest in Egypt, the Middle East’s most populous country.

In each case, the president faced stark alternatives. He could try to catch the wave, Bismarck style, by lending his support to the youthful revolutionaries and trying to ride it in a direction advantageous to American interests. Or he could do nothing and let the forces of reaction prevail. In the case of Iran, he did nothing, and the thugs of the Islamic Republic ruthlessly crushed the demonstrations. This time around, in Egypt, it was worse. He did both—some days exhorting Egyptian President Hosni Mubarak to leave, other days drawing back and recommending an “orderly transition.”

The result has been a foreign-policy debacle. The president has alienated everybody: not only Mubarak’s cronies in the military, but also the youthful crowds in the streets of Cairo. Whoever ultimately wins, Obama loses. And the alienation doesn’t end there. America’s two closest friends in the region—Israel and Saudi Arabia—are both disgusted. The Saudis, who dread all manifestations of revolution, are appalled at Washington’s failure to resolutely prop up Mubarak. The Israelis, meanwhile, are dismayed by the administration’s apparent cluelessness.

Last week, while other commentators ran around Cairo’s Tahrir Square, hyperventilating about what they saw as an Arab 1989, I flew to Tel Aviv for the annual Herzliya security conference. The consensus among the assembled experts on the Middle East? A colossal failure of American foreign policy.

This failure was not the result of bad luck. It was the predictable consequence of the Obama administration’s lack of any kind of coherent grand strategy, a deficit about which more than a few veterans of U.S. foreign policy making have long worried. The president himself is not wholly to blame. Although cosmopolitan by both birth and upbringing, Obama was an unusually parochial politician prior to his election, judging by his scant public pronouncements on foreign-policy issues.

Yet no president can be expected to be omniscient. That is what advisers are for. The real responsibility for the current strategic vacuum lies not with Obama himself, but with the National Security Council, and in particular with the man who ran it until last October: retired Gen. James L. Jones. I suspected at the time of his appointment that General Jones was a poor choice. A big, bluff Marine, he once astonished me by recommending that Turkish troops might lend the United States support in Iraq. He seemed mildly surprised when I suggested the Iraqis might resent such a reminder of centuries of Ottoman Turkish rule.

The best national-security advisers have combined deep knowledge of international relations with an ability to play the Machiavellian Beltway game, which means competing for the president’s ear against the other would-be players in the policymaking process: not only the defense secretary but also the secretary of state and the head of the Central Intelligence Agency. No one has ever done this better than Henry Kissinger. But the crucial thing about Kissinger as national-security adviser was not the speed with which he learned the dark arts of interdepartmental turf warfare. It was the skill with which he, in partnership with Richard Nixon, forged a grand strategy for the United States at a time of alarming geopolitical instability.

The essence of that strategy was, first, to prioritize (for example, détente with the Soviets before human-rights issues within the U.S.S.R.) and then to exert pressure by deliberately linking key issues. In their hardest task—salvaging peace with honor in Indochina by preserving the independence of South Vietnam—Nixon and Kissinger ultimately could not succeed. But in the Middle East they were able to eject the Soviets from a position of influence and turn Egypt from a threat into a malleable ally. And their overtures to China exploited the divisions within the Communist bloc, helping to set Beijing on an epoch-making new course of economic openness.

The contrast between the foreign policy of the Nixon-Ford years and that of President Jimmy Carter is a stark reminder of how easily foreign policy can founder when there is a failure of strategic thinking. The Iranian Revolution of 1979, which took the Carter administration wholly by surprise, was a catastrophe far greater than the loss of South Vietnam.

Remind you of anything? “This is what happens when you get caught by surprise,” an anonymous American official told The New York Times last week. “We’ve had endless strategy sessions for the past two years on Mideast peace, on containing Iran. And how many of them factored in the possibility that Egypt moves from stability to turmoil? None.”

I can think of no more damning indictment of the administration’s strategic thinking than this: it never once considered a scenario in which Mubarak faced a popular revolt. Yet the very essence of rigorous strategic thinking is to devise such a scenario and to think through the best responses to them, preferably two or three moves ahead of actual or potential adversaries. It is only by doing these things—ranking priorities and gaming scenarios—that a coherent foreign policy can be made. The Israelis have been hard at work doing this. All the president and his NSC team seem to have done is to draft touchy-feely speeches like the one he delivered in Cairo early in his presidency.

These were his words back in June 2009:

America and Islam are not exclusive and need not be in competition. Instead, they overlap, and share common principles—principles of justice and progress; tolerance and the dignity of all human beings.

Those lines will come back to haunt Obama if, as cannot be ruled out, the ultimate beneficiary of his bungling in Egypt is the Muslim Brotherhood, which remains by far the best organized opposition force in the country—and wholly committed to the restoration of the caliphate and the strict application of Sharia. Would such an outcome advance “tolerance and the dignity of all human beings” in Egypt? Somehow, I don’t think so.

Grand strategy is all about the necessity of choice. Today, it means choosing between a daunting list of objectives: to resist the spread of radical Islam, to limit Iran’s ambition to become dominant in the Middle East, to contain the rise of China as an economic rival, to guard against a Russian “reconquista” of Eastern Europe—and so on. The defining characteristic of Obama’s foreign policy has been not just a failure to prioritize, but also a failure to recognize the need to do so. A succession of speeches saying, in essence, “I am not George W. Bush” is no substitute for a strategy.

Bismarck knew how to choose. He understood that riding the nationalist wave would enable Prussia to become the dominant force in Germany, but that thereafter the No. 1 objective must be to keep France and Russia from uniting against his new Reich. When asked for his opinion about colonizing Africa, Bismarck famously replied: “My map of Africa lies in Europe. Here lies Russia and here lies France, and we are in the middle. That is my map of Africa.”

Tragically, no one knows where Barack Obama’s map of the Middle East is. At best, it is in the heartland states of America, where the fate of his presidency will be decided next year, just as Jimmy Carter’s was back in 1980.

At worst, he has no map at all.

 

 

2010
The Shock of the Global: The 1970s in Perspective
Ferguson, Niall, Erez Manela, Charles S Maier, and Daniel Sargent. 2010. The Shock of the Global: The 1970s in Perspective. Harvard University Press. Publisher's Version Abstract

From the vantage point of the United States or Western Europe, the 1970s was a time of troubles: economic “stagflation,” political scandal, and global turmoil. Yet from an international perspective it was a seminal decade, one that brought the reintegration of the world after the great divisions of the mid-twentieth century. It was the 1970s that introduced the world to the phenomenon of “globalization,” as networks of interdependence bound peoples and societies in new and original ways

The 1970s saw the breakdown of the postwar economic order and the advent of floating currencies and free capital movements. Non-state actors rose to prominence while the authority of the superpowers diminished. Transnational issues such as environmental protection, population control, and human rights attracted unprecedented attention. The decade transformed international politics, ending the era of bipolarity and launching two great revolutions that would have repercussions in the twenty-first century: the Iranian theocratic revolution and the Chinese market revolution.

The Shock of the Global examines the large-scale structural upheaval of the 1970s by transcending the standard frameworks of national borders and superpower relations. It reveals for the first time an international system in the throes of enduring transformations.

2009
Ferguson, Niall. 2009. “Dead Men Walking”. Publisher's Version Abstract

There is nothing like a really big economic crisis to separate the Cassandras from the Panglosses, the horsemen of the apocalypse from the Kool-Aid-swigging optimists. No, the last year has shown that all is not for the best in the best of all possible worlds. On the contrary, we might be doomed.

At such times, we do well to remember that most of today’s public intellectuals are mere dwarves, standing on the shoulders of giants. So, if they had e-mail in the hereafter, which of the great thinkers of the past would be entitled to send us a message with the subject line: “I told you so”? And which would prefer to remain offline?

It has, for example, been a bad year for Adam Smith (1723-1790) and his “invisible hand,” which was supposed to steer the global economy onward and upward to new heights of opulence through the action of individual choice in unfettered markets. By contrast, it has been a good year for Karl Marx (1818-1883), who always maintained that the internal contradictions of capitalism, and particularly its tendency to increase the inequality of the distribution of wealth, would lead to crisis and finally collapse. A special mention is also due to early 20th-century Marxist theorist Rudolf Hilferding (1877-1941), whose Das Finanzkapital foresaw the rise of giant “too big to fail” financial institutions.

Joining Smith in embarrassed silence, you might think, is Friedrich von Hayek (1899-1992), who warned back in 1944 that the welfare state would lead the West down the “road to serfdom.” With a government-mandated expansion of health insurance likely to be enacted in the United States, Hayek's libertarian fears appear to have receded, at least in the Democratic Party. It has been a bumper year, on the other hand, for Hayek's old enemy, John Maynard Keynes (1883-1946), whose 1936 work The General Theory of Employment,Interest and Money has become the new bible for finance ministers seeking to reduce unemployment by means of fiscal stimuli. His biographer, Robert Skidelsky, has hailed the “return of the master.” Keynes's self-appointed representative on Earth, New York Times columnist Paul Krugman, insists that the application of Keynesian theory, in the form of giant government deficits, has saved the world from a second Great Depression.

The marketplace of ideas has not been nearly so kind this year to the late Milton Friedman (1912-2006), the diminutive doyen of free-market economics. “Inflation,” wrote Friedman in a famous definition, “is always and everywhere a monetary phenomenon, in the sense that it cannot occur without a more rapid increase in the quantity of money than in output.” Well, since September of 2008, Ben Bernanke has been printing dollars like mad at the U.S. Federal Reserve, more than doubling the monetary base. And inflation? As I write, the headline consumer price inflation rate is negative 2 percent. Better throw away that old copy of Friedman's Monetary History of the United States, 1867-1960 (co-authored with Anna J. Schwartz, who is happily still with us).

Invest, instead, in a spanking new edition of The Great Transformation by Karl Polanyi (1886-1964). We surely need Polanyi's more anthropological approach to economics to explain the excesses of the boom and the hysteria of the bust. For what in classical economics could possibly account for the credulity of investors in Bernard Madoff's long-running Ponzi scheme? Or the folly of Richard Fuld, who gambled his personal fortune and reputation on the very slim chance that Lehman Brothers, unlike Bear Stearns and Merrill Lynch, could survive the crisis without being sold to a competitor?

The biggest intellectual losers of all, however, must be the pioneers of the theory of efficient markets—economists still with us, such as Harry M. Markowitz, the University of Chicago-trained economist who developed the theory of portfolio diversification as the best protection against economic volatility, and William Sharpe, inventor of the capital asset pricing model. In two marvelously lucid books, the late Peter Bernstein extolled their “capital ideas.” Now, with so many quantitative hedge funds on the scrap heap, their ideas don't seem quite so capital.

And the biggest winners, among economists at least? Step forward the "Austrians" —economists like Ludwig von Mises (1881-1973), who always saw credit-propelled asset bubbles as the biggest threat to the stability of capitalism. Not many American economists carried forward their work into the later 20th century, but one heterodox figure has emerged as a posthumous beneficiary of this crisis: Hyman Minsky (1919-1996). At a time when other University of Chicago-trained economists were forging the neoclassical synthesis—Adam Smith plus applied math—Minsky developed his own math-free “financial instability hypothesis.”

Yet it would surely be wrong to make the Top Dead Thinker of 2009 an economic theorist. The entire discipline of economics has flopped too embarrassingly for that to be appropriate. Instead, we should consider the claims of a historian, because history has served as a far better guide to the current crisis than any economic model. My nominee is the financial historian Charles Kindleberger (1910-2003), who drew on Minsky's work to popularize the idea of financial crisis as a five-stage process, from displacement and euphoric overtrading to full-fledged mania, followed by growing concern and ending up with panic. (If those five steps to financial hell sound familiar, they should. We just went down them, twice in the space of 10 years.)

Of course, history offers more than just the lesson that financial accidents will happen. One of the most important historical truths is that the first draft of history —the version that gets written on the spot by journalists and other contemporaries —is nearly always wrong. So though superficially this crisis seems like a defeat for Smith, Hayek, and Friedman, and a victory for Marx, Keynes, and Polanyi, that might well turn out to be wrong. Far from having been caused by unregulated free markets, this crisis may have been caused by distortions of the market from ill-advised government actions: explicit and implicit guarantees to supersize banks, inappropriate empowerment of rating agencies, disastrously loose monetary policy, bad regulation of big insurers, systematic encouragement of reckless mortgage lending—not to mention distortions of currency markets by central bank

Consider this: The argument for avoiding mass bank failures was made by Friedman, not Keynes. It was Friedman who argued that the principal reason for the depth of the Depression was the Fed's failure to avoid an epidemic of bank failures. It has been Friedman, more than Keynes, who has been Bernanke’s inspiration over the past two years, as the Fed chairman has honored a pledge he made shortly before Friedman's death not to preside over another “great contraction.” Nor would Friedman have been in the least worried about inflation at a time like this. The Fed's balance sheet may have expanded rapidly, but broader measures of money are growing slowly and credit is contracting. Deflation, not inflation, remains the monetarist fear.

From a free market perspective, the vital thing is that legitimate emergency measures do not become established practices. For it cannot possibly be a healthy state of affairs for the core institutions of the Western financial system to be effectively guaranteed, if not actually owned, by the government. The thinker who most clearly discerned the problems associated with that kind of state intervention was Joseph Schumpeter (1883-1950), whose “creative destruction” has been one of this year's most commonly cited phrases.

“[T]his evolutionary…impulse that sets and keeps the capitalist engine in motion,” wrote Schumpeter in Capitalism, Socialism and Democracy, “comes from…the new forms of industrial organization that capitalist enterprise creates…This process of creative destruction is the essential fact about capitalism.” This crisis has certainly unleashed enough economic destruction in the world (though its creativity at this stage is still hard to discern). But in the world of the big banks, there has been far too little destruction, and about the only creative thing happening on Wall Street these days is the accounting.

“This economic system,” Schumpeter wrote in his earlier The Theory of Economic Development, “cannot do without the ultima ratio [final argument] of the complete destruction of those existences which are irretrievably associated with the hopelessly unadapted.” Indeed, he saw that the economy remained saddled with too many of “those firms that are unfit to live.” That could serve as a painfully accurate description of the Western financial system today.

Yet all those allusions to evolution and fitness to live serve as a reminder of the dead thinker we should all have spent at least part of 2009 venerating: Charles Darwin (1809-1882). This year was not only his bicentennial but the 150th birthday of his paradigm-shifting On the Origin of Species. Just reflect on these sentences from Darwin's seminal work:

All organic beings are exposed to severe competition.”

“As more individuals are produced than can possibly survive, there must in every case be a struggle for existence.”

“Each organic being…has to struggle for life and to suffer great destruction.... The vigorous, the healthy, and the happy survive and multiply.”

Thanks in no small measure to the efforts of his modern heirs, notably Richard Dawkins, we are all Darwinians now—except in the strange parallel worlds of fundamentalist Christianity and state-guaranteed finance.

Neither Cassandra nor Pangloss, Darwin surely deserves to top any list of modern thinkers, dead or alive.

Pages