Publications by Type: Newspaper Article

2011

The rugged Sanriku Coast of northeastern Japan is among the most beautiful places in the country. The white stone islands outside the port town of Miyako are magnificent. The Buddhist monk Reikyo could think of nothing but paradise when he first saw them in the 17th century. “It is the shore of the pure land,” he is said to have uttered in wonder, citing the common name for nirvana.

Reikyo’s name for the place stuck. Jodogahama, or Pure Land Beach, is the main gateway to the Rikuchu Kaigan National Park, a crenellated seashore of spectacular rock pillars, sheer cliffs, deep inlets and narrow river valleys that covers 100 miles of rural coastline. It is a region much like Down East Maine, full of small, tight-knit communities of hardworking people who earn their livelihoods from tourism and fishing. Sushi chefs around the country prize Sanriku abalone, cuttlefish and sea urchin.

Today that coast is at the center of one of the worst disasters in Japanese history. Despite the investment of billions of yen in disaster mitigation technology and the institution of robust building codes, entire villages have been swept out to sea. In some places little remains but piles of anonymous debris and concrete foundations.

I taught school in Miyako for more than two years in the 1990s, and it was while hiking in the mountains above one of those picturesque fishing villages that I came across my first material reminder of the intricate relationship between the area’s breathtaking geography, its people — generous and direct — and powerful seismic forces.

On a hot summer day a group of middle-school boys set out to introduce me to their town, a hamlet just north of Pure Land Beach. While I started up the steep mountainside the children bounced ahead of me, teasing me that I moved slowly for someone so tall. “Are you as tall as Michael Jordan, Miller-sensei?” yelled one boy as he shot past me up the trail.

“Not quite,” I told him, pausing on a spot of level ground to look out over the neat collection of tile roofs and gardens that filled the back of a narrow, high-walled bay.

“What is this?” I asked, pointing to a mossy stone marker that occupied the rest of the brief plateau. A chorus of young voices told me that it was the high-water mark for the area’s biggest tsunami: more than 50 feet above the valley floor.

“When was that?” I asked, but the boys couldn’t say. They had learned about it in school, they said, but like children everywhere they had little sense of time. Everything seemed like ancient history to them, but the thought of a wave reaching so high over the homes of my friends sent a chill down my spine, and I began to investigate the region’s history.

A major tsunami has hit the Sanriku Coast every few decades over the last century and a half. Waves swept the area in 1896, 1933 and 1960. The small monument was put there, high above the village, to mark the crest of the 1896 tsunami. The wave killed more than 20,000 people. The boys’ village, a place called Taro, was almost entirely destroyed. Seventy-five percent of the population died.

The force of those waves was amplified by the area’s distinctive geography. The same steep valley walls and deep inlets that make Sanriku so beautiful also make its villages and towns especially hazardous. The valleys channel a tsunami’s energy, pushing swells that are only a few feet high in the open ocean up to stunning heights. Fast-moving water topped 120 feet in one village in 1896.

In a landscape where earthquakes are a regular occurrence but major tsunamis happen irregularly, people naturally forget. The small monument— one of several commissioned for towns up and down the coast — was a mnemonic whose purpose was not commemoration but vigilance. “When there is an earthquake, watch for tsunami,” reads the rather practical poem engraved into one such slab.

Japan became a modern industrial state between the 1896 tsunami and the next major one, in 1933. The country’s radio and newspapers brought the story of rural fisher-folk swept out to sea to metropolitan audiences. Three thousand people died in the disaster and the humanitarian crisis elicited strong feelings of sympathy. The Sanriku region was portrayed as the nation’s heartland, a place where tradition remained intact, and the disaster threatened that preserve. Once again, Taro was particularly hard hit: all but eight of its homes were destroyed and nearly half of the village’s population of 1,800 souls went missing. The hamlet became an embodiment of agrarian loss.

It is paradoxical that the response to this threat to traditional ways was the application of cutting-edge engineering and technology. A huge concrete seawall was planned for Taro. Completed in 1958, that wall, 30 feet high at points, stretches over 1.5 miles across the base of the bay.

Faith in technology over nature appeared to be vindicated in 1960 with the great Chilean earthquake, a 9.5-magnitude quake that remains the largest ever recorded, which set off a Pacific-wide tsunami that killed 61 people in Hilo, Hawaii, before surging unannounced into the Sanriku Coast seven hours later. More than 120 Japanese died, but Taro remained largely unaffected, safe behind its sluice gates and concrete wall. Based in part on this success, a new program of coastal defense was initiated.

The Sanriku Coast is now one of the most engineered rural coastlines in the world. Its towns, villages and ports take shelter behind state-of-the-art seawalls and vast assemblages of concrete tetrapods designed to dissipate a wave’s energy. The region is home to one of the world’s best emergency broadcast systems and has been at the forefront of so-called “vertical evacuation” plans, building tall, quake-resistant structures in low-lying areas.

In 2003 Taro announced that it would become a “tsunami preparedness town.” Working with teams from the University of Tokyo and Iwate University, the town instituted a direct satellite link to accelerate the arrival of tsunami warnings. Public education was expanded and mayors from other towns visited to study this model village. Detailed maps showing projected maximum tsunami heights — using 1896 as a baseline — informed the selection of evacuation markers: a reassuring thick line defined the projected maximum reach of a tsunami. Evacuation sites were placed above that line on the maps. Similar calculations were made up and down the coast.

The lines were drawn in the wrong place. Despite the substantial infrastructure and technological investments in Sanriku, the wave on March 11 overwhelmed large portions of Taro and Miyako. Some of the evacuation points were not high enough. The walls were not tall enough. And the costs are still being tallied.

Thousands of people are missing along this beautiful, injured coast, hundreds in the town that I called home. I am still waiting to hear from one of the groomsmen from my wedding, the owner of Miyako’s best coffee shop and a sometime reader of this newspaper. Google’s people-finder app tells me he is alive, but I have no idea where he is or how our other friends fared. As for those rambunctious boys and all of my other students, I can only hope for the best.

Technology allowed me to learn my friend’s fate. It has also helped to inspire a worldwide humanitarian response. It may be, however, that a greater application of technology in the same direction is not the answer to the problems posed by the March 11 tsunami. As a historian, I am forced to recognize that there is nothing purely natural about this catastrophe. It is the result of a far longer negotiation between human culture and physical forces. Disasters have the counterintuitive tendency to reinforce the status quo. As the terrifying events at the Fukushima Daiichi nuclear plant continue to underline, there are very real costs to an uncritical application of technology.

I look forward to returning to my old Japanese home, but I also look forward to finding something new and different when I make that journey.

 

Ferguson, Niall. 2011. “Murder on the EU Express.” Newsweek. Publisher's Version
Aldrich, Daniel P. 2011. “With a Mighty Hand”. Publisher's Version Abstract

As the ongoing nuclear crisis at the Fukushima Daiichi plant in the town of Ohkuma continues, and plant engineers and first responders endanger their lives to keep fuel rods and containment units cool, it is critical to consider how Japan’s commitment to nuclear power arose in the first place. It was no twist of fate or invisible market-hand that created 55 nuclear reactors in a seismically active country smaller than the state of California. Japanese bureaucrats and politicians have made it a priority to create an indigenous source of power that provides an alternative to imported oil and coal.

Despite this understandable rationale, it’s still surprising that the only nation in the world against which atomic weaponry has been used—twice, no less—has created the world’s most advanced commercial nuclear program. It is especially surprising when you consider that countries like France, Germany, and the United States have given up their attempts at fast breeder reactor technology because of concerns about proliferation and hazard. What has driven Japan to pursue its advanced program of nuclear power, and why have nuclear power plants ended up in incredibly vulnerable positions along Japan’s coasts?

While the United States has provided some support for nuclear power (for example, the Price-Anderson Act, which commits the federal government to absorbing some of the financial costs of potential nuclear accidents), as a source of alternative energy, it has never been fully embraced. This tradition of fence-sitting continues today, as seen in the Obama administration’s decision to end funding for the planned high-level radioactive waste repository at Yucca Mountain after two and a half decades of struggle over the site. Completion of the facility would have helped pro-nuclear groups convince skeptics that offsite waste disposal was possible. (The difficulty in securing permanent storage sites for nuclear waste is what leads U.S. and Japanese nuclear power plants to store their used fuel rods on site where they are most vulnerable.)

In contrast, the Japanese government has gone far beyond this approach. The origins of its enthusiasm can be traced to the postwar period. In 1955, at the urging of then-Diet Memberand later Prime Minister Yasuhiro Nakasone, the Japanese government granted more than 5 billion yen—about $14 million in 1955 currency—to the Japan-based Agency for Industrial Science and Technology to begin research under the aegis of the “atoms for peace” banner raised by Dwight Eisenhower. In recent decades, the Japanese central government has supported the regional power utilities—including Tokyo Electric Power Company (TEPCO), which runs Fukushima, and its counterparts—through research funding, risk amortization, and financial and logistical support. Unlike France, for example, which explicitly nationalized and then partly privatized their main nuclear-power-promoting utility company, Japan did not nationalize its energy industry, and some Japan experts have characterized the relationship between the state and utility firms as contentious. Yet both sides got what they wanted over the past 50 years: The government guided Japanese firms to produce nearly one-third of the nation’s power through nuclear plants, and the utilities obtained credible commitment against risk and financial backing for their expensive investments.

The Japanese people, as readers might imagine, have not been solid supporters of these government-initiated policies. Just as the government hoped to start its nuclear program in 1954, a highly publicized accident—in which crewmates onboard the poorly named Lucky Dragon Number 5 ship were exposed to radioactive fallout from an American hydrogen bomb test—resulted in the death of a radioman from radiation exposure. This spurred the creation of one of the world’s first national anti-nuclear movements (known as Gensuikyō) and a petition against nuclear weapons that obtained more than 20 million signatures. Attempts to build a number of plants around the country since the mid-1950s have resulted in petitions, public outcries, demonstrations, sit-ins, marches, and the occupation of town councils by anti-nuclear activists. While the anti-nuclear movement has not demonstrated the kind of violence shown by the anti-Narita Airport protests in the 1960s—during which a number of police officers and anti-airport farmers and students were killed in struggle over land expropriation—the success rate for the building of nuclear power plant has been roughly 50 percent. (That is, for every two attempts to construct a new plant, only one has gone forward.)

To minimize these fights over nuclear power in a society where people are deeply sympathetic to victims of atomic energy, the government has taken a two-pronged approach. First, it has worked tirelessly with the regional utilities to map out villages and towns that are the best locations for plants, according to the utilities’ needs. Bureaucrats within MITI (the Ministry of International Trade and Industry, which became METI, the Ministry of Economy, Trade, and Industry, as of 2001) provided funding for geographical and demographic surveys of potential grounds. Power companies have often targeted rural, depopulated coastal communities, where the population of local fishermen are declining. But, while legitimate criteria, such as distance from high-population areas, shock-resistant bedrock, and access to cooling water, have played a role in such plant site selections, the inability of the local population to coordinate anti-nuclear mobilization has often been the dominant factor.

Second, the government has created an extensive framework of policy instruments to manage and dampen anti-nuclear contestation. Where the Japanese authorities have been content to use standard, Weberian tools against anti-facility movements in other areas—such as dam construction and airport building—they have never resorted to land-expropriation in struggles over nuclear power plants, despite clear legal precedent for them to do so. Rather, the government has created a series of hard- and soft-control tools alongside deep incentives for communities willing to take on nuclear reactors. For example, students in Japanese middle schools may take science courses emphasizing the safety and necessity of nuclear power plants, with curricula written by government bureaucrats rather than teachers. Farmers and fishermen in these communities are regularly offered jobs at government-sponsored facilities to compensate for signing away sea rights in the surrounding fishing area. To further assuage the resistance that fishermen and farmers have shown in the past (because of concerns over “nuclear blight”—potential customers avoiding crops or fish because of fears of nuclear contamination), the government sponsors a yearly fair in Yokohama, in which only communities that host nuclear power plants can display and sell their goods. Finally, the government has created a monumental program called The Three Power Source Development Laws (Dengen Sanpō), which funnels roughly $20 million per year to acquiescent host communities. The money—which comes not from the politically vulnerable and annually vetted budget, but, instead, from an invisible tax on all electricity use across the nation—purchases roads, buildings, job re-training, medical facilities, and good will. In these far-flung rural communities that are, by and large, dying through depopulation and aging, these funds can provide vital support.

Japan’s choices—to sway public opinion through subsidies, social control tools, and manipulation—have left little room for public debate on the issue of nuclear power. The local residents—whom we see bearing the heaviest burden of the ongoing crisis in Fukushima and who have been exposed to radiation by past accidents at the Monju FBR, the fatal accident at Tokaimura, and elsewhere—are seen not as partners, but as targets for policy tools. A plan-rational approach, as Chalmers Johnson might have called it, has placed reactors in areas vulnerable to the threat of tsunami and pushed rural communities into dependence on the economic side payments which accompany these facilities. Now, as Japan struggles to avert catastrophe, it is the time for a real discussion between civil society and state over the future of nuclear power.

Daniel P. Aldrich is a former Advanced Research Fellow (2006–2007) of the Program on U.S.-Japan Relations and a former Graduate Student Associate of the Weatherhead Center (2002–2005).

Hosni Mubarak’s promise this week to initiate constitutional reform in Egypt and then step down at the end of his presidential term in September did little to mollify the anger of the demonstrators protesting his rule. Many protesters seemed to agree with the assessment of the opposition leader Mohamed ElBaradei that it was “a trick” intended to buy time. With the regime-sponsored ugliness now engulfing Tahrir Square, demands for Mr. Mubarak’s immediate resignation have grown only more urgent, and the risk of a violent conclusion appears to have grown.

But there may still be a chance to effect the “orderly transition” that Secretary of State Hillary Rodham Clinton has called for. Paradoxically, it requires that Mr. Mubarak stay on, but only for a short time, to initiate the election of an entirely new Parliament that could then amend all the power out of the presidency or even abolish it.

This would no doubt disappoint those who want to put Mr. Mubarak on the next plane to Saudi Arabia, but there are two risks associated with his leaving so abruptly. The first is that the demonstrations might diminish or dissipate, leaving Mr. ElBaradei and his coalition trying to negotiate with the military or Vice President Omar Suleiman without the force of the crowds behind them.

The second risk stems from the Egyptian Constitution, which gives the power to dissolve Parliament and call new elections only to an elected president. Mr. Mubarak’s successor, as an acting president, would be specifically prohibited from getting the parliamentary elections under way. A new Parliament is crucial to democratic reform, because only Parliament has the power to defang the Egyptian presidency, stripping it of its dictatorial powers through constitutional amendment. The current Parliament — bought and paid for by Mr. Mubarak’s National Democratic Party — is not fit for that task.

Egypt’s next scheduled presidential election is only months away. If the Constitution isn’t amended before it is held, the notorious Article 76, which makes it difficult for independents like Mr. ElBaradei to get on the ballot, will still be in place. More important, the new president would have the same imperial powers Mr. Mubarak has had — the very powers that the Egyptian public wants taken away.

The constitutionally sanctioned timeline would be this: Mr. Mubarak dissolves Parliament, forcing a new election within 60 days (international observers would be required to make sure the election is fair). Once the new Parliament is seated, Mr. Mubarak resigns, and an acting president, probably the new Parliament’s speaker, takes charge until a new president is elected. The new Parliament would work around the clock to amend the Constitution in ways that would put Mr. Suleiman or any would-be strongman out of a job. The final step is a national referendum on the amendments.

For American policymakers, the most frightening possibility is that the Muslim Brotherhood would sweep the parliamentary elections and institute a constitution based on Islamic holy law. This is unlikely. The political momentum in Egypt is not with the Islamists. Moreover, the Brotherhood’s members have never sought to compete for a majority of seats in Parliament, and during the current protests have impressed people across the Egyptian political spectrum with their self-effacement. Brotherhood adherents know that a victory for them could be used by the military as an excuse to short-circuit the birth of democracy in Egypt.

A likelier outcome is that the Islamists would join a coalition slate of candidates, becoming part of an ideologically diverse Parliament. The greater danger now is that Mr. Mubarak would corrupt the electoral process by unleashing the same thugs who are now attacking the peaceful protesters of Tahrir Square.

One might wonder why, at this moment of change and tumult, anyone would talk about amending a constitution that everyone recognizes as a deformed confection of a corrupt regime. But by working with even a flawed constitution, the opposition would be helping to entrench and deepen a constitutionalist principle that has been steadily eroded. And with its built-in deadlines, the constitutional route also makes it harder for the military to draw out the transition and consolidate its hold.

For any of this to happen, Mr. Mubarak must remain briefly in office, and he must agree to the changes as an answer to his people’s legitimate cry for democracy. The demand that can make him comply must come from President Obama.

It has often been said in recent days that the United States can do nothing to affect the progress of democracy in Egypt, but the military’s dependence on American money and matériel suggests that this is untrue. The more the United States can make clear that continued military support depends on how the Egyptian Army conducts itself during this transition, the more likely the military is to play midwife to democracy.

Much could go wrong, but finding an orderly way to get not just Mr. Mubarak but also the armed forces out of political life should be a more important priority than ensuring that Islamists don’t hijack the revolution. All that is required of us is to remind ourselves that democracy in Egypt, or any other part of the world, is not something we should fear.

The seeds of Mary Lewis’ fascination with France were planted early. Her father spent a few years there as a young man, working in the offices of the Marshall Plan, so she grew up hearing a steady stream of stories about that country.

“I had never been out of North America,” said the newly tenured professor of history, “but when my father would talk about France’s history, it sparked an interest that is still with me.”

The geopolitically tense Reagan administration years were her political coming-of-age, and the native Californian went to college wanting to understand the Cold War, studying international relations when she attended the University of California, Davis.  She spent her junior year abroad in France, becoming increasingly interested in the diversity of its society.

The final seed that would eventually bear Lewis’ intellectual fruit was planted during a political science class she took upon her return from studying abroad. It was November 1989, the month the Berlin Wall fell.

“We were discussing the theory of mutually assured destruction,” she said. “A young man raised his hand and asked the professor, ‘Can we talk about Berlin?’ The professor was completely thrown. The real world was confronting his theoretical model, and he didn’t know what to do.”

Lewis remembers the professor dismissing the question by telling the student to read The New York Times. That, she said, was the moment she knew she wanted to study history.

“At that point, history had suddenly caught up to political science,” she said. “I realized you really needed history to understand politics.”

After graduating from the University of California, Davis, and before beginning a Ph.D. program in history at New York University, Lewis spent two years working for the U.S. Department of Education in its Office for Civil Rights, an experience she said greatly affected how she studies and thinks about history today.

“I learned a lot about bureaucracy and the layers of bureaucracy,” she said. “If I wrote a letter, it would go through six different levels of editing and end up with someone else’s signature on it.”

“I got a sense of how policy and decisions are layered. It helped me become the kind of historian that I am today.”

Lewis’ improbable interest in bureaucracy informed her first book, “The Boundaries of the Republic: Migrant Rights and the Limits of Universalism in France, 1918-1940” (Stanford University Press, 2007), recently translated into French as “Les Frontières de la République” (Éditions Agone, 2010). The book demonstrates how local actions — far removed from Parisian edicts — redefined the boundaries between French citizens and outsiders in the early decades of the 20th century. By focusing on the limits of legislation in a pluralistic society, the book challenges the common vision of France as a highly centralized nation.

“We tend to think of France as a centralized country with uniform rights decreed in Paris,” Lewis said. “But the actions of immigrants themselves in the provinces, by forcing officials to recognize that they were going to stay in the country, instigated an expansion of those rights. In a sense, today’s diverse French society is a product of that history.”

Today, Lewis’ studies are intersecting anew with current events: She is working on a book about Tunisia, using the case of the little-studied French protectorate there to study how imperial rivalry affected French colonial governance from the 1880s to the 1930s. Pent-up public unrest in the North African country exploded and brought down its government last month.

“Having researched my forthcoming book there, I was surprised that the protests would lead so suddenly to a change in regime,” she said of Tunisia’s overthrow of its president. “It’s a police state.  People have conditioned themselves to be very guarded in conversation when speaking about politics because they know they’re being watched, so the fact that they would have the nerve to protest as they did is remarkable.”

Lewis is also planning a new research project on intercolonial movement by studying colonial passports.

“We think of these societies as being hermetically sealed, because we tend to study them from an imperialist point of view, but in fact people were on the move, and we can see challenges to imperial control based on these varied movements.”

One of Lewis’ favorite parts of working at Harvard is interacting with students.

“They make you think,” she said. “Even if you’ve taught a class before, you’ll get something new out of it because of the student participation. This is positive feedback on a whole other level.”

Ferguson, Niall. 2011. “Sale of the Century”. Publisher's Version Abstract
In my favorite spaghetti western, The Good, the Bad and the Ugly, there is a memorable scene that sums up the world economy today. Blondie (Clint Eastwood) and Tuco (Eli Wallach) have finally found the cemetery where they know the gold is buried. Trouble is, they’re in a vast Civil War graveyard, and they don’t know where to find the loot. Eastwood looks at his gun, looks at Wallach, and utters the immortal line: “In this world, there are two kinds of people, my friend. Those with loaded guns … and those who dig.”

In the post-crisis economic order, there are likewise two kinds of economies. Those with vast accumulations of assets, including sovereign wealth funds (currently in excess of $4 trillion) and hard-currency reserves ($5.5 trillion for emerging markets alone), are the ones with loaded guns. The economies with huge public debts, by contrast, are the ones that have to dig. The question is, just how will they dig their way out?...

The U.S. needs to do exactly what it would if it were a severely indebted company: sell off assets to balance its books.

There are three different arguments against such asset sales. The first concerns national security. When Dubai Ports World bought the shipping company P&O in 2006—which would have given it control of facilities in a number of U.S. ports—the deal was killed in Congress in a fit of post-9/11 paranoia. The second argument is usually made by unions: private or foreign owners will be tougher on American workers than good old Uncle Sam. Finally, there’s the chauvinism that surfaced back in the 1980s when the Japanese were snapping up properties like Pebble Beach. How could the United States let its national treasures—the family silver—fall into the hands of inscrutable Asian rivals?

Such arguments were never very strong. Now, in the midst of the biggest crisis of American public finance since the Civil War, they simply collapse....

What is it about those robes? They are only flimsy bits of wools, enlivened in a few cases by some very European lace at the collar. Yet the moment our Supreme Court justices put them on, a segment of the concerned public imagines that they have become priests consecrated to the sacred order of the Constitution.

Recently, Justice Antonin Scalia has been criticized for meeting with a group of (gulp) conservative members of Congress and accused of participating in an event organized by the conservative billionaire Charles Koch. Justice Clarence Thomas has been excoriated because his wife, Virginia, last year took a leading role in organizing Liberty Central, a Tea Party offshoot that received anonymous, First Amendment-protected donations (she has since stepped down). He also belatedly amended 13 years’ worth of disclosure reports to include details of his wife’s employment.

Justices are required to disclose their income sources and those of their spouses. But the core of the criticisms against Justices Thomas and Scalia has nothing to do with judicial ethics. The attack is driven by the imagined ideal of the cloistered monk-justice, innocent of worldly vanities, free of political connections and guided only by the gem-like flame of inward conscience.

It was not ever thus. John Marshall, undoubtedly the greatest chief justice ever, spent his first month on the court as the secretary of state of the United States. That’s right, the chief justice and the secretary of state were the same person — an arrangement permitted by the Constitution, which only prohibits members of Congress from holding other offices. Marshall’s most famous decision — Marbury v. Madison, which established the principle of judicial review — arose from Marshall’s own failure as secretary of state to deliver the obscure William Marbury his commission as justice of the peace in the waning hours of the Adams administration. No one cared.

The political activities of the justices increased over time. Charles Evans Hughes, who would later become another great chief justice, resigned from his first stint as associate justice on June 10, 1916, to run for the presidency on the Republican ticket. Although this represented a separation from his judicial role, the Republican convention had begun at the Chicago Coliseum on June 7; Hughes did not resign until the nomination was in the bag.

In 1948, Americans for Democratic Action tried to draft Justice William O. Douglas as a Democratic presidential candidate. In their political literature, they used excerpts from his Supreme Court opinions, which (his colleagues noted privately) sounded suspiciously like stump speeches. (In the end, he decided against a run.)

Equally important, in the pre-monastic age, justices often took on politically charged government responsibilities when the world needed them. Their experiences in public service not only helped the country, but informed their subsequent jurisprudence.

Justice Robert Jackson, a valued player in Franklin Delano Roosevelt’s regular poker game (and a hero to many court observers today), took a year away from the court to serve as the chief prosecutor at Nuremberg, a presidential appointment. Later, when the Supreme Court had to decide whether German detainees convicted by United States war crimes tribunals were entitled to habeas corpus rights, Jackson did not recuse himself. Instead, he wrote the opinion in Johnson v. Eisentrager, the case that formed the precedent for the extension of habeas rights to the detainees at Guantánamo Bay.

Justice Owen Roberts was chosen by Roosevelt to head the commission investigating the attack on Pearl Harbor. What he learned made him one of only three justices to defy Roosevelt and dissent from the court’s shameful decision to uphold the wartime internment of more than 100,000 Japanese-Americans who had been convicted of no crime at all.

The 1970s saw the beginning of a retreat by the justices from public engagement with national affairs. Some of this was defensive. In 1969, Justice Abe Fortas, one of Lyndon Johnson’s closest advisers on Vietnam even while on the court, had to resign after revelations that he had been on retainer to a financier under investigation for securities violations. The next year, Gerald Ford, then the House minority leader, sought unsuccessfully to impeach Douglas for taking money from a nonprofit foundation.

Yet, probably the greater reason for the justices’ growing circumspection by the early 1970s was that the Supreme Court was taking its most active role ever in running the nation’s affairs: when the court ruled against Richard Nixon in the Watergate tapes case, it effectively forced a president from office. Empowered to break a president (making one had to wait until Bush v. Gore in 2000), the justices sought to deflect attention from the obvious fact that they were political.

The disengagement from public life that followed has had real costs. Isolated justices make isolated decisions. It is difficult to imagine justices who drank regularly with presidents deciding that a lawsuit against a sitting executive could go forward while he was in office, or imagining that the suit would not take up much of the president’s time. Yet that is precisely what the court did by a 9-to-0 vote in the 1997 case of Clinton v. Jones. The court’s mistaken practical judgment opened the door to President Bill Clinton’s testimony about Monica Lewinsky and the resulting impeachment that preoccupied the government for more than two years as Osama bin Laden laid his plans.

Today, even the justices’ minimal extrajudicial activities come in for public condemnation — some of it suspiciously partisan. Does anyone seriously think Justice Thomas would become more constitutionally conservative (if that were somehow logically possible) as a result of his wife’s political activism? It is true that Justice Thomas voted to protect the anonymity of some corporate contributions in the Citizens United case. But this vote reflected his long-established principles in favor of corporate speech. The personal connection was nowhere near close enough to demand recusal, any more than a justice who values her privacy should be expected to recuse herself from a Fourth Amendment decision.

After all, Martin Ginsburg, a model of ethical rectitude until his death last year, was for many years a partner in an important corporate law firm. But surely no one believes that his career made his wife, Justice Ruth Bader Ginsburg, more positively inclined toward corporate interests on the court than she would already be as a member in good standing of America’s class of legal elites.

Justice Antonin Scalia, for his part, naturally spends time with like-minded conservatives including Representative Michele Bachmann and Charles Koch. But when the brilliant, garrulous Justice Scalia hobnobs with fellow archconservatives, he is not being influenced any more than is the brilliant, garrulous Justice Stephen Breyer when he consorts with his numerous friends and former colleagues in the liberal bastion of Cambridge, Mass.

A FEW years ago, many insisted that Justice Scalia should not sit in judgment of Vice President Dick Cheney’s claims to enjoy executive privilege, noting that the two had been on the same duck-hunting trip. Justice Scalia memorably explained that the two men had never shared the same blind. He could as easily have pointed out that before President Harry Truman nationalized the steel mills, he asked Chief Justice Fred Vinson, a poker buddy and close friend, if the court would find the action constitutional. (Vinson incorrectly said yes.)

The upshot is that the justices’ few and meager contacts with the real world do little harm and perhaps occasionally some good. Justice Anthony Kennedy makes an annual trip to Salzburg, Austria, to discuss ideas with European and other global judges and intellectuals. This contact is often invoked to explain why Justice Kennedy occasionally cites foreign law (a taboo for Justice Scalia) and why his jurisprudence has been relatively liberal on such matters as gay rights and Guantánamo.

It is absurd for conservatives to criticize the cosmopolitan forums where judges from around the world compare notes. And it is absurd for liberals to criticize the conservative justices for associating with people who share or reinforce their views. The justices are human — and the more we let them be human, the better job they will do. Let the unthinkable be said! If the medieval vestments are making people think the justices should be monks, then maybe, just maybe, we should to do away with those robes.

 

When Barack Obama became US president, one of his top foreign policy priorities was to improve relations with China. Yet on the eve of President Hu Jintao's state visit to Washington, US-China relations are worse, rather than better.

Administration officials feel their efforts to reach out to China have been rebuffed.

Ironically, in 2007, President Hu Jintao had told the 17th Congress of the Communist Party that China needed to invest more in its soft, or attractive, power.

From the point of view of a country that was making enormous strides in economic and military power, this was a smart strategy.

By accompanying the rise of its hard economic and military power with efforts to make itself more attractive, China aimed to reduce the fear and tendencies to balance Chinese power that might otherwise grow among its neighbours.

But China's performance has been just the opposite, and China has had a bad year and a half in foreign policy.

Rising nationalism

For years, China had followed the advice of Deng Xiaoping to keep a low profile.

However, with its successful economic recovery from the recession, China passed Japan as the world's second largest economy, and America's slow recovery led many Chinese to mistakenly conclude that the United States was in decline.

Given such beliefs, and with rising nationalism in China as it prepares for the transition of power to the fifth generation of leaders in 2012, many in China pressed for a more assertive foreign policy.

In 2009, China was justly proud of its success in managing to emerge from the world recession with a high 10% rate of economic growth.

But many Chinese believed that this represented a shift in the world balance of power, and that China should be less deferential to other countries, including the US.

Chinese scholars began writing about the decline of the US. One dated the year 2000 as the peak of American power.

"People are now looking down on the West, from leadership circles, to academia, to everyday folks," said Professor Kang Xiaoguang of Renmin University.

This Chinese view is seriously mistaken and China is unlikely to equal American economic, military or soft power for decades to come.

Nonetheless, this over-confidence in power assessment (combined with insecurity in domestic affairs) led to more assertive Chinese foreign policy behaviour in the last two years.

China miscalculated by deviating from the smart strategy of a rising power and violating the wisdom of Deng Xiaoping who advised that China should proceed cautiously and "skilfully keep a low profile".

But perceptions matter, even when they are wrong. China's new attitudes alienated the Obama administration.

China stage-managed President Obama's trip to Beijing in November 2009 in a heavy-handed way; it over-reacted to Obama's meeting with the Dalai Lama, and the administration's long-expected and relatively modest arms sales to Taiwan.

When asked why they reacted so strongly to things they had accepted in the past, some Chinese responded, "because we were weaker then".

Obama administration officials began to believe that efforts at co-operation or conciliation would be interpreted by the Chinese as proof that the US was in decline.

Alienation and irritation

China's new assertiveness affected its relations with other countries as well.

Its policies in the South China Sea created fear among the Asean nations; and its over-reaction to Japan's actions after a ship collision near the Senkaku Islands put an end to the Democratic Party of Japan's hopes for a closer relationship with China. Instead, the Kan administration reaffirmed the American alliance.

Beijing alienated South Korea by failing to criticise North Korea's shelling of a South Korean island; irritated India over border and passport issues; and embarrassed itself in Europe and elsewhere by over-reacting to the Nobel Peace Prize granted to the jailed dissident Liu Xiaobo.

How will these issues play out in the coming year?

It is likely that China's leaders will draw back somewhat from the overly assertive posture that has proven so costly.

President Hu Jintao's stated desire to co-operate on terrorism, non-proliferation and clean energy will help to lead to a reduction of tensions, but powerful domestic interest groups in the export industries and in the People's Liberation Army will limit economic or naval co-operation.

And most important, given the nationalism that one sees on the blogosphere in China, it will be difficult for Chinese top leaders to change their policies too dramatically.

Mr Hu's state visit will help improve matters, but the relationship will remain difficult as long as the Chinese suffer from hubris based on a mistaken belief in American decline.

“The statesman can only wait and listen until he hears the footsteps of God resounding through events; then he must jump up and grasp the hem of His coat, that is all.” Thus Otto von Bismarck, the great Prussian statesman who united Germany and thereby reshaped Europe’s balance of power nearly a century and a half ago.

Last week, for the second time in his presidency, Barack Obama heard those footsteps, jumped up to grasp a historic opportunity … and missed it completely.

In Bismarck’s case it was not so much God’s coattails he caught as the revolutionary wave of mid-19th-century German nationalism. And he did more than catch it; he managed to surf it in a direction of his own choosing. The wave Obama just missed—again—is the revolutionary wave of Middle Eastern democracy. It has surged through the region twice since he was elected: once in Iran in the summer of 2009, the second time right across North Africa, from Tunisia all the way down the Red Sea to Yemen. But the swell has been biggest in Egypt, the Middle East’s most populous country.

In each case, the president faced stark alternatives. He could try to catch the wave, Bismarck style, by lending his support to the youthful revolutionaries and trying to ride it in a direction advantageous to American interests. Or he could do nothing and let the forces of reaction prevail. In the case of Iran, he did nothing, and the thugs of the Islamic Republic ruthlessly crushed the demonstrations. This time around, in Egypt, it was worse. He did both—some days exhorting Egyptian President Hosni Mubarak to leave, other days drawing back and recommending an “orderly transition.”

The result has been a foreign-policy debacle. The president has alienated everybody: not only Mubarak’s cronies in the military, but also the youthful crowds in the streets of Cairo. Whoever ultimately wins, Obama loses. And the alienation doesn’t end there. America’s two closest friends in the region—Israel and Saudi Arabia—are both disgusted. The Saudis, who dread all manifestations of revolution, are appalled at Washington’s failure to resolutely prop up Mubarak. The Israelis, meanwhile, are dismayed by the administration’s apparent cluelessness.

Last week, while other commentators ran around Cairo’s Tahrir Square, hyperventilating about what they saw as an Arab 1989, I flew to Tel Aviv for the annual Herzliya security conference. The consensus among the assembled experts on the Middle East? A colossal failure of American foreign policy.

This failure was not the result of bad luck. It was the predictable consequence of the Obama administration’s lack of any kind of coherent grand strategy, a deficit about which more than a few veterans of U.S. foreign policy making have long worried. The president himself is not wholly to blame. Although cosmopolitan by both birth and upbringing, Obama was an unusually parochial politician prior to his election, judging by his scant public pronouncements on foreign-policy issues.

Yet no president can be expected to be omniscient. That is what advisers are for. The real responsibility for the current strategic vacuum lies not with Obama himself, but with the National Security Council, and in particular with the man who ran it until last October: retired Gen. James L. Jones. I suspected at the time of his appointment that General Jones was a poor choice. A big, bluff Marine, he once astonished me by recommending that Turkish troops might lend the United States support in Iraq. He seemed mildly surprised when I suggested the Iraqis might resent such a reminder of centuries of Ottoman Turkish rule.

The best national-security advisers have combined deep knowledge of international relations with an ability to play the Machiavellian Beltway game, which means competing for the president’s ear against the other would-be players in the policymaking process: not only the defense secretary but also the secretary of state and the head of the Central Intelligence Agency. No one has ever done this better than Henry Kissinger. But the crucial thing about Kissinger as national-security adviser was not the speed with which he learned the dark arts of interdepartmental turf warfare. It was the skill with which he, in partnership with Richard Nixon, forged a grand strategy for the United States at a time of alarming geopolitical instability.

The essence of that strategy was, first, to prioritize (for example, détente with the Soviets before human-rights issues within the U.S.S.R.) and then to exert pressure by deliberately linking key issues. In their hardest task—salvaging peace with honor in Indochina by preserving the independence of South Vietnam—Nixon and Kissinger ultimately could not succeed. But in the Middle East they were able to eject the Soviets from a position of influence and turn Egypt from a threat into a malleable ally. And their overtures to China exploited the divisions within the Communist bloc, helping to set Beijing on an epoch-making new course of economic openness.

The contrast between the foreign policy of the Nixon-Ford years and that of President Jimmy Carter is a stark reminder of how easily foreign policy can founder when there is a failure of strategic thinking. The Iranian Revolution of 1979, which took the Carter administration wholly by surprise, was a catastrophe far greater than the loss of South Vietnam.

Remind you of anything? “This is what happens when you get caught by surprise,” an anonymous American official told The New York Times last week. “We’ve had endless strategy sessions for the past two years on Mideast peace, on containing Iran. And how many of them factored in the possibility that Egypt moves from stability to turmoil? None.”

I can think of no more damning indictment of the administration’s strategic thinking than this: it never once considered a scenario in which Mubarak faced a popular revolt. Yet the very essence of rigorous strategic thinking is to devise such a scenario and to think through the best responses to them, preferably two or three moves ahead of actual or potential adversaries. It is only by doing these things—ranking priorities and gaming scenarios—that a coherent foreign policy can be made. The Israelis have been hard at work doing this. All the president and his NSC team seem to have done is to draft touchy-feely speeches like the one he delivered in Cairo early in his presidency.

These were his words back in June 2009:

America and Islam are not exclusive and need not be in competition. Instead, they overlap, and share common principles—principles of justice and progress; tolerance and the dignity of all human beings.

Those lines will come back to haunt Obama if, as cannot be ruled out, the ultimate beneficiary of his bungling in Egypt is the Muslim Brotherhood, which remains by far the best organized opposition force in the country—and wholly committed to the restoration of the caliphate and the strict application of Sharia. Would such an outcome advance “tolerance and the dignity of all human beings” in Egypt? Somehow, I don’t think so.

Grand strategy is all about the necessity of choice. Today, it means choosing between a daunting list of objectives: to resist the spread of radical Islam, to limit Iran’s ambition to become dominant in the Middle East, to contain the rise of China as an economic rival, to guard against a Russian “reconquista” of Eastern Europe—and so on. The defining characteristic of Obama’s foreign policy has been not just a failure to prioritize, but also a failure to recognize the need to do so. A succession of speeches saying, in essence, “I am not George W. Bush” is no substitute for a strategy.

Bismarck knew how to choose. He understood that riding the nationalist wave would enable Prussia to become the dominant force in Germany, but that thereafter the No. 1 objective must be to keep France and Russia from uniting against his new Reich. When asked for his opinion about colonizing Africa, Bismarck famously replied: “My map of Africa lies in Europe. Here lies Russia and here lies France, and we are in the middle. That is my map of Africa.”

Tragically, no one knows where Barack Obama’s map of the Middle East is. At best, it is in the heartland states of America, where the fate of his presidency will be decided next year, just as Jimmy Carter’s was back in 1980.

At worst, he has no map at all.

 

 

In his new book The Future of Power, Joseph S. Nye Jr. analyses the changing nature of power in the 21st century as upheavals man-made and environmental alter the global terrain and as both state and non-state entities jostle for dominance. Nye is a proponent of “smart power,” a term he coined in 2004 to describe the strategic combination of coercion and persuasion.

Nye, a former assistant secretary of defense, is a professor and former dean of the Harvard Kennedy School of Government. His other books include Soft Power: The Means to Success in World Politics and The Powers to Lead. He spoke from his home in Boston.

Q. Has the term “smart power” been corrupted over time?

A. The term has been picked up by the Obama administration and used by Hillary Clinton to describe US foreign policy. But it is the older term “soft power” that is more often corrupted when it’s mistakenly used to describe anything that is not military power. More correctly, it refers to the ability to get what you want through attraction and persuasion. The Chinese president, for example, declared in 2007 that China needed to increase its soft power, and they have invested billions of dollars to that end.

Q. Are you saying that smart power is soft power backed up by hard power?

A. I think of smart power as the ability to combine hard and soft power. There may be situations in which you don’t want any hard power, and there may be others where soft power is not effective; stopping North Korea’s nuclear weapons program, for example.

Q. Does it require an underlying belief in American dominance?

A. Soft power and smart power are both available to any size country, not just the US or China. But the US, when it lives up to its values, probably has more soft power than a small country and certainly has more hard power. Our leadership resides in our ability to create the right combinations in the right circumstances.

Q. You use the term “values.’’ But don’t you characterize smart power as morally neutral?

A. Well, smart power is neutral in the sense that it can be used by bad states as well as good states. But it does depend in part on values which are more often the sources of soft power. Ironically, Osama bin Laden had soft power when he inspired people to fly into the World Trade Center and the Pentagon. They did so because they believed in bin Laden’s values. In that sense values matter. They can, however, be used as instruments by bad as well as good people.

Q. Do you have a moral position that you edited out of this book?

A. I try to write as an analyst when I argue that there is something to be said for soft power as a more ethical means. For example, even if I have bad ends and want to steal your money, I can use hard power — shoot you and take your money — or soft power — persuade you that I’m a guru and that you should give me your money. In the first case you don’t have anything to say about it, in the second case you do. If one believes in the value of individual autonomy and choice, as I do, soft power allows more of that individual autonomy even if the overall action is a bad one.

Q. Your strategy has been called the friendly face of American imperialism. How do you respond to that?

A. That criticism is often made by people who don’t understand the theory. Other countries besides the US can use soft power therefore it’s not an apology for the US or an instrument of American imperialism. In this book I try to describe the role of military, economic and soft power in an information age and to persuade people that we need to think in a more sophisticated way about what power means whether it be American, Chinese, or otherwise.

Q. With all that you’ve seen, do you find it hard to write a phrase such as “winning hearts and minds” without irony?

A. There is a risk of trivializing ideas. “Winning hearts and minds” has been around since the Vietnam War. On the other hand, when one tries to understand General Petraeus’s counterinsurgency doctrine (and whether he’ll succeed or not we don’t know) it is interesting to note that what he is trying to do is save civilian lives. The idea is not to kill as many people as possible but to win the minds of those who form the sea in which the insurgents swim. The insight is an important one and has a long standing in history.

Q. Do you see the uprising in Tunisia and now in Egypt as a test case of US commitment to smart power?

A. Smart power in this current case will require US foreign policy to align with the aspirations of people seeking democracy while at the same time not creating chaos in the region which would undercut our support for Israel and our efforts to prevent the spread of nuclear weapons. Smart power would aim to accomplish both a human rights democracy agenda as well as a more traditional agenda.

Q. Do you see that happening?

A. I’m always hopeful.

 

2010

The twenty-first century began with a very unequal distribution of power resources. With five percent of the world's population, the United States accounted for about a quarter of the world's economic output, was responsible for nearly half of global military expenditures, and had the most extensive cultural and educational soft-power resources. All this is still true, but the future of U.S. power is hotly debated. Many observers have interpreted the 2008 global financial crisis as the beginning of American decline. The National Intelligence Council, for example, has projected that in 2025, "the U.S. will remain the preeminent power, but that American dominance will be much diminished."

Power is the ability to attain the outcomes one wants, and the resources that produce it vary in different contexts. Spain in the sixteenth century took advantage of its control of colonies and gold bullion, the Netherlands in the seventeenth century profited from trade and finance, France in the eighteenth century benefited from its large population and armies, and the United Kingdom in the nineteenth century derived power from its primacy in the Industrial Revolution and its navy. This century is marked by a burgeoning revolution in information technology and globalization, and to understand this revolution, certain pitfalls need to be avoided.

First, one must beware of misleading metaphors of organic decline. Nations are not like humans, with predictable life spans. Rome remained dominant for more than three centuries after the peak of its power, and even then it did not succumb to the rise of another state. For all the fashionable predictions of China, India, or Brazil surpassing the United States in the next decades, the greater threat may come from modern barbarians and nonstate actors. In an information-based world, power diffusion may pose a bigger danger than power transition. Conventional wisdom holds that the state with the largest army prevails, but in the information age, the state (or the nonstate actor) with the best story may sometimes win.

For the complete article go to the Foreign Affairs website.
Lamont, Michèle, and Bruno Cousin. 2010. “The Multiple Crises of French Universities”. Publisher's Version Abstract

Between February and June 2009, French universities were the theatre of an exceptional protest movement against the latest flavour of governmental reform concerning academic careers. Protest sometimes seems to be a way of life in the French academy, and in France at large, but this time the situation is serious, with potentially huge consequences for the future of the sector. Indeed, the nation that gave birth to je pense, donc je suis is in a deep crisis on the intellectual front, and nowhere is this as obvious as in academic evaluation.

The protest movement did not take off in the grandes écoles (which train much of the French elite), or in professional and technical schools. Instead, it took off in the 80 comprehensive universités – the public institutions that are the backbone of the French educational system. Until two years ago, they were required to admit any high-school graduate on a first-come, first-served basis. A selection process was recently introduced, but even today most students are there because they could not gain entry elsewhere. Faculty work conditions are generally poor, as their institutions are chronically underfunded. Classes are large and programmes are understaffed. More than half of all students leave without any kind of diploma.

Public universities can be very different from each other and are research-intensive in varying degrees, but they carry out the bulk of French scientific research. Research is largely conducted in centres that are located within these institutions, and which often bring together overworked university teachers and full-time researchers who are attached to national institutes such as the Centre National de la Recherche Scientifique (CNRS). In a context where the output of these joint centres is not, or is only partially, covered by international ratings, French academics feel doubly underrated owing to the combination of low salaries and low ratings.

This feeling was exacerbated on 22 January when President Nicolas Sarkozy declared that the poor performance of French universities in international rankings was, above all, the consequence of the absence of continuous evaluation, which encourages sloth. Of course, he was displeased that the extensive set of higher education reforms undertaken by his Government during the preceding two years were met with opposition by large segments of the academic community.

Everyone agrees that the current system poses a great many problems, but there is no agreement on how to improve it and get beyond the current gridlock. It is la société bloquée all over again. To wit:

While most academics believe that the system is far too centralised, a 2007 law establishing the progressive financial “autonomy” (and accountability) of universities has been met with criticism and resistance, because it is perceived to be part of a strategy of withdrawal on the part of the State that will result in fewer resources being available for higher education. A number of scholars also fear that the increased decision-making power conferred on university presidents is a threat to the autonomy of faculty members.

While there is a need to design new, more universalistic procedures for evaluating performance and distributing resources, many academics are sceptical of the new institutions recently created to do this, namely the national agencies for the evaluation of universities and research units (Agence d’Evaluation de la Recherche et de l’Enseignement Superieur, or AERES) and research projects (Agence Nationale de la Recherche, or ANR). The former, in particular, has been criticised for its reliance on bibliometrics (publication and citation counts), even if the agency is now moving towards using less quantitative standards. Moreover, whereas the former mechanisms for distributing research funds depended on the decisions of elected peers (for instance, on the national committee of the CNRS), AERES appoints its panel members, and this is seen as a blow to researchers’ autonomy. For this and other reasons, many academics have refused to serve on its evaluation panels.

- While academics often agree that the old CNRS needed further integration with the universities, many denounce its gradual downsizing and transformation from a comprehensive research institution to a simple funding and programming agency as the work of uninformed politicians and technocrats intent on dismantling what works best in French research. In 2004, a widespread national protest arose against this dismantling, with 74,000 scholars signing a petition against it. Critics also say that the ongoing reorganisation of the CNRS into disciplinary institutes will reinforce the separation between the sciences, reorient research towards more applied fields and work against the interdisciplinary collaborations that are crucial to innovation in many fields.

While many agree on the need to improve teaching, moves to increase the number of teaching hours are among the most strongly contested reforms. French academics, who very rarely have sabbaticals, already perceive themselves as overworked in a system where time for research is increasingly scarce. These factors help to explain the resistance to expanded classroom hours and new administrative duties.

In the longest strike ever organised by the French scientific community, tens of thousands of lecturers and researchers began in early February to hold protests over a period of several weeks, demonstrating in the streets and (with the support of some students) blocking access to some university campuses. Many also participated in a national debate via print, online and broadcast media, and in general meetings. Some faculty members held teach-ins and action-oriented “alternative courses” for students. Several universities saw their final exams and summer holidays delayed and many foreign exchange students were called back by their home institutions. Despite this frontal assault, the Government did not back down: the much disparaged decree reorganising academic careers (with regard to recruitment, teaching loads, evaluations and promotions) and giving more prerogatives and autonomy to university presidents came into effect on 23 April.

This outcome will probably lead academics and their unions to rethink their strategies and repertoires of collective action. The traditional protest forms are losing legitimacy. As the dust settles, it is becoming clear that demonstrating has little traction in a context where the French public increasingly perceives academics as an elite bent on defending its privileges, even if it requires depriving students of their courses. Negotiation is also perceived as ineffectual, as many suspect that governmental consultations were conducted to buy time until the end of the academic year, when mobilisation would peter out. A third strategy—the radical option that would have prevented the scheduling of exams and the handing out of diplomas at the end of this spring – was ruled out even on the campuses most committed to the cause for fear of alienating the public even further.

As yet, however, no clear alternative has surfaced. We are now witnessing a cleavage between those who voice their opposition (in the main, scholars in the humanities) and the increasing number of academics (primarily scientists) who espouse a “wait-and-see” or a collaborative position as the only realistic path to improving the situation in their own universities. If the majority of academics appear to share the same diagnosis about what needs to be changed in the French system, they disagree on the solution (and on its scale—national or local). The root of the crisis lies not only in the Government’s difficulties in generating consensus, but also in the academics’ own scepticism, cynicism or fatalism about meritocracy, the absence of the administrative resources needed to support proper evaluation, the possibility of impartial evaluation, and the system’s ability to recognise and reward merit.

Deep problems remain in the institutions charged with evaluating the work of academics. The interference of political power, and the (admittedly diminishing) influence of trade unions and corporatist associations have long been viewed as obstacles to a collegial system of academic evaluation. The legitimacy of the 70 disciplinary sections of the Conseil National des Universites (CNU)—charged with certifying individuals as eligible for faculty positions, and with directly granting some promotions – is under question. Some of its committee members are appointed by the Government and as such are suspected of being second-rate, of benefiting from governmental patronage, or of defending governmental interests. Others are chosen from electoral lists that include a disproportionate number of partisan members, who are often perceived to be there because of their political involvement rather than because of their scientific status.

The legitimacy of these committees is further called into question because they include only academics employed by French institutions and are often viewed as perpetuating a longstanding tradition of favouritism. To give only one particularly scandalous example: in June, panellists in the sociology section allocated to themselves half of the promotions that they were charged with assigning across the entire discipline of sociology. This led to the resignation of the rest of the commission and to multiple protests. Such an occurrence sent deep waves of distrust not only between academics, but also towards the civil servants charged with reforming a system that is increasingly viewed as flawed.

Peer review is also in crisis at the local level. While selecting young doctoral recipients to be maîtres de conférences (the entry level permanent position in the French academy, similar to the British lecturer), French universities on average fill 30 per cent of available posts with their own graduates, to the point where local clientelism is often decried as symbolising the corruption of the entire system. The typical (and only) job interview for such a post lasts 20 to 30 minutes—probably the European record for brevity and surely too short to determine whether an individual deserves what is essentially a lifelong appointment. Many view the selection process as little more than a means to legitimise the appointment of pre-selected candidates—although the extent to which this is genuinely the case varies across institutions.

What is to be done? Because both the CNU and the local selection committees have recently been reorganised or granted new responsibilities, it seems the right moment to think about how to improve the evaluation processes in very practical ways. As part of a new start, academics should aim to generate a system of true self-governance at each level, grounded in more explicit principles for peer review. This would put them in a position to defend academic autonomy against the much-feared and maligned governmental or managerial control. While this is certainly occurring in some disciplines and institutions, progress is far from being equally spread across the sector.

Obvious and costless regulatory measures could easily be implemented—for instance, discouraging universities from hiring their own PhD graduates (as AERES recently started to), or forbidding selection committees from promoting their own members. One could also look abroad for examples of “best practice”. The UK’s Economic and Social Research Council has created colleges of trained academic evaluators who are charged with maintaining academic and ethical standards in peer review; although not all aspects of the British approach to academic reform should be emulated, this one is particularly worthy.

The Deutsche Forschungsgemeinschaft (German Research Foundation) uses teams of elected experts to evaluate proposals, and academic reputation weighs heavily in determining which names will be put on electoral lists and who will serve on evaluation panels. Canada’s Social Sciences and Humanities Research Council recently asked an independent panel of international experts to evaluate its peer review process in order to improve impartiality and effectiveness.

In a recent book on peer review in the US, one of the present authors (Michéle Lamont) showed the ways in which American social scientists and humanists operate to maintain their faith in the idea that peer review works and that the academic system of evaluation is fair. In this case, academics exercise their right as the only legitimate evaluators of knowledge by providing detailed assessment of intellectual production in light of their extensive expertise in specialised topics. The exercise of peer evaluation sustains and expresses professional status and professional autonomy. But it requires significant time (and thus good working conditions) and moral commitment—time spent comparing dossiers, making principled decisions about when it is necessary to withdraw on the grounds of personal interest, and so forth. Of course no peer review works perfectly, but US academics, while being aware of its limitations, appear to view the system as relatively healthy and they engage in many actions that contribute to sustaining this faith.

In our view, fixing the current flaws in the French system does not merely demand organisational reforms, including giving academics more time to evaluate the research of colleagues and candidates properly. It may also require French academics to think long and hard about their own cynicism and fatalism concerning their ability to make judgments about quality that would not be driven by cronyism or particularism, and that would honour their own expertise and connoisseurship.

Not that proper governmental reform is not needed, but sometimes blaming the Government may be an easy way out. Above all, it is increasingly a very ineffectual way of tackling a substantial part of the problem. A little more collaborative thinking and a little less cynicism among both academics and administrators—if at all possible—may very well help French universities find a way out of the crisis. And it will help the French academic and research community to become, once again, much more than the sum of its parts.

Allison, Graham T., Jr. 2010. “A Failure to Imagine the Worst.” Foreign Policy. Publisher's Version

The violence tearing apart Jamaica, a democratic state, raises serious questions not only about its government’s capacity to provide basic security but, more broadly and disturbingly, the link between violence and democracy itself.

The specific causes of the turmoil are well known. For decades political leaders have used armed local gangs to mobilize voters in their constituencies; the gangs are rewarded with the spoils of power, in particular housing and employment contracts they can dole out. Opposition leaders counter with their own gangs, resulting in chronic violence during election seasons.

These gangs eventually moved into international drug trafficking, with their leaders, called “dons,” becoming ever more powerful. The tables turned quite some time ago, with the politicians becoming dependent on the dons for their survival.

A case in point is the reliance of Prime Minister Bruce Golding on one notorious don, Christopher Coke, whose refusal to surrender for extradition to the United States to stand trial on gun and drug charges led last week to virtual warfare on the streets of the capital, Kingston, and the deaths of scores of civilians.

Endemic political corruption is hardly Jamaica’s only problem. Add to it paltry rates of economic growth, widespread poverty and income inequality, vast urban slums and a police force considered brutal and despised by the poor, and it is little surprise that the island nation’s homicide rate is always among the handful of the world’s highest.

Yet Jamaica, to its credit, has by global standards achieved a robust democracy. However great the violence during elections, voting is fair and governments change at the national level regularly and fairly smoothly. The judiciary, if overburdened, is nonetheless independent and relatively uncorrupt. There is a vigorous free press, and a lively civil society. Freedom House has continuously categorized the island as a “free” country.

For most observers of democracy, Jamaica’s violence seems an anomaly. Democracy is held to be inherently prone to good order and peace. According to this “democratic peace” doctrine, democracies do not go to war with each other, and in domestic life they provide nonviolent means of settling differences. Violence, writes the political theorist John Keane, is anathema to democracy’s “spirit and substance.”

It may or may not be true that democracies do not wage war with each other, but a growing number of analysts have concluded that, domestically, democracies are in fact more prone to violence than authoritarian states, measured by incidence of civil wars, communal conflict and homicide.

There are many obvious examples of this: India has far more street crime than China; the countries of the former Soviet Union are more violent now than they were under Communism; the streets of South Africa became more dangerous after apartheid was dismantled; Brazil was safer before 1985 under its military rule.

Three good explanations are offered for this connection between democracy and violent crime. First, it has been persuasively shown by social scientists like David Rapoport of the University of California at Los Angeles and Leonard Weinberg of the University of Nevada at Reno that the electoral process itself tends, on balance, to promote violence more than peace.

Sometimes the ballot can substitute for the bullets of civil wars, as in Nicaragua in 1990 when the Sandinista government was voted out peacefully. However, the opposite is more often the case, as in Greece in 1967, when electoral uncertainty led to a military coup, and Algeria in 1992, when elections were canceled in the face of a certain victory by a fundamentalist Islamic party, leading to civil war.

Another well-supported argument is that democracies are especially vulnerable to ethnic conflict and organized crime. In diverse democracies, the temptation of leaders to exploit ethnic identity for political ends is an all too frequent source of major conflict, sometimes culminating in oppression of minorities and even genocide. We saw this happen in Rwanda in 1994 and the former Yugoslav states in the 1990s. Dennis Austin, who has studied political strife in India and Sri Lanka, has concluded that in such societies “democracy is itself a spur to violence” adding “depth to the sense of division.”

Organized crime, especially international trafficking in drugs, has become a serious threat to democracies worldwide. Felia Allun and Renate Siebert, the editors of an important scholarly collection, “Organized Crime and the Challenge to Democracy,” argue that “it is by exploiting the very freedoms which democratic systems offer that organized crime is able to thrive ... although mortifying democratic rights, these kinds of crimes need the democratic space to flourish.”

A third, more nuanced argument is suggested by the work of the Norwegian political scientist Havard Hegre, who has shown that nondemocratic regimes become more prone to civil unrest, and more likely to threaten or start wars with neighboring countries, as they enter the transition period toward becoming democratic. The arc to democratic peace is therefore U-shaped. Authoritarian regimes can tyrannize their citizens into less violence. But as their states become more democratic, the mix of persisting authoritarian traditions and democratic freedoms can be lethal, sometimes resulting in complete state collapse, as in Yugoslavia.

It is only when such countries get very close to democratic maturity that social violence rapidly declines. At least that is the conclusion that my Harvard colleague Ethan Fosse and I came to after examining the relationship between homicide rates and Freedom House’s democracy rankings.

Yet even in these countries on the cusp of democracy there is a complicating factor — they are usually also going through the transition from a poor economy to a more developed one. The expectations of citizens in these transitional economies often outrun the capacity of society to meet them; people get frustrated and feel unfairly treated, leading to high risks of violence.

The worst possible situation for a state, however, is for its economic transition to stall or fail before the transition to mature democracy is complete. And this is what Jamaica now faces. For the first dozen years after independence from Britain in 1962, progress toward democracy and self-sustained economic growth moved nicely in tandem. But then the oil crisis and recession of 1973, and the efforts by the democratic socialist government of Prime Minister Michael Manley to deal with hard times, knocked the wind from the sails of economic progress, and Jamaica has never really recovered. (Disclosure: I was an adviser to Prime Minister Manley at that time.)

To see what happens when a country accomplishes both transitions, we need only look at the neighboring Afro-Caribbean island of Barbados. It has a similar colonial past, and became independent just three years after Jamaica. Yet Barbados’ per capita income is now more than twice that of Jamaica, its standard of living puts it among the developed world and Freedom House places it on a par with Western Europe in terms of the maturity of its democracy. Sure enough, Barbados also has one of the lowest homicide rates in the hemisphere.

Barbados, unfortunately, is not typical. Jamaica, though an extreme case, is more in line with other democracies of the hemisphere, including Colombia, Venezuela, Trinidad and Tobago, Mexico and even Brazil, where the treacherous joint transition to democratic maturity and economic security has been accompanied by horrendous levels of crime.

As the American government decides how to respond to the crisis in Jamaica, a product of the (proper) insistence on the extradition of Christopher Coke, it would do well to view developments there in these broader terms. The problem of Jamaica might not seem so insoluble if Americans were to have a more sympathetic understanding of the transitional plight of this little country, brought on in large part by its unwavering struggle toward a more mature and equitable democracy.

Orlando Patterson is a professor of sociology at Harvard.

Seen from Tokyo, America’s relationship with Japan faces a crisis. The immediate problem is deadlock over a plan to move an American military base on the island of Okinawa. It sounds simple, but this is an issue with a long back story that could create a serious rift with one of our most crucial allies.

When I was in the Pentagon more than a decade ago, we began planning to reduce the burden that our presence places on Okinawa, which houses more than half of the 47,000 American troops in Japan. The Marine Corps Air Station Futenma was a particular problem because of its proximity to a crowded city, Ginowan. After years of negotiation, the Japanese and American governments agreed in 2006 to move the base to a less populated part of Okinawa and to move 8,000 Marines from Okinawa to Guam by 2014.

The plan was thrown into jeopardy last summer when the Japanese voted out the Liberal Democratic Party that had governed the country for nearly half a century in favor of the Democratic Party of Japan. The new prime minister, Yukio Hatoyama, leads a government that is inexperienced, divided and still in the thrall of campaign promises to move the base off the island or out of Japan completely.

The Pentagon is properly annoyed that Mr. Hatoyama is trying to go back on an agreement that took more than a decade to work out and that has major implications for the Marine Corps’ budget and force realignment. Secretary of Defense Robert Gates expressed displeasure during a trip to Japan in October, calling any reassessment of the plan “counterproductive.” When he visited Tokyo in November, President Obama agreed to a high-level working group to consider the Futenma question. But since then, Mr. Hatoyama has said he will delay a final decision on relocation until at least May.

Not surprisingly, some in Washington want to play hardball with the new Japanese government. But that would be unwise, for Mr. Hatoyama is caught in a vise, with the Americans squeezing from one side and a small left-wing party (upon which his majority in the upper house of the legislature depends) threatening to quit the coalition if he makes any significant concessions to the Americans. Further complicating matters, the future of Futenma is deeply contentious for Okinawans.

Even if Mr. Hatoyama eventually gives in on the base plan, we need a more patient and strategic approach to Japan. We are allowing a second-order issue to threaten our long-term strategy for East Asia. Futenma, it is worth noting, is not the only matter that the new government has raised. It also speaks of wanting a more equal alliance and better relations with China, and of creating an East Asian community—though it is far from clear what any of this means.

When I helped to develop the Pentagon’s East Asian Strategy Report in 1995, we started with the reality that there were three major powers in the region—the United States, Japan and China—and that maintaining our alliance with Japan would shape the environment into which China was emerging. We wanted to integrate China into the international system by, say, inviting it to join the World Trade Organization, but we needed to hedge against the danger that a future and stronger China might turn aggressive.

After a year and a half of extensive negotiations, the United States and Japan agreed that our alliance, rather than representing a cold war relic, was the basis for stability and prosperity in the region. President Bill Clinton and Prime Minister Ryutaro Hashimoto affirmed that in their 1996 Tokyo declaration. This strategy of “integrate, but hedge” continued to guide American foreign policy through the years of the Bush administration.

This year is the 50th anniversary of the United States–Japan security treaty. The two countries will miss a major opportunity if they let the base controversy lead to bitter feelings or the further reduction of American forces in Japan. The best guarantee of security in a region where China remains a long-term challenge and a nuclear North Korea poses a clear threat remains the presence of American troops, which Japan helps to maintain with generous host nation support.

Sometimes Japanese officials quietly welcome “gaiatsu,” or foreign pressure, to help resolve their own bureaucratic deadlocks. But that is not the case here: if the United States undercuts the new Japanese government and creates resentment among the Japanese public, then a victory on Futenma could prove Pyrrhic.

 

Joseph S. Nye Jr., a professor of government at Harvard and the author of The Powers to Lead, was an assistant secretary of defense from 1994 to 1995.
2009

“Growing up, I don’t know if I ever thought of becoming a teacher,” said Erez Manela, recently tenured professor of history in the Faculty of Arts and Sciences. “I was always supposed to become a doctor or a lawyer.”

Manela actually began by studying foreign languages as an undergraduate at the Hebrew University of Jerusalem, not far from his hometown of Haifa, Israel. He soon discovered that courses in East Asian and Middle Eastern history complemented his interests in Chinese, Arabic, and Persian. Though he saw a future in academia, history was not a field he had expected to pursue.

“I didn’t yet conceive of it as something you could do as a profession, but rather something you might study to know more about the world,” he said.

Previously the Dunwalke Associate Professor of American History, Manela now specializes in modern international history and the history of the United States in the world. His first book, “The Wilsonian Moment: Self-Determination and the International Origins of Anticolonial Nationalism” (Oxford, 2007), explored the impact of President Woodrow Wilson’s rhetoric on nationalist movements in Asia and the Middle East in the wake of World War I. Manela’s current research revolves around the global campaign to eradicate smallpox in the 1960s and ‘70s.

Manela considers his undergraduate experience the source of some of his main intellectual questions.

In college, “I realized that in the modern period—during the 19th and early 20th centuries—there were really fascinating parallels between the history of the Ottoman Empire and the history of East Asia, particularly China,” Manela said. “That really intrigued me…I kept wondering how I could study these parallels in a way that wasn’t simply comparative. I wanted to put everything into one big framework and tell the story as connected.”

Manela decided to concentrate on international history as a graduate student at Yale, partly because he was reluctant to give up studying any of the countries or languages he had embraced in college.

“I couldn’t bear the thought of focusing on just one of them,” he said. “I wanted to put it all to good use.”

Studying international history has allowed Manela to break free of the nation as an analytical framework and devote new attention to transnational actors, organizations, and themes. When Manela arrived at Harvard in 2003, history professors Akira Iriye (now emeritus) and the late Ernest May served as inspirational figures for him, as they too were concerned with these pioneering directions in international history. (May even taught Manela the basics of PowerPoint by jotting a few commands on an index card during his first semester at Harvard.)

“Teaching with them…was a tremendously formative experience for me,” Manela said. “Together, they established an amazing tradition of international history in this department.”

A member of Harvard’s Weatherhead Center for International Affairs, Manela considers the interdisciplinary community of scholars and students his “second home” within the University. He served as the Weatherhead’s director of undergraduate student programs for four years, and is now director of graduate student programs.

Manela spends most of his free time with his three daughters, but sometimes revisits an old interest: chess. Occasionally he challenges the regulars in Harvard Square.

“Once upon a time I used to play chess fairly well,” he said, “but that’s history.”

Manela is grateful to have the chance to study a subject that many people can pursue only as a hobby, even though he never did live up to expectations of becoming a doctor or a lawyer.

“I think this is a decent alternative,” he said with a grin.

Ferguson, Niall. 2009. “Dead Men Walking”. Publisher's Version Abstract

There is nothing like a really big economic crisis to separate the Cassandras from the Panglosses, the horsemen of the apocalypse from the Kool-Aid-swigging optimists. No, the last year has shown that all is not for the best in the best of all possible worlds. On the contrary, we might be doomed.

At such times, we do well to remember that most of today’s public intellectuals are mere dwarves, standing on the shoulders of giants. So, if they had e-mail in the hereafter, which of the great thinkers of the past would be entitled to send us a message with the subject line: “I told you so”? And which would prefer to remain offline?

It has, for example, been a bad year for Adam Smith (1723-1790) and his “invisible hand,” which was supposed to steer the global economy onward and upward to new heights of opulence through the action of individual choice in unfettered markets. By contrast, it has been a good year for Karl Marx (1818-1883), who always maintained that the internal contradictions of capitalism, and particularly its tendency to increase the inequality of the distribution of wealth, would lead to crisis and finally collapse. A special mention is also due to early 20th-century Marxist theorist Rudolf Hilferding (1877-1941), whose Das Finanzkapital foresaw the rise of giant “too big to fail” financial institutions.

Joining Smith in embarrassed silence, you might think, is Friedrich von Hayek (1899-1992), who warned back in 1944 that the welfare state would lead the West down the “road to serfdom.” With a government-mandated expansion of health insurance likely to be enacted in the United States, Hayek's libertarian fears appear to have receded, at least in the Democratic Party. It has been a bumper year, on the other hand, for Hayek's old enemy, John Maynard Keynes (1883-1946), whose 1936 work The General Theory of Employment,Interest and Money has become the new bible for finance ministers seeking to reduce unemployment by means of fiscal stimuli. His biographer, Robert Skidelsky, has hailed the “return of the master.” Keynes's self-appointed representative on Earth, New York Times columnist Paul Krugman, insists that the application of Keynesian theory, in the form of giant government deficits, has saved the world from a second Great Depression.

The marketplace of ideas has not been nearly so kind this year to the late Milton Friedman (1912-2006), the diminutive doyen of free-market economics. “Inflation,” wrote Friedman in a famous definition, “is always and everywhere a monetary phenomenon, in the sense that it cannot occur without a more rapid increase in the quantity of money than in output.” Well, since September of 2008, Ben Bernanke has been printing dollars like mad at the U.S. Federal Reserve, more than doubling the monetary base. And inflation? As I write, the headline consumer price inflation rate is negative 2 percent. Better throw away that old copy of Friedman's Monetary History of the United States, 1867-1960 (co-authored with Anna J. Schwartz, who is happily still with us).

Invest, instead, in a spanking new edition of The Great Transformation by Karl Polanyi (1886-1964). We surely need Polanyi's more anthropological approach to economics to explain the excesses of the boom and the hysteria of the bust. For what in classical economics could possibly account for the credulity of investors in Bernard Madoff's long-running Ponzi scheme? Or the folly of Richard Fuld, who gambled his personal fortune and reputation on the very slim chance that Lehman Brothers, unlike Bear Stearns and Merrill Lynch, could survive the crisis without being sold to a competitor?

The biggest intellectual losers of all, however, must be the pioneers of the theory of efficient markets—economists still with us, such as Harry M. Markowitz, the University of Chicago-trained economist who developed the theory of portfolio diversification as the best protection against economic volatility, and William Sharpe, inventor of the capital asset pricing model. In two marvelously lucid books, the late Peter Bernstein extolled their “capital ideas.” Now, with so many quantitative hedge funds on the scrap heap, their ideas don't seem quite so capital.

And the biggest winners, among economists at least? Step forward the "Austrians" —economists like Ludwig von Mises (1881-1973), who always saw credit-propelled asset bubbles as the biggest threat to the stability of capitalism. Not many American economists carried forward their work into the later 20th century, but one heterodox figure has emerged as a posthumous beneficiary of this crisis: Hyman Minsky (1919-1996). At a time when other University of Chicago-trained economists were forging the neoclassical synthesis—Adam Smith plus applied math—Minsky developed his own math-free “financial instability hypothesis.”

Yet it would surely be wrong to make the Top Dead Thinker of 2009 an economic theorist. The entire discipline of economics has flopped too embarrassingly for that to be appropriate. Instead, we should consider the claims of a historian, because history has served as a far better guide to the current crisis than any economic model. My nominee is the financial historian Charles Kindleberger (1910-2003), who drew on Minsky's work to popularize the idea of financial crisis as a five-stage process, from displacement and euphoric overtrading to full-fledged mania, followed by growing concern and ending up with panic. (If those five steps to financial hell sound familiar, they should. We just went down them, twice in the space of 10 years.)

Of course, history offers more than just the lesson that financial accidents will happen. One of the most important historical truths is that the first draft of history —the version that gets written on the spot by journalists and other contemporaries —is nearly always wrong. So though superficially this crisis seems like a defeat for Smith, Hayek, and Friedman, and a victory for Marx, Keynes, and Polanyi, that might well turn out to be wrong. Far from having been caused by unregulated free markets, this crisis may have been caused by distortions of the market from ill-advised government actions: explicit and implicit guarantees to supersize banks, inappropriate empowerment of rating agencies, disastrously loose monetary policy, bad regulation of big insurers, systematic encouragement of reckless mortgage lending—not to mention distortions of currency markets by central bank

Consider this: The argument for avoiding mass bank failures was made by Friedman, not Keynes. It was Friedman who argued that the principal reason for the depth of the Depression was the Fed's failure to avoid an epidemic of bank failures. It has been Friedman, more than Keynes, who has been Bernanke’s inspiration over the past two years, as the Fed chairman has honored a pledge he made shortly before Friedman's death not to preside over another “great contraction.” Nor would Friedman have been in the least worried about inflation at a time like this. The Fed's balance sheet may have expanded rapidly, but broader measures of money are growing slowly and credit is contracting. Deflation, not inflation, remains the monetarist fear.

From a free market perspective, the vital thing is that legitimate emergency measures do not become established practices. For it cannot possibly be a healthy state of affairs for the core institutions of the Western financial system to be effectively guaranteed, if not actually owned, by the government. The thinker who most clearly discerned the problems associated with that kind of state intervention was Joseph Schumpeter (1883-1950), whose “creative destruction” has been one of this year's most commonly cited phrases.

“[T]his evolutionary…impulse that sets and keeps the capitalist engine in motion,” wrote Schumpeter in Capitalism, Socialism and Democracy, “comes from…the new forms of industrial organization that capitalist enterprise creates…This process of creative destruction is the essential fact about capitalism.” This crisis has certainly unleashed enough economic destruction in the world (though its creativity at this stage is still hard to discern). But in the world of the big banks, there has been far too little destruction, and about the only creative thing happening on Wall Street these days is the accounting.

“This economic system,” Schumpeter wrote in his earlier The Theory of Economic Development, “cannot do without the ultima ratio [final argument] of the complete destruction of those existences which are irretrievably associated with the hopelessly unadapted.” Indeed, he saw that the economy remained saddled with too many of “those firms that are unfit to live.” That could serve as a painfully accurate description of the Western financial system today.

Yet all those allusions to evolution and fitness to live serve as a reminder of the dead thinker we should all have spent at least part of 2009 venerating: Charles Darwin (1809-1882). This year was not only his bicentennial but the 150th birthday of his paradigm-shifting On the Origin of Species. Just reflect on these sentences from Darwin's seminal work:

All organic beings are exposed to severe competition.”

“As more individuals are produced than can possibly survive, there must in every case be a struggle for existence.”

“Each organic being…has to struggle for life and to suffer great destruction.... The vigorous, the healthy, and the happy survive and multiply.”

Thanks in no small measure to the efforts of his modern heirs, notably Richard Dawkins, we are all Darwinians now—except in the strange parallel worlds of fundamentalist Christianity and state-guaranteed finance.

Neither Cassandra nor Pangloss, Darwin surely deserves to top any list of modern thinkers, dead or alive.

Lamont, Michèle, and Bruno Cousin. 2009. “The French Disconnection”. Publisher's Version Abstract

Bruno Cousin and Michèle Lamont say academics at France's public universities need to rethink their strategy after this year's protests alienated the public and had little impact on the Government.

var pgtitle = "The French disconnection"; var byline = "";

Between February and June 2009, French universities were the theatre of an exceptional protest movement against the latest flavour of governmental reform concerning academic careers. Protest sometimes seems to be a way of life in the French academy, and in France at large, but this time the situation is serious, with potentially huge consequences for the future of the sector. Indeed, the nation that gave birth to je pense, donc je suis is in a deep crisis on the intellectual front, and nowhere is this as obvious as in academic evaluation.

The protest movement did not take off in the grandes écoles (which train much of the French elite), or in professional and technical schools. Instead, it took off in the 80 comprehensive universités—the public institutions that are the backbone of the French educational system. Until two years ago, they were required to admit any high-school graduate on a first-come, first-served basis. A selection process was recently introduced, but even today most students are there because they could not gain entry elsewhere. Faculty work conditions are generally poor, as their institutions are chronically underfunded. Classes are large and programmes are understaffed. More than half of all students leave without any kind of diploma.

Public universities can be very different from each other and are research-intensive in varying degrees, but they carry out the bulk of French scientific research. Research is largely conducted in centres that are located within these institutions, and which often bring together overworked university teachers and full-time researchers who are attached to national institutes such as the Centre National de la Recherche Scientifique (CNRS). In a context where the output of these joint centres is not, or is only partially, covered by international ratings, French academics feel doubly underrated owing to the combination of low salaries and low ratings.

This feeling was exacerbated on 22 January when President Nicolas Sarkozy declared that the poor performance of French universities in international rankings was, above all, the consequence of the absence of continuous evaluation, which encourages sloth. Of course, he was displeased that the extensive set of higher education reforms undertaken by his Government during the preceding two years were met with opposition by large segments of the academic community.

Everyone agrees that the current system poses a great many problems, but there is no agreement on how to improve it and get beyond the current gridlock. It is la société bloquée all over again. To wit:

- While most academics believe that the system is far too centralised, a 2007 law establishing the progressive financial “autonomy” (and accountability) of universities has been met with criticism and resistance, because it is perceived to be part of a strategy of withdrawal on the part of the State that will result in fewer resources being available for higher education. A number of scholars also fear that the increased decision-making power conferred on university presidents is a threat to the autonomy of faculty members.

- While there is a need to design new, more universalistic procedures for evaluating performance and distributing resources, many academics are sceptical of the new institutions recently created to do this, namely the national agencies for the evaluation of universities and research units (Agence d'Evaluation de la Recherche et de l'Enseignement Superieur, or AERES) and research projects (Agence Nationale de la Recherche, or ANR). The former, in particular, has been criticised for its reliance on bibliometrics (publication and citation counts), even if the agency is now moving towards using less quantitative standards. Moreover, whereas the former mechanisms for distributing research funds depended on the decisions of elected peers (for instance, on the national committee of the CNRS), AERES appoints its panel members, and this is seen as a blow to researchers' autonomy. For this and other reasons, many academics have refused to serve on its evaluation panels.

- While academics often agree that the old CNRS needed further integration with the universities, many denounce its gradual downsizing and transformation from a comprehensive research institution to a simple funding and programming agency as the work of uninformed politicians and technocrats intent on dismantling what works best in French research. In 2004, a widespread national protest arose against this dismantling, with 74,000 scholars signing a petition against it. Critics also say that the ongoing reorganisation of the CNRS into disciplinary institutes will reinforce the separation between the sciences, reorient research towards more applied fields and work against the interdisciplinary collaborations that are crucial to innovation in many fields.

- While many agree on the need to improve teaching, moves to increase the number of teaching hours are among the most strongly contested reforms. French academics, who very rarely have sabbaticals, already perceive themselves as overworked in a system where time for research is increasingly scarce. These factors help to explain the resistance to expanded classroom hours and new administrative duties.

In the longest strike ever organised by the French scientific community, tens of thousands of lecturers and researchers began in early February to hold protests over a period of several weeks, demonstrating in the streets and (with the support of some students) blocking access to some university campuses. Many also participated in a national debate via print, online and broadcast media, and in general meetings. Some faculty members held teach-ins and action-oriented “alternative courses” for students. Several universities saw their final exams and summer holidays delayed and many foreign exchange students were called back by their home institutions. Despite this frontal assault, the Government did not back down: the much disparaged decree reorganising academic careers (with regard to recruitment, teaching loads, evaluations and promotions) and giving more prerogatives and autonomy to university presidents came into effect on 23 April.

This outcome will probably lead academics and their unions to rethink their strategies and repertoires of collective action. The traditional protest forms are losing legitimacy. As the dust settles, it is becoming clear that demonstrating has little traction in a context where the French public increasingly perceives academics as an elite bent on defending its privileges, even if it requires depriving students of their courses. Negotiation is also perceived as ineffectual, as many suspect that governmental consultations were conducted to buy time until the end of the academic year, when mobilisation would peter out. A third strategy—the radical option that would have prevented the scheduling of exams and the handing out of diplomas at the end of this spring—was ruled out even on the campuses most committed to the cause for fear of alienating the public even further.

As yet, however, no clear alternative has surfaced. We are now witnessing a cleavage between those who voice their opposition (in the main, scholars in the humanities) and the increasing number of academics (primarily scientists) who espouse a “wait-and-see” or a collaborative position as the only realistic path to improving the situation in their own universities. If the majority of academics appear to share the same diagnosis about what needs to be changed in the French system, they disagree on the solution (and on its scale—national or local). The root of the crisis lies not only in the Government's difficulties in generating consensus, but also in the academics' own scepticism, cynicism or fatalism about meritocracy, the absence of the administrative resources needed to support proper evaluation, the possibility of impartial evaluation, and the system's ability to recognise and reward merit.

Deep problems remain in the institutions charged with evaluating the work of academics. The interference of political power, and the (admittedly diminishing) influence of trade unions and corporatist associations have long been viewed as obstacles to a collegial system of academic evaluation. The legitimacy of the 70 disciplinary sections of the Conseil National des Universites (CNU)—charged with certifying individuals as eligible for faculty positions, and with directly granting some promotions—is under question. Some of its committee members are appointed by the Government and as such are suspected of being second-rate, of benefiting from governmental patronage, or of defending governmental interests. Others are chosen from electoral lists that include a disproportionate number of partisan members, who are often perceived to be there because of their political involvement rather than because of their scientific status.

The legitimacy of these committees is further called into question because they include only academics employed by French institutions and are often viewed as perpetuating a longstanding tradition of favouritism. To give only one particularly scandalous example: in June, panellists in the sociology section allocated to themselves half of the promotions that they were charged with assigning across the entire discipline of sociology. This led to the resignation of the rest of the commission and to multiple protests. Such an occurrence sent deep waves of distrust not only between academics, but also towards the civil servants charged with reforming a system that is increasingly viewed as flawed.

Peer review is also in crisis at the local level. While selecting young doctoral recipients to be maîtres de conférences (the entry level permanent position in the French academy, similar to the British lecturer), French universities on average fill 30 per cent of available posts with their own graduates, to the point where local clientelism is often decried as symbolising the corruption of the entire system. The typical (and only) job interview for such a post lasts 20 to 30 minutes—probably the European record for brevity and surely too short to determine whether an individual deserves what is essentially a lifelong appointment. Many view the selection process as little more than a means to legitimise the appointment of pre-selected candidates—although the extent to which this is genuinely the case varies across institutions.

What is to be done? Because both the CNU and the local selection committees have recently been reorganised or granted new responsibilities, it seems the right moment to think about how to improve the evaluation processes in very practical ways. As part of a new start, academics should aim to generate a system of true self-governance at each level, grounded in more explicit principles for peer review. This would put them in a position to defend academic autonomy against the much-feared and maligned governmental or managerial control. While this is certainly occurring in some disciplines and institutions, progress is far from being equally spread across the sector.

Obvious and costless regulatory measures could easily be implemented—for instance, discouraging universities from hiring their own PhD graduates (as AERES recently started to), or forbidding selection committees from promoting their own members. One could also look abroad for examples of “best practice”. The UK's Economic and Social Research Council has created colleges of trained academic evaluators who are charged with maintaining academic and ethical standards in peer review; although not all aspects of the British approach to academic reform should be emulated, this one is particularly worthy.

The Deutsche Forschungsgemeinschaft (German Research Foundation) uses teams of elected experts to evaluate proposals, and academic reputation weighs heavily in determining which names will be put on electoral lists and who will serve on evaluation panels. Canada's Social Sciences and Humanities Research Council recently asked an independent panel of international experts to evaluate its peer review process in order to improve impartiality and effectiveness.

In a recent book on peer review in the US, one of the present authors (Michele Lamont) showed the ways in which American social scientists and humanists operate to maintain their faith in the idea that peer review works and that the academic system of evaluation is fair. In this case, academics exercise their right as the only legitimate evaluators of knowledge by providing detailed assessment of intellectual production in light of their extensive expertise in specialised topics. The exercise of peer evaluation sustains and expresses professional status and professional autonomy. But it requires significant time (and thus good working conditions) and moral commitment—time spent comparing dossiers, making principled decisions about when it is necessary to withdraw on the grounds of personal interest, and so forth. Of course no peer review works perfectly, but US academics, while being aware of its limitations, appear to view the system as relatively healthy and they engage in many actions that contribute to sustaining this faith.

In our view, fixing the current flaws in the French system does not merely demand organisational reforms, including giving academics more time to evaluate the research of colleagues and candidates properly. It may also require French academics to think long and hard about their own cynicism and fatalism concerning their ability to make judgments about quality that would not be driven by cronyism or particularism, and that would honour their own expertise and connoisseurship.

Not that proper governmental reform is not needed, but sometimes blaming the Government may be an easy way out. Above all, it is increasingly a very ineffectual way of tackling a substantial part of the problem. A little more collaborative thinking and a little less cynicism among both academics and administrators—if at all possible - may very well help French universities find a way out of the crisis. And it will help the French academic and research community to become, once again, much more than the sum of its parts.

Bruno Cousin is postdoctoral research scholar in sociology at Harvard University and Sciences Po Paris. Michèle Lamont is Robert I. Goldman professor of European studies and professor of sociology and African and African-American studies at Harvard University. She is the author of How Professors Think: Inside the Curious World of Academic Judgment (2009). She chaired the 2008 international panel of experts evaluating peer review practices at the Social Science and Humanities Research Council of Canada.
Putnam, Robert D, and Thomas H Sander. 2009. “How Joblessness Hurts Us All”. Publisher's Version Abstract

The unemployment rate has topped 10% for the first time in a quarter-century. More than one in six adults are unemployed or underemployed, the most since the Great Depression. By any measure this is troubling, but the long-term effects of unemployment are more devastating than most Americans grasp. Economists warn that high unemployment may persist for years.

Misery, it turns out, doesn't love company. Distressing new research shows that unemployment fosters social isolation not just for the unemployed but also for their still-employed neighbors. Moreover, the negative consequences last much longer than the unemployment itself. Policymakers have focused on short-term help for the jobless, but they must address these longer-term community effects, too.

Impact studied

Recent studies confirm the results of research during the Great Depression— unemployment badly frays a person's ties with his community, sometimes permanently. After careful analysis of 20 years of monthly surveys tracking Americans' social and political habits, our colleague Chaeyoon Lim of the University of Wisconsin has found that unemployed Americans are significantly less involved in their communities than their employed demographic twins. The jobless are less likely to vote, petition, march, write letters to editors, or even volunteer. They attend fewer meetings and serve less frequently as leaders in local organizations. Moreover, sociologist Cristobal Young's research finds that the unemployed spend most of their increased free time alone.

These negative social consequences outlast the unemployment itself. Tracking Wisconsin 1957 high school graduates, sociologists Jennie Brand and Sarah Burgard found that in contrast to comparable classmates who were never unemployed, graduates who lost jobs, even briefly and early in their careers, joined community groups less and volunteered considerably less over their entire lives. And economist Andrew Clark, psychologist Richard Lucas and others found that, unlike almost any other traumatic life event, joblessness results in permanently lower levels of life satisfaction, even if the jobless later find jobs.

Equally disturbing, high unemployment rates reduce the social and civic involvement even of those still employed. Lim has found that Americans with jobs who live in states with high unemployment are less civically engaged than workers elsewhere. In fact, most of the civic decay in hard-hit communities is likely due not to the jobless dropping out, but to their still-employed neighbors dropping out.

Moreover, beyond civic disengagement, places with higher joblessness have more pervasive violence and crimes against property. They have more fragile families with harsher parenting, and higher rates of mental disorder and psychological distress among both the unemployed and the employed. These social consequences are a powerful aftershock to communities already reeling economically.

What might explain the civic withdrawal during recessions? The jobless shun socializing, shamed that their work was deemed expendable. Economic depression breeds psychological depression. The unemployed may feel that their employer has broken an implicit social contract, deflating any impulse to help others. Where unemployment is high, those still hanging onto their jobs might work harder for fear of further layoffs, thus crowding out time for civic engagement. Above all, in afflicted communities, the contagion of psychic depression and social isolation spreads more rapidly than joblessness itself.

What to do?

The lasting social consequences of unemployment demand remedy. President Obama has extended individual unemployment benefits.These new findings call for special aid to communities with high and persistent unemployment. Government and business should ensure that unemployment is a last cost-cutting move, not the first.

Millions of employed Americans are silently thankful that the sword of Damocles has not yet fallen on them. Meanwhile, they and their leaders overlook the fact that unemployment is causing long-run social disintegration in their communities. Albert Camus was right: “Without work, all life goes rotten.”


Thomas H. Sander is executive director of the Saguaro Seminar: Civic Engagement in America, at Harvard Kennedy School. Robert D. Putnam is Peter & Isabel Malkin Professor of Public Policy at Harvard Kennedy School and co-author of the forthcoming American Grace: The Changing Role of Religion in America.
Patterson, Orlando. 2009. “A Job Too Big for One Man”. Publisher's Version Abstract

IN the year since his election, as he has since he first appeared on the national stage, Barack Obama has embodied the fundamental paradoxes of race in America: that we live in a still racially fragmented society; that we share a public culture with an outsized black presence, but that in the privacy of homes and neighborhoods we are more segregated than in the Jim Crow era; that we worship more fervently than any other advanced nation, in churches and synagogues that define our separate ethnic identities and differences, to gods proclaiming the unity of mankind. Why are we this strange way? Is President Obama the ultimate expression of our peculiarities? Has he made a difference? Can he? Will he?

We became this way because of the peculiar tragedies and triumphs of our past. Race and racism scar all advanced nations, but America is peculiar because slavery thrived internally and race became a defining feature of personal identity.

Slavery was quintessentially an institution of exclusion: the slave first and foremost was someone who did not belong to and had no claims on the public order, nor any legitimate private existence, since both were appropriated by the slaveholder. The Act of Emancipation abolished only the first part of slavery, the master’s ownership; far from removing the concept of the ex-slave as someone who did not belong, it reinforced it. The nightmare of the Jim Crow era then extended and reinforced the public slavery of black Americans right up through the middle of the 20th century.

At the same time, the status of blacks as permanent outsiders made whiteness a treasured personal attribute in a manner inconceivable to Europeans. Whiteness had no real meaning to pre-immigration Swedes or Irishmen because they were all white. But it became meaningful the moment they landed in America, where it was eagerly embraced as a free cultural resource in assimilating to the white republic. In America race had the same significance as gender and age as defining qualities of personhood.

The great achievement of the civil rights movement was to finally abolish the lingering public culture of slavery and to create the opportunities that fostered the black middle class and black political leadership. This was a sea change. But Mr. Obama, by virtue of his unusual background as a biracial child reared by loving, though not unprejudiced, white caregivers, is acutely aware that the crude, dominating racism of the past simply morphed into a subtler cultural racism of the private sphere—significantly altered though hardly less damaging.

Seeing blacks as culturally different—a perception legitimized by the nation’s celebration of diversity and identity—permits all kinds of complicated attitudes and misjudgments. Their differences can be celebrated on playing fields, dance floors and television, in theaters, hip-hop and cinema, and not least of all in that most public and ambivalently regarded arena of mass engagement: politics. But in the disciplined cultural spaces of marriages, homes, neighborhoods, schools and churches, these same differences become the source of Apollonian dread.

What then can we expect of Mr. Obama? One thing we can be sure of is that he will not be leading any national conversations on race, convinced as he must be that they exacerbate rather than illuminate. During the campaign last year he spoke eloquently on the subject, but only when he was forced to do so by the uproar over the Rev. Jeremiah Wright. And since he took office, his one foray into racial politics—his reaction to the arrest of Henry Louis Gates Jr.—was a near political disaster that must have reinforced his reluctance.

Mr. Obama’s writings, politics and personal relations suggest instead that he prefers a three-pronged strategy. First, he is committed to the universalist position that the best way to help the black and Latino poor is to help all disadvantaged people, Appalachian whites included. The outrage of black over-incarceration will be remedied by quietly reforming the justice system.

Second, Mr. Obama appears convinced that residential segregation lies at the heart of both black problems and cultural racism. He is a committed integrationist and seems to favor policies intended to move people out of the inner cities.

Third, he clearly considers education to be the major solution and has tried to lavishly finance our schools, despite the fiscal crisis. More broadly, he will quietly promote policies that celebrate the common culture of America, emphasizing the extraordinary role of blacks and other minorities in this continuing creation.

At the same time, Mr. Obama seems to believe that the problems of black Americans are in part attributable to certain behaviors among them—most notably absentee fathers, dropping out of school and violence—which not only constrain their choices but rationalize the disfiguring processes of white cultural racism that extend the pathologies of the few to all black Americans. As a deeply committed family man, Mr. Obama has already made clear that he will use the bully pulpit of the presidency to encourage internal cultural reformation.

All of these approaches are likely to alienate the identity-seeped segment of black leadership, and they will not prevent the extreme cultural right from accusing him of overplaying race, whatever he does.

The uniqueness of Mr. Obama provides both obstacles and opportunities. My students have found that many young inner city blacks, while they admire him, find him too remote from their lives to be a role model. His policies, if properly carried out, might very well improve their chances in life, but in the end he is more likely to influence the racial attitudes of middle-class blacks and younger white Americans. This is all we can reasonably expect. It will take far more than a single presidency to fully end America’s long struggle with race.

Orlando Patterson, a professor of sociology at Harvard, is the author of The Ordeal of Integration: Progress and Resentment in America’s "Racial" Crisis.

Pages