This book examines a variety of comparative and historical experiences in which irregular armed forces (ranging from militias, paramilitaries, guerrillas, bandits, mercenaries, vigilantes, and police forces to armed veteran groups) have struggled against or on behalf of national states. The study hopes to raise questions about the new political relevance of these types of armed forces. It considers the conditions under which they are more significant than conventional military personnel in supplanting or undermining states, and their broader role in national political development.
Published just as the United States went to war in Iraq, After Jihad put Noah Feldman "into the center of an unruly brawl now raging in policy circles over what to do with the Arab world" (The New York Times Book Review).
A year later, the questions Feldman raises - and answers - are at the center of every serious discussion about America's role in the world. How can Islam and democracy be reconciled? How can the United States sponsor emerging Islamic democrats without appeasing radicals and terrorists? Can we responsibly remain allies with stable but repressive Arab regimes, chaotic emerging democracies, and Israel as well?
After Jihad made Feldman, in a stroke, the leading Western authority on emerging Islamic democracy - and the most prominent adviser to the Iraqis drafting a constitution for their newly freed nation. This paperback edition - which includes a new preface taking account of recent events - is the best single book on the nature of Islam today and on the forms Islam is likely to take in the coming years.
War and its aftermath serve as powerful motivators for the elaboration and transmission of individual, communal, and national histories. These histories both reflect and constitute human experience as they contour social memory and produce their truth effects. These histories use the past in a creative manner, combining and recombining elements of that past in service to interests in the present. In this sense, the conscious appropriation of history involves both memory and forgetting—both being dynamic processes permeated with intentionality.
In this essay I explore the political use of the narratives being elaborated in rural villages in the department of Ayacucho regarding the internal war that convulsed Peru for some fifteen years. I suggest that each narrative has a political intent and assumes both an internal and external audience. Indeed, the deployment of war narratives has much to do with forging new relations of power, ethnicity, and gender that are integral to the contemporary politics of the region. These new relations impact the construction of democratic practices and the model of citizenship being elaborated in the current context.
Sept. 29 issue—The Swedes reject the euro by the resounding margin of 56 percent to 42 percent.
Politicians and pundits proclaim the European Union has reached a crisis. Well, yes and no. The
Swedish vote was no isolated bout of Nordic crankiness. To the contrary, it was sign of the deep
disillusionment sweeping Europe. Yes, Estonia just voted overwhelmingly to join the Union, and Latvia
probably will do the same this week. But note how, in a recent referenda, a slew of existing members
have either voted down a variety of proposed community reforms or approved them by only slender
margins. In Sweden’s case, a broad coalition of female, working-class, poor and rural voters rose to
defend the welfare state against what they perceived as collusion between neo-liberal business and
government elites in Stockholm and distant and unaccountable technocrats in Brussels.
It's hard to interpret this as anything but a resounding sign of the times. Over the coming year the
leaders of at least five countries have promised to submit the landmark new European constitution, currently in
the final stages of negotiation, to national votes. Even a single no could consign the whole lofty enterprise to
failure. It will be a rough ride, underscoring the EU’s accelerating fragmentation. In Britain last week, Tory euroskeptics
watched gleefully as government officials admitted that adopting the euro is “completely off the radar
screen” until 2007. In Brussels, the president of the European Commission, Romano Prodi, warned that the EU
might split, relegating Sweden (along with fellow Eurozone holdouts Britain, Denmark and some East European
members-in-waiting) to a sort of second tier of financially less-influential countries. Acrimonious splits between
“old” and “new” Europeans over the Iraq war have undermined any semblance of unified foreign and security
policies. And when it comes to common economic policy, budget-busting fiscal “unilateralists”—France, Italy
and Germany—have all but killed the EU stability pact.
This is not the end (or even the beginning of the end) for European integration. But it is the end of the
cherished Euro-federalist dream (and Euro-skeptic nightmare) of a single European state, centralized in Brussels.
To be sure, every member accepts certain core commitments—the single market, the European Parliament and the
supremacy of European law in most economic matters. In a plethora of regulatory areas—from anti-monopoly
policy to cellular-phone design to food-safety standards—EU rules are the rules. Still, the EU has crossed the
Rubicon. Once a relatively small and homogeneous organization, cherishing “ever closer union” above all else, it
has become a large and far more diverse, flexible and pragmatic one. “Multispeed” arrangements, whereby
countries advance toward integration at their own pace, have become the norm. Eurocrats have even invented a
positive term for it: “enhanced cooperation."
Many West Europeans from the old core of the European Community are still ambivalent about the
swathe of 10 eastern and southern European countries about to join their club. Their reservations, albeit to a lesser
degree, are shared by the new entrants. Yet for all the bumps encountered along the way, enlargement is an
admirable achievement. Over the next decade the EU will likely become a zone of enviable peace and democracy
from the Arctic Circle to the border of Iraq. The lesson of the Swedish referendum is that the success of the EU
cannot be measured by polls. Public opinion may swing back and forth, but the thousands of small decisions taken
in common cannot be reversed. The EU remains the most ambitious and the most successful example of peaceful
international cooperation in world history.
No issue better reveals the American tension between principle and pragmatism than the debate over affirmative action. This week the Supreme Court is expected to enter the debate with a widely anticipated ruling on the University of Michigan's admissions policies, which favor black and other minority applicants. More important than the decision the court reaches will be the reasoning it uses.
As pragmatic public policy, it is easy to show that the benefits of affirmative action far outweigh its social or individual costs. It ensures the integration of our best universities and thereby promotes (if indirectly) a heterogeneous professional elite. In conjunction with anti-discrimination laws, it has directly fostered the growth of an African-American and Latino middle class.
Corporate America has also embraced the policy, mostly by choice. As a result, minorities make up a large part of the middle and top ranks at many of the country's most recognizable firms. On Fortune magazine's latest list of the 50 best companies for minorities, for example, 24 percent of officials and managers are minorities. Affirmative action has transformed the American military, making it the most ethnically varied at all levels of its organization of all the world's great forces. And, along with changing ethnic and racial attitudes, affirmative action has helped promote a powerful global popular culture, many areas of which are dominated by minorities.
Negative achievements—that is, what affirmative action has spared us—are hard to prove. But it is surely reasonable to attribute the relative infrequency of ethnic or racial riots in America to the presence of minority leadership in many of the nation's mainstream institutions.
All these gains have been achieved at very little cost to America's economic or political efficiency: our economy dominates the world; our army is history's most awesome; our great universities have few equals; our arts, science and scholarship are the envy of the world.
There are indeed costs at the individual level, borne by those whites who may not have gained places or jobs as a result of preferences for minorities. But nearly all research indicates that these costs are minuscule. Repeated surveys indicate that no more than 7 percent of Americans of European heritage claim to have been adversely affected by affirmative action programs, and it has been shown that affirmative action reduces the chances of whites getting into top colleges by only 1.5 percentage points.
For all its achievements, however, many critics fear that affirmative action violates fundamental principles that have guided this country. It is indeed difficult to reconcile affirmative action with the nation's manifest ideals of individualism and merit-based competition. But America's history is replete with just such pragmatic fudging of these ideals.
In foreign policy the United States has defended dictators, destabilized democracies and invaded other countries in the pragmatic promotion of the national interest. Domestically, Congress regularly passes laws that favor special interests—veterans, millionaire ranchers, farmers, oil-well owners, holders of patents about to expire, people with home mortgages—many with no economic justification, all costing billions of tax dollars.
Why, then, the obsession with the principle of colorblindness, especially among right-wing activists who otherwise exhibit little enthusiasm for the equality principle enshrined in the Declaration of Independence? It is hard to resist the conclusion that principles are invoked in public life to rationalize the control of the vulnerable. In relations among equals, meanwhile, pragmatism trumps virtue.
Yet these critics miss a more compelling, and more subtle, argument against affirmative action. In spite of its benefits, there are serious problems in the long run for its beneficiaries if affirmative action is not decisively modified.
First, while diversity is a goal that deserves to be pursued in its own right, it was a major strategic error for African-American leaders to have advocated it as the main justification for affirmative action. In doing so, they greatly expanded the number of groups entitled to preferences—including millions of immigrants whose claims on the nation pale in comparison to those who have been historically discriminated against. Such a development understandably alarmed many whites who were otherwise prepared to turn a pragmatic blind eye to their principled concerns about affirmative action.
Using diversity as a rationale for affirmative action also distorts the aims of affirmative action. The original, morally incontestable goal of the policy was the integration of African-Americans in all important areas of the public and private sectors from which they had been historically excluded. But if diversity is the goal, the purpose of affirmative action shifts from improving the condition of blacks to transforming America into a multicultural society. Thus the pursuit of inclusion is replaced by the celebration of separate identities.
In a more profound sense, the diversity rationale undermines a hopeful view of America. If the purpose of affirmative action is to redress past wrongs, then it requires both the minority and the majority to do the cultural work necessary to create what Martin Luther King Jr. called the "beloved community" of an integrated nation. Instead, many of its supporters see affirmative action as an entitlement, requiring little or no effort on the part of minorities.
Another consequence of this view is that it allows no recognition of the brute historical fact that the very patterns of social, educational and cultural adjustments that ensured survival, and even conferred nobility, under the extreme conditions of racist oppression no longer apply. In fact, now they may even be dysfunctional.
The gravest danger, however, and what perhaps alarms the majority most, is the tendency to view affirmative action as a permanent program for preferred minorities and, simultaneously, the refusal even to consider it a topic for public discourse. Indeed, among the black middle class, especially on the nation's campuses, blind support for affirmative action has become an essential signal of ethnic solidarity and commitment.
The nation needs this policy, but it must be modified. For starters, it should exclude all immigrants and be confined to African-Americans, Native Americans and most Latinos. It should include an economic means test. Only those who are poor or grew up in deprived neighborhoods should benefit. At the same time, poor whites from deprived neighborhoods should be phased into the program, a development that would counter the arguments of right-wing critics.
Finally, affirmative action should be severed from the goal of diversity—which, as the legal scholar Peter Schuck has argued, is best left to the private sector. Middle-class blacks and Latinos would continue to benefit from such voluntary programs, properly understood as a sharing of diverse experiences and perspectives rather than a withdrawal into ethnic glorification. There is every reason to believe the nation's corporations and universities will continue to find such a policy to be in their own best interests, and the nation's.
Americans have always recognized that high ideals, however desirable, inevitably clash with reality, and that good public policy requires compromise. But only through the struggle of affirmative action are they coming to realize that such compromises, wisely pursued, can actually serve a higher principle: the supreme virtue of being fair to those who have been most unfairly treated.
After months of resistance, Iran has agreed to accept stricter international inspections and temporarily suspend its production of enriched uranium. That is progress, but not enough to stop an Iranian nuclear bomb.
Last summer, while Americans were searching fruitlessly for evidence of a nuclear weapons program in Iraq, international inspectors found disturbing evidence next door in Iran. According to the International Atomic Energy Agency, Iran began enriching uranium at a pilot centrifuge plant last August and is also constructing larger underground enrichment facilities. Within a year, the plant could make enough highly enriched uranium for one bomb a year and the larger facilities could make 15 to 20 times as much.
Iran said that its program were for peaceful generation of nuclear energy; but inspectors found traces of weapons-grade, highly enriched uranium. The IAEA set an October 31 deadline for full compliance. And now, under pressure from Europe, Iran has called a temporary halt to enrichment. But it would be a mistake to stop at this point.
Iran correctly claims that, as a party to the non-proliferation treaty, it has the right to enrich uranium for peaceful purposes. In a sense, the NPT was born with a loophole. Even if a country allows inspections, it can legally accumulate enriched uranium (or reprocessed plutonium) under the guise of a peaceful energy program and then suddenly declare that circumstances have changed, and withdraw from the treaty. It could then produce nuclear weapons at short notice. If Iran were to do this, it would add to the dangers in an unstable region and would be likely to begin unraveling the non-proliferation regime worldwide.
For these good reasons, President George W. Bush has declared an Iranian nuclear weapon unacceptable. Our unilateral options, however, are limited. Not only is our military plate full in trying to stabilize and reconstruct Iraq but it would also be difficult to recruit allies. Fortunately, there is a precedent from something we began in the mid-1970s.
India had recently exploded a nuclear device, France was selling a reprocessing plant to Pakistan and Germany began to sell enrichment technology to Brazil. Many parties to the NPT planned to import or develop enrichment and reprocessing facilities. We feared that the non-proliferation regime was unraveling. The Jimmy Carter administration successfully persuaded France and Germany to curtail their exports, and countries as diverse as the Soviet Union and Japan joined us to form a Nuclear Suppliers Group. In 1977, we signed an agreement to restrict the export of enrichment and reprocessing facilities. We were able to plug part of the loophole without amending the treaty.
Mr. Bush should build on recent progress by approaching Europe, Russia and others and persuading them to offer Iran a deal that would fully plug this loophole. Russia, which is helping Iran to construct a nuclear energy reactor at Bushehr, would offer it a guarantee of low-enriched uranium fuel and reprocessing of the reactor's spent fuel in Russia. The deal would be given teeth by a United Nations Security Council resolution, endorsed by the US. The resolution would include a stick declaring that further proliferation of nuclear weapons would be a threat to peace under the Charter and that any country moving in such a direction would be subject to sanctions. The resolution would also include a carrot—guaranteeing Iran access to the non-dangerous aspects of nuclear energy technology. The deal could be sweetened by offers to relax existing sanctions and a security guarantee if Iran remains non-nuclear. At best, such a proposal would head off a looming danger in the Middle East. At worst, if Iran rejected the deal, it would have served notice of its real intentions.
European foreign ministers have produced a useful first step. Russia has refused our requests to cancel the Bushehr reactor but has indicated it would be willing to provide such fuel services. Given Iran's suspicion of the US, subtle American diplomacy would be needed to persuade Europe and Russia to launch the proposal, and we would then announce our support. There have been few recent issues on which the US, Russia, Europe and the UN are in close agreement. Such a proposal offers a rare opportunity to co-operate on an issue of vital concern.
THE GAZA STRIP has been called "a refugee state." Indeed it is humbling to wander through the Beach Camp, built to the edge of the Mediterranean Sea in Gaza City. A lonely child, severely underfed, sits on an oil drum, talking to himself and staring at the sea. An old, toothless woman invites me to sip mint tea in her dark shack, then shows me photographs of of herself as a beautiful young girl. "Where was this taken?" I ask. In Jaffa, she explains, now part of Israel. "I fled in 1948. I've lived here ever since."
Gaza City is an odd place. Dusty, hot, and teeming with children, it offers scenes of squalor juxtaposed with bursts of opulence. Donkey carts coexist with Jaguars roaring by. Futuristic buildings colored pink and periwinkle—built recently when the Palestinian Authority came here to govern and rich Arabs invested in the city—tower over neighborhoods where older buildings have been ripped apart by Israeli bombs.
Despite the unrest, people carry on. Shops sell furniture and mobile telephones; people go out to dinner. One afternoon, I passed a school when the gates opened. Children gushed out, the boys in T-shirts and blue denim, the girls in head scarves and long smocks—separate from the boys since the society is Muslim.
As I traveled throughout the Gaza Strip, I kept encountering multitudes of children. In the north at Beit Hanoun, they played soccer in streets chewed up by Israeli bulldozers, hide and seek in the rubble of demolished houses, and even romped on jagged water pipes and bridges that the Israelis had bombed.
I thought it sad that the Israeli government found collective punishment necessary, inflicting wounds on neighborhoods, villages, and towns for the crimes of single terrorists. The policy is disproportionate, and it has not worked.
Most Palestinians are ambivalent about Hamas and the other militant Muslim groups. On the one hand, they admire Hamas for its fierce resistance to Israel and the United States; on the other they want peace and quiet, work to feed their families, and they hold Hamas responsible for the paralysis of their economy and the horrors of collective punishment.
Yet these huge Palestinian families may one day unlock the riddle of the Arab-Israeli conflict. I mentioned the mobs of children thoughout Gaza—and the same is so in the West Bank. Palestinians generally do not practice birth control; it is not unusual to meet a Palestinian father who has eight, 10, or even 15 children. Israeli families normally do not exceed two or three children.
This phenomenon is what sociologists call a "demographic time bomb"—and it terrifies Israelis across the political spectrum. Within the next two decades, Arabs between the Mediterranean and the Jordan River will surpass Jews in numbers. Many Palestinians consider the road map a bad joke, and are willing to wait until Arabs command a majority in all of historical Palestine.
Michael Tarazi, a legal adviser to the Palestinian Authority, urges patience until Arabs and Jews become in effect one entity throughout Israel and the West Bank and Arabs can demand equal rights. He argues that eventually the world will impose a one man, one vote system on Israel. Soon enough, with an Arab majority, a Palestinian will be elected president of a new state embracing all of Israel, Gaza, and the West Bank.
The argument is full of holes, but demographically it contains a certain logic. Liberal Israelis are determined to retain Israel's mostly Jewish character. They realize that time for a two-state solution is running out. They agree with the Arabs that Ariel Sharon's plan for a minimal Palestinian state will never work.
Thus they are thrown back on President Bush's "road map." This process—already so vague about final borders and how a Palestinian state will be achieved—will inevitably fail unless Bush makes decisions from which his predecessors have recoiled. Of course he will put pressure on the Palestinian Authority to crack down on terrorism. But how will he confront the obstructions and delaying tactics of Sharon, who palpably has no intention of dismantling major settlements or surrendering East Jerusalem?
One wonders whether Bush has a clear idea of the mountains he must move. Does he realize that to move Sharon he may have to withhold spare parts for the Apache helicopters, F-16 aircraft, and other military equipment that the United States sells or gives to Israel?
Polls show that most Israelis still want to exchange land for peace. Reports reaching Israel suggest that Bush is furious with the Sharon government for impeding the road map. Possibly he senses that this is the last chance for a two-state solution. Those multitudes of children in Gaza and the West Bank will soon grow up, and then the demographic time bomb may explode.
Archbishop Sean P. O'Malley, demonstrating the high priority he places on services to the poor, yesterday named a priest who is a national leader in Catholic social services and the former dean of Harvard Divinity School to oversee Catholic Charities of the Archdiocese of Boston.
O'Malley's choice of the Rev. J. Bryan Hehir as president of the largest private social services agency in the state places an outspoken and highly regarded intellectual in the inner circle of the church's hard-pressed local administration. Hehir, who will now be a member of O'Malley's cabinet, has spoken widely about the sexual abuse crisis in the church, and has been critical of the hierarchy, declaring just last week at Boston College that "we've got to treat adults as adults in the church."
He succeeds current Catholic Charities president Joseph Doolin, who had announced plans to retire.
Hehir, 63, who currently serves as president and chief executive of Catholic Charities USA in Alexandria, Va., will assume his new post in January. The Boston agency provides social services to hundreds of thousands of poor people, most of whom are not Catholic and many of whom are immigrants.
"There is rising human need, and there are declining resources to meet those needs, rooted in the fact that we're coming out of a recession, state budgets all across the country have been devastated, and the church across the country has been impacted by the [abuse] crisis," Hehir said in a telephone interview. "We're working on housing, nutrition, helping people with employment searches, but in Boston we're doing that under an overarching reality, which is the need for a pastoral rebuilding of trust within the church, and a public rebuilding of trust within society as a whole."
O'Malley, who as a Franciscan Capuchin friar has expressed a pastoral preference for the poor, declared privately shortly after his installation July 30 that he wanted a priest to take over the agency upon the retirement of Doolin, the social worker who has headed the agency since 1989. O'Malley consulted with the agency's board of trustees, but made the choice himself.
"Archbishop O'Malley, as the bishop of Boston, has a great concern for the poor and needy of Boston, and his own charisma as a Franciscan gives him a deeper love for the poor and underprivileged," said O'Malley's spokesman, the Rev. Christopher J. Coyne. "The appointment of Father Hehir to this position is going to strengthen the church's mission to the neediest among us."
Many church observers were surprised that such a prominent figure would move from the national organization to a local one. But it was hailed in Boston.
"In the world that Bryan Hehir lives in, he's a giant," said Neal F. Finnegan, chairman of the Catholic Charities board of trustees. "This certainly makes a statement about the importance of Catholic Charities in the array of things this archbishop is trying to strengthen."
Peter Meade, the board's vice chairman, called the appointment "an incredible signal to the priests of the diocese that the archbishop wants to hear from everyone, and a signal to us of how important he believes Catholic Charities is to Boston."
Voice of the Faithful, a lay organization that has been contributing money to Catholic Charities because the Archdiocese of Boston has refused to accept the group's contributions, also welcomed the move.
"This is . . . an extraordinary coup for the Archdiocese of Boston," said James E. Post, the president of the Voice of the Faithful. "Father Hehir is a scholar, a passionate advocate for social justice, and a gifted leader who well understands the impact of this crisis on Catholics and Catholic institutions."
Hehir will face numerous challenges at Catholic Charities, a 100-year-old agency that served 200,000 people last year and had a budget of approximately $40 million, most of which came from the state government and private fund-raising. Doolin is credited with building up the board of the organization, which includes numerous prominent business leaders, and with spearheading an ongoing but painful reorganization in which the agency expects to consolidate about 75 facilities around Greater Boston to fewer than 10.
Hehir is not a well-known public figure, but in the worlds of academia and in church circles, he is highly regarded for his intellect and communication skills. He is one of the leading Catholic experts on the theory of just war.
Educated at St. John's Seminary in Boston and ordained in 1966, Hehir worked from 1973 to 1992 on the staff of the National Conference of Catholic Bishops, where he was regarded as the architect of the bishops' influential 1982 pastoral letter on nuclear weapons, which called for reducing the nation's nuclear arsenal. In 1984 he won a so-called genius grant from the MacArthur Foundation, and he has won more than 25 honorary degrees from various universities.
While at the bishops' conference, Hehir taught at Georgetown, and then he moved back to Boston, where from 1993 to 2001 he was a professor of the practice of religion and society at Harvard Divinity School. From 1998 to 2001, he served as the first Catholic priest to lead Harvard Divinity School, but he refused to take the title of dean or to live in the dean's mansion to demonstrate that his first duty was to the church.
Cardinal Bernard F. Law, then archbishop of Boston, made it clear he was unhappy with Hehir being stationed at Harvard, a historically Unitarian school with a reputation for progressive theology. In 2001 Hehir left to head the national Catholic Charities agency, which does not provide direct services but rather serves as an umbrella organization advocating in Washington for the interests of Catholic Charities agencies around the country.
Over the course of the church crisis, Hehir has been much in demand as a public speaker, and has addressed the crisis at multiple venues, including Regis College and Boston College. He has also spoken on the subject to the Boston Priests Forum.
At BC last week, he declared that the church is still in crisis, saying, "We've had the nuclear explosion, the blast is over, but the radiation is still in the air," and warning that "if we lose the people who have become disillusioned by this, the next two or three generations of their families are gone, too. . . . That will be devastating."
Hehir said yesterday that even as the archbishop's secretary for social services, he expects to continue to speak his mind.
"You need to know when you're working in a given framework, that you're not a freelancer. But I've always found people give you the scope to speak as honestly as you can to problems, and that you should try and be as direct and clear as you can," he said.
Next month the Federal Communications Commission in the US is likely to lift remaining restrictions on media cross-ownership. Michael Powell, chairman of the FCC and son of Secretary of State, Colin Powell, has made it clear he believes the marketplace should be the mechanism by which the media business is managed. It sounds fair. But in a country where about six major conglomerates control the bulk of all forms of news and entertainment media from books, film, newspapers, radio and television, the June decision will open the way for further consolidation where a single company could own newspapers, radio and network/cable TV in the same city.
The debate in the US has been limited since the public's main source of news—television—has not covered the issues of how media monopolies affect the public's news since they are owned by the main media giants. For the American public the obscured debate—which occasionally makes the business pages of mainstream newspapers—highlights the very dangers of a hyper-commercialisation of news. It is not so much a matter of bias but the absence of information which truly disenfranchises people.
The recent Iraq conflict was a showcase for the commoditisation of news. The key media winner in the US was Fox News—the 24/7 cable news channel owned by Rupert Murdoch's company, News Corporation, which claimed a 300 per cent increase in viewers during the war.
The network channels ABC, NBC and CBS all lost viewers, while the cable channels, led by Fox's example, won audiences by being there on demand and also for selling the war news through a patriotic filter where the US flag was put front of screen and both anchors and reporters embedded with the troops used "we" throughout their coverage. The battlefield war for the cable channels was a means to fight the ratings war. They used it as a hook to break the news connections with the network houses, and CNN, a very different product in the US to what is broadcast in Europe and the Middle East, chased Fox in an opinion-led war coverage style.
In the US, the weakness of public broadcasting, particularly on TV where PBS has neither the resources nor newsroom to compete on war coverage, means that the counter balance to the market-led extremities is absent. NPR and PBS relied on relays of the BBC on radio and television to maintain a concept of full coverage and hence the BBC's man in Baghdad, Rageh Omaar, was named the "Scud Stud" by the New York Post - ironically another Murdoch house. Once CNN was thrown out by the Iraqis at the end of the first week no US TV crew remained to report the news.
The BBC is using the opportunity to expand its news profile in the US. Its audiences are up 28 per cent and it will put its 24 hour news channel on cable here later in the year. The BBC director general, Greg Dyke, bluntly told an audience at Harvard Business School, during the war, that the US electronic media had "wrapped the flag around itself and given up the ghost of real journalism". And that's the point. For the AOL Times Warner, Disney, Viacom and News Corporations of the world it is not "real journalism" or informed democracy, which is the end game—it is the rewards of the market—the profit margin. It is unrealistic to expect a commercial conglomerate without legal requirements (there's no longer a legal fairness statute in the US) to behave like a not-for-profit trust or public service broadcaster. But for the US, which has the most deregulated news media market and the weakest public broadcasting sector in the developed world, the real losers are the citizens who at the end of the "war" were still left in the dark about key events like the use of cluster bombs, the impact on civilians, and the conflicting statements of record on what happened in a range of incidents, including the market bombing ones.
This week both the BBC and the Guardian boasted that their websites had taken off in the US during the war. BBC News on-line increased by 47 per cent, while traffic to the Guardian news site was up 83 per cent (obviously from a relatively low base). While it's good news for the two UK media groups, the real trend is that the gap in public services or not for profit news provision in the US is being increasing met by overseas sources—indeed by the British licence—fee payers in the case of BBC News.
It is ironic that at a time when the US is debating its lonely super-power status that it lacks a super news media. France's President Jacques Chirac has already set aside a budget for the creation of a French language 24/7 global news television channel to compete with the growth of global "soft power" forces like the BBC. For Europeans and indeed small countries like Ireland, the Iraq war with its image saturated 24/7 coverage; from embedded, largely US/British journalists (two thirds of the 600 embedded reporters were American) has underscored the need to debate the fate of a public's access to information.
Market forces can lead to monopolies that are just as undesirable for citizens as state-controlled monopolies. The threat to public access to information was clear in totalitarian states but the threat may be just as significant in a news media world run solely to generate profit rather than spread information and seek the truth. A world where a handful of business interests dominate the provision of news needs a balance.
It may be too late for the US public to engage the FCC in debate before the June deadline, but for those of us on the other side of the Atlantic, where the wave of global media is hitting, it is the time to start learning lessons not just from the inspiration of Thomas Jefferson, who said the right to information protected every other right, but also from the reality of the limits of a free falling market.
Observing the second anniversary of Al Qaeda's assault on the World Trade Center and Pentagon, administration spokesmen sought to highlight progress in the war on terrorism to support President Bush's claim that we're getting safer every day. But if one stands back and asks whether Americans are actually safer from terrorist attacks than we were 12 months ago, a serious answer requires a net assessment. Our safety is a function not only of what our government does, but also of changes in our adversaries' capabilities and motivation.
Assessing that balance, I conclude that the threat of terrorist attacks on America in the year ahead remains at least as high as it was last year. Consider four fronts in this war: the international campaign against Al Qaeda; homeland security; preventing nuclear and biological terrorism; and Iraq.
In the first 12 months after the attack, America organized an extraordinary worldwide campaign against terrorism. Universal sympathy for American victims of 9/11 led governments and citizens around the world to proclaim solidarity under the banner, "We are all Americans." Through the UN, the United States enlisted more than 100 nations in a global effort to share intelligence information, enforce antiterrorist legislation in each local setting, and stop terrorists' flow of funds.
In the past year, this global coalition has frayed. America's standing in the world has fallen further and faster than at any time in the history of polling. When asked recently whom they trust to "do the right thing regarding world affairs," more Pakistanis, Indonesians, and Jordanians chose Osama bin Laden than President Bush.
Why does this matter? The essence of effective counter-terrorism is intelligence and police enforcement at the local level. The mastermind of Al Qaeda's Southeast Asian operations was captured in Thailand as a result of a tip from suspicious neighbors combined with active cooperation between local Thai agents and the CIA. The "hearts and minds" of governments and citizens provide either a sympathetic sea in which terrorists swim and hide, or alternatively, millions of eyes and ears from which terrorists cannot escape. Although the United States caught three of Al Qaeda's top leaders in the past year, bin Laden and his second-in-command are still on the lam and Al Qaeda recruitment is up.
Second, in defending the American homeland, the US government has taken many significant actions, most notably the creation of the Department of Homeland Security. In time, this will yield security dividends. But realism requires recognition that in the first year after such disruptive changes, capabilities are more likely to decline than increase. Former Senator Warren Rudman's recent Council on Foreign Relations report finds a shortfall of $98 billion in priority investments in homeland security over the next five years.
Third, the campaign to prevent a nuclear or biological 9/11 is failing. Despite lots of talk, North Korea is today producing plutonium that could be sold to terrorists for use in a nuclear attack. Russian and Pakistani nuclear weapons and materials remain as vulnerable to theft as they were last year. Orphaned research reactors in a dozen countries around the world, including Libya and Ghana, contain highly enriched uranium sufficient to make a number of nuclear weapons. And while Saddam no longer rules Iraq, he has gone missing, and with him a sophisticated smuggling network and whatever biological weapons Iraq had.
While the administration rightly recognized the threat of a bioterrorist attack with smallpox or other agents, the campaign to protect American citizens flopped. The announced goal of vaccinating 500,000 first responders against smallpox was recently disbanded after vaccinating fewer than 40,000. Long term, Project Bioshield will create new vaccines and therapeutics. Short term, the public health system's capacity to identify and respond to bioterrorism remains dangerously inadequate.
Iraq is the wild card. Analytically, Iraq was tangential to the immediate war on terrorism. The CIA found no links between Saddam and 9/11 and no evidence of Iraqi support for terrorist attacks on American targets since the early 1990s. But President Bush's decision to market war on Iraq as a centerpiece of the war on terrorism has now forged an inextricable link.
While the demonstration of America's military might has surely sobered regimes that wish America harm, the administration's incompetence in postwar Iraq has attracted both terrorists and jihadi wannabes in what now threatens to become a terrorist incubator akin to Afghanistan during Soviet occupation.
For the year ahead, Americans are at least as vulnerable to terrorist attacks as we were in the year past. After the next mega-terrorist attack, Americans will find our government's failure to mount a more effective war on terrorism as inexplicable as they found the failure to prevent 9/11.
Five years ago, Haven Acres—a largely African American neighborhood of Tupelo, Miss.—showed familiar signs of urban decay: gangs, drugs, trash, dilapidated housing, for-sale signs, distrust and hopelessness. Haven Acres was so dangerous that armed cops were afraid to enter the neighborhood unless they had backup.
But last month I walked at dusk through a miraculously transformed Haven Acres. Crime rates have fallen by 86%, and the drug dealers and for-sale signs are gone. Larry Otis, the white Republican mayor of Tupelo, understandably brags about this grass-roots accomplishment as the proudest success of the city during his tenure.
At the heart of this change is the community center that houses a booming Boys and Girls Club, created to give the kids of Haven Acres a positive alternative to gang life.
Ron Green, who grew up in Haven Acres and now runs the Boys and Girls Clubs of northern Mississippi, says the mentors at the Haven Acres club served 260 kids each day this summer, saving them from drugs, crime and maybe even death as they learned self-esteem and civic virtue.
Youths whom he would not have trusted with the keys to his office six months ago now stay there late to clean up, proud of the community they are creating.
And then Green's face falls. More than half of the mentors at Haven Acres were supported by the AmeriCorps program, which offers young people a modest living allowance and a promise of help toward college expenses in return for a year of full-time community service.
Cutbacks in AmeriCorps have forced him to fire all those mentors and slash his services to Haven Acres children.
Haven Acres is not alone. Federal politicking has forced AmeriCorps to renege on its commitments to community service programs across the country. In the three-state Mississippi Delta, the nation's poorest region, AmeriCorps' support for programs has been chopped by 90%. And thousands of other programs from coast to coast have been similarly cut.
Like Green, the harried leaders of those programs have been forced to eliminate positions for volunteers—nationally, 20,000 such AmeriCorps slots have been cut, nearly half the total planned for this year.
During the closing decades of the 20th century, as I wrote three years ago in "Bowling Alone," Americans became steadily less connected with one another and with collective life. We voted less, joined less, gave less, trusted less, invested less time in public affairs and disengaged from friends and neighbors and even from our own families. Our "we" steadily shriveled.
In this context, the attacks of 9/11 represented not just a national tragedy but also an extraordinary opportunity for civic renewal. After these decades of disengagement from our communities, Americans surprised themselves in their post-9/11 solidarity.
Many people recognized that this was a "teachable moment." If we caught that moment to build the foundation for a new culture of service and civic engagement, we could create a new "greatest generation." President Bush articulated this national yearning in his 2002 State of the Union address.
The president proposed a substantial expansion of the federal government's support for national and community service through AmeriCorps.
The program was already a proven success at helping young people from all walks of life—not just the wealthy who can afford full-time volunteering—invest a year of their lives in community service. Now it would be expanded to 75,000 posts a year.
Bush's call to action caught the imagination of the nation's youth. Applications to AmeriCorps jumped, and organizations in thousands of communities worked to absorb and direct these new energies. But now Washington is betraying the hopes of these idealistic youngsters.
Despite support for AmeriCorps from 44 governors, 79 senators, 250 corporate leaders and the vast majority of the American public, right-wing ideologues in the House have blocked the funding needed to keep the program even close to the growth path that the president laid out nearly two years ago. The House and Senate appropriation committees recently proposed allocating $340 million to $345 million to AmeriCorps for next year, but this initiative does nothing to forestall the draconian cuts facing its programs this year.
The president has so far remained above the fray, seemingly reluctant to commit himself to defending his own promise. Discussing his call to service two days after the 2002 State of the Union address, Bush told volunteers in Daytona Beach: "I'm one of these accountability guys. I understand sometimes in political process all you hear are words. I like to back them up with action. It's one thing to lay it out; it's another thing to follow up. I'm a follow-up guy."
Mr. President, now's the time for you personally to follow up.
The nation's young people responded enthusiastically to your leadership two years ago, and now's the time to make good on your commitments to them. Now's the time for you to persuade the House to act.
This is indeed a "teachable moment," and American youth—in Haven Acres and across the country—are listening. What civics lesson will you teach?
We know what the benefits of a war on Iraq would be: the ouster of a cruel tyrant and the elimination of his weapons of mass destruction. But we also know what the costs would be: prohibitive. The Bush administration believes that the overthrow of Saddam Hussein could bring democracy to Iraq and then spread to the rest of the Arab world. This is a fantasy. It will be difficult to introduce democracy in a heterogeneous country that has never experienced it. After 30 years of repression, there could be violence between ethnic and religious groups that US forces would have to cope with; moreover, a prolonged occupation and military rule would squander the good will we as liberators expect to prevail at first. Arab governments will try to contain the spread of democracy in order to stay in power. Moreover, as long as we haven't decisively intervened in the Israeli-Palestinian conflict and suggest that it may have to wait until Arab regimes have changed, Arab suspicion of the United States will mount.
Indeed, American control of Iraq could contribute to Muslim terrorism and foster a xenophobic fundamentalism aimed at US "imperialism." Already the administration's obsession with Iraq and its bullying way of obtaining support have provoked considerable anti-American resentment abroad.
The image of the United States has been tarnished by our manipulation of the United Nations and our alliances with NATO countries, our efforts to split the European Union, and our disdain for public opinion abroad.
More seriously, the Bush administration has recklessly attacked the very foundations of world order that this country helped put in place: the UN, international law, and the EU have been the casualties of a team that has repudiated a distinction we had wisely preserved throughout the Cold War, between leadership and dictation. This may result in something we had avoided: an anti-American ganging-up of countries threatened by the growth of unchecked American power.
Then there are the economic costs of the war as well as those of rebuilding Iraq and not neglecting its needs, which would encourage those who doubt American good intentions.
Are these costs worth it?
We have two main reasons to go to war. Both are shaky. Iraq is effectively defanged and incapable of constituting a real threat to us or its neighbors as long as we operate freely in two no-fly zones, the Kurds have autonomy under Anglo-American protection, and inspectors roam freely.
Contrary to the administration's assertions, containment can continue to work in the long run, especially if the no-fly zones and the inspections are maintained, a tight naval blockade prevents military imports into Iraq, and ground forces remain stationed at Iraq's borders. War is not necessary to render Iraq harmless to others.
The more difficult issue is that of Saddam Hussein. Like preventive war, forcible regime change violates international law. The exceptions to state sovereignty that were made legitimate through the so-called humanitarian interventions in the 1990s have never involved "regime change" and had to be justified by the argument that the violation of human rights constituted a threat to international and regional security. This could have been an argument for intervening against Saddam Hussein's regime in 1991; the opportunity was missed, and since then he has not committed mass crimes against humanity. Still, his regime is based on fear and terror.
Sooner or later all governments may realize their citizens are entitled to basic human rights. But this principle will need international support, not unilateral action, to be established, and a clear understanding of the differences between "ordinary" bad regimes and truly evil ones.
Meanwhile, those who sympathize with the plight of the Iraqi people yet do not want to increase their suffering through war should do all they can to make Saddam Hussein's position increasingly difficult. The International Criminal Court or a special criminal tribunal can try him for crimes against humanity, order a ban on travel by all top officials, deprive them of their fortunes abroad, and indicate that a recovery by Iraq of its full sovereignty will depend on its compliance with those decisions. Furthermore, we can provide covert assistance to groups of Iraqis willing to act against Saddam Hussein and overt aid if they try to overthrow him.
For these purposes, we could lead a "coalition of the willing" far bigger, and less resentful, than the one we're trying to force into supporting us for a risky and unpopular war. The democracy we say we champion should begin by recognizing that listening to what the public says is not tantamount to appeasement and inaction. Gaining more victories in the difficult war against terrorism is a far more widely shared goal, and one that could be imperiled by American hubris in the war against Iraq.
Here’s how to think about the EU’s new constitution
Alas, to be born into the world without friends. That seems to be the fate of the draft constitution of the new European Union, to be presented at the summit in Thessaloniki on Friday. To a contemptuous Romano Prodi, the head of the European Commission who dreamed of building a stronger "federal" union, the document "lacks vision and ambition." By contrast, British Tories and tabloids are positively twitching with Europhobia. The proposed constitution will "sweep away 1,000 years of history," proclaims the Sun.
DON’T BE FOOLED. Prodi and his federalists are right. The Convention on the Future of Europe has deliberated—and delivered a mouse. When they began their work 18 months ago, Prodi and other Euro-insiders were convinced that they would dominate. They intended to exploit the bold rhetoric of constitutionalism to craft an idealistic document that centralized ever more power and democratic control in Brussels. There was heady talk of a new name, "United Europe." There were proposals for introducing majority voting on sensitive issues of foreign and defense policy. Brussels’s influence would be extended over national fiscal and social policies. National vetoes would be eliminated. A new EU president would be directly elected, by and for the people. Farewell, faceless Eurocracy. Welcome, democracy.
Little of this has come to pass. The convention’s canny chieftain, former French president Valery Giscard d’Estaing, knows where real power in the EU lies. Any radical draft would be picked apart by national governments when they begin reviewing the document this autumn. To encourage them to accept his draft intact, Giscard faced down the federalists and struck backroom deals with the member states. Case in point: last week’s wrangling over whether to allow EU foreign policy to be determined via qualified majority voting. Bottom line? It won’t be.
Most of the reforms that remain are simply good public management. With enlargement and a union of 25 member states, many of them tiny, it was high time to replace the revolving six-month council presidency, responsible for organizing the overall agenda. Thus there will be a new EU president, appointed by the states for a five-year term. Both "old" and "new" Europeans agree that crises like Iraq require greater integration. Thus the two top EU foreign-policy posts—currently held by Javier Solana and Chris Patten—will be consolidated into the new position of EU foreign minister. For most European countries the management of asylum, crime and defense procurement justifies a minimum set of intergovernmental standards. Thus a smattering of consensual policies will be newly governed by qualified majority voting and European Parliament oversight.
Otherwise, the new constitution merely consolidates current practice—a fact that should be welcomed rather than scorned. For too long, the debate over union has been divided between radical federalists who would move forward toward a centralized Europe and excessively cautious skeptics who would roll it back. What has emerged from the constitutional convention is a Europe in equilibrium. Henceforth, policies of greatest concern to citizens—social welfare, taxation, pensions, health care, education, culture, infrastructure—will remain essentially national and local. Those of less interest—trade, banking, industrial standardization, technical harmonization—will be European. In the middle, a few policies—immigration, policing and defense—will be shared. This is a heartening outcome. In coming to terms with its strengths as well as its limitations, the European Union has at long last entered its maturity.
For many transatlantic pundits, the Iraq crisis is further proof that Europe needs an autonomous military force. This view was forcefully expressed last week by Laurent Fabius, the former French prime minister, who said in the Financial Times that Europe "was unable to make its voice heard in the US because it was divided and lacked a unified defence force".
For some years now politicians have found European defence irresistible. European public opinion strongly favours it. European federalists want the European Union to have greater powers. French Gaullists, long convinced that military might means great power prestige, trumpet the idea. Tony Blair, Britain's prime minister, has exploited it to become more "European"; Joschka Fischer, Germany's foreign minister, has exploited it to become more military.
The logic is seductive: if the US respects only military power, a European army will surely command respect. Yet European defence is a dangerous pipe dream. And the Iraq crisis demonstrates why.
A co-ordinated military force with the global capabilities to fight a high-technology, low-casualty war would require Europeans to increase military spending, currently 2 per cent of gross domestic product, to more than the US rate of 4 per cent if it is to overcome a decades-long US lead. No European public would accept this.
However heavily they were deployed, European transport aircraft, satellites and multilingual soldiers would not add up to an effective policy response to US unilateralism. Do Europeans propose to use military force against the US? Launch "pre-preventive" interventions?
Or is the goal to reduce European dependency on Nato? If so, the result would be to encourage precisely the withdrawal from Europe advocated by US hawks. A European rapid reaction force might be useful for peacekeeping but neither it nor a larger force would reverse determined US unilateralism.
The entire notion is in fact incoherent. Europeans have claimed from the start of the Iraq crisis that non-military means should be used more intensively. Yet when Washington sends in the marines, Europeans call for a stronger defence.
The real problem is that European defence schemes distract Europe from its true comparative advantage in world politics: the cultivation of civilian and quasi-military power. Europe is the "quiet superpower". There are at least five ways in which Europe can wield influence over peace and war as great as that of the US.
First, EU accession—perhaps the single most powerful policy instrument for peace and security in the world today. In 10-15 potential member states, authoritarian, intolerant or corrupt governments have recently lost elections to democratic, market-oriented coalitions held together by the promise of eventual EU membership.
Second, Europeans provide more than 70 per cent of all civilian development assistance. This is four times more than the US and is far more equitably disbursed, often by multilateral organisations. When the shooting stopped in Kosovo and Afghanistan, it was the Europeans who were called on to rebuild, reconstruct and reform.
Third, European troops, generally under multilateral auspices, help keep the peace in trouble spots as disparate as Guatemala and Eritrea. EU members and applicants contribute 10 times as many peacekeeping troops as the US. No one outside Washington believes US troops will be able to do the job after the Iraq war.
Fourth, monitoring by international institutions, supported by Europe, builds the global trust that is needed to manage crises. The Iraq crisis might have developed very differently if the Europeans had been able to offer the option of sending, say, 10 times as many weapons inspectors in, 10 months earlier.
Last, the Iraq crisis demonstrates the extraordinary effect of multilateral institutions on global opinion. In country after country, polls have shown that a second United Nations Security Council resolution would have given public opinion a 30-40 per cent swing towards military action. With the US stance apparently lacking international legitimacy, American troops have been unable to open a second front from Turkish territory; and the bill for the war is likely to fall largely to the US.
Americans are not just unwilling but also—for complex domestic, cultural and institutional reasons—apparently unable to deploy civilian power effectively. That is the true weakness of US strategy today, for without trade, aid, peacekeeping, monitoring and legitimacy, no amount of unilateral military might can stabilise an unruly world.
Rather than criticising US military power, or hankering after it, Europe would do better to invest its political and budgetary capital in a distinctive complement to it. European civilian power, if wielded shrewdly and more coherently, could be an effective and credible instrument of modern European statecraft, not just to compel compliance by smaller countries but perhaps even to induce greater American understanding. Europe might get its way more often—and without a bigger army.
President Bush used three main arguments to justify sending American troops into Iraq. The first tied Saddam Hussein to Al Qaeda, but the evidence that was presented publicly remained thin. Public opinion polls show that many Americans accepted the administration's word on the connection, but overseas responses were more skeptical.
The second argument was that replacing Saddam Hussein with a democratic regime was a way to transform the politics of the Middle East.
A number of neo-conservative members of the administration had urged this before taking office but were unable to turn it into policy during the first eight months of the administration.
After Sept. 11, however, they quickly moved through the window of opportunity (even though North Korea posed a more imminent danger). President Bush spoke often of regime change.
The plausibility of this argument was debated at home and abroad, and the merits probably lie somewhere between the proponents and skeptics.
Our dedication to the broad values of democracy and human rights is part of our ''soft'' or attractive power and an essential part of our foreign policy.
But democracy is a fragile plant that requires carefully cultivated soil. It is not easily transplanted. Of the places where the United States has sent troops in the last half-century, only a minority of the interventions resulted in democratic governments.
Optimists cite the role of American military occupation in the democratization of Germany and Japan after World War II.
But conditions in the Middle East today are not like Germany and Japan in 1945. Both of those countries had large middle classes, prior experience with democracy, and a prolonged and largely unopposed American military presence.
Given its history and internal divisions, postwar Iraq is unlikely to look like democracy as we know it, but it will be a better and more pluralistic regime than now exists.
American military success may lead Iran and Syria to temper their policies, but if President Bush is unable to persuade Israeli Prime Minister Ariel Sharon to reach a compromise acceptable to the Palestinians, the map of the region may change less dramatically than the optimists hope.
The third argument was the clearest and most widely accepted. It focused on preventing Saddam Hussein from possessing weapons of mass destruction.
Most countries agreed that Saddam had defied UN Security Council resolutions for a dozen years.
Moreover, Resolution 1441 unanimously put the burden of proof on him to demonstrate what had happened to weapons that UN inspectors had been concerned about before they left Iraq in 1998.
If President Bush had focused on this third argument and been willing to work out a compromise along the lines suggested by Canada, he could have built a far broader coalition for the war (Canada, and later Britain, suggested an agreement to give the UN inspectors more time in return for clear benchmarks and a date certain to declare the end of the process.)
I believe France might have gone along, but even if it had used its veto, American actions would have been legitimized by a majority in the Security Council similar to that we enjoyed when we intervened in Kosovo in 1999.
Unfortunately, the administration sent mixed messages. The more the unilateralists in the administration talked about regime change and going ahead no matter what the UN did, the more other countries became convinced we were not serious about cooperation, and the more the issue became the legitimacy of American power rather than the transgressions of Saddam Hussein.
The result is the right war at the wrong time, but the point is now moot.
Those of us who are critical of the clumsy handling and timing of the war must admit that indefinite containment was unlikely to succeed.
Saddam Hussein had a record of taking high risks, a clear intention to develop weapons of mass destruction, and a proven willingness to use them.
Enforcing Security Council Resolutions 687 and 1441 is better than returning to the evasive politics of the 1990s when Saddam Hussein successfully defied a divided United Nations.
We multilateralists must now hope that the war is brief, that the Iraqi people will visibly welcome the removal of a tyrant, and that the reconstruction of Iraq will involve many countries and a United Nations role.
Perhaps that will allow us to recover some of the legitimacy after the fact that the administration squandered before the war.
There would be something charming—quaintly reminiscent of Trollope perhaps—in the image of Britons "from pub landlords to vicars" forming a queue to vote in the Daily Mail "referendum" on the proposed EU constitution. Charming, that is, if it were not so corrosive of proper democratic debate.
The current campaign for a referendum shows just what is wrong with plebiscitary democracy. It is a clever campaign because it uses and abuses two of the highest political values in the west: limited government and democracy. Limiting government by blocking activities of "foreign" institutions may seem prudent, yet it is impractical in an interdependent world. Plebiscitary democracy—politics by referendum—seems unimpeachably "democratic" on the surface, yet in fact it empowers the rich, the ignorant, the negative, and the ideological. Voters lack the time, commitment or expertise to engage fully in complex issues—particularly when, as in the case of the EU, their main concerns are not on the agenda. Referendums in the US have shown that under such circumstances, huge amounts of money, slick consultants and access to the media are required to win...
Economic historians continue to debate the causes of the 'great divergence of economic'
fortunes which has characterized the last half millennium. In this debate, the role of
colonialism—and specifically the British Empire—must needs play a crucial role. If
geography, climate and disease provide a sufficient explanation for the widening of
global inequalities, then the policies and institutions exported by British imperialism were
of marginal importance;4 the agricultural, commercial and industrial technologies
developed in Europe from 1700 onwards were bound to work better in temperate regions
with good access to sea routes. However, if the key to economic success lies in the
adoption of legal, financial and political institutions favourable to technical innovation
and capital accumulation—regardless of location, mean temperature and longevity—then
it matters a great deal that by the end of the nineteenth century a quarter of the world was
under British rule.
Also Development Research Institute Working Paper Series
No.2, RR# 2003-02. Download PDF
The Israeli security cabinet's decision to “remove” Palestinian Authority President Yasser Arafat is ominous. Though the stated justification for the decision is the assertion that Arafat is an obstacle to peace, it actually seems to be designed to ensure the failure of the peace process envisaged by the road map.
This is not a time for passivity, subtlety, or ambiguity in Washington's response. Vehement US opposition to such a project is essential to averting reckless actions that are likely, at the very least, to set back the Israeli-Palestinian peace process for a long time to come.
There is no Palestinian leader who would be able to negotiate a peace agreement in the wake of Arafat's expulsion. Worse yet, if the expulsion causes the death of Arafat and/or the deaths of Palestinians who gather to protect their leader, the likely result is a further escalation of violence, with disastrous consequences for both communities.
Washington, unfortunately, prepared the ground for this dangerous turn of events by framing the appointment of Prime Minister Mahmoud Abbas (Abu Mazen) as another case of "regime change" in the Middle East—as the replacement of Arafat with Abu Mazen—and by shunning and seeking to isolate Arafat and pressuring European officials to break all contacts with him.
This approach has compromised Abu Mazen's legitimacy in the eyes of the Palestinian population and seriously undermined his
position. He came to be seen by many as a tool and accomplice in US efforts to reorder the Middle East and Israeli efforts to
perpetuate the occupation. It also encouraged a power struggle between Abu Mazen and Arafat, who had an incentive to block
some of Abu Mazen's initiatives in order to maintain his personal control.
Defining Abu Mazen, and now his designated successor, Ahmed Qurei (Abu Ala), as replacements and rivals of Arafat, flies in the face of an important reality: Both men are long-term, close associates of Arafat and derive whatever domestic legitimacy they have and can potentially enhance from this association.
Furthermore, the political strategy pursued by these two men—and, indeed, by all of the Palestinian leaders who have been committed to negotiating a historic compromise with Israel, in the form of a two-state solution—is ultimately the strategy of Arafat.
Where Arafat's leadership has gone awry is in his tactics: his continuing embrace of the bankrupt idea that violence can be used as a bargaining tool; his unwillingness to share control and credit.
The appeal of Abu Mazen and Abu Ala is that they are ready to advocate an end to violence and to engage in realistic negotiations in pursuit of the goal that Arafat has enunciated and persuaded the majority of the Palestinian population to accept: an independent Palestinian state in the West Bank and Gaza, with its capital in Jerusalem, in peaceful co-existence with the State of Israel.
The irony of the situation is that, by the time of Abu Mazen's appointment as prime minister, there was growing dissatisfaction with Arafat's leadership—including, significantly, his inability to end the violence and advance the negotiations—within the Palestinian political elite and general population. The idea of appointing a prime minister was generated within the Palestinian community itself and represented a significant accomplishment under trying circumstances.
US pressure no doubt contributed to Arafat's decision to accept his Legislative Council's recommendation. But by defining the action as a US-sponsored regime change, we were in effect saying to the Palestinian leadership: ”We will force you to appoint a new prime minister, even if you have independently decided to do so.”
As a result, not only was the legitimacy of the new prime minister undermined, but Palestinians felt further humiliated because their elected president, the father of their nation, and the symbol of their collective identity was being denied the respect and dignity due to his status.
Not surprisingly, Arafat's popularity has increased—and will increase even further if he is forcibly removed.
With the designation of Abu Ala as the new prime minister, we have another chance to frame this political development constructively, in a way consistent with the intentions of the Palestinian reformers and the dignity of the Palestinian public. Far from removing Arafat or replacing him with the new prime minister, we should endorse a redefinition of the role of the Palestinian president. Arafat, as president, would become the head-of-state, occupying a position that is essentially ceremonial and symbolic (similar to the position of the president of Israel), while the prime minister would be the head-of-government, responsible for conducting the internal and external affairs of the Palestinian Authority, including internal security and peace negotiations.
As head-of-state, Arafat would be treated with the full respect that is due to his office and would be recognized in his historical role as the symbol and father of the Palestinian people and its incipient state.
There is no guarantee that, under such an arrangement, Arafat would readily abandon his life-long habit of maintaining personal control. But his incentive to cooperate with his prime minister would be greatly enhanced. Above all, the prime minister would start out with the domestic legitimacy that he needs to function effectively and would have the opportunity to enhance his legitimacy by pursuing a meaningful peace process and achieving positive changes in the daily lives of his constituency.