Research Library

2012
Frankel, Jeffrey. 2012. “Internationalization Of The Rmb And Historical Precedents.” Journal of Economic Integration 27 (3): 329-365. Website Abstract
The possibility that the renminbi may soon join the ranks of international currencies has generated much excitement. This paper looks to history for help in evaluating the factors determining its prospects. The three best precedents in the twentieth century were the rise of the dollar from 1913 to 1945, the rise of the Deutsche mark from 1973 to 1990, and the rise of the yen from 1984 to 1991. The fundamental determinants of international currency status are economic size, confidence in the currency, and depth of financial markets. The new view is that, once these three factors are in place, internationalization of the currency can proceed quite rapidly. Thus some observers have recently forecast that the RMB may even challenge the dollar within a decade. But they underestimate the importance of the third criterion, the depth of financial markets. In principle, the Chinese government could decide to create that depth, which would require accepting an open capital account, diminished control over the domestic allocation of credit, and a flexible exchange rate. But although the Chinese government has been actively promoting offshore use of the currency since 2010, it has not done very much to meet these requirements. Indeed, to promote internationalization as national policy would depart from the historical precedents. In all three twentieth-century cases of internationalization, popular interest in the supposed prestige of having the country’s currency appear in the international listings was scant, and businessmen feared that the currency would strengthen and damage their export competitiveness. Probably China, likewise, is not yet fully ready to open its domestic financial markets and let the currency appreciate, so the renminbi will not be challenging the dollar for a long time. We begin, however, by asking: What is international currency status, and why does it matter?
Download Paper
Campante, Felipe, and David Chor. 2012. “Why Was The Arab World Poised For Revolution? Schooling, Economic Opportunities, And The Arab Spring.” Journal of Economic Perspectives 26 (2): 167-188. Publisher's Version Abstract
In December 2010, the self-immolation of a Tunisian fruit vendor sparked what has come to be termed the “Arab Spring.” What first appeared as an isolated act of protest against local authorities quickly gained broader significance, as it was followed by a series of demonstrations that has shaken the grip of autocratic regimes across the Arab world. A year later, three longstanding dictators - Zine El Abidine Ben Ali of Tunisia, Hosni Mubarak of Egypt, and Muammar el-Qaddafi of Libya - have been ousted, after varying degrees of violence. Syria, Yemen, and Bahrain have all witnessed extensive turmoil, raising serious questions about the ahrain have all witnessed extensive turmoil, raising serious questions about the legitimacy and survival of their rulers. Elsewhere, the political leaders of Morocco, Algeria, and Jordan have also been pressured into enacting reforms to try to assuage public demands.
Download Paper
Campante, Felipe, and David Chor. 2012. “Schooling, Political Participation, And The Economy.” The Review of Economics and Statistics 94 (4): 841-859. Publisher's Version Abstract
We investigate how the link between individual schooling and political participation is affected by country characteristics. Using individual survey data, we find that political participation is more responsive to schooling in land-abundant countries and less responsive in human capital - abundant countries, even while controlling for country political institutions and cultural attitudes. We find related evidence that political participation is less responsive to schooling in countries with a higher skill premium, as well as within countries for individuals in skilled occupations. The evidence motivates a theoretical explanation in which patterns of political participation are influenced by the opportunity cost of engaging in political rather than production activities.
Download Paper
Good, Mary-Jo DelVecchio, and Byron J Good. 2012. “Significance Of The 686 Program For China In Global Mental Health.” Shanghai Archives of Psychiatry 24 (3): 175-177. Shanghai Archives of Psychiatry Abstract
Quietly, with little apparent notice from even the strongest advocates for global mental health, China is undertaking the world’s largest - and arguably most important - mental health services demonstration project, a project focused on providing comprehensive care for persons with severe mental illnesses. As Professor Ma indicates in her short report, the ‘686 Project’ was launched as part of China’s commitment to rebuild its public health infrastructure following the SARS epidemic, and has now moved beyond the initial pilot phase into a process of scaling up community mental health services throughout the country. China is currently moving toward passage of its first national mental health law, so the project has profound implications for mental health policy in the country. It will also provide useful models for the development of mental health policies in other countries with limited mental health personnel.
Download Paper
2011
Antràs, Pol. 2011. “Offshoring And The Role Of Trade Agreements.” American Economic Review. American Economic Review. Publisher's Version Abstract
The rise of offshoring of intermediate inputs raises important questions for commercial policy. Do the distinguishing features of offshoring introduce novel reasons for trade policy intervention? Does offshoring create new problems of global policy cooperation whose solutions require international agreements with novel features? In this paper we provide answers to these questions, and thereby initiate the study of trade agreements in the presence of offshoring. We argue that the rise of offshoring will make it increasingly difficult for governments to rely on traditional GATT/WTO concepts and rules—such as market access, reciprocity and non-discrimination—to solve their trade-related problems
How different would the world be today if there had been no 9/11? What if the attacks had been foiled or bungled? One obvious answer is that Americans would probably care a lot less than they do about the rest of the world. Back on the eve of destruction, in early September 2001, only 13 percent of Americans believed that the U.S. should be “the single world leader.” And fewer than a third favored higher defense spending. Now those figures are naturally much higher. Right? Wrong. According to the most recent surveys, just 12 percent of Americans today think the U.S. should be the sole superpower—almost exactly the same proportion as on the eve of the 9/11 attacks. The share of Americans who want to see higher spending on national security is actually down to 26 percent. Paradoxically, Americans today seem less interested in the wider world than they were before the Twin Towers were felled. In the past 10 years, the U.S. has directly or indirectly overthrown at least three governments in the Muslim world. Yet Americans today feel less powerful than they did then. In 2001 just over a quarter felt that the U.S. had “a less important role as a world leader compared to 10 years ago.” The latest figure is 41 percent. Three explanations suggest themselves. First, wielding power abroad proved harder in practice than in neoconservative theory. Second, the financial crisis has dampened American spirits. A third possibility is that 9/11 simply didn’t have that big an impact on American opinion. Yet to conclude that 9/11 didn’t change much is to misunderstand the historical process. The world is a seriously complex place, and a small change to the web of events can have huge consequences. Our difficulty is imagining what those consequences might have been. So let’s play a game like the one my friends at the Muzzy Lane software company are currently designing, which has the working title “New World Disorder.” The game simulates the complex interaction of economics, politics, and international relations, allowing us to replay the past. Let’s start in January 2001 and thwart the 9/11 attacks by having Condi Rice and Paul Wolfowitz heed Richard Clarke’s warnings about Al-Qaeda. The game starts off well. Al-Qaeda is preemptively decapitated, its leaders rounded up in a series of covert operations and left to the tender mercies of their home governments. President Bush gets to focus on tax cuts, his first love. But then, three years later, the murky details of this operation surface on the front page of The New York Times. John Kerry, the Democratic candidate for the presidency, denounces the “criminal conduct” of the Bush administration. Liberal pundits foam at the mouth. Ordinary Americans, unseared by 9/11, are shocked. Osama bin Laden issues a fierce denunciation of the U.S. from his Saudi prison cell. It triggers a wave of popular anger in the Middle East that topples any regime seen as too close to Washington. The government of Qatar - gone. The government of Kuwait - gone. Above all, the government of Saudi Arabia - gone. True to form, the experts are soon all over network TV explaining how this fundamentalist backlash against the U.S.-backed oil monarchies had been years in the making (even if they hadn’t quite gotten around to predicting it beforehand). “Who lost the Middle East?” demands Kerry, pointing an accusing finger at George W. Bush. (Remember, prior to 9/11 Bush favored a reduction of U.S. overseas commitments.) The Democrats win the 2004 election, where-upon bin Laden’s new Islamic Republic of Arabia takes hostages at the U.S. Embassy in Riyadh… In other words, if things had happened differently 10 years ago - if there had been no 9/11 and no retaliatory invasions of Afghanistan and Iraq - we might be living through an Islamist Winter rather than an Arab Spring. Replaying the history game without 9/11 suggests that, ironically, the real impact of the attacks was not on Americans but on the homelands of the attackers themselves.
I survey the influence of Grossman and Hart's (1986) seminal paper in the field of International Trade. I discuss the implementation of the theory in open-economy environments and its implications for the international organization of production and the structure of international trade flows. I also review empirical work suggestive of the empirical relevance of the property-rights theory. Along the way, I develop novel theoretical results and also outline some of the key limitations of existing contributions.
The Palestinian leader Mahmoud Abbas’s bid for full U.N. membership was dead on arrival in New York. So why bother even raising the subject? The answer: to drum up international sympathy for the plight of the Palestinians. Yet other defeated peoples have suffered far more than they. Think only of how—and at whose expense—the U.N. itself began. Born in the gently foggy city of San Francisco, the U.N. was conceived in the Ukrainian resort of Yalta. Though nestled amid the green Crimean hills and lapped by the Black Sea’s languid waves, the city was severely battle-scarred in February 1945; Winston Churchill dubbed it “the Riviera of Hades.” Its diabolical master was the Soviet despot Joseph Stalin, who acted as host to Churchill and the ailing American President Franklin Roosevelt. Of the Big Three, as Sergei Plokhy shows in his riveting study Yalta: The Price of Peace, Roosevelt alone truly believed in the dream of a world parliament, and even he knew the U.N. would need to give greater weight to the great powers than its ill-starred predecessor, the League of Nations. Thus it was Roosevelt who proposed a Security Council on which the war’s victors—plus France and China—would be permanently represented and armed with veto powers. Churchill and Stalin were realists. They saw the postwar world in terms of “spheres of influence.” Though perfectly capable of such realism in practice, Roosevelt still yearned for the idealist’s world of peace based on collective security. Yet Churchill was deeply reluctant to accept that Stalin’s postwar sphere of influence would include Poland. His predecessor had acquiesced in the destruction of Czechoslovakia at Munich but had gone to war when Hitler (and Stalin) carved up Poland between them. Was Yalta to be the Poles’ Munich? “We can’t agree,” grumbled Churchill, “that Poland shall be a mere puppet state of Russia, where the people who don’t agree with Stalin are bumped off.” But that was exactly what postwar Poland became. A staggering 19 percent of the prewar population of Poland had been killed as a result of World War II, including a huge proportion of the country’s large Jewish population. Yalta inflicted further punishment. The country not only shrank; it was also shifted westward so that Stalin could keep his gains from the 1939 Nazi-Soviet Pact. And it became a Soviet vassal state for the next half century. After Yalta, chess players devised a variant of their game for three players, using a six-sided board. As at the conference, in the game “Yalta” two players can join forces against the third, but all such alliances are temporary. Briefly, Churchill got Roosevelt on his side over Poland, but the American cared more about getting Stalin to agree to join the U.N.; Poland was a pawn to be sacrificed. Having got what he wanted, Roosevelt left Yalta early. His destination? The Middle East, which he was intent on adding to ... the American sphere of influence. The conflicting commitments he made on that trip—to the Arabs and the Jews—have bedeviled U.S. foreign policy ever since. Asked by Roosevelt if he was a Zionist, Stalin replied elliptically that he “was one in principle, but he recognized the difficulty.” That “difficulty” remains that a Jewish state could be created only at the expense of non-Jews living in Palestine. The Arabs resisted Israel’s creation, but they lost. So it goes. A trip to Yalta provides a salutary reminder that throughout history those who lose at war generally lose land, too, and sometimes sovereignty with it. By comparison with what the Poles endured last century, the Palestinians have got off lightly. They will get their own state eventually. But not until all the permanent members of the Security Council are convinced the Palestinians will not abuse the privileges of statehood. Like it or not, that was how the U.N. was meant to work when the Big Three conceived it on Hell’s Riviera.
Lost Decades: The Making of America's Debt Crisis and the Long Recovery
Chinn, Menzie D, and Jeffry Frieden. 2011. Lost Decades: The Making Of America's Debt Crisis And The Long Recovery. W. W. Norton & Company. Publisher's Version Abstract
Two acclaimed political economists explore the origins and long-term effects of the financial crisis in historical and comparative perspective. Welcome to Argentina: by 2008 the United States had become the biggest international borrower in world history, with almost half of its 6.4 trillion dollar federal debt in foreign hands. The proportion of foreign loans to the size of the economy put the United States in league with Mexico, Pakistan, and other third-world debtor nations. The massive inflow of foreign funds financed the booms in housing prices and consumer spending that fueled the economy until the collapse of late 2008. The authors explore the political and economic roots of this crisis as well as its long-term effects. They explain the political strategies behind the Bush administration's policy of funding massive deficits with the foreign borrowing that fed the crisis. They see the continuing impact of our huge debt in a slow recovery ahead. Their clear, insightful, and comprehensive account will long be regarded as the standard on the crisis.
Chandra, Amitabh. 2011. “Massachusetts' Health Care Reform And Emergency Department Utilization.” New England Journal of Medicine. New England Journal of Medicine. Publisher's Version Abstract
Does an expansion of health insurance increase or decrease use of the emergency department (ED)? Both predictions can be justified logically. On the one hand, research on patient cost sharing predicts that by reducing the out-of-pocket costs of an ED visit, expanded insurance coverage, especially in the face of physician shortages, could result in increased ED utilization. This view has been echoed by elected leaders: Senator Jon Kyl (R-AZ), citing the Massachusetts experience with health care reform, claimed that if anything, universal coverage brought even higher rates of emergency room visits due to increased difficulty in getting appointments for outpatient physician visits. Others have predicted that expanded coverage would actually reduce ED use, since previously uninsured patients would now have access to preventive care. The relative importance of these countervailing forces is a question that clearly weighs on physicians: in a survey of emergency physicians conducted in April 2010, about 71 percent said they expected emergency visits to increase after the passage of the Affordable Care Act (ACA). To explore the importance of these effects, we examined the Massachusetts experience. The state's 2006 health care reform was a model for the ACA and reduced the proportion of Massachusetts adults under the age of 65 who were uninsured by 7.7 percentage points between the fall of 2006 and the fall of 2009. To determine whether any changes in ED utilization in Massachusetts reflected the effect of Massachusetts' reform or were merely representative of broader regional trends in ED utilization, we used New Hampshire and Vermont as control states.
Pharmaceutical Reform: A Guide to Improving Performance and Equity
Reich, Michael R, and Marc J Roberts. 2011. Pharmaceutical Reform: A Guide To Improving Performance And Equity. World Bank Publications. Publisher's Version Abstract
This book applies an established analytical framework for health sector reform (Getting Health Reform Right, Oxford, 2004) to the performance problems of the pharmaceutical sector. The book is divided into three sections. The first section presents the basic ideas for analysis. It begins by insisting that reform start with a clear understanding of the performance deficiencies of the current system. Like all priority setting in the public sector, this 'definition of the problem' involves both ethical choices and political processes. Early chapters explain the foundations of these ideas and apply them to the pharmaceutical sector. The relationship of ultimate outcomes (like health status or risk protection) to classic health systems concepts like efficiency, access and quality is also explored. The last chapter in the first part is devoted to 'diagnosis'—explaining how to move from the definition of a problem to an understanding of how the functioning of the system produces the undesirable outcomes in question. The second part of the book devotes one chapter to each of five 'control knobs': finance, payment, organization, regulation and persuasion. These are sets of potential interventions that governments can use to improve pharmaceutical sector performance. Each chapter presents basic concepts and discusses examples of reform options. Throughout we provide 'conditional guidance'—avoiding the approach of a 'one size fits all' model of 'best practices' in these five arenas for reform. Instead we stress the need for local knowledge of political systems, administrative capacities, community values and market conditions in order to design pharmaceutical sector policies appropriate to a country’s particular circumstances. The last part of the book is a set of teaching cases. Each is preceded by questions and is followed by a brief note on the lessons to be learned. The goal is to help readers develop the skills they need to deal effectively with pharmaceutical sector reform problems in their own countries.
The good news is that today’s teenagers are avid readers and prolific writers. The bad news is that what they are reading and writing are text messages. According to a survey carried out last year by Nielsen, Americans between the ages of 13 and 17 send and receive an average of 3,339 texts per month. Teenage girls send and receive more than 4,000. It’s an unmissable trend. Even if you don’t have teenage kids, you’ll see other people’s offspring slouching around, eyes averted, tapping away, oblivious to their surroundings. Take a group of teenagers to see the seven wonders of the world. They’ll be texting all the way. Show a teenager Botticelli’s Adoration of the Magi. You might get a cursory glance before a buzz signals the arrival of the latest SMS. Seconds before the earth is hit by a gigantic asteroid or engulfed by a super tsunami, millions of lithe young fingers will be typing the human race’s last inane words to itself: C u later NOT :( Now, before I am accused of throwing stones in a glass house, let me confess. I probably send about 50 emails a day, and I receive what seem like 200. But there’s a difference. I also read books. It’s a quaint old habit I picked up as a kid, in the days before cellphones began nesting, cuckoolike, in the palms of the young. Half of today’s teenagers don’t read books—except when they’re made to. According to the most recent survey by the National Endowment for the Arts, the proportion of Americans between the ages of 18 and 24 who read a book not required at school or at work is now 50.7 percent, the lowest for any adult age group younger than 75, and down from 59 percent 20 years ago. Back in 2004, when the NEA last looked at younger readers’ habits, it was already the case that fewer than one in three 13-year-olds read for pleasure every day. Especially terrifying to me as a professor is the fact that two thirds of college freshmen read for pleasure for less than an hour per week. A third of seniors don’t read for pleasure at all. Why does this matter? For two reasons. First, we are falling behind more-literate societies. According to the results of the Organization for Economic Cooperation and Development’s most recent Program for International Student Assessment, the gap in reading ability between the 15-year-olds in the Shanghai district of China and those in the United States is now as big as the gap between the U.S. and Serbia or Chile. But the more important reason is that children who don’t read are cut off from the civilization of their ancestors. So take a look at your bookshelves. Do you have all - better make that any - of the books on the Columbia University undergraduate core curriculum? It’s not perfect, but it’s as good a list of the canon of Western civilization as I know of. Let’s take the 11 books on the syllabus for the spring 2012 semester: (1) Virgil’s Aeneid; (2) Ovid’s Metamorphoses; (3) Saint Augustine’s Confessions; (4) Dante’s The Divine Comedy; (5) Montaigne’s Essays; (6) Shakespeare’s King Lear; (7) Cervantes’s Don Quixote; (8) Goethe’s Faust; (9) Austen’s Pride and Prejudice; (10) Dostoevsky’s Crime and Punishment; (11) Woolf’s To the Lighthouse. Step one: Order the ones you haven’t got today. (And get War and Peace, Great Expectations, and Moby-Dick while you’re at it.) Step two: When vacation time comes around, tell the teenagers in your life you are taking them to a party. Or to camp. They won’t resist. Step three: Drive to a remote rural location where there is no cell-phone reception whatsoever. Step four: Reveal that this is in fact a reading party and that for the next two weeks reading is all you are proposing to do—apart from eating, sleeping, and talking about the books. Welcome to Book Camp, kids.
Allison, Graham T., Jr.. 2011. “The Costs Of America’S Choices”. Publisher's Version Abstract
America's last 10 years might be called “The Decade the Locusts Ate.’’ A nation that started with a credible claim to lead a second American century lost its way after the terrorist attacks of September 11, 2001. Whether the nation will continue on a path of decline, or, alternatively, find our way to recovery and renewal, is uncertain. The nation began the decade with a growing fiscal surplus and ended with a deficit so uncontrolled that its AAA credit rating was downgraded for the first time in its history. Ten years on, Americans’ confidence in our country and the promise of the American Dream is lower than at any point in memory. The indispensable superpower that entered the decade as the most respected nation in the world has seen its standing plummet. Seven out of every 10 Americans say that the United States is worse off today than it was a decade ago. While many of the factors that contributed to these developments were evident before 9/11, this unprecedented reversal pivots on that tragic day - and the choices made in response to it. Those choices had costs: the inescapable costs of the attack, the chosen costs, and the opportunity costs. Inescapable costs of 9/11 must be counted first in the 3,000 innocent lives extinguished that morning. In addition, the collapse of the World Trade Center and part of the Pentagon destroyed $30 billion of property. The Dow plunged, erasing $1.2 trillion in value. Psychologically, the assault punctured the “security bubble’’ in which most Americans imagined they lived securely. Today, 80 percent of Americans expect another major terrorist attack on the homeland in the next decade. Were this the sum of the matter, 9/11 would stand as a day of infamy, but not as an historic turning point. Huge as these directs costs are, they pale in comparison to costs of choices the United States made in response to 9/11: about how to defend America; where to fight Al Qaeda; whether to attack Iraq (or Iran or North Korea) on grounds that they had chemical or biological weapons that could be transferred to Al Qaeda; and whether to pay for these choices by taxing the current generation, or borrowing from China and other lenders, leaving the bills to the next generation. Unquestionably, much of what was done to protect citizens at home and to fight Al Qaeda abroad has made America safer. It is no accident that the United States has not suffered further megaterrorist attacks. The remarkable intelligence and Special Forces capabilities demonstrated in the operation that killed Osama bin Laden suggest how far we have come. But the central storyline of the decade focuses on two choices made by President George W. Bush - his decision to go to war with Iraq and his commitment to cut taxes, especially for wealthy Americans, and thus not to pay for the wars in Iraq and Afghanistan. The cost of his decision to go to war with Iraq is measured in 4,478 American deaths, 40,000 Americans gravely wounded, and a monetary cost of $2 trillion. Bush justified his decision to attack Iraq on the grounds that Saddam Hussein might arm terrorists with weapons of mass destruction, arguing that “19 hijackers armed by Saddam Hussein…could bring a day of horror like none we have ever known.’’ In retrospect, even Bush supporters agree that we went to war on false premises—since we now know that Saddam had no chemical or biological weapons. Suppose, however, that chemical weapons had been found in Iraq. Would that have made Bush’s choice a wise decision? What about the many other states that had chemical or biological weapons that could have been transferred to Al Qaeda, for example Libya, or Syria, or Iran? What about the state that unquestionably had an advanced nuclear weapons program, North Korea, which took advantage of the US preoccupation with Iraq to develop an arsenal of nuclear weapons and conduct its first nuclear weapons test? As for cutting taxes for the wealthy, Bush’s decision left the nation with a widening gap between government revenues and its expenditures. Brute facts are hard to ignore: having entered office with a budgetary surplus that the CBO projected would total $3.5 trillion through 2008, Bush left office with an annual deficit of over $1 trillion that the CBO projected would grow to $3 trillion over the next decade. Finally, and most difficult to assess, are opportunity costs, what could be Robert Frost’s “road not taken.’’ In the immediate aftermath of 9/11, the United States was the object of overwhelming international sympathy and solidarity. The leading French newspaper declared: “We are all Americans.’’ Citizens united behind their commander in chief, giving him license to do virtually anything he could plausibly argue would defend us against future attacks. This rare combination of readiness to sacrifice at home plus solidarity abroad sparked imagination. Would Americans have willingly paid a “terrorist tax’’ on gas that could kick what Bush rightly called America’s “oil addiction’’? Could an international campaign against nuclear terrorism or megaterrorism have bent trend lines that leave Americans and the world increasingly vulnerable to future biological or nuclear terrorist attacks? What impact could $2 trillion invested in new technologies have had on American competitiveness? That such a decade leaves Americans increasingly pessimistic about ourselves and our future is not surprising. American history, however, is a story of recurring, impending catastrophes from which there is no apparent escape—followed by miraculous recoveries. At one of our darkest hours in 1776 when defeat at the hands of the British occupying Boston seemed almost certain, the general commanding American forces, George Washington, observed: “Perseverance and spirit have done wonders in all ages.’’
The United States is in the third year of a grand experiment by the Obama administration to revive the economy through enormous borrowing and spending by the government, with the Federal Reserve playing a supporting role by keeping interest rates at record lows. How is the experiment going? By the looks of it, not well. The economy is growing much more slowly than in a typical recovery, housing prices remain depressed and the stock market has been in a slump—all troubling indicators that another recession may be on the way. Most worrisome is the anemic state of the labor market, underscored by the zero growth in the latest jobs report. The poor results should not surprise us given the macroeconomic policies the government has pursued. I agree that the recession warranted fiscal deficits in 2008–2010, but the vast increase of public debt since 2007 and the uncertainty about the country’s long-run fiscal path mean that we no longer have the luxury of combating the weak economy with more deficits. Today’s priority has to be austerity, not stimulus, and it will not work to announce a new $450 billion jobs plan while promising vaguely to pay for it with fiscal restraint over the next 10 years, as Mr. Obama did in his address to Congress on Thursday. Given the low level of government credibility, fiscal discipline has to start now to be taken seriously. But we have to do even more: I propose a consumption tax, an idea that offends many conservatives, and elimination of the corporate income tax, a proposal that outrages many liberals. These difficult steps would be far more effective than the president’s failed experiment. The administration’s $800 billion stimulus program raised government demand for goods and services and was also intended to stimulate consumer demand. These interventions are usually described as Keynesian, but as John Maynard Keynes understood in his 1936 masterwork, “The General Theory of Employment, Interest and Money” (the first economics book I read), the main driver of business cycles is investment. As is typical, the main decline in G.D.P. during the recession showed up in the form of reduced investment by businesses and households. What drives investment? Stable expectations of a sound economic environment, including the long-run path of tax rates, regulations and so on. And employment is akin to investment in that hiring decisions take into account the long-run economic climate. The lesson is that effective incentives for investment and employment require permanence and transparency. Measures that are transient or uncertain will be ineffective. And yet these are precisely the kinds of policies the Obama administration has pursued: temporarily cutting the payroll tax rate, maintaining the marginal income-tax rates from the George W. Bush era while vowing to raise them in the future, holding off on clean-air regulations while promising to implement them later and enacting an ambitious overhaul of Wall Street regulations while leaving lots of rules undefined and ambiguous. Is there a better way? I believe that a long-term fiscal plan for the country requires six big steps. Three of them were identified by the Bowles-Simpson deficit reduction commission: reforming Social Security and Medicare by increasing ages of eligibility and shifting to an appropriate formula for indexing benefits to inflation; phasing out “tax expenditures” like the deductions for mortgage interest, state and local taxes and employer-provided health care; and lowering the marginal income-tax rates for individuals. I would add three more: reversing the vast and unwise increase in spending that occurred under Presidents Bush and Obama; introducing a tax on consumer spending, like the value-added tax (or VAT) common in other rich countries; and abolishing federal corporate taxes and estate taxes. All three measures would be enormously difficult—many say impossible—but crises are opportune times for these important, basic reforms. A broad-based expenditure tax, like a VAT, amounts to a tax on consumption. If the base rate were 10 percent, the revenue would be roughly 5 percent of G.D.P. One benefit from a VAT is that it is more efficient than an income tax—and in particular the current American income tax system. I received vigorous criticism from conservatives after advocating a VAT in an essay in The Wall Street Journal last month. The main objection—reminiscent of the complaints about income-tax withholding, which was introduced in the United States in 1943—is that a VAT would be a money machine, allowing the government to readily grow larger. For example, the availability of easy VAT revenue in Western Europe, where rates reach as high as 25 percent, has supported the vast increase in the welfare state there since World War II. I share these concerns and, therefore, favor a VAT only if it is part of a package that includes other sensible reforms. But given the likely path of government spending on health care and Social Security, I see no reasonable alternative. Abolishing the corporate income tax is similarly controversial. Any tax on capital income distorts decisions on saving and investment. Moreover, the inefficiency is magnified here because of double taxation: the income is taxed when corporations make profits and again when owners receive dividends or capital gains. If we want to tax capital income, a preferred method treats corporate profits as accruing to owners when profits arise and then taxes this income only once—whether it is paid out as dividends or retained by companies. Liberals love the idea of a levy on evil corporations, but taxes on corporate profits in fact make up only a small part of federal revenue, compared to the two main sources: the individual income tax and payroll taxes for Social Security and Medicare. In 2009-10, taxes on corporate profits averaged 1.4 percent of G.D.P. and 8.6 percent of total federal receipts. Even from 2000 to 2008, when corporations were more profitable, these taxes averaged only 1.9 percent of G.D.P. and 10.3 percent of federal receipts. If we could get past the political fallout, we could get more revenue and improve economic efficiency by abolishing the corporate income tax and relying instead on a VAT. I had a dream that Mr. Obama and Congress enacted this fiscal reform package—triggering a surge in the stock market and a boom in investment and G.D.P.—and that he was re-elected. This dream could become reality if our leader were Ronald Reagan or Bill Clinton—the two presidential heroes of the American economy since World War II—but Mr. Obama is another story. To become market-friendly, he would have to abandon most of his core economic and political principles. More likely, his administration will continue with more of the same: an expansion of payroll-tax cuts, short-term tax credits, promises to raise future taxes on the rich, and added spending on infrastructure, job training and unemployment benefits. The economy will probably continue in its sluggish state, possibly slipping into another recession. In that case, our best hope is for a Republican president far more committed to the principles of free markets and limited government than Mr. Bush ever was.
America's last 10 years might be called “The Decade the Locusts Ate.’’ A nation that started with a credible claim to lead a second American century lost its way after the terrorist attacks of Sept. 11, 2001. Whether the nation will continue on a path of decline, or, alternatively, find our way to recovery and renewal, is uncertain.The nation began the decade with a growing fiscal surplus and ended with a deficit so uncontrolled that its AAA credit rating was downgraded for the first time in its history. Ten years on, Americans’ confidence in our country and the promise of the American Dream is lower than at any point in memory. The indispensable superpower that entered the decade as the most respected nation in the world has seen its standing plummet. Seven out of every 10 Americans say that the United States is worse off today than it was a decade ago. While many of the factors that contributed to these developments were evident before 9/11, this unprecedented reversal pivots on that tragic day - and the choices made in response to it. Those choices had costs: the inescapable costs of the attack, the chosen costs, and the opportunity costs.Inescapable costs of 9/11 must be counted first in the 3,000 innocent lives extinguished that morning. In addition, the collapse of the World Trade Center and part of the Pentagon destroyed $30 billion of property. The Dow plunged, erasing $1.2 trillion in value. Psychologically, the assault punctured the “security bubble’’ in which most Americans imagined they lived securely. Today, 80 percent of Americans expect another major terrorist attack on the homeland in the next decade.Were this the sum of the matter, 9/11 would stand as a day of infamy, but not as an historic turning point. Huge as these directs costs are, they pale in comparison to costs of choices the United States made in response to 9/11: about how to defend America; where to fight Al Qaeda; whether to attack Iraq (or Iran or North Korea) on grounds that they had chemical or biological weapons that could be transferred to Al Qaeda; and whether to pay for these choices by taxing the current generation, or borrowing from China and other lenders, leaving the bills to the next generation.Unquestionably, much of what was done to protect citizens at home and to fight Al Qaeda abroad has made America safer. It is no accident that the United States has not suffered further megaterrorist attacks. The remarkable intelligence and Special Forces capabilities demonstrated in the operation that killed Osama bin Laden suggest how far we have come.But the central storyline of the decade focuses on two choices made by President George W. Bush—his decision to go to war with Iraq and his commitment to cut taxes, especially for wealthy Americans, and thus not to pay for the wars in Iraq and Afghanistan.The cost of his decision to go to war with Iraq is measured in 4,478 American deaths, 40,000 Americans gravely wounded, and a monetary cost of $2 trillion.Bush justified his decision to attack Iraq on the grounds that Saddam Hussein might arm terrorists with weapons of mass destruction, arguing that “19 hijackers armed by Saddam Hussein… could bring a day of horror like none we have ever known.’’ In retrospect, even Bush supporters agree that we went to war on false premises—since we now know that Saddam had no chemical or biological weapons.Suppose, however, that chemical weapons had been found in Iraq. Would that have made Bush’s choice a wise decision? What about the many other states that had chemical or biological weapons that could have been transferred to Al Qaeda, for example Libya, or Syria, or Iran? What about the state that unquestionably had an advanced nuclear weapons program, North Korea, which took advantage of the US preoccupation with Iraq to develop an arsenal of nuclear weapons and conduct its first nuclear weapons test?As for cutting taxes for the wealthy, Bush’s decision left the nation with a widening gap between government revenues and its expenditures. Brute facts are hard to ignore: having entered office with a budgetary surplus that the CBO projected would total $3.5 trillion through 2008, Bush left office with an annual deficit of over $1 trillion that the CBO projected would grow to $3 trillion over the next decade.Finally, and most difficult to assess, are opportunity costs, what could be Robert Frost’s “road not taken.’’ In the immediate aftermath of 9/11, the United States was the object of overwhelming international sympathy and solidarity. The leading French newspaper declared: “We are all Americans.’’ Citizens united behind their commander in chief, giving him license to do virtually anything he could plausibly argue would defend us against future attacks.This rare combination of readiness to sacrifice at home plus solidarity abroad sparked imagination. Would Americans have willingly paid a “terrorist tax’’ on gas that could kick what Bush rightly called America’s “oil addiction’’? Could an international campaign against nuclear terrorism or megaterrorism have bent trend lines that leave Americans and the world increasingly vulnerable to future biological or nuclear terrorist attacks? What impact could $2 trillion invested in new technologies have had on American competitiveness?That such a decade leaves Americans increasingly pessimistic about ourselves and our future is not surprising. American history, however, is a story of recurring, impending catastrophes from which there is no apparent escape—followed by miraculous recoveries. At one of our darkest hours in 1776 when defeat at the hands of the British occupying Boston seemed almost certain, the general commanding American forces, George Washington, observed: “Perseverance and spirit have done wonders in all ages.’’ 
Joseph S. Nye: ‘Protect the Homeland’ Joseph S. Nye reflects on his Op-Ed from Sept. 25, 2001, about the strategies needed to defeat terrorism. Read the original Op-ed, “How to Protect the Homeland.”
Tingley, Dustin. 2011. “The Dark Side Of The Future: An Experimental Test Of Commitment Problems In Bargaining.” International Studies Quarterly. International Studies Quarterly. Publisher's Version Abstract
While most existing theoretical and experimental literatures focus on how a high probability of repeated play can lead to more socially efficient outcomes (for instance, using the result that cooperation is possible in a repeated prisoner’s dilemma), this paper focuses on the detrimental effects of repeated play—the ‘‘dark side of the future.’’ I study a resource division model with repeated interaction and changes in bargaining strength. The model predicts a negative relationship between the likelihood of repeated interaction and social efficiency. This is because the longer shadow of the future exacerbates commitment problems created by changes in bargaining strength. I test and find support for the model using incentivized laboratory experiments. Increases in the likelihood of repeated play lead to more socially inefficient outcomes in the laboratory.
Reich, Michael R, Keizo Takemi, and Naoki Ikegami. 2011. “Fifty Years Of Pursuing A Healthy Society In Japan.” The Lancet. The Lancet. Publisher's Version Abstract
In this Series in The Lancet, we review the past 50 years of Japan’s universal health coverage, identify the major challenges of today, and propose paths for the future, within the context of long-term population aging and the devastating crises triggered by the March 11 earthquake. Japan is recognised internationally for its outstanding achievements during the second half of the 20th century, in both improving the population’s health status and developing a strong health system. At the end of World War 2, in Japan, life expectancy at birth was 50 years for men and 54 years for women; by the late 1970s, Japan overtook Sweden as the world’s leader for longest life expectancy at birth. Japanese women have remained in the number one slot for 25 years, reaching a life expectancy of 86.4 years in 2009 (while Japanese men slipped to fifth longest living that year, at 79.6 years).In 2011, Japan celebrates 50 years of kaihoken: health insurance for all. Universal health insurance was achieved in 1961, assuring access to a wide array of health services for the whole population. Since then, benefits have become more egalitarian while health expenditures have remained comparatively low: 8.5% of the gross domestic product and 20th out of countries in the Organisation for Economic Co-operation and Development in 2008. This achievement is all the more remarkable because the percentage of the population aged 65 years or older has increased nearly four-fold (from 6% to 23%) over the past 50 years.
Hanna, Rema, and Michael Greenstone. 2011. “Environmental Regulations, Air And Water Pollution, And Infant Mortality In India.” Harvard Kennedy School. Publisher's Version Abstract
Using the most comprehensive data file ever compiled on air pollution, water pollution, environmental regulations, and infant mortality from a developing country, the paper examines the effectiveness of India’s environmental regulations. The air pollution regulations were effective at reducing ambient concentrations of particulate matter, sulfur dioxide, and nitrogen dioxide. The most successful air pollution regulation is associated with a modest and statistically insignificant decline in infant mortality. However, the water pollution regulations had no observable effect. Overall, these results contradict the conventional wisdom that environmental quality is a deterministic function of income and underscore the role of institutions and politics.
HKS Faculty Research Working Paper Series RWP11-034, John F. Kennedy School of Government, Harvard University.Harvard DASH repository: http://nrs.harvard.edu/urn-3:HUL.InstRepos:5131505Download PDF
President Obama should take a page from Ronald Reagan’s playbook in winning the final inning of the Cold War. Obama can challenge President Mahmoud Ahmadinejad to put his enriched uranium where his mouth is—by stopping all Iranian enrichment of uranium beyond the 5 percent level. A quarter-century ago, Soviet leader Mikhail Gorbachev was touting a new “glasnost”: openness. President Reagan went to Berlin and called on Gorbachev to “tear down this wall.” Two years later, the Berlin Wall came tumbling down and, shortly thereafter, the Soviet “evil empire” fell as well. While in New York for the opening of the UN General Assembly in September, Ahmadinejad on three occasions made an unambiguous offer: He said Iran would stop all enrichment of uranium beyond the levels used in civilian power plants—if his country is able to buy specialized fuel enriched at 20 percent, for use in its research reactor that produces medical isotopes to treat cancer patients. Obama should seize this proposal and send negotiators straightaway to hammer out specifics. Iran has been enriching uranium since 2006, and it has accumulated a stockpile of uranium enriched at up to 5 percent, sufficient after further enrichment for several nuclear bombs. Iran is also producing 20 percent material every day, and it announced in June that it planned to triple its output. Halting Iran’s current production of 20 percent material and its projected growth would be significant. A stockpile of uranium enriched at 20 percent shrinks the potential timeline for breaking out to bomb material from months to weeks. In effect, having uranium enriched at 20 percent takes Iran 90 yards along the football field to bomb-grade material. Pushing it back below 5 percent would effectively move Tehran back to the 30-yard line - much farther from the goal of bomb-grade material. Even more important, extracting from Iran a commitment to a bright red line capping enrichment at 5 percent would stop the Islamic Republic from advancing on its current path to 60 percent enrichment and then 90 percent. Stopping Iran from enriching beyond 5 percent is not, in itself, a “solution” to its nuclear threat. Nor was Reagan’s proposal to Gorbachev. The question for Reagan was whether we would be better off with the Berlin Wall or without it. Iran today is the most sanctioned member of the United Nations; it has been the target of five Security Council resolutions since 2006 demanding that it suspend all uranium enrichment. The United States and Europe have organized their own, tougher economic sanctions forbidding businesses from trading with Iranian companies and limiting Iran’s access to financial markets. But Iran does not require the permission of the United Nations or, for that matter, the United States to advance its nuclear program within its borders. Nor are current or future sanctions likely to dissuade Iran from progressing steadily toward a nuclear weapon. So far, Obama has essentially continued the Bush administration’s policy toward Iran with one addition: an authentic offer from the start of his administration to begin negotiations. Negotiations, however, have not been feasible because of sharp divisions within Iran. Those rifts were exacerbated after the June 2009 elections, in which Iran’s ruling powers (Supreme Leader Ayatollah Ali Khamanei, Ahmadinejad and the Revolutionary Guard) rigged the presidential vote and then moved to suppress the opposition Green Movement protests. In the last two years, they have tightened control over their society. Enter Ahmadinejad’s proposal to stop all enrichment at the 5 percent level—without preconditions. Although differences between Ahmadinejad and the supreme leader have become evident, the United States should pay attention to the president’s offer. Arguments against testing the offer are easy to make. An embattled Ahmadinejad may not be able to deliver. Iran will use negotiations to seek to relax or escape current sanctions. If a deal were reached, it would be more difficult to win international support for the next round of sanctions. An agreement that stops only the 20 percent enrichment could imply a degree of acceptance of Iran’s ongoing enrichment up to 5 percent. Recognizing all of these negatives, however, the policy question remains: Would the United States be better off with Iran enriching its uranium to 20 percent or without it? President Obama should act now to test Ahmadinejad’s word.