Informal payments are a frequently overlooked source of local public finance in developing countries. We use microdata from ten countries to establish stylized facts on the magnitude, form, and distributional implications of this “informal taxation.” Informal taxation is widespread, particularly in rural areas, with substantial in-kind labor payments. The wealthy pay more, but pay less in percentage terms, and informal taxes are more regressive than formal taxes. Failing to include informal taxation underestimates household tax burdens and revenue decentralization in developing countries. We discuss various explanations for and implications of these observed stylized facts.
They were abolitionists, speculators, slave owners, government officials, and occasional politicians. They were observers of the anxieties and dramas of empire. And they were from one family. The Inner Life of Empires tells the intimate history of the Johnstones—four sisters and seven brothers who lived in Scotland and around the globe in the fast-changing eighteenth century. Piecing together their voyages, marriages, debts, and lawsuits, and examining their ideas, sentiments, and values, renowned historian Emma Rothschild illuminates a tumultuous period that created the modern economy, the British Empire, and the philosophical Enlightenment.
One of the sisters joined a rebel army, was imprisoned in Edinburgh Castle, and escaped in disguise in 1746. Her younger brother was a close friend of Adam Smith and David Hume. Another brother was fluent in Persian and Bengali, and married to a celebrated poet. He was the owner of a slave known only as "Bell or Belinda," who journeyed from Calcutta to Virginia, was accused in Scotland of infanticide, and was the last person judged to be a slave by a court in the British isles. In Grenada, India, Jamaica, and Florida, the Johnstones embodied the connections between European, American, and Asian empires. Their family history offers insights into a time when distinctions between the public and private, home and overseas, and slavery and servitude were in constant flux.Based on multiple archives, documents, and letters, The Inner Life of Empires looks at one family's complex story to describe the origins of the modern political, economic, and intellectual world.
During the past decade, a variety of intermediaries have emerged to facilitate the trading of patents: brokers, non-practicing entities (NPEs), defensive aggregators, online platforms, auctions and unique entities such as Intellectual Ventures. We discuss the fundamental causes for the lack of liquidity in the IP market and analyze the merits and shortcomings of the various business models used by patent intermediaries. A key conclusion is that platform-type intermediaries (who facilitate transactions without taking possession of assets) have struggled, whereas merchant-type intermediaries (who acquire patents and seek to monetize them directly) have reached significant scale and influence in the technology industries that fall under the incidence of their assets. We also discuss some efficiency issues raised by the growing prominence of patent merchants.
Most social scientists would like to believe that their profession contributes to solving pressing global problems. There is today no shortage of global problems that social scientists should study in depth: ethnic and religious conflict within and between states, the challenge of economic development, terrorism, the management of a fragile world economy, climate change and other forms of environmental degradation, the origins and impact of great power rivalries, the spread of weapons of mass destruction, just to mention a few. In this complex and contentious world, one might think that academic expertise about global affairs would be a highly valued commodity. One might also expect scholars of international relations to play a prominent role in public debates about foreign policy, along with government officials, business interests, representatives of special interest groups, and other concerned citizens. Yet the precise role that academic scholars of international affairs should play is not easy to specify. Indeed, there appear to be two conflicting ways of thinking about this matter. On the one hand, there is a widespread sense that academic research on global affairs is of declining practical value, either as a guide to policymakers or as part of broader public discourse about world affairs. On the other hand, closer engagement with the policy world and more explicit efforts at public outreach are not without their own pitfalls. Scholars who enter government service or participate in policy debates may believe that they are "speaking truth to power," but they run the risk of being corrupted or co-opted in subtle and not-so-subtle ways by the same individuals and institutions that they initially hoped to sway. The remainder of this essay explores these themes in greater detail.
KS Faculty Research Working Paper Series RWP11-030, John F. Kennedy School of Government, Harvard University.Download PDF
Like corruption, crime, and asbestos, “inflation” is a word that many
Americans imagine in all-red capital letters, flashing across TV screens
amid warnings of crisis. For anyone who remembers the gloomy, scary
1970s, when the inflation rate in the United States reached double
digits, the word is shorthand for an economy that has spiraled out of
control, the dollar losing value and prices climbing feverishly.
“Inflation is as violent as a mugger, as frightening as an armed robber,
and as deadly as a hit man,” said Ronald Reagan in 1978, as nervous
citizens imagined the day when they’d have to push a wheelbarrow full of
cash to the grocery store in order to buy a loaf of bread.That
particular nightmare never came to pass, thanks to drastic measures
taken by the Federal Reserve. For the better part of the past 30 years,
the dollar has stayed stable, reassuring American families and the
nation’s trading partners, with the central bank standing guard over the
economy and doing everything necessary to keep inflation low.You
might say that Kenneth Rogoff has been one of the guards. As a research
economist at the Federal Reserve during the first half of the 1980s, he
helped ensure that the word “inflation” would never again flash across
American TV screens. His reputation as a conservative-minded inflation
hawk followed him from the Fed to the International Monetary Fund to his
current position in the economics department at Harvard.But then
came the financial crisis of 2008, and the ensuing slump. And as the
economy has continued to stagnate, Rogoff, 58, has become the
flag-bearer for an unlikely position: that as we struggle to help the
economy find its way out of the darkness, inflation could be the answer.
It’s time, Rogoff says, to put Reagan’s “hit man” to work for the good
guys.Over the past several years, Rogoff has emerged as one of
the world’s leading experts on the history of financial crises and how
they work, a unique perch that has given him a long view on what is
happening to our economy and what lies ahead. In the bestselling 2009
book “This Time Is Different,’’ he and Carmen Reinhart, currently a
senior fellow at the Peterson Institute for International Economics ,
laid out a detailed analysis of financial crises that have taken place
around the world going back 800 years, and they put forth an alarming
idea about our current predicament. What we’re going through, they
argued—what we’ve been going through ever since the subprime mortgage
crisis—has not been just a typical recession, as our leaders have been
treating it, but something much worse, something that demands
altogether different tools to stop it.One
of these tools, Rogoff believes, is a temporary burst of inflation. And
for the past several weeks, as the stock market has convulsed and
debate raged over the Fed’s next move, he has been making his case
publicly, through syndicated opinion columns, high-profile TV
appearances, and numerous interviews. It’s an argument that Rogoff
himself admits is “radical,” and one he says he’d rather not be making.
But as he sees it, what’s holding the country back from recovery is not
just a lack of consumer confidence or suppressed demand, as in a normal
recession, but an immense overhang of debt: thanks to the collapse of
the real-estate bubble, millions of American families owe so much to
banks that they’re focusing all their energy on paying down their debts
instead of spending their money on new investments. There will be no
recovery until the painful process of working through that debt is
behind us, Rogoff argues, and an increase in the annual inflation rate,
which has floated around 2 percent since the early 1990s,would make it
easier for debtors to pay down what they owe.“There’s
no penicillin for this,” he said in an interview. “There’s no quick
getting better. What you’re really talking about is taking the edge off
the downturn and coming back to normal growth somewhat faster.”Rogoff’s
call to raise inflation has come under attack from several different
directions. Some economists think it wouldn’t do any good—that trying
to raise inflation wouldn’t create demand or spur growth the way Rogoff
thinks—while others believe that, given that prices actually seem to
be in danger of falling at the moment, the Fed couldn’t make it happen
if it wanted to. But perhaps the biggest problem for Rogoff is that, for
most policymakers, elected and otherwise, the idea of courting
inflation on purpose sounds downright crazy—not to mention politically
disastrous.“Going around the country saying, ‘We need more
inflation’ is not going to be a big seller,” said Michael Mussa, a
senior fellow at the Peterson Institute and a former adviser to Reagan.
“Inflation means that the costs of everybody’s goods and services are
going up … And I believe it’s a substantial symbol of mismanagement by
the government and the central bank.”Rogoff, however, remains
convinced that as the situation grows more desperate, our leaders will
feel pressure to start considering their options with more open minds.
“As more and more people realize that we’re not quickly going back to
normal,” he said, “they become more flexible.”Though Rogoff speaks with unflinching steadiness, hearing him explain how
badly our leaders misdiagnosed the economy after the crash, one imagines
a doctor banging his fists against the door of a surgery ward, trying
to warn his colleagues that he has checked their patient’s chart and
realized they’re about to make a huge mistake.The mistake we all
made, as Rogoff sees it, was thinking this was going to be nothing more
than a regular recession, the same kind of thing that has happened in
the United States once or twice every decade for the past 150 years.
These cyclical recessions come and go, and we have a pretty reliable
playbook for dealing with them: usually, an increase in government
spending and lower interest rates to encourage money to flow. Recessions
tend to end after about a year, at which point unemployment starts to
fall and normal growth resumes.Far
less frequently, something more serious grips an economy: a financial
crisis that breaks the pattern, and from which it is much more difficult
to recover. Rogoff and Reinhart’s book suggests that such contractions
are characterized above all by severe, widespread debt, which leads to
long periods of economic stagnation and uncertainty. Rogoff puts our
current situation in that category, along with the Great Depression, and
he fears that if we do not act quickly and creatively to dig ourselves
out of it, we risk settling into a long-term slowdown along the lines of
what Japan has been going through since the 1990s. Mistaking this
crisis for a typical recession, he says, is like mistaking pneumonia for
a stubborn cold. “They’re very, very different animals.”The
animal we’re wrestling with today, of course, was born of the vastly
overheated real estate market that collapsed in 2007, temporarily
paralyzing the global financial system and taking some powerful banks
down with it. Today, its legacy is a towering mountain of consumer debt,
government debt, and millions of underwater mortgages that are gumming
up the economy and preventing it from coming back to life.“It’s
very unlikely that all these debts are going to get repaid in full,”
Rogoff said. Banks have loans on their books that people simply don’t
have—won’t have—the money to pay off, and expecting it to happen
means we’ll just stay frozen in place, waiting. What needs to happen,
Rogoff says, is “some transfer from creditors to debtors.” The ideal way
for that to happen, he says, would be through loan renegotiation,
whereby banks would forgive some homebuyers and strike repayment deals
with others. But that sort of piecemeal renegotiation has proved very
difficult to carry out.A
more viable way to start fixing the nation’s balance sheets, Rogoff
argues, is by inducing a temporary bout of inflation. If the Federal
Reserve raises its target inflation rate by several percentage points—up from around 2 percent, where it’s been for the past decade, to
somewhere in the neighborhood of 4 to 6 percent—and injects new money
into the economy until it gets there, then debtors will get some relief
and the wheels of the economy will once again start to turn.Rogoff
first laid out the argument for embracing inflation in one of his
columns in December of 2008—a move that came as such a surprise to
people who knew his reputation that he got letters from central bankers
who were sure they’d misunderstood him. Rogoff had worked at the Fed
under none other than Paul Volcker, whose mandate as Fed chairman was to
drive inflation down at any cost. Under Volcker’s watch, inflation fell
from 13.3 percent in December of 1979 to just 3.8 percent four years
later. And though Rogoff at the time was just starting out as an
economist—indeed, he was still transitioning from his first career as a
professional chess player—he soon became an intellectual force in the
movement to make central banks the economy’s first defense against
inflation. In 1985, he published what would become one of his most
widely cited academic papers in the Quarterly Journal of Economics,
arguing that healthy economies depended on central banks being reliably
committed to holding inflation down in all but the most extreme
circumstances.Rogoff says he hasn’t changed his mind on how
central banks should behave, and still thinks our fears of runaway
inflation are well-founded. He just thinks that right now, it’s a risk
worth taking. “There’s certainly some benefit in a society having a
very, very strong conviction about keeping low inflation,” Rogoff said.
“But I think right now it’s not helpful. You can have a very strong
conviction that you don’t want to take medicines … And I respect that,
but there are times when there’s really no choice.”Though
Rogoff’s idea about raising inflation has so far not gained much
purchase in the economics profession - Mussa, for instance, called it “a
hare-brained crackpot scheme”— he is not alone in his thinking.
Versions of the same call have been taken up by several prominent
economists across the political spectrum, including Olivier Blanchard,
the chief economist at the International Monetary Fund; Joshua Aizenman,
co-editor of the Journal of International Money and Finance; Harvard’s
Greg Mankiw, a former adviser to George W. Bush; and Paul Krugman, the
Nobel Prize-winning New York Times columnist.THOSE
WHO disagree with Rogoff cite several key objections. One is that
inflation can be hard to stop once it starts: if the Fed turned on the
spigot, there’s no guarantee they’d be able to turn it off before
inflation got out of hand. Another objection is that if the Fed does
raise its inflation target, pumping more money into the system and
allowing the dollar to lose some of its value, lenders here and abroad
will lose faith in the currency and respond by raising interest rates,
which would ultimately make it harder for Americans to borrow money. A
third objection is practical: that even if the Fed tried to trigger
inflation, it simply might not be able to. The problem with the economy
right now, some critics say, is a lack of demand for workers and
products, and blowing air into the money supply would not change that.“This
idea that there’s some separate policy instrument called ‘creating
inflation,’ I think, is a little problematic,” said Lawrence Summers,
the former secretary of the Treasury and Harvard president who also
served as the director of President Obama’s National Economic Council.
Increasing demand should be the primary goal, with inflation a possible
byproduct, Summers said. “I don’t think the idea that you could simply
get more inflation by saying you want more inflation is a promising
one.”Rogoff is not swayed by these arguments. He emphasizes that
the level of inflation he is calling for is very modest—and that
there’s no really no reason to think that the Fed would be incapable of
inducing it or reining it in at will. As for damaging the central bank’s
credibility, Rogoff reiterates the extraordinary nature of the present
circumstances. “This is a very exceptional situation—a once in
Halley’s Comet kind of phenomenon,” he said. As he wrote in his column
earlier this month, “These are times when central banks need to spend
some of the credibility that they accumulate in normal times.”Trying
to persuade central bankers to go for that plan involves a different
kind of problem: a political one. Inflation devalues the dollar and
makes things more expensive, making it an easy political target. Earlier
this month, as Wall Street and Washington waited to hear how the Fed
would approach monetary policy going forward, Texas Governor Rick Perry
more or less threatened Fed chairman Ben Bernanke with violence if he
“prints more money” before the next election. What he was talking about
wasn’t even inflation, but a policy called “quantitative easing,” in
which the Federal Reserve injects new money into the economy by buying
billions of dollars worth of Treasury bonds from banks. The Fed has
already tried this twice since 2008, and each time it has been
controversial. While Rogoff’s plan to raise the inflation rate target is
conceptually different from quantitative easing, it would involve the
same mechanism, and would push the same political buttons in an even
more extreme way.Underlying
that opposition is more than just patriotism: it’s also a moral
objection. Transferring the debt burden from borrowers to creditors,
after all, effectively bails out borrowers by punishing the banks that
lent them money, as well as devaluing the savings of their more prudent
neighbors. That kind of rescue plan strikes many as fundamentally
unfair.Rogoff understands this objection, and doesn’t dispute
that what he’s proposing is on some level unfair. But ultimately, he
argues, this contraction is dragging us all down together, and even
those lenders and savers will be better off if America’s debt overhang
is taken care of swiftly. Once that happens, and the economy starts to
recover properly, we’ll be able to focus on designing better policies
that will make us less vulnerable to financial crisis in the future. For
now, a little inflation might just be the cost of getting us to where
that might be possible.“One way or another,” said Rogoff, “we’re going to be doing things we would not dream we would ever do before this is over.”
You probably missed the recent special issue of China Newsweek, so let me bring you up to date. Who do you think was on the cover—named the “most influential foreign figure” of the year in China? Barack Obama? No. Bill Gates? No. Warren Buffett? No. O.K., I'll give you a hint: He's a rock star in Asia, and people in China, Japan and South Korea scalp tickets to hear him. Give up?It was Michael J. Sandel, the Harvard University political philosopher.This news will not come as a surprise to Harvard students, some 15,000 of whom have taken Sandel's legendary “Justice” class. What makes the class so compelling is the way Sandel uses real-life examples to illustrate the philosophies of the likes of Aristotle, Immanuel Kant and John Stuart Mill.Sandel, 58, will start by tossing out a question, like, “Is it fair that David Letterman makes 700 times more than a schoolteacher?” or “Are we morally responsible for righting the wrongs of our grandparents' generation?” Students offer competing answers, challenge one another across the hall, debate with the philosophers—and learn the art of reasoned moral argument along the way.Besides being educational, the classes make great theater—so much so that Harvard and WGBH (Boston's PBS station) filmed them and created a public television series that aired across the country in 2009. The series, now freely available online (at http://www.JusticeHarvard.org), has begun to stir interest in surprising new places.Last year, Japan's NHK TV broadcast a translated version of the PBS series, which sparked a philosophy craze in Japan and prompted the University of Tokyo to create a course based on Sandel's. In China, volunteer translators subtitled the lectures and uploaded them to Chinese Web sites, where they have attracted millions of viewers. Sandel's recent book—Justice: What’s the Right Thing to Do?—has sold more than a million copies in East Asia alone. This is a book about moral philosophy, folks!Here's The Japan Times describing Sandel's 2010 visit: “Few philosophers are compared to rock stars or TV celebrities, but that's the kind of popularity Michael Sandel enjoys in Japan.” At a recent lecture in Tokyo, “long lines had formed outside almost an hour before the start of the evening event. Tickets, which were free and assigned by lottery in advance, were in such demand that one was reportedly offered for sale on the Web for $500.” Sandel began the lecture by asking: “Is ticket scalping fair or unfair?”But what is most intriguing is the reception that Sandel (a close friend) received in China. He just completed a book tour and lectures at Tsinghua and Fudan universities, where students began staking out seats hours in advance. This semester, Tsinghua started a course called “Critical Thinking and Moral Reasoning,” modeled on Sandel's. His class visit was covered on the national evening news.Sandel's popularity in Asia reflects the intersection of three trends. One is the growth of online education, where students anywhere now can gain access to the best professors from everywhere. Another is the craving in Asia for a more creative, discussion-based style of teaching in order to produce more creative, innovative students. And the last is the hunger of young people to engage in moral reasoning and debates, rather than having their education confined to the dry technical aspects of economics, business or engineering.At Tsinghua and Fudan, Sandel challenged students with a series of cases about justice and markets: Is it fair to raise the price of snow shovels after a snowstorm? What about auctioning university admissions to the highest bidder? “Free-market sentiment ran surprisingly high,” Sandel said, “but some students argued that unfettered markets create inequality and social discord.”Sandel's way of teaching about justice “is both refreshing and relevant in the context of China,” Dean Qian Yingyi of Tsinghua’s School of Economics and Management, explained in an e-mail. Refreshing because of the style and relevant because “the philosophic thinking among the Chinese is mostly instrumentalist and materialistic,” partly because of “the contemporary obsession on economic development in China.”Tsinghua's decision to offer a version of Sandel's course, added Qian, “is part of a great experiment of undergraduate education reform currently under way at our school. …This is not just one class; it is the beginning of an era.”Sandel is touching something deep in both Boston and Beijing. “Students everywhere are hungry for discussion of the big ethical questions we confront in our everyday lives,” Sandel argues. “In recent years, seemingly technical economic questions have crowded out questions of justice and the common good. I think there is a growing sense, in many societies, that G.D.P. and market values do not by themselves produce happiness, or a good society. My dream is to create a video-linked global classroom, connecting students across cultures and national boundaries—to think through these hard moral questions together, to see what we can learn from one another.”
Keynesian economics—the go-to theory for those who like government at the controls of the economy—is in the forefront of the ongoing debate on fiscal-stimulus packages. For example, in true Keynesian spirit, Agriculture Secretary Tom Vilsack said recently that food stamps were an "economic stimulus" and that "every dollar of benefits generates $1.84 in the economy in terms of economic activity." Many observers may see how this idea—that one can magically get back more than one puts in—conflicts with what I will call "regular economics." What few know is that there is no meaningful theoretical or empirical support for the Keynesian position.The overall prediction from regular economics is that an expansion of transfers, such as food stamps, decreases employment and, hence, gross domestic product (GDP). In regular economics, the central ideas involve incentives as the drivers of economic activity. Additional transfers to people with earnings below designated levels motivate less work effort by reducing the reward from working.
In addition, the financing of a transfer program requires more taxes—today or in the future in the case of deficit financing. These added levies likely further reduce work effort—in this instance by taxpayers expected to finance the transfer—and also lower investment because the return after taxes is diminished.
This result does not mean that food stamps and other transfers are necessarily bad ideas in the world of regular economics. But there is an acknowledged trade-off: Greater provision of social insurance and redistribution of income reduces the overall GDP pie.
Yet Keynesian economics argues that incentives and other forces in regular economics are overwhelmed, at least in recessions, by effects involving "aggregate demand." Recipients of food stamps use their transfers to consume more. Compared to this urge, the negative effects on consumption and investment by taxpayers are viewed as weaker in magnitude, particularly when the transfers are deficit-financed.
Thus, the aggregate demand for goods rises, and businesses respond by selling more goods and then by raising production and employment. The additional wage and profit income leads to further expansions of demand and, hence, to more production and employment. As per Mr. Vilsack, the administration believes that the cumulative effect is a multiplier around two.
If valid, this result would be truly miraculous. The recipients of food stamps get, say, $1 billion but they are not the only ones who benefit. Another $1 billion appears that can make the rest of society better off. Unlike the trade-off in regular economics, that extra $1 billion is the ultimate free lunch.
How can it be right? Where was the market failure that allowed the government to improve things just by borrowing money and giving it to people? Keynes, in his "General Theory" (1936), was not so good at explaining why this worked, and subsequent generations of Keynesian economists (including my own youthful efforts) have not been more successful.
Theorizing aside, Keynesian policy conclusions, such as the wisdom of additional stimulus geared to money transfers, should come down to empirical evidence. And there is zero evidence that deficit-financed transfers raise GDP and employment—not to mention evidence for a multiplier of two.
Gathering evidence is challenging. In the data, transfers are higher than normal during recessions but mainly because of the automatic increases in welfare programs, such as food stamps and unemployment benefits. To figure out the economic effects of transfers one needs "experiments" in which the government changes transfers in an unusual way—while other factors stay the same—but these events are rare.
Ironically, the administration created one informative data point by dramatically raising unemployment insurance eligibility to 99 weeks in 2009—a much bigger expansion than in previous recessions. Interestingly, the fraction of the unemployed who are long term (more than 26 weeks) has jumped since 2009—to over 44% today, whereas the previous peak had been only 26% during the 1982-83 recession. This pattern suggests that the dramatically longer unemployment-insurance eligibility period adversely affected the labor market. All we need now to get reliable estimates are a hundred more of these experiments.
The administration found the evidence it wanted—multipliers around two—by consulting some large-scale macro-econometric models, which substitute assumptions for identification. These models were undoubtedly the source of Mr. Vilsack's claim that a dollar more of food stamps led to an extra $1.84 of GDP. This multiplier is nonsense, but one has to admire the precision in the number.
There are two ways to view Keynesian stimulus through transfer programs. It's either a divine miracle—where one gets back more than one puts in—or else it's the macroeconomic equivalent of bloodletting. Obviously, I lean toward the latter position, but I am still hoping for more empirical evidence.
This article examines the compelling enigma of how
the introduction of a new international law, the North American Agreement on
Labor Cooperation (NAALC), helped stimulate labor cooperation and collaboration
in the 1990s. It offers a theory of legal transnationalism—defined as processes
by which international laws and legal mechanisms facilitate social movement
building at the transnational level—that explains how nascent international
legal institutions and mechanisms can help develop collective interests, build
social movements, and, ultimately, stimulate cross-border collaboration and
cooperation. It identifies three primary dimensions of legal transnationalism
that explain how international laws stimulate and constrain movement building
through: (1) formation of collective identity and interests (constitutive
effects), (2) facilitation of collective action (mobilization effects), and (3)
adjudication and enforcement (redress effects).
At the end of the American Revolution, sixty thousand Americans loyal to
the British cause fled the United States and became refugees throughout
the British Empire. This groundbreaking book offers the first global
history of the loyalist exodus to Canada, the Caribbean, Sierra Leone,
India, and beyond. Following extraordinary journeys like the one of
Elizabeth Johnston, a young mother from Georgia, who led her growing
family to Britain, Jamaica, and Canada, questing for a home; black
loyalists such as David George, who escaped from slavery in Virginia and
went on to found Baptist congregations in Nova Scotia and Sierra Leone;
and Mohawk Indian leader Joseph Brant, who tried to find autonomy for
his people in Ontario, Liberty’s Exiles challenges conventional
understandings about the founding of the United States and the shaping
of the postrevolutionary world. Based on original research on four
continents, this book is at once an intimate narrative history and a
provocative new analysis—a story about the past that helps us think
about migration, tolerance, and liberty in the world today.
We study patterns of FDI in a multi-country world economy. First, we present evidence for a broad sample of countries that firms direct FDI disproportionately to markets with income levels similar to their home market. Then we develop a model featuring non-homothetic preferences for quality and monopolistic competition in which specialization is purely demand-driven and the decision to serve foreign countries via exports or FDI depends on a proximity-concentration trade-off. We characterize the joint patterns of trade and FDI when countries differ in income distribution and size and show that FDI is more likely to occur between countries with similar per capita income levels. The model predicts a Linder Hypothesis for FDI, consistent with the patterns found in the data.
Co-author Gene Grossman is a professor of economics at Princeton University. Co-author Pablo Fajgelbaum is an assistant professor of economics at the University of California, LA.Working Paper 17550, National Bureau of Economic Research, October 2011. Download PDF
Line in the Sand details the dramatic transformation of the western US-Mexico border from its creation at the end of the Mexican-American War in 1848 to the emergence of the modern boundary line in the first decades of the twentieth century. In this sweeping narrative, Rachel St. John explores how this boundary changed from a mere line on a map to a clearly marked and heavily regulated divide between the United States and Mexico. Focusing on the desert border to the west of the Rio Grande, this book explains the origins of the modern border and places the line at the center of a transnational history of expanding capitalism and state power in the late nineteenth and early twentieth centuries.Moving across local, regional, and national scales, St. John shows how government officials, Native American raiders, ranchers, railroad builders, miners, investors, immigrants, and smugglers contributed to the rise of state power on the border and developed strategies to navigate the increasingly regulated landscape. Over the border's history, the US and Mexican states gradually developed an expanding array of official laws, ad hoc arrangements, government agents, and physical barriers that did not close the line, but made it a flexible barrier that restricted the movement of some people, goods, and animals without impeding others. By the 1930s, their efforts had created the foundations of the modern border control apparatus.Drawing on extensive research in US and Mexican archives, Line in the Sand weaves together a transnational history of how an undistinguished strip of land became the significant and symbolic space of state power and national definition that we know today.
For nearly 2,000 miles, it runs alongside California, Arizona, New Mexico, and Texas. It begins in the east in Brownsville, Texas, and marches west along the Rio Grande, halting at the Pacific, in the town of Tijuana, notorious for its drug violence and reputation as a party spot for frat boys.Whatever the cause, the mythic US-Mexico border draws millions of people to it each year. It’s the most frequently crossed international border in the world, and is one of the most intriguing unseen lines in history.Just ask Rachel St. John. In her new book, Line in the Sand: A History of the Western U.S.-Mexico Border, the Harvard associate professor of history traces the border’s origins to its modern-day consequences.The eastern US-Mexico border was easy to establish: the Rio Grande forms a natural divide. But after the Treaty of Guadalupe Hidalgo ended the Mexican-American War in 1848, diplomats gathered with maps to configure the western border. According to St. John’s research, they drew arbitrary lines, following no existing geographical feature, but connecting a few known spots like El Paso, the Gila and Colorado rivers, and San Diego Bay.Armed with maps and equipment, a US-Mexico boundary commission next set out into the barren and inhospitable desert with the task of formally surveying and demarcating that part of the border.“There’s this idea that you can draw a boundary line on paper,” mused St. John, “but that’s much harder to put into effect when you get on the ground.”Some of the men’s maps proved incorrect, which spurred on-the-spot compromises—just another added stress in addition to contending with everything from heat and rough terrain to getting lost, Apache attacks, and sometimes death.“The one part of the boundary line that corresponded to a natural geographic feature, the Gila River, was made obsolete by the renegotiation of the border in the Gadsden Treaty of 1953,” wrote St. John. “From that point on, with the exception of a small stretch of the boundary line that runs along the Colorado River, the western border was made up of a series of imaginary lines.” Finalized in 1854, “the boundary line as it exists today was in place,” she said.But what St. John finds remarkable is the shift in the border’s meaning over time.“When the border was first drawn, the government thought, ‘No one’s ever going to come out here. This is the middle of the desert—who cares what happens,’” she said. “But there’s a massive change in economics that begins in the US and spreads into Mexico in the late 19th-century. And as you have the development of this capitalist economy, the border takes on different meanings.”Cattle ranching and mining became big industries, and a railroad was built on the US side. The exchange of goods prompted the US government to send customs agents to the border, and, said St. John, “It’s really the change in the economy that causes the government to care about maintaining the border.”According to St. John, most people assume today that the border is there to regulate the movement of people, “but the sense from both the US and Mexican governments that they needed to regulate the movement of people is a 20th-century phenomenon.”The first people who the US government wanted to control weren’t Mexicans, but Chinese immigrants, St. John discovered. “I find it really interesting that during the first decade of the 20th century, you have Chinese immigrants disguising themselves as Mexicans so they can cross the border.”St. John grew up in Southern California, and as a teenager sometimes trekked to Tijuana herself. “I remember one day thinking, ‘It’s really interesting how the border is one way on this side, and on the other side it’s totally different,’” she recalled.The border now is political, policed, and unpredictable. “All the attention on the border, in some ways, is not a very effective way of dealing with larger problems of managing immigration and other smuggling. Many people in the U.S. without proper documentation entered legally and overstayed their visas. This emphasis on building a wall doesn’t necessarily match up with the issues people are trying to address,” said St. John.“But one thing studying the border has taught me is that it hasn’t always meant the same thing, and so it’s very possible that in the future it won’t mean the same thing,” she noted. “At no time that I’ve seen does anyone want a totally closed or open border. It’s all about creating a border that’s a force field—it lets in the things you want and lets out the things you don’t.”
Two acclaimed political economists explore the origins and long-term effects of the financial crisis in historical and comparative perspective.
Welcome to Argentina: by 2008 the United States had become the biggest international borrower in world history, with almost half of its 6.4 trillion dollar federal debt in foreign hands. The proportion of foreign loans to the size of the economy put the United States in league with Mexico, Pakistan, and other third-world debtor nations. The massive inflow of foreign funds financed the booms in housing prices and consumer spending that fueled the economy until the collapse of late 2008.
The authors explore the political and economic roots of this crisis as well as its long-term effects. They explain the political strategies behind the Bush administration's policy of funding massive deficits with the foreign borrowing that fed the crisis. They see the continuing impact of our huge debt in a slow recovery ahead. Their clear, insightful, and comprehensive account will long be regarded as the standard on the crisis.
Lost in Transition tells the story of the “lost generation” that came of age in Japan's deep economic recession in the 1990s. The book argues that Japan is in the midst of profound changes that have had an especially strong impact on the young generation. The country's renowned “permanent employment system” has unraveled for young workers, only to be replaced by temporary and insecure forms of employment. The much-admired system of moving young people smoothly from school to work has frayed. The book argues that these changes in the very fabric of Japanese postwar institutions have loosened young people's attachment to school as the launching pad into the world of work and loosened their attachment to the workplace as a source of identity and security. The implications for the future of Japanese society—and the fault lines within it—loom large.
Does democratic governance expand wealth and prosperity? There is no consensus about this issue despite the fact that for more than half a century, rival theories about the regime-growth relationship have been repeatedly tested against the empirical evidence, using a variety of cases, models and techniques. To consider the issues, Part I of this paper reviews and summarizes theories why regimes are expected to influence economic growth directly, either positively or negatively. After considering these debates, Part II discusses the technical challenges facing research on this topic and how it is proposed to overcome these. Part III presents the results of the comparative analysis for the effects of democratic governance on economic growth during recent decades. The descriptive results illustrate the main relationships. The multivariate models check whether these patterns remain significant after controlling for many other factors associated with growth, including geography, economic conditions, social structural variables, cultural legacies, and global trends. The evidence supports the equilibrium thesis suggesting that regimes combining both liberal democracy and bureaucratic governance are most likely to generate growth, while by contrast patronage autocracies display the worst economic performance. The conclusion considers the implications.
This thesis asks how uses of city space among a minority demonstrate engagement and identification with the city overall. Research focused on ethnographic interviews with second-generation Turkish women in Copenhagen, about their use of the city throughout different stages of their lives. This was supplemented by participation observation across Copenhagen’s public spaces and interviews with urban planners and leaders of various women’s centers. I find that the second-generation Turkish women demonstrate multiple uses and understandings of city spaces based on their multiple, fluid identities, so that through a particular identity a physical space becomes a meaningful place. Because people are situated and related through space, space and place play an important role in an individual’s identification with and against others. I define space as composed of the built and physical environment across all scales. Place, on the other hand, is space made especially meaningful, interpreted by individuals based on their histories, use, and perceptions of the space. Place is thus a product of a particular identity and its respective ways of being in the city. Different contexts and spaces become the platform for enacting different identities. The result is to conceive of space as dynamically constructed into different places as different identities play out across the city. I demonstrate this first by describing how Turkish immigrants claim and appropriate urban space in Copenhagen, recreating their cultural uses of space within the context of Copenhagen. I continue by contextualizing these Turkish practices within the diverse repertoire of identities of second-generation Turkish women and their accompanying diverse understandings of place. Ultimately, allowing for an open, fluid sense of identities and place creates a more inclusive framework for belonging in a multicultural, transnational city. Download PDF
Does an expansion of health insurance increase or decrease use of the emergency department (ED)? Both predictions can be justified logically. On the one hand, research on patient cost sharing predicts that by reducing the out-of-pocket costs of an ED visit, expanded insurance coverage, especially in the face of physician shortages, could result in increased ED utilization. This view has been echoed by elected leaders: Senator Jon Kyl (R-AZ), citing the Massachusetts experience with health care reform, claimed that if anything, universal coverage brought even higher rates of emergency room visits due to increased difficulty in getting appointments for outpatient physician visits. Others have predicted that expanded coverage would actually reduce ED use, since previously uninsured patients would now have access to preventive care. The relative importance of these countervailing forces is a question that clearly weighs on physicians: in a survey of emergency physicians conducted in April 2010, about 71 percent said they expected emergency visits to increase after the passage of the Affordable Care Act (ACA). To explore the importance of these effects, we examined the Massachusetts experience. The state's 2006 health care reform was a model for the ACA and reduced the proportion of Massachusetts adults under the age of 65 who were uninsured by 7.7 percentage points between the fall of 2006 and the fall of 2009. To determine whether any changes in ED utilization in Massachusetts reflected the effect of Massachusetts' reform or were merely representative of broader regional trends in ED utilization, we used New Hampshire and Vermont as control states.
With the monetary union coming apart, the finger-pointing has begun. Who really killed Europe?(Page 1 of 2)You remember Agatha Christie’s classic whodunit Murder on the Orient Express?
The problem for the great Belgian sleuth Hercule Poirot was that there
were far too many suspects. The strange death of the European Union may
prove to be a rather similar case.So used are we to hearing the process of European
integration likened to an unstoppable train that we discount the idea it
could ever stop in its tracks. Yet the reality is that Europe has been
quietly disintegrating for some time.Outwardly, it’s true, Europe’s leaders still appear to be
inching toward their long-cherished goal of “ever closer union.” Last
month they agreed to set up a new European Stability Mechanism to deal
with future financial crises. It’s still a long way from being the
United States of Europe, but most Americans assume that’s the ultimate
destination: a truly federal system like their own. Think again. Not
only has the economic crisis blown holes in the finances of nearly all
EU states, it has also revealed a deep reluctance on the part of those
least affected to bail out the hardest hit.Americans bemoaning their own economy’s sluggish
recovery should look on the bright side: it’s worse in Europe. The
International Monetary Fund projects growth of 3 percent for the United
States this year but just half that for the euro zone. Even more
striking is the extent of economic divergence within the euro area.
While the German economy is currently growing at an annualized rate of
around 6 percent, Greek growth in the fourth quarter of last year was
minus 6 percent. So much for the convergence monetary union was meant to
bring.The underlying problem is the euro’s failure to
create a truly integrated market for labor. In the decade after the
euro’s creation in 1999, German unit labor costs rose by less than 40
percent; the equivalent figure for Spain was 80 percent. Workers in the
periphery took monetary union to mean they should be paid as well as
workers in the German core. But their productivity didn’t rise to German
levels. At the same time, people in countries like Ireland took the
post-1999 reduction in interest rates—one of the most obvious benefits
to the periphery of euro membership—as a signal to go on a borrowing
binge. The result: Ireland and Spain behaved a lot like Florida and
Nevada. House prices bubbled, then burst.In the wake of the American crisis, some banks
failed—most spectacularly Lehman Brothers—but most were bailed out, and
the federal deficit soared. Dollars were transferred by the U.S.
Treasury from Texan taxpayers to welfare recipients in New Mexico. In
Europe the story was different. There was no big bank failure; all “too
big to fail” institutions were rescued. National deficits soared. But
when some countries ran into fiscal trouble—when financial markets
started to demand sharply higher interest rates—things got ugly, because
there is no mechanism to transfer euros between countries other than in
tiny amounts.The crisis has driven not
just one but two divisive wedges into the European economy. First there
is the fundamental political rift between the 17 EU members who joined
the monetary union and the 10 who didn’t. Then, within the euro zone,
there is the widening economic rift between the German-dominated core
and the ailing periphery—the countries cursed with the unflattering
acronym PIGS (Portugal, Ireland, Greece, and Spain).In this whodunit, the prime suspect is not the real
culprit. At first sight, the fingerprints on the murder weapon belong
to feckless finance ministers of the PIGS. It’s true that those
countries had been heading for fiscal trouble even before the onset of
the financial crisis. The Bank for International Settlements was
forecasting that by 2040 they would all have public debts equal to at
least 300 percent of gross domestic product.In the cases of Greece and Ireland, the financial
markets decided some months ago that they were likely to default; hence
the surge in their borrowing costs as investors sought compensation for
this risk in the form of higher rates; hence the need for bailouts from
the other EU members.But why exactly is Ireland’s deficit so huge? Step
forward suspect No. 2: Europe’s banks. For it was by bailing out the
country’s bloated banking sector—the total assets of which now exceed
Irish GDP by a factor of 10—that the last Irish government created the
present fiscal crisis. In much the same way, worries about Spain have
much more to do with the still-uncertain losses of the country’s cajas (savings banks) than with the government’s own fiscal health.Nor is it only the banks of Euroland’s periphery
who are suspects. Equally culpable are the banks of the core. German
banks, for example, have close to €500 billion of exposure to the PIGS.
The dirty little secret of euro-zone finance is that if one of the
periphery countries were to default, German banks—in particular the
state-owned Landesbanken—would be among the biggest losers. And that, of
course, is why it makes sense for the core to bail out the periphery:
in truth, they are all in this banking crisis together.It is the political difficulty of selling this
proposition to German voters that is set to derail the EU train. A
euro-barometer poll last year revealed that only 34 percent of Germans
thought the euro had mitigated the effects of the financial crisis.
Germans are overwhelmingly for fiscal austerity—88 percent favor a
policy of deficit reduction, much higher than for the EU as a whole.
That is why the German government keeps insisting that the recipients of
bailout money impose painful austerity measures on themselves.The mood of the German voter can be summed up as
follows: No More Herr Nice Guy. So the tax-dodging Greeks, the feckless
Irish, and the bone-idle Portuguese expect the thrifty German worker to
write them yet another check? For five decades after World War II, a
penitent Germany paid up. The Federal Republic was the single biggest
net contributor to the process of European integration. But the era of
war guilt is now over—witness the humiliating electoral defeat inflicted
on Germany’s governing parties in Baden-Württemberg at the end of last
month. No matter how tough Chancellor Angela Merkel seems to the
hard-pressed Greeks, to her own people she seems way too soft.For years the train of European integration ran on
German subsidies. No longer. So as the process of disintegration
accelerates this year—as the economies of the periphery languish and
their governments topple—don’t blame the victim. It’s the German voter
who dun it.
When NAFTA went into effect in 1994, many feared it would intensify animosity among North American unions, lead to the scapegoating of Mexican workers and immigrants, and eclipse any possibility for cross-border labor cooperation. But far from polarizing workers, NAFTA unexpectedly helped stimulate labor transnationalism among key North American unions and erode union policies and discourses rooted in racism. The emergence of labor transnationalism in North America presents compelling political and sociological puzzles: How did NAFTA, the concrete manifestation of globalization processes in North America, help deepen labor solidarity on the continent? In addition to making the provocative argument that global governance institutions can play a pivotal role in the development of transnational social movements, this book suggests that globalization need not undermine labor movements: collectively, unions can help shape how the rules governing the global economy are made.
The way to restoring America's AAA credit-rating starts with President Obama moving beyond
blaming the economy on the admittedly inept George W. Bush.
Standard & Poor's recent downgrade of the U.S. government shows how far the world has moved into a crisis of governments.
The official reactions to the S&P action have not been promising. The Obama administration attacked S&P's competence, and the U.S. Congress has threatened hearings, apparently aimed at bullying S&P and the other agencies from further downgrades.
The main substantive criticism was that S&P made a $2 trillion mistake in its baseline projection of 10-year deficits. Of course, these projections came from the Congressional Budget Office, which lost its credibility in these matters when it scored President Obama's health care reform plan as reducing 10-year deficits - mostly because of the inclusion of phantom reductions in Medicare payments to doctors.
In truth, S&P's downgrade stemmed mainly from its legitimate concern that the U.S. government has no coherent medium- or long-term plan to eliminate budget deficits and stabilize the path of public debt. This judgment is accurate and courageous and goes some distance in offsetting the hit to S&P's reputation that came from the AAA ratings that it gave not so long ago to mounds of mortgage-backed securities built on subprime garbage.
Unfortunately, Obama's main response to S&P's downgrade and the economic crisis more generally has been to continue blaming almost everything on his admittedly inept predecessor, George W. Bush, and on the Republican Congress.
Another familiar theme is the unwillingness of the evil rich to pay more taxes. (I have one modest proposal that could save the President valuable time in this regard. Rather than continuing to repeat the long phrase "millionaires and billionaires," I suggest a merger: "mibillionaires." I know it looks funny and is hard to say on a first try, but after three or four repetitions it becomes strikingly mellifluous.)
The way forward to restoring our AAA rating begins with Obama taking seriously the surprisingly sound report by his recent bipartisan debt and deficit commission. Building on those recommendations, I have constructed a fiscal plan:• Make structural reforms to the main entitlement programs starting with increases in ages of eligibility and a shift to an economically appropriate indexing formula.
• Eliminate the unwise increases of federal spending by Bush and Obama, including added outlays for education, farm and ethanol subsidies, and expansions of Medicare and Medicaid.
• Lower the structure of marginal tax rates in the individual income tax.
• Pay for the rate cuts by gradually phasing out the main tax-expenditure items, including preferences for home-mortgage interest, state and local income taxes, and employee fringe benefits.
• Permanently eliminate federal corporate and estate taxes, levies that are inefficient and raise comparatively little money.
• Introduce a broad-based expenditure tax, such as a value-added tax (VAT). Depending on the structure of exemptions, a rate of 10% should raise about 5% of GDP in revenue.
The VAT system is present in most developed countries and can be highly efficient because it has a flat rate, falls on consumption and has built-in mechanisms for ensuring compliance. However, a VAT is also a magnet for criticism by conservatives - who worry about the promotion of a larger government.
I share this concern and would defend a VAT only if it can be firmly linked to the other parts of the reform package. But more fundamentally, given the projected path of entitlement spending, I see no reasonable alternative.
It is hard to imagine President Obama becoming the leader of this kind of broad fiscal initiative. Though he has endorsed some pieces of some of these components, the embrace has been halting. He is hedging, not leading.
Thus, as S&P observed, uncertainty about our fiscal path will likely not be resolved at least until the outcomes of next year's crucial elections.
The one person with the power to eliminate part of this uncertainty is the President, who could nobly decide not to stand for reelection, thereby following in the footsteps of Lyndon Johnson and Calvin Coolidge. Johnson was forced out by a different type of crisis, Vietnam, and he hung on too long, delaying his announcement until he saw his poor performance in the New Hampshire primary and in subsequent electoral polls. Coolidge is a more dignified model, as he opted out in 1927 while things were going fine. In fact, Obama could borrow Coolidge's memorable phrase, "I do not choose to run."
President Obama should take a page from Ronald Reagan’s playbook in winning the final inning of the Cold War. Obama can challenge President Mahmoud Ahmadinejad to put his enriched uranium where his mouth is—by stopping all Iranian enrichment of uranium beyond the 5 percent level.
A quarter-century ago, Soviet leader Mikhail Gorbachev was touting a new “glasnost”: openness. President Reagan went to Berlin and called on Gorbachev to “tear down this wall.” Two years later, the Berlin Wall came tumbling down and, shortly thereafter, the Soviet “evil empire” fell as well.
While in New York for the opening of the UN General Assembly in September, Ahmadinejad on three occasions made an unambiguous offer: He said Iran would stop all enrichment of uranium beyond the levels used in civilian power plants—if his country is able to buy specialized fuel enriched at 20 percent, for use in its research reactor that produces medical isotopes to treat cancer patients.
Obama should seize this proposal and send negotiators straightaway to hammer out specifics. Iran has been enriching uranium since 2006, and it has accumulated a stockpile of uranium enriched at up to 5 percent, sufficient after further enrichment for several nuclear bombs. Iran is also producing 20 percent material every day, and it announced in June that it planned to triple its output. Halting Iran’s current production of 20 percent material and its projected growth would be significant.
A stockpile of uranium enriched at 20 percent shrinks the potential timeline for breaking out to bomb material from months to weeks. In effect, having uranium enriched at 20 percent takes Iran 90 yards along the football field to bomb-grade material. Pushing it back below 5 percent would effectively move Tehran back to the 30-yard line - much farther from the goal of bomb-grade material. Even more important, extracting from Iran a commitment to a bright red line capping enrichment at 5 percent would stop the Islamic Republic from advancing on its current path to 60 percent enrichment and then 90 percent.
Stopping Iran from enriching beyond 5 percent is not, in itself, a “solution” to its nuclear threat. Nor was Reagan’s proposal to Gorbachev. The question for Reagan was whether we would be better off with the Berlin Wall or without it.
Iran today is the most sanctioned member of the United Nations; it has been the target of five Security Council resolutions since 2006 demanding that it suspend all uranium enrichment. The United States and Europe have organized their own, tougher economic sanctions forbidding businesses from trading with Iranian companies and limiting Iran’s access to financial markets.
But Iran does not require the permission of the United Nations or, for that matter, the United States to advance its nuclear program within its borders. Nor are current or future sanctions likely to dissuade Iran from progressing steadily toward a nuclear weapon.
So far, Obama has essentially continued the Bush administration’s policy toward Iran with one addition: an authentic offer from the start of his administration to begin negotiations. Negotiations, however, have not been feasible because of sharp divisions within Iran. Those rifts were exacerbated after the June 2009 elections, in which Iran’s ruling powers (Supreme Leader Ayatollah Ali Khamanei, Ahmadinejad and the Revolutionary Guard) rigged the presidential vote and then moved to suppress the opposition Green Movement protests. In the last two years, they have tightened control over their society.
Enter Ahmadinejad’s proposal to stop all enrichment at the 5 percent level—without preconditions. Although differences between Ahmadinejad and the supreme leader have become evident, the United States should pay attention to the president’s offer.
Arguments against testing the offer are easy to make. An embattled Ahmadinejad may not be able to deliver. Iran will use negotiations to seek to relax or escape current sanctions. If a deal were reached, it would be more difficult to win international support for the next round of sanctions. An agreement that stops only the 20 percent enrichment could imply a degree of acceptance of Iran’s ongoing enrichment up to 5 percent.
Recognizing all of these negatives, however, the policy question remains: Would the United States be better off with Iran enriching its uranium to 20 percent or without it?
President Obama should act now to test Ahmadinejad’s word.
Killing terrorists with drones is great politics. To the question, “Is it legal?” a natural answer might well be, “Who cares?”
But the legal justifications in the war on terrorism do matter - and not just to people who care about civil liberties. They end up structuring policy. As it turns out, targeted killing, now the hallmark of the Barack Obama administration’s war on terrorism, has its roots in rejection of the legal justifications once offered for waterboarding prisoners.
The leaking of the basic content (but not the text) of an Obama administration memo authorizing the drone strike that killed US citizen Anwar Al-Awlaki therefore calls for serious reflection about where the war on terrorists has been - and where it is headed next.
The George W. Bush administration’s signature anti-terror policy after the September 11 attacks (apart from invading countries) was to capture suspected terrorists, detain them, and question them aggressively in the hopes of gaining actionable intelligence to prevent more attacks.
In the Bush years, after the CIA and other agencies balked at the interrogation techniques being urged by Vice President Dick Cheney, the White House asked the Department of Justice to explain why the most aggressive questioning tactics were legal. Lawyers at the Office of Legal Counsel—especially John Yoo, now a professor at the University of California at Berkeley—produced secret memos arguing that waterboarding wasn’t torture.The Torture Memos
What was more, the memos maintained, it didn’t matter if it was torture or not, because the president had the inherent constitutional authority to do whatever was needed to protect the country.
Some of the documents were leaked and quickly dubbed “the torture memos.” A firestorm of legal criticism followed. One of the most astute and outraged critics was Marty Lederman, who had served in the Office of Legal Counsel under President Bill Clinton. With David Barron, a colleague of mine at Harvard, Lederman went on to write two academic articles attacking the Bush administration’s theories of expansive presidential power. Eventually, Jack Goldsmith, who led the Office of Legal Council in 2003–2004 (and is now also at Harvard), retracted the most extreme of Yoo’s arguments about the president’s inherent power.
In the years leading to the 2008 election, all this technical criticism of the Bush team’s legal strategy merged with domestic and global condemnation of the administration’s detention policies. The Supreme Court weighed in, finding that detainees were entitled to hearings and better tribunals than were being offered. As a candidate, Obama joined the bandwagon, promising to close the prison at Guantanamo Bay, Cuba, within a year of taking office.
Guantanamo is still open, in part because Congress put obstacles in the way. Instead of detaining new terror suspects there, however, Obama vastly expanded the tactic of targeting them, with eight times more drone strikes in his first year than in all of Bush’s time in office. Barron and Lederman, the erstwhile Bush critics, were appointed to senior positions in the Office of Legal Counsel—where they wrote the recent memo authorizing the Al-Awlaki killing.
What explains these startling developments? If it’s illegal and wrong to capture suspected terrorists and detain them indefinitely without a hearing, how exactly did the Obama administration decide it was desirable and lawful to target and kill them?
The politics were straightforward. Obama’s team observed that holding terror suspects exposed the Bush administration to harsh criticism (including their own). They wanted to avoid adding detainees at Guantanamo or elsewhere.
A Father’s Appeal
Dead terrorists tell no tales—and they also have no lawyers shouting about their human rights. Before Al-Awlaki was killed, his father sued the government for putting the son on its target list. The Obama Justice Department asked the court to dismiss the claim as being too closely related to government secrets. The court agreed—a result never reached in all the Guantanamo litigation. Anwar Al-Awlaki now has no posthumous recourse.
In the bigger picture, Obama also wanted to show measurable success in the war on terrorism while withdrawing troops from Iraq and Afghanistan. But even here the means were influenced by legal concerns.Osama bin Laden is the best example. One suspects that the US forces who led the fatal raid in Abbottabad almost certainly could have taken him alive. But detaining and trying him would probably have been a political disaster. So they shot him on sight, as the international law of war allows for enemies unless they surrender.The authority for targeted killing—as expressed in the Lederman-Barron memo—offers the legal counterpart to the political advantages of the Obama targeting policy. According to the leaks, the memo holds that the U.S. can kill suspected terrorists from the air not because the president has inherent power, but because Congress declared war on Al-Qaeda the week after the September 11 attacks.
The logic is that once Congress declares war, the president can determine whom we are fighting. The president found that Yemen-based Al-Qaeda in the Arabian Peninsula, which didn’t exist on September 11, had joined the war in progress. He determined that Al-Awlaki was an active member of the Yemeni groups with some role in planning attacks. And, the memo says, it’s not unlawful assassination or murder if the targets are wartime enemies.
From a formal legal standpoint, Lederman and Barron can claim consistency with their attacks on the Bush administration. They relied on Congress and international law; Yoo’s “torture memos” didn’t.
But this argument misses the more basic point: Most critics rejected Bush’s policies not on technical grounds based on the Constitution, but because they thought there was something wrong with the president acting as judge and jury in the war on terrorism.
No Defense Allowed
Anwar al-Awlaki was killed because the president decided he was an enemy. Like the Bush-era Guantanamo detainees, he had no chance to deny this—even when his father tried to go to court while he was still alive.
Naturally, a uniformed soldier in a regular war also wouldn’t get a hearing. But like the Guantanamo detainees, Al-Awlaki wore no uniform. Nor was he on a battlefield, except according to the view that anywhere in the world can be the battlefield in the war on terrorism.
Al-Awlaki might have maintained that he was merely a jihadi propagandist exercising his free speech rights as a U.S. citizen. Which might well have been a lie. Yet we have only the president’s word that he was an active terrorist—and that is all we will ever have. The future direction of the policy is therefore clear: Killing is safer, easier and legally superior to catching and detaining.
Sitting beside Al-Awlaki when he was killed was another US citizen, Samir Khan, who was apparently a full-time propagandist, not an operational terrorist. Khan was, we are told, not the target, but collateral damage—a good kill under the laws of war.
Legal memos are weapons of combat—no matter who is writing them.
The rise of offshoring of intermediate inputs raises important questions for commercial policy. Do the distinguishing features of offshoring introduce novel reasons for trade policy intervention? Does offshoring create new problems of global policy cooperation whose solutions require international agreements with novel features? In this paper we provide answers to these questions, and thereby initiate the study of trade agreements in the presence of offshoring. We argue that the rise of offshoring will make it increasingly difficult for governments to rely on traditional GATT/WTO concepts and rules—such as market access, reciprocity and non-discrimination—to solve their trade-related problems