December 4, 2022

Economix Blog: Comparing the Job Losses in Financial Crises



Dollars to doughnuts.

For a while I was regularly updating a chart each month showing how far employment plummeted in the latest recession and how little ground has been recovered since the recovery began in June 2009. Compared with other recent recessions and recoveries, the last few years have looked especially disastrous:

Source: Bureau of Labor Statistics

But that may be the wrong comparison to make.

As the Harvard economists Carmen Reinhart and Kenneth Rogoff have written, financial crises are always especially disastrous. Over the course of a dozen financial crises in developed and developing countries going back to the Great Depression, the unemployment rate rose an average of 7 percentage points over 4.8 years, they found.

And actually, when shown alongside the track records of other financial crises, the American job losses caused by the recent financial crisis don’t look quite as horrifying:

Courtesy of Josh Lehner.Courtesy of Josh Lehner.

The chart above was put together by Josh Lehner, an economist for the state of Oregon. The red line shows the change in employment since December 2007, when the most recent recession officially began.

As you can see, drastic as American job losses have been in recent years, they were far worse and lasted much longer in the aftermath of the financial crises that struck, for example, Finland and Sweden in 1991 and Spain in 1977, not to mention the United States during the Great Depression.

Looking at unemployment rates (which refer to the share of people who want to work but can’t find jobs, as opposed to just the total number of jobs) also shows that things in the United States could have been much worse:

Courtesy of Josh Lehner. Courtesy of Josh Lehner.

In the United States, the unemployment rate rose from an average of 4.5 percent in the year before the crisis to a peak of 10 percent. In other words, the jobless rate more than doubled. After previous financial crises, however, some countries saw their unemployment rates triple, quadruple, even quintuple.

It’s not clear why the United States came out of this financial crisis relatively less scathed than history might predict.

Mr. Lehner attributes this to “the coordinated global response to the immediate crises in late 2008 and early 2009,” referring to both monetary and fiscal stimulus. He notes that the United States and the global economy had been tracking the path of the Great Depression in 2008-9, until we saw a number of major government interventions kick in.

Article source:

Economix Blog: Wealth, Taxes and Public Opinion



Dollars to doughnuts.

Last week I wrote about a new Pew Research Center report on the ailing middle class. Today, Pew has come out with a comparable report about the wealthy and how Americans feel about this upper-income class.

When respondents were asked how much a family of four would need to earn to be considered wealthy “in your area,” the median response was $150,000. The responses varied by geographic region, though, with people in the Northeast (where the cost of living is higher) giving a median response of $200,000.

The survey also included a pointed question about whether upper-income people pay their “fair share” in taxes. About 26 percent of respondents said they did, with another 8 percent saying the rich paid too much in taxes and 58 percent saying the rich paid too little.

If that sounds like a lot of people complaining that the wealthy don’t contribute enough to Uncle Sam, note that Americans’ attitudes toward the tax obligation of the rich have become much less demanding over the last two decades.

When this question was first asked by Gallup, in March 1992, 77 percent of respondents said upper-income Americans paid too little in taxes. Yet the average income tax burden of the wealthy was actually higher then.

Sources: Pew Research Center, Tax Policy Center. The blue line, which shows the percent of Americans who say the upper-income pay too little in taxes, refers to the left-hand axis. Note that this axis does not start at zero to better show the change. The red line, which shows the average federal income tax rate for a family of four earning twice the median income, refers to the right-hand axis.Sources: Pew Research Center, Tax Policy Center. The blue line, which shows the percent of Americans who say the upper-income pay too little in taxes, refers to the left-hand axis. Note that this axis does not start at zero to better show the change. The red line, which shows the average federal income tax rate for a family of four earning twice the median income, refers to the right-hand axis.

In 1992, when more than three-quarters of Americans said that rich people should be paying more, a family of four earning twice the median household income paid an average federal income tax rate of 14.79 percent, according to the Tax Policy Center.

As of 2011, the average tax burden for a family of four earning twice the median income (which came to $151,296) was 12.93 percent.

Over the same period, Americans have become much more demanding about how much the poor pay, however.

In 1992, 8 percent of Americans said lower-income people paid “too little” in taxes. Today that share has risen to 20 percent.

As with that of the wealthy, the tax burden of the poor has also fallen considerably in the last two decades; in 1992, a family earning half the median income paid an average tax rate of 4.55 percent, whereas last year a family in that position had a negative tax burden of 6.84 percent (that is, the family received money from the federal government equaling 6.84 percent of their income, thanks to refundable tax credits).

For whatever reason, Americans have become much less tolerant of lower tax rates for the poor than they have for the rich.

Addendum on methodology: Pew’s survey was done through telephone interviews conducted July 16-26, 2012, with a nationally representative sample of 2,508 adults ages 18 and older. The margin of sampling error is plus or minus 3 percentage points.

Article source:

Economix Blog: The Role of Austerity



Thoughts on the economic scene.

The chart here offers one of the better recent snapshots of the American economy that you will find.

The blue line shows the rate at which the government — federal, state and local — has been growing or shrinking. The red line shows the same for the private sector.

Annualized growth in the public and private sectors.Source: Bureau of Economic Analysis, via Haver AnalyticsQuarterly change at seasonally adjusted annual rate.

The brief version of the story is that the government, which helped mitigate the recession, has been a significant drag on growth for more than a year now.

In 2007, both the private sector and government were growing. The government continued growing through 2008 and most of 2009, with the exception of one quarter when military spending fell. The private sector, though, began to shrink in 2008 and by late 2008, as the financial crisis took hold, it was shrinking rapidly.

The government — first the Federal Reserve and the Bush administration and then, more aggressively, the Fed and the Obama administration — responded with various stimulus programs. They are the reason for the blue line’s upward spike in 2009.

The private sector began to recover in 2009. The recovery slowed in 2010 and again in 2011, as the dips in the red line show. But by the end of last year, the private sector was expanding at a healthy 4.5 percent annualized pace.

Why, then, wasn’t economic growth in the most recent quarter better than the 2.8 percent that the Commerce Department reported today?

Because the economy is the combination of the private and public sectors. The public sector has been shrinking for the last year and a half — mostly because of cuts in state and local government, with some federal cuts, especially to the military, playing a role as well. In the fourth quarter, government shrank at an annual rate of 4.5 percent.

Over the last two years, the private sector grew at an average annual rate of 3.2 percent, while the government shrank at an annual rate of 1.4 percent.

The combined result has been economic growth of 2.3 percent.

People can obviously have a spirited debate about cause and effect here. I’m not aware of much research or evidence suggesting that short-term declines in government activity — at least in a largely free-market economy — cause short-term growth in the private sector.

Lacking such evidence, the obvious conclusion seems to be that economic growth, and employment growth, would have been significantly stronger over the last two years without government cuts. But I’d invite readers to point us to any research that bears on the question, one way or the other.

Article source:

Economix: Why Hasn’t Employment of the Elderly Fallen?


Casey B. Mulligan is an economics professor at the University of Chicago.

While employment rates have fallen sharply among the general population, they have not done so among the elderly. This result is difficult to reconcile with Keynesian characterizations of the labor market.

Today’s Economist

Perspectives from expert contributors.

The red line in the chart below displays an index of the per capita employment for the general population. For example, a value of 93 for 2010 means that the fraction of people employed in 2010 was 7 percent less than it was in 2007, before the recession began. The red line shows what we all know by now: many fewer people have jobs now than did a few years ago.

The other two series in the chart are for specific age groups: ages 65-69 and ages 70-74. Both groups have a somewhat greater fraction working now than in 2007 (the increase is even more for ages 75+, but that group is small, so it is omitted from the chart).

Recent studies have looked at the labor-market experiences of the elderly during the first half of the recession. The authors emphasize that, while the recession by itself might reduce elderly employment, the elderly have become increasingly willing to work. I agree.

Many elderly people, for example, saw the market values of their homes and retirement assets plummet in 2008 and feel they can no longer afford to be retired. Naturally, many of them react by looking for work.

The blue and green lines in the chart show that the elderly have been much more successful than the general population at obtaining and retaining jobs.

These findings contradict the Keynesian narrative of the labor market, in which the marketplace fails to recognize the degree to which people would like to have a job. (The Keynesian narrative helps rationalize, among other things, the assertion that unemployment insurance did not reduce employment during the recession, because “what’s limiting employment now is lack of demand for the things workers produce. Their incentives to seek work are, for now, irrelevant.”)

Employment, even during a recession, is not solely the result of lucky few finding available positions. All else being the same, the market tends to create and allocate jobs for those people who are most interested in working.

That’s why, if government is to avoid making employment any less than it has to be, it’s so important to pay attention to the incentives created by taxes, subsidies and government regulations.

Article source:

Economix: A History of College Grade Inflation

We’ve written before about some of the work of Stuart Rojstaczer and Christopher Healy, grade inflation chroniclers extraordinaire. They have put together a new, comprehensive study of college grading over the decades, and let me tell you, it is a doozy.



Dollars to doughnuts.

The researchers collected historical data on letter grades awarded by more than 200 four-year colleges and universities. Their analysis (published in the Teachers College Record) confirm that the share of A grades awarded has skyrocketed over the years. Take a look at the red line in the chart below, which refers to the share of grades given that are A’s:

DESCRIPTIONStuart Rojstaczer and Christopher Healy Note: 1940 and 1950 (nonconnected data points in figure) represent averages from 1935 to 1944 and 1945 to 1954, respectively. Data from 1960 onward represent annual averages in their database, smoothed with a three-year centered moving average.

Most recently, about 43 percent of all letter grades given were A’s, an increase of 28 percentage points since 1960 and 12 percentage points since 1988. The distribution of B’s has stayed relatively constant; the growing share of A’s instead comes at the expense of a shrinking share of C’s, D’s and F’s. In fact, only about 10 percent of grades awarded are D’s and F’s.

As we have written before, private colleges and universities are by far the biggest offenders on grade inflation, even when you compare private schools to equally selective public schools. Here’s another chart showing the grading curves for public versus private schools in the years 1960, 1980 and 2007:

DESCRIPTIONStuart Rojstaczer and Christopher Healy Note: 1960 and 1980 data represent averages from 1959–1961 and 1979–1981, respectively.

As you can see, public and private school grading curves started out as relatively similar, and gradually pulled further apart. Both types of institutions made their curves easier over time, but private schools made their grades much easier.

By the end of the last decade, A’s and B’s represented 73 percent of all grades awarded at public schools, and 86 percent of all grades awarded at private schools, according to the database compiled by Mr. Rojstaczer and Mr. Healy. (Mr. Rojstaczer is a former Duke geophysics professor, and Mr. Healy is a computer science professor at Furman University.)

Southern schools have also been less generous with their grading than institutions in other geographic regions, and schools that focus on science and engineering tend to be stingier with their A’s than liberal arts schools of equal selectivity.

What accounts for the higher G.P.A.’s over the last few decades?

The authors don’t attribute steep grade inflation to higher-quality or harder-working students. In fact, one recent study found that students spend significantly less time studying today than they did in the past.

Rather, the researchers argue that grade inflation began picking in the 1960s and 1970s probably because professors were reluctant to give students D’s and F’s. After all, poor grades could land young men in Vietnam.

They then attribute the rapid rise in grade inflation in the last couple of decades to a more “consumer-based approach” to education, which they say “has created both external and internal incentives for the faculty to grade more generously.” More generous grading can produce better instructor reviews, for example, and can help students be more competitive candidates for graduate schools and the job market.

The authors argue that grading standards may become even looser in the coming years, making it increasingly more difficult for graduate schools and employers to distinguish between excellent, good and mediocre students.

More disturbing, they argue, are the potential effects on educational outcomes.

“When college students perceive that the average grade in a class will be an A, they do not try to excel,” they write. “It is likely that the decline in student study hours, student engagement, and literacy are partly the result of diminished academic expectations.”

Article source:

Economix: Keynesians Miss the Point, for Now

Today's Economist

Casey B. Mulligan is an economics professor at the University of Chicago.

Our labor market has long-term problems that are not addressed by Keynesian economic theory. New Keynesian economics is built on the assumption that employers charge too much for the products that their employees make and are too slow to cut their prices when demand falls. With prices too high, customers are discouraged from buying, especially during recessions, and there is not enough demand to maintain employment.

When the financial crisis hit in 2008, the New Keynesian “sticky price” story had some plausibility because economic conditions were, in fact, deflationary (although I have my doubts about other aspects of their theory). That is, the demand for safe assets surged in 2008, which means that those assets had to become expensive or, equivalently, goods had to get cheaper in order to clear the market.

Normally the Federal Reserve could expand the money supply to satisfy the extra demand for safe assets, so consumer prices wouldn’t have to fall to maintain employment. But the financial crisis was severe enough that the Fed’s best efforts would not be enough.

At the time, New Keynesian fears seem to have been realized: consumer prices had to fall to maintain employment, but too few employers were willing or able to make the price cuts quickly enough. The result was going to be a severe recession that could be partly cured, in the short term, by fiscal stimulus or, in the longer term, as more companies had the time needed to cut their prices.

The red line in the chart below shows the consumer price index that, according to New Keynesian theory, was needed to maintain employment. I have rescaled the index to be based in December 2007, when the recession began: a value of 96 means consumer prices were 4 percent below what they were in December 2007.

In theory, the index of consumer prices had to fall eight percentage points below its peak (of almost 104) in the summer of 2008 to maintain employment. (I measure all consumer prices here, not merely the “core” price index that excludes fuel and many other items, because the excluded items provide jobs, too.)

The blue line shows actual consumer prices. We readily see the “downward pressure” on consumer prices at the end of 2008, because, in fact, prices stopped rising and actually fell a couple of percent.

But New Keynesians say that with the drop needed to maintain employment, the blue line needed to fall as much as the red, and a drop that large would take more time. In the meantime, employment would be low and employers would enjoy cheap labor for a while, as so many unemployed people were desperate to work.

But the availability of cheap labor would eventually give employers room to cut their prices – in theory the blue and red series would converge and employment would eventually return to previous levels.

Early on, I thought the New Keynesian theory was wrong because I didn’t see that employers perceived labor to be cheap. Federal law had increased minimum wages three times in and around the recession. A number of other public policies made labor more expensive. My fellow blogger Nancy Folbre has written that American labor looks increasingly expensive compared with potential workers abroad.

The price chart above shows little or no tendency for the blue series to converge with the red one, because, contrary to the theory, high unemployment rates have not caused employers to perceive labor as cheap.

The low employment rates we have today are too persistent to be blamed on price adjustment lags (I have similar reservations about another business-cycle theory: “job search” theory says that jobs are there to be found, but that unemployed people have not been lucky enough to look in the right places).

Our labor-market problems may not disappear by themselves and are not addressed by New Keynesian theory.

Article source:

Economix: Human Capital Follows the Thermometer

Today's Economist

Edward L. Glaeser is an economics professor at Harvard and the author of “Triumph of the City.”

Over the last decade, population growth in the fifth of American counties where January temperature averaged above 43 degrees was over 9 percent, while the population growth in the fifth of American counties where January temperature average below 22 degrees was less than 2 percent. Population growth was over 13 percent in the fifth of counties where more than 21 percent of adults had college degrees in 2000, while growth in the least educated three-fifths of counties was below 3 percent.

The powerful pull of skills reminds us that human capital is the bedrock of local and national success. The message of the Sun Belt is more complicated. Its success tells us a bit about the pleasures of warmth, and a bit about the importance of natural resources and a bit about the impact of limited government.

Population data is from the 2000 and 2010 Census.
Skills data (share of 25+ population with a college degree) is from the 2000 Census.
January temperature comes from ICPSR (Interuniversity Consortium for Political and Social Research) Study No. 2896, “Historical, Demographic, Economic, and Social Data: The United States, 1790-2002,” by Michael R. Haines, which compiles data from various Census sources over many years.

The chart shows population growth across American counties between 2000 and 2010. I have ranked counties both by average January temperature and by share of the adult population with college degrees as of the year 2000. Each point represents one-tenth of America’s counties. The blue line shows the powerful connection between skills and population growth; the red line shows the also-strong connection between January temperature and population growth. Both trends represent longstanding patterns.

Last week, I discussed a new paper of mine jointly written with Giacomo Ponzetto and Kristina Tobio, looking at population growth over the last two centuries. Our longer-run investigation focused on counties in the eastern United States, roughly bordered by the Mississippi, in order to focus on an area that was populated at the time of the Civil War. For more recent decades, we also look at metropolitan areas throughout the United States.

We cannot look at the correlation between growth and historic skills — at least as typically measured by the share of the population with college degrees — over very long time horizons, because it was only in 1940 that the Census began measuring educational attainments at the county level. We are therefore limited, like earlier researchers, to looking at 1940 education levels. This is somewhat problematic since growing areas might have attracted more educated people.

Looking only at counties in the eastern United States, we find that education as of 1940 is essentially unrelated to county growth during the 19th century. Perhaps education in 1940 is not very correlated with the relevant skill level as of 1830 or 1860 or perhaps skills just weren’t important in generating local success during the era of the railroad and the mechanical reaper.

Starting in 1900, however, skills, as of 1940, predict faster county population growth, during eight of the next 10 decades. During the decades when skills don’t predict county growth in the eastern United States — the 1970s and the 1990s — skills are still powerful predictors of metropolitan area and city population across the entire nation.

Our paper then seeks to understand why skills predict metropolitan area population growth throughout the United States since 1970. Like previous papers on this topic by Jesse Shapiro, we find that the skills-growth link occurs primarily because more educated places have become steadily more productive. In the West, there is also some evidence that suggests that skilled areas have quality-of-life amenities that people increasingly value, but that that isn’t true across the nation more generally.

The impact of sunshine, as measured by average January temperature, on population growth is more complicated. From the 1790s to the 1860s, population grew more quickly in the colder counties east of the Mississippi, which helped ensure that the Union enjoyed a healthy demographic advantage at the start of the Civil War. But after 1870, warmer states grew more quickly for four decades. Between 1870 and 1910, the population of the South increased by nearly 140 percent, while the population of the Midwest increased by 130 percent and the Northeast increased by 110 percent. The growth was particularly strong in the least dense Southern states, which may reflect the increasing spread of railroads into those areas after the Civil War.

But after 1910, the connection — within Eastern counties — between January temperature and population growth disappeared until the 1960s. During those decades, the Great Lakes areas expanded. Areas that had once been centers for shipping natural resources grew as great manufacturing hubs. Proximity to the Great Lakes predicts population growth before 1870 and then after 1910, but not between. Chicago, Cleveland, Detroit and their surrounding areas all boomed as Americans moved from farms to factories.

After 1960, trends changed again. The South came roaring back after World War II and the Great Lakes region became known as the Rust Belt. While the pre-World War II South was hardly known for being friendly to outsiders, the post-World War II South adopted right-to-work laws that helped lure manufacturing to its lower costs. The decline of Jim Crow, a victory for racial justice, also made the South more politically competitive and less frightening to outside investors.

Over the last decade, January temperature continues to predict growth throughout the entire United States. This connection reflects economic productivity, but also the ease of construction in less regulated areas. Atlanta, Dallas, Houston and Phoenix grew more than any other metropolitan areas since 2000 because they combined economic productivity with a regulatory environment that encouraged, rather than stifled, new building.

Sun and skills are not opposites. There are plenty of skilled metropolitan areas, like Atlanta and Charlotte, in the Sun Belt. But at the extreme, sunshine and skills do represent two models of American success. Colder, skilled areas, like Boston, succeed because education makes up for a difficult regulatory environment, especially toward new construction. Warmer, less skilled areas succeed because limited regulation and natural resources make up for limited human capital.

For America to be successful in the 21st century, it is going to need the power inherent in both Houston and Boston. It will need unleashed human capital, and that’s why our nation needs to invest heavily in our children and in policy reforms that will make entrepreneurship easier and less expensive.

Article source: