Minimum Wage Effects with Non-Living Wages

I’m teaching “Economics for Non-Economists” this semester. This is an interesting experiment, and is strongly testing my belief that you can teach economics without mathematics so long as people understand graphs and tables. (It appears that people primarily learn how to read graphs and tables in mathematics-related courses. Did everyone except me know this?)

Since economics is All About Trade-offs, our textbook notes that minimum wage increases should also mean some people are not employed. Yet, as I noted to the students, in the past several decades, none of the empirical research in the United States shows this to be true. (From Card and Kreuger (1994) to Card and Kreuger (2000) to the City of Seattle, in fact, all of the evidence has run the other way, as noted by the Forbes link.)

Part of that is intuitive. If you’re running a viable business and able to generate $50 an hour, it hardly makes sense not to hire someone for $7.25, or even $9.25, to free up an hour of your time. The tradeoff is that your workers make more and your customers can afford to pay or buy more. Ask Henry Ford how that worked for him.

The generic counterargument (notably not an argument well-grounded in economic theory) was summarized accurately by Tim Worstall in one of his early attempts to hype the later-superseded initial UW study for the Seattle Minimum Wage Study Team.

[T]here is some level of the minimum wage where the unemployment effects become a greater cost than the benefits of the higher wages going to those who remain in work.

This seems intuitive in the short-term and problematic in the long term, even ignoring the sketchiness of the details and the curious assumption of an overall increase in unemployment (or at least underemployment) if you assume a rising Aggregate Demand environment. To confirm the assumptions would seem to require either a rather more open economy than exists anywhere or a rather severe privileging of capital over labor.*

On slightly more solid ground is the assumption that minimum wage should be approximately half of the median hourly wage. But then you hit issues such as median weekly real earnings not having increased much in almost forty years, while a minimum wage at the median nominal wage rate suggests that the Federal minimum wage should be somewhere between about $12.75 and $14.25 an hour. (Links are to FRED graphics and data; per hour derivations based on the 35-hour work week standard for “full-time.”)

So all of the benchmark data indicates that reasonable minimum wage increases will have virtually no effect, and none on established, well-managed businesses. The question becomes: why would that be so?

One baseline assumption of economic models is that working full-time provides at least the necessary income to cover basic expenses. Employment and Income models assume it, and it’s either fundamental to Arrow-Debreu or you have to assume that people either (a) are not rational, (b) die horrible deaths, or (c) both.

If you test that assumption, it has not obvkiously been so for at least 30 years:


The last two increases of the Carter Administration slightly lag inflation, but they are during a period of high inflation as well; the four-year plan may just have underestimated the effect of G. William Miller. (They would hardly be unique in this.)

By the next Federal increase, though—more than nine years of inflation, major deficit spending, a shift to noticeably negative net exports, and a couple of bubble-licious rounds of asset growth (1987, 1989) later—the minimum wage was long past the possibility of paying a living wage, so any relative increase in it would, by definition, increase Aggregate Demand as people came closer to being able to subsist.

The gap is greater than $1.50 an hour by the end of the 1991 increase. The 1996-1997 increase barely manages to slow the acceleration of the gap (to nearly $1.70), leaving the 10-year gap in increases to require three 70-cent increases just to get the gap back down to $1.86 by their end in 2009.

Nine years later, almost another $1.50 has been eroded, even in an inflation-controlled environment.

Card and Kreuger, in the context of increasing gap between “making minimum wage” and “making subsistence wage,” appear to have discovered not so much that minimum wage increases are not negatives to well-run businesses so much as that any negative impact of an increase, under the condition that the minimum wage does not provide for subsistence income, will be more than ameliorated by the increase in Aggregate Demand at the lower end.

My non-economist students had very little trouble understanding that.

*The general retort of “well, then, why not $100/hour” would create a severe discontinuity, making standard models ineffective in the short term and require recalibration to estimate the longer term. Claiming that such a statement is “economic reality,” then, empirically would be a statement of ignorance.