# More Thoughts on Minimum Wage

In part inspired by Kevin Erdmann's **comment** on my previous post, I've done a little more work with the data on teen unemployment and minimum-wage hikes. I previously noted that teen unemployment and total unemployment track each other very well; in this post I'll try to see if I can cull out any changes to that relationship that occur during minimum-wage hikes. The short answer is "kinda, but not really," and I think **this post** from Arin Dube does a good job of explaining why it's hard to analyze the minimum wage using nothing but national trendlines.

It's about to get boring up in here. And of course, the usual caveat is required: I was a journalism major and I'm using spreadsheets. I'll update this post with any necessary corrections, and you can see my spreadsheet **here**.

Here is a graph I created using monthly unemployment data since 1948. As you can see, the overall unemployment rate is a great predictor of teen unemployment. The regression line running through the data is y=2.14x+3.74. This indicates that when the overall rate goes up by a point, the teen rate goes up by 2.14 points. The line explains about 83 percent of the variance in the data.

Erdmann writes that minimum-wage hikes dramatically decrease employment starting about three months before they go into effect and continuing about 27 months afterward -- by the end of that period, teen employment is still 6 percent lower than it would have been if the previous trend had continued, and then it rebounds. Since teens are **5 percent of workers but around 30 percent of minimum-wage earners**, an effect this big might show up as an increased gap between teen and total unemployment; the normal disparity is 2.14:1, but disparities driven by minimum-wage workers' losing their jobs should be more like 6:1. I.e., the teen unemployment rate during these time periods should be higher than the rate predicted by the formula above, and perhaps the relationship should have a steeper slope.

So, I pulled all of the data points from the seven time periods Erdmann focuses on. Using the formula, the average teen unemployment rate during these periods should have been 15.8 percent. In fact it was 15.9 percent. And if I create a new regression line based just on the data that's affected by minimum-wage hikes, the relationship is y=1.85x+5.51 (explaining 76 percent of the variance). Not only is this a very similar line in the range we're talking about, but, far from tending to run above the line representing the overall data, the line derived from minimum-wage-hike data *intersects* with it pretty close to the perfectly average data point (which is roughly [6,16] in both data sets). The slope is less steep, too; the line is red here:

In other words, the relationship between total unemployment and teen unemployment looks pretty normal during minimum-wage hikes.

(I also tried splitting the data in two instead of comparing the wage-hike periods to the overall period. For the non-wage-hike periods the line is 2.23x+3.18, explaining 85 percent of the variance. The gap increases to 15.7 vs. 15.9.)

[*Update*: It also occurred to me that by lumping all the wage-hike eras together, I might have obscured trends within each period -- a **Simpson's paradox** kind of situation. Here are the regression lines and R-squareds for the seven periods, in consecutive order: 1.15x+6.27, .60; 1.33x+8.39, .18; 1.84x+5.81, .28; 1.40x+8.15, .93; 2.58x+1.05, .92; 2.68x+2.57, .71; 1.66x+8.59, .93. Obviously, the relationship between teen and overall unemployment is *much* stronger in some periods than others, but nothing else jumps out at me here.]

There's another possibility here: These two data sets might simply illustrate opposite trends in the minimum wage, and be similar as a result. Maybe the minimum wage abruptly decreases employment during the times it's in flux and then gradually increases it as inflation eats away the wage's value. Perhaps we should look for a more general trend.

So -- even though I am completely unqualified to do this -- I started with a **list** of the real value of each minimum-wage hike and then adjusted (admittedly crudely) each subsequent month for inflation based on the **average annual inflation rate** of the decade it fell in. I then did a multiple regression, using the real minimum wage and the overall unemployment rate as X variables and the teen unemployment rate as the Y variable -- basically, looking to see if the minimum wage explains anything about teen unemployment that the overall unemployment rate doesn't.

The results: The regression still explains 83 percent of the data -- but the results for the minimum wage are statistically significant. They suggest that, after accounting for changes in the overall unemployment rate, raising the wage by a dollar is associated with an increase in the teen unemployment rate of one-tenth of a percentage point. So, nothing huge, but something, I guess.

Erdmann says that employers start responding to wage hikes three months in advance, so I also tried pulling all of the real-minimum-wage data up by three months. Essentially the same result (actually slightly weaker).

As I said in my earlier post, I tend to share Erdmann's instinct that the minimum wage costs jobs to some degree, and I've **pointed out before** that it's a pretty bad way of targeting poor people for aid. But there's a reason that high-caliber economists **disagree** on this topic: The effects are pretty small if they exist at all.

*Robert VerBruggen is editor of RealClearPolicy. Twitter: @RAVerBruggen*