Have Prison Terms Kept Steady?
That's the claim of scholar John Pfaff, and it's rather intriguing considering all the attention lavished on mandatory minimums. (He says the rise in incarceration can instead be blamed on prosecutors' becoming much more likely to file felony charges after arrests.) The theory has been getting a lot of attention lately, including from David Brooks.
Here's the main contention from one of Pfaff's papers:
In practice, sentence lengths have generally remained relatively short, and evidence suggests that sentence lengths do not explain much of the increase in the U.S. prison population. For example, I have shown that in eleven predominantly northern states (chosen solely due to limitations in the data) median time spent in prison hovered around one year from the late 1980s through the early 2000s, with lows of six months in states like California and Illinois.
Furthermore, in a recent paper I demonstrate that these findings are generalizable to the country as a whole. Moreover, data from these states clearly demonstrate that trends in admissions, not releases, drove their prison growth. Changes in sentence lengths had no noticeable effects on prison populations in these states, but prison populations in all eleven states would have flattened, and sometimes even fallen, by the mid- to late-1990s had admissions levels not grown. There is actually a fairly simple way to show that increased severity is unlikely to be the primary engine of population growth. Figure 3 plots annual admissions and releases from prison. If sentences were getting significantly longer, we should expect to see the dotted releases line grow more slowly relative to the solid admissions line — the gap between the two should widen. But except for a brief period in the early 1990s, that simply does not happen; as we enter the 2000s, the gap actually narrows.
Here's the chart (and here's the paper it comes from, which features some interesting simulations):
Pfaff is a law professor and I'm a journalist, and I'm more than willing to accept that aggressive prosecutors played a huge role. But I'm hesitant to dismiss the role of sentence length entirely.
First, his findings don't jibe with the (admittedly imperfect) official statistics collected by the Justice Department. The DOJ has numbers on the time served by people released from state prisons for the first time (as opposed to returning after being released to community supervision). These are broken down into a whopping 31 categories, from the broad (violent offenses) to the specific (nonnegligent manslaughter). Between 1993 and 2009, the mean time served for all offenses climbed from 21 to 29 months, an increase of almost 40 percent, which is comparable to the rise in incarceration during that period. Strikingly, there was an upward trend in each of these measures individually as well. Here's all the data dumped onto one messy, ugly graph with the 1993 numbers set to 1; the crime with the weird dip in the middle is "unspecified homicide":
(On a side note, I do I think we should focus on the mean time served, not the median. The mean counts every year served equally, and thus is skewed if a few people serve very long sentences. We want to factor in that skew if we're concerned about our total prisoner count, because someone serving a long sentence will contribute to the incarceration rate for a longer period of time.)
You can see time-served data that reaches farther back here from Vox, via the National Academies. The National Academies report cites an estimate splitting the blame for the rise of incarceration in state institutions about 50-50 between longer sentences and higher admissions, with admissions playing a bigger role during the '80s and sentences pulling more weight during the '90s. Pew also found a 36 percent increase in time served from 1990 to 2009.
How to reconcile this with Pfaff's chart that fails to show a widening gap between admissions and releases? Here's my theory: Precisely because most sentences are short, changes to sentences won't show up as a gap for very long. Imagine 100 people commit a crime every year and are sentenced to two years each. 100 people will be admitted every year and 100 more — those who committed the crime two years ago — will be released. If you change the sentence to three years, no one will get out two years later, but the following year you'll be right back to 100 people being admitted and 100 people released (those who committed the crime three years before). And yet the number of people imprisoned for the crime will have permanently grown by 50 percent right along with the sentence. At any given time you'll be imprisoning three years' worth of offenders instead of two years'.
Simply put, the higher level of incarceration stays even when the admissions/releases gap closes back up. You don't need a widening gap, but just a sustained gap — which is what we see — to keep adding prisoners to the total, because the gap accumulates year after year.
Importantly, this gap can be created by rising admissions or lagging releases, so it's not a very useful way to distinguish between the two. One quick-and-dirty test, however, might be to look at each year's gap during the time the incarceration rate was growing the fastest, and see how much of it is explained by a rise in admissions since the previous year. On average during the period 1980-2000, rising admissions explained about 55 percent of the gap each year — if the gap was 100, indicating that the prison population grew by 100, admissions grew by just 55 — leaving a whole lot that might result from longer sentences. And some admissions growth is just U.S. population growth, and therefore doesn't contribute to a rising per-capita rate.
There's obviously no question that admissions rose, as seen in Pfaff's chart above. But he may overstate his case when he minimizes the additional role played by longer sentences.
Robert VerBruggen is editor of RealClearPolicy. Twitter: @RAVerBruggen