Filter | only charts

New Driverless Car Rules Will Stifle Innovation, Cost Lives

Grant Broadhurst- September 23, 2016

Three numbers: 35,200 people were killed in auto accidents last year; 94 percent of car crashes are due to human error; 613,501 lives have been saved by advances in auto safety over the past 50 years. These numbers form the basis of the National Highway Traffic Safety Administration head’s argument for autonomous vehicles and a friendly regulatory environment.

Ironically, though, the National Highway Traffic Safety Administration (NHTSA) is also considering premarket approval and post-sale regulations that would restrict the development and improvement of autonomous vehicles even more than “dumb” vehicles, potentially leading to the unnecessary loss of life.

In a speech on Monday at the Automated Vehicles Symposium in San Francisco, NHTSA Administrator Mark Rosekind said that his agency’s goal is to create “a framework that will speed the development and deployment of technologies with significant lifesaving potential.” However, the very next day, his agency released the long-promised NHTSA guidelines for autonomous vehicles, proposing two new authorities that would do the exact opposite. These new authorities are only options, and the NHTSA is seeking public comment.

The first proposal, the “Considered New Authority” of premarket approval, would require manufacturers to have their models approved before hitting showrooms for sale — a departure from the current process of self-certification. A premarket approval process, the guidelines say, would help the public accept autonomous vehicles. However, this is a long-term solution to a short-term problem; and this new authority not only goes against Rosekind’s own expressed approach but also the way automobiles are made. 

“If we wait for perfect, we’ll be waiting for a very, very long time,” Rosekind said of autonomous vehicle technology in general. “How many lives might we be losing while we wait?”

The problem is that approving every single model for every single manufacturer would be a monumental task — and a slow one. Do we really want an FDA-style premarket approval process when delays could cost lives? (Look what’s happened with EpiPens.)

Moreover, models don’t just change every 12 months. Toyota makes thousands of improvements to its manufacturing processes every year, and manufacturers regularly tweak and improve their models. Even the parts, themselves, come from thousands of suppliers, each of which should be free to improve. Given that autonomous vehicles rely on software, manufacturers need the capability to implement change swiftly up to the moment of release.

The NHTSA is also considering establishing an authority to regulate post-sale software updates and is even considering “new measures and tools” such as prerelease simulation. At the moment, companies like Tesla can send software updates through the airwaves — which it did a week ago, making over two hundred enhancements of varying importance. Rosekind saw this as a positive development since it means that safety can be continuously improved.

However, the need for up-to-the-minute updates not only illustrates why a premarket approval process for software would be unsound, but calls into question the wisdom of heavily regulating post-sale software enhancements. If the NHTSA decides to regulate post-sale updates, their regulations should come in the form of self-certifications and post-release assessments. A pre-release approval process for security updates makes no sense. 

Rosekind was right when he said, “technology is changing so rapidly that any rule we write today would likely be woefully irrelevant by the time it took effect years later.” Let’s just hope that the actual regulations will reflect this reality.

If not, the NHTSA could undermine its own mission, and the highway death toll will remain at its current high levels. 

Grant Broadhurst’s work has appeared in The American Spectator and Watchdog News. He graduated summa cum laude from the University of North Florida and is a Young Voices Advocate. Find him on Twitter: @GWBroadhurst 

The Risks of Ignorance in Chemical and Radiation Regulation

James Broughel & Dima Yazji Shamoun- September 21, 2016

The Nuclear Regulatory Commission sought comments last June on whether it should switch its default “dose-response model” for ionizing radiation from a linear no threshold model to a hormesis model. This highly technical debate may sound like it has nothing to do with the average American, but the Nuclear Regulatory Commission’s (NRC) decision on the matter could set the stage for a dramatic shift in the way health and environmental standards are set in the United States, with implications for everyone.

Regulators use dose-response models to explain how human health responds to exposure to environmental stressors like chemicals or radiation. These models are typically used to fill gaps where data is limited or non-existent. For example, analysts might have evidence about health effects in rodents that were exposed to very high doses of a chemical, but if they want to know what happens to humans at much lower exposure levels, there might not be much available information, for both practical and ethical reasons. 

The linear no threshold (LNT) model has a tendency to overestimate risk because it assumes there’s no safe dose — or “threshold” — for an environmental stressor. (We discuss the LNT model in our new Mercatus Center research, “Regulating Under Uncertainty: Use of the Linear No Threshold Model in Chemical and Radiation Exposure.”) The response (cancer, in most cases) is assumed to be proportional to the dose at any level, even when exposure is just a single molecule. LNT is popular with regulators in part because of its conservative nature. When setting standards, the logic goes, better to be safe than sorry. That is, it’s better to assume that there is no threshold and be wrong than to assume a safe dose exists when one does not.

But does the use of the LNT model really produce the “conservative” results its proponents claim? There are very good reasons to doubt it.

The first is that there are no absolute choices; there are only tradeoffs. Regulations that address risk induce behavioral responses among the regulated. These responses carry risks of their own. For example, if a chemical is banned by a regulator, companies usually substitute another chemical in place of the banned one. Both the banned chemical and the substitute carry risks, but if risks are exaggerated by an unknown amount, then we remain ignorant of the safer option. And because LNT detects — by design — low-dose health risks in any substance where there is evidence of toxicity at high doses, businesses are led to use newer, not-yet-assessed chemicals.

Economic costs borne from complying with regulations also produce “risk tradeoffs.” Since compliance costs are ultimately passed on to individuals, lost income from regulations means less money to spend addressing risks privately. When their incomes fall, people forgo buying things such as home security systems, gym memberships, healthier food, new smoke detectors, or safer vehicles. And when regulators inflate publicly addressed risks but leave private risks unanalyzed, it becomes impossible to weigh the pros and cons of public versus private risk mitigation.

But the most compelling reason to doubt that LNT is a “conservative” standard is simply that it’s likely to be wrong in so many cases. The assumption that “any exposure” causes harm is contradicted not only by common sense, but by a growing body of research. In the decades since LNT was first adopted by regulatory agencies, more and more evidence supporting a threshold — or even a “hormetic” — model of dose response has been found.

Hormesis occurs when low doses of exposure actually cause beneficial health outcomes, and, coincidentally, the scientific evidence for hormesis appears strongest in the area where the LNT was first adopted before its use spread to other areas: radiation. For example, low-doses of radiation exposure have been shown to have protective effects against kidney damage in diabetic patients, and low doses of X-rays have been associated with an anti-inflammatory response to treat pneumonia. There is now evidence of hormesis in hundreds of experiments, but the LNT rules out — by assumption — the possibility of these kinds of beneficial health responses. 

Unfortunately, the way regulators typically respond to these problems is simply by ignoring them. Hence a better moniker for the use of the LNT model might be “Ignorance Is Bliss.” So long as regulators ignore the inconvenient truths posed by the possibilities of hormesis and risk tradeoffs, they can continue going to work every day maintaining the belief they are protecting public health. But the uncertainty in their risk assessments is so great that, in fact, regulators often have no idea whether they’re improving public health or doing just the opposite.

A reconsideration of the LNT is long overdue. At the very least, risk analysts should characterize uncertainty using multiple-dose response models — including a threshold model or a hormetic model — when no model has the overwhelming support of the scientific evidence. And analyzing risk tradeoffs should be a routine part of rulemaking.

The NRC should be commended for acknowledging the doubts about the LNT. When the time comes for the agency’s decision, let’s hope they choose knowledge over ignorance.

James Broughel is a research fellow with the Mercatus Center at George Mason University. Dima Yazji Shamoun is the associate director of research at the Center for Politics and Governance and a lecturer at the economics department at the University of Texas at Austin. They are coauthors of new Mercatus Center research on “Regulating Under Uncertainty: Use of the Linear No Threshold Model in Chemical and Radiation Exposure.”

Derek Cohen & Randy Petersen- September 17, 2016

The classic Daniel Patrick Moynihan quote that “everyone is entitled to his own opinion, but not his own facts,” is an import maxim in public policy debates. This is doubly so in criminology, where billions of dollars, quality of life in communities, and — most importantly — the very safety of law-abiding citizens rest on policymakers getting it right.

But despite the facts, critics still maintain that Texas’ criminal-justice reforms have failed to reduce crime and recidivism. The reforms at issue began with a spate of legislation passed during the 80th Texas Legislature in 2007, including a sweeping reorganization of the state’s community correction system under HB 1678. Facing prison and jail capacity overruns with no space to house violent offenders, the legislature prioritized probation and parole for low-risk offenders. We’ve illustrated time and again that once these policies were in place, crime rates continued to fall in-tandem with these reforms, despite similar protests from critics of the day that the opposite would happen. 

For instance, in a recent Real Clear Policy op-ed, Sean Kennedy argues that despite Texas’ reform efforts, the re-arrest rate for state prisons and state jails (a Texas-specific type of short-term state incarceration facility) has not changed significantly since 2004. Even assuming that re-arrest rates are a good measure of recidivism reduction, the problem with this argument is that the composition of the prison population before and after the reforms is importantly different. Why? The 2007 reforms focused only on nonviolent and low-level offenders. (See graphic.)

Texas’ 2007 criminal-justice reforms shifted many non-violent offenders away from incarceration while increasing resources for proven rehabilitation programs and supervision, such as probation officers, resulting in improved public safety. 

It’s almost a cliché that limited prison capacity should be reserved for “those who we’re afraid of, not for those we are mad at,” prioritizing bed space for violent or high-risk offenders. This was not the case in Texas prior to 2007, when only 22 percent of the admissions to state facilities for violent, property, and drug offenses were violent offenders. In 2015, this number grew to 27 percent, meaning that the prison composition on net became far more criminogenic. In fact, contrasting admissions from the two years, the only population that grew in raw terms was violent offenders.

Was this because Texas was suddenly beset with bands of violent super-predators marauding the state after having been given lenient sentences? Clearly not. Texas’ 2007 reforms didn’t address violent crimes. But had we kept the status quo, we would have likely not have had the space to house new offenders. On pace, we would have been 11,464 over operational capacity by 2010. (“Operational” capacity, not design capacity, means milking every last square foot of residential space in a facility.) Texas’ state beds now hold a greater number of violent offenders than they did before the reforms. 

The bottom line is that Texas has successfully focused criminal-justice resources on violent offenders while diverting non-violent offenders — whenever appropriate — to alternatives to incarceration. The data show that the non-violent offenders who received probation or who were diverted to rehabilitation programs are now less likely to reoffend than before when they were housed with violent offenders in the general prison population. Texas’ criminal-justice reforms have made Texas safer. They are — and should remain — a model for the rest of the nation.

Derek Cohen is the Deputy Director of Right on Crime. During his Ph.D. coursework, he taught several undergraduate sections of criminal justice research methods and statistics. Randy Petersen is a senior researcher with the Right on Crime initiative and a veteran of 21 years of law enforcement as a sworn officer.

First-to-Market Battle May Prove Decisive for Autonomous Vehicles

Joshua Baca- September 16, 2016

Autonomous Vehicles (AVs) are no longer a figment of science fiction. They are the future. The technology behind the biggest transportation market disruptor in decades is advancing rapidly, with the “first to market” battle well underway. As the legislative and regulatory challenges in this emerging market are sure to intensify, whichever company can avoid the obstacles and cross the finish line first may reap all of the rewards. We could be on the cusp of the next Model T.  

Uber, in partnership with Volvo, is already deploying AVs on the streets of Pittsburgh. General Motors is working with Lyft on a fleet of driverless cars, and Ford has announced their AVs will be on the market by 2021. Foreign automakers are also looking to penetrate the U.S. market: Audi is testing AVs in Washington, D.C., and Toyota is pledging to spend over $22 million to develop driverless vehicles. Given the nature of AVs and the inherent loss of control on behalf of the “driver,” questions of trust, privacy, and safety will dominate the market.

In June 2016, DDC Public Affairs conducted an online survey of 500 registered voters, in partnership with our research partner Axis Research, to better understand the political environment surrounding AVs. Findings show that awareness of AVs is high at 89 percent, while support is much lower with only 24 percent feeling strongly that AVs are a good thing for the future. Soft support creates an opportunity for voters to be swayed in either direction, and, as such, both auto manufacturers and technology companies have some work to do in order to establish a strong base. 

The survey also found the average makeup of AV supporters to be high-income men ages 35—54, who are also more likely to speak out on the issue. On the other hand, characteristics of the opposition include women over 55, who are less likely to speak out. This presents a challenge as a cornerstone argument for AVs is their promise to provide increased mobility for groups such as seniors and women — both of which are hesitant to buy into the technology, according to the survey.

When asked what companies voters trusted to bring this technology to the market, domestic automakers had the strongest support, with General Motors and Ford leading the pack. Comparatively, trust in foreign automakers and technology companies such as Google and Apple to develop this revolutionary technology was much lower. And ride-sharing companies Uber and Lyft, which have an increasingly visible stake in the industry, proved to be the least trusted entities, with only 9 and 5 percent of voter trust, respectively. The lack of independent support for ride-sharing companies only adds additional credence to the importance of their partnerships with auto manufactures. 

Barring federal preemption, state and municipal laws and regulations will dictate how AVs for consumer use will be introduced and tested for the market. Seven states have enacted legislation or adopted regulations that govern the testing and use of AVs. These policies vary drastically, with some states, such as Michigan and Florida, seen as AV-friendly, while others, e.g., California, deemed much more restrictive. While 59 percent of survey respondents believe state officials should be welcoming of AVs to market, they also see a need for additional regulations and the ability for human passengers to take over control.

AVs have a promising future. Despite the possibility of increased mobility and safety and environmental benefits, the emerging industry faces major challenges. Insurance groups, labor unions and privacy groups are developing arguments against AV safety, highlighting pitfalls in the technology and actual market implementation. Our research shows voters are not yet convinced of the technology’s viability. 

The AV industry was rocked when a recent report came out showing that autonomous technology was responsible for a fatal Tesla crash. An incident like that is exactly why average Americans are hesitant to embrace this futuristic technology. Removing the human element from the driving experience is a massive societal change; only time will tell if we are ready to embrace the accompanying challenges. In the meantime, the companies competing for this golden goose would be well advised to address the concerns of the general public and adapt accordingly. 

Joshua Baca is senior vice president of DDC Public Affairs, where he leads the company’s technology practice group. Formerly, Baca was National Coalitions Director for Governor Mitt Romney’s 2012 Presidential campaign. 

How to Make American Manufacturing Great Again

Susan Helper- September 15, 2016

Both the Republican and Democratic candidates for president claim to have plans to make manufacturing great again — but neither candidate goes far enough.

Donald Trump mistakenly says the United States doesn’t make anything anymore and promises to restore (somehow) thousands of high-paying manufacturing jobs to U.S. shores if he’s elected. Hillary Clinton wants to invest in training and technology for advanced manufacturing via grants and tax cuts. These are good ideas, but we can do more. 

While issues such as trade agreements and tax policy are certainly important, here are six additional points policymakers should consider in their efforts to create a better future for American workers.

1. Fewer Americans work in manufacturing than before — but the sector is regaining strength. Between 2000 and 2010, the U.S. manufacturing sector lost 5.8 million jobs — over one-third of all jobs in the sector. Since then, we’ve gained back more than 800,000 manufacturing jobs. Over half the value of all the manufactured goods we consume today in the United States is produced right here in our own country. 

2. Manufacturing jobs on average pay more than jobs in other sectors of the economy, but a significant percentage of jobs in manufacturing do not pay a living wage. Most manufacturing jobs pay well because the production process is capital intensive — meaning that most manufacturers depend upon highly skilled and motivated employees to develop advanced processes and keep expensive equipment up and running. On the other end of the scale, however, one-third of manufacturing production workers or their families are enrolled in public safety-net programs such as Medicaid or food stamps.

3. Neither these good jobs nor these low-paying jobs are inevitable. Manufacturers compete with each other using very different “production recipes.” Even within narrow industries, the top 25 percent of firms measured by compensation level pay more than twice as much per worker as the bottom 25 percent. The high-wage firms often can remain profitable because they adopt practices that yield high productivity — but only with a skilled and motivated workforce. These practices include increasing automation while having all workers participate in design and problem-solving. For decades, unions helped ensure both a supply of skilled workers and a fair distribution of the value they helped create; the decline of unions is an important factor facilitating the adoption of low-wage strategies by some employers.

4. Contrary to popular belief, gains in productivity can actually increase the number of jobs. It’s true that when productivity rises, fewer workers are required to make a given number of products. However, demand for those products usually rises with productivity. In fact, those manufacturing industries with greater productivity growth have often seen greater employment growth. Robots and other forms of automation are substituting for production workers, but new jobs are created designing and maintaining robots. Overall, manufacturing has a very large multiplier effect: A dollar more of final demand for U.S. manufactured goods generates $1.48 in other services and production — the highest multiplier of any sector.

5. Smart policy in other areas could increase the number of good manufacturing jobs. In the current environment where real interest rates (interest minus inflation) are actually negative, we can invest in areas of need, such as rebuilding our transportation, water, sewer, energy, and Internet infrastructure, with little or no cost of financing. Seriously fighting climate change would create a large number of manufacturing jobs, too, as we move toward “manufacturing” more of our energy from renewable sources such as wind and solar (instead of buying imported oil), and we invent new energy-efficient products, such as cars and appliances, which will have to be manufactured.  

6. Good jobs for ordinary workers do not have to be limited to manufacturing. Service jobs can also be organized to benefit greatly from skilled and motivated workers. Retailers like Trader Joe’s and Costco combine investment in their employees with low prices, financial success, and industry-leading customer service. As in manufacturing, these companies benefit from having well-trained, flexible workers who can shift with little supervision to do whatever is needed at the moment. Ultimately, companies create a virtuous cycle, paying higher wages, which increases workers’ loyalty and productivity, which, in turn, increases revenue and offsets higher compensation costs.

In debates such as this, which focus on the future of one sector of the economy, people often tend toward extremes — fearing that manufacturing employment will continue to shrink and eventually reach zero, or, hoping to regain the millions of good jobs we lost and restore the sector to its former level. In reality, neither scenario is likely to occur. But good policymaking could bring us closer to the latter than the former. 

Well-designed policies for job creation and innovation can have a positive, long-term impact on all sectors of our economy.

Susan Helper is the Frank Tracy Carlton Professor of Economics at Weatherhead School of Management, Case Western Reserve University. She served as the Chief Economist of the U.S. Department of Commerce from 2013-2015, and as a Senior Economist at the White House Council of Economic Advisors in 2012-2013.

The Trouble With Accountable Care Organizations

James C. Capretta- September 13, 2016

Dr. Ashish Jha did us all a favor recently by pulling back the curtain on the Obama administration’s recent press release touting the supposed success of the Accountable Care Organization (ACO) effort.

The administration claimed that ACOs operating under the Medicare Shared Savings Program (MSSP), as opposed to the Pioneer ACO or “Next Gen” ACO demonstration programs, reduced Medicare’s costs in 2015 by $429 million. But that figure excludes the payments made by the federal government to those ACOs having savings that exceeded a certain threshold which made them eligible for bonus payments. Include these bonus payments in the calculation, and the MSSP ACO program actually increased Medicare spending by $216 million in 2015 — a rather different bottom line from the one implied by the press release.

Furthermore, as Dr. Jha notes, nearly as many MSSP ACOs lost money in 2015 as saved money. And the ones that saved enough money to be eligible for bonuses were concentrated in markets with high benchmarks, raising the possibility that only ACOs in excessively costly regions are able to reduce costs in any significant way.

The continuing underperformance of the MSSP ACO program is complicating the Obama administration’s preferred narrative of recent health-care history. The administration has gone to great lengths to suggest to the media that cost escalation is slowing down throughout the health sector, that the Affordable Care Act's (ACA) “delivery system reforms” are an important reason for this development, and that the ACO initiative is the most important of the ACA’s delivery system reforms.

Unfortunately for the administration, its explanation of the cost story doesn’t stand up to the slightest scrutiny. For starters, to the extent that there’s been a slowdown in cost growth over the past decade, it predates the enactment of the ACA by several years. And the Congressional Budget Office has estimated that the much-discussed “delivery system reforms” of the ACA are minor events at best, even if they work as planned. The biggest cuts in spending in the ACA aren’t from these provisions but from blunt, across-the-board cuts in Medicare that almost no one believes can be sustained over the long run.

As for the ACOs, if they’ve produced any savings at all — which is questionable — the total of the previous four years is less than a rounding error in the nation’s massive $3 trillion per year health system.

ACOs were conceived as an alternative to insurance-driven managed care. Prior to the ACA, Medicare beneficiaries already had the option to enroll in private insurance plans through Medicare Advantage (MA), including scores of HMOs with decades of experience in managing care. The authors of the ACA wanted to give beneficiaries another option: provider-driven managed care. ACOs must provide the full spectrum of Medicare-covered services, so that means hospital and physician groups must work together to provide patients with the full spectrum of care. But there’s no requirement for ACOs to accept a capitated payment or operate like an insurance plan.

The fundamental problem with the MSSP ACO effort is the method of beneficiary enrollment. The ACA stipulates that a beneficiary is to be assigned to an ACO if the beneficiary’s primary physician has joined the ACO. Beneficiaries don’t really have a say in the matter and are never really informed in a clear way about their assignment to an ACO. They are under no obligation to get care from the providers within the ACO’s network and can see any physician they want to under the usual rules of traditional fee-for-service Medicare. (The Next Gen ACO demonstration is testing the payment of incentives to beneficiaries for staying within the ACO network for care.)

This assignment of beneficiaries to ACOs has undermined the ability of the MSSP ACOs to operate like genuine managed-care entities. The patients have no incentive for, or interest in, complying with the plan’s effort to control costs, and many times the physicians don’t have any real idea who among their patients is in the ACO.

The recently enacted “doc fix” legislation, called the Medicare Access and CHIP Reauthorization Act, upped the ante on ACO coercion. In future years, physicians will get paid more by Medicare only if they join an alternative payment model, which effectively means that they will have to join an ACO to get any kind of reasonable increase in their fees. And when physicians join an ACO, their patients will automatically come with them.

The administration is hoping eventually to herd all of the nation’s physicians — and thus the vast majority of Medicare beneficiaries — into ACOs by effectively giving them no other choice. But this won’t lead to “delivery system reform” or a more efficient health system. Rather, it will lead to widespread resentment among physicians and beneficiaries alike because neither will have fully consented to participate in the ACO model. The result will be a care delivery system that is indistinguishable in reality from unmanaged and inefficient fee-for-service, albeit with lower payments from the government.

A better approach would be to trust the Medicare beneficiaries to make choices for themselves. ACOs should be converted and rebranded into genuine, provider-driven integrated delivery networks (IDNs), with less regulation by the government but stronger incentives to cut costs to attract enrollment. Beneficiaries would be given the option to enroll in competing IDNs, Medicare Advantage plans, or the traditional fee-for-service program and would pay higher premiums for enrolling in the more expensive options. Competition based on price and quality would push the IDNs and the MA plans to improve their performance each year.

The MSSP ACO program has been in place now for four years, which is long enough to see that it’s not going to deliver what was promised. The problem is fundamental: Managed care plans that are formed based on assignment of beneficiaries, rather than consumer choice, will never have the legitimacy that comes from a patient’s genuine consent to submit to far-reaching changes in the care-delivery process. A very different approach to provider-driven managed care is required for that — one that relies less on government regulation and more on strong competition in the marketplace to deliver higher-value care for patients.

James C. Capretta is a resident fellow and holds the Milton Friedman chair at the American Enterprise Institute.

Adam Andrzejewski- September 13, 2016

The mission of the U.S. Small Business Administration (SBA) is to provide lending to entrepreneurs with great ideas who can’t find financing in the private marketplace. The public image is one financing the American Dream. But the reality is that the SBA is costly for taxpayers, and — even worse — it creates a painful human cost for the workers it dislocates.

OpenTheBooks recently published its Snapshot Oversight Report Truth in Lending: The U.S. Small Business Administration’s $24.2 Billion Bad Loan Portfolio. At Forbes, we documented $160 million in lending to private country clubs; $350 million in failed lending to just four hotel chains; $562 million lent to the convenience store/gas station industry — many franchises of Big Oil; and $2.2 billion in failed lending to failed restaurants, breweries, and wineries including Quiznos ($58.1 million) and Cold Stone Creamery ($49.1 million). 

In general, we found that when the SBA (or any agency) gets into the business of picking winners and losers, the losers always win. An there’s always a cost to taxpayers either directly or indirectly, because wasting scarce dollars on unproductive enterprises diverts dollars away from productive enterprises. It’s a rigged game in which the house — politicians — always wins.

Adam Andrzejewski (say: And-G-F-Ski) is the CEO of OpenTheBooks.com, the world’s largest private database of government spending. Learn more at OpenTheBooks.com.

Farmers Deserve Free Markets, Not Handouts

Stephen Moore- September 10, 2016

Just when it seems federal spending couldn't get more preposterous, Congress gives us the “cheese bailout.” In order to support dairy farmers, the feds are buying an estimated 11 million pounds of surplus cheese. The cost to taxpayers? About $20 million

Buying cheese that nobody wants is just a small slice of federal risk-related subsidies for agricultural producers. Designed to serve as a “safety net” for farmers, these subsidies cost taxpayers about $15 billion a year. This federal intervention is both unnecessary and unaffordable.

Why not let the free market work here as in every other industry, from computers to cars? ‎We can do this in a way that maintains the safety and reliability of our food supply, protects farmers from droughts and other natural disasters, and shields taxpayers from routine bailouts for special agricultural interests.

Let’s dispel some misconceptions about modern farming. The struggling family farmer from Steinbeck's “The Grapes of Wrath” is not today’s reality. In most cases, ‎taxpayers are subsidizing well-to-do farm households, with a median net worth about $800,000 — 10 times greater than the median net worth for all U.S. household. Farm subsidies are Robin Hood in reverse: Tax the middle class to prop up the rich.

 

Moreover, large producers dominate the agricultural sector. The latest Census of Agriculture shows that about 4 percent of farms (those with sales of $1 million or greater) account for two-thirds of all agricultural sales. More than half of all farms (54 percent) accounted for less than 1 percent of all agricultural sales. Not surprisingly, the subsidies to address agricultural risk generally go to large agricultural producers.

 

There’s no reason why agricultural producers can’t be expected to manage their risk — such as bad crop seasons — just like other businesses. Agricultural prices can be volatile, but so too are other commodities’ prices. The federal government doesn’t bail out oil producers, who have seen prices fall more than 60 percent in two years. Nor was the construction industry bailed out when the housing collapse threw thousands of contractors out of business.

 

After natural disasters, such as hurricanes, there’s no reason why farms can’t be treated like other businesses and receive the same types of assistance. In any case, our farmers and our financial markets are much better able to adjust to and insure against bad weather or other contingencies. Still, Washington gives agricultural producers assistance no matter if crops are good (which means lower prices) or bad.

 

The $15-billion “safety net” consists of commodity programs (e.g. price and income supports) and the federally subsidized crop insurance program. And these programs create bad side effects. For example, agricultural producers “farm the subsidies,” basing their planting decisions on how to maximize the subsidies they receive, rather than how to meet the needs of consumers.

 

The subsidies can also distort prices. For instance, the federal sugar program restricts the sugar supply, driving up prices and imposing a hidden tax on consumers, estimated at over $3 billion a year. And subsidies can crowd out less expensive risk-management solutions, such as private insurance products.

 

A new Heritage Foundation report recommends moving away from harmful subsidies by phasing them out, giving farmers time to adjust. During the transition, subsidies would only protect farmers from deep crop losses connected to natural disasters and the like. While most farm safety-net programs should be eliminated, disaster assistance programs and a properly focused federal crop insurance program should remain, at least for now.

 

But’s folly to try to justify existing policy as a “safety net.” Under the current program, farmers can receive payments just because they received slightly less revenue than they expected. They can have bumper crops and still get indemnities from the federal crop insurance program. Heritage recommends ending these unjustified subsidies, so that farmers are longer insulated from market forces.

 

American farmers are productive and business savvy. Free trade and less regulation ‎would help them far more than handouts.

 

Taxpayers shouldn’t be in the business of ensuring that farmers are successful. For the good of consumers, farmers, and taxpayers, government should get out of the farming business.

 

Stephen Moore is a Distinguished Visiting Fellow at The Heritage Foundation, where he directs the think tank’s Project for Economic Growth.

Save Public Pensions With School Choice

Lewis M. Andrews- September 9, 2016

It’s no secret that America’s 564 state and local public pension plans are in serious trouble. Joshua Rauh of Stanford’s Hoover Institution puts the cumulative underfunding at $3.4 trillion. Less widely known is that the shape of a long-term cure for this deficit has already begun to emerge — along with the unexpected opportunity finally to give parents with children in poorly performing public schools the right to pick alternatives. 

How are these things related? Consider first the underlying pattern of recently negotiated settlements to salvage financially shaky public pensions. Although union officials are reluctant to concede a permanent trend, union members have nevertheless made a self-serving concession: allow new workers to be hired with 401(k)-style contribution plans, which reduce taxpayers’ long-term pension obligations, in return for giving current and retired workers something close to what they were originally promised.

In 2011, Atlanta Mayor Kasim Reed and his city council negotiated a plan that saved $25 million in annual pension contributions, mostly through reduced pension payments for new hires. A year later, municipal officials and union representatives in Lexington, Kentucky came to a similar agreement, which was passed by the city in January of 2013.

One of the best-kept secrets about Detroit’s bankruptcy reorganization is that new hires who retire after 30 years will receive pensions costing 40 percent less in inflation-adjusted dollars than those who retired before the settlement. In total dollars, new workers contribute 10 times more to Detroit’s 2013 reorganization than do those grandfathered into defined benefit pensions.

Of course, with today’s low interest rates many pension funds are so far underwater that it’s not enough even to deny future workers the same defined benefits enjoyed by their predecessors. For instance, as part of their agreement in Lexington, active police officers and firefighters still had to give up 1 percent of their paychecks and accept lower cost-of-living increases. And with many pension trustees making riskier investments to compensate for the low return on bonds, even past contributions cannot be considered “safely in the bank.”

This is where school choice comes into play.

Widely seen by education reformers as a way to improve K-12 teaching through competition, voucher systems have a second and underappreciated advantage: They cost less than traditional public schools. And, if greatly expanded, voucher systems could free up more than enough savings to cover projected pension deficits without requiring either higher taxes or cuts to non-educational public services.

Consider that more than half a million school children are already enrolled in experimental voucher programs across the U.S. Several have already been studied to determine their fiscal impact. In 2014, Jeff Spalding of the Friedman Foundation looked at the 10 largest programs to get the most accurate picture to date of their overall financial benefit.

Spalding’s study went well beyond comparing the face value of a private school voucher to the per-pupil cost at nearby public schools. He also took into account a number of complicating factors, including the fact that students already in a private or parochial school become eligible for newly established voucher programs; that voucher amounts vary across existing school choice programs; that learning-disabled students require special services; and that the sources of public-school funding differ state to state.

Accounting for all these variables, the average annual per pupil savings for voucher programs turns out to be $3,400 — a figure that confirms the relative efficiency of school choice. But if we go one step further than Spalding and multiply his calculated savings times America’s 50.4 million schoolchildren, we find that vouchers have the potential to reduce public pension deficits at the rate of $171 billion annually.

That’s a $1 trillion reduction in less than six years.

This does not mean that public union officials will eagerly embrace the idea of funding member retirements with savings from school choice. But what they have already signaled through their acceptance of 401(k) plans for new hires is that, when pushed to the wall to preserve promised benefits for current and past workers, they’re willing to negotiate the structure of public services going forward.

Furthermore, it’s possible to expand significantly the number of voucher programs in ways that minimize the immediate disruption to most of the nation’s public-school teachers. This may be accomplished by taking advantage of the sharp employment fall off caused by retiring baby boomers and by focusing first on large cities with the worst performing schools and most distressed pension funds. 

For the present, government unions will continue to insist that both their K-12 education monopoly and past pension promises are sacrosanct. But as it becomes clear that reforming the former is the only way to guarantee the latter without risking a taxpayer backlash, one of these sacred cows will give way. The recent history of union bargaining tells us which one it will be. 

Dr. Andrews was executive director of the Yankee Institute for Public Policy from 1999 to 2009. He is the author of To Thine Own Self Be True: the Relationship between Spiritual Values and Emotional Health (Doubleday).

How Texas Reduced Both Crime & Incarceration

The Hon. Jerry Madden & Marc Levin- September 8, 2016

As the idiom “you can’t have your cake and eat it too” suggests, we’re properly conditioned to detect stories that are too good to be true. But Texas has proven it’s possible to have both much lower crime and a lower rate of imprisonment. Indeed, Texas’ FBI index crime rate, which accounts for both violent crime and property crime, has fallen more sharply than it has nationally, posting a 29 percent drop from 2005 to 2014, the latest full year for which official data is available.  

So how did Texas close three prisons while making its streets safer? By making alternatives to incarceration for nonviolent offenders more available and effective. The two of us had front row seats to the action in 2007, when the chairman of the House Corrections Committee was tasked by Texas House Speaker Tom Craddick (R-Midland) to find a way to protect public safety without building new prisons. We collaborated with then Governor Rick Perry, Senate Criminal Justice Chairman John Whitmire, and other leaders to push through this transformation.

To understand how it worked, we must first examine the arc of the prison boom. One major reason the prison population in Texas and across the nation grew six-fold from the mid-1970s to the mid-2000s is that proven alternatives were scarce with prison construction and operation costs consuming ever greater portions of state criminal-justice budgets. As more money went into prisons, less and less money was left for probation and other alternatives to incarceration. Texas showed the nation how to escape this cycle while improving public safety.

There are three main reasons prosecutors and judges might send someone to prison: 1) They believe imprisonment is the most just and effective sentence; 2) They’re required to do so by a statute that specifies a mandatory minimum prison term; or, 3) There are no proven and effective alternatives to prison available. The third reason is key to appreciating what Texas — and, subsequently, many other conservative states such as Georgia and South Carolina — achieved in both crime reduction and incarceration.

In January 2007, Texas projected that the state would need to allocate billions for the construction and operation of prisons, including more than 17,000 additional prison beds by 2012. Legislators also heard testimony from prosecutors and judges stating that low-risk, nonviolent offenders were often sent to prison for lack of effective alternatives. These criminal-justice professionals cited unwieldy probation caseloads along with lengthy waiting lists for drug courts or mental health treatment options, which make it difficult to supervise and treat offenders effectively.

Given Texas’ long tradition of prosecutorial and judicial discretion with little in the way of mandatory minimums, the path forward became clear. Texas policymakers worked with experts and across party lines to assemble a budget package that provided a historic expansion of diversion and treatment programs. The front end items included 800 new residential substance abuse treatment beds and 3,000 more outpatient substance abuse treatment slots — all to be used as initial options after sentencing and for those whose addiction problems undermine their compliance with community supervision.

The other piece of the package involved the back-end of the criminal-justice system. Lawmakers found that the Board of Pardons and Paroles had been paroling fewer inmates because they weren’t confident parole candidates were getting the necessary treatment in prison. The Board was also revoking an increasing number of parolees because they had few other options. In fact, thousands of inmates approved for parole had to be wait-listed for either halfway houses or in-prison treatment programs, conditions of parole. The result? Overflowing prisons. So lawmakers filled the gap, adding 2,700 substance abuse treatment beds behind bars, 1,400 new intermediate sanction beds (a 90-day program for probationers and parolees with technical violations such as missing appointments), and 300 halfway-house beds. They also capped parole caseloads at 75 to ensure closer supervision.

This was a major shift from the last three decades. Previously, Texas lawmakers responded to projected increases in the need for prisons simply by building more lockups. As a result, the state’s prison population rose from 19,000 in 1975 to more than 155,000 in 2007. The new approach has exceeded expectations both for public safety and cost control. Rather than build more prisons, Texas has since closed three and is looking at additional closures as the population continues to shrink. Most importantly, crime has declined more in Texas than it has nationally or in states without significant criminal-justice reform programs.

An even a better measure of the efficacy of these reforms is the performance of those placed on supervision. In 2007, 15.9 percent of probationers failed and were revoked to prison, a figure that fell to 14.7 percent by 2015. Thus, as more nonviolent offenders were diverted to probation instead of prison, probation success rates climbed. This likely stemmed from a combination of improved supervision — for instance, the use of graduated sanctions such as curfews to promote compliance — and court officials’ assessment that many of these individuals could succeed given the right resources in the community. The gains in parole are even more impressive: Even with 11,000 more people on parole today than in 2007, more than 17 percent fewer crimes are being alleged against parolees now than then.

It’s true that the national index crime rate fell 20 percent from 2007 to 2014. But Texas did even better. And its 26 percent drop was effected without spending money on new prisons and while shuttering some old ones.

Texas’ pioneering success has not gone unnoticed. But it’s not the only example of reducing crime and incarceration simultaneously. In fact, from 2008 to 2013, states that reduced incarceration rates achieved a 13 percent drop in crime, whereas those that increased rates saw an 11 percent drop.

Texas’ diversion efforts target nonviolent, low-risk offenders who are typically given short prison sentences. The reason is that recidivism in this group can often be lowered by interventions such as drug court in lieu of prison. Indeed, many of these offenders become more threatening to the public when incarcerated because they become disconnected from employment and family and exposed to a tough prison crowd. In this way, incarceration can be “criminogenic” for certain offenders — prison itself can create more criminals.

With advancements in technologies and techniques ranging from electronic monitoring to non-narcotic treatments for opioid addiction, states have choices beyond mere imprisonment or a toothless response to lawbreaking. Texas is proof positive that by filling the spectrum in between these two extremes with effective monitoring and treatment programs, we can both enhance taxpayer responsibility and increase the safety of our communities.

The Hon. Jerry Madden (R-Plano) served as Texas House Corrections Chairman at a time when Texas passed major criminal-justice reforms into law; he served in the Texas House of Representatives from 1993 to 2013, and is a senior fellow for the Texas Public Policy Foundation’s Right on Crime initiative. Marc Levin is the director of the Center for Effective Justice at the Texas Public Policy Foundation where he leads the foundation’s Right on Crime initiative.

Teachers' Unions Profit at Students' Expense

Robert Fellner- September 7, 2016

Now that the Supreme Court declined to hear a constitutional challenge to tenure laws that make it nearly impossible to fire bad teachers, California's children will continue to pay the price.

Tenure rules are so skewed in favor of protecting incompetent teachers — some of whom are merely reassigned to classrooms in poor and minority communities — that they “shock[ed] the conscience” of the Superior Court judge who first ruled on them. 

Because experts cite teacher quality as the most important factor in student-learning, even those who are normally supportive of government unions have criticized their inflexible opposition to reform — as was seen in recent editorials from both The Los Angeles Times and The Sacramento Bee.

But simply to ask legislators to oppose one of California’s most powerful special interest groups is to ignore the very forces that got us here in the first place.

The special interest effect — first popularized in 1986 when economist James Buchanan won the Nobel prize for his work in this area — says that lawmakers will serve those who can provide the most political support to them, not the public interest. Thus, as long as the state’s top political spender — the California Teachers Association (CTA) — opposes reform, California's lawmakers will, too. 

Because it is, like most government unions, chiefly concerned with protecting the jobs of its dues-paying members, the CTA often ends up supporting demonstrably harmful policies. Such is the case for California’s tenure rules, which have led to the “diminishment of the teaching profession,” according to TNTP, a national nonprofit dedicated to improving teacher quality. 

Unfortunately, while supporters of government unions are frequently — and correctly — critical of special interest groups in the private sector, such as those in the banking and pharmaceutical industries, this critical lens is shattered when it comes to government special interest groups.

The basis for government unions rests on shaky ground. The original labor movement was created to prevent the exploitation of workers by profit-hungry corporations. Because the government has no profits over which to negotiate, the case for government unionization remains dubious, which is why, historically, many of the labor movement’s greatest leaders emphatically rejected the idea of government unions.

In other words, the argument for labor unions in the private sector — namely, that an owner can profit by paying their workers less — does not apply to government, which has no owners.

But government unions aren't merely permitted in California. California state law grants them compulsory power to force elected officials to sign union contracts. 

This coercive power has left taxpayers on the hook for nearly $20 billion in superfluous spending in 2012 alone, according to an analysis by the Heritage Foundation. 

Such outlandish examples of government excess burden those least able to afford it the most.

Consider the Orange County city of Santa Ana,. In 2014, the median earnings for full-time, year round private-sector workers was only $27,391, while the median city worker pocketed over $90,000 — with city manager David Cavazos’s $341,798 salary topping that list, according to TransparentCalifornia.com.

Clearly, California’s government unions have taken full advantage of their monopolistic powers. But at what cost?

Repealing compulsory collective bargaining laws would yield more than just the estimated $20 billion in annual cost savings. It would also allow the state Legislature to enact desperately needed reforms without having to battle such powerful adversaries as the CTA — who would be forced to become more efficient and less willing to fight for universally reviled laws. 

The sad truth, however, is that legislators are unlikely to advocate for tenure reform as long as the political cost of doing so remains high. For reform to have a shot, Californians should reconsider the sacrosanct status granted to government unions, especially now that there’s so much at stake. 

Robert Fellner is director of transparency research at the Nevada Policy Research Institute, where he manages the TransparentNevada.com and TransparentCalifornia.com public pay databases.

The Jobless Rate for Young Black Men Is a National Disgrace

Robert Cherry- September 2, 2016

The rioting in Milwaukee over a police killing of an armed black man has reanimated the issue of longstanding black joblessness. Between 2010 and 2014, Milwaukee’s average rate of black joblessness was 54 percent, compared to rates of only 17 and 26 percent, respectively, among the city’s white and Latino young male population. But the pervasive joblessness of young black men goes well beyond such deindustrialized cities, and it’s effects are devastating on both the young men themselves and their families. 

The U.S. Census Bureau’s American Community Survey gathered data on the jobless rate of non-institutionalized men, 20 to 34 years old, averaged over the period, 2010-2014, for 34 major US cities. The data paint a grim picture for black men, particularly in the Midwestern industrial and the Mid-Atlantic cities. Like Milwaukee, Chicago, Detroit, Cleveland, Milwaukee, Philadelphia, Baltimore, and D.C. all had black jobless rates above 45 percent. In these cities, more young black men were either jobless or imprisoned than employed.

Black joblessness in southern and western cities was modestly lower — as low as 31.7 and 23.3 percent, respectively, in Dallas and Seattle. As a result, the national jobless rate for young black men was 39 percent, versus only about 22 percent for both white and Latino young men. These statistics show that weak labor markets cannot be the most important reason for such high black jobless rates. It follows that simply expanding employment opportunities would likely have only a modest effect on the jobless rates in this population.

The black joblessness data have direct ramifications for the black family. Kathryn Edin has linked joblessness to family break-up, causing many mothers to enter into sequential sexual relationships. One outcome is multi-partner fertility: A growing share of black mothers are having children with more than one partner. And once biological fathers move out, a large percentage of them abandon their children — which has particularly harmful effects on boys. 

As these men enter into new relationships, they father additional children. Often these men are caring towards their new biological children, but harsh with children from previous relationships. Statistics indicate that the rate of child maltreatment is three times higher for a mother living with a partner who is not the father of all her children than if she is without a partner.

In a recently published study, Chun Wang and I verified that male joblessness is strongly linked to child maltreatment. Using state-level data, we found that for each 1 percent increase in the male jobless rate, the overall child maltreatment rate increased by almost 1 percent. The differential jobless rates among racial groups goes a long way to explaining the racial disparities in child maltreatment rates.

Due to employment disparities, black men are unemployed out of proportion to their numbers. In many cities, their share of joblessness exceeds 40 percent. When young black men make up such a high proportion of the jobless, racial stereotypes among employers and the police are reinforced. This is certainly the case in Milwaukee, where employer biases and biased transportation policies contribute to much of the racial jobless disparities.   

These large disparities, however, are also an index of the social isolation of young black men. Often living in high poverty neighborhoods, they have deficient networks of contacts that can recommend them for jobs. While direct discrimination is in play, this lack of social networks helps explain why young black men have substantially higher rates of joblessness than other minority groups, such as Latinos.

The importance of such networks is vividly depicted in Clint Eastwood’s 2009 film "Gran Torino," in which an elderly white man befriends his teenage neighbor, who comes from a family of Hmong immigrants. Wanting to help the boy gain employment, the older man (played by Eastwood) contacts a friend who has a salvage company. By prepping him on how to look and what to say, Eastwood’s character helps ensure that the boy gets hired. This illustrates how difficult it can be for young black men to be considered for available jobs, even when employers have no overt racial animus.

What can be done about this state of affairs? 20 years ago the answer would have been to increase teen employment to allow disadvantaged youth to get the interpersonal skills that would aid them in gaining long-term employment. Over the last two decades, however, teen employment rates have plummeted, particularly for black youth, making this strategy no longer viable.

As a result, many favor improving college access with the hopes that many of these young men will attain college degrees. And yet, despite substantial expenditures, the vast majority of these young men do not gain community college degrees, let alone four-year degrees. A better strategy is to pay more attention to certificate programs that range from 8 to 15 months, particularly those offered by the public sector or the best-practices for-profit colleges. These programs enable students to avoid the remediation hurdles they experience in community colleges and provide success markers in a shorter period of time — successes that can be built upon.

We must also take seriously the legal employment barriers, especially given the breathtaking reality that one-quarter to one-third of all black men have been incarcerated. The expanded use of “ban the box,” i.e., not checking criminal justice information until the end of the hiring process, has improved employment prospects for black men seeking jobs with no restrictions on hiring the previously incarcerated. But a wide range of occupations remain unavailable to the previously incarcerated — including in most health and educational institutions — regardless of the quality of their lives since conviction.

To change the employment trajectory of young black men, we must look beyond traditional academic tracks and eliminate blanket restrictions that limit employment of the previously incarcerated.

Robert Cherry is an economist at Brooklyn College and the CUNY graduate Center. His publications on black labor include "Race and opportunity," National Affairs (Winter 2016).

Stanford Case Shows Danger of Judicial Discretion

Thomas R. Ascik- September 2, 2016

When Congress returns this month, will the summer uproar over a Stanford swimmer’s six-month sentence for sexual assault affect the criminal justice reforms being advanced by a bipartisan coalition of U.S. senators?

In that California case, 20-year-old Brock Turner was convicted by a jury of three counts of felony sexual assault against an intoxicated and apparently unconscious woman. The prosecutor asked for a sentence of six years, but in June the judge sentenced him instead to six months in jail. Turner is scheduled to be released today after serving only three months.

Though his decision was widely criticized for being too lenient, the judge, Aaron Persky, exercised the discretion allowed him by California sentencing law. And, ironically, it’s precisely such “judicial discretion” that advocates of federal sentencing reform want more of.

When a version of the Sentencing Reform and Corrections Act legislation was passed in the Senate Judiciary Committee in October 2015, Committee Chairman and Republican chief co-sponsor Charles Grassley spoke of the need to reduce “over-incarceration” resulting from current sentencing law that judges are required to follow and sentencing guidelines that they are required to consider. According to Democrat sponsor Patrick Leahy, mandatory-minimum sentences unfortunately “remove discretion from our criminal justice system.” And Deputy Attorney General Sally Quillian Yates testified at a hearing in the Senate Judiciary Committee that judges should be given more latitude so that sentences could be “tailored to the facts and circumstances of the crime.”

Marc Mauer of the Sentencing Project, the group that advocates for “alternatives to incarceration,” testified at the same hearing that sentencing reform for federal crimes would “restore a greater measure of discretion to federal judges.” And Families Against Mandatory Minimums (FAMM), the influential lobby for imprisoned criminals — and a prime supporter of the current legislation — has long advocated for virtually untrammeled judicial discretion in sentencing. As FAMM says: “Give judges discretion to fit the punishment to the individual. Because judges have an intimate and impartial understanding of each case, only they — not legislators, prosecutors, or defense attorneys — should determine appropriate sentences based on the facts of the crime.”

However unpopular, the Turner sentencing hearing was, in fact, a model of what advocates of judicial discretion are seeking. There was a thorough review of Turner’s life. 39 people wrote letters to the judge on his behalf. Turner addressed the court himself. The victim submitted a long statement. As part of its lengthy analysis, the California state probation office recommended six months based on state sentencing guidelines. In stating his considerations in open court, the judge went into great detail, and in imposing the sentence, took into account Turner’s youth, intoxication at the time of the crime, and lack of a criminal record. He then asked, “is state prison for this defendant an antidote” to the seriousness of the crime? His approach sounds exactly like what the senators are now advocating for federal sentences.

Opposition to the sentencing has gone beyond the furious public outcry. The district attorney in the Turner case subsequently chose to file a new and different sexual-assault case before another judge. And, in an unprecedented act, a number of prospective jurors refused to sit in a jury pool in another case tried before the presiding judge in the Turner case. A group is now seeking enough qualifying signatures to put a referendum on the November ballot aimed at recalling the judge from office.

The Turner case brings back to mind the even more notorious 2013 case in which drunk Texas teenager Ethan Couch killed four people while driving. The prosecution asked for a sentence of 20 years, but the sentencing judge, citing expert testimony about Couch’s having “affluenza,” sentenced him only to probation and therapy. Affluenza is a purported psychological condition caused by excessive affluence and consumption. Couch came from a wealthy family.

30 years ago, public sentiment supported the notion that equal sentencing — not individualized sentencing — leads to fair and just results. The federal government, subsequently followed by almost all the states, set the standard with the passage by Congress in 1984 of the Comprehensive Crime Control Act. As stated in the original Guidelines Manual (1987) published by the Congressional Sentencing Commission, the legislation had as its major purpose to “narrow…the wide disparity in sentences imposed by different federal courts for similar criminal conduct by similar offenders.”

But equalized sentencing according to statutory guidelines is now being dismantled under a new consensus shared by both parties in Congress, the Department of Justice, the federal Sentencing Commission, several states, and a host of outside groups — including conservative ones. The irony is that, under this new consensus, there’s no obvious basis for criticizing the Turner and Couch sentences. On the contrary, that kind of individualized sentencing based on judicial discretion is exactly what’s being called for.

The problem is that “discretion” can appear subjective or even arbitrary. And defendants whose families have the resources to put on costly, sophisticated, and elaborate sentencing presentations — as in the Couch case — may have an unfair advantage. Finally, giving judges maximum discretion concerning criminal sentencing is arguably undemocratic. With no knowledge of or expectations about what sentences are legal and appropriate, it’s impossible for the public to exercise any role in monitoring the decisions of the judiciary?

If we want to avoid more unfair, arbitrary, or unjust sentences, we might want to reconsider criminal justice reform.

Thomas R. Ascik recently retired as a federal prosecutor.

Getting More Bang For Our Transportation Buck

Tracy C. Miller- September 1, 2016

With the federal role in highway funding set to decline, states may find they have to take up the slack. While costs of highway construction and maintenance are rising, Congress has disagreed about whether to raise the federal gas tax or otherwise find a sustainable alternative revenue source to fund highways. In new research from the Mercatus Center at George Mason University, I suggest a few things states can do to cushion the blow to their budgets.

States already play an important role both in collecting taxes to pay their share of highway funding and in managing the highway funds provided to them by the federal government. With fewer federal funds and competing demands on their budgets, states must find ways to make their highway spending go further by reducing regulations, finding better ways to deal with congestion, and improving maintenance.

Regulations like prevailing wage laws and in-state preference policies raise highway construction and maintenance costs in many states. Eliminating these and other regulations could reduce costs substantially in some states.

Congestion is more than just an annoyance on our daily commutes. It’s also an economic problem caused when drivers do not pay the full price for using highways during certain times of the day. During congested periods, each driver imposes costs on surrounding drivers by increasing the time it takes them to reach their destinations. Because drivers don’t pay for their respective contributions to the congestion, they lack incentive to change their schedules, take public transportation, work from home, or find other solutions. 

Inadequate highway maintenance, which damages vehicles, is the result of the way funds are allocated among the different roads and highways within a state and by choices made about highway durability. Better management could help: Increasing the role of private firms in managing highways is one way to bring this about.

The use of adjustable rate tolls, in which prices vary based on factors such as the current congestion level or time of day, can significantly reduce congestion problems. Tolls can be set high enough during rush hour to virtually eliminate congestion, as demonstrated by the use of high-occupancy toll lanes in a number of metro areas. Besides having an incentive to construct and maintain highways efficiently, private firms also prioritize ways to reduce congestion problems on the highways they manage.

Because their profit depends on how well the highways they manage satisfy the preferences of drivers, private highway managers find ways to keep costs down while providing well-maintained highways. Enabling drivers to reach their destinations quickly — even if they have to pay a little extra during rush hour — is in everyone’s best interest.

State highway agencies could also consider giving local governments greater responsibility to fund and manage local roads and transit systems. When a large share of the funding for a local road or transit system comes from the federal or state government, local governments compete against each other for funds, even if the benefits of those projects do not cover the costs.

If set high enough, user fees and fares give drivers and transit riders incentives to use the most cost-effective mode of transportation available. To the extent that residents and businesses gain additional benefits from highways or transit lines that are unrelated to how often they are used, it makes sense to charge users less than cost while subsidizing highways and transit. If the tax revenue used to subsidize local roads and transit systems comes primarily from local residents, they have an incentive to vote only for really necessary projects.

Skeptics may argue that tolls deny us an important public good: the free public roads we all rely on. But we must not forget that every taxpayer is already paying a high (and sometimes unnoticed) cost for these roads, and in most large cities, we are not getting a good return on investment. Ultimately, what we pay in tolls for efficient roads must be balanced against what we would otherwise pay in taxes as well as the time we waste navigating poorly managed roadways. 

Relying more heavily on market forces and federalism will get us more bang for our transportation buck. While controversial, privatization — together with more local control — will benefit taxpayers and drivers by reducing congestion, improving the quality of our roads, and lowering overall costs.

Tracy C. Miller is a senior policy research editor with the Mercatus Center at George Mason University. He is coauthor (with Megan E. Hansen) of a new working paper, “Getting More Out of State Transportation Infrastructure Spending.”

Soda Taxes Leave Bitter Taste, Empty Wallets

Jesse Hathaway- August 31, 2016

Lawmakers in cities such as Boulder, Colorado and Oakland, California are asking voters to approve ballot measures that would increase the cost of soda and other sweetened beverages. These so-called “health-related taxes” are supposed to improve the general health of the public by making drinks lawmakers deem unhealthy more expensive to buy, thereby reducing consumer demand for those drinks. 

But instead of harnessing economics and social engineering to improve people’s health, soda taxes end up abusing taxpayers and improving only the health of governments’ revenue streams.

In Boulder and Oakland, for instance, the tax will be paid directly by grocery distributors. But consumers will be the ones paying the bill in the form of price hikes that will be passed along to cover the additional cost of doing business and selling food to consumers.

Not only will consumer prices go up, the price increases will almost certainly fail to achieve their stated goal: Reducing consumer demand for unhealthy products. Scholarly research has shown the link between lawmakers’ intentions and consumers’ reactions to be as fleeting and tenuous as the fizz escaping from a freshly opened two-liter can of soda.

In 2015, University of Massachusetts-Amherst researchers Francesca Colantuoni and Christian Rojas examined datasets consisting of nearly a decade’s worth of consumer data collected in two states in order to study how consumers in the real world have reacted to health-related taxes on soda. They found that instead of reducing consumption of sugary drinks, health-related taxes had no measurable impact on consumers’ purchases — even though the taxes did increase the price of the goods they bought. As they put it: “after the tax is applied, there is an overall increase in the tax-exclusive price in the treatment city that does not translate in a decrease in consumption either.” In other words, soda taxes don’t make consumers buy less soda, but they do make groceries more expensive.

A 2011 study reached similar conclusions. U.S. Department of Agriculture Senior Economist Biing-Hwan Lin and University of Florida-Gainesville Department of Food and Resource Economics professor Jonq-Ying Lee used data on consumer buying habits collected by Nielsen Holdings PLC, an international consumer research organization operating in more than 100 countries, and the Centers for Disease Control and Prevention, to model how a hypothetical national sin tax hiking the price of soda by 20 percent would impact consumers’ health. They found that big taxes on soda have a small effect on public health, but give a big boost to government revenues. A national 20 percent tax on soda, for instance, would spur only an “average daily reductionof 34–47 calories among adults and 40–51 calories among children,” but it would expand government’s budget waistline, boosting revenue by an estimated $5.8 billion.

Instead of going on a tax-and-spend bender, lawmakers should admit that they’re the ones with the consumption problem. A real prescription for a healthier economy and happier consumers is lower government spending and a lighter tax burden.

Jesse Hathaway (jhathaway@heartland.org) is a research fellow with The Heartland Institute.

Blog Archives