Sign up to receive email updates, action alerts, health tips, promotions to support our work and more from EWG. You can opt-out at any time. [Privacy]

 

Crop Insurance Lottery

A Lottery That’s a Sure Bet

February 10, 2016

Crop Insurance Lottery: Odds of Winning Vary Between Regions and Crops

Table 1 presents average per-acre returns and average rates of return for various crop/state combinations in three regions. The top section includes important Corn Belt states and crops. The middle section shows some Great Plains states and crops. The bottom section presents Southern states and crops.

The data clearly show that farmers in different regions growing different crops have very different odds of achieving positive returns from crop insurance. Not one of the Corn Belt states has a rate of return greater than 100 percent. But in the Great Plains, only North Dakota soybeans have had a rate of return of less than 100 percent since 2000. Cotton in Oklahoma and Texas has rates of return greater than 200 percent. In the South, only North Carolina cotton and soybeans have rates of return of less than 100 percent, while Arkansas corn and wheat and North Carolina wheat have rates of returns greater than 200 percent.

Table 1. Rates of return vary between crops and states

  Rate of Return Net Return ($/acre)
  1989-1999 2000-2014 1989-1999 2000-2014
Iowa
  Corn 15% 80% 0.48 19.00
  Soybeans -5% 43% -0.37 4.72
Illinois
  Corn -5% 88% -0.89 19.97
  Soybeans -33% 12% -1.58 1.26
Minnesota
  Corn 53% 60% 8.51 13.5
  Soybeans 40% 49% 3.78 5.8
  Wheat 155% 17% 7.68 -0.19
Kansas
  Wheat 134% 142% 3.50 12.26
North Dakota
  Corn 119% 137% 8.15 23.84
  Soybeans 54% 83% 2.35 9.45
  Wheat 76% 111% 3.73 10.70
Oklahoma
  Corn 94% 170% 8.90 27.92
  Cotton 266% 237% 28.41 74.47
  Wheat 96% 170% 6.10 18.43
Texas
  Corn 243% 189% 16.82 23.73
  Cotton 159% 235% 22.32 54.35
  Soybeans 115% 185% 9.26 22.54
  Wheat 163% 241% 7.33 24.06
Arkansas
  Corn 282% 221% 24.27 34.59
  Cotton 179% 120% 31.77 20.02
  Soybeans 95% 96% 7.23 7.84
  Wheat 223% 220% 13.62 20.78
Georgia
  Corn 95% 132% 8.94 20.25
  Cotton 74% 116% 14.70 26.37
  Soybeans 78% 105% 6.60 12.04
  Wheat 47% 159% 2.38 14.62
North Carolina
  Corn 72% 112% 6.78 19.33
  Cotton 78% 91% 11.29 13.94
  Soybeans 47% 87% 4.44 9.09
  Wheat 37% 203% 1.98 12.13

There are also regional differences in average per-acre returns. Since 2000, the highest average net return was $74.47 per acre for Oklahoma cotton, with Texas cotton the next highest at $54.35 per acre. Minnesota wheat has experienced a negative average per-acre return, and the Illinois soybean average of $1.26 per acre is close to zero. The Iowa and Minnesota average net returns are also quite low compared to net returns in other states and regions.

Various explanations have been given for these regional variations. Some argue that USDA has set premiums too low in the Corn Belt and too high elsewhere.[1] An alternative explanation is that Corn Belt weather has been abnormally benign in recent years, leading to lower-than expected claims payouts. 

Odds Are In Farmers’ Favor

The data in Table 1 clearly show that the odds of winning the bet on a crop insurance policy vary among crops and regions. But the data also show that the odds of winning are quite good nearly everywhere. From 2000 to 2014, the rate of return was positive for all crop/state combinations and net returns were positive for all but Minnesota wheat.

The odds of a farmer collecting more in claims payouts than he or she paid in premiums are in the farmer’s favor because the size of the bet — the premium paid — is less than the average payoff. This results from taxpayers paying more than half of the premium. It is as if half of a gambler’s bet came from the casino’s money. The effect of premium subsidies on rates of return and net returns is clearly evident in Table 1.

Passage of the Agricultural Risk Protection Act (ARPA) in 2000 greatly increased the premium subsidies, doubling the share of premiums paid by the government for most policies and coverage levels. In Table 1, returns on crop insurance are calculated for two periods: 1989 to 1999 (before ARPA) and from 2000 to 2014 (after ARPA).

For nearly all state and crop combinations, both the average rate of return and the average per-acre net returns were much higher in the post-ARPA period than earlier. This reflects both the higher premium subsidies available under ARPA and higher commodity prices since 2006.

Insurance Returns versus Direct Payments

The uncertainty of gaining a positive net payout makes the payoffs from crop insurance more like a lottery with favorable odds than the sure thing that farmers enjoyed through the discredited direct payment program. Ending direct payments was touted as a major accomplishment of the 2014 farm bill. Payments would now go to growers only when they had suffered an actual “loss.”

Despite the variability, however, the data show that at least some – and perhaps many – farmers will achieve similar or better net returns from crop insurance than they got in direct payments.

Before direct payments were reduced for some farmers in 2009, the average direct payment per planted acre was about $27 for corn, $8 for soybeans, $20 for wheat and $40 for cotton. Table 1 shows that crop insurance net returns per acre for corn ranged from $13.50 in Minnesota to $34.59 in Arkansas. Net returns for soybeans ranged from $1.26 in Illinois to $22.54 in Texas; for wheat from minus $0.19 in Minnesota to $24.06 in Texas, and for cotton from $13.94 in North Carolina to $74.47 in Oklahoma.

Between 2001 and 2015 crop insurance total net returns averaged about $3.6 billion a year; less than the $5 billion sent out annually in direct payments. For every year since 2011, however, total net returns exceeded the value of total direct payments, ranging from just over $5 billion in 2014 to over $13 billion in 2012.

Lottery versus Risk Management

The result of purchasing a subsidized crop insurance policy looks more like a lottery than a safety net designed to step in when farmers suffer a serious financial loss. Studies of the choices farmers make when deciding which crop insurance policy to buy also suggest that risk management is not the deciding factor.

Researchers at Iowa State University and the University of Wisconsin have shown that if farmers had to rely solely on crop insurance to manage their risk, and if risk management was the sole motivation for buying the insurance, nearly all farmers would buy at least 80 percent coverage in light of the high rate at which premiums are subsidized.[2] Many would buy 85 percent coverage. If premiums were not subsidized, farmers would buy the maximum amount of coverage available to them.

However, in most regions of the country farmers buy much less than the maximum available level of coverage. In 2014, the average coverage levels for the top four insured crops were 75 percent for corn, 74 percent for soybeans, 70 percent for wheat and 66 percent for cotton. This discrepancy means that either the standard model economists use to explain risk management decisions is wrong, or farmers have ways to manage risk that are more cost-effective than crop insurance.

In a recent study, I applied the framework developed by Daniel Kahneman and Amos Tversky to crop insurance to see if its predictions on how much coverage farmers would buy lined up well with what they actually did – rather than what economists said they should do.[3]

Kahneman and Tversky, both psychologists, waded into the realm of mathematical economics in the mid-1970s with a new way of explaining how people make decisions that involve uncertain outcomes. Their chief insight was that in decisions involving risk, people place a higher value on avoiding losses than they do on getting the same amount of gains. For example, most people would choose not to play a gamble where there is a 50 percent chance of winning $1,050 and a 50 percent change of losing $1,000, even though the gain from a win is larger than the potential loss. Kahneman and Tversky call people’s reluctance to take gambles of this sort “loss aversion.”

Another key aspect of Kahneman and Tversky’s framework for explaining how people choose between risky ventures (called prospect theory) is selecting the so-called financial “reference point.” An outcome from a risky prospect that is greater than the reference point is viewed as a “gain,” whereas one that is less than the reference point is viewed as a “loss.” I set one reference point so that the model would predict coverage choices made by farmers if they used crop insurance as a risk management tool. I set a second reference point so that the model would predict farmers’ choices if they treated crop insurance more like a lottery than a risk management tool.[4]

My study carefully modeled the coverage choices of a representative corn farmer in York County, Neb., a cotton famer in Lubbock County, Texas and a wheat farmer in Sumner County, Kansas. In all three cases, modeling crop insurance as a lottery rather than as a risk management tool predicted coverage level choices that were more consistent with the levels actually chosen by farmers in these counties.



[1] See, for example, Woodard, J.D., B.J. Sherrick, and G.D. Schnitkey. “Actuarial Impacts of Loss Cost Ratio Ratemaking in U.S. Crop Insurance Programs.” Journal of Agricultural and Resource Economics, 36(1):211-228.

[2] Du, X., H. Feng, and D.A. Hennessy. “Rationality of Choices in Subsidized Crop Insurance Markets.” Working Paper 14-WP 545, Center for Agricultural and Rural Development, Iowa State University, February, 2014.

[3] Babcock, B.A. 2015. “Using Cumulative Prospect Theory to Explain Anomalous Crop Insurance Coverage Choice.” American Journal of Agricultural Economics doi:10.1093/ajae/aav032.

[4] For example see Collins, K., and T. Zacharias. “A 25 Year Milestone in Farm Policy: A Look Back at the 1989 GAO Report.” CropInsurance Today. February 2014 vol 47 No. 1. Available at https://www.ag-risk.org/NCISPUBS/Today/2014/Feb_2014_Today.pdf