Commentary

Using Cost-effectiveness Analysis to Make Policy Decisions

Author
Fiona Hollands
Teachers College, Columbia University

Education policy and program evaluation has largely focused on estimating the effectiveness of educational alternatives to inform policymakers about reforms that produce student gains in learning. Despite the fact that almost one trillion dollars of public funding is spent each year on education in the United States, little attention has been focused on evaluating the costs of interventions. Cost studies are needed in conjunction with effectiveness studies to allow policymakers to examine effectiveness relative to costs. Otherwise, research evidence that omits cost analysis may promote educational interventions that have high costs relative to their effectiveness instead of those that have lower costs but obtain similar results. It can also lead to policy implementation without sufficient resources to ensure that gains found in research settings can be realized in the field, as occurred with class size reduction in California. Evidence from economic analysis can help policymakers maximize gains in student learning from the investment of a given level of resources.

One important educational goal that has become a national priority is increasing high school graduation. More than 28 million American citizens aged 18 or over are high school dropouts. In California, the dropout rate in 2013 was 11.2%, but as high as 19.9% for African-Americans. The economic consequences of dropping out of school are substantial, with estimates exceeding $250,000 in losses of tax revenue and public costs for crime, public health, and welfare, and over $750,000 in total social burden per dropout. Reducing the dropout rate makes economic sense, but education budgets are limited. Rather than simply asking how to reduce the dropout rate, policymakers might justifiably seek the most efficient ways to increase high school graduates within available resources.

To demonstrate the use of cost-effectiveness analysis to answer such policy questions, we combined evidence obtained from the What Works Clearinghouse on the effectiveness of five promising dropout prevention programs—Talent Search, JOBSTART, New Chance, National Guard Youth Challenge (NGYC), and Job Corps—with cost data for each program to calculate cost-effectiveness ratios. Participants in all five programs earned a high school diploma or a GED at higher rates than similar peers not participating in these programs. However, the costs of each program differed substantially, ranging from as low as $3,290 (in 2010 dollars) per participant for Talent Search, a supplementary program that targets low-income, middle and high school students, to as high as $22,290 per participant for Job Corps, which, like JOBSTART, NGYC, and New Chance, serves youth who have already dropped out of school. Within each program, costs and effectiveness varied significantly among sites indicating that site-level analysis is more accurate and informative for decision-makers than pooled estimates derived from multiple sites that mask differences in context, implementation, resource allocation, and effectiveness.

We used the percentage difference in graduation rates between participants in each of the five programs and their respective control groups to calculate the number of graduates produced by each program above and beyond the number expected to graduate in the absence of the program. For the four non-school-based programs, the costs per “extra” graduate ranged from around $70,000 to $195,000, while the cost per extra graduate for Talent Search was $30,520. It appears that remedial programs aiming to help dropouts complete high school are very expensive relative to preventative programs, such as Talent Search, which targets students still in school. These high costs may reflect the difficulty for education programs alone to overcome barriers to educational success deriving from both academic and out-of-school influences.

In order to inform educational policymaking, such cost-effectiveness analyses should be routinely incorporated into evaluations of educational programs, allowing for comparisons among viable alternatives for achieving educational goals. Without such comparisons, even the highest quality research evidence presents an incomplete and perhaps misleading picture to decision-makers.

The full study is available online and a gated, peer-reviewed version was published in Educational Evaluation and Policy Analysis, Volume 36, No. 3, September 2014, pp. 307-326. 

Suggested citationHollands, F. (2014, October). Using cost-effective analysis to make policy decisions [Commentary] Policy Analysis for California Education. https://edpolicyinca.org/newsroom/using-cost-effectiveness-analysis-make-policy-decisions