Washington State Institute for Public Policy
Vocational and employment training
Juvenile Justice
Benefit-cost estimates updated May 2017.  Literature review updated September 2015.
Vocational and employment training programs for juvenile offenders can be community-based residential and non-residential programs or take place during incarceration. Training typically consists of classroom-based or unpaid job experiences that teach juveniles employable skills such as construction and carpentry trades, landscaping, or culinary arts. Most programs combine vocational skills training with academic education or tutoring and provide some job search assistance such as interview preparation, resume building, or job placement services over a period of three to ten months.

The studies included in this meta-analysis consist of federal government-initiated workforce training programs that have an offender subgroup, state juvenile justice department programs, and programs operated through private organizations (i.e. the Homebuilders Institute). Using regression analysis on the studies included in the meta-analysis, we tested whether specific program components (vocational education, employment experiences, academic education, etc.) have a differentiated effect on crime. Programs with a vocational education component have greater reductions in crime with a statistically significant effect (p = 0.0001). However, the interaction between participation in vocational education and months spent in the program has a significant negative effect. That is, the longer a subject participates in vocational education, the greater the increase in crime (p = 0.0087). Programs with an academic education component also show reductions in crime (p = 0.0531) and no statistically significant interaction with months in the program. Programs that utilize unpaid employment experiences show statistically significant increases in crime (p = 0.0001).
BENEFIT-COST
META-ANALYSIS
CITATIONS
The estimates shown are present value, life cycle benefits and costs. All dollars are expressed in the base year chosen for this analysis (2016). The chance the benefits exceed the costs are derived from a Monte Carlo risk analysis. The details on this, as well as the economic discount rates and other relevant parameters are described in our Technical Documentation.
Benefit-Cost Summary Statistics Per Participant
Benefits to:
Taxpayers $5,912 Benefits minus costs $5,213
Participants $2,707 Benefit to cost ratio $1.69
Others $5,874 Chance the program will produce
Indirect ($1,670) benefits greater than the costs 58 %
Total benefits $12,822
Net program cost ($7,609)
Benefits minus cost $5,213
1In addition to the outcomes measured in the meta-analysis table, WSIPP measures benefits and costs estimated from other outcomes associated with those reported in the evaluation literature. For example, empirical research demonstrates that high school graduation leads to reduced crime. These associated measures provide a more complete picture of the detailed costs and benefits of the program.

2“Others” includes benefits to people other than taxpayers and participants. Depending on the program, it could include reductions in crime victimization, the economic benefits from a more educated workforce, and the benefits from employer-paid health insurance.

3“Indirect benefits” includes estimates of the net changes in the value of a statistical life and net changes in the deadweight costs of taxation.
Detailed Monetary Benefit Estimates Per Participant
Benefits from changes to:1 Benefits to:
Taxpayers Participants Others2 Indirect3 Total
Crime $2,270 $0 $5,991 $1,135 $9,396
Labor market earnings associated with employment $1,612 $3,551 $0 $0 $5,163
Property loss associated with alcohol abuse or dependence $0 $3 $5 $0 $8
Public assistance $1,921 ($816) $0 $959 $2,064
Health care associated with educational attainment $109 ($30) ($122) $47 $3
Adjustment for deadweight cost of program $0 $0 $0 ($3,811) ($3,811)
Totals $5,912 $2,707 $5,874 ($1,670) $12,822
Detailed Annual Cost Estimates Per Participant
Annual cost Year dollars Summary
Program costs $7,500 2014 Present value of net program costs (in 2016 dollars) ($7,609)
Comparison costs $0 2014 Cost range (+ or -) 10 %
We calculated the cost per participant from the literature in the meta-analysis, based on 6.5 months, weighted by the number of youth served by these programs. Our weighted average cost estimate also incorporates the cost per participant of youth served by a similar (non-residential) program in Washington.
The figures shown are estimates of the costs to implement programs in Washington. The comparison group costs reflect either no treatment or treatment as usual, depending on how effect sizes were calculated in the meta-analysis. The cost range reported above reflects potential variation or uncertainty in the cost estimate; more detail can be found in our Technical Documentation.
Estimated Cumulative Net Benefits Over Time (Non-Discounted Dollars)
The graph above illustrates the estimated cumulative net benefits per-participant for the first fifty years beyond the initial investment in the program. We present these cash flows in non-discounted dollars to simplify the “break-even” point from a budgeting perspective. If the dollars are negative (bars below $0 line), the cumulative benefits do not outweigh the cost of the program up to that point in time. The program breaks even when the dollars reach $0. At this point, the total benefits to participants, taxpayers, and others, are equal to the cost of the program. If the dollars are above $0, the benefits of the program exceed the initial investment.

^WSIPP’s benefit-cost model does not monetize this outcome.

^^WSIPP does not include this outcome when conducting benefit-cost analysis for this program.

Meta-analysis is a statistical method to combine the results from separate studies on a program, policy, or topic in order to estimate its effect on an outcome. WSIPP systematically evaluates all credible evaluations we can locate on each topic. The outcomes measured are the types of program impacts that were measured in the research literature (for example, crime or educational attainment). Treatment N represents the total number of individuals or units in the treatment group across the included studies.

An effect size (ES) is a standard metric that summarizes the degree to which a program or policy affects a measured outcome. If the effect size is positive, the outcome increases. If the effect size is negative, the outcome decreases.

Adjusted effect sizes are used to calculate the benefits from our benefit cost model. WSIPP may adjust effect sizes based on methodological characteristics of the study. For example, we may adjust effect sizes when a study has a weak research design or when the program developer is involved in the research. The magnitude of these adjustments varies depending on the topic area.

WSIPP may also adjust the second ES measurement. Research shows the magnitude of some effect sizes decrease over time. For those effect sizes, we estimate outcome-based adjustments which we apply between the first time ES is estimated and the second time ES is estimated. We also report the unadjusted effect size to show the effect sizes before any adjustments have been made. More details about these adjustments can be found in our Technical Documentation.

Meta-Analysis of Program Effects
Outcomes measured Primary or secondary participant No. of effect sizes Treatment N Adjusted effect sizes (ES) and standard errors (SE) used in the benefit-cost analysis Unadjusted effect size (random effects model)
First time ES is estimated Second time ES is estimated
ES SE Age ES SE Age ES p-value
Alcohol use in high school 2 344 -0.125 0.140 18 -0.125 0.140 28 -0.125 0.373
Crime 12 2413 -0.084 0.042 19 -0.084 0.042 29 -0.082 0.052
Earnings^^ 4 1065 0.075 0.047 22 0.000 0.018 23 0.075 0.115
Employment^^ 3 431 0.140 0.202 18 0.140 0.202 28 0.140 0.488
GED attainment^ 4 869 0.282 0.135 19 0.282 0.135 29 0.282 0.037
High school graduation 2 419 0.010 0.323 19 0.010 0.323 29 0.010 0.975
Illicit drug use in high school 2 344 0.110 0.173 18 0.110 0.173 28 0.110 0.526
Public assistance 3 1032 -0.132 0.074 19 -0.132 0.074 29 -0.132 0.073
Citations Used in the Meta-Analysis

Bloom, H.S., Orr, L.L., Bell, S.H., Cave, G., Doolittle, F., Lin, W., & Bos, J. M. (1996). The benefits and costs of JTPA Title II-A programs: Key findings from the National Job Training Partnership Act study. The Journal of Human Resources, 32(3), 549-576.

Cave, G., Bos, H., Doolittle, F., & Toussaint, C. (1993). JOBSTART: Final report on a program for school dropouts. New York, NY: Manpower Demonstration Research Corporation.

Gruenewald, P.J., Laurence, S.E., & West, B.R. (1985). National evaluation of the New Pride replication program, final report–Volume II: Client impact evaluation. Pacific Institute for Research and Evaluation (PIRE).

Johnson, B.D., & Goldberg, R.T. (1982). Vocational and social rehabilitation of delinquents–A study of experimentals and controls. Journal of Offender Counseling, 6(3), 43-60.

Miller, M., Drake, E.K., He, L. (2015). The King county Education and Employment Training (EET) Program: Effect on recidivism of juvenile offenders. (Document Number 15-12-3901). Olympia: Washington State Institute for Public Policy.

National Council on Crime and Delinquency. (2009). In search of evidence-based practice in juvenile corrections: An evaluation of Florida's Avon Park Youth Academy and STREET Smart Program. Madison, WI: National Council on Crime and Delinquency.

Piliavin, I., & Masters, S.H. (1981). The impact of employment programs on offenders, addicts, and problem youth: Implications from supported work. Madison: Institute for Research on Poverty, University of Wisconsin—Madison.

Quay, H.C., & Love, C.T. (1977). The effect of a juvenile diversion program on rearrests. Criminal Justice and Behavior, 4, 377-396.

Schaeffer, C.M., Henggeler, S.W., Ford, J.D., Mann, M., Chang, R., & Chapman, J.E. (2014). RCT of a promising vocational/employment program for high-risk juvenile offenders. Journal of Substance Abuse Treatment, 46(2), 134-143.

Schochet, P.Z., Burghardt, J., & Glazerman, S. (2001). National Job Corps study: The impacts of Job Corps on participants' employment and related outcomes, (Document No. PR00-67). Princeton, NJ: Mathematica Policy Research.

Schochet, P.Z., Burghardt, J., & McConnell, S. (2008). Does Job Corps work? Impact findings from the National Job Corps study. The American Economic Review, 98(5), 1864-1886.

For more information on the methods
used please see our Technical Documentation.
360.664.9800
institute@wsipp.wa.gov