Washington State Institute for Public Policy
Training with work experience for adult welfare recipients
Workforce Development
Benefit-cost estimates updated May 2017.  Literature review updated November 2015.
Adult TANF/AFDC recipients may receive job search and placement assistance, adult basic education, ESL and GED preparation, vocational training, or support services such as child care and housing support. All participants in these programs also receive some type of work experience, paid or unpaid. Most studies define the adult population to be age 18 and over. Treatment may be sequential, where participants first undergo training and then receive work experience, or follow individualized employment plans for each participant. These programs sometimes take the form of "welfare-to-work" programs, where participants must participate in employment activities to receive welfare benefits. Community organizations, welfare agencies, and federally or state-funded programs administered by state, county, or local government agencies typically provide these services. Programs last anywhere from two months to one year.
BENEFIT-COST
META-ANALYSIS
CITATIONS
The estimates shown are present value, life cycle benefits and costs. All dollars are expressed in the base year chosen for this analysis (2016). The chance the benefits exceed the costs are derived from a Monte Carlo risk analysis. The details on this, as well as the economic discount rates and other relevant parameters are described in our Technical Documentation.
Benefit-Cost Summary Statistics Per Participant
Benefits to:
Taxpayers $3,805 Benefits minus costs $2,749
Participants $4,579 Benefit to cost ratio $1.65
Others $0 Chance the program will produce
Indirect ($1,417) benefits greater than the costs 78 %
Total benefits $6,967
Net program cost ($4,218)
Benefits minus cost $2,749
1In addition to the outcomes measured in the meta-analysis table, WSIPP measures benefits and costs estimated from other outcomes associated with those reported in the evaluation literature. For example, empirical research demonstrates that high school graduation leads to reduced crime. These associated measures provide a more complete picture of the detailed costs and benefits of the program.

2“Others” includes benefits to people other than taxpayers and participants. Depending on the program, it could include reductions in crime victimization, the economic benefits from a more educated workforce, and the benefits from employer-paid health insurance.

3“Indirect benefits” includes estimates of the net changes in the value of a statistical life and net changes in the deadweight costs of taxation.
Detailed Monetary Benefit Estimates Per Participant
Benefits from changes to:1 Benefits to:
Taxpayers Participants Others2 Indirect3 Total
Labor market earnings associated with employment $2,443 $5,380 $0 $0 $7,823
Public assistance $896 ($381) $0 $446 $962
Food assistance $465 ($420) $0 $232 $276
Adjustment for deadweight cost of program $0 $0 $0 ($2,095) ($2,095)
Totals $3,805 $4,579 $0 ($1,417) $6,967
Detailed Annual Cost Estimates Per Participant
Annual cost Year dollars Summary
Program costs $4,154 2014 Present value of net program costs (in 2016 dollars) ($4,218)
Comparison costs $0 2014 Cost range (+ or -) 43 %
These programs typically last between two months and one year. We estimated the average annual cost of treatment per participant using data from studies in our meta-analysis that report cost estimates (Auspos et al., 1988; Bell & Orr, 1994; Blomquist, 1995; Bloom et al., 2000; Farrell, 2000; Freedman et al., 2000; Freedman et al., 1995; Hamilton et al., 1997; Riccio et al., 1986; Scrivener et al., 2002; Scrivener et al., 2001; Scrivener et al., 1998; Storto et al., 2000). Costs vary by study but may include administrative costs, employment services, case management, eligibility-related services, foregone earnings, tuition payments, allowances, support services such as transportation assistance and child care costs, and wage subsidies.
The figures shown are estimates of the costs to implement programs in Washington. The comparison group costs reflect either no treatment or treatment as usual, depending on how effect sizes were calculated in the meta-analysis. The cost range reported above reflects potential variation or uncertainty in the cost estimate; more detail can be found in our Technical Documentation.
Estimated Cumulative Net Benefits Over Time (Non-Discounted Dollars)
The graph above illustrates the estimated cumulative net benefits per-participant for the first fifty years beyond the initial investment in the program. We present these cash flows in non-discounted dollars to simplify the “break-even” point from a budgeting perspective. If the dollars are negative (bars below $0 line), the cumulative benefits do not outweigh the cost of the program up to that point in time. The program breaks even when the dollars reach $0. At this point, the total benefits to participants, taxpayers, and others, are equal to the cost of the program. If the dollars are above $0, the benefits of the program exceed the initial investment.

*The effect size for this outcome indicates percentage change, not a standardized mean difference effect size.

Meta-analysis is a statistical method to combine the results from separate studies on a program, policy, or topic in order to estimate its effect on an outcome. WSIPP systematically evaluates all credible evaluations we can locate on each topic. The outcomes measured are the types of program impacts that were measured in the research literature (for example, crime or educational attainment). Treatment N represents the total number of individuals or units in the treatment group across the included studies.

An effect size (ES) is a standard metric that summarizes the degree to which a program or policy affects a measured outcome. If the effect size is positive, the outcome increases. If the effect size is negative, the outcome decreases.

Adjusted effect sizes are used to calculate the benefits from our benefit cost model. WSIPP may adjust effect sizes based on methodological characteristics of the study. For example, we may adjust effect sizes when a study has a weak research design or when the program developer is involved in the research. The magnitude of these adjustments varies depending on the topic area.

WSIPP may also adjust the second ES measurement. Research shows the magnitude of some effect sizes decrease over time. For those effect sizes, we estimate outcome-based adjustments which we apply between the first time ES is estimated and the second time ES is estimated. We also report the unadjusted effect size to show the effect sizes before any adjustments have been made. More details about these adjustments can be found in our Technical Documentation.

Meta-Analysis of Program Effects
Outcomes measured Primary or secondary participant No. of effect sizes Treatment N Adjusted effect sizes (ES) and standard errors (SE) used in the benefit-cost analysis Unadjusted effect size (random effects model)
First time ES is estimated Second time ES is estimated
ES SE Age ES SE Age ES p-value
Earnings* 36 95653 0.146 0.026 39 0.000 0.018 40 0.149 0.001
Employment 32 95650 0.091 0.014 39 0.000 0.018 40 0.094 0.001
Food assistance 19 42878 -0.055 0.010 39 0.000 0.018 40 -0.058 0.001
Public assistance 38 91383 -0.064 0.015 39 0.000 0.028 40 -0.065 0.001
Citations Used in the Meta-Analysis

Auspos, P., Cave, G., & Long, D. (1988). Final report on the Training Opportunities in the Private Sector Program. New York, NY: Manpower Demonstration Research Corporation.

Bell, S.P., & Orr, L.L. (1994). Is subsidized employment cost effective for welfare recipients? Experimental evidence from seven state demonstration. The Journal of Human Resources, 29(1), 42-61.

Bloom, D., Kemple, J.J., Morris, P., Scrivener, S., Verma, N., Hendra, R., . . . Walter, J. (2000). The Family Transition Program : Final report on Florida's Initial Time-Limited Welfare Program. New York, NY: Manpower Demonstration Research Corporation.

Bloom, D., Miller, C., & Azurdia, G.L. (2007). Results from the Personal Roads to Individual Development and Employment (PRIDE) Program in New York City. New York, NY: Manpower Demonstration Research Corporation.

Farrell, M. (2000). Implementation, participation patterns, costs, and two-year impacts of the Detroit welfare-to-work program. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families, Office of the Assistant Secretary for Planning and Evaluation.

Fein, D.J., Beecroft, E., & Blomquist, J.D. (1994). Ohio Transitions to Independence Demonstration. Final Impacts for JOBS and Work Choice. Cambridge, MA: Abt Associates.

Freedman, S., Friedlander, D., Lin, W., & Schweder, A. (1996). The GAIN evaluation: Five-year impacts on employment, earnings and AFDC receipt. New York, NY: Manpower Demonstration Research Corporation.

Freedman, S., Knab, J.T., Gennetian, L.A., & Navarro, D. (2000). The Los Angeles Jobs-First GAIN Evaluation: Final report on a work first program in a major urban center. New York, NY: Manpower Demonstration Research Corporation.

Friedlander, D., Hoerz, G., Long, D., & Quint, J. (1985). Final report on the Employment Evaluation. New York, NY: Manpower Demonstration Research Corporation.

Hamilton, G., Brock, T., Farrell, M., Friedlander, D., Harknett, K., Hunter-Manns, J-A., . . . Weissman, J. (1997). Evaluating two welfare-to-work program approaches: Two-year findings on the labor force attachment and human capital development programs in three sites. New York, NY: Manpower Demonstration Research Corporation.

Jacobs, E., & Bloom, D. (2011). Alternative employment strategies for hard-to-employ TANF recipients: Final results from a test of transitional jobs and preemployment services in Philadelphia(OPRE Report 2011-19). Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

Pawasarat, J., & Quinn, L.M. (1993). Wisconsin welfare employment experiments: An evaluation of the WEJT and CWEP programs. Milwaukee, WI: Employment and Training Institute, Division of Outreach and Continuing Education Extension, University of Wisconsin-Milwaukee.

Riccio, J.A., Cave, G., Freedman, S., & Price, M. (1986). Final report on the Virginia Employment Services Program. New York, NY: Manpower Demonstration Research Corporation.

Scrivener, S., Hamilton, G., Farrell, M., Freedman, S., Friedlander, D., Mitchell, M., . . . Schwartz, C. (1998). Implementation, participation patterns, costs, and two-year impacts of the Portland (Oregon) welfare-to-work program. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families, Office of the Assistant Secretary for Planning and Evaluation.

Scrivener, S., Walter, J., Brock, T., & Hamilton G. (2001). Evaluating two approaches to case management: Implementation, participation patterns, costs, and three-year impacts of the Columbus welfare-to-work program. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families, Office of the Assistant Secretary for Planning and Evaluation.

Scrivener, S., Hendra, R., Redcross, C., Bloom, D., Michalopoulos, C., & Walter J. (2002). WRP: Final report on Vermont's Welfare Restructuring Project. New York, NY: Manpower Demonstration Research Corporation.

Storto, L., Hamilton, G., Schwartz, C., & Scrivener, S. (2000). Oklahoma City’s ET & E Program: Two-year implementation, participation, cost, and impact findings. Washington, DC: U.S. Department of Health and Human Services, Administration for Children and Families and Office of the Assistant Secretary for Planning and Evaluation.

For more information on the methods
used please see our Technical Documentation.
360.664.9800
institute@wsipp.wa.gov