skip to main content
Washington State Institute for Public Policy
Back Button

Teacher professional development: Targeted

Pre-K to 12 Education
Benefit-cost methods last updated December 2024.  Literature review updated June 2014.
This program was archived December 2024.
Generally, professional development (PD) for K–12 teachers includes activities such as workshops, conferences, summer institutes, and time set aside during the school year for staff development. Targeted PD focuses on improving teaching in a particular content area (such as reading, math, and science) and/or a particular grade level. The specific types of PD evaluated and included in this meta-analysis are (in no particular order) Language Essentials for Teachers of Reading and Spelling (LETRS), Pacific Communities with High Performance in Literacy Development (Pacific CHILD), Cognitively Guided Instruction, Math & Science Partnerships (MSP), Teaching Science, Mathematics and Relevant Technologies (Teaching SMART), Discovery Model Schools Initiative, the Integrated Mathematics Assessment, Teaching Cases, and Metacognitive Analysis. Most forms of targeted PD include a summer institute in addition to training provided during the regular school year.
 
ALL
BENEFIT-COST
META-ANALYSIS
CITATIONS
For an overview of WSIPP's Benefit-Cost Model, please see this guide. The estimates shown are present value, life cycle benefits and costs. All dollars are expressed in the base year chosen for this analysis (2022).  The chance the benefits exceed the costs are derived from a Monte Carlo risk analysis. The details on this, as well as the economic discount rates and other relevant parameters are described in our 2023 Technical Documentation.
Benefit-Cost Summary Statistics Per Participant
Benefits to:
Taxpayers $2,542 Benefits minus costs $11,212
Participants $5,987 Benefit to cost ratio $36.59
Others $3,156 Chance the program will produce
Indirect ($158) benefits greater than the costs 79%
Total benefits $11,527
Net program cost ($315)
Benefits minus cost $11,212

Meta-analysis is a statistical method to combine the results from separate studies on a program, policy, or topic to estimate its effect on an outcome. WSIPP systematically evaluates all credible evaluations we can locate on each topic. The outcomes measured are the program impacts measured in the research literature (for example, impacts on crime or educational attainment). Treatment N represents the total number of individuals or units in the treatment group across the included studies.

An effect size (ES) is a standard metric that summarizes the degree to which a program or policy affects a measured outcome. If the effect size is positive, the outcome increases. If the effect size is negative, the outcome decreases. See Estimating Program Effects Using Effect Sizes for additional information on how we estimate effect sizes.

The effect size may be adjusted from the unadjusted effect size estimated in the meta-analysis. Historically, WSIPP adjusted effect sizes to some programs based on the methodological characteristics of the study. For programs reviewed in 2024 or later, we do not make additional adjustments, and we use the unadjusted effect size whenever we run a benefit-cost analysis.

Research shows the magnitude of effects may change over time. For those effect sizes, we estimate outcome-based adjustments, which we apply between the first time ES is estimated and the second time ES is estimated. More details about these adjustments can be found in our 2023 Technical Documentation.

Meta-Analysis of Program Effects
Outcomes measured Treatment age No. of effect sizes Treatment N Effect sizes (ES) and standard errors (SE) used in the benefit-cost analysis Unadjusted effect size (random effects model)
First time ES is estimated Second time ES is estimated
ES SE Age ES SE Age ES p-value
10 14 161229 0.071 0.055 11 0.051 0.060 17 0.198 0.008
1In addition to the outcomes measured in the meta-analysis table, WSIPP measures benefits and costs estimated from other outcomes associated with those reported in the evaluation literature. For example, empirical research demonstrates that high school graduation leads to reduced crime. These associated measures provide a more complete picture of the detailed costs and benefits of the program.

2“Others” includes benefits to people other than taxpayers and participants. Depending on the program, it could include reductions in crime victimization, the economic benefits from a more educated workforce, and the benefits from employer-paid health insurance.

3“Indirect benefits” includes estimates of the net changes in the value of a statistical life and net changes in the deadweight costs of taxation.
Detailed Monetary Benefit Estimates Per Participant
Affected outcome: Resulting benefits:1 Benefits accrue to:
Taxpayers Participants Others2 Indirect3 Total
Test scores Labor market earnings associated with test scores $2,542 $5,987 $3,156 $0 $11,684
Program cost Adjustment for deadweight cost of program $0 $0 $0 ($158) ($158)
Totals $2,542 $5,987 $3,156 ($158) $11,527
Click here to see populations selected
Detailed Annual Cost Estimates Per Participant
Annual cost Year dollars Summary
Program costs $260 2013 Present value of net program costs (in 2022 dollars) ($315)
Comparison costs $0 2013 Cost range (+ or -) 10%
In the evaluations included in the meta-analysis, teachers received an average of 63 additional hours of targeted professional development (PD) in comparison with the usual amount of PD time. We calculated the value of PD time using average teacher salaries (including benefits) in Washington State as reported by the Office of Superintendent of Public Instruction. To calculate a per-student annual cost, we divided compensation costs by the number of students per classroom in Washington's prototypical schools formula and add per-student materials, supplies, and operating costs to account for the overhead (i.e. facility and administrative costs) associated with providing PD.
The figures shown are estimates of the costs to implement programs in Washington. The comparison group costs reflect either no treatment or treatment as usual, depending on how effect sizes were calculated in the meta-analysis. The cost range reported above reflects potential variation or uncertainty in the cost estimate; more detail can be found in our 2023 Technical Documentation.
Benefits Minus Costs
Benefits by Perspective
Taxpayer Benefits by Source of Value
Benefits Minus Costs Over Time (Cumulative Discounted Dollars)
The graph above illustrates the estimated cumulative net benefits per-participant for the first fifty years beyond the initial investment in the program. We present these cash flows in discounted dollars. If the dollars are negative (bars below $0 line), the cumulative benefits do not outweigh the cost of the program up to that point in time. The program breaks even when the dollars reach $0. At this point, the total benefits to participants, taxpayers, and others, are equal to the cost of the program. If the dollars are above $0, the benefits of the program exceed the initial investment.

Citations Used in the Meta-Analysis

Abe, Y., Thomas, V., Sinicrope, C., & Gee, K.A. (2012). Effects of the Pacific CHILD professional development program. (NCEE 2013–4002). Washington, DC: National Center for Education Evaluation and Regional Assistance.

Borman, K.M., Cotner, B.A., Lee, R.S., Boydston, T.L., & Lanehart, R. (2009). Improving elementary science instruction and student achievement: The impact of a professional development program. Paper presented at the Second Annual Conference of the Society for Research on Educational Effectiveness, Crystal City, VA.

Borman, G.D., Gamoran, A., & Bowdon, J. (2008). A randomized trial of teacher development in elementary science: First-year achievement effects. Journal of Research on Educational Effectiveness, 1(4), 237-264.

Carpenter, T.P., Fennema, E., Peterson, P.L., Chiang, C.P., & Loef, M. (1989). Using knowledge of children's mathematics thinking in classroom teaching: An experimental study. American Educational Research Journal, 26(4), 499-531.

Foster, J.M., Toma, E.F., & Troske, S.P. (2013). Does teacher professional development improve math and science outcomes and is it cost effective? Journal of Education Finance, 38(3), 255-275.

Garet, M.S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., . . . Silverberg, M. (2008). The impact of two professional development interventions on early reading instruction and achievement. Washington, DC: National Center for Education Evaluation and Regional Assistance.

Garet, M.S., Wayne, A. J., Stancavage, F., Taylor, J., Walters, K., Song, M., . . . Warner, E. (2010). Middle school mathematics professional development impact study: Findings after the first year of implementation. Washington, DC: National Center for Education Evaluation and Regional Assistance.

Heller, J.I., Daehler, K.R., Wong, N., Shinohara, M., & Miratrix, L. W. (2012). Differential effects of three professional development models on teacher knowledge and student achievement in elementary science. Journal of Research in Science Teaching, 49(3), 333-362.

Johnson, C.C., Kahle, J.B., & Fargo, J.D. (2007). A study of the effect of sustained, whole-school professional development on student achievement in science. Journal of Research in Science Teaching, 44(6), 775-786.

McCutchen, D., Abbott, R.D., Green, L.B., Beretvas, S.N., Cox, S., Potter, N.S., . . . Gray, A.L. (2002). Beginning literacy: Links among teacher knowledge, teacher practice, and student learning. Journal of Learning Disabilities, 35(1), 69-86.

Santagata, R., Kersting, N., Givvin, K. B., & Stigler, J.W. (2011). Problem implementation as a lever for change: An experimental study of the effects of a professional development program on students' mathematics learning. Journal of Research on Educational Effectiveness, 4(1), 1-24.

Saxe, G., Gearhart, M., & Nasir, N. (2001). Enhancing students' understanding of mathematics: A study of three contrasting approaches to professional support. Journal of Mathematics Teacher Education, 4(1), 55-79.