ALL |
META-ANALYSIS |
CITATIONS |
|
Benefit-Cost Summary Statistics Per Participant | ||||||
---|---|---|---|---|---|---|
Benefits to: | ||||||
Taxpayers | $4,181 | Benefits minus costs | $19,028 | |||
Participants | $9,850 | Benefit to cost ratio | $147.84 | |||
Others | $5,191 | Chance the program will produce | ||||
Indirect | ($65) | benefits greater than the costs | 98% | |||
Total benefits | $19,158 | |||||
Net program cost | ($130) | |||||
Benefits minus cost | $19,028 | |||||
Meta-Analysis of Program Effects | ||||||||||||
Outcomes measured | Treatment age | No. of effect sizes | Treatment N | Effect sizes (ES) and standard errors (SE) used in the benefit-cost analysis | Unadjusted effect size (random effects model) | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
First time ES is estimated | Second time ES is estimated | |||||||||||
ES | SE | Age | ES | SE | Age | ES | p-value | |||||
Test scores Standardized, validated tests of academic achievement. |
10 | 10 | 196126 | 0.117 | 0.035 | 11 | 0.084 | 0.038 | 17 | 0.190 | 0.001 |
Detailed Monetary Benefit Estimates Per Participant | ||||||
Affected outcome: | Resulting benefits:1 | Benefits accrue to: | ||||
---|---|---|---|---|---|---|
Taxpayers | Participants | Others2 | Indirect3 | Total |
||
Test scores | Labor market earnings associated with test scores | $4,181 | $9,850 | $5,191 | $0 | $19,223 |
Program cost | Adjustment for deadweight cost of program | $0 | $0 | $0 | ($65) | ($65) |
Totals | $4,181 | $9,850 | $5,191 | ($65) | $19,158 | |
Detailed Annual Cost Estimates Per Participant | ||||
Annual cost | Year dollars | Summary | ||
---|---|---|---|---|
Program costs | $107 | 2013 | Present value of net program costs (in 2022 dollars) | ($130) |
Comparison costs | $0 | 2013 | Cost range (+ or -) | 10% |
Benefits Minus Costs |
Benefits by Perspective |
Taxpayer Benefits by Source of Value |
Benefits Minus Costs Over Time (Cumulative Discounted Dollars) |
The graph above illustrates the estimated cumulative net benefits per-participant for the first fifty years beyond the initial investment in the program. We present these cash flows in discounted dollars. If the dollars are negative (bars below $0 line), the cumulative benefits do not outweigh the cost of the program up to that point in time. The program breaks even when the dollars reach $0. At this point, the total benefits to participants, taxpayers, and others, are equal to the cost of the program. If the dollars are above $0, the benefits of the program exceed the initial investment. |
Al Otaiba, S., Connor, C.M., Folsom, J.S., Greulich, L., Meadows, J., & Li, Z. (2011). Assessment data-informed guidance to individualize kindergarten reading instruction: Findings from a cluster-randomized control field trial. The Elementary School Journal, 111(4), 535-560.
Connor, C.M., Morrison, F.J., Fishman, B.J., Schatschneider, C., & Underwood, P. (2007). The early years. Algorithm-guided individualized reading instruction. Science, 315(5811), 464-5.
Fuchs, L.S., Fuchs, D., Karns, K., Hamlett, C.L., & Katzaroff, M. (1999). Mathematics performance assessment in the classroom: Effects on teacher planning and student problem solving. American Educational Research Journal, 36(3), 609-646.
Heller, J.I., Daehler, K.R., Wong, N., Shinohara, M., & Miratrix, L.W. (2012). Differential effects of three professional development models on teacher knowledge and student achievement in elementary science. Journal of Research in Science Teaching, 49(3), 333-362.
Konstantopoulos, S., Miller, S.R., & van der Ploeg, A. (2013). The impact of Indiana's system of interim assessments on mathematics and reading achievement. Educational Evaluation and Policy Analysis, 35(4), 481-499.
Quint, J.C., Sepanik, S., & Smith, J.K. (2008). Using student data to improve teaching and learning: Findings from an evaluation of the Formative Assessments of Students Thinking in Reading (FAST-R) Program in Boston elementary schools. New York: MDRC.
Slavin, R.E., Cheung, A., Holmes, G.C., Madden, N.A., & Chamberlain, A. (2013). Effects of a data-driven district reform model on state assessment outcomes. American Educational Research Journal, 50(2), 371-396.
Tyler, J.H. (2013). If you build it will they come? Teachers' online use of student performance data. Education Finance and Policy, 8(2), 168-207.