This post was written by Jonathan Nakamoto and Staci Wendt, Research Associates in WestEd’s Health & Justice Program.

We’re excited to announce our published article describing WestEd’s evaluation that used regression discontinuity and quasi-experimental designs to assess the impact of School Improvement Grants (SIGs). The article appears in SAGE Research Methods Cases Part 2.

The article is designed to be a teaching tool for graduate students and practicing evaluators to demonstrate the use of both regression discontinuity and quasi-experimental designs, methods that have different strengths and limitations. The use of both methods allowed us to compare the findings across different evaluation designs.

Our study analyzed high school students’ reading, mathematics, and science achievement in a Midwestern state. The high schools that participated in the SIG program were very low performing and received approximately $1 million a year, beginning in 2010, to help turn around their academic performance. Our aim was to evaluate whether the SIG program helped improve student achievement. And we did that by employing two research methods: regression discontinuity design and quasi-experimental design.

Regression discontinuity designs are generally considered the most rigorous alternative to experimental studies. A regression discontinuity design allows researchers to evaluate the impact of an intervention when the assignment to the treatment and control groups is based on a cutoff using a numeric variable. The schools’ scores on their applications for SIG funding were used as the assignment variable in the design.

A quasi-experimental design allowed us to assess the impact of an intervention, but, unlike an experimental design, it relies on a control group that was not formed through random assignment. Quasi-experimental designs are much more commonly used in the field of education than regression discontinuity designs because they are typically easier to implement.

The sample size for our quasi-experimental design was larger than the sample size for the regression discontinuity design because the quasi-experimental design included SIG-eligible schools that did not apply for the funding and schools with applications scores that were too far from the funding cutoff to include in the regression discontinuity design.

Key Takeaways

The purpose of the SAGE Research Methods Cases article is to help readers identify situations where regression discontinuity and quasi-experimental designs could be used to evaluate the impact of interventions. A small dataset is included in the article that allows readers to replicate analyses we conducted for the regression discontinuity design.

In addition, our article shows that researchers can have more confidence in their findings when the estimates of the program impacts from multiple designs are consistent with one another.

The evaluation findings presented in the article indicated that the SIG program did not positively impact high school students’ reading, mathematics, and science achievement during the program’s first 2 years. In later years of the evaluation, we relied solely on the quasi-experimental design and continued to find no impact of the SIG program on high school students’ achievement.

Want to Learn More?

If you are interested in learning more about how the impact of a program could be evaluated using regression discontinuity or quasi-experimental designs, please contact us at jnakamo@wested.org. Visit the “An Evaluation of School Improvement Grants Using Regression Discontinuity and Quasi-Experimental Designs” abstract page to find out how to gain access to the full article.