To answer key questions about what works, WestEd conducts impact studies that use rigorous research designs, including randomized controlled trials.
Does a new program lead to improved education outcomes? Do students who use one science curriculum do better than students who use another? Impact studies are designed to answer such questions.
Impact studies test whether one thing causes another. Just knowing scores have gone up following an intervention doesn’t mean improvement is attributable to that intervention. Perhaps other changes, such as changes in the school’s schedule or staffing, also contributed. Impact studies isolate the effect of an intervention by assuring that there is clean comparison between a treatment group that received the intervention and a comparison group that is just the same except that it did not get the intervention.
A randomized controlled trial (RCT) is the “gold standard” research design for establishing cause and effect; its random assignment of participants to the treatment or the control group minimizes differences between the two groups. But a well designed and implemented quasi-experimental study can also support strong inferences by carefully matching the treatment group to a comparison group. For example, in regression discontinuity designs, which WestEd researchers have written about, participants in a program or intervention are chosen based on a cut score for an eligibility measure, such as an achievement score. Students just above the cut score enter the program and their outcomes are compared to those of their counterparts who were just below the cut score and, therefore, did not enter the program.
Impact studies conducted at WestEd range from large, school-based trials (with dozens of schools, hundreds of teachers, and thousands of students) to more rapid, small-scale studies (involving 20–30 students at a time). WestEd’s studies use state-of-the-art techniques and aim to make practical contributions to professional practice in education and other fields, as well as to advance theoretical understanding of what works and how.
Illustrative examples from our work
Investing in Innovation (i3) studies
Often, WestEd serves as evaluator on grants funded by state or federal agencies. For example, WestEd researchers have conducted three impact studies of programs funded under the federal Investing in Innovation (i3) grants initiative. One study was an RCT and two were quasi-experimental designs, with all three meeting the What Works Clearinghouse standards for their designs. The programs studied ranged from an arts program in Oregon to a STEM initiative in Nevada to a high school course in California that prepares students for college, the Expository Reading and Writing Course.
Impact research building theory about how students learn
WestEd and partners are investigating ways that commonly used visuals and diagrams can be best arranged on a page or a screen to maximize learning in mathematics and science. This work involves a series of small-scale RCTs in which middle school students study visualizations of math and science concepts, and their understanding of those concepts is measured afterward. Spatial arrangements are deliberately varied to test general principles of visual processing and cognition. The studies provide guidance to teachers and curriculum developers about the best arrangement of visuals in instructional materials. Findings also build theory about how students process visualizations, which informs new studies.
Five university partners and WestEd formed the National Research & Development Center on Cognition & Mathematics Instruction to study whether revising an existing math curriculum using principles from research in cognitive science would improve student math outcomes. The center team revised student and teacher materials for seventh grade and ran a nationwide RCT to determine whether students benefited more from the new versus existing materials.
Impact studies focusing at the local level
Educators in districts and at the state level turn to WestEd to help determine the impact of local programs. Studies can be carried out quickly, using outcome data already being collected. For example, REL West partnered with the Silicon Valley Education Foundation and local districts to study a summer school program, Elevate. The study report, released within a year, documented improvements in algebra readiness. A companion research brief was used by the school community to guide discussions of how to further strengthen outcomes.