It is that time of year: This winter, state education agencies (SEAs) are proposing revisions to accountability plans under the ESEA, relying on U.S. Department of Education (ED) revision guidance released last fall.

To assist, school accountability expert Robert Salley, a former ED program officer and current advisor to SEAs as Deputy Director of the Region 13 Comprehensive Center, offers insight into the opportunities this year’s accountability revision cycle poses, national trends, and practical tips.

What opportunities does this year’s Accountability Plan revision cycle raise for SEAs?

Salley: I’ll start with this: Accountability systems are meant to be improved. These plans are living, breathing documents. So, [revisions are] an opportunity to take stock of what is working and what is not working and to engage the field to determine the best course of action going forward.

Since the reauthorization of the ESEA in 2015, states have gotten far more comfortable in thinking “blue sky” about using flexibility to innovate and improve. That is, beyond the requirements and the law, what can we validly and reliably measure to better understand adjacent student factors that impact learning? How can we fine tune, tweak, and build in more diagnostics to understand where our system is today and where we are trying to go with our learners?

Take chronic absenteeism—the more we understand who is absent and why, the more we can support districts and schools in doing something about it. Within accountability work, School Quality and Student Success measures provide states with meaningful opportunities to improve student outcomes.

Many states are also resetting the ways their business rules operate and how indicators are measured under the law, and they are looking deeply at what has impacted them—such as decreased enrollment numbers.

Also, states are thinking about setting standards, the front end of this work. As states evaluate which standards and assessment systems have served them well over time versus what they need to tweak, some are looking at whether consortium assessments are working for them or whether the greatest investment is in designing their own assessments—to better reflect their values as a state and to provide strong evidence of student learning.

What trends are we seeing in terms of school ratings?

Salley: There are new data and a big gap since we last assessed. In many states, we are seeing schools that have never before been identified for “support and improvement” starting to show up on radar screens. Some schools may have a decline in performance. So, states have an opportunity to really step back, review the data, make sense of them, and cocreate meaning with communities around data to ensure that accountability systems are working, figure out how to better support schools, and assess whether more causal factors as a direct result of the pandemic are in play.

Finally, although some schools continue to exit improvement status, on the opposite end of the spectrum other schools are languishing in improvement status. States are trying to figure out how to serve schools better on both ends of the spectrum—to make sure the right supports are in place when schools exit status, and, for those schools languishing, to determine the right levels of support so that they don’t stagnate and stay in improvement status forever.

What specific tips do you have for SEAs on how to unearth potential areas for change

Salley: Start with these:

  • Think about what is in the best interest of students—the end users. Who are your students, how have demographics shifted over time—and what is the responsibility of the state in meeting needs on the ground?
  • Look further at how well comparable groups of students are performing, and do that analysis year over year. The revision cycle is really an opportunity to evaluate the effectiveness of programs and services in serving each group. Resource Allocation Reviews can also help here.
  • Then think about practical implementation issues. For example: Where do silos exist at the state or district levels? How do we break them down so schools and students really benefit from the full range of expertise inside the agency? How do shifts in leadership impact how practice is happening today in an SEA and future direction setting?

Interest holder engagement regarding proposed revisions is both a legal requirement and a critically important driver of progress. What does it look like when states do interest holder outreach and analysis well?

Salley: First, systems that successfully engage a wide range of people in interpreting school performance data are, as a result, able to understand whether changes to accountability and assessment systems are necessary to propel systems forward. Start here:

  • Consider whether you are eliciting feedback beyond traditional interest holders, schools, districts, parents, students, and board members. To get a comprehensive picture, you’ll need feedback from everyone affected by education, including local government, nonprofits that support families and students, and others. Beyond legal requirements, this is a real opportunity for the state to share its story, vision, and proposed changes.

The more interest holder engagement, the more likely states will construct plans that reflect the values of the state and realistic needs of its people. Some states do a good job of looking at their calendars of regularly scheduled meetings and using already planned forums to focus interest holder attention on improvements.

  • Think about how to reach multiple audiences—those with technical skills but also parents. It can help to create a companion guide that breaks down what would change, what it would change from and to, how it would change, and what the predicted impact would be. For nontechnical audiences, original “redlined” language, clear direction on how to engage with the text, and plain language in general are all helpful.
  • Consider designing a tool to help you articulate what you are asking for feedback on and to organize that feedback by type. It could be as simple as a spreadsheet that allows you to code the types of responses you are getting from certain interest holder groups. This would make it easier to disseminate feedback to your SEA team and share feedback with leadership, such as boards of education and others. You may unearth powerful broad-brush strokes and themes relevant to changes in accountability systems.
  • Bring together teams of folks writing the specifications. This will help break down silos within SEAs. Each can talk about what they are seeing across the feedback, across the agency. Often, what seems like feedback to one particular area will have import across numerous areas of the plan.

Great pointers. Finally, what free resources can SEAs turn to during this process?

Salley: I would recommend the following:

  • Reach out to your program officer at ED who can help guide you through the process and answer any questions. Or just make them aware that your state may undertake some changes.
  • Comprehensive Centers and Regional Educational Laboratories can also help you work through technical elements and receive fair, unbiased recommendations on potential next steps given your own state’s circumstances.
  • Look at peer state data trends and what those peer states have been doing to support schools in this gap of assessment. Also, look ahead to understand how other states are adjusting their systems.
  • Finally, Family Engagement Centers pose an opportunity to reach parents, and Equity Assistance Centers provide opportunities to work with LEAs looking at civil rights issues, for example.

The Region 13 Comprehensive Center works with state education agencies and their regional and local constituents in New Mexico, Oklahoma, and the Bureau of Indian Education to improve outcomes for all children and better serve communities through capacity-building technical assistance.

The contents of this post were developed by the Region 13 Comprehensive Center. The Region 13 Comprehensive Center is funded by a grant from the U.S. Department of Education. However, the contents of this post do not necessarily represent the policy of the Department of Education, and you should not assume endorsement by the federal government.