Effective Innovation Starts With Better Data

Mary Laski, Ph.D. and Calen Clifton 16 July 2025

“Business as usual” is clearly not working for today’s teachers and students. To address the ever-increasing issues with our education system, we’ll need to try bold, innovative solutions. But importantly, we’ll also need to study those innovations, both to prove what works, and to identify where things might be going wrong. Behind any groundbreaking innovation lies a foundational challenge that can make or break any reform effort: data. Before researchers can rigorously evaluate outcomes, we must accurately track and document the intervention itself.

As straightforward as it sounds, “getting the data right” is often far more complicated than researchers expect. We share three key principles for effective data collection as gleaned from our ongoing evaluation of Next Education Workforce™ (NEW), a strategic school staffing model developed by faculty and staff at Arizona State University and implemented in Mesa Public Schools (MPS). We argue that solid data tracking is an essential, if unglamorous, first step in developing an evidence base for innovative interventions like NEW.

 

1. Explicitly define the innovation being measured.

Innovative practices are often complex and thus require clear definitions to allow for adequate tracking. In our evaluation of NEW models, we faced a surprisingly thorny question: what exactly does it mean to be implementing the NEW model, and who should be counted as a participant? In trying to answer this question, we discovered that different stakeholders had different definitions of what “counts” as implementing NEW, and these definitions evolved as the initiative grew.

After several years in the field, MPS and NEW sharpened their shared definition. NEW implementation requires (1) at least two professional educators, (2) sharing a common roster of students during the same time period, and (3) spending at least half of their time working in these shared roles. These three metrics specifically define the work and also distinguish NEW from traditional “one teacher, one classroom” instruction.

Of course, implementation in the real world is never that simple. Schools may only partially implement or tailor innovations to their unique contexts. While implementation fidelity is an expected challenge in education research, starting with a clear definition allows us to assess the extent of this challenge in clear, agreed-upon terms.

 

2. Frame data collection around this definition.

Existing data systems in education agencies are built for the traditional schooling model, making innovation tracking difficult. These data systems are built to tie students and courses to an individual teacher of record by default. As such, we can identify teachers as either implementing NEW or not, but as our work progressed, we realized that the teacher-level information likely wasn’t the right unit of measurement. Is NEW a teacher-level condition, a student-level experience, or a grade-level reform? It turns out that it can be all three, which complicates how we link the intervention to outcomes. Our aim is to understand student-level experiences of the NEW model, but with the data we have, it’s often challenging to identify which students are actually experiencing the model.

For successful evaluations, tracking data should be tied directly to the innovation definition. In the case of NEW, we have three metrics, outlined above. Better data would identify the specific educators working together, the shared roster of students they are working with, and when the shared rostering is occurring.

The broader lesson is that innovative interventions often defy the neat categories of traditional data systems. To learn about these interventions, we need to not only define “what counts,” but also collect additional data aligned to this definition to ensure we know whom to count.

 

3. Regularly revisit definitions and data collection strategies.

Schools are not controlled laboratories, and innovations are bound to develop over time. All stakeholders need a shared understanding of what counts, and these criteria should be regularly reviewed and updated. Changes should be clearly documented, both to ensure alignment across stakeholders and to increase research credibility and transparency.

As an example, MPS has refined its internal tracking as NEW has expanded. The district is now identifying course sections, in addition to teachers, that are implementing NEW. Crucially, this information will be integrated into their existing data systems. These changes will further clarify which students experience the NEW model, strengthening our research on student-level outcomes.

 

Conclusion

The education landscape in America is changing rapidly. Now more than ever, schools seek creative solutions to long-standing problems. Innovative models like NEW offer exciting possibilities, but realizing that promise at scale depends on learning what works, for whom, and under what conditions. That learning starts with quality data. By prioritizing clear definitions and aligned data collection from the outset, education leaders can ensure that when a new idea shows positive results, they can trust those results and build on them. Conversely, if an approach isn’t delivering, good data can help pinpoint why and guide adjustments.

“Getting the data right” may not grab headlines, but it lays the groundwork for every headline that follows. As schools and districts continue to experiment with bold ideas to transform teaching and learning, investing in the unglamorous work of data tracking and documentation will pay off in a robust evidence base—one that can inform policy and practice for years to come.

Mary Laski, Ph.D. and Calen Clifton

Mary Laski, Ph.D is Research Principal at the Center on Reinventing Public Education. Mary’s research focuses on the educator workforce, with an emphasis on partnerships with state and local education agencies. She received her Ph.D. from Harvard University in 2024. Calen Clifton is a Research Analyst at the Center on Reinventing Public Education. A former teacher, Calen received his Ed.M. from Harvard University in 2019 and has previously worked as a research analyst at the North Carolina Department of Public Instruction.

views