To achieve equity in education, we need the right data


To achieve equity in education, we need the right data

Photo Credit: Jessica Scranton/FHI 360

A version of this post originally appeared on FHI 360’s R&E Search for Evidence blog.

As we work to realize the Sustainable Development Goals (SDGs) related to education, it is the responsibility of every funding, implementing and research organization internationally to be asking questions about our own contributions to building equity in education. While a great amount of data gets produced in the course of education projects, only a fraction provides the detail that is needed to assess intervention impact on different equity dimensions. At the technical and implementation level, organizations need to capture and use the necessary evidence to understand and respond to inequity in education provision and outcomes.

To do that, we need to be deliberate in building monitoring, evaluation and learning systems that generate the data and analysis that help answer the question: are we improving education equity through our programming and policy? Disaggregated data are the first step to understanding who is left behind in obtaining a quality education for successful and productive adulthood. My recent paper, Mainstreaming Equity in Education, outlines key issues and challenges that need to be addressed around equity in education, and provides a way forward for mainstreaming equity-oriented programming and data analysis. In this blog post, I show how disaggregated data can make a difference to understanding impacts. I then provide evidence that, unfortunately, such disaggregated data are rarely collected.

Let’s take a look at a recent example where the analysis of program impact explicitly examines the equity dimensions available in the data. Figure 1 offers an illustration. As the figure shows, the overall positive program impact – which would have been sufficient to identify and report in most cases – masks the fact that there are substantial differences in how different types of learners responded to the program. While the gender gap appears to have been closed, indeed slightly reversed, a disproportionately larger share of the gains accrued to the relatively better-off students, who had already started out with a substantial advantage. In other words, the program, while creating a positive treatment effect across the board, exacerbated inequality in outcomes along the wealth equity dimension.

Graph showing positive education equity program impact

The analysis in Figure 1 was possible because data on equity dimensions were present in the data set. This analysis is still limited in that it does not look at outcomes for children not speaking the language of instruction at home, nor at migrant children. However, it does provide insights that are directly actionable within the confines of the program, and contributes to a broader knowledge base on the effectiveness of reading interventions for children at the lower end of the socioeconomic spectrum. Ideally, this is the type of analysis that every program and education system would be able to do, and the type of data and evidence that should inform funding, programming, and policy decisions.

How much of the existing evidence base includes this type of analysis? Unfortunately, very little. Figure 2 is borrowed from a recent review of evidence in education by the International Initiative for Impact Evaluation (3ie), which covered a wide range of systematic reviews and evaluations at different levels. While the state of the evidence is far from comprehensive, the review found some modalities that were effective across most contexts, a few promising approaches that have shown to be effective in some contexts, and some that had limited success.

Even when the modalities were well-researched and supported by a wealth of evidence, however, the equity dimensions were not addressed. The second research question of the 3ie review – “Do the effects differ across subpopulations (due to sex, age, or socioeconomic determinants)?” – remained largely unanswered, as researchers struggled to identify equity-relevant variables in the analysis, and bring together the disparate sets of evidence under a common framework.

Infographic showing 3ie review of evidence

Structured pedagogy, for example, may be highly effective for all students, and even more effective for struggling readers who are placed in unfamiliar language environments. Or on the contrary, it may be effective on average, but it may be the most capable learners who benefit most from structure and the additional challenge it creates. Similarly, we learn little about whether school-based management support programs were more ineffective in lower resource contexts, where capacity is lacking, or equally ineffective regardless of school type and population characteristics. The lack of depth and consistency in the measurement of equity dimensions places limitations on the utility of evidence on education interventions.

It is certainly true that even if data on equity dimensions within the underlying studies that went into the systematic review were available, reliable estimates for subgroups and sub-contexts could often be difficult to generate, due to sample size constraints within many studies. Unfortunately, the missed opportunity for 3ie to generate real insights of what works for whom is reflective of the overall state of monitoring and evaluation of program-level data. As we learned in another paper, Measuring Equity in Education Landscape Review, this level of depth is simply not a common, accepted practice for analysis and reporting. Leaving socioeconomic and demographic predictors out of the main analysis often conceals important insights about how interventions affect the learning and life trajectories of children and youth.

In my next post, I provide five features necessary for a monitoring, evaluation and learning system to be geared towards equity in education.

1 Response

One Response to “To achieve equity in education, we need the right data”

  1. Ariel on

    Wow, it was telling that it appeared that the gender gap closed, but when disaggregated, the influence of wealth inequity was evident. It is important that education institutions and researchers are trained and have the resources to examine data and disaggregate so that no youth are left behind. I cannot wait until the next post. Would you be willing to submit a post like this to the Global Campaign for Education-US Action blog? As a coalition member, we would love your participation. You’re doing great work! Great post!