How to Interrogate Your School's Data

For some time, it has been argued that educational institutions do not have enough data, and that they need to become more data-rich in order to inform decision-making. As the thinking goes, such an approach would result in higher quality decisions. We offer three observations (of many), relative to that argument, based on our work with schools:

  • Most schools are awash in data. Some know it, and some don't know it.
  • Most schools have not invested sufficiently in people's ability to read the data, or to question it. This includes both the schools that know they're awash in data, and those that do not know it.
  • Governing boards and leadership team members need to learn how to interrogate data, as part of high quality decision-making.

We think it's misleading to insist that, when schools are rich in data, quality decision-making will follow. However, an abundance of data is no crystal ball. To make quality decisions, there are principles to be followed, starting by the first question: what is the decision to be made, and why must it be made? The second question deals with the nature of the decision (e.g., is it the only decision of its kind, ever? Highly unlikely.). How one responds to those questions determines the subsequent actions. Among the subsequent actions is likely analysing the data that the school has to-hand. We then return to our observation about not knowing how to question the data. This not-knowing can lead easily to misinterpretation and/or faulty reasoning, and, when it comes to educational institutions, that impacts decision-making at the governing board level as well as the senior leadership team level.

Co-authors of Decisions Over Decimals (Christopher Frank, Paul Magnone, and Oded Netzer) recommend a series of questions that "fierce interrogators" can use, showing a blend of curiosity, critical thinking, and a good understanding of the organisation's context. We cite them below from their article, "The Art of Data Interrogation" in Rotman Magazine (Spring 2023), with some editorial insertions in [brackets] to provide more utility to schools.

Fierce Interrogators: Questions for Boards and Leadership Teams to Ask

  • What is the source of the data? Data and analyses rarely arrive at your desk at random. There are often intentions (good ones, but also possibly bad ones) behind how and why the data was collected, and how and why the analyses are presented to you. Depending on the source and the intention, there could be possible agendas behind the data delivered to you. [...] Does the [person providing the data] have a reason not to show me the entire data? If so, what are they likely to hide? For example, is it the marketing team that provides evidence about the success of the advertising campaign? Understanding the source of data, and the intent or possible agenda, can inform [you] about possible issues in the data that you want to pay closer attention to.
  • Are the metrics provided the ones we expected to see? If not, why not? Are [the people providing the data] showing your the right KPIs [for your school's context]? Are you being presented with vanity metrics that make [those providing the data] look good?
  • How were the metrics calculated? Many metrics have no clear definition. For example, when a company reports having 10 million customers, you want to ask yourself how 'customers' are defined. Are customers everyone who ever visited the company's website (even if they never bought anything), people who last bought from the company five years ago, or only active customers who purchased in the past year? Depending on the agenda behind the data, [the person providing the data] may choose different metrics. Make sure that you understand the metrics, particularly those that are critical to your decision-making.
  • When and where was data collected? Are the time period, location, and context relevant to the decision at hand? Should we make decisions about [new admission marketing in 2023] based on [admission success of previous marketing from 2020-21]? We happen to have accurate and reliable data from [previous marketing successes from 2020-21], but no readily available good data [from 2021-22 or 2022-23]. Am I better off with accurate but less relevant and possibly outdated data [...], or less precise but more current data [...]?
  • Are the comparisons being made to relevant and comparable alternatives? Almost every [school] can look good if compared to the right competitor. If comparisons are made, are the metrics comparable across alternatives? Different [schools] may measure the same KPI [e.g., the number of admission inquiries] in different ways.
  • What is missing? Are there other data points that may be relevant? Do you have this data over time, so you can explore possible trends?
  • Is the data I am not seeing similar to the data I am seeing? What data did [the person collecting the data] not capture? Who [or what] was left out? Could [the organisation] have fallen prey to a nonresponse or survival bias?
  • Are there outliers? Was there any data that [the person collecting and providing the data] could not explain (outliers) and therefore did not show? Is there a pattern in the outliers that may prove valuable?

Previous
Previous

On Emergence

Next
Next

Persistent Innovation