On Research, Evidence, and Practice

What if research were being undertaken, evidence of 'what works' noted and shared in public forums, yet no espousal and/or application of those results in schools? It would seem difficult to believe that such a thing could be taking place, yet that is precisely the premise of an article in the August 2014 issue of Scientific American.

To cite just one example of what the article highlights, let us consider the Reception (kindergarten) year. Researcher Anna Fisher observed that the walls early years classrooms were covered in colourful art, maps with multiple colours, and vibrant posters. She then wondered, "What effect [...] does all that visual stimulation have on children, who are far more susceptible to distraction than her students [university students at Carnegie Mellon University]? Do the decorations affect youngsters' ability to learn?" Somewhat fortuitously, Carnegie Mellon University happens to have a primary school on the premises -- a campus laboratory school, a natural place to use her question(s) to guide research, looking at the evidence. What she and her graduate assistant found, using two control groups, was that pupils in a bare-walled classroom "were more likely to pay attention and scored higher on comprehension tests" than their counterparts in the decorated classroom.

Admittedly, this is but one snapshot from one study that is amerocentric at a university that has its own laboratory school, and the article does not provide the actual assessments used to demonstrate how the pupils paid attention or how much better they scored on comprehension tests. I would be curious to know, for instance, whether international schools would find similar results, were we to pose the same question(s) and undertake that same research tack with control groups. From a data perspective, it would be all the more insightful to examine those results from member schools that utilise different curricula, and to track it for three to five years.

What is encouraging, though, in the kinds of studies cited in this article is that they are going beyond standardised assessments, choosing instead to focus on learning that is in progress, and that is a welcome step. The focus is really on identification of larger questions, such as querying what patterns exist in classrooms and learning spaces. For example, there is research being undertaken on 'discovery learning,' which promotes pupils discovering facts for themselves, rather than receiving them passively in a traditional classroom format. One of the important questions at the moment deals with the level of instruction (in the traditional sense) that students truly need. As Paulo Blikstein, assistant professor at the Stanford Graduate School of Education, states, "A lot of what happens in engineering and science is the failure [...]. You try something, it doesn't work, then you reevaluate your ideas; you go back and try it again with a new set of ideas." In sum, "there are levels of frustration  and failure that are very productive, are very good ways to learn."

Other ideas being challenged at the moment include class size, 'tell and practice' as a methodology (pupils are told information, then they try to put that information into practice), and why higher education is so slow to adopt changes in teaching methodologies.

The heart of the article, though, is about two things: (1) building an evidence base for what really works and (2) establishing a clear pipeline for getting that research into classrooms. The existing system cited in the article is frustrating to many, but it prompts me to pose the following two-part question: how might we build an evidence base within a global school community, and how might we facilitate the promulgation of this evidence base within that community? Our research would be more complex (and far richer, I would submit) than that cited in the article.

Previous
Previous

Creativity and System Leadership