Home » Commentary » Opinion » Our Closed Schools’ Herd Immunity to Evidence
· Quadrant
We’re told education policy and practice should be guided by ‘the evidence’. Lately, that’s been a quasi-religious deference to health advice, with education evidence a distant second on school operations (or lack thereof). It’s telling that it’s just this week — some months into the pandemic — that education evidence has finally gotten a seat at the adults’ table.
A succession of five reports commissioned by the National Cabinet illustrate what education researchers and educators already know — there’s huge educational cost to students from learning disruption. Home-based learning may work for some students, but certainly not all (nor parents or teachers, for that matter). There’s also considerable economic cost — students’ reduced learning translates into lower future earnings, and parents’ current work participation and productivity are sapped by playing teachers’ aide.
It wouldn’t hurt for a change if education policy was guided primarily by education research — which repeatedly warned that school closures would undermine students’ outcomes.
In recent years, the Productivity Commission has advocated for a ‘national education evidence base’, while the Gonski 2.0 education review recommended a ‘national research and evidence institute.’ Indeed, such an organisation is being established, though details on structure and timetable have been scant. Were there such an institute today, would we have dodged the unnecessary closure of schools and associated education disruption? Perhaps an authoritative, independent, and national body might have cautioned overzealous premiers against imposing haphazard closures of schools. At the very least, it might have advised how the disruption could be minimised. Perhaps education evidence is resigned to languish further down the pecking order. Observers of education policy are only too aware evidence is routinely relegated to the bin during decision-making.
It’s often noted the problem with education research is not a lack of evidence, but a dearth of ‘mobilisation’ into practice and policy. There are — rightly or wrongly — concerns about education research’s rigour, independence, and role in practice.
First, efficacy of educational evidence rarely enjoys the ‘gold standard’ sometimes attributed to medical research. Common to other social sciences, it’s hard to construct ‘randomised control trials’ and not everything is quantifiable, objective, and observed. As a result, even the most compelling evidence from classroom practice faces resistance scaling up to broader education policy.
Second, education research is unfortunately dominated by vested interests. Individual teachers and school leaders can’t be abreast of all research developments, so they rely on synthesis and guidance from departments, teacher educators, accreditation bodies, consultants, instructional leaders, and unions. But this ultimately undermines independence of educational evidence — with habitual rejection on political grounds — as research is not filtered objectively.
Third, in education, more than anywhere else, experiential knowledge often trumps empirical evidence. Research findings are routinely dismissed as highly contextual (‘it doesn’t apply to me’), and not specific (‘no empirical study knows my students and classroom’). And therein lies the enduring division within the education community over the relative merits of teaching as either art or science.
The former argue that each student, class, school, and teacher are unique — and researchers can’t capture this complexity empirically. Moreover, they argue that through their own experience they have intuitively learned what works ‘for me’. But this rejection of empirical evidence in favour of anecdotes is classic confirmation bias. It’s also expedited evidence-free fads into classrooms, based on little more than wishful thinking and hunches.
And the latter argue that top-down, one-size-fits-all practice should be prescribed by departments and schools. This results in mechanistic commitment to replicate ‘what works’ — irrespective of context, and deaf to feedback. A common snag is acceptance of small ‘effect sizes’— meaning practices with trivial educational impact are endorsed without adequately weighing cost-effectiveness or reliability of interventions.
Of course, the truth is that teaching requires both art and science. Educators need to be skilled evidence users, capable of accessing, interpreting, and applying research, as well as weighing up sources of evidence and evaluating their reliability. The same should be expected of policymakers. Yet, school closures in the crisis were not driven by a careful evaluation and consideration of available evidence, but by politics with a nod to ‘medical advice’ — which in reality soon admitted there was no reason to keep children out of school.
It may be that education evidence never really had a shot against hysterical politics and a public appetite for overly cautious reading of health advice. But it’s also a cautionary tale for the education research community; telling of its impotence in influencing policy decision-making at the highest levels.
Communicating the self-evident educational impact of school closures needed to be timelier and the case made more forcefully. It could be that a national education evidence institute may have provided the coordination and clout to do this.
But for education stakeholders to be taken seriously, we need to immunise against the plague of vested interests. It’s telling that two of the biggest players in Australian education circles — the NSW Education Department and Teachers Federation — have been busily fighting it out over their respectively discredited readings of medical advice, with education evidence a distant afterthought.
Whatever the imperfections and complexities in education evidence, its belated consideration regarding school closures is indefensible. Schools now face a formidable task that was avoidable: flattening the curve on a new epidemic of lost learning for Australian students.
Our Closed Schools’ Herd Immunity to Evidence