The following video and edited transcript are from a presentation I gave at the November 2019 Teach Meet. Teach Meets are informal gatherings of educations "to share good practice, practical innovations and personal insights" and the theme for this session was 'Evidence in Education'.
Over the last few years, I've become increasingly interested in the topic of evidence in education. This presentation involves an important lesson from turkeys and some examples of evidence blindness in Australian education policy.
So let's get started.
Lessons From A Turkey
I want you to imagine for a moment that you're a turkey. As a turkey, each day you're fed by the friendly members of the human race and every day that you are fed firms up your belief that human beings have your best interests at heart.
Now this is what your life looks like: the variable on the y-axis is your well-being. You can see that day after day your well-being is on a slight increase as you continue to be fed and looked after by those caring human beings.
That is, until day 1,001 when something unexpected happens: Thanksgiving.
Sadly, your life as a turkey ends and, at that point, you experience a sudden revision of your beliefs.
What's the moral of this story? This particular example comes from essayist, philosopher and mathematician Nassim Nicholas Taleb, who showed in his book The Black Swan that hidden evidence can change the story entirely. For the turkey, their perspective of life was from one world view but for humans that turkey’s role in life looked quite different.
We Can Be Blind To Evidence
Just like turkeys we can be blind to evidence and this comes from ignoring what's hard to measure and also from what we don't believe.
Ignoring what's hard to measure is bound up in something known as the McNamara fallacy. This is a tendency to make decisions based on easily quantifiable constructs – so what's easy to measure – and ignoring all other evidence.
Here's an example. Imagine that you want to try out a new practice in the classroom. So you go and implement it one lesson and then at the end of the lesson you give your students a test. The test shows that your students haven't learnt anything. A few weeks later, however, you overhear some students talking and you realise that, actually, they had picked up some things from that lesson. It turns out, there's quite a lot of information that they'd retained. If you'd made your decision about that classroom practice based solely on the test results you would be missing out on some quite important information. And that's what often happens when we look at the short-term gains from an intervention and ignore those harder to measure long-term benefits.
Now let's consider our beliefs. Confirmation bias is the tendency to embrace information that supports your beliefs and reject information that contradicts them.
This time, consider a class where you have one student who you would consider to be strong, perhaps even academically gifted. The student sits an assessment where you would expect them to do well - but, they do quite poorly. You conclude that it was bad luck and that the student was having a bad day. In that same class, you have a student who you'd consider to be weak, who struggles and is usually quite slow to pick up things. On that same assessment this student also does poorly, but this time you conclude that this was what you expected anyway.
Now you might be thinking, “Well I wouldn't be susceptible and show confirmation bias in that situation”. That may be so, however all of us, even when we are aware of this bias, do fall prey to it at one point or another.
Evidence Blindness in Australian Education Policy
Let's look at both of these constructs, confirmation bias and the McNamara Fallacy, using actual examples that have played out in Australian education policy in recent months.
Governments across Australia have been talking about banning mobile phones in classrooms. The media release in Example 1 comes from the Victorian Department of Education. They put forward some evidence about why mobile phones should be banned. Firstly, they stated that a large proportion of young Australians are experiencing cyber-bullying. Secondly, they stated that there are concerns of mobile phones causing distractions in the classroom.
What's hard to measure here is whether mobile phones actually cause cyber-bullying – and, indeed, what causes cyber bullying or any sort of bullying at all. What's also ignored is whether mobile phones can produce any educational benefit and positive learning outcomes. For example, when mobile phones are used as a calculator, for publishing or some other learning support. When you take into account what's been ignored or is hard to measure, the situation that has been presented here can actually look quite different. Importantly, the argument that's being put forward perhaps isn't the right one.
Here's another example. The New South Wales Department of Education recently indicated that they're planning to make maths compulsory for all students up to the end of Year 12. The evidence the department put forward is that maths is important throughout our lives and that parents want their students to leave school as mathematically literate.
Let's look at this more closely. The evidence being ignored here is the situation for those Year 10 students who are choosing not to continue with maths beyond the end of that year. There is something happening in the lead up to Year 10 that's making these students decide “Maths isn't for me. This important subject isn't for me for these final two years of school”. What's hard to measure is whether an additional two years of maths is what will make the difference for these students or whether there's a more expedient initiative that could reach the same goals. Just like the first example, here we have a situation where particular evidence is being presented in a particular way to communicate a narrative that this body is wanting us to infer.
Look For The Big Picture
I'd like to leave you with some take-home lessons. Firstly, each piece of evidence is only part of the overall story – think of the turkey – and this occurs even for robust, well-designed research. It's not just government; even researchers are susceptible to confirmation bias and they may look for those easily quantifiable constructs, those measurement tools that are easy to use. Lastly, something for you to consider: when you're presented with an idea or argument, instead of accepting it at face value and unquestionably, ask yourself, "What's the bigger picture being presented here?"
Related posts:
コメント