Ed Tech determinism and so-called conventional wisdom

I am always suspicious of things that everyone agrees about – mainly because I usually find myself disagreeing. I would not be so arrogant as to believe that everyone else is wrong and I’m right, but I do often think that it would be good to evaluate what is taken as read in light of our personal experience and pedagogical expertise.

Here are just three examples of what I’m talking about.

Khan Academy

I missed an interview recently with Salman Khan, founder of the Khan Academy. This is, as you know, the organisation that makes available loads of free videos for use in education. Students can watch the videos at home, thereby freeing up time for discussion in the classroom (the “flipped classroom”), or they can use them for self-study.

Apparently, Khan was asked what students can do if they didn’t understand something in the video. He replied that they could rewind it and watch that bit again.

Well, apart from the fact the idea of rewinding a video is hardly revolutionary, and even if you think education is merely a means of transmitting knowledge, there’s a fundamental problem here. If you didn’t understand something the first time, why would listening to the same explanation again, made in exactly the same way, by the same person, with the same graphics, examples, analogies etc etc make any difference to your level of understanding? That is like saying if your level of understanding the first time was zero percent, then by watching it over and over again, understanding nothing each time, all those zeros would somehow add up to 100%. Or is the argument that the video acts as a kind of battering ram into your brain: if you watch it often enough it will make a dent in your brain through which the knowledge can flood in?

The long and short of it is that Khan’s statement encapsulates really bad pedagogy and actually reinforces the value of an expert teacher, ie someone who is an expert at teaching. An expert teacher will find other ways to explain it, and recommend alternative resources.

It’s actually a really bad use of technology too. We should be aiming for adaptive technology, or compromise by having a range of different types of resources for students to use, not a ‘one-technology for all’ approach .

Educational Technology Action Group

The recently set up ETAG has put forward a series of proposals for discussion. One of them is:

Rapid technological advances imply that the impact of learning research will be potentially very considerable - children will be able to learn "better", faster, to learn and retain more, to collaborate and problem solve more effectively, to be better placed to support each other, to better evidence learning through doing (as with video analysis in sport), better exchange of effective practice globally. Cognitive science, biological research, nutrition, and more are shedding / will shed considerable light on the complexities of learning performance and open entirely new possibilities.

Well, maybe, but based on my own experience I am yet to be convinced. I’ve been around quite a long time, I read a lot, keep up with scientific developments (especially as they relate to the way the brain works) and use technology all the time, and for the most part the way I learn hasn’t changed much at all. The ways I acquire information and share my thoughts have, but I am afraid I have not managed to bypass several millennia of human evolution and become some sort of super-genius as a result of all these developments. In fact, the biggest impact to the way I learn was a course in efficient reading that I attended thirty years ago. If we really want kids to learn better, I suggest making a course like that compulsory.

OK, I know I haven’t done the discussion prompts any justice, but I find these kind of statements rather frustrating. Perhaps new developments have the potential to change the way we learn, but doesn’t mean they will. We’ll probably assimilate new knowledge and new technology, and end up with a bigger variety of ways to learn. I’m not sure they will enable us to learn “better”, even with quotation marks.

For me, a big problem with what I call “technological determinism” is that many future-tellers forget about human nature. I remember reading an article many years ago about why “1984” could never happen in England. The writer’s main argument was that, given the choice between arresting someone for having a twitch, and having to go through all the associated paperwork and hassle, or pretending not to see it and getting home for tea on time, most normal members of the “Thought Police” would probably opt for the latter.

I’m sure he was right!

Do contribute to the discussion using the guidelines for doing so at the link given above. For a more detailed critical look at ETAG’s suggestions, see Crispin Weston’s article, What ETAG should say.

The three part lesson

It seems reasonable enough: start by telling the kids what the learning objectives are, run the lesson, then have a plenary at the end in which you discuss and summarise what has been learnt. In a way that is pretty similar to the ages-old definition of teaching:

First I tells ‘em what I’m gonna tell ‘em.

Then I tells ‘em.

Then I tells ‘em what I told ‘em.

Where it all goes wrong, and one of the reasons that I disliked the Key Stage 3 ICT Strategy, is when that format is applied regardless of circumstances. What’s worse, some schools then adopt this approach (or some other equally groundless formula) and then apply it rigidly to every teacher for every lesson.

As a case in point, someone told me recently about a school in which teachers are instructed to never talk for more than five minutes at the beginning of the lesson, regardless of subject, stage of learning or anything else. After that, the kids have to learn from each other or by themselves.

Want to explain a particularly tricky bit of programming logic? Better speak fast: you’ve only got five minutes. Want to find out what the kids know and understand, by asking diagnostic questions, before you decide where to pitch the lesson? Set them a task instead. Want to apply a standard assessment for learning technique of waiting for an answer, no matter how long it takes? Forget it.

If there was any research evidence to show that speaking the class for no more than five minutes per lesson resulted in better learning I might be able to live with this sort of blanket approach. As far as I know, there isn’t. In any case, if you can show or tell the whole class something in ten minutes, say, isn’t that a much more efficient use of time than hoping they’ll discover it for themselves or by talking to each other? You can use the time saved to check that everyone “gets” it, and do some really interesting stuff to consolidate their new-found knowledge.

What can we do?

As I said, these are only three things that I worry about. Don’t even get me started on MOOCs.

But what can we do about it all? I suggest the best thing to do is think, and read books that make you think. I’ve already recommended Oliver Quinlan’s book “The Thinking Teacher”. (See Review of The Thinking Teacher). I’m currently reading Thinking Allowed, by Mick Waters, and I think I’m going to end up recommending that as well. I will soon be reviewing that, and I hope to soon be reading Tom Bennett’s Teacher Proof, another book that I believe hammers into some of the myths and groupthink that pass as wisdom these days.

I believe, somewhat naively no doubt, that if enough people openly question so-called “good practice” and various examples of technological determinism (see the Forbes article below, for another example) it would at least encourage the proponents of things like the ban on talking for more than five minutes at the start of the lesson to make a better, ie more considered case.

And I just saw the whole of London Zoo go flying past!

cross-posted at www.ictineducation.org

Terry Freedman is an independent educational ICT consultant with over 35 years of experience in education. He publishes the ICT in Education website and the newsletter “Digital Education."