It’s a good job everybody had already made their bonfire night plans cos after yesterday I suspect that many schools would have come back after half-term to find smouldering piles of Interactive Whiteboards littering their playgrounds. Certainly wasn’t a lot of love around for the old IWB yesterday on Twitter.
The starting point for the discussion was this article which looked at the £253k being spent by the EEF on a tablet project. On the face (see details on EEF site here) of it the project appears to be a proper research effort rather than a way of buying more iPads. Personally, I think that this is a step in the right direction. There is at least some sense in coming up with a proposition and testing it. Time will tell if the research design is up to the task.
Consider how most organisations decide whether to change how they do things in response to technological development. Firstly, they often do this out of competitive fear.
Director 1 – “New technology X now exists”
Director 2 – “Competitor Y might use it and gain an advantage.”
Director 1 – “What should we do?”.
Director 2 – “Lets investigate the technology, see if it can help us, and then do a cost benefit analysis.”
They first look at an area of their organisation where there is a problem. They then consider what possible solutions there could be to the problem that any new technology could provide. They then spend money testing to see if the technology will help their organisation. Finally, having tested they look to see if the benefits that arise outweigh the costs of implementation. As an example of the the scale of such spending, consider that in 2010 Intel invested $8billion in retooling several manufacturing plants. Imagine the work that went on beforehand to assess the benefits of spending that amount.
Consider how the above conversation has traditionally gone at the DfES/DCSF/DfE:
Civil Servant 1 – “New technology X now exists”
Minister 1 – “How shiny is it?”
Civil Servant 1 – “Very.”
Minister 1 – “Buy shedloads.”
The story of how the schools of England came to be infested with IWBs is a salutary one, There are many versions of how it came to pass that Charles Clarke announced at BETT in 2004 that there would be a £50m starting fund to introduce IWBs into schools, but none of them include any research into their benefits. Possibly £500m later we have many schools with IWBs in every classroom. Now, before we get our knickers in too much of a twist about the big numbers we should remember that this expenditure has occurred over a decade. So cost about £6 per child per year. Yes, we can all think of other ways to spend that sum, but we should also keep some perspective whilst we look at why this was such a bad move.
I am not going to criticise or gainsay with evidence any teacher who wants to use an IWB. It is the case that while the evidence on IWBs is mixed (to say the least), this is often down to the research being an attempt to say that, across the system, IWBs are a good/bad thing. I have seen teachers who use them to enhance learning in wonderful ways. But that can never be an argument for what became an essentially whole country implementation plan. They were introduced without any clear idea of what problem they were meant to solve, or how they would be used to enhance the learning of children. There was scandalously little training for teachers to use what remains a complex piece of kit. And they were bolted onto technology environments that were already creaking at the knees due to rapid expansion with little investment. Often the board wouldn’t work because the PC it was attached to was 5 years old. They were put into rooms with south-facing windows without budgeting for blinds. Projectors were hung from ceilings in such a way as to take out desperately needed seating space in the classrooms. I could go on and on and on.
As an aside, I would say that many of the issues with IWBs that arise in secondary settings are not always replicated in primary. There are a number of reasons for this. The principle one is that in primary the interactivity tends to be between the system and the children, which is where it can often have best effect. Training of 6-12 primary teachers is also an easier task than the 60-100 in secondary. And whole school approaches are easier to achieve in a smaller setting. There are also curriculum areas in secondary where the IWB has been used to good effect – many Maths departments were often the early adopters.
Over the past 10 years I have been into many schools (predominantly secondary), often specifically to see their use of technology, and have rarely seen IWBs being widely used, except as a projection surface. In most cases the teachers would have been better served by a projector (which is, IMHO, an essential teaching tool) and a BIG white board. Preferably two BIG whiteboards. I would say that 90% of the affordances of the projector/IWB combo used in schools would be able to be gained by simply having the projector.
Why have IWBs not been a great success? To reiterate, I would suggest it comes down to four reasons:
- At the time they were introduced schools had a generally poor technology infrastructure. Adding more hi-tech equipment just caused more problems, and teachers won’t use equipment they can’t rely on.
- Lack of training.
- Lack of interactive content. There was an assumption that teachers would generate their own content and share it. Well, firstly, interactive content is not easy to produce, so the pool of producers was even smaller than the pool of IWB users. Secondly (treads warily) whilst teachers are natural sharers from a giving perspective, they are not great at using unamended materials from others. Teachers like to use stuff they have modified for their own use. So the pool of potential users got smaller. This left commercial product, which was expensive.
- No-one really knew what problem they were meant to solve, which meant no-one really knew how they were to be used.
Now, I can understand why such a spectacular failure (and spending that much money without commensurate benefit can only be described that way) would provide a platform for those who want to argue against any and all technology changes. But it shouldn’t. All change costs money, and many suggestions for change will turn out to be dead-ends or, at best, cul-de-sacs. The important thing is that we consider carefully where to spend the money. There are some things we do in schools where there is general agreement that it works and where possible we should do more of it. So spending £500m on some of those things would be good. And should be prioritised.
However, we should also not pretend that we can always know in advance of trying something that it will not work. So, Hattie as guidance, not dogma. That is where research comes in.
Unfortunatly, schools have been very badly served by research up to now. Particularly when it comes to looking at technology. I think the problem is this: People expect the big answer. They expect a technology to completely change things. “Which one technology can I buy that would change everything?” This is because there is a recognition of the big changes that technology have made to peoples everyday lives, over many years. In many ways, schools have failed to keep up with these changes. Much of the fault for this has been out of the control of individual schools. And schools have not been well served by those who should have been helping them with this issue.
Research into big changes is difficult because of the number of variables that come into play. It becomes almost impossible to control for them in any experimental design. Which makes the research meaningless. Research into the changes technology can facilitate needs to concentrate on smaller things. On the more incremental improvements. Modern technology, which is more concerned with the personal, facilitates this.
So whilst it is possible to criticise the BBC report into the EEF research (and many have), this is misguided. Research costs money, and much of it will be spent on things that come to nowt. I would argue that we need much more spent in this way, by more than just the EEF. The EEF is funded by a £125m grant. I would like to see that expanded by a factor of 10 and spread out among other providers (instinctively I don’t trust the single provider model). Bigger projects (such as the one we started this post with) should have smaller precursor research.
My gut feeling (yes, I know, but it does have its own validity), from years of work in industrial and educational settings is that appropriate use of technology can improve learning and teaching. I know this because I have seen it happen. I have also seen (and lived through) many of the issues that led to failure. Most of those improvements don’t scale. Most of them don’t cross phase well. Most of them don’t cross from one subject to the other. Many of them don’t survive changes in technology. Far to many of them have been the carefully-nurtured project of one person in a school and so fail the sustainability test (they often even fail to follow the person as their next school can’t implement). Far, far too many of them pass the “shiny” test but fail the cost benefits test. These are the “technology looking for a problem” ones rather than “a problem looking for technology”. There is very little software/hardware used in schools that was designed specifically for the purpose of supporting learning and teaching.
If we spend more on (better) research we will have fewer pieces of unused equipment bolted to the walls, fewer cupboards filled with data loggers and, coming to a school near you soon, fewer 3D printers gathering dust on the shelves. We just have to steel ourselves for the inevitable failures and recognise that they are a good thing. Far better to spend £10m on a research project that shows a change has no benefit than spending £500m to find it out.