Myth 1: Jam Tomorrow
“New technologies are being developed all the time, so the past history of the impact of ICT is irrelevant to what we have now or will be available tomorrow.”
After more than fifty years of digital technology adoption in education this argument is wearing a bit thin. We need a clear rationale for why we think the introduction of (yet another) new technology will be more effective than the last one. The introduction of digital technologies has consistently been shown to improve learning where it has been researched, the trouble is most things improve learning in schools when they are introduced, and technology is consistently just a little bit less effective than the average intervention.
Myth 2: Digital natives
“Today’s children are digital natives and the ‘net generation’ – they learn differently from older people.”
There are two problems with this. First, there is no evidence the human brain has evolved in the last 50 years, so our learning capacity remains as it was before digital technologies became so prevalent. Young people may have learned to focus their attention differently and to flit from task to task, but their cognitive capabilities are basically the same as 30 years ago. Second, just because young people have grown up with technology they learn efficiently with it. There is a vast range of digital technologies available, it is unlikely that today’s young people know and understand them all. Being an expert at playing Halo Wars 2 requires different skills and knowledge from having an active Facebook account. Most young people are fluent in the use of some technologies that they use regularly. None are expert at all of them.
Myth 3: Access to knowledge is all learners need
“Learning has changed now we can get to knowledge from the internet, today’s children don’t need to know stuff, they just need to know where to find it.”
This might be true if it really was knowledge that was available on the internet. The web has certainly changed access to information, but it this only becomes knowledge when it is used for a purpose. When this requires understanding and judgement, information alone is insufficient. Googling is great for answers to a pub quiz, but would you trust your surgeon if she was only using Wikipedia or YouTube? To be an expert in a field you also need experience of using this information and knowledge, so that you understand where to focus your attention and where new information will help you in making decisions and judgements. Access to information is important, but it is no substitute for experience and expertise.
Myth 4: The illusion of engagement
“Young people are motivated and engaged by technology, so they must learn better when they use it.”
It is certainly true that most young people enjoy using technology in schools. However, this is not the case for every learner, and certainly not all of the time. The assumption that increased motivation and engagement will automatically lead to better learning is also certainly false. It is possible to be motivated and engaged in digital activities, without any improvement in learning. Any greater engagement or motivation may help increase the time learners spend on learning activities, or the intensity with which they concentrate or their commitment and determination to complete a task. It is only when this motivation and engagement can be harnessed in tasks which improve skills, knowledge or understanding that there will be any learning benefit seen.
A Goldilocks problem
Using digital technologies for learning is a “Goldilocks” problem. You’ve got to get the choice “just right”. Rather than be seduced by the promise of a new digital device, look for where the technology can improve on something you already do, so that you adopt it as a solution, rather than as yet another innovation.
A bit dated now, but this report for the Education Endowment Foundation presents the rationale for these arguments: http://bit.ly/EEFDigiTech.
Author: Steve Higgins
Professor of Education at Durham University