Measuring the impact of mastery: which research should we be looking at?

As a department of the University of Oxford, whose mission is to further educational excellence, OUP believes in the important role of evidence of impact in improving learning outcomes across the country.

We know how important evidence of impact is to support schools’ decisions around how they invest their time and money. Teachers can now draw on an increasing range of information sources to assess the potential impact of their investment.

The Secretary of State, Damian Hinds, gave a speech at the recent World Economic Forum pointing to the importance of research and proven efficacy in the context of improving maths teaching. He referred to how the 2014 National Curriculum for Mathematics drew on Singapore’s curriculum and textbooks because these were proven methods with evidence of impact.

Last year Ofsted appointed a Head of Research, Professor Daniel Muijs, saying it is “committed to ensuring that [its] new education inspection framework (EIF) is informed by research evidence.”

Earlier this term, the DfE published the results of a three-year longitudinal study carried out by Sheffield Hallam University on the impact of the government’s £76m investment into Teaching for Mastery so far, run by NCETM and delivered by the Maths Hubs.

The headline results showed little to no impact, with the lead author, Mark Boylans, suggesting that the results indicated the “big gains” in practice and attainment that the DfE had anticipated from the NCETM Teaching for Mastery project were unlikely to emerge.

It would be a shame if these headlines led to a kneejerk rejection of mastery-based approaches in general as having failed, particularly when there is positive evidence of what real teaching to mastery can achieve for mathematics in our schools.

Governments like Singapore allow a long-term horizon for measuring overall impact on large-scale transformation projects. They weigh the interim evidence, and adjust their investments to focus on what is working, jettisoning what is distracting and intervening where necessary.

This is the reason why OUP chose to work with Singapore’s world-class Primary maths textbook series My Pals are Here and introduce this authentic, tried and tested programme to UK primary schools. Published here as Inspire Maths, it has a growing network of UK advocate schools, with a bank of UK case studies, video testimonials and statistics to show how using the Inspire Maths approach to mastery has positively impacted on students’ understanding and achievement alongside teachers’ experience of teaching maths to mastery. This means schools can be sure that Inspire Maths is backed by gold-standard evidence of impact globally and in UK schools.

To see this evidence in action, any teacher in the UK can book an appointment to visit an Inspire Maths Advocate Schools, providing the opportunity to:

  • Ask questions about resources and be present in lessons, learning directly from teachers who have personally implemented programmes.
  • Find out about the practicalities of classroom implementation, including timetabling and staffing.
  • Observe lessons to see their impact on learning and behaviour.

Find out more today by booking your place at a free event with an Inspire Maths Advocate School or getting in touch with your local educational consultant.

2 thoughts on “Measuring the impact of mastery: which research should we be looking at?

  1. I found this a timely blog, as you mention Singapore, from where I am currentlyv awaiting a flight. Can anyone explain why this texbook is not on the DfE approved list, if it so obviously works both in the 85% of primary schools in Singapore (which THEY pointed us towards after all) and in the UK?

  2. Nithya says:

    Appropriate evidence is vital especially before making decisions or even making trials. This is a good move towards mastering maths. And it is very encouraging to see how the Government is researching over various tried and tested methods from other countries.

Comments are closed.