By Matthew Hunter
Like all Heads of Department and GCSE teachers, I was quick to run my analysis on results day. How had the students performed versus school targets, their aspirations and our expectations? Like many, I’m sure, we were pleased with how some of our top-end students performed. However, like most, our eyes were quickly drawn to the bulk of students in the middle, target grades four to six; had we been able to help them maximise their potential? As the largest attainment group in most schools, it is often the answer to this question that will determine if a group of staff is pleased with the results.
With these students particularly in mind, scrutinising exam performance with the tools available is an important process. All departments will obviously want future cohorts to be better prepared and even more exam savvy than this year’s cohort, and the examiners reports are clearly a great place to start.
With boards providing question-by-question performance feedback at centre level, I would advise you to use this information to see if your students struggled on particular questions and if you can identify any themes. In addition, buying back some scripts can provide a fantastic professional development opportunity for staff because it allows you to scrutinise answers yourself and build a list of ‘work-ons’ that can be really specific to your department.
Having reviewed the 2018/2019 GCSE PE examiner reports from OCR, AQA and Edexcel, I would like to share my thoughts on the most pressing issues raised by the written exam papers. As I will refer to them regularly through this piece, it is worth reminding ourselves of the three (examination) assessment objectives:
|Demonstrate knowledge and understanding of the factors that underpin performance and involvement in physical activity and sport
|Apply knowledge and understanding of the factors that underpin performance and involvement in physical activity and sport
|Analyse and evaluate the factors that underpin performance and involvement in physical activity and sport
One positive that stands out from the reports, perhaps unsurprisingly, is that student performance improved this year compared to last. With the papers said to be of comparable level, the effectiveness of student response increased. As a result, perhaps frustratingly, the grade boundaries crept up.
Reading between the lines, the reports suggest that much of this improvement was down to how well students understood and recalled AO1 content. Definitions and descriptions of concepts were acknowledged as generally good, as was students’ ability to cope with simple data questions. Most pleasingly perhaps, the feeling from the boards was that students were better prepared to structure extended responses appropriately, showing that good work has been done in this challenging area.
When it comes to the main areas that need work, I’ve grouped information together under the following headings:
- Reading the question properly
- Understanding command words
- Having sufficient depth of understanding
- AO2 marks (applying)
- AO3 marks (analysing and evaluating)
It would be fair to say that issues three, four and five relate to the aspects of the exams where boards attempt to separate students. These are the more challenging aspects of a paper and, as a result, it is unsurprising that mid- and lower-level students leave most marks on the table here. That said, issues one and two are frustrating for staff. I’m sure we all spend a lot of time discussing what we perceive to be the real basics, but the boards continue to report issues in these areas for our lower-level students.
1. Reading the question properly
This is the most frustrating feedback I read in the reports. It is likely to impact lower-level students the most and I assume it is linked to perceived lack of time, feeling anxious under pressure and a consequent unwillingness to create a mini-plan for each question. For lower-level learners, rushing the questions they perceive to be easier is a false economy. The time they may save for more challenging questions later often doesn’t bear fruit, and so the marks they sacrifice by rushing things they feel they know are never earned back. This issue is compounded by students not understanding command words, because they often answer the question they think they’ve read rather than the one that is actually being asked, meaning that they respond with irrelevant information or an incorrect context.
My suggestion is to impress even more firmly on lower-level learners the need for a mini plan for almost all questions. They should underline the command word, circle the topic trigger word and quickly brainstorm the key information around the question before the answer lines are touched. Insisting that students go through this process will slow their reading and ensure they go over a question at least twice before answering it. If this means students run out of time during the extended response, I feel this still leaves them better off because lower-level students often achieve very few marks on the longer-answer questions and picking up simple marks they’re currently missing out on, by rushing their question analysis throughout, is likely to outweigh this.
2. Understanding command words
Students describing rather than explaining and not evaluating fully were repeated themes in the reports. Again, this is an irritating thing to read because I know teachers will be talking about this a lot in the classroom. The issue for me here is that staff and higher-level students instinctively know what a command word wants, and it is easy to assume that this is the same for the whole class. Unfortunately, some students have such limited vocabulary, that they simply don’t recognise the differences between the command words and they require more than telling; they need opportunities to practice regularly, under time pressure.
My suggestion would be grouping together questions on the same topic and having students revise that topic before tackling questions with different command words linked to it. It would appear to be the case that lower-level students are scanning questions, seeing the topic trigger and simply writing everything they know about that topic, which often means they waste time writing irrelevant information and struggle to note whether the question needs AO2 and AO3 points too, something I will return to later. It feels like we need to be actively linking information to command words and teaching students how to select appropriate information when under pressure. This could be something that higher-level students sit and do with lower-level students, as a peer task. Providing it’s done regularly at the end of each topic, as part of the preparation for a test, students of all levels should improve their ability to respond appropriately to command words.
3. Having sufficient depth of understanding
This issue links to the first two issues discussed, and with the two to come. In our rush to get through the theory content, deliver and assess NEA and deal with all the admin linked to delivering a GCSE PE course, it would appear that some of us are delivering simplified AO1 content and assuming that our sporty students already know and have the ability to fill in application (AO2) and analysis points (AO3). This is highlighted in the reports, which say that students are often making the same points repeatedly, changing the phrasing and using synonyms but hitting the same mark scheme point repeatedly. This suggests that they simply don’t have enough knowledge available to get all the marks on offer.
With most of us converting the courses we ran for the old specifications to the new specifications and keenly clinging on to practical time (because it’s the main reason our students select the subject), it may be that we need to consider giving more time to theory. There is an obvious halfway house, where theory work is tackled through practical sessions, which would help with AO2 in particular, but it is clear to me that lower-level students need more information and should be given more opportunities to access it. Good textbooks and resources do provide this, but in an effort to simplify things for lower-level students, are we doing them a disservice and leaving them short of what they need to get to their target grade? Alternatively, are we simplifying things appropriately for lower-level students, but then allowing this to become the default setting for the mid-level students too?
4. AO2 marks (applying)
As mentioned above, there is a lot of feedback in the reports on poor application skills, with students failing to provide practical examples even when they’re asked for or students using very basic examples which don’t exemplify the points overtly enough to receive marks.
For example, a lower-level student might write, ‘One function of the skeletal system is protection. We see this is in rugby.’ This is correct, but in most instances the question will require students to fully develop their second point, linking the AO1 to the AO2: ‘One function of the skeletal system is protection. We see this is rugby, where the cranium protects the brain from impacts in tackles and helps reduce the risk of injury.’ In most cases, exam questions targeting AO2 require explanation rather than simple identification. Making it clear to students that this is required more often than not, and asking them to practice, should help them improve their AO2 work.
5. AO3 marks (analysing and evaluating)
Many AO3 marks are unclaimed because students lack depth of understanding. There are two particular AO3 elements that are mentioned in examiner reports fairly often:
- Inability of students to infer things from data: While students are able to cope with basic data questions, perhaps as a result of good maths teaching, they are struggling to make inferences from the data provided. In essence they can pick out what the data says, but perhaps don’t realise (maybe due to their limited understanding of command words) that they often need to discuss or give reasons for what the data shows, by making a link between the data and the theory they’ve been taught. This would appear to be something that could be solved with some guidance and practice.
- Providing advantages or disadvantages rather than both when asked to evaluate and failing to conclude anything: Linked to understanding command words again, this is something that students will need to be educated on. Alternatively, it may be that they simply don’t know both the advantages and disadvantages in certain areas and need to be asked to consider both sides more often when the relevant theory is delivered. By identifying points in the course where evaluate questions are most likely, staff can give students a head start here.
While helping students to read questions properly and understand command words are likely to benefit lower-level students most, developing students’ ability to answer AO2 and AO3 questions more effectively will assist those aiming for the higher grades. Our mid-level students, who are often the largest group in a cohort, could benefit from improved techniques at both ends of this spectrum. If mid-level students gain three or four more marks as a result of improved AO2 and AO3 and add two or three marks by reading and understanding questions better, that’s a fair number of marks across both papers, which could well add up to a grade hop. If numerous mid-level students improve their overall performance by one grade, consider how different your departmental results might look!
It’s all worth considering. Good luck.
Matthew Hunter is Head of PE at Roundwood Park School, Harpenden, and co-author of OUP’s OCR, Edexcel and AQA GCSE PE resources.
To find out more some of Matthew and OUP’s PE resources, click here.