Plus ça change, plus c’est la même chose – uninformed use of 9-1 flight paths for assessment
Secondary Schools that are using 9-1 GCSE summative grades to somehow demonstrate progress against curriculum standards through key stages 3 and 4 are deluded.
This is just a copycat approach to the previous system of National Curriculum Attainment targets and level descriptors. Flight paths were a phenomena developed when levels were central to the system and FFT estimates. This system was flawed and the reason why legislation removed levels from the system. Companies were tasked to develop a more equitable system measure. Progress 8 is not based on the use of levels so to return to flight paths based on 9-1 labelling is like trying to plug an analogue TV into a digital feed and expecting it to work!
“There is overwhelming evidence that levels needed to go and the Commission strongly endorses the decision to remove them. However, the system has been so conditioned by levels that there is considerable challenge in moving away from them. We have been concerned by evidence that some schools are trying to recreate levels based on the new national curriculum. Unless this is addressed, we run the risk of failing to put in place the conditions for a higher-attaining, higher-equity system.”
(Government Commission on Assessing without Levels 2015).
The use of 9-1 cannot be used to somehow track progress from year 7 because the reformed GCSE’s are designed to be a 2 year course starting in year 10. In physical education for example core PE is 100% practical. The GCSE PE qualification which started in September 2016 is 60% assessed by exam and 40% NEA. The two don’t equate – analogue and digital again! To indicate that a year 7 pupil is currently performing at a 1 or a 2 when they have not started on the GCSE specification therefore is a waste of teacher admin time and increases meaningless workload especially at a period when workload is an issue.
In fact Ofqual, have developed grade descriptors for the reformed GCSEs to assist teachers when using the specification to plan learning by providing an indication of the likely level of performance at grades 2, 5 and 8. The purpose of these grade descriptors is to give an indication of average performance at the mid-points of grades 2, 5 and 8. Even Ofqual categorically state that “The descriptors are not designed to be used for awarding purposes, unlike the ‘grade descriptions’ that apply to legacy GCSEs graded A* to G.” It appears that when guidance is offered to a highly qualified and experienced profession it is ignored!
What is very easy to do, however is to map key language from the National Curriculum Programme of Study for Physical Education as a reference point to the average performance descriptors for GCSE. We can make an informed professional judgement as to whether pupils are on track or otherwise to meet the physical education curriculum thresholds and where our assessment has identified any gaps do our utmost to support learners to close them. This ensures that the essential knowledge and key skills – the foundations for better grades when the specification for GCSE PE is finally followed – are mastered. This is different to professional practice that demonstrates an obsession to convert every bit of progress a learner makes into a number or a grade for internal records and reporting to parents and different to providing a 9-1 level in key stage 3 when the GCSE course is not even being engaged with.
One of the criticisms of levelling was that it labelled differential performance and did not encourage a growth mindset. There is a huge rhetoric reality gap between stating ‘we want to encourage a growth mindset’ and then using an approach that doesn’t encourage it! Learning isn’t linear and therefore a measure and practice that attempts to recognise progress in learning as linear and hierarchical is not fit for purpose and in many cases is demoralising for pupils! This point is exacerbated when schools spend thousands of pounds on commercial tracking systems (please note this is not an assessment system), which then drives their assessment practice. A case of the tail wagging the assessment and learning dog! Assessment approaches are often driven by the commercial tracking resource to produce ‘proving’ data and not ‘improving’ assessment information. Best practice methods as part of effective assessment systems are often disconnected from tracking systems that require precious time inputting data, that could be better used for planning, teaching and learning.
The Government Commission on Assessment recommended expressing outcomes in curriculum terms and the CIF (OFSTED 2015) even uses the term ‘Assessment information in the place of ‘data’ and then Sean Harford (National Director, Education – responsible for leading inspection policy and guidance) and John Mackintosh (John McIntosh CBE, Chairman of the Government commission on assessment and a former headmaster of the London Oratory School) appear on two videos sharing a message with schools saying that Ofsted inspectors do not want to see data spreadsheets developed from the use of numbers – rather they wish to see how schools use assessment information.
In terms of Mastery learning the profession has somewhat misunderstood the use of the term. Mastery is the expected inclusive standard. Many schools use the terms “emergent” “expected” and “exceeding”. Mastery is the expected standard for all, not something that is only achievable for a select few. Everything we know about childhood growth and development and the performance of other high performing jurisdictions indicate that we can expect mastery for all against the new standards unless children are SEN or disabled. Yes, work back from the new curriculum thresholds but there is no defined linear route to them, therefore we should not attempt to capture this new progress using a reinvention of meaningless labelling.
Fischer Family Trust (FFT) started in 2001 with 55 Local Authorities. In 2004 all LAs were involved using FFT data. Type D Estimates (95% accurate in Eng & Maths within one estimated grade) didn’t emerge until there was sufficient ‘data’ in the system some 7 years after descriptive statistics were introduced. In ‘foundation’ subjects like PE this was as low as only 70% accurate within one estimated grade. SATS are tests in the core subjects – is it any wonder that in other subjects estimates were 30% inaccurate? Any statistician will tell you that using data that is 30% inaccurate is a total waste of time…. Yet here we go again – plus c’est la même chose – despite the legislative changes, with a re-creation of a previously data obsessed administrative rich standards plateauing system which the removal of levels had attempted to avoid!
Sign up for our Standards-Based Assessment for Mastery Learning in Physical Education running on Wednesday 28th June 2017