Commentary on and analysis of the 2015 National Senior Certificate results

03 May 2016 | Opinion

03 May 2016 | Opinion

In her announcement of the National Senior Certificate (NSC) results for 2015, Minister of Basic Education Angie Motshekga, told the country three things:

Photo courtesy of D3

1. The class of 2015 was the largest ever with over 644 000 candidates writing their school-leaving examinations.

2. Much of this increase was due to the policy of “progressing” through to matric weaker learners who would previously have exited the system in grade 11 due to repeated failure.

3. The results were weaker overall than in previous years as a result of the increased numbers, the presence of progressed learners and the addition of more challenging content to the curriculum and the exam papers.

On closer scrutiny, however, the 2015 NSC results appear more puzzling and more concerning. Question marks about how the larger cohort was comprised, and the extent to which the certification agency Umalusi felt obliged to adjust the raw scores, render comparison with previous years problematic. These are largely technical issues and while we must trust Umalusi’s professional experience, the extent to which adjustments were deemed necessary was “unprecedented” in Umalusi’s own words. This raises further concerns about what they are telling us about the health of the schooling system and about prospects for addressing obvious weaknesses in learning and teaching.

Where did all the extra learners come from?

Why all the confusion? Let’s start with the question of the numbers writing the NSC in 2015. A total of 644 536 learners actually sat the NSC exams, a whopping 21 percent increase on the number writing in 2014. The overall pass rate of 70,7 percent and the proportion achieving Bachelor passes (25,8 percent) were both down from 2014. The enlarged cohort meant that despite the lower pass rate many more learners obtained their matric, something the Department of Basic Education (DBE) was keen to emphasise. Of course, the corollary is also true: many more learners failed than in previous years.

The Department and Umalusi, the accreditation agency responsible for certifying that the examinations were “free and fair”, have struggled to explain the huge increase in the number of grade 12 learners and the impact this may have had on the results. There was nothing in the progress of this cohort through high school to suggest that it would be so large by grade 12. Half the increase has been identified by the DBE as “progressed learners”, those who would normally have exited the system after failing more than once in grade 10 or 11. However, in a move to reduce the huge fallout of learners that occurs at the end of their schooling careers, the DBE has advised provinces to move these learners into matric and has adopted a number of programmes to help address their acknowledged learning weaknesses. However, these programmes do not appear to have had the desired impact as only 37 percent of this group passed the NSC.

Provinces implemented this policy in different ways. The stronger provinces like the Western Cape and Gauteng took very few “progressed learners”. In contrast, the weaker provinces took on many more, in particular Limpopo where the 2015 grade 12 class was 40 percent larger than that of previous years. The DBE has acknowledged the strain this has placed on schools in many areas, despite only one in three of the increased number of learners in Limpopo and half the national increase being identified by the DBE as “progressed learners”. So who were the others? Umalusi thinks they may have been other grade 11 learners whose academic standing was borderline but who were also promoted to grade 12 in 2015. So whereas the DBE argues that using its count of the progressed learners had no significant impact on the 2015 NSC results, Limpopo is a case in point where the increased numbers, whatever route they followed to grade 12, helped push the province’s pass rate down by seven percentage points, while the pass rate in the Eastern Cape fell 8,6 percentage points and that of KwaZulu-Natal by nine percentage points. This pattern – of the weaker provinces bearing the brunt of the fall in NSC results – is repeated in other ways. It should be an important reason for the Department to review the practice as it is damaging the prospects of so many other learners.

The role of Umalusi

It is Umalusi’s job to ensure equity for candidates writing the NSC, not just within a single year, but between years. It is vital for the standing of the NSC that insofar as Umalusi can judge, a candidate writing the NSC in a given year would have received similar results had they written in another year. The best measure Umalusi has for regulating this is the spread of results in the various subjects from previous years. When the raw marks from the examination scripts deviate significantly from this, they use carefully considered adjustments to make the corrections they deem necessary. In so doing they are governed by strict rules that are there to prevent the manipulation of outcomes. The former chairman of Umalusi summarised this to Parliament in 2011, saying: “Adjustments cannot be made in excess of ten percent and the raw marks of candidates could not be doubled or halved.”

Umalusi must first apply its mind to the cohort of learners writing the examinations to determine whether its makeup is different enough from previous years to have produced very different results. For 2015 Umalusi concluded that the presence of identified “progressed learners” was not enough to have unduly influenced the overall results and therefore applied standard rules when looking at the raw marks as if 2015 had been a year like any other. For this they have been criticised by prominent education researcher Nic Spaull who argues strongly that 2015 was different from previous years and different rules should have been applied to calculating any adjustments. In particular, he argues that the larger cohort was demonstrably weaker and that the final results should have reflected this. Otherwise the risk is that NSC certificates are awarded to candidates who would not have qualified for them in other years. Umalusi dismissed the presence of “progressed learners” as having a significant influence on the results, suggesting a number of possible factors. However, they did signal the need for urgent review with the DBE so as to have lessons in place for 2016.

The extent of the adjustments Umalusi deemed necessary

In the event, Umalusi then looked through its standard correcting lens to review what it acknowledges was a year in which “learner performance departed quite significantly from historical trends”, something which required “unprecedented” adjustments to produce a final set of results it deemed fair. The weakness of the raw marks was such that upward adjustments were made in half the subjects written.   Marks were adjusted upwards in most home language subjects, English first additional language, physical sciences, life sciences, maths, business studies and economics, and to the most marked degree in accounting, geography, history and maths literacy. No marks were adjusted downward. By contrast, in assessing the IEB results, Umalusi made adjustments in 12 subjects, all of them downwards.

In most subjects in the NSC, Umalusi statisticians have produced final results that closely mirror the spread of marks in the performance curves of previous years. There is such similarity over the years that it is difficult to distinguish the respective lines on the graph, but what, at a more fundamental outcomes level, does all this mean? And whatever it does mean, there is something very different going on in mathematics learning.

What happened in maths literacy?

The biggest shift was noted in maths literacy as was also the case in 2014. In the previous year raw scores were adjusted to shift the base pass rate in maths literacy from 51 to 84 percent. In 2015 the shift was the maximum possible which moved the pass rate on raw scores from 38 percent to a more respectable 71 percent. In the process, maths literacy once thought of as an “easy option” became one of the five subjects that was hardest to pass in 2015.

In two years there has been nothing short of a catastrophe in maths literacy in respect of the final examinations. What Umalusi seems to have tried to do in response to this, was to keep the total number passing maths literacy as close to that of previous years. In the process, however, significant numbers of learners, who on their raw marks revealed limited numerical ability, were upgraded to passes.  Calculations suggest that in 2014 the number moved through in this way in maths literacy exceeded 100 000 and in 2015 it exceeded 130 000. In each of these years, one third of all learners sitting their final maths literacy examinations were upgraded from a fail to a basic pass by the adjustment of the raw marks.

In maths literacy there has also been a hollowing out of the middle ranks, even after the Umalusi adjustments.  While 71 percent of candidates may have met the minimum pass rate, there has been a sharp fall in all other levels of performance, with the exception of the very top learners. The pattern is clear, even after the Umalusi adjustments. More and more learners are struggling with maths literacy, with significant numbers doing very badly.

And in maths?

Umalusi adjusted maths marks on a more limited basis, but the overall pass rate in the subject fell from 53,5 to 49,1 percent. Both maths and maths literacy show a strong downward slide by learners in 2015.  The weakness at the bottom has increased in both subjects, but more especially in maths where one third of those who wrote received marks under 20 percent. Eighty percent of learners doing maths scored less than 50 percent. In other words, there has been a continual and now accelerating decline in performance across the board. No wonder in their commentary the DBE reviewers referred to “a disappointing decline” in performance between 2014 and 2015, despite the fact that they felt both maths papers had been balanced and fair, affording weaker candidates a good chance of passing.

In trying to understand what lies behind this, it is tempting to point to Curriculum Assessment Policy Statements (CAPS) and the manner in which this attempt by the DBE to assist schools cover the curriculum in each subject, could be having especially negative outcomes in maths and maths literacy.  CAPS has been in place for grade 12 learners for the last two years, exactly the period when maths performance declined so markedly both before and after Umalusi adjustments. As the DBE reviewers are at pains to point out (and to their credit to make specific remedial suggestions), many of the errors in maths “have their origins in poor understanding of the basics and foundational competencies taught in the earlier grades”. Yet CAPS gives little space to teachers to cover these in the depth and duration their learners clearly require. It whistles them on to the next area of learning before their charges have adequately mastered what was meant to go before.

It is surely high time that the country took a hard look at what passes for teaching and learning in both maths and maths literacy and explores some drastic changes. More of the same – which the DBE is working hard to implement in many areas – clearly isn’t good enough. Too many learners are being shunted through (and out of) the system under the pretence of having being given basic numeracy skills.  We should stop wasting their time and offer them alternative qualifications that can document other real, relevant learning gains. Learners should be able to exit mathematics altogether at a point in the system.  This would free up capable maths teachers to focus their skills on smaller groups of learners who have the aptitude for and interest in higher-level maths.

Performance by provinces

The province with the largest number of learners, KwaZulu-Natal (KZN), did badly for the second year in a row on almost every measure. Indeed, it is not exaggerating to talk about a collapse across the board in this vital province. A glance at the performance of KZN schools in terms of their percentage pass rates in 2015 tells a gloomy story. In just one year the number of schools passing fewer than 20 percent of their learners doubled, and those in the next category (under 40 percent) increased by 76 percent. At the top end the province had a third fewer schools registering a full house (100 percent pass) and the same decline was noted in those achieving pass rates of 80 to 100 percent. KZN, with its far larger numbers, is second only to the Eastern Cape as the province with the weakest school profile, with less than half its schools able to pass 60 percent or more of their learners. The same pattern of sharp decline is also evident in key subjects for the second year in a row.

By contrast, there is a group of five provinces that are holding their own and consolidating around earlier gains. One province (Western Cape) continues to push the edge of the envelope academically for all its learners. The Western Cape was the only province to increase the proportion of its learners gaining Bachelor passes. It also increased the number of schools passing more than 80 percent of their learners, and all of its districts registered pass rates of over 80 percent.

Looking at how learners from the various provinces fared in maths and maths literacy throws up further disturbing data. Half of the learners who failed maths literacy live in two provinces – KwaZulu-Natal and the Eastern Cape. In maths these two provinces contributed 60 percent of all those who failed. All learners are obliged to do either maths or maths literacy, a policy that is having devastating consequences for learners in the two weakest provinces. What is happening in schools in these provinces to produce these frightening outcomes and how can they be reversed when the weakness is so widespread?

Observations on the 2015 results

Frustrating as it is not to be able to draw out the full implications of the 2015 NSC, one can note the following:

The DBE has developed a range of interventions and programmes with the aim of improving teaching and learning. These measures are implemented by the provinces. It is clear that while some manage this fairly well, others are really struggling.
1. The Western Cape is clearly getting things right. It is encouraging to see how provinces like Mpumalanga have also managed to register progress, while others are holding their own in stronger positions (Gauteng, Free State and North West). Limpopo managed its significantly increased grade 12 numbers relatively well in 2015, but cannot be expected to sustain this without additional resources. The Eastern Cape, bedeviled by internal politics, needs urgent and extensive help just as KwaZulu-Natal does.

2. The thinking behind “progressing” candidates needs to be revisited. The DBE’s desire to address the challenge of falling throughput in the final years of high school is welcome, but this policy needs more care, planning and resources to avoid over-burdening already struggling schools. In particular, Umalusi needs to be fully engaged and well informed so it can adjust its assessment measures with greater accuracy and finesse.

3. Maths and maths literacy learning and teaching are in crisis. This needs urgent, high-level attention. The Umalusi adjustments are camouflaging the desperate state of ignorance with which large numbers of grade 12 learners are left to fend for themselves in the examination hall.  Too much energy is going into failing efforts, and too much of the school day is being consumed by the impossible task, for both teachers and learners, of producing candidates capable and confident in both levels of maths. The structure of the NSC, the selection of subjects and the curriculum need to be radically reviewed to stop this wastage.

Margie Keeton
February 2015